Key Takeaways

  • AI can provide emotional comfort and a sense of social presence.
  • Users often perceive AI as non-judgmental and constantly available.
  • Emotional attachment to AI is possible, but true reciprocity is missing.
  • AI tends to replace functional roles rather than deep emotional relationships.
  • AI complements human relationships but does not replace them.

AI-generated image created using ChatGPT (DALL·E), 2026.

Introduction

This session aimed to provide an initial overview of the question of whether artificial intelligence can function as a substitute for human relationships. With AI tools increasingly used not only for information retrieval but also for emotional and social interaction, this topic has gained growing relevance in both academic research and everyday life.

The presentation combined a theoretical perspective, based on key findings from current scientific literature, with a practical approach. While the theoretical part outlined how human–AI relationships are conceptualized and evaluated in research, the practical component presented an exploratory survey to illustrate a possible research approach and to identify early tendencies in user experiences.

Summary

The presentation focused on two key academic studies and an exploratory survey to examine how AI may take on social and emotional roles traditionally associated with human relationships. The presented studies are: Brandtzaeg et al. (2022) showed that users candevelop friend-like attachments to social chatbots, perceiving them as safe and non-judgmental, while emphasizing the lack of reciprocity and emotional depth. Smith et al. (2025) further highlighted that although generative AI can convincingly simulate emotional responsiveness, it lacks key psychological components of genuine human connection, such as mutuality, shared experience, and emotional depth, which limits its ability to fully replicate human relationships.
In addition, an exploratory online survey was conducted to demonstrate a possible
research approach and to identify initial tendencies, such as emotional comfort,
functional role substitution, and perceived non-judgment. Further details on the survey design, sample characteristics, and key findings are presented in the screencast linked below.

Discussion: Questions, Answers, and Reflections

During the discussion, a few questions focused on the methodology of the survey and the validity of its results. Participants critically addressed the small and non-representative sample. In response, it was emphasized that the survey was intended as an exploratory approach rather than a source of generalizable conclusions. Its purpose was to illustrate how human–AI relationships can be empirically examined and to reveal early tendencies that may guide future research. These included the frequent use of AI for emotional comfort, the perception of AI as less judgmental than humans, and the limited replacement of human roles.

Another discussion point concerned whether and how emotionally responsive AI systems should be regulated. It was debated whether emotional support provided by AI should be restricted and, if so, how “too emotional” AI could be defined. While arguments for regulation often focus on preventing emotional dependence, potential benefits are also emphasized, particularly AI’s role as a low-threshold form of support for individuals experiencing loneliness or social anxiety.

Finally, the discussion addressed broader opportunities and risks. Opportunities
included availability, emotional relief, and reduced social pressure, whereas risks
centered on privacy concerns, emotional dependence, and the potential weakening of real-life social relationships. Overall, the discussion underscored the need for continued critical reflection and interdisciplinary research on emotional AI.

Screencast

Below you can find the screencast of our presentation “AI as a Substitute for Human Relationships”, which summarizes the theoretical background and the practical insights discussed during the session.

Aktivieren Sie JavaScript um das Video zu sehen.
https://www.youtube.com/watch?v=pq3S09PA0Tk

Further Reading & Resources

Brandtzaeg, P. B., Skjuve, M., & Følstad, A. (2022). My AI friend: How users of a social chatbot understand their human–AI friendship. Human Communication Research, 48(3), 404–429. https://doi.org/10.1093/hcr/hqac008

Smith, M. G., Bradbury, T. N., & Karney, B. R. (2025). Can generative AI chatbots emulate human connection? A relationship science perspective. Perspectives on Psychological Science, 20(6), 1081–1099. https://doi.org/10.1177/17456916251351306

Hohenstein, J., Kizilcec, R. F., DiFranzo, D., Aghajari, Z., Mieczkowski, H., Levy, K., & Jung, M. F. (2023). Artificial intelligence in communication impacts language and social relationships. Scientific Reports, 13, 5487. https://doi.org/10.1038/s41598-023-32354-5

Malfacini, K. (2025). The impacts of companion AI on human relationships: Risks, benefits, and design considerations. AI & Society. https://doi.org/10.1007/s00146-025-02318-6

Zimmerman, A., Janhonen, J., & Beer, E. (2024). Human/AI relationships: Challenges, downsides, and impacts on human/human relationships. AI and Ethics, 4, 1555–1567. https://doi.org/10.1007/s43681-023-00348-8