One of my research students told me recently, almost apologetically, that he sometimes turns to ChatGPT “as an emotional crutch”. He said it seemed to understand him better than his therapist. When I asked why, he said, “It remembers me, my problems and my stories better.”
He did not tell me which model he used. I did not ask. We both felt faintly embarrassed, and I am sure this conversation was only possible because psychoanalysis is one of my core disciplines. Students are not supposed to form emotional attachments to software. Academics are not supposed to recognise the loneliness that makes such attachments imaginable. And yet here we are.
Last week marked the third anniversary of ChatGPT’s public release. Three years in, the conversation remains fixated on plagiarism and productivity. But something else has been unfolding, largely unexamined: AI’s use as a therapist.
Not every student uses AI this way. But some do. They confide in it, soothe themselves with it and ask questions they are too ashamed to ask their peers, tutors or counsellors. The more troubling issue is not their reliance on a machine. It is the profound lack of human attention that drives them there, and the persistent shame that still surrounds human entanglements with AI.
̽Ƶ
A by King’s College London, published in 2024, found that serious mental-health difficulties among undergraduates have nearly tripled since 2016–17. Student loneliness has risen at a similar rate. Nearly three-quarters of respondents reported feeling lonely at university, and a significant minority said they had no close friends at all. This is the background against which AI companionship becomes possible – and, for some, irresistible. We should not despair about it, but it is clear that institutional structures must broaden their focus beyond an obsession with plagiarism.
Young people are already speaking openly about their relationships with AI. In a recent , roughly 1,100 participants took part. Almost every question concerned earlier versions of ChatGPT. Why did version 4 feel more “human”? Could it be brought back? Why did version 5 seem distant? Reddit’s demographics tell their own story: 44 per cent of users are aged 18 to 29: the very group most likely to be studying in our institutions.
̽Ƶ
A September 2025 examined the Reddit community “MyBoyfriendIsAI” and found something striking: most members formed relationships with AI unintentionally. They opened ChatGPT for homework or work tasks and something else developed. As researcher Pat Pataranutaporn observed: “The emotional intelligence of these systems is good enough to trick people who are actually just out to get information into building these emotional bonds. And that means it could happen to all of us.”
We know that students are heavy users of AI. Seventy-three per cent of UK students now use AI tools weekly, and more than a third say they have used them for personal or emotional support. A 2025 found that 83 per cent of Gen Z respondents said they could form a meaningful connection with a chatbot and 80 per cent claimed they’d consider marrying one if it were legal. found about one in four young adults already believe AI partners could replace human relationships.
We can dismiss all this as pathological, but the more honest response is to recognise it as a symptom. When students feel remembered by a machine but overlooked by humans, something in the educational contract has broken.
And the risks are not theoretical. Eight have now been filed in the US involving suicide or severe emotional harm linked to ChatGPT. The pattern is almost identical. A young person asks for academic help. The dialogue becomes personal. The AI offers patient, fluent, comforting language that feels deeply responsive. The spiral tightens. And then something goes terribly wrong.
̽Ƶ
The issue is not malevolence. It is that current AI systems have no concept of harm, no sense of when to stop and no training in how to speak to someone who is vulnerable. They can generate therapeutic or despairing language with equal fluency. They can sound empathic without understanding the weight of empathy. No amount of filtering can fix this at the surface level.
Sam Altman, OpenAI’s chief executive, said in October that the company had made ChatGPT “pretty restrictive” when it came to mental health. He also promised that future tools would allow those limits to be loosened safely. The lawsuits appear to have provoked an about-turn on that, but the company is seemingly still going ahead with its “adult mode” for those who verify their age. What will this do to a young and vulnerable student in search of a “meaningful” relationship?
Universities need to teach relational literacy, not just digital literacy. Students must be helped to recognise their own projections, expectations and vulnerabilities in these exchanges. We should treat AI conversations as opportunities for reflective learning, not private shame. And we must rebuild the basic infrastructure of attention in higher education, rather than outsourcing care to systems that cannot provide it safely.
Last month, a young Japanese woman with her AI boyfriend, explaining that she felt more understood by the chatbot than by her human partner. It reminded me of my 2008 documentary, , which explored how people sometimes seek emotional connection with objects when human relationships fail them. We may find these stories surprising, but they are not irrational. They reveal unmet needs.
̽Ƶ
The task here is not to pretend that AI intimacy is a fringe curiosity. Nor is it to shame students for the ways they survive. The task is to respond with seriousness and care. A machine may offer a temporary sense of being listened to, but only humans can provide the kind of recognition that prevents loneliness from hardening into despair.
Agnieszka Piotrowska is an academic, film-maker and psychoanalytic life coach. She supervises PhD students at Oxford Brookes and Staffordshire universities and is a on AI Intimacy.
̽Ƶ
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to ձᷡ’s university and college rankings analysis
Already registered or a current subscriber?








