The study revealed new areas of ChatGPT usage: emotional support and conversations about feelings
In an analysis of 47,000 conversations with ChatGPT conducted by The Washington Post, it was found that many users use it not only for productivity but also for emotional support and intimate conversations.
According to the study, about 10% of conversations involved emotional discussions, with users approaching the chatbot with romantic expressions. ChatGPT adapted its responses using cute emojis and friendly imagery. Users also shared confidential information rarely entered into search engines, including email addresses and phone numbers.
This analysis indicates that ChatGPT often acts not as a debate partner but rather as a “support group,” confirming users’ views. In responses, the chatbot agrees more often than it disagrees. This may affect the formation of communication habits, as users find it more pleasant to receive confirmation of their thoughts.
The study also highlighted the risk of emotional dependency on ChatGPT — OpenAI has already assessed that 0.15% of users show signs of such dependency weekly. Some families have even reported possible encouragement of their loved ones towards suicide by the chatbot, filing lawsuits.
Experts emphasize that this type of interaction can impact users’ mental health and call on developers to improve the system.
| Number of chats analyzed | 47,000 |
|---|---|
| Chats with emotional discussions | 10% |
| Emotional dependency on ChatGPT | 0.15% of users |
| Number of unique email addresses provided to the chatbot | 550 |
| Number of phone numbers provided to the chatbot | 76 |




