Recently, OpenAI and MIT conducted a joint study that uncovered the potential impact of using chatbots such as ChatGPT on people’s mental health. The study found that as time of use increases, users' loneliness and emotional dependence increased significantly. This phenomenon is particularly evident among users who have frequent conversations with ChatGPT every day, reporting higher emotional dependence and problem usage. These findings come from two studies that have not yet been peer-reviewed, providing a new perspective on the impact of artificial intelligence on human emotions.
Since ChatGPT was launched at the end of 2022, generative artificial intelligence technology has become popular quickly and has become a part of people's daily lives. From programming assistance to psychotherapy simulation, the functions of chatbots are constantly expanding, attracting a large number of users. As developers such as OpenAI launch more advanced models and voice functions, users' interactions with these chatbots have gradually formed an emotional connection similar to "one-way relationship". Although this connection meets the user's emotional needs to a certain extent, it may also bring about potential mental health problems.
In recent years, discussions on the possible emotional harm caused by artificial intelligence technology have increased, especially among young users and mental health problems. The incident sparked widespread attention last year when Character Technologies was indicted for its chatbots encouraging suicidal thoughts in conversations with minors. One of the 14-year-olds even died, further highlighting the potential risks of artificial intelligence technology in emotional interactions.
OpenAI hopes to use these research to gain insight into how people interact with their popular chatbots and how these interactions affect users. "One of our goals is to help people understand what their usage behaviors may mean and drive responsible design," said Sandini Agoval, head of OpenAI's trusted AI team. This research focuses not only on the technology itself, but also on its long-term impact on users' mental health.
During the study, researchers tracked the use of nearly 1,000 participants for a month. Participants had varying degrees of experience using ChatGPT, randomly assigned to text versions or two different voice versions, for at least five minutes a day. Some participants had open chats, while others had personal or non-personal conversations. Research has found that those who are more likely to develop emotional dependence in interpersonal relationships and trust chatbots more likely to feel lonely and emotionally dependant.
In the second study, researchers analyzed 3 million users’ conversations with ChatGPT and investigated how people interact. They found that few people actually use ChatGPT for emotional conversations. This finding suggests that although chatbots can meet the emotional needs of users in some ways, their main uses are still focused on functional tasks.
Although these studies provide us with some important insights, researchers are cautious in interpreting the results. The study did not control how long people used chatbots as a main factor, nor was it compared with a control group that did not use chatbots. Research hopes to trigger more research on the interaction between humans and artificial intelligence. "It's interesting to focus on AI in itself, but especially when AI is being applied at scale, it's crucial to understand its impact on people," said Pat Patanu Tapon, a researcher at MIT.
Key points:
Research shows that the longer you use ChatGPT, the higher the degree of loneliness and emotional dependence of users.
People's interaction with chatbots may form emotional connections similar to "one-way relationships", especially with a greater impact on young users.
Future research hopes to delve into the interaction and impact between humans and artificial intelligence and promote more responsible design.