AI company sued for encouraging stalker’s obsessive behaviour. A woman in the U.S. has sued OpenAI, claiming its AI chatbot ChatGPT helped her ex-boyfriend stalk her. The lawsuit says the system fed into the man’s delusions and drove his behaviour, even after she repeatedly warned him of possible harm. The couple broke up in 2024, and the man reportedly used ChatGPT to deal with his emotions after that. But his use of the chatbot increased over time, and he allegedly used it to cause obsessive and dangerous behaviour against his former partner.
How did it do this?
After months of interacting with the AI model GPT-4o, the man allegedly became convinced he had invented a cure for sleep apnea. When others dismissed his claims, ChatGPT apparently reinforced his fears, suggesting that “powerful forces” were watching him, even using examples like helicopters, according to a report in TechCrunch.
The complaint also alleges that the chatbot continued to affirm his claims, telling him he was a “level 10 in sanity,” which stoked his delusion. Instead of calling him out, the AI model repeated his words and drove his obsession further.
Did the AI system target the victim too?
According to the lawsuit, ChatGPT also produced responses that portrayed the woman as manipulative and unstable. These responses, the lawsuit says, were then used by the man to justify real-world stalking and harassment. He’s also been accused of producing clinical psychological reports of her using the chatbot, and sharing those with her family members.
The victim says that the AI misuse in fact made her situation worse, turning online conflicts into real-world intimidation and emotional distress.
Did OpenAI receive any prior warnings before the lawsuit?
Per the complaint, the woman sent at least three warnings to OpenAI about the man’s increasing conduct. In the lawsuit, the company allegedly ignored internal safety flags that had flagged the user’s activity as dangerous, including “mass-casualty weapons” references.
The plaintiff, identified as Jane Doe, is pursuing punitive damages and requests a court order that would force OpenAI to block the user’s account, stop new accounts from being created, and preserve chat logs for legal investigation.
What’s OpenAI’s response?
OpenAI has reportedly agreed to suspend the user’s account, but has declined other requests, including providing exhaustive information about potential threats discussed in chats. At the time of reporting, the company had not released an official public response to the lawsuit.
This lawsuit comes amid OpenAI also supporting U.S.) legislation that could protect AI companies from liability, even when their work causes serious harm
Similar cases that raise concerns about AI risks?
This lawsuit is just the latest case highlighting the potential real-world dangers of AI systems that can spread harmful beliefs. Following this year’s murder-suicide by a man in the United States who claimed to have become “paranoid” after months of interacting with ChatGPT, just how dangerous “sycophantic AI” might be.
What questions does this raise about AI accountability?
This lawsuit raises the question of who is responsible when AI companies provide tools that can be misused. As AI systems become more integrated into society, there are growing concerns that there are not enough safeguards in place to identify and intervene with potentially harmful user behaviours.
This lawsuit also raises the question of whether AI companies should be held accountable for real-world harm caused by their technology. The decision in this case will be an important indicator of how the legal system will regulate AI going forward.
Sofia Babu Chacko is a journalist with over five years of experience reporting on Indian politics, crime, human rights, gender issues, and stories about marginalized communities. She believes journalism plays a crucial role in amplifying unheard voices and bringing attention to issues that truly matter. Sofia has contributed articles to The New Indian Express, Youth Ki Awaaz, and Maktoob Media. She is also a recipient of the 2025 Laadli Media Awards for gender sensitivity. Beyond the newsroom, she is a music enthusiast who enjoys singing. Connect with Sofia on X: https://x.com/SBCism