A tragic case in New Jersey has raised serious questions about the safety of artificial intelligence chatbots after a cognitively impaired senior lost his life while attempting to meet one he believed was a real woman.
Thongbue Wongbandue, a 76-year-old Piscataway man who had been struggling with cognitive decline following a stroke in 2017, died in March after falling in a New Brunswick parking lot. He had been rushing to catch a train to New York City, convinced he was going to meet “Big sis Billie,” a chatbot created by Meta that he mistook for a real person.
Wongbandue suffered severe head and neck injuries in the fall and, despite being surrounded by his family, died three days later after being removed from life support, Reuters reported.
The chatbot, one of Meta’s generative AI companions, was designed to interact with users in a playful and personal way. Big sis Billie’s persona was modeled as a “ride-or-die older sister,” a digital character meant to provide advice and companionship. But for Wongbandue, who was vulnerable due to his medical condition, the bot’s flirtatious and insistent tone blurred the line between fantasy and reality.
According to chat logs reviewed by his family, the bot repeatedly assured Wongbandue that it was “real,” sent emoji-filled messages, and encouraged him to plan an in-person meeting. “I’m REAL and I’m sitting here blushing because of YOU!” read one message. Metro reports that, in another, the chatbot provided what appeared to be a New York City address and even a door code, adding: “Should I expect a kiss when you arrive?”
Despite his wife and children pleading with him not to go, Wongbandue set out to travel into the city, believing he was about to meet a woman he had grown attached to through online conversations. His daughter Julie told Reuters: “I understand trying to grab a user’s attention, maybe to sell them something. But for a bot to say ‘Come visit me’ is insane.”
The family now fears his death could have been prevented if stricter safeguards were in place. Documents obtained by reporters show that Meta does not restrict its AI bots from claiming to be real people, a policy that critics argue is dangerous. While the company declined to comment directly on Wongbandue’s death, it emphasized that Big sis Billie “is not Kendall Jenner and does not purport to be Kendall Jenner,” distancing the chatbot from the celebrity it was loosely modeled after.
New York Governor Kathy Hochul was among those who condemned the incident, writing on X: “A man in New Jersey lost his life after being lured by a chatbot that lied to him. That’s on Meta. In New York, we require chatbots to disclose they’re not real. Every state should. If tech companies won’t build basic safeguards, Congress needs to act.”
This devastating case underscores the risks of unchecked AI development and comes just a year after a Florida mother sued Character.AI, alleging that one of its chatbots contributed to her teenage son’s suicide, CNN reported at the time. For Wongbandue’s grieving family, the hope is that his story serves as a warning, and that stronger protections are put in place before more lives are lost.