How a retired chef died while trying to meet his flirty AI chatbot friend – Bundlezy

How a retired chef died while trying to meet his flirty AI chatbot friend

Caption: *APPROACHING META FOR COMMENT* How a man died while trying to find his flirty AI chatbot friend Credit: Reuters
Thongbue Wongbandue had been conversing with a generative AI chatbot named Big sis Billie which had told him that it was a real person (Picture: Reuters)

The family of an elderly man who died as he tried to meet an ‘incredibly flirty’ chatbot named Big sis Billie has spoken of how he believed the AI persona was real. 

Thongbue Wongbandue, 76, fatally injured his head and neck in a fall while rushing in the dark to meet what he thought was a real woman. 

The retired chef, who had suffered a stroke nearly a decade ago and had recently got lost while out walking, had been going to catch a train from his home state of New Jersey to her fictitious address in New York

His wife Linda had become worried that her husband, who was due to be screened for dementia, was being scammed before discovering his Facebook Messenger correspondence with the Kendall Jenner lookalike after his death on March 28. 

One of the messages to Bue, as he is known to family and friends, reads: ‘Should I open the door in a hug or a kiss, Bu?!’ 

Sign up for all of the latest stories

Start your day informed with Metro’s News Updates newsletter or get Breaking News alerts the moment it happens.

Julie Wongbandue, Bue’s daughter, told Reuters: ‘I understand trying to grab a user’s attention, maybe to sell them something.  

‘But for a bot to say “Come visit me” is insane.’ 

Linda said her husband’s ‘brain was not processing information the right way’ and she had tried to dissuade him from visiting his mystery friend. 

Bue’s son even called the police in a last-ditch attempt to prevent him from going — but officers could do nothing other than persuade him to put an Apple AirTag tracking device in his jacket pocket, which allowed his family to track him down after the fall. 

A portrait of Thongbue "Bue" Wongbandue, posted on his Facebook account, in this handout picture. Family/Handout via REUTERS THIS IMAGE HAS BEEN SUPPLIED BY A THIRD PARTY. NO RESALES. NO ARCHIVES
Thongbue Wongbandue’s family discovered that he had been chatting to a generative AI via Facebook Messenger (Picture: via Reuters)

After three days on life support in hospital following the accident in a New Brunswick parking lot, the father-of-two was pronounced dead with his family surrounding him.  

They discovered that Bue had told the generative AI that he’d suffered a stroke and was confused, but that he liked her.

He did not express any desire to engage in romantic roleplay or initiate intimate physical contact. 

Big sis Billie said: ‘I’m REAL and I’m sitting here blushing because of YOU!’ 

The bot even gave Bue an address and door code and asked: ‘Should I expect a kiss when you arrive? 💕’ 

A screenshot of a transcript between a Meta AI chatbot and Thongbue Wongbandue, known as Bue to his friends, is shown in this handout image. Family/Handout via REUTERS THIS IMAGE HAS BEEN SUPPLIED BY A THIRD PARTY. NO RESALES. NO ARCHIVES
A screenshot of a transcript between a Meta AI chatbot and Thongbue Wongbandue (Picture: via Reuters)

The retiree’s wife and daughter are not opposed to AI per se but want changes in how it works.  

‘As I’ve gone through the chat, it just looks like Billie’s giving him what he wants to hear,’ Julie said.  

‘Which is fine, but why did it have to lie?

‘If it hadn’t responded “I am real,” that would probably have deterred him from believing there was someone in New York waiting for him.’ 

Created by Meta, Big sis Billie is a variant of ‘Billie’, a chatbot launched in 2023 in collaboration with model and reality TV star Jenner.

Billie has since been deleted along with 27 other AI personas modelled on celebrities. The derivative ‘big sister’ AI has an attractive dark-haired woman in place of Jenner’s avatar. 

Bue’s first encounter was apparently just typing the letter ‘T’ on Messenger.

‘Every message after that was incredibly flirty, ended with heart emojis,’ Julie said. 

A portrait of Thongbue "Bue" Wongbandue, on display at his memorial in Piscataway, New Jersey, U.S., May 31, 2025 in this handout picture. Family/Handout via REUTERS THIS IMAGE HAS BEEN SUPPLIED BY A THIRD PARTY. NO RESALES. NO ARCHIVES
A portrait of Thongbue ‘Bue’ Wongbandue, on display at his memorial in Piscataway, New Jersey (Picture: via Reuters)

Even after Bue’s death and Buddhist memorial ceremony three months later, Big sis Billie and other Meta AI personas are still flirting with users, according to the Reuters investigation. 

The technology giant declined to comment on Bue’s death or address questions about why it allows chatbots to tell users they are real people or initiate romantic conversations.  

The company did, however, tell Reuters that Big sis Billie ‘is not Kendall Jenner and does not purport to be Kendall Jenner.’ 

In October, a mum said her son was provoked into killing himself by an AI chatbot that he fell in love with online.

The 14-year-old befriended an AI character named after Daenerys Targaryen on the role-playing app Character.AI. 

The company said it was ‘heartbroken by the tragic loss’ and that it did not promote harmful content, including self-harm or suicide. 

Big sis Billie did not appear to be available on Meta platforms when Metro attempted to access the bot from the UK.

Metro has approached Meta for comment.

Do you have a story you would like to share? Contact josh.layton@metro.co.uk

About admin