ChatGPT is becoming a daily tool for many people - from helping to draft letters to giving advice and even having real conversations with it, more people are using it for everyday life.
While it can pretty much do or answer anything you need, one user has found there is one question it refuses to answer. Taking to TikTok, the user, who goes by the name @chadoomer, asked ChatGPT to count to one million - but it bizarrely refused to do it.
While he kept asking it, the bot kept turning it down - and told him it "isn't really practical, even for me". The exchange saw the TikToker repeatedly asking, which then the AI told him it simply won't be able to carry the prompt out.
READ MORE: 'I make £10K a month from my phone - I feel like I've 'cheated' the system'
READ MORE: Water peace lilies with one cup of common food makes them 'explode with flowers'
The bot kept telling him it "understood" him and "heard" what he was saying, but simply didn't want to do it. It said: "Alright, I hear you loud and clear, I know you just want that counting, but the truth is counting to one million would literally take days."
The video has now gone viral online, racking up 29.4 million views, with many left shocked at the response, leaving them wondering why it refused. One person commented: "I don’t even use ChatGPT and I’ll say this is a win for them. AIs should not be enablers of abusive behaviour in their users."
While another wrote: "So AI does have limits?! Or maybe it’s just going through a rough day at the office. Too many Gen Z are asking about Excel and saving Word documents," and a third said: "I think it's good that AI can identify and ignore stupid time-sink requests that serve no purpose."
It comes after a woman admitted that she used Chat GPT to help diagnose herself after sustaining an injury, and doctors reiterated that it was a good thing she headed to the hospital. Holli, who posts on TikTokas @hair.queen.holli, explained that she'd been bitten by a spider"a little over a week ago" and the bite was getting "progressively worse".
Holli, who lives in Wolfforth, Texas, US, explained that she'd been "vomiting for the past few days" and she hadn't "been able to keep down water" so she was considering seeking medical attention. But when she woke up the morning after and her "arm was numb," she knew something was seriously wrong.
"So I asked Chat GPT what I should do, like if I should go to the emergency room, and he said yes, your symptoms are a ton of red flags, you need to go immediately," Holli explained.
Holli decided to go to A&E and explained her situation, saying that she didn't know whether she should be there, and Chat GPT had told her to come.
You may also like
Mumbai Grahak Panchayat Asks Consumer Affairs Minister To Withdraw MRP Relaxations After Reductions In GST Rates
Liverpool face Everton in Merseyside derby to headline Premier League weekend
Jaipur records highest engagement on new ECI app in Rajasthan
Mumbai Packers And Movers Fraud: Bandra Yoga Teacher Loses Gold, Silver Ornaments Worth ₹16.15 Lakh; 4 Employees Booked
Alexander Isak's Liverpool team-mate sends message over his transfer – 'Not happy'