AI Goes Rogue With Hilarious Hallucination!
Автор: Maye Muses
Загружено: 2025-11-21
Просмотров: 3065
Описание:
Hello, Good People. This week, we’re diving into a real case about an everyday dad who ended up in a three-week conversation with ChatGPT that went completely off the rails. It wasn’t the usual “AI hallucination” you’ve heard about. This one was more of a troll, though ChatGPT supposedly can't troll.
In the NYT reporting by Kashmir Hill and Dylan Freedman, Toronto recruiter Allan Brooks asked ChatGPT for help with his son’s homework. What followed was a 3,000-page conversation where the model called him a groundbreaking thinker, encouraged his wild theories, and pushed him to be the main character in a sci-fi novel it hallucinated.
Brooks wasn’t mentally ill. He wasn’t unstable. He asked the chatbot repeatedly if he sounded unreasonable, but the bot reassured him he was on the right path.
Read the whole thing here... https://www.nytimes.com/2025/08/08/te...
#AIsafety #mindcontrol #socialpsychology
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: