this post was submitted on 04 Mar 2026
524 points (97.8% liked)

Technology

82227 readers
4546 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Reygle@lemmy.world 24 points 11 hours ago* (last edited 11 hours ago) (14 children)

“On September 29, 2025, it sent him — armed with knives and tactical gear — to scout what Gemini called a ‘kill box’ near the airport’s cargo hub,” the complaint reads. “It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK and directed him to a storage facility where the truck would stop. Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ designed to ‘ensure the complete destruction of the transport vehicle and . . . all digital records and witnesses.’”


WHAT

Genuine question, REALLY: What in the fuck is an otherwise "functioning adult" doing believing shit like this? I feel like his father should also slap himself unconscious for raising a fuckwit?

[–] SalamenceFury@piefed.social 14 points 11 hours ago* (last edited 11 hours ago) (3 children)

I don't think this person was a "fuckwit". AI is designed to keep engaging with you and will affirm any belief you have, and anything that is a little weird, but innocent otherwise will simply get amplified further and further into straight up mega delusions until the person has a psychotic episode, and this stuff happens more to NORMIES with no historic of mental illnesses than neurodivergent people.

[–] tamal3@lemmy.world 5 points 8 hours ago* (last edited 8 hours ago)

Chat GPT was super affirming about a job I recently applied to... I did not get the job. That was my first experience with it affirming something that was personally important. And so I can absolutely see how this would affect someone in other ways.

load more comments (2 replies)
load more comments (12 replies)