this post was submitted on 04 Mar 2026
517 points (97.8% liked)

Technology

82227 readers
4546 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BranBucket@lemmy.world 64 points 9 hours ago* (last edited 4 hours ago) (23 children)

People don't often realize how subtle changes in language can change our thought process. It's just how human brains work sometimes.

The old bit about smoking and praying is a great example. If you ask a priest if it's alright to smoke when you pray, they're likely to say no, as your focus should be on your prayers and not your cigarette. But if you ask a priest if it's alright to pray while you're smoking, they'd probably say yes, as you should feel free to pray to God whenever you need...

Now, make a machine that's designed to be agreeable, relatable, and makes persuasive arguments but that can't separate fact from fiction, can't reason, has no way of intuiting it's user's mental state beyond checking for certain language parameters, and can't know if the user is actually following it's suggestions with physical actions or is just asking for the next step in a hypothetical process. Then make the machine try to keep people talking for as long as possible...

You get one answer that leads you a set direction, then another, then another... It snowballs a bit as you get deeper in. Maybe something shocks you out of it, maybe the machine sucks you back in. The descent probably isn't a steady downhill slope, it rolls up and down from reality to delusion a few times before going down sharply.

Are we surprised some people's thought processes and decision making might turn extreme when exposed to this? The only question is how many people will be effected and to what degree.

[–] Zink@programming.dev 5 points 8 hours ago (1 child)

Then make the machine try to keep people talking for as long as possible...

That's probably a huge part of it. How many billions of dollars have been spent engineering content on a screen to get its tendrils into people's minds and attention and not let go?

EnGaGeMent!!!

[–] BranBucket@lemmy.world 2 points 2 hours ago

This is also part of my broader gripe with social media, cable news, and the current media landscape in general. They use so many sneaky little psychological hooks to keep you plugged in that I honestly believe it's screwing with our heads to the point of it being a public health crisis.

People are already frazzled and beat down by the onslaught of dopamine feedback loops and outrage bait, then you go and get them hooked on a charbot that feeds into every little neurosies they've developed and just sinks those hooks in even deeper and it's no wonder some people are having a mental health crisis.

A lot of us vastly overestimate our resistance to having our heads jacked with and it worries me.

load more comments (21 replies)