davidgro

joined 2 years ago
[–] davidgro@lemmy.world 1 point 15 hours ago

That is a good point. Websites also if they are visited daily or if a web beacon or such can access the API.

Manually adjusting the brackets until 18+ (or just lying about the precise date) would grant more privacy. I can see making that trade-off though.

[–] davidgro@lemmy.world 1 point 1 day ago (2 children)

As an option, so it can automatically increment the brackets.

[–] davidgro@lemmy.world 7 points 1 day ago (4 children)

It's not though. It's literally asking the user "how old are you?" and not even caring if they lie. It's not even requiring a date, just a number of years.

[–] davidgro@lemmy.world 1 point 4 days ago

I guess you haven't heard of Avenue Q?

[–] davidgro@lemmy.world 0 points 7 months ago (1 child)

Why would someone direct the output of an LLM to a terminal on its own machine like that? That just sounds like an invitation to an ordinary disaster with all the 'rm -rf' content on the Internet (aka training data). That still wouldn't be access on a second machine though, and also even if it could make a copy, it would be an exact copy, or an incomplete (broken) copy. There's no reasonable way it could 'mutate' and still work using terminal commands.

And to be a meme requires minds. There were no humans or other minds in my analogy. Nor in your question.

[–] davidgro@lemmy.world 0 points 7 months ago* (last edited 7 months ago) (3 children)

If you know that it's fancy autocomplete then why do you think it could "copy itself"?

The output of an LLM is a different thing from the model itself. The output is a stream of tokens. It doesn't have access to the file systems it runs on, and certainly not the LLM's own compiled binaries (or even less source code) - it doesn't have access to the LLM's weights either. (Of course it would hallucinate that it does if asked)

This is like worrying that the music coming from a player piano might copy itself to another piano.