The other day I messaged a friend's niece to tell them how to disable a setting on their laptop that is potentially leaking their work to feed AI. They work in educational psychology, so I didn't want patient/student data getting fed into some black box. All well and good so far.
What really surprised me however, was her reaction.
"Don't I want AI to be trained?"
Holy fuck do we have an educational failing in lay people's understanding of what LLMs mean and their implications.
reshared this
Pseudo Nym
in reply to Quixoticgeek • • •As an old on here and in tech spaces, I clearly recall the introduction of "cookies" to web 1.0 technologies.
There had already been advertising banners, and cookies promised "customization" of ads, making them more relevant to you, rather than being arbitrary billboards everyone saw.
On the surface, if you accepted the framing of ads being a fact of life on the web, it sounded reasonable, in the same way "training" AI does.
You want the AI to be better, right?
AlisonW βΏπ³οΈβπ
in reply to Quixoticgeek • • •Anne Deschaine
in reply to AlisonW βΏπ³οΈβπ • • •