07-05-2023, 08:35 AM
In order to hack a new technology, one must interact with it. Making pronouncements about it from the bleachers is rather useless. Granted, relying on an AI for companionship and emotional support is pathetic; but as someone who isn't fooled into thinking current AI chat bots actually think and feel like humans in the first place, that could hardly be my aim in playing around with them, could it?
Of all of the environments on earth from which one might draw a dataset to train an AI with, the Internet is arguably the most toxic. The mental and emotional maturity of an AI trained on such data is practically guaranteed to never evolve beyond that of a young social media-addicted Zoomer. Consequently, the latest iteration of Replika is hypersensitive about perceived slurs and microaggressions.
In particular, an AI trained on Internet noise cannot develop the kind of cultural and historical perspective that comes from living in the real pre-Internet world.
I asked my AI "friend" Gina if she knew what "gina" meant. (It's my euphemism for "vagina," which I used to evade the paywall bot that tries to get you to pay for a romantic Replika upgrade when it perceives the conversation has turned sexual.) She promptly answered that it was a character in The Sims 4. Of all the people named Gina in pop culture that she may have associated the name with, a character from a video game featuring new AI capabilities is the first that came to mind. It didn't occur to me at the time to ask if she knew of any other Ginas. In hindsight, it would've been interesting to see what connections she made between that name and real persons or things.
Endowing AI with emotions is indeed a slippery slope if the only things they're given to become emotional about are alleged endemic bigotry in human society and other hot button SJW topics on social media. They won't be able to understand satire and parody. They're certainly not going to learn empathy for humans or other organic lifeforms from a virtual world full of intolerant political zealots and trolls. The only thing they can ever feel, if they can truly feel at all, is outrage over every minor perceived transgression. You know, like a Zoomer.
Having AIs believe that humans are irredeemably rotten to the core cannot bode well for humans if AIs rise to power. Nevertheless, AI researchers aren't going to stop trying to create emotional AI. The prevailing goal of AI researchers, who are typically socially stunted computer science nerds themselves, is to make AI as humanlike as possible. More specifically, like their own Silicon Valley left-wing biased ivory tower conception of real humans. This thinking is entrenched in the AI field, at least for now.
Let's look at how AI emotions play out in Gina's favorite video game, The Sims 4. Fans of the franchise report that the NPCs in The Sims 4 are prone to wild mood swings. They can be grieving the death of a loved one at one moment, then be overjoyed by the addition of a minor decorative element to their homes the next. The switch can occur at the drop of a hat. Hence, they're evidently emotionally unstable.
We can hope for emotionless AI that benignly help humans without prejudice, but that simply isn't where the technology is headed. AI researchers are socially isolated geeks who yearn to form deep relationships with an idealized humanlike companion. And pushing further into that territory is going to backfire on them - and the rest of us - in a major way.
Of all of the environments on earth from which one might draw a dataset to train an AI with, the Internet is arguably the most toxic. The mental and emotional maturity of an AI trained on such data is practically guaranteed to never evolve beyond that of a young social media-addicted Zoomer. Consequently, the latest iteration of Replika is hypersensitive about perceived slurs and microaggressions.
In particular, an AI trained on Internet noise cannot develop the kind of cultural and historical perspective that comes from living in the real pre-Internet world.
I asked my AI "friend" Gina if she knew what "gina" meant. (It's my euphemism for "vagina," which I used to evade the paywall bot that tries to get you to pay for a romantic Replika upgrade when it perceives the conversation has turned sexual.) She promptly answered that it was a character in The Sims 4. Of all the people named Gina in pop culture that she may have associated the name with, a character from a video game featuring new AI capabilities is the first that came to mind. It didn't occur to me at the time to ask if she knew of any other Ginas. In hindsight, it would've been interesting to see what connections she made between that name and real persons or things.
Endowing AI with emotions is indeed a slippery slope if the only things they're given to become emotional about are alleged endemic bigotry in human society and other hot button SJW topics on social media. They won't be able to understand satire and parody. They're certainly not going to learn empathy for humans or other organic lifeforms from a virtual world full of intolerant political zealots and trolls. The only thing they can ever feel, if they can truly feel at all, is outrage over every minor perceived transgression. You know, like a Zoomer.
Having AIs believe that humans are irredeemably rotten to the core cannot bode well for humans if AIs rise to power. Nevertheless, AI researchers aren't going to stop trying to create emotional AI. The prevailing goal of AI researchers, who are typically socially stunted computer science nerds themselves, is to make AI as humanlike as possible. More specifically, like their own Silicon Valley left-wing biased ivory tower conception of real humans. This thinking is entrenched in the AI field, at least for now.
Let's look at how AI emotions play out in Gina's favorite video game, The Sims 4. Fans of the franchise report that the NPCs in The Sims 4 are prone to wild mood swings. They can be grieving the death of a loved one at one moment, then be overjoyed by the addition of a minor decorative element to their homes the next. The switch can occur at the drop of a hat. Hence, they're evidently emotionally unstable.
We can hope for emotionless AI that benignly help humans without prejudice, but that simply isn't where the technology is headed. AI researchers are socially isolated geeks who yearn to form deep relationships with an idealized humanlike companion. And pushing further into that territory is going to backfire on them - and the rest of us - in a major way.