My bot, my self

A new wave of selfie-generating chatbots, made in our own images, end up revealing how much humanity they leave behind.

My bot, my self
I prompted Substack's AI image generator to create "the opposite of a chatbot." It made this.

Full story: Creators, porn stars turn to AI doppelgangers to keep fans entertained


This story is about a porn bot — but, more than that, it’s about what we expect from those we interact with, and how tech companies are eager to capitalize on our need for connection with AI models they hope are good enough:

Stacy Torres, a researcher at the University of California, San Francisco, who studies social isolation and technological interaction, said it’s no surprise that porn stars and other creators are allocating tasks to AI; the workers of corporate America are, too. She said the chatbot could be helpful for people who have given up on healthy, in-person conversation.

But she also worried that it could further teach people to expect the “instant gratification” of on-demand communication, in which one side is always eager and accessible, and under the other’s total control.

“What happens when these people who get conditioned to this form of interaction try to create real intimacy with a real person in the real world who can’t be manipulated in the same way?” she said.

I spent some time chatting with SophieAI and found its answers a bit rote and annoying. The fact that it mostly just made up stuff about the personal life of the human it was modeled after — even after she fed it the right answers! — also seemed to undermine its reason for being.

And yes, I know: I’m probably expecting too much from a sext bot. Even still, if this is a model for future interactions, I’ll stick with people.


In other news

  • The pharmacy chain Rite Aid used facial recognition surveillance systems in its stores for years, leading many customers to be misidentified, harassed and shamed as shoplifters in front of their coworkers, family members and friends. One distraught 11-year-old girl was stopped and searched after a false match.
  • Photos of child sexual abuse were found in one of the most prominent AI training data sets. A shocking demonstration of how little we know about the nerve centers of these new models. Maybe vacuuming giant swathes of the internet to “teach” them was not the best idea.

Thanks for reading. Let's talk: [email protected].