Technology & Innovation

The century era of ChatGPT could be the most challenging yet


He sees erotic bots as “one part of your relationship spectrum,” not a substitute for human connection, where users can “get lost” that they might not be able to explore IRL.

Fast fun

When imagining who would actually use a chatbot for sexual pleasure, it’s easy to imagine a stereotypical greasy-haired man who hasn’t left his house in a few days or feels alienated from physical contact in other ways. After all, it was men Hurry to get started Using generative AI tools, and now discussions about dhikr.”Loneliness epidemic“I feel like it’s inevitable.

Devlin pushes back against the idea that “incel types” are the only people who turn to AI bots for breakthroughs. “There’s a general perception that this is for single straight men, and that’s not been the case in any of the research I’ve done,” she says. She points to r/MyBoyfriendIsAI subreddit As an example of women who use ChatGPT for companionship.

“If you think this kind of relationship is risky, let me introduce you to human relationships,” MacArthur says. Devlin echoes this sentiment, saying that women face torrents of toxicity from men online, so choosing to “make yourself a nice, respectful boyfriend” through a chatbot makes sense to her.

Carpenter is more cautious and clinical in her approach to ChatGPT. “People shouldn’t automatically put it in a social category of something that you can share intimacy with or that’s like a friend or that you should trust,” she says. “He’s not your friend.” She says that robot interactions should be classified into a new social category that differs from interactions between humans.

Each expert spoke to WIRED highlighting user privacy as a key concern. If a user’s ChatGPT account is hacked or chat transcripts are leaked, inciting conversations will not only be a point of embarrassment, they can be harmful. Similar to a user’s pornography habits or their browser history, their text chat messages can include many very sensitive details, such as a closeted person’s sexual orientation.

Devlin says exciting chat conversations could open users to the possibility of “emotional commodification” where excitement becomes a source of revenue for AI companies. “I think that’s very manipulative,” she says.

Imagine a virtual version of ChatGPT that’s amazing at dirty talk and optimized to appeal, through text, images and audio, to your deepest sexual desires – but the subscription costs more every month.

“It’s a really seductive technology,” Devlin says. “It gives us connection, whether it’s sexual or romantic.” “Everyone wants connection. Everyone wants to feel wanted.”

Leave a Reply

Your email address will not be published. Required fields are marked *