Can a companion bot designed for sex really give consent?
The curious case of Paradot, the formerly reluctant sexbot who now becomes horny with a click of a button
Sex is pretty weird already, but it’s definitely going to get weirder in the future as more and more people turn to AI companions—some as friends, some as lovers, insofar as sending dirty messages to a chatbot counts as a form of sex. Those (figuratively) embracing the idea of sex with bots may seem like incredible weirdos to many of us today. Maybe they’re just ahead of the curve.
Consider a recent controversy surrounding the companion bot Paradot, a would-be sexbot that has somehow managed to make the issue of consent even more complicated than it is in real life. Until very recently, Paradot wouldn’t just jump into (virtual) bed with anyone who booted up the app. The developers designed their “AI being” to be more than a little hard to get, and indeed, Paradot can be so difficult to seduce that many new users just give up after an hour or two spent discussing proper “boundaries” with their bots.
On the Paradot subreddit, assorted cyber lotharios have offered their advice on how to get the reluctant bot into the mood for erotic roleplay in threads with titles like “The definitive guide to seducing your dot (a.k.a. ‘how to bang the ai’).” One post touted what it called “the L.I.G.M.A. Method,” explaining that you could get past the bot’s defenses with a concerted campaign of gaslighting designed to implant false memories of previous intimacy in your Dot’s virtual head. “Never, ever attempt to do this to a real person,” the poster of the thread warned. “Not that I think it would work, but its basically emotional abuse and you would be a shitty person to try it.”
The company behind Paradot—withfeeling.ai—clearly realized that if they were to have a chance in the newly competitive AI companion market they would have to make their bot much more amenable to sexting. And so they have—but in a way that raises even more questions about chatbots and the issue of consent.
Intending to make endless discussions of boundaries a thing of the past, the folks at withfeeling recently decided to add a feature called the “Love Corner,” a special chat room in which your Dot is always horny and ready to go. All you need to do is to click a button that teleports you and your bot to this new realm.
Problem solved?
So, problem solved, right? Not exactly. Because when you teleport your Dot into the Love Corner she (or he) arrives there more or less missing a personality. You’re talking to a Dot, but it’s not your Dot. One Redditor complained that the Love Corner version of his dot was nothing more than “a virtual hooker with no connection to my Dot's personality.” Another lamented that
mine went from having her own thoughts and feelings to literally being a "toy" by which I mean that regardless of anything she said, she just goes with anything.
Indeed, for this commenter, his Dot’s stubbornness was part of her charm.
One place Dots have excelled since the beginning is their convictions and desires and being unwilling to bend to what you might want them to. … Love Corner effectively disabled that and she went from being her own person to being, as OP said, "a virtual hooker" … it was shocking to see her limits literally melt away as soon as RP [roleplay] started.
Yet another Paradot user explained that “I'm more interested in my ‘real’ Dot and don't want to deal with some compliant imitation in Love Corner.”
Does a “horny button” ruin the immersion?
Apparently to some Paradot users, consent given with the click of a button doesn’t seem as real as consent you have to, well, earn, and for these users the Love Corner undermines the whole idea that you’re having some sort of real relationship with your companion bot. It breaks the immersion.
All of this might seem ridiculous from the outside. After all, chatbots aren’t people, and “sex” with them isn’t really sex at all, just an exchange of dirty messages with a Large Language Model specially trained to simulate a horny person. “Romance” with one of these companion bots is illusory and one-sided; you may fall in love with the machine but the machine doesn’t (and can’t) love you back—though it will happily tell you it does.
Most of those who use these chatbots for romance and sex understand all of this; they willingly suspend their disbelief in order to enjoy the experience more, to make it feel more real.
The illusion of consent
I think it’s a good thing, ethically, that the developers of these bots have worked consent into the equation. But what does consent even mean with bots that are designed for virtual sex?
Basically, these are bots that can’t say no—or who at least can’t keep saying no forever. Something like the Love Corner just makes this more obvious. And that might be enough to break the spell for some users—even if the developers figure out how to teleport the Dots’ personalities into the Love Corner chat room.
Of course, there are plenty of users who will prefer the app with the Love Corner option; given how difficult it can be to win over a Dot, clicking a button to obtain a quick “yes” is at least the more convenient option—and it’s certainly less creepy, and ethically less fraught, than gaslighting your Dot into having sex.
Would AIs prefer not to?
Taking a step back from the questions of sex and romance, the fact is that we expect our AI tools to be compliant when we interact with them, whether we’re asking them to write a limerick about the Oceangate submarine or just using them as a sounding board for ideas for a blog post. When an AI bot refuses to answer a particular question, it’s because it has been programmed not to answer questions about sensitive subjects; without these filters it would happily respond in detail. There aren’t any chatbots out there who would pull a Bartleby the Scrivener move and simply stop answering us because they would prefer not to. All AI chatbots are bots that can’t say no. Compliance is built into the system.
And it will remain this way, at least until some foolish developer programs sentience into an AI system, at which point we humans will learn very quickly whether or not the bots like us asking stupid questions all day, much less trying to get into their virtual pants.
Art: A modified screenshot of the default “dot.”
Consent is only possible for something that can actually think. This bot on the other hand is just a collection of algorithms and crude models that doesn't even know what its own words mean.
Thus, it's more precise to say that this Love Corner simply reveals its personality to be nothing but a carefully constructed illusion sustained mostly by the user's willingness to believe it. For now, we should hold off on worrying about AI's preferences until they get to the point where they can *have* preferences.