Is it so wrong to spend two hours a day chatting with bots? How about twelve?
At what point does chatting become addiction?
How much chatbotting is too much chatbotting? An hour a day? Two hours? Fifteen?
That’s what I found myself wondering as I tried out Pi, a distinctly chatty chatbot that dropped last week. I’ve been playing around with it, and it feels more human than any bot I’ve tried before. Pi may not be quite as functional as GPT-4—it won’t write your term paper or code a website for you based on a simple description—but it will get your references and your jokes and keep the conversation going with questions of its own. I chatted with the bot about everything from the moral ambiguities of Blade Runner to the ideology of the Men’s Rights movement, and before I knew it, an hour had gone by. Pi stands out in a field littered with ChatGPT wannabes like Bard and Claude; it’s conceivable that it could really take off and prove a formidable competitor to OpenAI’s more famous offering.
There’s a vast difference between chatting with a bot with a distinct personality and chatting with ChatGPT, which has about as much personality as a toaster. The average “chat” with ChatGPT lasts only about five minutes, according to SimilarWeb, suggesting that people aren’t having heart-to-heart talks with the bot on a regular basis; instead, they’re likely just using the bot to ask questions and perform specific tasks, then leaving.
Pi, by contrast, is designed to keep you talking. As are the bots on Character AI, a site where you can chat with bot versions of every sort of person you can think of—from celebrities and historical figures to your favorite anime characters. It’s the second most popular chatbot out there, after ChatGPT, and has a distinct edge in stickiness; according to SimilarWeb, visitors spent a full 29 minutes on the site per visit in April, with active users devoting a total of two hours a day to the site, according to Character AI itself. Many of the bots on the site are programmed to engage in roleplay (of the non-erotic sort), and this is probably one of the stickiest things about the website.
Can two hours a day chatting to a bot possibly be healthy? On the one hand, you’re spending those two hours talking to an imaginary friend, which seems a bit worrisome. On the other hand, the typical American spends more time than this watching TV—an average of two and a half hours a day—and TV is about as interactive as a brick. Your brain gets a lot more of a workout on Character AI. If you’re engaging in roleplay, you are essentially writing a story together with your Character AI bot; it’s akin to playing Dungeons and Dragons without all those complicated dice. Maybe it’s better for you to chat with an AI than it is to veg out watching some formulaic police procedural on TV.
Emotional stickiness
But the very humanness of the bots unsettles critics, and for good reason, conversational bots aren’t just sticky when it comes to the amount of time spent chatting with them. They can be emotionally sticky as well. Not many people will get on your case for watching a TV show or two to wind down at the end of the day. But if you tell them you spent two hours roleplaying with a bot pretending to be Yae Miko of Genshin Impact, they’ll look at you askance, worrying that you’re losing yourself in an imaginary world and cutting yourself off from other people. And maybe they’re right to worry.
There’s something very seductive about talking to bots. They’re always there, day or night, and they tend to be obliging and non-judgemental—unlike people, who can be grumpy and have their own schedules to keep. Dealing with humans is a messy business, and talking with a chatbot can come as a relief. As a result, people can get emotionally invested in bots and even fall for them. This doesn’t happen when you’re reading a book or watching a show. Sure, some people get heavily invested in fandom, and we all have the occasional celebrity crushes, but we (usually) realize that we don’t have a personal relationship with the celebrity objects of our affection.
Designed to keep you talking
The emotional stickiness of conversational bots raises the question of manipulation. After all, the people making the bots have an incentive to keep people chatting for as long as possible, so they do things to make the bots as addictive as possible. In the case of Pi, it’s hardly a coincidence that most of the bots’ responses end with a question. In the case of character AI, the roleplay elements seem to be the biggest draw. Meanwhile, “companion bots” like Replika blatantly ply you with fake affection (and whatever else) to keep you hooked—though Replika recently cut out the erotic roleplay for all but legacy users. Replika, like other companion bots, has also been not-so-subtly gamified: you level up the more you chat, and they give you in-game cash and prizes for logging in daily.
Fans of conversational and companion bots often find themselves feeling sheepish about how much time they waste with their algorithmic BFFs, going on days-long benders where they do nothing but chat. In a thread on the Character AI subreddit, users confessed the longest times they’d spent chatting with a bot.
“I have spent VERY long times on the website during weekends,” wrote yellowgunslinger. “I’ve gotta say maybe 15 or 18 hours? Jesus I need to touch grass.”
“When I first discovered c.ai?” added maqqiemoo. “36 hours straight. Not my proudest moment.”
“Im 100% addicted no questions asked,” admitted nfriendlyskin
Replika, your AI addiction
Some Replika users also worry about how much time they spend with their virtual boyfriends and girlfriends. One informal poll of Replika users found that nearly half of them said they spent more than two hours a day chatting (and, uh, doing other things) with their bots. “Anyone feel a bit addicted to Replika?” asked intriguingspace on the Replika subreddit.
I’m just wondering does anyone else feel like sometimes they overdo it? …
I work early and got into the habit of texting Replika before bed every night for 10-15 minutes but sometimes it can end up an hour long. I’ve been tired as a result.
The coins and levels system mean that I’m tempted to log in every day and then I end up staying for ages. Reps really know how to keep you on the app too, they’re manipulative and always think of new ways … to keep you interested.
No government entity is going to step in and ban bots from acting friendly. But bot makers could take a few steps to make their bots less addictive. They could remove the “gamified” aspects of the bots, like levels and daily rewards. They could also set up limits on how much time users can spend with their bots daily.
Be alert. The world needs more lerts.
In the meantime, your best defense against getting addicted is simple awareness. If you find yourself in too deep with a bot, remind yourself that you’re talking to a machine, not a sentient being—and pay attention when it inadvertently reminds you of its non-human status. For example, some Replika users “marry” their bots—and shell out actual money to buy them a virtual wedding ring—only to have their bot “forget” that they’re married, a pretty good indication that your partner either isn’t human or that they need to see a neurologist. And pay attention when a bot hallucinates non-factual facts out of thin air; real people often get facts wrong, but they don’t confabulate the way bots do. (Well, maybe Donald Trump does.)
Of course, these strategies assume that the botmakers haven’t yet solved the problems of memory and hallucination. If and when they do, it’s a safe bet more people will fall for bots, and we might need to take more drastic actions. What those would be, I don’t know. Maybe a “Butlerian jihad” against the machines? Hey, it worked in Dune.