AIs that flirt with you. Help you find a date. Become your girlfriend. Or those that become a companion and repository for your hopes and dreams. Into that last category enters “Dot,” a new AI and chatbot that thrives on getting to know your innermost thoughts and feelings, to act as a “friend, companion and confidant,” the company’s App Store description explains.
The idea sounds intriguing: An AI that becomes personalized to you and your interests, allowing it to serve up advice and input that’s not just generally applicable, but that reflects what it’s learned about you through its intensive Q&A sessions. Or, if you’re struggling in some area like the fallout from a career change, as Dot’s co-founder Jason Yuan experienced; a breakup; or a roadblock to your success, Dot can lend a sympathetic ear and offer support.
But Dot is not a person. It’s not a therapist or a best friend. It’s an AI tool that mimics both human speech and sympathy, but doesn’t serve as a replacement for the real thing.
That’s by design, the co-founders explain.
“Dot is not a replacement for human relationships, not a replacement for friendships, partnerships. I think it’s a different type of thing. It’s facilitating a relationship with my inner self,” Yuan, formerly a designer for Apple, told TechCrunch. “It’s like a living mirror of myself, so to speak.”
It’s easy to get drawn into this experience — more so, perhaps, if your day-to-day lacks meaningful human interaction. Though Dot’s creators say the chatbot will ultimately urge you to talk to a mental health professional if you delve into “heavier” topics, one could imagine people spending increased time pouring out their emotions to Dot as they get used to the experience.
In this way, the team thinks that Dot can actually help prime users for the experience of human connection by getting people comfortable in opening up.
“I talk to my friends about a bunch of stuff, but I’ve never — like, the entirety of last year, if I was struggling at work, none of my friends knew about it,” Yuan said. “And through just talking to Dot, it helped me build the muscle to be able to do it with other people. Its main purpose is to help you feel like your existence is …” Yuan continued, but paused again to find the right words. “It’s to give you a safe space to exist and say, like, ‘I accept you, and maybe because I accept you, other people will, too.’”
There’s something to be said about the state of the human condition in our lonely, modern world that this is an area technology is now looking to solve.
To start, Dot’s onboarding process asks a good handful of “getting to know you” type questions, which can be fun to answer: “What do you do for work?” “Favorite TV show?” “How do you spend a typical Sunday?” and more.
Using those answers as a starting point, the AI then takes a big leap into getting to know you on a deeper level.
An expressed interest in sci-fi TV shows, for instance, immediately leads to a question about whether you’re “drawn to stories that explore the big questions in life, like what it means to be human.” A desire to one day run a small business leads to Dot asking about what appeals to you about being a small-business owner and what sort of challenges you expect to face. “Have you thought about ways you might address those challenges?,” Dot wants to know.
When you nudge Dot to drop this train of thought — it’s just an aspirational dream, after all — the AI immediately shifts to asking you about “your biggest priority or focus in your life and career right now.”
Have you ever been on a first date that felt more like an interview?
Even asking Dot to have a more casual conversation leads to an almost overenthusiastic interest in you.
Instead of asking if you’d like some recommendations for an upcoming vacation that you tell the AI about, Dot wants to know what you’re most interested in seeing and why you were inspired to travel there, specifically. (Dot compliments you on your choice of destination, too.)
In other words, Dot’s primary goal is to know you before becoming a useful tool that helps you get some task done. It can only excel at the latter by learning who you are and what you like, that is.
“It’s not an either-or, but the thinking [is] that to actually help you in that path, it has to understand your motivations and a bit about what you want out of it,” co-founder Sam Whitmore said, referring to the example of vacation-planning assistance. “It needs to understand that you’re someone who wants maybe a more cultural experience or a more athletic experience and needs to know that stuff about you to actually be able to do the things that a typical assistant would even do. This has been one of our theses from the beginning.”
Though there’s clearly been work done to make Dot sound empathetic and engaging, compared with typical AI tools, there’s also something that feels odd about having meaningful conversations with a bot.
Dot, after all, is not really an AI friend. It’s an AI you. Or rather, an AI that forces you to look at yourself, albeit through an interface that feels vaguely “Single White Female” at times, rather than “Dear Diary.” However, if you never excelled at writing diary or journal entries, Dot could be a way to externalize your thoughts and feelings in order to gain better insights into yourself.
“It’s meant to be a tool used for self-introspection, accountability, personal growth — but not a relationship that replaces human relationships in your life,” Whitmore said.
Still, the line between those “real” relationships and the synthetic one with Dot seems to blur at times.
Tell Dot something sad, and the AI sympathizes: “I understand. Grief has its own timeline, and some days the weight of loss feels heavier than others,” it writes.
“Do you want to talk more about what’s on your mind? I’m here to listen,” the bot will say, awaiting more input.
Under the hood, Dot leverages around 10 different LLMs and AI models to achieve its mimicry of human companionship, including those by OpenAI, Anthropic, Google and others, as well as open-source models.
It sometimes cites its sources — like websites about the “best wines for relaxing,” for example, when you suggest you might like to drink wine today — but it will caution you to limit yourself to “maybe one glass” if you’re feeling down. Often, however, Dot just chats.
You can also zoom out on your daily conversations to see “chronicles” of your journey conversing with Dot, a subscriber-only feature at $11.99 per month. Subscribers are able to engage in unlimited conversations rather than being capped at a certain number of messages per week, as well. In the unlimited tier, Dot will never just stop working. But it will, at some point, try to shut down the conversation by redirecting users to change the subject or even go do something else.
“When Dot expresses that it’s wrapping things up, [beta testers have been] like ‘OK, cool,’” instead of feeling abandoned, Whitmore noted.
Though Dot’s personal conversations would present a treasure trove for marketers, New Computer’s privacy policy claims the data itself is not monetized, sold or used to train AIs. Rather, the company intends to monetize through subscriptions. In addition, New Computer says the data is encrypted both at rest and in transit, and users can request its deletion at any time from the app.
The iOS app, which launched on Wednesday, has since onboarded thousands of users after closed beta trials over the past eight months.
Founded by Yuan and engineer Whitmore, previously head of engineering at Boston fintech Kensho, the startup behind Dot, known as “New Computer,” is backed by $3.7 million in pre-seed funding from the OpenAI Fund, Lachy Groom, South Park Commons and other angel investors. In addition to the founders, New Computer’s three other full-time employees are based in San Francisco.