Friend, a startup creating a $99, AI-powered necklace designed to be treated as a digital companion, has delayed its first batch of shipments until the third quarter.
Friend had planned to ship devices to pre-order customers in Q1. But according to co-founder and CEO Avi Schiffman, that’s no longer possible.
“As much as I would like to have shipped in the first quarter of this year, I still have improvements to make, and unfortunately, you can only start building electronics when you’re 95 percent done with your design,” Schiffman said. he said in an email to customers. “I estimate that by the end of February, when our prototype is complete, we will start our final sprint.”
An email I sent to all customers with Friend pre-orders: pic.twitter.com/wUPR0OhpI4
— Avi (@AviSchiffmann) January 20, 2025
Friend, which has an eight-person engineering staff and $8.5 million in capital from investors including Perplexity CEO Aravind Srinivas, raised eyebrows when it spent $1.8 million on the Friend.com domain name. This fall, as part of what Schiffman called an “experiment,” Friend debuted an online platform at Friend.com that let people talk to random examples of AI characters.
The reception was mixed. TechRadar’s Eric Schwartz famous that Friend’s chatbots would often inexplicably start conversations with anecdotal traumas, including thefts and shootings. Indeed, when this reporter visited Friend.com on Monday afternoon, a chatbot named Donald shared that the “ghosts of [his] past” he was “triked”
In the aforementioned email, Schiffman also said that Friend will end its chatbot experience.
“We’re happy that millions have played with what I believe is the most realistic chatbot out there,” Schiffman wrote. “This has really proven our internal ability to manage traffic and has really taught us a lot about digital companionship… [But] I want us to stay focused solely on hardware, and I’ve realized that digital chatbots and built-in companions don’t mix well.”
AI companions have become a hot topic. Character.AI, a chatbot platform backed by Google, was accused in two separate lawsuits for causing psychological harm to children. Some experts have raised concerns that AI companions could exacerbate isolation replacing human relationships with artificial ones, and produce harmful content which can cause mental health conditions.