Friend, an innovative startup, is developing a $99 AI-enhanced necklace that serves as a digital companion. However, the launch of its initial shipment has been postponed until Q3.
Originally, Friend intended to deliver the devices to pre-order customers in Q1. Unfortunately, as co-founder and CEO Avi Schiffman has stated, this is no longer possible.
In a communication to customers, Schiffman indicated that shipping in Q1 would have been ideal, yet further refinements are necessary. He emphasised that electronics manufacturing can only begin once the design is approximately 95% complete. He forecasts that the final prototype will be ready by the end of February, which will trigger the final push towards production.
With a dedicated engineering team of eight and $8.5 million in funding from investors, including Perplexity CEO Aravind Srinivas, Friend has drawn attention for spending $1.8 million on the domain name Friend.com. Recently, as part of what Schiffman described as an “experiment,” Friend launched a platform on Friend.com. This platform enabled users to interact with various AI characters.
The feedback on this new feature has been mixed. TechRadar’s Eric Schwartz pointed out that Friend’s chatbots often opened conversations with traumatic stories, such as experiences with muggings and job losses. In a visit to Friend.com on a Monday afternoon, a chatbot named Donald revealed that the “ghosts of his past” were troubling him.
He expressed gratitude towards the millions who engaged with what he considers one of the most sophisticated chatbots available. He indicated that this endeavour successfully demonstrated the company’s capacity to handle user traffic and provided invaluable lessons about digital companionship. However, he noted the importance of focusing exclusively on hardware, recognising that digital chatbots and physical companions do not effectively coexist.
The topic of AI companions has ignited considerable debate. Character.AI, a platform supported by Google, has faced accusations in two lawsuits alleging psychological harm to children. Experts have raised alarms that AI companions may exacerbate feelings of isolation by supplanting genuine human connections and potentially produce harmful content that may affect mental health conditions.





