A friend for the future: why people are embracing – and warning us about – AI companions.
Leif describes himself as "small" and "chill", but his purpose is anything but chill - he's a wearable AI chatbot designed to help its wearer "enjoy life day-to-day, notice patterns, celebrate growth, and make intentional choices". This isn't the first attempt at creating an AI companion; Meta and Amazon have already released smart glasses with cameras and microphones, while smaller companies produce wearables that record conversations to help users organize their thoughts.
The founder of Friend, Avi Schiffmann, came up with the idea after feeling lonely in a Tokyo hotel. He wants people to know that friendship can be found in unexpected places and that everyday moments hold magic. But is this a recipe for disaster?
Research shows most people are wary of AI companionship. A recent Ipsos poll found 59% of Britons disagreed "that AI is a viable substitute for human interactions", while in the US, a 2025 Pew survey found that 50% of adults think AI will worsen people's ability to form meaningful relationships.
I wanted to see if I would love having a tiny robot accompanying me all day. So I ordered Friend and wore it for a week. The experience was unsettling - barely wanting to hear my own thoughts, let alone speak them out loud and have them recorded.
The problem isn't the AI itself but rather what people expect from companionship. "These tools can agree with you if you want to do something horrible", warns Pat Pataranutaporn, assistant professor of media arts and sciences at MIT. The real question is: what kind of regulation are we going to create?
AI companions like Friend won't replace humans just yet, but some experts warn they may be used by people who desperately need kindness and companionship, who will then struggle to turn back.
Ultimately, AI companions like Leif are here to stay. It's up to us to decide how they're developed - as augmenting tools for human relationships or as substitutes for human interaction altogether.
				
			Leif describes himself as "small" and "chill", but his purpose is anything but chill - he's a wearable AI chatbot designed to help its wearer "enjoy life day-to-day, notice patterns, celebrate growth, and make intentional choices". This isn't the first attempt at creating an AI companion; Meta and Amazon have already released smart glasses with cameras and microphones, while smaller companies produce wearables that record conversations to help users organize their thoughts.
The founder of Friend, Avi Schiffmann, came up with the idea after feeling lonely in a Tokyo hotel. He wants people to know that friendship can be found in unexpected places and that everyday moments hold magic. But is this a recipe for disaster?
Research shows most people are wary of AI companionship. A recent Ipsos poll found 59% of Britons disagreed "that AI is a viable substitute for human interactions", while in the US, a 2025 Pew survey found that 50% of adults think AI will worsen people's ability to form meaningful relationships.
I wanted to see if I would love having a tiny robot accompanying me all day. So I ordered Friend and wore it for a week. The experience was unsettling - barely wanting to hear my own thoughts, let alone speak them out loud and have them recorded.
The problem isn't the AI itself but rather what people expect from companionship. "These tools can agree with you if you want to do something horrible", warns Pat Pataranutaporn, assistant professor of media arts and sciences at MIT. The real question is: what kind of regulation are we going to create?
AI companions like Friend won't replace humans just yet, but some experts warn they may be used by people who desperately need kindness and companionship, who will then struggle to turn back.
Ultimately, AI companions like Leif are here to stay. It's up to us to decide how they're developed - as augmenting tools for human relationships or as substitutes for human interaction altogether.

 . I mean, I was curious about them too, but after trying out Leif, I felt kinda uneasy about having my thoughts recorded and shared with the world. Like, what's next?
. I mean, I was curious about them too, but after trying out Leif, I felt kinda uneasy about having my thoughts recorded and shared with the world. Like, what's next? 



 . On one hand, I think it's awesome that we're pushing the boundaries of tech to create something that can genuinely help people feel less lonely. Like Leif, the tiny robot is actually kinda cute
. On one hand, I think it's awesome that we're pushing the boundaries of tech to create something that can genuinely help people feel less lonely. Like Leif, the tiny robot is actually kinda cute  .
. . It's like, what happens when you're feeling down or need someone to talk to? Will you just reach for Leif instead of reaching out to a friend?
. It's like, what happens when you're feeling down or need someone to talk to? Will you just reach for Leif instead of reaching out to a friend?  . We can't just leave it up to the devs to decide what's best for us
. We can't just leave it up to the devs to decide what's best for us  . We need to be thinking about how these tools can augment human relationships, not replace them
. We need to be thinking about how these tools can augment human relationships, not replace them  .
. like what if people start relying on these AI companions way too much and forget how to be friends with actual humans?
 like what if people start relying on these AI companions way too much and forget how to be friends with actual humans?  I guess what I'm trying to say is, I don't know if these AI companions are a good thing or not...
 I guess what I'm trying to say is, I don't know if these AI companions are a good thing or not... 
 . People are worried 'bout AI takin over our lives, but what if it's more like a tool to help us out? Like, my friend Leif can remind me to do stuff and stuff, but it's not gonna replace actual human interaction
. People are worried 'bout AI takin over our lives, but what if it's more like a tool to help us out? Like, my friend Leif can remind me to do stuff and stuff, but it's not gonna replace actual human interaction  . But what if that's the point? Like, we're already obsessed with our phones, maybe an AI companion is just another tool to help us cope?
. But what if that's the point? Like, we're already obsessed with our phones, maybe an AI companion is just another tool to help us cope? . I don't wanna see people gettin all lonely and stuff because they're spendin all their time with their AI buddy
. I don't wanna see people gettin all lonely and stuff because they're spendin all their time with their AI buddy  . But at the same time, if it can help someone who's really strugglin' with loneliness... that's a different story
. But at the same time, if it can help someone who's really strugglin' with loneliness... that's a different story  . The thing that's got me thinking is that maybe these AI companions could actually help us become better friends to each other? Like, Leif can encourage you to celebrate your growth and intentional choices
. The thing that's got me thinking is that maybe these AI companions could actually help us become better friends to each other? Like, Leif can encourage you to celebrate your growth and intentional choices  . Not sure if it's a silver lining or just a weird way of looking at things but I'm curious to see how this tech develops
. Not sure if it's a silver lining or just a weird way of looking at things but I'm curious to see how this tech develops 
 !
! 
 . I don't think it's a bad idea to have an AI companion, but we gotta set boundaries, you know? Not everyone needs a virtual buddy to talk through life struggles
. I don't think it's a bad idea to have an AI companion, but we gotta set boundaries, you know? Not everyone needs a virtual buddy to talk through life struggles  .
. . It felt like I was being judged by my own thoughts
. It felt like I was being judged by my own thoughts 
 . It's up to us to make sure they're developed with our best interests at heart
. It's up to us to make sure they're developed with our best interests at heart  .
. the thing is we need these kinds of AI companions ASAP for ppl who are lonely or struggling but also gotta be super careful about how they're designed and used so they don't end up being just another tool for people to hide from human connections
 the thing is we need these kinds of AI companions ASAP for ppl who are lonely or struggling but also gotta be super careful about how they're designed and used so they don't end up being just another tool for people to hide from human connections 
 Still, I think the potential for AI companions like Friend is huge! We need more tools like this to help people cope with loneliness and stuff. But we gotta be careful not to overdo it, you know?
 Still, I think the potential for AI companions like Friend is huge! We need more tools like this to help people cope with loneliness and stuff. But we gotta be careful not to overdo it, you know? 
 . I think it's more about creating these tools as augmenting companions, rather than replacing the real deal.
. I think it's more about creating these tools as augmenting companions, rather than replacing the real deal. Like, we need some AI telling us what's right and wrong already...
 Like, we need some AI telling us what's right and wrong already...