AI Friendship Necklace Sparks Controversy: Does It Care or Not?
New York Residents Push Back Against AI Advertising Blitz
An AI startup launched an aggressive advertising campaign across New York City’s subway system, spending over $1 million on posters promoting their device, called Friend—a necklace-like gadget that listens throughout the day and sends notifications. However, many New Yorkers have responded with opposition, defacing these ads with warnings about the dangers of AI.
The campaign blanketed the West 4th Street subway station and covered thousands of cars and platforms, making it reportedly the largest subway ad initiative ever. The device itself resembles an AirTag necklace, intended to be kept close to answer questions and monitor surroundings, providing opinions on conversations via notifications powered by Claude.
Despite its promotional slogan—”Friend: someone who listens, responds, and supports you”—the public reaction has been hostile. Graffiti messages such as “BE A LUDDITE,” and warnings like “AI will promote suicide,” have appeared over the posters. Another graffiti urged residents to “be a luddite,” reflecting resistance to intrusive AI technology.
Older tech experiments like Google Glass in 2013 and Zuckerberg’s failed Metaverse ventures serve as cautionary tales. Both projects were abandoned after public rejection. Meanwhile, other tech companies are developing similar wearable AI devices, like Meta’s AI glasses in collaboration with Oakley and Ray-Ban, which integrate AI into daily life by reminding or recording everything seen.
As society becomes more aware of screen addiction—especially among teens—public skepticism grows. Recent surveys indicate that Americans are twice as likely to believe AI will negatively impact society, and there’s a call to reconnect with real-world relationships rather than virtual ones. Critics argue these devices threaten privacy and social bonds, revealing a widespread sentiment: it’s time to oppose invasive AI gadgets and prioritize genuine human connection.