Meta's smart glasses raise questions about the future of digital advertising

Meta's smart glasses raise privacy concerns and could revolutionize digital advertising by capturing real-world visual data.

Meta's smart glasses raise questions about the future of digital advertising
Ray-Ban Meta glasses

Meta, formerly known as Facebook, this month announced new AI features and partner integrations for its Ray-Ban Meta smart glasses, signaling the company's continued push into wearable technology and augmented reality. While touted as a convenient way for users to capture moments and interact hands-free, the glasses also represent a significant step towards Meta gaining greater access to users' real-world visual data and behaviors.

The Ray-Ban Meta glasses, first launched one year ago, allow wearers to take photos and videos, make calls, listen to music, and use voice commands - all through a device worn on their face. The latest updates announced on September 25, 2024 add AI capabilities like real-time language translation, memory assistance, and the ability to ask questions about what the user is looking at.

According to Meta's announcement, users will soon be able to "ask Meta AI to tag along" while exploring a new city, getting information about landmarks and suggestions for what to see next. The glasses can also now help users remember where they parked or set reminders based on visual cues.

While these features may prove useful for consumers, they also give Meta unprecedented access to users' real-world experiences and visual data. Unlike a smartphone that is often kept in a pocket or bag, smart glasses are worn continuously and can potentially capture everything the user sees.

This treasure trove of visual data and real-world behavior patterns could prove invaluable for Meta as it seeks to build more sophisticated advertising and AI models. By understanding exactly what catches users' attention in the physical world, Meta may be able to create more targeted and effective digital advertising.

The implications for digital out-of-home (DOOH) advertising could be particularly significant. DOOH ads, which appear on digital billboards and screens in public spaces, have traditionally been difficult to personalize or measure in terms of engagement. However, if a critical mass of consumers begin wearing AI-enabled smart glasses, it may become possible to track exactly which outdoor ads people look at and for how long.

This data could then be used to serve personalized DOOH ads in real-time as a user approaches a digital billboard. For example, if the AI notices a user often looks at ads for sports cars, it could trigger car ads to appear on nearby screens. The glasses' AI could even potentially highlight or annotate real-world ads that match a user's interests.

While Meta has emphasized privacy controls and a visible LED to indicate when the glasses are recording, some privacy advocates have raised concerns about the implications of normalizing always-on, AI-enabled cameras in public spaces. There are also questions about how Meta may use or share the vast amounts of visual data collected through the glasses.

As smart glasses and augmented reality technology continue to advance, they have the potential to radically reshape how we interact with the world around us - and how companies interact with us. While offering new conveniences and capabilities, this technology also raises important questions about privacy, consent, and the future of advertising in both digital and physical spaces.

Meta's push into smart glasses represents just the beginning of this new frontier. As the technology becomes more sophisticated and widespread, society will need to grapple with its implications and establish new norms and regulations around its use. For now, the full impact remains to be seen, but it's clear that devices like the Ray-Ban Meta glasses have the potential to fundamentally change our relationship with technology, advertising, and the world around us.

Key Facts

  • Meta announced new AI features for Ray-Ban Meta smart glasses on September 25, 2024
  • New capabilities include real-time translation, visual question answering, and memory assistance
  • The glasses give Meta access to users' real-world visual data and attention patterns
  • This data could potentially be used to create more targeted advertising, including personalized digital out-of-home (DOOH) ads
  • Privacy advocates have raised concerns about the implications of normalizing always-on cameras in public spaces