
Date
30 September, 2024
Author
Stuart Brazier
Creative Director
Houston, We Have a Problem: First Impressions of Meta Ray-Ban Smart Glasses pt1
This article explores my initial hands-on experience with the Meta Ray-Ban smart glasses following an exciting and motivating Meta Connect 2024. From promising demos to real-world challenges, I’ll share how the first few days of using them played out, and whether the recent release of Meta AI changes the game.
Meta Connect 2024 made me reconsider a product I had written off: Meta Ray-Ban smart glasses. Previously, I saw them as a gimmick, but the showcase of AI integration and a potential screenless future got me thinking—was this just a polished demonstration or the start of something bigger? After attending the event through my Meta Quest 3, I decided it was time to explore for myself.
The next day, I pitched the idea at work: “How can we advise clients on smart glasses without firsthand experience?” So, I ordered a pair. Delivery was fast, setup smooth—but then the real question: Are they any good?
First impressions? Oh dear. These glasses felt underwhelming. The demo promised so much, but the reality was different. They didn’t do everything I’d seen. Sure, they played music—but not through Audible Yes, I could launch Calm and enjoy some much-needed relaxation, but where was Meta AI? I’d half-ignored the implications of the EU’s data privacy laws (GDPR), which blocked AI data collection, leaving me without the ‘wake word’ magic I’d hoped for. No ‘Meta, look at this…’. Just a pair of glasses that weren’t delivering what I’d expected.
But then, a little notification this week gave me hope: Meta AI is now available on our Ray-Bans. Could this change everything?
With Meta AI now activated, I’m genuinely excited to see how this changes the game. The promise of having an AI assistant just a ‘Hey Meta’ away could transform these glasses from a gimmick to something truly impactful. I really think we’re at the point of saying, ‘Houston, we have liftoff.’