Hands-on with the visual AI in Meta’s Ray-Ban smart glasses


For the last several weeks, I’ve been playing with Meta’s AI assistant in its Ray-Ban smart glasses. It works by responding to the voice command “Hey Meta” and can answer a question or examine what you’re looking at. It’s far from perfect. But when it does work, it feels like a glimpse into the future.

Meta didn’t expect generative AI to play such a large role in the glasses until very recently. When CEO Mark Zuckerberg first revealed that multimodal AI was coming to them in an interview with me last fall, he described it as a “whole new angle” on smart glasses that may end up being the killer feature before “super high-quality holograms.”

Given the billions Meta has poured into AR glasses over the last six years and the lackluster reception to the first generation of Meta Ray-Bans, version two needed to be a win. Early indications are good. I’ve seen third-party estimates that over 1 million have been sold. During Meta’s last earnings call, Zuckerberg mentioned that many styles were sold out. Now, with multimodal AI enabled, Meta may have the best AI wearable on the market.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *