AI Glasses: The Year 2024

AI Glasses: The Year 2024

Apple’s pricey Vision Pro augmented ⁢reality platform is expected to⁤ arrive in the first quarter of⁢ 2024. But by the end of the year, I predict the platform of the year will be — drum roll, please — AI glasses!

Wait, what?

That’s right. Glasses that let you interact with artificial‍ intelligence (AI) from the comfort ⁣of your own face​ will be the sleeper hit of the year. In fact, the buzz around market leader Meta has already ⁢begun.

Announced in September and shipped in ⁢October, the Ray-Ban Meta glasses arrival was initially received with a collective shrug. They were assumed to be camera ⁣glasses,⁣ like Snap Spectacles, or virtual assistant glasses, like Amazon’s⁣ Echo Frames. Or, for that matter, a small upgrade from their Meta⁣ predecessor, Ray-Ban Stories.⁤ But Spectacles, Echo Frames and Ray-Ban Stories failed‍ to ⁢thrill ​the gadget-loving public.

It took a while for everyone to learn that Ray-Ban Meta glasses, which ⁢start at $299, are orders⁤ of magnitude ⁤better and more powerful ‌than any of these lackluster gadgets; they offer a much better camera, super high-quality audio, the ability ⁣to live-stream to social, and⁣ an incredibly good AI assistant.

(Here’s⁤ a ⁢look ⁣at the camera ‍quality via my own photos and a video.)

Despite‍ the lackluster⁤ launch, Ray-Ban Meta glasses started blowing up online in December when three things happened.

First, Meta announced a kind of⁣ closed ‍beta of its “multimodal” feature.​ While users could conjure up the Meta Assistant at any ⁢time using the “Hey, Meta” command, the “multimodal” feature adds a “look” command that sends‌ a picture taken through the glasses’ camera to the Meta Assistant for processing and analysis. You can tell your glasses to look at a table full of ⁤ingredients and⁤ condiments and give you⁣ a recipe for using those items, for example — all hands-free.  The ‌combination of both spoken and visual interaction with Meta’s powerful AI is mind-blowing ⁤and conspicuously world-changing.

Second,​ tech journalists started forming a consensus that Ray-Ban Meta glasses are actually transformative. Though I was praising them​ way back in October, ⁣a critical mass of my colleagues really⁣ started getting excited about them only last month.

Mashable’s​ Kimberly Gedeon‌ said: “The Ray-Ban Meta ⁢Smart Glasses shocked me,‍ in a good way.”

9to5Mac’s Filipe Espósito wrote, “Ray-Ban⁤ Meta glasses convinced me to believe in‍ smart glasses.”

C|NET’s Scott Stein said (about the ‍”multi-modal” feature) that⁢ “the demo ⁢wowed me because I had never seen anything like it.”

Third, Ray-Ban ​Meta videos started taking off on TikTok and other social networks (even though the glasses’ live-streaming feature supported ​only Facebook and Instagram).

Yes, it’s already a market

Several AI glasses products are already on the market. For example, Lucyd⁣ Lyte ‍Smart Eyewear Powered with ChatGPT was ‌announced ⁣in August. The glasses ⁤are inexpensive, but of fatally low quality — especially the⁣ sound quality — according to…

2024-01-06 21:00:03
Link from www.computerworld.com rnrn

Exit mobile version