Meta
will
bring
AI
to
its
Ray-Ban
smart
glasses
starting
next
month,
according
to
a
report
from
The
New
York
Times.
The
multimodal
AI
features,
which
can
perform
translation,
along
with
object,
animal,
and
monument
identification,
have
been
in
early
access
since
last
December.
Users
can
activate
the
glasses’
smart
assistant
by
saying
“Hey
Meta,”
and
then
saying
a
prompt
or
asking
a
question.
It
will
then
respond
through
the
speakers
built
into
the
frames.
The
NYT
offers
a
glimpse
at
how
well
Meta’s
AI
works
when
taking
the
glasses
for
a
spin
in
a
grocery
store,
while
driving,
at
museums,
and
even
at
the
zoo.
Although
Meta’s
AI
was
able
to
correctly
identify
pets
and
artwork,
it
didn’t
get
things
right
100
percent
of
the
time.
The
NYT
found
that
the
glasses
struggled
to
identify
zoo
animals
that
were
far
away
and
behind
cages.
It
also
didn’t
properly
identify
an
exotic
fruit,
called
a
cherimoya,
after
multiple
tries.
As
for
AI
translations,
the
NYT
found
that
the
glasses
support
English,
Spanish,
Italian,
French,
and
German.
Meta
will
likely
continue
refining
these
features
as
time
goes
on.
Right
now,
the
AI
features
in
the
Ray-Ban
Meta
Smart
Glasses
are
only
available
through
an
early
access
waitlist
for
users
in
the
US.
Original author: Emma Roth
Comments