Apple’s
new
“Visual
Intelligence”
feature
was
one
of
the
most
impressive
things
shown
at
Monday’s
iPhone
16
event.
The
tool
lets
users
scan
the
world
around
them
through
the
iPhone’s
camera
to
identify
a
dog
breed,
copy
event
details
off
a
poster,
or
look
up
just
about
anything
around
them.
It’s
a
handy-looking
feature
that
fits
right
in
with
the
iPhone’s
new
camera
button.
But
it
may
also
be
setting
the
stage
for
bigger
products
down
the
road:
it’s
the
exact
kind
of
thing
Apple
will
need
for
future
tech
like
AR
glasses.
It’s
not
hard
to
imagine
how
Visual
Intelligence
could
help
you
out
on
a
device
that
sees
everything
you
see.
Take
the
idea
of
learning
more
about
a
restaurant,
like
Apple
showed
for
Visual
Intelligence
on
an
iPhone:
instead
of
fishing
your
phone
out
of
your
pocket
to
look
up
information
about
a
new
spot,
with
glasses,
you
could
just
look
at
the
restaurant,
ask
a
question,
and
have
the
glasses
tell
you
more.
Meta
has
already
proven
that
computer
glasses
can
be
good
Meta
has
already
proven
that
computer
glasses
with
an
AI
assistant
can
be
a
good
and
useful
tool
for
identifying
things.
It’s
not
a
great
leap
to
imagine
Apple
doing
something
similar
with
a
very
high
level
of
fit
and
finish
for
theoretical
glasses.
Apple
would
almost
certainly
make
glasses
connect
back
to
all
of
your
apps
and
personal
context
on
your
iPhone,
too,
which
would
make
Visual
Intelligence
even
more
handy.
Of
course,
Apple
already
does
have
a
headset
covered
in
cameras: the
Vision
Pro.
But
most
people
don’t
walk
around
with
their
headset
outside
of
their
house,
and
they
probably
already
know
about
the
things
they
have
at
home.
It’s
long
been
reported
that
Apple
wants
to
develop
a
pair
of
true
AR
glasses,
and
that
feels
like
the
ultimate
destination
for
this
kind
of
tech.
The
thing
is,
Apple-made
AR
glasses
might
be
very
far
away.
Bloomberg’s
Mark
Gurman
reported
in
June
that
a
2027
launch
date
has
been
“bandied
about”
for
its
in-development
glasses
but
noted
that
“no
one
I’ve
spoken
to
within
Apple
believes
the
glasses
will
be
ready
in
a
few
years.”
But
whenever
those
glasses
arrive,
they’re
going
to
need
software
—
and
you
can
see
Apple
building
out
the
basics
of
it
here.
Visual
Intelligence
might
be
Apple’s
first
step
toward
the
killer
app
for
computer
spectacles,
and
by
starting
now,
Apple
will
potentially
have
years
to
refine
the
feature
before
it
shows
up
in
glasses.
It
wouldn’t
be
unprecedented
for
Apple
to
take
that
approach.
The
company
iterated
on
AR
technologies
in
the
iPhone
for
years
before
launching
the
Vision
Pro.
Yes,
the
Vision
Pro
is
arguably
much
more
of
a
VR
headset
than
an
AR
device,
but
it’s
clearly
a
first
step
toward
something
that
could
turn
into
AR
glasses.
As
Apple
improves
that
hardware,
it
can
work
on
software
features
like
Visual
Intelligence
on
the
iPhone,
too,
and
when
the
time
is
right,
pack
all
of
the
best
ideas
into
a
glasses-like
product.
A
glasses
computer
might
be
a
major
new
frontier,
with
companies
like
Meta
and
Snap
investing
heavily
in
AR
glasses,
Google
showing
off
prototype
glasses,
and
Qualcomm
working
on
mixed
reality
glasses
with
Samsung
and
Google.
If
Apple
makes
a
pair,
Visual
Intelligence
will
probably
be
a
key
way
it
tries
to
compete.
Let’s
just
hope
it
works
well
on
iPhones
first.
(Originally posted by Jay Peters)
Comments