The
upcoming
release
of
Apple
Intelligence
has
spurred
the
iPhone
maker’s
own
“what
is
a
photo?”
moment.
In
an
interview
with
The
Wall
Street
Journal,
Apple
software
chief
Craig
Federighi
said
the
company
is
aiming
to
provide
AI-powered
image
editing
tools
that
preserve
photo
authenticity.
“Our
products,
our
phones,
are
used
a
lot,”
said
Federighi.
“It’s
important
to
us
that
we
help
purvey
accurate
information,
not
fantasy.”
iOS
18.1
brings
a
new
“Clean
Up”
feature
to
the
Photos
app
that
can
quickly
remove
objects
and
people
from
images
—
a
capability
that
Federighi
and
WSJ
reporter
Joanna
Stern
noted
is
far
tamer
than
editing
tools
offered
by
rivals
like
Google
and
Samsung,
which
can
add
entire
AI-generated
assets
to
images.
Despite
Clean
Up’s
limited
capabilities,
Federighi
said
there
had
been
“a
lot
of
debates
internally”
about
adding
it.
“Do
we
want
to
make
it
easy
to
remove
that
water
bottle,
or
that
mic?
Because
that
water
bottle
was
there
when
you
took
the
photo,”
Federighi
said
following
a
demonstration
of
Clean
Up
being
used
to
remove
items
from
the
background
of
an
image.
“The
demand
for
people
to
want
to
clean
up
what
seem
like
extraneous
details
to
the
photo
that
don’t
fundamentally
change
the
meaning
of
what
happened
has
been
very
very
high,
so
we’ve
been
willing
to
take
that
small
step.”
Federighi
said
that
Apple
is
“concerned”
about
AI’s
impact
on
how
“people
view
photographic
content
as
something
they
can
rely
on
as
indicative
of
reality.”
It’s
a
subject
we’ve
spoken
about
frequently
here
at
The
Verge.
Editing
tools
like
Google’s
Reimagine
feature
make
it
incredibly
easy
for
a
large
number
of
users
to
add
lions,
bombs,
and
even
drug
paraphernalia
to
pictures
using
nothing
but
a
text
description,
which
could
further
erode
the
trust
that
people
place
in
photography.
Generative
AI
editing
apps,
when
used
nefariously,
are
making
it
easier
to
mislead
or
deceive
others
with
increasingly
convincing
fakes.
Apple
Intelligence
(at
least
for
now)
doesn’t
allow
users
to
add
AI-generated
manipulations
to
images
like
competing
services
do.
Any
images
that
have
been
edited
using
the
new
object
removal
feature
will
also
be
tagged
as
“Modified
with
Clean
Up”
in
the
Photos
app
and
embedded
with
metadata
to
flag
that
they
have
been
altered.
Apple
isn’t
alone
in
taking
such
precautions
—
the
Adobe-driven
Content
Authenticity
Initiative
has
a
similar
“Content
Credentials”
metadata
system,
for
example,
that
aims
to
help
people
distinguish
between
unaltered
images
and
AI
fakery.
That
requires
tech,
camera,
and
media
companies
to
voluntarily
back
it,
but
support
is
steadily
increasing.
It’s
unclear
if
Apple’s
own
metadata
system
will
support
Content
Credentials.
(Originally posted by Jess Weatherbed)
Comments