The
iPhone
16
Pro
is
one
of
the
most
unfinished
products
Apple
has
ever
shipped.
Almost
all
of
its
highlight
features
will
arrive
in
future
software
updates
that
will
stretch
well
into
next
year
before
they’re
here.
That’s
big
stuff,
like
the
new
Apple
Intelligence
AI
features
the
company
says
will
start
slowly
arriving
in
October,
and
little
stuff,
like
the
complete
functionality
of
the
new
Camera
Control
button
on
the
side.
Even
really
minor
things,
like
that
new
Siri
animation
that
inspired
the
tagline
“It’s
Glowtime”
for
the
phone’s
launch
event?
Not
here
yet.
You
get
the
same
old
Siri
bubble
as
ever
until
Apple
Intelligence
arrives.
The
Good
New
tone
control
in
camera
lets
you
dial
back
HDR
processing
Who
doesn’t
love
a
physical
shutter
button?
Your
video
director
friends
will
spend
hours
gleefully
taking
4K120
video
portraits
of
people
at
street
festivals
The
Bad
Camera
Control
button
is
a
little
fiddly
Default
photo
processing
is
more
aggressive
than
ever
The
most
incremental
of
incremental
upgrades
over
the
iPhone
15
Pro
$999
at
Apple$1000
at
Best
Buy
How
we
rate
and
review
products
Apple
iPhone
16
Pro
Max
$1199
The
Good
New
tone
control
in
camera
lets
you
dial
back
HDR
processing
Who
doesn’t
love
a
physical
shutter
button?
Your
video
director
friends
will
spend
hours
gleefully
taking
4K120
video
portraits
of
people
at
street
festivals
The
Bad
The
Pro
Max
is
very
close
to
being
too
big
Camera
Control
button
is
a
little
fiddly
Default
photo
processing
is
more
aggressive
than
ever
The
most
incremental
of
incremental
upgrades
over
the
iPhone
15
Pro
$1199
at
Apple$1200
at
Best
Buy
How
we
rate
and
review
products
The
hard
rule
of
reviews
at
The
Verge
is
that
we
always
review
what’s
in
the
box
—
the
thing
you
can
buy
right
now.
We
never
review
products
based
on
potential
or
the
promise
of
software
updates
to
come,
even
if
a
company
is
putting
up
billboards
advertising
those
features,
and
even
if
people
are
playing
with
those
features
in
developer
betas
right
now.
When
Apple
Intelligence
ships
to
the
public,
we’ll
review
it,
and
we’ll
see
if
it
makes
the
iPhone
16
Pro
a
different
kind
of
phone.
Until
then,
the
iPhone
16
Pro
we’re
reviewing
today
is
an
incremental
update
—
it’s
mostly
a
set
of
very
nice
but
ultimately
minor
changes
to
the
iPhone
15
Pro.
It’s
hard
to
make
the
case
for
an
upgrade
right
now:
there
is
almost
no
reason
to
upgrade
to
the
16
Pro
or
16
Pro
Max
from
the
15
Pro
or
15
Pro
Max
—
especially
since
the
15
Pros
are
the
only
older
iPhones
that
will
get
Apple
Intelligence
when
it
arrives.
And
if
you
have
an
older
Pro
phone,
it’s
worth
waiting
to
see
if
Apple
Intelligence
is
any
good
before
you
upgrade;
there’s
no
reason
to
throw
money
at
hardware
just
to
support
unproven
software.
All
that
said,
the
iPhone
16
Pro
does
contain
one
extremely
notable
camera
update,
and
it’s
a
good
one
—
although
it’s
probably
not
what
you
think.
So
let’s
start
there.
The
Camera
Control
button
sits
where
the
mmWave
5G
antenna
used
to
be
—
it’s
now
been
integrated
into
the
other
antennas.
There
are
two
big
changes
to
the
iPhone
16
and
16
Pro
cameras:
the
new
Camera
Control
button,
and
a
new
set
of
controls
for
how
images
are
processed.
The
button
itself
is
a
hybrid:
you
can
press
it
down
all
the
way
to
take
a
photo,
or
give
it
a
light
press
to
trigger
a
haptic
click
and
bring
up
a
setting
like
zoom
or
exposure,
which
you
can
adjust
with
a
swipe.
A
double
light
press
lets
you
switch
between
those
settings.
(You
can
adjust
the
pressure
sensitivity
of
the
haptic
press
in
the
accessibility
settings,
which
is
nice,
although
I
found
the
default
to
be
just
fine.)
By
default,
a
single
click
opens
the
camera
when
the
phone
is
unlocked,
and
another
takes
a
photo.
It’s
pretty
fun
to
flip
the
phone
on
its
side
and
shoot
with
the
button
like
a
normal
camera,
although
the
physical
button
is
a
bit
stiff
—
a
few
Verge
staffers
found
themselves
moving
the
phone
slightly
when
pushing
all
the
way
down
to
take
a
photo,
although
I
thought
it
was
fine.
I
found
myself
accidentally
opening
the
camera
a
lot
at
first
since
I’m
left-handed
and
the
button
is
placed
where
my
fingers
tend
to
rest
when
I
hold
the
phone.
You
can
set
it
to
require
a
double-click,
and
that
solved
the
problem
for
me.
You
can
also
set
the
button
to
open
third-party
camera
apps;
it
works
well
with
the
new
version
of
Halide
that’s
been
updated
to
support
that
functionality.
We’ll
swipe
down
the
surface
of
things.
The
reason
Apple
calls
it
“Camera
Control”
and
not
just
“shutter
button”
is
the
capacitive
controls
on
the
top,
which
should
ideally
let
you
adjust
various
settings
with
a
quick
swipe.
I
was
really
hoping
I’d
find
myself
using
the
capacitive
controls
to
adjust
things
like
exposure
and
focal
length,
but
it’s
all
a
bit
fiddly
switching
between
everything
with
the
light
presses
and
far
too
easy
to
end
up
changing
things
you
weren’t
intending
to.
The
whole
thing
would
be
greatly
improved
if
a
second
light
press
dismissed
the
control;
once
they’re
open,
they
tend
to
stay
open,
leading
to
inadvertent
changes
when
your
finger
slides
along
the
button.
You
can
just
tap
on
the
screen
to
dismiss
the
control,
which
I
found
useful.
You
can
also
just
swipe
on
the
onscreen
settings
to
adjust
them,
which
allowed
for
more
precise
control
than
swiping
along
the
button
itself.
In
a
real
theme
for
the
iPhone
this
year,
the
Camera
Control
is
shipping
in
an
unfinished
state.
Apple
says
a
software
update
later
this
year
will
allow
the
button
to
emulate
a
traditional
two-stage
shutter
button,
where
a
half-press
focuses
and
a
full
press
takes
the
shot.
(I
asked,
but
the
company
isn’t
giving
a
firm
date
for
this.)
It’s
hard
to
know
how
big
a
deal
this
will
be
until
it
arrives;
I’ve
had
a
lot
of
complaints
about
iPhone
cameras
over
the
years,
but
setting
focus
has
never
been
one
of
them.
It’s
still
easier
to
use
the
on-screen
shutter
button
when
shooting
vertically.
Apple
is
very
proud
of
the
faster
camera
sensor
in
the
iPhone
16
Pro,
which
it
claims
offers
zero
shutter
lag,
and
you
can
indeed
click
away
pretty
fast
on
the
camera
button
while
shooting
in
HEIF
or
JPG
mode.
You
can
definitely
outrun
it
if
you’re
shooting
in
RAW,
though
—
I
clocked
it
around
4
frames
per
second,
which
is
pretty
great
for
a
phone
but
not
anywhere
close
to
what
a
modern
mirrorless
camera
with
an
electronic
shutter
can
do.
Overall,
the
button
is
very
nice
to
have,
but
that’s
about
it
right
now
—
as
it
exists
today,
it’s
not
a
huge
improvement
over
shooting
photos
with
any
other
iPhone.
The
actual
photos,
on
the
other
hand?
Well,
it’s
complicated.
Having
camera
controls
right
under
your
finger
is
nice,
but
it
can
be
easy
to
accidentally
change
things
while
you’re
shooting.
It’s
safe
to
say
that
a
lot
of
people
did
not
love
the
cameras
on
the
iPhone
15
and
15
Pro.
Apple
has
gotten
increasingly
aggressive
with
its
approach
to
computational
photography
over
the
past
few
years,
and
various
forums
and
social
platforms
have
been
filled
with
complaints
about
that
for
a
while
now.
The
New
Yorker
published
a
piece
about
iPhone
photos
looking
unrealistic
two
years
ago
—
the
sense
that
these
cameras
are
starting
to
look
a
little
weird
has
been
building.
The
iPhone
15
and
15
Pro
hit
a
kind
of
tipping
point
—
they
produced
photos
so
aggressively
processed
that
all
kinds
of
people
started
noticing
and
complaining
about
it.
I
have
been
reviewing
phones
and
cameras
for
a
long
time,
but
I
will
never
publish
a
review
as
efficiently
devastating
as
Alix
Earle
asking
her
7
million
followers
why
her
iPhone
15
camera
sucks.
If
people
who’ve
built
multimillion-dollar
content
businesses
with
their
phone
cameras
aren’t
loving
the
cameras
on
their
new
phones,
something’s
gone
wrong.
If
I
had
to
offer
a
radically
simplified
diagnosis
of
what’s
going
on
with
all
these
complaints,
it’s
just
that
the
iPhone
won’t
simply
leave
shadows
and
highlights
alone.
You’re
not
just
taking
a
photo
when
you
press
that
shutter
button
—
Apple’s
fancy
Photonic
Engine
HDR
photography
pipeline
captures
up
to
nine
frames
with
each
press,
intelligently
exposes
things
like
the
sky
and
faces
in
different
ways,
applies
a
great
deal
of
sharpening
and
noise
reduction,
and
drops
a
final
processed
image
in
your
camera
roll.
The
whole
process
allows
iPhones
to
preserve
a
great
deal
of
detail
across
an
image,
but
one
side
effect
is
that
it
inevitably
brightens
the
dark
parts
of
an
image
and
brings
down
the
bright
parts
so
you
can
actually
see
that
detail.
The
side
effect
is
that
images
seem
flat
because
they
lack
contrast
between
light
and
dark.
I
always
think
about
this
like
dynamics
in
music:
if
every
part
of
a
song
is
loud,
then
nothing
actually
seems
loud.
That’s
what’s
been
happening
with
the
iPhone
camera
over
time.
Everything
is
getting
so
bright
that
nothing
is
bright,
and
the
photos
are
starting
to
look
flat,
even
gray.
This
time
around,
I
have
good
news
and
bad
news.
iPhone
16
(left),
iPhone
16
Pro
(right).
The
Pro
camera
can
capture
more
light,
and
it’s
never
seen
a
shadow
it
didn’t
have
a
problem
with.
The
bad
news
is
that
by
default,
the
iPhone
16
Pro
camera
is
even
more
aggressive
about
evening
out
shadows
and
highlights
than
the
iPhone
15
Pro.
It’s
subtle,
but
it’s
there
—
you
can
see
it
with
basic
photos
of
plants,
with
pictures
of
people,
with
street
scenes
—
it’s
all
just
a
little
bit
brighter,
a
little
bit
flatter.
Shadows
in
iPhone
16
Pro
photos
are
dramatically
boosted
compared
to
the
regular
iPhone
16,
although
the
16
Pro
offers
much
nicer
depth
of
field,
does
less
sharpening,
and
performs
better
in
low
light.
(I
actually
found
it
hard
to
make
the
16
Pro
go
into
night
mode,
while
the
regular
16
drops
to
night
mode
pretty
easily.)
The
larger
sensor
with
bigger
pixels
on
the
16
Pro
can
just
capture
more
light
than
the
sensor
on
the
16,
and
Apple’s
default
settings
use
all
that
extra
light
to
wage
absolute
war
on
shadows.
And
while
the
48-megapixel
ultrawide
camera
on
the
iPhone
16
Pro
produces
12-megapixel
photos
that
look
awfully
similar
to
the
iPhone
15
Pro,
they
are
substantially
better
than
the
ultrawide
photos
from
the
iPhone
16.
iPhone
15
Pro
Max
(left),
iPhone
16
Pro
Max
(right).
The
differences
here
are
so
subtle,
but
the
newer
phone
boosts
the
shadows
just
a
little
more.
We’re
going
to
do
a
much
deeper
camera
comparison
in
the
weeks
to
come,
so
I
won’t
overdo
the
comparison
to
the
Galaxy
S24
Ultra
and
the
Pixel
9
Pro
XL,
since
that
requires
intense
pixel
peeping.
Suffice
it
to
say
that
the
Pixel
has
the
best
zoom,
while
Samsung’s
color
handling
remains
aggressively
chaotic.
But
time
and
again,
all
three
cameras
produced
photos
that
were
essentially
small
variations
on
the
same
ultraprocessed
look
that
these
companies
have
seemed
intent
on
chasing
for
a
while
now.
The
Pixel
9
Pro
XL
at
5x
zoom
is
notably
clearer
than
the
iPhone
16
Pro
Max
5x
zoom.
The
iPhone
16
Pro
has
a
nice
5x
telephoto
lens,
but
you
can
see
some
artifacting
in
this
medium-light
shot.
But
here’s
the
good
news.
The
iPhone
16
and
16
Pro
allow
you
to
exclude
yourself
from
this
narrative
entirely
with
a
huge
upgrade
to
the
Photographic
Styles
feature
that
allows
you
to
adjust
how
the
camera
processes
colors,
skin
tones,
and
shadows,
even
after
you’ve
shot
a
photo.
The
iPhone
16
and
16
Pro
let
you
pick
“undertones”
to
help
dial
in
your
preferred
skin
tone.
You
can
pick
between
five
“undertone”
settings
that
are
meant
to
adjust
skin
tones,
and
nine
“mood”
settings
that
feel
a
lot
like
high-quality
Instagram
filters.
You
can
shoot
with
a
live
preview
of
any
of
the
styles,
and
then
you
can
tweak
the
settings
or
even
switch
styles
entirely
later
on.
And
all
of
these
styles
offer
three
new
fine
controls:
there’s
“color,”
which
is
basically
saturation,
and
“palette,”
which
is
the
range
of
colors
being
applied.
Most
importantly,
there’s
a
new
control
called
“tone”
which
lets
you
add
shadows
back
to
your
photos.
It
turns
out
Apple
is
using
“tone”
in
this
context
to
mean
“tone
mapping,”
and
in
my
tests,
the
tone
control
allowed
me
to
reliably
bring
the
iPhone’s
image
processing
back
to
reality
by
turning
it
down.
Default
settings
(left),
with
the
tone
control
set
to
-60
(right).
I
set
it
a
little
more
aggressively
to
make
the
effect
obvious.
The
tone
control
is
semantically
aware
—
it
will
adjust
things
like
faces
and
the
sky
differently,
so
it’s
still
doing
some
intense
computational
photography,
but
the
goal
is
for
you
to
be
able
to
take
photos
that
look
a
lot
more
like
what
a
traditional
camera
would
produce
if
you
bring
the
slider
all
the
way
down.
(You
can
also
go
all
the
way
up
for
the
most
intense
smartphone
HDR
photos
you’ve
ever
seen,
if
that’s
the
sort
of
thing
that
makes
you
happy.)
Default
settings
(left),
Tone
-70
and
Color
+73
(right).
Just
a
little
punchier!
Turning
down
the
tone
control
felt
like
a
sigh
of
relief
—
I
prefer
photos
with
less
aggressive
tone
mapping
way
more
than
the
default
iPhone
16
Pro
settings
and
the
photos
produced
by
the
iPhone
15
Pro.
It’s
like
a
haze
is
being
lifted;
images
are
a
little
punchier,
a
little
more
present.
You
might
feel
differently,
but
I
like
shadows
and
highlights,
and
the
addition
of
the
tone
control
lets
me
have
them
on
a
phone
camera
without
jumping
through
the
hoops
of
shooting
in
RAW
and
processing
the
photos
myself.
For
me,
the
tone
control
offers
such
a
meaningful
improvement
to
iPhone
photos
that
it’s
possible
to
argue
that
this
one
single
camera
adjustment
makes
upgrading
to
an
iPhone
16
or
16
Pro
worth
it.
I
am
a
huge
photo
nerd
who
cares
a
lot
about
these
things,
and
even
I
don’t
think
that
that
argument
is
100
percent
convincing,
but
it
is
very
possible
to
make
that
argument,
which
is
wild.
Could
you
also
just
buy
a
camera
app
like
Halide
and
use
its
very
popular
new
Process
Zero
feature
on
any
other
iPhone
to
take
less
processed
photos?
You
could.
Could
you
just
spend
this
money
on
a
nice
point
and
shoot,
which
are
staging
a
mini
comeback?
You
definitely
could,
and
you
might
change
your
relationship
to
photography
in
a
deeply
positive
way
by
doing
so.
But
if
you
take
a
lot
of
iPhone
photos,
and
you’ve
started
to
notice
that
they
look
a
little
weird,
well,
it’s
possible
to
at
least
make
the
argument.
The
48MP
ultrawide
camera
produces
12MP
photos
that
look
essentially
the
same
as
the
iPhone
15
Pro’s
ultrawide.
Allowing
styles
to
be
edited
and
changed
after
a
photo
is
taken
required
Apple
to
reshuffle
the
Photonic
Engine
computational
photography
pipeline
—
it’s
the
same
basic
process
on
the
iPhone
16
Pro
as
on
the
15
Pro,
but
tone
mapping
is
now
one
of
the
final
steps.
The
idea
here
is
for
the
the
edits
to
be
“perceptually
lossless;”
when
you
take
a
photo
in
a
style,
that’s
how
the
photo
is
saved,
but
the
iPhone
adds
a
little
chunk
of
data
to
the
image
file
that
allows
it
to
undo
that
style
and
revert
the
image
to
standard.
This
means
you
can
tweak
styles
and
even
change
them
entirely
whenever
you
want,
and
I
had
great
fun
making
several
different
versions
of
the
same
shot.
That
bit
of
extra
data
results
in
files
that
are
about
25
percent
bigger
than
before
—
around
3MB
instead
of
2.5MB,
generally
—
and
the
vagaries
of
compression
mean
that
styles
only
work
if
you
shoot
in
HEIF,
a
format
that
continues
to
bedevil
basically
everything
outside
of
Apple’s
ecosystem.
If
you
set
the
camera
to
shoot
standard
JPGs,
you
don’t
get
styles
or
the
new
tone
control.
(Apple’s
also
added
the
ability
to
shoot
in
the
new
JPEG-XL
format
in
both
lossy
and
lossless
modes,
but
that’s
a
RAW
format,
and
styles
won’t
work.)
My
suggestion
to
Apple
would
be
to
allow
the
use
of
the
tone
control
as
a
permanent
exposure-like
adjustment
when
shooting
JPGs,
but
I’m
just
a
guy
who
likes
shadows.
The
D-pad
control
for
color
and
tone
in
photographic
styles
is
fun
but
hard
to
use
precisely.
Styles
overall
aren’t
really
ready
for
professional
workflows
—
the
only
way
to
adjust
them
after
shooting
is
with
a
fiddly
two-axis
D-pad
control
that
also
controls
color,
which
makes
the
whole
thing
feel
woefully
imprecise.
You
also
can’t
apply
a
style
to
a
bunch
of
photos
at
once,
and
trying
to
keep
track
of
which
photos
have
which
styles
applied
requires
staring
into
the
absolute
abyss
of
iOS
file
management.
The
Photos
app
in
macOS
Sequoia
will
be
able
to
adjust
styles,
but
Apple
won’t
say
if
third-party
apps
will
be
able
to
support
style
editing
in
the
future.
The
entire
vibe
of
all
these
new
controls
is
very
much
“you
figure
it
out.”
The
more
you
play
with
styles,
especially
the
undertone
styles
meant
for
skintones,
the
more
it
seems
like
Apple
has
simply
given
up
on
having
a
point
of
view
for
what
this
camera
should
look
like.
Google
makes
a
lot
of
noise
about
its
Real
Tone
project,
which
is
supposed
to
allow
the
Pixel
to
capture
accurate
skin
tones
for
all
kinds
of
people,
but
Apple’s
solution
is
to
simply
let
people
choose
their
own
skin
tone
using
the
“undertones”
styles
(which
works,
although
it
often
changed
more
than
just
skin
tone
in
our
test
photos).
Undertones
also
apply
to
everyone
in
an
image,
so
if
you
take
a
photo
of
people
with
a
range
of
skin
tones,
they’re
all
going
to
get
the
same
effect.
I
get
the
idea
behind
undertones,
but
the
execution
feels
like
it
needs
a
little
more
refinement.
I
don’t
have
a
lot
of
say
about
the
“mood”
styles,
which
are
very
fun
and
expressive.
Verge
supervising
producer
Vjeran
Pavic
basically
fell
in
love
with
these
while
he
was
testing
the
video
features
—
they
reminded
him
of
the
very
popular
Fujifilm
recipes
for
emulating
different
kinds
of
film.
You
should
watch
our
video
review
for
more
on
both.
The
one
thing
I
will
add
is
that
the
new
spatial
audio
recording
in
video
is
surprisingly
complicated,
and
doesn’t
really
result
in
spatial
audio
the
way
you’d
expect
when
you
play
a
video
back.
Apple’s
Alex
Kirschner
told
me
that
spatial
audio
capture
is
primarily
there
to
enable
the
(very
cool!)
new
audio
mix
feature
that
allows
you
to
remove
background
noise
from
videos
of
people
talking;
you’ll
get
headphone-based
spatial
audio
when
listening
through
AirPods,
but
Apple’s
bizarrely
chosen
to
have
the
Apple
TV
play
these
videos
in
5.1
or
7.1
surround
instead
of
something
like
Atmos,
so
you’ll
lose
any
height
effects.
(Worse:
if
you
AirPlay
a
video
captured
with
spatial
audio,
it
will
only
play
back
in
stereo.)
Is
it
bananas
that
a
smartphone
can
record
4K
video
in
surround
sound?
It
absolutely
is.
It
is
just
also
getting
increasingly
hard
to
understand
what
Apple
means
by
“spatial
audio,”
and
how
it
can
be
edited
and
played
back
across
various
audio
devices.
No
AI-enhanced
existential
crisis
moon
photos
here.
It’s
also
notable
what
isn’t
present
on
the
iPhone
this
year:
there’s
no
generative
AI
wackiness
at
all.
There’s
no
video
boost
or
face-swapping,
no
adding
yourself
to
group
photos,
no
drawing
to
add
stuff
with
AI
like
on
the
Pixel
or
Galaxy
phones
—
really,
none
of
it.
I
asked
Apple’s
VP
of
camera
software
engineering
Jon
McCormack
about
Google’s
view
that
the
Pixel
camera
now
captures
“memories”
instead
of
photos,
and
he
told
me
that
Apple
has
a
strong
point
of
view
about
what
a
photograph
is
—
that
it’s
something
that
actually
happened.
It
was
a
long
and
thoughtful
answer,
so
I’m
just
going
to
print
the
whole
thing:
Here’s
our
view
of
what
a
photograph
is.
The
way
we
like
to
think
of
it
is
that
it’s
a
personal
celebration
of
something
that
really,
actually
happened.
Whether
that’s
a
simple
thing
like
a
fancy
cup
of
coffee
that’s
got
some
cool
design
on
it,
all
the
way
through
to
my
kid’s
first
steps,
or
my
parents’
last
breath,
It’s
something
that
really
happened.
It’s
something
that
is
a
marker
in
my
life,
and
it’s
something
that
deserves
to
be
celebrated.
And
that
is
why
when
we
think
about
evolving
in
the
camera,
we
also
rooted
it
very
heavily
in
tradition.
Photography
is
not
a
new
thing.
It’s
been
around
for
198
years.
People
seem
to
like
it.
There’s
a
lot
to
learn
from
that.
There’s
a
lot
to
rely
on
from
that.
Think
about
stylization,
the
first
example
of
stylization
that
we
can
find
is
Roger
Fenton
in
1854
—
that’s
170
years
ago.
It’s
a
durable,
long-term,
lasting
thing.
We
stand
proudly
on
the
shoulders
of
photographic
history.
That’s
a
sharp
and
clear
answer,
but
I’m
curious
how
Apple
contends
with
the
relentless
addition
of
AI
editing
to
the
iPhone’s
competitors.
The
company
is
already
taking
small
steps
in
that
direction:
a
feature
called
“Clean
Up”
will
arrive
with
Apple
Intelligence,
which
will
allow
you
to
remove
objects
from
photos
like
Google’s
Magic
Eraser.
McCormack
told
me
that
feature
will
somehow
mark
the
resulting
images
as
having
been
generatively
edited,
although
he
didn’t
say
how.
I
did
ask
if
Apple
would
adopt
an
image
verification
standard
like
C2PA,
which
companies
like
Adobe,
Microsoft,
OpenAI,
and
now
Amazon
and
Google
have
decided
to
support;
McCormack
told
me
Apple
was
waiting
to
see
how
things
evolved
before
it
made
decisions.
This
is
fair,
as
that
standard
is
a
bit
of
a
mess
right
now,
and
it’s
not
even
clear
it’ll
even
be
that
effective
on
social
platforms.
But
being
able
to
trust
the
images
we
see
is
going
to
get
more
and
more
complex
and
important,
and
the
iPhone
is
the
most
popular
camera
in
the
world,
so
it’s
clear
that
the
industry
will
ultimately
bend
around
Apple’s
approach.
We’ll
see.
The
16
Pro
Max
is
a
big
phone.
Everything
else
about
the
iPhone
16
Pro
is
incredibly
incremental.
The
displays
now
go
down
to
1
nit
of
brightness,
which
is
very
nice
for
not
waking
your
partner
while
you
doomscroll
in
bed.
Those
displays
are
also
bigger
now
—
the
Pro
is
6.3
inches,
while
the
Pro
Max
is
6.9
inches,
which
is
the
largest
ever
on
an
iPhone.
The
regular
Pro
doesn’t
feel
that
much
bigger,
since
the
bezels
are
smaller
and
the
phones
didn’t
get
any
thicker.
But
the
16
Pro
Max
feels
meaningfully
larger
than
the
15
Pro
Max.
I
have
big
hands
and
I’ve
always
picked
the
big
phone,
and
the
16
Pro
Max
is
definitely
right
on
the
line
of
too
big
to
handle
like
a
phone
instead
of
a
tablet.
Both
phones
have
an
A18
Pro
chip
inside,
which
Apple
claims
is
faster
by
various
impressive-sounding
percentages
than
the
A17
Pro.
As
with
all
iPhones,
those
performance
numbers
are
mostly
about
headroom
and
longevity
at
this
point
—
my
iPhone
15
Pro
doesn’t
feel
slow,
and
the
16
Pro
doesn’t
feel
faster.
I
am
very
curious
to
see
if
the
addition
of
Apple
Intelligence
changes
this
perception,
but
we’ll
just
have
to
wait
and
see.
I
feel
the
same
way
about
battery
life.
While
the
iPhone
performance
advantage
means
these
phones
will
stay
relevant
for
a
long
time,
my
experience
with
battery
degradation
is
just
the
opposite.
After
about
a
year,
my
iPhone
15
Pro
Max
battery
capacity
has
dropped
to
93
percent,
and
it
now
struggles
to
make
it
through
a
day
without
enabling
Low
Power
Mode.
Apple
says
the
iPhone
16
Pro
gets
significantly
better
battery
life
than
the
iPhone
15
Pro,
although
the
company
won’t
quote
anything
other
than
video
playback
times.
The
Pro
Max
is
supposed
to
have
the
best
battery
life
ever
on
an
iPhone,
and
the
batteries
certainly
held
up
for
full
days
during
my
testing,
which
was
very
heavy
on
camera
usage
and
screen-on
time.
But
it’s
unclear
how
Apple
Intelligence
running
various
places
across
iOS
will
affect
battery
life,
and
it’s
similarly
hard
to
know
if
the
battery
will
stay
strong
after
months
and
years
of
use.
iPhone
16
Pro
battery
replacements
from
Apple
cost
more
than
before,
so
this
is
something
to
pay
attention
to
over
time.
Software-wise,
my
review
units
are
running
iOS
18.0,
which
allows
you
to
radically
customize
the
homescreen,
lockscreen,
and
Control
Center,
and
which
supports
RCS
for
better
messaging
with
Android
users.
The
updated
Qi2-enabled
MagSafe
puck
can
charge
the
iPhone
16
Pro
at
30W.
You
can
more
or
less
theme
the
homescreen
any
way
you
want,
down
to
adjusting
icon
colors
globally,
and
the
lockscreen
now
allows
you
to
change
the
quick
access
buttons
to
third-party
apps.
The
preview
build
of
Halide
I
was
testing
supported
this,
so
I
switched
it
in
for
the
system
camera
app,
which
was
nice.
Once
more
camera
apps
support
this,
we’ll
end
up
with
a
lot
of
ways
to
open
camera
apps
from
the
lockscreen
now
—
you’ll
be
able
to
set
the
Action
Button,
the
Camera
Control
button,
and
the
lockscreen
button
all
to
different
camera
apps
if
you
want,
and
still
have
the
ability
to
open
the
system
camera
by
swiping
to
the
right.
Pretty
neat.
The
revamped
Control
Center
is
a
bit
of
an
adjustment.
If
you’re
like
me
and
you’ve
used
it
by
pure
muscle
memory
for
years,
even
the
switch
from
squarish
icons
to
circles
is
a
little
disorienting.
The
whole
thing
is
now
organized
into
vertical
sheets:
favorites,
media
controls,
home
controls,
and
the
various
radio
and
network
controls.
You
can
move
controls
and
groups
of
controls
around
at
will,
resize
them
as
you
like,
and
generally
create
a
little
freeform
command
center
of
your
go-to
settings.
This
one
will
all
come
down
to
how
much
time
you
want
to
spend
creating
the
perfect
arrangement
of
controls
—
I’m
a
huge
nerd,
and
I
can’t
wait
to
spend
an
hour
or
so
getting
it
just
right.
Apple
also
updated
the
MagSafe
charging
system,
which
can
now
charge
at
up
to
25
watts
using
the
new
Qi2-compatible
MagSafe
puck
and
a
30W
charger.
I
charged
my
review
unit
for
quite
a
while
at
full
speed,
and
it
didn’t
even
get
warm.
Price-wise,
things
are
the
same
as
last
year:
the
iPhone
16
Pro
starts
at
$999
for
the
model
with
128GB
of
storage,
while
the
larger
Pro
Max
starts
at
$1,199
with
256GB
of
storage.
You
can
get
them
in
desert,
natural,
white,
and
black,
which
all
look
fine
—
I’m
a
little
jealous
that
the
regular
iPhones
get
fun
colors
this
year,
but
I
just
stick
these
things
in
cases
anyway,
so
it
doesn’t
really
matter.
Can
Apple
pull
its
AI
software...
into
focus?
So
that
is
the
iPhone
16
Pro…
so
far.
As
it
exists
today,
it’s
a
remarkably
iterative
update
to
the
iPhone
15
Pro
—
it’s
hard
to
find
reasons
to
upgrade
from
last
year’s
model.
And
I’m
not
at
all
convinced
that
it’s
worth
upgrading
to
the
16
Pro
from
older
Pro
models
just
yet,
either
—
the
Camera
Control
and
Action
Button
are
nice,
but
not
game
changing,
and
unless
you’re
excited
about
dialing
in
the
new
Photographic
Styles
and
the
new
tone
control,
you
might
find
the
even-brighter-and-flatter
photos
to
actually
be
a
step
backward
in
photo
processing.
If
you
can’t
tell,
I
am
personally
thrilled
by
the
tone
control,
so
this
is
an
easy
choice
for
me,
but
it
feels
like
it’s
worth
waiting
a
tick
for
everyone
else.
A
lot
of
people
have
asked
us
if
the
extra
money
for
the
Pro
phone
is
worth
it
this
year,
since
the
spec
sheet
of
the
iPhone
16
appears
to
be
very
close
to
the
Pro.
We’ve
got
a
full
review
of
the
regular
iPhone
16
here,
but
my
short
answer
is
that
the
Pro
camera
is
meaningfully
better,
and
that
Apple
shipping
a
60Hz
screen
in
2024
is
just
silly,
so
I’m
a
Pro
phone
person
all
the
way.
It
really
does
feel
like
Apple
intended
to
ship
these
things
with
Apple
Intelligence,
but
it’s
simply
not
here
yet,
and
the
complete
feature
set
Apple’s
announced
with
things
like
image
generation
and
ChatGPT
integration
won’t
be
here
until
next
year.
And
if
you’re
in
the
EU
or
China,
you
might
be
waiting
for
quite
a
while,
as
Apple
navigates
various
regulatory
hurdles
in
those
regions
to
even
launch
this
stuff
at
all.
That’s
not
to
say
the
iPhone
16
Pro
is
a
bad
phone
—
it’s
a
great
phone,
with
some
fascinating
ideas
about
smartphone
photography
embedded
in
it.
But
it’s
also
clearly
unfinished,
and
I
think
it’s
worth
waiting
to
see
if
Apple
Intelligence
can
complete
some
of
these
thoughts
before
spending
the
money
on
an
upgrade.
Agree
to
continue:
Apple
iPhone
16,
16
Plus,
16
Pro,
and
16
Pro
Max
Every
smart
device
now
requires
you
to
agree
to
a
series
of
terms
and
conditions
before
you
can
use
it
—
contracts
that
no
one
actually
reads.
It’s
impossible
for
us
to
read
and
analyze
every
single
one
of
these
agreements.
But
we’re
going
to
start
counting
exactly
how
many
times
you
have
to
hit
“agree”
to
use
devices
when
we
review
them
since
these
are
agreements
most
people
don’t
read
and
definitely
can’t
negotiate.
To
use
any
of
the
iPhone
16
models,
you
have
to
agree
to:
The
iOS
terms
and
conditions,
which
you
can
have
sent
to
you
by
email
Apple’s
warranty
agreement,
which
you
can
have
sent
to
you
by
email
These
agreements
are
nonnegotiable,
and
you
cannot
use
the
phone
at
all
if
you
don’t
agree
to
them.
The
iPhone
also
prompts
you
to
set
up
Apple
Cash
and
Apple
Pay
at
setup,
which
further
means
you
have
to
agree
to:
The
Apple
Cash
agreement,
which
specifies
that
services
are
actually
provided
by
Green
Dot
Bank
and
Apple
Payments
Inc.
and
further
consists
of
the
following
agreements:
The
Apple
Cash
terms
and
conditions
The
electronic
communications
agreement
The
Green
Dot
Bank
privacy
policy
Direct
payments
terms
and
conditions
Direct
payments
privacy
notice
Apple
Payments
Inc.
license
If
you
add
a
credit
card
to
Apple
Pay,
you
have
to
agree
to:
The
terms
from
your
credit
card
provider,
which
do
not
have
an
option
to
be
emailed
Final
tally:
two
mandatory
agreements,
seven
optional
agreements
for
Apple
Cash,
and
one
optional
agreement
for
Apple
Pay.
Comments