Apple
is
inviting
investigations
into
the
Private
Cloud
Compute
(PCC)
system
that
powers
more
computationally
intensive
Apple
Intelligence
requests.
The
company
is
also
expanding
its
bug
bounty
program
to
offer
payouts
of
up
to
$1,000,000
for
people
who
discover
PCC
vulnerabilities.
The
company
has
boasted
about
how
many
AI
features
(branded
as
Apple
Intelligence)
will
run
on-device
without
leaving
your
Mac,
iPhone,
or
other
Apple
hardware.
Still,
for
more
difficult
requests,
it
will
send
them
to
PCC
servers
that
are
built
using
Apple
Silicon
and
a
new
operating
system.
Many
AI
applications
from
other
companies
also
rely
on
servers
to
complete
more
difficult
requests.
Still,
users
don’t
have
much
line
of
sight
into
how
secure
those
server-based
operations
are.
Apple,
of
course,
has
made
a
big
deal
over
the
years
about
how
much
it
cares
about
user
privacy,
so
poorly
designed
cloud
servers
for
AI
could
poke
a
hole
in
that
image.
To
prevent
that,
Apple
said
it
designed
the
PCC
so
that
the
company’s
security
and
privacy
guarantees
are
enforceable
and
that
security
researchers
can
independently
verify
those
guarantees.
For
researchers,
Apple
is
offering:
With
the
bug
bounty,
Apple
is
offering
payouts
from
$50,000
to
$1,000,000
for
vulnerabilities
discovered
across
a
few
different
categories.
Apple
also
will
evaluate
any
security
issue
for
a
potential
reward
that
“has
a
significant
impact
to
PCC.”
The
first
Apple
Intelligence
features
are
set
to
launch
for
everyone
with
iOS
18.1,
which
is
expected
next
week.
Some
of
the
bigger
Apple
Intelligence
features,
including
Genmoji
and
ChatGPT
integration,
appeared
in
the
first
iOS
18.2
developer
beta
released
yesterday.
(Originally posted by Jay Peters)
Comments