In
March
2023,
a
North
Carolina
student
was
stepping
off
a
school
bus
when
he
was
struck
by
a
Tesla
Model
Y
traveling
at
“highway
speeds,”
according
to
a
federal
investigation
that
published
today.
The
Tesla
driver
was
using
Autopilot,
the
automaker’s
advanced
driver-assist
feature
that
Elon
Musk
insists
will
eventually
lead
to
fully
autonomous
cars.
The
17-year-old
student
who
was
struck
was
transported
to
a
hospital
by
helicopter
with
life-threatening
injuries.
But
what
the
investigation
found
after
examining
hundreds
of
similar
crashes
was
a
pattern
of
driver
inattention,
combined
with
the
shortcomings
of
Tesla’s
technology,
resulting
in
hundreds
of
injuries
and
dozens
of
deaths.
Drivers
using
Autopilot
or
the
system’s
more
advanced
sibling,
Full
Self-Driving,
“were
not
sufficiently
engaged
in
the
driving
task,”
and
Tesla’s
technology
“did
not
adequately
ensure
that
drivers
maintained
their
attention
on
the
driving
task,”
NHTSA
concluded.
Drivers
using
Autopilot
or
the
system’s
more
advanced
sibling,
Full
Self-Driving,
“were
not
sufficiently
engaged
in
the
driving
task”
In
total,
NHTSA
investigated
956
crashes,
starting
in
January
2018
and
extending
all
the
way
until
August
2023.
Of
those
crashes,
some
of
which
involved
other
vehicles
striking
the
Tesla
vehicle,
29
people
died.
There
were
also
211
crashes
in
which
“the
frontal
plane
of
the
Tesla
struck
a
vehicle
or
obstacle
in
its
path.”
These
crashes,
which
were
often
the
most
severe,
resulted
in
14
deaths
and
49
injuries.
NHTSA
was
prompted
to
launch
its
investigation
after
several
incidents
of
Tesla
drivers
crashing
into
stationary
emergency
vehicles
parked
on
the
side
of
the
road.
Most
of
these
incidents
took
place
after
dark,
with
the
software
ignoring
scene
control
measures,
including
warning
lights,
flares,
cones,
and
an
illuminated
arrow
board.
In
its
report,
the
agency
found
that
Autopilot
— and,
in
some
cases,
FSD
— was
not
designed
to
keep
the
driver
engaged
in
the
task
of
driving.
Tesla
says
that
it
warns
its
customers
that
they
need
to
pay
attention
while
using
Autopilot
and
FSD,
which
includes
keeping
their
hands
on
the
wheels
and
eyes
on
the
road.
But
NHTSA
says
that
in
many
cases,
drivers
would
become
overly
complacent
and
lose
focus.
And
when
it
came
time
to
react,
it
was
often
too
late.
In
59
crashes
examined
by
NHTSA,
the
agency
found
that
Tesla
drivers
had
enough
time,
“five
or
more
seconds,”
prior
to
crashing
into
another
object
in
which
to
react.
In
19
of
those
crashes,
the
hazard
was
visible
for
10
or
more
seconds
before
the
collision.
Reviewing
crash
logs
and
data
provided
by
Tesla,
NHTSA
found
that
drivers
failed
to
brake
or
steer
to
avoid
the
hazard
in
a
majority
of
the
crashes
analyzed.
“Crashes
with
no
or
late
evasive
action
attempted
by
the
driver
were
found
across
all
Tesla
hardware
versions
and
crash
circumstances,”
NHTSA
said.
NHTSA
also
compared
Tesla’s
Level
2
(L2)
automation
features
to
products
available
in
other
companies’
vehicles.
Unlike
other
systems,
Autopilot
would
disengage
rather
than
allow
drivers
to
adjust
their
steering.
This
“discourages”
drivers
from
staying
involved
in
the
task
of
driving,
NHTSA
said.
“Crashes
with
no
or
late
evasive
action
attempted
by
the
driver
were
found
across
all
Tesla
hardware
versions
and
crash
circumstances.”
“A
comparison
of
Tesla’s
design
choices
to
those
of
L2
peers
identified
Tesla
as
an
industry
outlier
in
its
approach
to
L2
technology
by
mismatching
a
weak
driver
engagement
system
with
Autopilot’s
permissive
operating
capabilities,”
the
agency
said.
Even
the
brand
name
“Autopilot”
is
misleading,
NHTSA
said,
conjuring
up
the
idea
that
drivers
are
not
in
control.
While
other
companies
use
some
version
of
“assist,”
“sense,”
or
“team,”
Tesla’s
products
lure
drivers
into
thinking
they
are
more
capable
than
they
are.
California’s
attorney
general
and
the
state’s
Department
of
Motor
Vehicles
are
both
investigating
Tesla
for
misleading
branding
and
marketing.
NHTSA
acknowledges
that
its
probe
may
be
incomplete
based
on
“gaps”
in
Tesla’s
telemetry
data.
That
could
mean
there
are
many
more
crashes
involving
Autopilot
and
FSD
than
what
NHTSA
was
able
to
find.
Even
the
brand
name
“Autopilot”
is
misleading,
NHTSA
said
Tesla
issued
a
voluntary
recall
late
last
year
in
response
to
the
investigation,
pushing
out
an
over-the-air
software
update
to
add
more
warnings
to
Autopilot.
NHTSA
said
today
it
was
launching
a
new
investigation
into
the
recall
after
a
number
of
safety
experts
said
the
update
was
inadequate
and
still
allowed
for
misuse.
The
findings
cut
against
Musk’s
insistence
that
Tesla
is
an
artificial
intelligence
company
that
is
on
the
cusp
of
releasing
a
fully
autonomous
vehicle
for
personal
use.
The
company
plans
to
unveil
a
robotaxi
later
this
year
that
is
supposed
to
usher
in
this
new
era
for
Tesla.
During
this
week’s
first
quarter
earnings
call,
Musk
doubled
down
on
the
notion
that
his
vehicles
were
safer
than
human-driven
cars.
“If
you’ve
got,
at
scale,
a
statistically
significant
amount
of
data
that
shows
conclusively
that
the
autonomous
car
has,
let’s
say,
half
the
accident
rate
of
a
human-driven
car,
I
think
that’s
difficult
to
ignore,”
Musk
said.
“Because
at
that
point,
stopping
autonomy
means
killing
people.”
(Originally posted by Andrew J. Hawkins)
Comments