The Second Eclectic

Technology changes how we relate to God and each other

Tesla Introduces “New Safety Features and Autopilot,”
but also New Risks


Tesla may very well be the most forward-thinking car maker on the planet. Their sexy designs and fully electric motors make driving a Tesla the ultimate status symbol. Here in Illinois, every license plate has a number somewhere between 1-2000 and suffixed with “EL,” which I can only assume means “electric.” And the numbers probably mean that the state is counting each Tesla one by one. Deservedly so.

This week Tesla announced their newest design, the Dual Motor Model S, which they claim is “the fastest accelerating four-door production car of all time.” Sweet. But they buried the lead on this story: “New Safety Features and Autopilot.” Here’s what they said on their blog:
Our system is called Autopilot because it’s similar to systems that pilots use to increase comfort and safety when conditions are clear. Tesla’s Autopilot is a way to relieve drivers of the most boring and potentially dangerous aspects of road travel – but the driver is still responsible for, and ultimately in control of, the car. The Autopilot hardware opens up some exciting long term possibilities.
Now I’m a fan of Tesla, which may be obvious, but the promise of “exciting long-term possibilities” doesn’t fool me. Tesla, like most tech companies, is presenting their technology with uncritical optimism, presenting only the benefits of Autopilot. Consumers, however, would be wise to consider both sides.

Tesla promises that Autopilot could increase “comfort and safety” and “relieve drivers” of boredom and the “potentially dangerous aspects” of driving.” But only “when conditions are clear.” Yet, safety is only half the story.

Tesla likens their Autopilot to “systems that pilots use.” However, autopilot isn’t entirely smooth sailing either. Nicholas Carr points to recent studies about problems that pilots are facing:
Pilots’ “automation addiction” has eroded their flying skills to the point that they sometimes don’t know how to recover from stalls and other mid-flight problems, say pilots and safety officials. The weakened skills have contributed to hundreds of deaths in airline crashes in the last five years. . . .
As the Dual Motor Model S merges with traffic, these same risks entering our roads and highways. Yes, they promise to relieve boredom and potential dangers, but they also incur new risks at the same time. Tesla’s Autopilot feature will present us with new problems. And yet.

“The driver is still responsible for . . . the car.” In case there is any doubt, Tesla wants to make that clear. And they want to assure you that you are “ultimately in control.” Yet, how can you be responsible for a car’s performance when you didn’t program the car’s software?

Tesla’s computer software is evaluating the road and making driving decisions without you. The person behind the wheel isn’t deciding. Just like airline pilots, why should you be paying attention when they are out of your control?

So who is responsible when an accident occurs? Tesla certainly wants it to be you. Why? Because if the car crashes, they don’t want to be held liable. Liability is too costly for them. But with more and more automation in automobiles, liability is a question that the courts may have to resolve.

The courts or, possibly, you.

If drivers take the time to really consider what Tesla is offering, the public may decide they’re not buying it. Despite the “exciting long term possibilities,” the long-term risks of “automation addiction” may simply not be worth the cost. The costs may simply be too high, no matter what the sticker price is.

Conversations are better than comments, don't you think? If you'd like to continue the conversation, email me.