The first known fatal car crash with use of autopilot driving engaged was on May 7th in Florida. This is the first of what will likely be many crashes and injuries involving partially or fully autonomous vehicles. It’s very likely true that over time, autonomous vehicles will make our roadways safer than ever before. This will be from direct benefits such as less accidents, to less obvious cascade effects from less pollution, less traffic jams holding up emergency vehicles, and much, much more that’s already been written about extensively. Motorists will be so happy, they’d be shooting sunbeams out of the exhaust if cars still had exhaust. (We’ll leave aside the reality that for now, most electric vehicles are likely juiced up from electrons being produced at coal fired power plants. So the pollution reduction benefit may have to wait. See Coal Powered Electric Cars and Electric cars and the coal that runs them)
So why is some more destruction on the way?
Well, there’s a whole lot of reasons. Some reasons should be obvious. Some less so. In some of these cases, it’s likely that better user interfaces and experiences can help. The challenge is we don’t know what “better” might mean here as yet. And sadly, some of those learnings just won’t be had in the lab or test tracks. Whether you’re a Product person or User Experience person, looking at this market segment, (whether you work directly in it or not), is going to provide some powerful lessons on crafting industry changing mission critical hardware and applications.
Part of the goal is for more embedded technology such as hands free voice activated devices and such is to reduce this problem. But the issue isn’t just one of mechanics. It’s not just about taking your hands or eyes off what you’re supposed to be doing. It’s about cognitive load and being able to handle multiple tasks at once. Most people don’t truly multi-task. We just task switch quickly. Maybe sometimes we do; driving while chewing gum and changing the radio station; that’s three. Now add a child running into the street. That’s a fourth thing to deal with. Maybe everything works out ok, but if not, was it because it was unavoidable? Or if one of those tasks was shedded might there have been a better reaction time?
Anyway you look at it, there are at least two factors that come into play here. There’s the fundamental human factors issue of cognitive load and how much any typical person would be able to handle at once. And then there’s training. Most drivers are not actively trained on new technology. They might not even read manuals much less do drills or practice skills before they’re needed.
So what’s going to happen when they get loaded up with even more engaging technology while they’re barreling down the road?
What are some potential failure modes?
Obvious Physical Failures
- Mechanical things sometimes fail; either poor manufacturing, damage in transport, abuse, vandalism, time.
- As of this writing, it’s known that among the significant remaining issues with autonomous cars is reliability of sensors. Specifically, a variety of inclement weather conditions can challenge their abilities.
- People are often poor drivers. (You know who you are!) (But actually, sadly, you don’t.)
- New systems won’t necessarily help many motorists. They just add complexity and “modes” that will be more than most users can handle.
- User interfaces and controls for this area are new. No matter how standardized the industry tries to make them, chances are there will be some mistakes made that lead to problems.
These are things we’re reasonably aware could happen. We don’t know how or when, but we try to guard against them.
- Hackers (For fun, profit, criminal, terroristic.)
- Sunspots or other electrical events that could do damage.
- Political / Police Controls – Might someone intetionally be given access to take control over another vehicle and mishandle that responsibility.
These are things we just haven’t anticipated at all. Everything is so new and different, things happen as a result of what has been sometimes called “Failure of Imagination.”
A Demanding, (and often misunderstanding), Public
After the Space Shuttle Challenger exploded in the 1980s, President Ronald Regan said, “The future doesn’t belong to the fainthearted; it belongs to the brave.” However, most motor vehicle operators and passengers are not trying to be test pilots. They may – from a marketing perspective – be early adopters of new technology, but they have an expectation of safety that’s probably a bit higher than astronauts have regarding their vehicle of choice.
This seems reasonable.
But it’s likely not the case. At least, not for some individuals. Let’s say it’s 10 – 20 years from now. Let’s also assume a smooth implementation and evolution of autonomous vehicles, (there won’t be; it will be a mess for various reasons), it’s likely we’ll see a lot less accidents and associated injuries and deaths. This is of course a good thing. And yet, when something bad does happen, we’ll see the usual 24/7 news coverage of those “scary ghost autos that kill.” And we’ll see lawsuits where the lawyers will go after car companies, software developers, sensor manufactures, and… well… everyone they can because that’s their job. While I may have shown poor taste in the space shuttle metaphor, the reality is NASA’s wonderful and amazing space shuttles had a 20% catastrophic failure rate on a total unit basis. New military aircraft under development have had infamously spectacular failures. And even supposedly well understood technology like airbags have had major safety issues long after launch. (The Takata airbag crisis.)
So back to Tesla for a Moment…
There’s a concept in business called “First Mover Advantage.” The theory is if you’re first to market, you can far exceed others and do very well. However, there’s a couple of corollaries to this. One is “Second Mover Advantage,” whereby you learn from the mistakes of others and as a fast follower, may garner even more advantage. Another is the concept that the Pioneers are the ones with the arrows in their chests.
It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it.
Unfortunately, this doesn’t matter. And it likely never will. This is an assertion on my part, but I’ll go out on a limb and say that there’s very little between fully autonomous and fully manual that will not occasionally be wholly misused by motorists. Even ‘dumb’ cruise control, which has been around for decades, causes or is contributory in some accidents. (They could be defective units, or enable drowsy driving or other issues.)
The essential problem is the human factor. I don’t know if there’s any data on how often traditional cruise control is used by how many people for what percentage of their drive time. I happen to believe it’s not much. But as technology gets better – and by better I mean more assistive here – we probably will be using it more. Just not all at once. And during the in between periods of partial functionality where a lot of vehicles may work much more differently than each other than ever before and when we’re fully autonomous, we’re in for some problems.
Any Takeaways for Product Management?
The Minimum Viable Product Concept: This popular concept in so-called “Lean” product development may be sound. It’s usually coupled with Agile methods for development that involve fast product development Sprints to build incremental value and learn quickly from the marketplace. However, serious judgment is involved in just what constitutes “Minimum” and “Viable.” Step one is don’t do things that kill your customers. I don’t personally believe Tesla has done anything wrong here. And yet, could this have been avoided? Will they modify designs or user experience to try to stop this sort of thing from happening again? If they do, it’s basically an admission that it could have been better in the first place. But is it reasonable to expect them to have just known how to do better in the first place?
Maybe. But probably not. This is a judgement call and we have to make it for all manner of things. We most frequently see such questions regarding traditional auto manufacturing, aircraft issues and other consumer products.
The Lean concept and MVP used to be proud of a “fail fast and iterate” mantra. This tragically hip thinking is thankfully morphing more into a “learn fast and iterate” euphemism, which sounds much better. Still, it amounts to the same thing. Tesla has said it uses Agile methods. Which is fine. There’s nothing to say that such methods don’t produce fine physical products as well as digital. (And I really don’t know how much they use Agile for software vs. physical production, though since they’re so tightly coupled in this case, it probably doesn’t matter.)
Designing the Physical User Experience: We have over the years developed a great many design patterns for interacting with our world. When most of us come to a set of double doors with handles on them, we’ll assume we need to pull to open them. And we’ll do so even when there’s a giant sign right there that says “Push.” Why? Because the handle is what’s called an “affordance” that gives us a means to accomplish an action. And we understand so many common actions that we have both mental models and the associated muscle memory that results from similar actions performed over time.
We have visual metaphors such as a mailbox for email; even though at this point email has capabilities far beyond early email that make the metaphor a bit of a joke in some ways. Still, we’ve come to understand this and many other things from visual to gestural on our devices, and more.
What we really don’t have is established design patterns for the kind of interactivity we’ll be using in autos. It may be instructive to look at the aviation industry. Pilots have been using automation and advanced navigation systems for years. They often struggle with these things as they’re implemented inconsistently and advanced features can be challenging. And these are people with fairly advanced training in their systems who usually do read the owner’s manual.
Now, go ask a typical car owner if they’ve ever read their vehicle manual beyond looking up what a headlight part replacement number is or similar. Some might say, “Well, the tech should be so easy you don’t need a manual anyway.” Maybe. We’ll see. The point is, bad UI / UX here doesn’t mean just a bad online shopping experience where a customer abandons their shopping cart. It can mean a distraction that causes injury or death.
Your products might not have the same critical potential failure modes. But in any case, it’s going to be instructive to watch this industry struggle and products get deployed beyond the beta and very early adopter phase.