November 21, 2018

#S02E11 - Self-Driving Cars: Building An Autonomous Vehicle (Part 2)

Uber was the first company to ever experience a fatal accident with a self-driving car, and in just a few years autonomous vehicles will start flooding the market. So what went wrong with Uber's driverless car, and how do companies rebuild trust with the public?

#S02E11 - Self-Driving Cars: Building An Autonomous Vehicle (Part 2)

Uber was the first company to ever experience a fatal accident with a self-driving car, and in just a few years autonomous vehicles will start flooding the market. So what went wrong with Uber's driverless car, and how do companies rebuild trust with the public?

  • Dean Pomerleau - Autonomous vehicle consultant and early pioneer.
  • Sasha Lekach - Transportation Reporter at Mashable
  • Cibby Pulikkaseril - Co-Founder and CTO of Baraja
  • Chris Woods - Regional President of Chassis Systems Control at Bosch Australia

Dean and Todd's 'No Hands Across America' blog: https://www.cs.cmu.edu/~tjochem/nhaa/Journal.html

CREDITS

This episode of Moonshot was hosted by Kristofor Lawson (@kristoforlawson) and and Andrew Moon (@moonytweets).

Research for this episode by Patrick Laverick.

Our theme music is by Breakmaster Cylinder.

And our cover artwork is by Andrew Millist.

TRANSCRIPT

Newsreader 1: “New concerns about self-driving cars after a fatal crash involving one of those cars.”

Newsreader 2: “After hearing about the news, Uber pulled its self driving cars off the road in all of its test cities”

KRIS: Welcome to Moonshot - I’m Kristofor Lawson. And this is Part 2 of our episode looking at the development of driverless vehicles.

KRIS: On the 18th of March 2018, an autonomous vehicle operated by Uber struck and killed Elaine Herzberg while she was crossing a dark street at night in Tempe, Arizona. It was the first time an autonomous vehicle had killed anybody and it shook the entire industry.

Newsreader 3: “The National Transportation Safety Board is sending investigators to Tempe, Arizona. Police are still figuring out who was at fault. But there are serious questions now because this was part of Uber’s pilot program to see whether this works at all.”

Newsreader 4: “A human operator was inside behind the wheel but the car was in the self-driving mode.”

Newsreader 5: “This dashcam video shows the horrifying seconds before a self-driving Uber hit and killed a pedestrian in Arizona.”

Man: “Our investigation did not show at this time that there were significant signs of the vehicle slowing down.”

Newsreader 6: “And as they piece this together that video is both answering questions and raising new ones. It does show how difficult it was to see the victim on that dark road, but these autonomous vehicles are designed to detect obstacles even in the dark. So the question here is, what went wrong?”

Sasha Lekach: That didn't just affect Uber, that affected everyone.

KRIS: That’s Sasha Lekach - transportation reporter at Mashable.

Sasha Lekach: Waymo took a hit as much as Uber did, but just in a different way. This was a tragic, really terrible accident, and it made the whole self-driving car industry look like it couldn't go on, so it's really important that they all communicate and kind of hold a firm ground all together.

Dean Pomerleau: The Uber crash was tragic, as well as several of the recent Tesla fatalities when autopilot was on.

KRIS: This is Dean Pomerleau - one of the pioneers of autonomous vehicles back at Carnegie Mellon University in the late 80’s and early 90’s. If you missed - our previous episode, go back and have a listen to find out more about Dean’s work on the early driverless systems.

Dean Pomerleau: Uber may be an outlier to some extent in terms of its focus on safety. I have pretty good first or second hand knowledge about several of the others in the industry, Waymo and Cruise, and know with pretty good authority that they take safety quite seriously. And so, if anything, if you look at the California crash statistics, they err on the side of caution.

KRIS: The National Transport Safety Bureau released a preliminary report in May into the crash, which found Uber’s vehicle recognised Elaine six seconds before impact, the vehicle was equipped with all the necessary sensors - but the system took a while to identify what was going on. It first classified Elaine as an unknown object, it then classified her as a vehicle, and then as a bicycle. And she did have a bicycle with her but she was walking on foot. Before it realised emergency braking was needed just 1.3 seconds before impact.

Dean Pomerleau: The problem I see and the problem we foresaw many years ago in the deployment of these systems, was this chicken and egg problem. Unless the system is perfect and you only have other self driving cars on the road, there's always going to be the possibility of mistakes and any human at the wheel or behind the wheel won't be ready to intervene quickly enough, because unlike in aeroplanes, where you typically have at least a few seconds, if not minutes to respond to an emergency situation, in cars it's often at most seconds, and often just fractions of a second, between when the system realises it's incapable of handling the situation, and the impending crash.

KRIS: The other issue at play was to do with the way Uber engineered their system. It was designed in a way so that it would ignore some obstacles that it came across, an issue that Dean says could be due to the way that radar sensors work.

Dean Pomerleau: Radars in particular are known for, for example, giving... showing what appears to be an obstacle in the middle of the road ahead of a vehicle, even when it's simply a sign, or an overhead sign, or a bridge that you're going under. The radars can sometimes give you returns off of things that aren't actually an obstacle in your path.

KRIS: With so many potentials for false positives, the easiest option is to just avoid them by switching off the alarm. And in the case of Uber they had disabled their self-driving car’s emergency brakes to prevent erratic driving behaviour. That meant the system couldn’t react at the point that it realised it needed to, and the back-up driver was expected to intervene and take over. However the car’s system was not actually designed to warn the operator that emergency braking was needed.

Dean Pomerleau: I think that has been both, for the Uber crash, and for these recent Tesla crashes, the expectation that stopped obstacles are very frequently false alarms, and therefore can safely be ignored, has been, sort of, the Achilles heel of many of the existing self driving cars systems, and on virtually all of the crashes that have made the news.

KRIS: After the Uber crash, the company immediately suspended their self-driving operations while an investigation was completed, and the industry scrambled to reassure the public that driverless vehicles are in fact safe. But how do you rebuild trust in autonomous vehicles after such a serious event, we’ll have more after this.

[Ad break]

KRIS: Welcome back to Moonshot, I’m Kristofor Lawson... and a week after the Uber crash, Waymo’s CEO John Krafcik was interviewed at the National Automobile Dealers Association conference, and said that their technology could have avoided tragedy.

John Krafcik: “And I can say with some confidence that in situations like that one, with pedestrians, in this case a pedestrian with a bicycle, we have a lot of confidence that our technology would be robust, and would be able to handle situations like that one.”

KRIS: Waymo has a long history in the autonomous vehicle space - but they aren’t free from incident. There have been many reports of Waymo’s vehicles being involved in crashes - however the majority of autonomous vehicle incidents are due to other road users not realising that the vehicle is about to stop.

Dean Pomerleau: Typically, if you look at the scenarios described in the crash reports, they're almost always the vehicle stopping short or pausing at an intersection when the driver behind wasn't expecting the vehicle to. And so they rear-end the vehicle, it's like four to one that the vehicles are hit. Four, five times more often than they actually run in to anything. And that's both because they have attentive safety drivers who take over when there is a threat that the autonomous vehicle itself will run into something and because the software itself is designed to be very conservative. And I think rightfully so to err on the side of caution. And that was something that appeared that Uber wasn't on board with. They were being a little bit aggressive in staying engaged or ignoring potential obstacles, because there will almost certainly be false alarms.

KRIS: After several months of investigation, Uber decided in July that they would resume testing, but only in Pittsburgh, and only in manual mode, with two highly trained specialists in the car. Also the collision avoidance system is turned back on so the vehicle can react automatically if it notices any problems. And drivers will also be monitored to make sure they’re alert.

KRIS: It’s very clear that Uber’s focus is now squarely on making sure their vehicles are safe and building trust with the public. But the incident actually highlights a big problem when it comes to the implementation of autonomous vehicles. How do we deal with series issues like this one, which are rare but have the potential to completely turn the public off from autonomous vehicles.

Cibby Pulikkaseril: You know, when people develop self-driving cars I they are going to come up against edge cases, and that is where a lot of the challenge lies I think.

KRIS: That’s Cibby Pulikkaseril - Co-Founder and CTO of Baraja, a company building LIDAR sensors for autonomous vehicles.

Cibby Pulikkaseril: How the companies approach these safely and without the loss of human life is a very difficult challenge. I think by having the best sensors possible, they arm themselves with the best information. So making sure that they have really high quality sensors is key for them to develop this at the speed they're going.

KRIS: Autonomous cars have driven millions of kilometres on roads across the United States and the rest of the world, collecting massive amounts of data on how to drive safely. Waymo has passed 10 million miles of autonomous driving. But most of this distance was traveled in good weather in warmer climates, and Dean says the best way to safely navigate difficult driving conditions is to have access to the best data available.

Dean Pomerleau: In all of these learning systems, we've seen, across all domains, it's garbage in, garbage out. If you don't give it the right training data or you give it noisy training data, the system, these machine learning systems, artificial neural networks are prone to learn the wrong thing.

KRIS: As we’ve mentioned many times on Moonshot before - an AI system is only as good as the data it’s been trained with. And if a vehicle is not learning how to deal with complex situations like Uber’s it will face problems in the real world. Which is why Waymo is training all of their vehicles in simulation. They go over, and over, situations and learn how to navigate in a more consistent way. And a lot of the issues being faced now - are actually similar to the problems Dean faced back in the early 1990s when working on the NavLab.

Dean Pomerleau: What's fascinating to see is that two things that came together in those early days, artificial neural networks and self-driving cars, are the hot topics again in both the broader AI community, but using artificial neural networks to improve self-driving cars is basically what everyone is doing now. All of the leading self-driving car companies, including Tesla, are using deep learning networks to do much of the driving. And that was exactly what we were doing back then. So I've actually suggested, I consult on for several of those companies, and I've suggested, you should go back and look at some of my PhD thesis because many of the problems that they're trying to deal with today I had to solve back then and had some clever ways of doing it.

KRIS: One of issues that Dean had spent time figuring out was how to train an autonomous vehicle for the edge cases we were speaking about earlier. Because we know autonomous vehicles rarely are involved in accidents and are mostly driven in relatively good weather conditions, they haven’t really experienced as much training on how to deal with extreme situations. Which is why companies like Tesla are using their network of human driven vehicles to collect valuable real-world data which can then be used to train the vehicle for autonomy.

Dean Pomerleau: One of the challenges is how do you use data collected from human drivers to show the variety of situations that a self-driving system might get into?

Dean Pomerleau: So, if for example, a person keeps the vehicle well centred in the lane, a learning system would never learn how to recover if it somehow got offset from the centre of the lane by more than a small amount that a human normally, the band that human normally stays within. And so a lot of companies have been using simulations to try and show a learning system, a neural network, what it looks like when the vehicle gets far from the centre of the lane because in the live data, you never see that or very rarely see that.

Dean Pomerleau: And so I had a number of clever ways in my PhD thesis to transform the image in software to make it appear that the vehicle was offset to one side of the lane or the other and then adjust the correct answer, the steering direction that would be appropriate, you know, based on a model of how you recover or how a person would recover if they ever encountered such an extreme situation.

KRIS: But assuming that you can build a reliable autonomous vehicle that can react in all types of situations without error, What does the future for those vehicles actually look like? We’ll have more on that right after this break.

[AD BREAK]

KRIS: Welcome back to Moonshot - I’m Kristofor Lawson. And most people today will own and drive their own car, but once vehicles can drive themselves, will car ownership be necessary at all?

KRIS: As we mentioned in our episode on Designing a Driverless City - Waymo and Uber are both working to develop their own fleets of autonomous vehicles, replacing individual cars with an on-demand, autonomous ride hailing system. Think of what Uber is today but the car that turns up will be driving itself.

KRIS: And this idea that you won’t own your own vehicle is becoming really popular. Although many manufacturers are working on autonomous vehicles that people might be able to purchase from around 2021 - most technology companies working in this space see car ownership as a thing of the past.

Chris Woods: There will be a fleet of these automated taxis running around picking people up which I think overall has a positive benefit that we don't then have a whole car park here at Bosch of people's vehicles sitting in there eight hours a day doing nothing. The utilisation of assets is much better and I think people will tend to operate or use mobility as a service in the future much more.

KRIS: That’s Chris Woods the regional president of Chassis Systems Control at Bosch Australia.

KRIS: Bosch is working together with Mercedes-Benz to build a fleet of autonomous urban taxis that can be hailed at any time, removing the need for individuals to own cars. And it’s expected that service will have their final car design by around 2022. The average driver in the US and Australia only spends about an hour on the road each day, which means their car is parked somewhere for the other 23 hours doing absolutely nothing.

Sasha Lekach: If it's driving for you, the whole stat of cars sit idle 95% of the time. This is a way to really maximise how cars get used.

KRIS: That’s Sasha Lekach again. She says the possibilities for on demand driverless vehicles unlock massive potential for mobility. Not just for existing motorists, but for people who can’t drive.

Sasha Lekach: If you're too old or can't handle driving, this is an awesome option for you…. I mean drunk driving, it's gonna have a huge impact on accidents and things like that. So yeah, there's definitely benefits, I think it could have really cool potential.

KRIS: An on demand car service will let people summon the right size car, when they want it, for as long as they need it. Meaning there’s less general expenses in owning and running a vehicle.

Cibby Pulikkaseril: Like when I was growing up, it was really important to own a car in Canada. But increasingly as people live urbanised lives, I think it's less important for them to own a car and to have shared mobility… If you want a van, you get a van. If you want a small car, you get a small car. And you don't have to worry about servicing and filling up gas or maintenance on other vehicles. You just have mobility at your fingertips.

Chris Woods: Look, I think there's a chance that my young children will never need to own a car, I think the need to own a car will definitely change, certainly for a long time, there will be a personal choice there, so if someone wants to own their own car and drive it, they can. But mobility as a service we talk about, which effectively is upon us today with Uber where you dial up your Uber taxi on your smartphone, it comes and picks you up and drops you off, that will happen in the future but without the driver there.

KRIS: Vehicles as a service promises a future full of car-free households. But this is a long-term vision - for now autonomous vehicles have to navigate roads filled with real human drivers, and that’s not an easy task.

Sasha Lekach: When you have a mix of human and robot controlled, you're gonna hit problems sharing the road with a robot, it's frustrating and can be annoying. Here in San Francisco where I live, there's cars being tested all the time and I routinely just drive around them because I'm like, “They are going so slow!” They're probably going the speed limit, I know, but no one drives on the street that slowly. So there’s going to be some friction with these vehicles as they get integrated into society. So it’s not going to be all smooth and cool and slick right away, there’s definitely going to be some problems.

KRIS: But the end goal is of course to make roads safer by replacing every human driver on the road with a level five, fully automated vehicle. But that will not happen overnight, and there will be a transition period where both people and autonomous vehicles will have to learn to share the road, and there’s many different companies playing in this space designing all different types of vehicles. So it's important that we think about how we roll out this technology to make sure everyone is on the same page.

Cibby Pulikkaseril: The first step will be to have Level 4 vehicles. And these cars have to coexist with human-driven cars, and that's probably the most challenging state of all. If all the cars were self-driving, then at least they would have predictable behaviour. But because they interact with humans, the education probably is on humans on how to drive when there are autonomous vehicles around them. I know some cities have proposed things like dedicated lanes for self-driving cars, or maybe having certain regions where they’re confined. And those are probably the good first steps.

KRIS: Once these first level four vehicles hit the market, they will be pretty expensive. The sensors and computing equipment needed to drive these cars often double or triple the cost of the vehicle itself.

Dean Pomerleau: Both Uber and Waymo and Cruise, have the right model for these early deployments and that is, have a very expensive sophisticated car, with many sensors and a lot of computing power, and the cars, as I understand it, the Waymo and Cruise cars, have about $300,000 worth of sensing and computing on them, now. It's coming down as they begin to mass produce it, but for some time, it's pretty clear that the technology to do the autonomous driving will be in the neighbourhood at least, of the cost of the vehicle itself. So it will be prohibitively expensive, I think, if only for cost reasons, for people to own these systems, in the foreseeable future.

KRIS: Another barrier to the rollout of autonomous vehicles is the rivalry between tech companies. Autonomous vehicles actually create this weird dynamic due to the research going on by both technology companies and traditional auto manufacturers. Waymo and Uber are pretty secretive about their testing, and few of the companies in this space share the test data that they actually collect.

KRIS: So what’s going to happen if the roads are full of autonomous vehicles - all with their own sensors, and all designed in different ways? How do we make sure that there’s some kind of consistency in the way that these vehicles operate? And how do we make sure they don’t interfere with each other?

Cibby Pulikkaseril: I really wonder though when you start to have entire streets full of autonomous vehicles, how they're gonna deal with each other. That's gonna be... Are they going to be able to communicate with each other, but also will the sensors now start to interfere with each other, for example. You know, If you start having laser pulses flying everywhere, every autonomous vehicle is going to start picking up these pulses unless they have a very rigorous, robust method of reducing interference from them.

KRIS: Some companies are beginning to address this problem. Bosch is part of the Autonomous Vehicles Alliance, which works with Volkswagen and Nvidia to try and standardise the way autonomous vehicles communicate, both within the vehicles components and beyond.

Chris Woods: Probably one of the interesting things there is what we call V2X communication so vehicle to infrastructure, vehicle to vehicle communications, where that different vehicles or devices in the network which might be traffic lights or speed signs, in the future, all of this data will be communicated directly to the vehicle. So very important to have standards in place such that, that data can be shared.

KRIS: Once the whole industry develops a standardised way of communication, autonomous vehicles can potentially do a lot more than just drive themselves. They could then send messages to other vehicles on the road, and alert them to potential hazards so that they can reroute and get to their destination without delay.

Chris Woods: Some of the other use cases might be if you have a motorbike around a blind corner, where a driver can't see, the motorbike can be talking to the vehicle and telling the vehicle that there's a motorbike there and the vehicle can then act accordingly to reduce speed and avoid accidents before they happen.

KRIS: People also need to become comfortable taking their hands off the wheel. It’s an unnatural experience at first, placing your trust in a machine to do something that took you years of practise and for many people is actually a badge of honour. Showing people how the technology will work could help, it might be a novelty to take a ride in a vehicle at first, but the experience really should be pretty mundane. And that’s probably a good thing.

Sasha Lekach: I got a self-driving Uber when I was in Pittsburgh two years ago, just accidentally it happened to come up when I ordered an Uber. It wasn't actually self-driving, there was two people, but for a portion of the ride it was self-driving. And just that mere exposure made me so much more excited and into self-driving cars.

Sasha Lekach: There was a really interesting study that the Society of Automotive Engineers did in Florida earlier this year, where they basically just had self-driving cars open to the public, and had a stretch of freeway that they cordoned off and said we're just doing test runs on here. And they let people just get in the vehicle… it was just regular people. It wasn't industry folks, it wasn't car people, it wasn't tech people. Gave them a free ride in a self-driving car, and then afterwards people were like, “Oh it wasn't as bad as I thought” Or, “Oh, I wasn't as scared.” Even just that 20 minute ride really changed their perception, and I thought that was an interesting indicator of what needs to happen. People just need to be exposed to them and see what it is. So that's mostly the problem, that they seem really mysterious right now.

KRIS: The hardest question to answer about autonomous vehicles is when they will actually reach the market. 2020 seems to be when you’re most likely to start seeing some autonomous vehicles in a city near you, and they will likely be in a fleet. Waymo is best placed to be the ones to provide that service - they have the most data, and the best reliability, plus they’re already rolling the system out in the US.

KRIS: Uber could also provide the much-needed infrastructure and an existing user base worldwide.

KRIS: Then you should also consider Tesla which has long said that their vehicles could potentially become fully self-driving with a simple update to their Autopilot system.  However the timeline is frequently pushed back, and given they have a fairly limited track record you would anticipate that they might need more data and research before that becomes a reality.

KRIS: And let’s not forget the traditional car manufacturers who might be a couple of years behind but could dramatically lower the price to entry.

KRIS: But more than 20 years on from No Hands Across America trip, Dean says we’re not likely to see full autonomy in production cars any time soon.

Dean Pomerleau: My car finally after 25 years, I have a Honda, that has lane keeping assist, and lane departure warning. And those are the kind of things that I think, we will be seeing more of, maybe more sophisticated versions of them, but not too much more, in the way of fully automated driving for decades to come.

KRIS: Does that disappoint you, that you were driving an autonomous vehicle in the early '90s, and now your existing Honda doesn't have those same abilities?

Dean Pomerleau: I mean I think at the time, in fact in 1995, we were doing this major demonstration of the state-of-the-art self driving cars, as part of the Automated Highway System project, which was a Federal Highway Administration sponsored project. It was originally scheduled to be $100 million project, to do a proof of concept, in the mid '90s of self-driving cars. And so we gave rides to, I think 3,000 people from all over. Both local, just people off the street, as well as the head of the department of transportation in the US, over a three day period. And as part of that pitch, people were asking us, "When can we expect to see this on the road?" And at the time, we said, "It will probably be about 20 years before you start seeing self driving cars." Just because of all the hurdles, both technically and administratively, and 20 years later, we're just about beginning to see these sorts of deployments.

KRIS: You were pretty spot on.

Dean Pomerleau: For example, airbags took 30 years between the time that there was a proof of concept, and when they were actually commercially deployed. And they still haven't received, or had full penetration. There are still lots of cars on the roads today, that don't have airbags, just because they're 15 or 20 years old. And so, it will be like that with self driving cars. GM, their Super Cruise system, as well as Tesla's autopilot, have reasonably competent level two, or level three systems today. And those will get better, and eventually, you'll be able to take your eyes off the road, but it will be, I would say, at least another decade, before your average person would be able to buy a system that you don't have to be constantly monitoring.