November 14, 2018

#S02E10 - Self-Driving Cars: Building An Autonomous Vehicle (Part 1)

In just a few years autonomous vehicles will start flooding the market and it's just a matter of time before we don't need to drive at all. But how did we get to this point and when will the transition happen?

#S02E10 - Self-Driving Cars: Building An Autonomous Vehicle (Part 1)

Most cars these days have some level of automation such as cruise control or lane guidance, but as technology advances the experience of owning a car and even driving one is about to change drastically. In just a few years autonomous vehicles will start flooding the market and it's just a matter of time before we don't need to drive at all. But how did we get to this point and when will the transition happen?

  • Dean Pomerleau - Autonomous vehicle consultant and early pioneer.
  • Sasha Lekach - Transportation Reporter at Mashable
  • Cibby Pulikkaseril - Co-Founder and CTO of Baraja
  • Chris Woods - Regional President of Chassis Systems Control at Bosch Australia

Dean and Todd's 'No Hands Across America' blog:


This episode of Moonshot was hosted by Kristofor Lawson (@kristoforlawson) and and Andrew Moon (@moonytweets).

Research for this episode by Patrick Laverick.

Our theme music is by Breakmaster Cylinder.

And our cover artwork is by Andrew Millist.


[Humans Need Not Apply - "Self-driving cars aren't the future: they're here and they work. Self-driving cars have traveled hundreds of thousands of miles up and down the California coast and through cities -- all without human intervention. The question is not if they'll replace cars, but how quickly. They don’t need to be perfect, they just need to be better than us. Humans drivers, by the way, kill 40,000 people a year with cars just in the United States. Given that self-driving cars don’t blink, don’t text while driving, don’t get sleepy or stupid, it’s easy to see them being better than humans because they already are.”]

KRIS: Welcome to Moonshot. I’m Kristofor Lawson. And you’ve just been listening to YouTuber CGP Grey discuss the future of autonomous vehicles in his video essay from 2014 - Humans Need Not Apply - a future that is already here.

KRIS: Now if you’re listening to this podcast while driving, chances are your car already has some autonomous features.

KRIS: Cruise control has been around since the 1940s and is one of the first examples of automating part of the driving experience. While this feature is only considered to be the first level of automation, it really shows what’s possible as more parts of the driving experience become autonomous, to the point where steering wheels and gear sticks become an optional extra.

KRIS: But why aren’t we riding around in cars that drive themselves and how long will it be before we can walk into a car dealership and pick up a brand new self-driving car? That’s coming up on this episode of Moonshot.

KRIS: And before we get started - a quick disclaimer. As with our episode on Designing a Driverless City - one of the companies we’ll talk about in this episode is Uber - a company that my co-host Andrew Moon actually works for, so for that reason he’s going to sit this one out.

KRIS: But before we get into it, here’s a word from our sponsors.

[First Ad]

Dean Pomerleau: The vehicle itself, when we started out, was rather underwhelming. We would creep along at just a few centimetres per second on these bike paths. It was not the most impressive to begin with. By the end, we had gotten a faster vehicle and I had optimised things to the point where we could or ALVINN could actually drive at up to 55 miles per hour on the highway.

KRIS: This is Dean Pomerleau - one of the early pioneers of autonomous vehicles.

Dean Pomerleau: … I'm a PhD. Got my PhD in 1992 from Carnegie Mellon University and I was one of the first people, at least in the United States, to work in the area of autonomous cars.

KRIS: Companies like Waymo and Uber have led the way in autonomous vehicle development for the past few years. But the concept has been around much longer than you’d think, with researchers and engineers developing autonomous vehicles for decades.

KRIS: Dean started his career working on artificial neural networks which are the backbone of autonomous vehicles and many artificial intelligence systems, but back in the late 80s the technology was brand new, nobody really had a practical application for what to do with it, and that’s where Dean comes in.

Dean Pomerleau: I got into self-driving cars through the backdoor by trying to find an interesting application of this technology of artificial neural networks that everyone was excited about. I, as a naïve, new PhD student, thought hey, maybe we can apply these neural networks to derive a self-driving car, so that became my PhD project.

KRIS: Dean combined his interest in artificial neural networks with a project Carnegie Mellon was working on for the Department of Defence. It was called NavLab, and there were already plenty of people at the university working on the idea of having a car that drove itself - but most of the other concepts involved a vehicle being pre-programmed to move a particular way or using edge-detection sensors - but it was Dean who wondered whether this new concept of artificial neural networks could be used to help the car learn to drive on its own.

Dean Pomerleau: And I came in and I said, "Maybe we can use an end-to-end learning system based on artificial neural networks. Maybe we can feed an artificial retina the raw camera image coming from the camera on board the NavLab and train it by watching a human to steer," to figure out what the right steering direction was just looking at the images themselves.

KRIS: Dean named his new system the Autonomous Land Vehicle In a Neural Network, or ALVINN.

Dean Pomerleau: The way it worked was, a human driver would drive the NavLab for about five minutes while the ALVINN system watched. And that is it adjusted the weights in the artificial neural network to map the images that it was being fed as input into a steering direction. You know, turn this hard to the left in order to follow the curve that it saw in the video camera image that it was being fed.

Dean Pomerleau: And so It was basically simply learning by example. Learning by watching a human drive, how to mimic that same stimulus response. You know, you see a curve to the right, turn the steering wheel to the right.

KRIS: Now - before you go putting any pictures in your mind about what this vehicle may have looked like - remember that this was the late 80s and early 90s… computing systems were far less powerful then they are today, and they were significantly larger - which in turn meant you needed a bigger vehicle then the sleek autonomous cars that you may see today.

Dean Pomerleau: The first one, it was a blue Chevy panel van, very much like your larger ambulances that you see today. It had a 5,000 watt generator on board and this very bulky supercomputer, mini supercomputer basically that another faculty member at CMU had designed and had fabricated on board the vehicle that sucked down most of that 5,000 watts of power.

Dean Pomerleau: The interesting thing was, I did a calculation a couple years ago and it was impressive at the time, but the amount of computing power we had was like one sixth of the computing power that an iWatch, an Apple iWatch, has today.

KRIS: Now something we’ve mentioned on this show before is that autonomous systems require an awful lot of data. On modern autonomous vehicles that might be a gigabyte a second due to all the complex sensors involved and the complexity of the data being collected. But back in the 80s and 90s that wasn’t the case - computers weren’t that powerful, and the camera being used to collect the data was by today’s standards fairly basic.

Dean Pomerleau: We had a 30x32 pixel input retina that was about 1K pixels of input and only four hidden neurons in a single layer connected with 30 different steering directions ranging from hard left to hard right. So we had about 4,000 connections versus 10s of millions of connections in the latest deep neural networks that people are using to drive and to do many other things today. And that was solely because it took so long to run these neural networks, that we couldn't afford to have a lot of extra connections. But it turns out, at least for the kinds of highway driving and path-following that we were doing, that low resolution and a small number of connections was sufficient.

KRIS: Now most of this early autonomous driving occured on the grounds of Carnegie Mellon University… but every PHD requires a big project and for Dean that project was to drive his ALVINN vehicle 100 miles from Pittsburgh to Lake Eerie.

Dean Pomerleau: ...The system drive virtually that whole 100 miles under automated control. There was very light traffic. I took the vehicle out on a Sunday morning to make sure there wasn't too much traffic on the highway, so it was mostly just steering, not controlling the speed, but it was a very exciting moment, to have driven that far under automated control back in 1991.

KRIS: When you took the vehicle on this 100 mile drive, did you have to get specific permissions to take it out on the road? Or did you just go for it?

Dean Pomerleau: We pretty much just went for it. I strongly suspect if we had been candid with the university lawyers about what we were doing, we probably would have been prohibited from doing it. It was a don't ask, don't tell sort of situation.

Dean Pomerleau: We were very attentive. I was very aware of the shortcomings of the system, having developed it myself, so I was always in the driver's seat with my hands hovering over the wheel and had a pretty good idea, looking at the road ahead myself, when the system was likely to work well and when it was likely to have problems.

KRIS: Dean built on the success of ALVINN, upgrading the neural network to one which was actually custom designed to be used on an autonomous vehicle - resulting in a system that would watch the road and detect lane markings, meaning the car could steer itself in far more complex conditions.

KRIS: The result was Rapidly Adapting Lateral Position Handler, or RALPH for short. Dean and a colleague Todd Jochem decided to take RALPH on a 2800 mile or 4,500 kilometre cross country trip from Pittsburgh to San Diego. They called it No Hands Across America.

Dean Pomerleau: That was actually a pun on there was a Hands Across America fundraiser I think for farm aid, I think it was at the time. It was a tongue in cheek extension of that, No Hands Across America. It had its own logo with basically a steering wheel with hands waving above the steering wheel.

KRIS: The journey was funded by the National Highway Traffic Safety Administration, as well as the Federal Highway Administration, but Dean and Todd sold t-shirts to pay for additional expenses, like hotels and food. And they didn’t bother getting any permission from anyone, it was a lot different from today’s highly structured, well-funded vehicle tests.

Dean Pomerleau: It was so early that nobody was even thinking about this. We did get some PR along the trip. A Business Week reporter rode with us through part of Missouri. But other than that, we kept a pretty low profile with the university attorneys and any other administrative people who we would have had to get the permission from. I still don't know how companies do it today in the litigious environment we're in, particularly going across state lines. It could g et very tricky I think if you tried to get permission.

KRIS: Besides a short test run from Pittsburgh to Washington DC, RALPH had never gone on a long distance drive. Nobody else had tried this before, it was unprecedented. Dean and Todd would be on high alert, watching for anything that might trip up the guidance systems. The car drove itself for the most part... although occasionally when road conditions changed a human did have to take the wheel.

Dean Pomerleau: We stayed mostly on the highway, and so we expected there to be pretty nice roads. The longest stretch where we were unable or where RALPH was unable to drive was a stretch of very freshly painted pavement that didn't even have lane markers on it yet. And so for a few miles there, we had to take over. There were a couple times where there was very low sun angle and we had all kinds of bugs splattered on our windshield and stuff where heading into the setting sun going west that RALPH had a little bit of trouble at sunset time.

KRIS: The car steered itself for 98 per cent of the journey, and once it reached 55 miles per hour, cruise control took over as well. However because they were travelling slower than many of the other vehicles - Dean and Todd drove mostly in the slow lane… which meant having to deal with other issues - like exit ramps.

Dean Pomerleau: We had implemented a very crude solution, and that was to basically blank out through a button on the keyboard of the computer that was next to the driver’s seat. You could hit a button, and it would basically mask out the right half or the left half of the image in order to tell the system, ignore that part of the scene to prevent it from getting locked on to a lane marker that was peeling off to take an exit ramp. So the system was able to drive with just one eye open basically, looking at the left half of the image and continue straight rather than get drawn off to follow an exit.

KRIS: Now Dean and Todd actually kept a detailed blog of their trip which is still online. We’ll put a link to that in the show notes. But one of the fascinating stories that they share in this blog is actually about dealing with the heat.

Dean Pomerleau:It was fairly hot going across the United States in the middle of summer. We had trouble keeping the computing that was in the back of the vehicle cool. At one point, things were overheating, and we had to jerry rig a duct from the air-conditioning, which was upfront where we, the passengers, were all the way to the back of the van where all the hot computing hardware was. In the blog, we sort of tongue in cheek played it like it was the Apollo 13 mission, trying to use what we had on board the vehicle to solve the terrible problems we were running into, using the sunscreen that we had to block the sun coming in on the windshield as a duct to duct the cool air back to our overeating computers and solving our problems in a clever sort of way.

KRIS: Almost a decade on from No Hands Across America trip, DARPA created a race for autonomous vehicles, with a prize of $1 million dollars for the first team to successfully complete a 142 mile or approximately 240 kilometre course. Nobody won the first race in 2004. The furthest any team got was Carnegie Mellon University’s Sandstorm, which traveled 12 kilometres before being caught on a rock and catching fire.

[Darpa Grand Challenge 2005 - Announcer: "And we have movement from Stanley ladies and gentlemen, the start of the DARPA Grand Challenge…"]

KRIS: The following year they held another race, in which five vehicles managed to cross the finish line, with the Stanford Racing Team crossing the line first in just under seven hours.

KRIS: After No Hands Across America, it would take another two decades for a self-driving car to drive itself across the United States. Delphi took one of its autonomous vehicles from San Francisco to New York in 2015. Of course this time around, the car was completely autonomous, controlling steering, acceleration, lane changes and everything.

[News Report: “Delphi Automotive is launching a self-driving car from San Francisco tomorrow on a 3,500 mile trip to New York City. The computer driven car will be tested in a variety of conditions that could never be tested in a lab, from changing weather and terrain, to potential road hazards, but don’t worry - there will be a human inside, just in case.”]

KRIS: And we’ll continue our deep dive on autonomous vehicles - right after this break.


KRIS: Welcome Back to Moonshot - I’m Kristofor Lawson - and as we mentioned at the start of the show, most cars already have some degree of automation. And it turns out there are actually six different levels of autonomy as determined by The Society of Automotive Engineers.

Chris Woods: The lowest level obviously is no autonomy. Then we start at level one systems which is effectively what people know as a cruise control system so there is speed regulation of the vehicle but the driver still needs [00:03:30] to maintain overall control there.

KRIS: This is Chris Woods, he’s the regional president of Chassis System Control at Bosch Australia. He works on autonomous vehicle components, as well as other safety features in cars. Bosch - in case you’re not aware - manufactures components for a lot of the vehicles that you drive today. And Chris broke down the different levels of autonomy for us.

Chris Woods: Level two systems are things like traffic jam assist and these allow the driver to, more automotive functions where the speed of the vehicle is controlled from zero to maximum speed potentially, but the driver always needs to stay monitoring that system so if something goes wrong, they need to be ready to take over.

KRIS: Level two autonomy is what you might see on Tesla’s Autopilot systemor similar systems from Volvo or Mercedes-Benz. This is where we find most autonomous systems are on production cars right now.

Chris Woods: level three is when effectively the system can operate independent of the driver, they don't need to be there monitoring what's going on, they can perform some other functions, potentially reading emails in the future but very much in limited driving situations.

KRIS: Now Audi claim their A8 vehicle is the first publicly available vehicle to reach level 3 autonomy. The car can drive itself on divided roads at up to 60 kilometres per hour, and manage itself in traffic without user assistance. However most governments around the world don’t yet allow vehicles to be placed into this level of autonomy - something which many manufacturers are working to change.

Chris Woods: Level four then becomes again more autonomous, full urban taxis are level four [00:04:30] type system that we're developing at the moment. Again, somewhat limited in geography so when the urban taxis come, there will be dedicated routes where those vehicles can drive.

KRIS: Level four systems are cars which can for the most part drive themselves. They will have a steering wheel and users may have to take over in particular conditions. But mostly they are completely autonomous.

Chris Woods: And then, level five is effectively fully ubiquitous, unlimited, automated driving.

KRIS: Level five autonomy is the holy grail of self-driving cars. It’s where cars are completely autonomous, they are built with no steering wheel, no pedals, no gearsticks. None of that stuff that allows you to control the vehicle. And this is where companies like Waymo, Uber, and Cruise will eventually be heading. And a lot of the development is happening in Pittsburgh.

Sasha Lekach: Places like Pittsburgh, it has a long history of robotics [00:12:00] there so that's kind of why that's grown there. The streets of Pittsburgh are very difficult to navigate, it's an older city.

KRIS: This is Sasha Lekach. She’s a transportation reporter at Mashable.

Sasha Lekach: So there's not as much actual testing out in the public there, but Arizona has been a hotbed for this. Same with Silicon Valley in California.

KRIS: Many states in the US have regulated autonomous vehicles, and earlier this year the federal government updated their guidelines for testing these systems. But you’ll only see self-driving cars on the street in a few places, with testing centred around Silicon Valley, Arizona and Pittsburgh.

Sasha Lekach: Arizona, I don't know if you've ever been, but most of it, especially in big suburban areas outside of Phoenix are warm most of the year, very big, wide streets and boulevards, it's not super congested, it's very grid like systems, it's not like you're winding through random roads, there's not very many one ways. So these are ideal conditions for these type of vehicles.

KRIS: Most autonomous vehicles being tested right now can only function in good weather and on clear days. Changes in the road surface and weather can make it a lot harder for the cars to navigate.

Sasha Lekach: The roads get cold, the roads get snowy, they get slushy, they get slick. Whatever it is, that's another challenge, and another situation that they have to be able to handle, and a lot of them aren't ready for that. They can only handle when the road is nice and toasty at 90 degrees or whatever it is.

Cibby Pulikkaseril: I grew up in Canada, where weather conditions are pretty brutal in the winter. When we have blizzards in Canada, so all the drivers slow down. I think autonomous car has that same insight, which is when visibility is poor, when the range on sensors is degraded, it can slow down. But because it has a diversity of sensors actually, it's an augmented intelligence. So compared to a human... Where we only have eyesight from our stereo vision. The combination of radar, LiDAR, and vision is going to enable the cars to be superpowered in bad conditions.

KRIS: That’s Cibby Pulikkaseril. He’s the Co-Founder and CTO of Baraja, a company that wants to be the eyes of autonomous vehicles. They build LiDAR technology for self-driving cars. And Cibby sees their technology working together with a number of different sensors to map out a detailed picture of the road ahead .

Cibby Pulikkaseril: And the sensors are nicely overlapped in that ... The cameras give you colour perception and high-resolution images, but they don't work as well at night or at all at night. LiDAR works at night and gives you high-resolution. LiDAR can work in some of the conditions like rain and light snow. Radar works in lots of different weather conditions, including fog, so the combination of all of these things gives you a really enhanced sensor suite.

KRIS: The companies developing these cars have focused on buildings systems that implement complex sensor technology, and it’s all really important to make sure that the cars work, and they’re out there testing them on public roads as extensively as possible, but what happens when it all goes wrong?

[News Report: Woman’s Voice: "It is tough to watch, an Uber driver going hands free and slaming into a 49-year-old woman and it was all captured on Uber’s in-car cameras. That horrifying video was released today by police in Tempe, Arizona."

Man’s Voice: "And tonight the investigation by the NTSB continues into what is the first known pedestrian fatality in a self-driving car.]

KRIS: That’s coming up in the next episode of Moonshot.