|
WEBVTT |
|
|
|
00:00.000 --> 00:03.120 |
|
The following is a conversation with Chris Ermsen. |
|
|
|
00:03.120 --> 00:06.040 |
|
He was the CTO of the Google self driving car team, |
|
|
|
00:06.040 --> 00:08.880 |
|
a key engineer and leader behind the Carnegie Mellon |
|
|
|
00:08.880 --> 00:11.240 |
|
University, autonomous vehicle entries |
|
|
|
00:11.240 --> 00:14.120 |
|
in the DARPA Grand Challenges and the winner |
|
|
|
00:14.120 --> 00:16.160 |
|
of the DARPA Urban Challenge. |
|
|
|
00:16.160 --> 00:19.480 |
|
Today, he's the CEO of Aurora Innovation, |
|
|
|
00:19.480 --> 00:21.360 |
|
an autonomous vehicle software company. |
|
|
|
00:21.360 --> 00:23.600 |
|
He started with Sterling Anderson, |
|
|
|
00:23.600 --> 00:26.000 |
|
who was the former director of Tesla Autopilot |
|
|
|
00:26.000 --> 00:30.160 |
|
and drew back now Uber's former autonomy and perception lead. |
|
|
|
00:30.160 --> 00:33.160 |
|
Chris is one of the top roboticist and autonomous vehicle |
|
|
|
00:33.160 --> 00:37.440 |
|
experts in the world and a long time voice of reason |
|
|
|
00:37.440 --> 00:41.320 |
|
in a space that is shrouded in both mystery and hype. |
|
|
|
00:41.320 --> 00:43.600 |
|
He both acknowledges the incredible challenges |
|
|
|
00:43.600 --> 00:46.560 |
|
involved in solving the problem of autonomous driving |
|
|
|
00:46.560 --> 00:49.680 |
|
and is working hard to solve it. |
|
|
|
00:49.680 --> 00:52.440 |
|
This is the Artificial Intelligence Podcast. |
|
|
|
00:52.440 --> 00:54.720 |
|
If you enjoy it, subscribe on YouTube, |
|
|
|
00:54.720 --> 00:57.920 |
|
give it five stars on iTunes, support it on Patreon, |
|
|
|
00:57.920 --> 00:59.760 |
|
or simply connect with me on Twitter |
|
|
|
00:59.760 --> 01:03.280 |
|
at Lex Freedman spelled FRID MAN. |
|
|
|
01:03.280 --> 01:09.160 |
|
And now, here's my conversation with Chris Ermsen. |
|
|
|
01:09.160 --> 01:11.960 |
|
You were part of both the DARPA Grand Challenge |
|
|
|
01:11.960 --> 01:17.040 |
|
and the DARPA Urban Challenge teams at CMU with Red Whitaker. |
|
|
|
01:17.040 --> 01:19.720 |
|
What technical or philosophical things |
|
|
|
01:19.720 --> 01:22.280 |
|
have you learned from these races? |
|
|
|
01:22.280 --> 01:26.640 |
|
I think the high order bit was that it could be done. |
|
|
|
01:26.640 --> 01:32.880 |
|
I think that was the thing that was incredible about the first |
|
|
|
01:32.880 --> 01:36.440 |
|
of the Grand Challenges, that I remember I was a grad |
|
|
|
01:36.440 --> 01:41.440 |
|
student at Carnegie Mellon, and there we |
|
|
|
01:41.440 --> 01:46.320 |
|
was kind of this dichotomy of it seemed really hard, |
|
|
|
01:46.320 --> 01:48.800 |
|
so that would be cool and interesting. |
|
|
|
01:48.800 --> 01:51.720 |
|
But at the time, we were the only robotics |
|
|
|
01:51.720 --> 01:54.960 |
|
institute around, and so if we went into it and fell |
|
|
|
01:54.960 --> 01:58.320 |
|
in our faces, that would be embarrassing. |
|
|
|
01:58.320 --> 02:01.160 |
|
So I think just having the will to go do it, |
|
|
|
02:01.160 --> 02:03.360 |
|
to try to do this thing that at the time was marked |
|
|
|
02:03.360 --> 02:07.120 |
|
as darn near impossible, and then after a couple of tries, |
|
|
|
02:07.120 --> 02:11.360 |
|
be able to actually make it happen, I think that was really |
|
|
|
02:11.360 --> 02:12.360 |
|
exciting. |
|
|
|
02:12.360 --> 02:15.120 |
|
But at which point did you believe it was possible? |
|
|
|
02:15.120 --> 02:17.000 |
|
Did you, from the very beginning, |
|
|
|
02:17.000 --> 02:18.360 |
|
did you personally, because you're |
|
|
|
02:18.360 --> 02:20.320 |
|
one of the lead engineers, you actually |
|
|
|
02:20.320 --> 02:21.800 |
|
had to do a lot of the work? |
|
|
|
02:21.800 --> 02:23.840 |
|
Yeah, I was the technical director there, |
|
|
|
02:23.840 --> 02:26.120 |
|
and did a lot of the work, along with a bunch |
|
|
|
02:26.120 --> 02:28.440 |
|
of other really good people. |
|
|
|
02:28.440 --> 02:29.760 |
|
Did I believe it could be done? |
|
|
|
02:29.760 --> 02:31.120 |
|
Yeah, of course. |
|
|
|
02:31.120 --> 02:33.400 |
|
Why would you go do something you thought was impossible, |
|
|
|
02:33.400 --> 02:34.880 |
|
completely impossible? |
|
|
|
02:34.880 --> 02:36.280 |
|
We thought it was going to be hard. |
|
|
|
02:36.280 --> 02:38.080 |
|
We didn't know how we're going to be able to do it. |
|
|
|
02:38.080 --> 02:42.880 |
|
We didn't know if we'd be able to do it the first time. |
|
|
|
02:42.880 --> 02:46.000 |
|
Turns out we couldn't. |
|
|
|
02:46.000 --> 02:48.400 |
|
That, yeah, I guess you have to. |
|
|
|
02:48.400 --> 02:52.920 |
|
I think there's a certain benefit to naivete, |
|
|
|
02:52.920 --> 02:55.400 |
|
that if you don't know how hard something really is, |
|
|
|
02:55.400 --> 02:59.560 |
|
you try different things, and it gives you an opportunity |
|
|
|
02:59.560 --> 03:04.080 |
|
that others who are wiser maybe don't have. |
|
|
|
03:04.080 --> 03:05.680 |
|
What were the biggest pain points? |
|
|
|
03:05.680 --> 03:09.360 |
|
Mechanical, sensors, hardware, software, algorithms |
|
|
|
03:09.360 --> 03:12.760 |
|
for mapping, localization, just general perception, |
|
|
|
03:12.760 --> 03:15.440 |
|
control, like hardware, software, first of all. |
|
|
|
03:15.440 --> 03:17.840 |
|
I think that's the joy of this field, |
|
|
|
03:17.840 --> 03:20.040 |
|
is that it's all hard. |
|
|
|
03:20.040 --> 03:25.160 |
|
And that you have to be good at each part of it. |
|
|
|
03:25.160 --> 03:32.280 |
|
So for the urban challenges, if I look back at it from today, |
|
|
|
03:32.280 --> 03:36.200 |
|
it should be easy today. |
|
|
|
03:36.200 --> 03:38.880 |
|
That it was a static world. |
|
|
|
03:38.880 --> 03:40.720 |
|
There weren't other actors moving through it. |
|
|
|
03:40.720 --> 03:42.440 |
|
That is what that means. |
|
|
|
03:42.440 --> 03:47.080 |
|
It was out in the desert, so you get really good GPS. |
|
|
|
03:47.080 --> 03:51.320 |
|
So that went, and we could map it roughly. |
|
|
|
03:51.320 --> 03:55.160 |
|
And so in retrospect now, it's within the realm of things |
|
|
|
03:55.160 --> 03:57.800 |
|
we could do back then. |
|
|
|
03:57.800 --> 03:59.200 |
|
Just actually getting the vehicle, |
|
|
|
03:59.200 --> 04:00.680 |
|
and there's a bunch of engineering work |
|
|
|
04:00.680 --> 04:04.200 |
|
to get the vehicle so that we could control and drive it. |
|
|
|
04:04.200 --> 04:09.520 |
|
That's still a pain today, but it was even more so back then. |
|
|
|
04:09.520 --> 04:12.920 |
|
And then the uncertainty of exactly what they wanted us |
|
|
|
04:12.920 --> 04:17.080 |
|
to do was part of the challenge as well. |
|
|
|
04:17.080 --> 04:19.360 |
|
Right, you didn't actually know the track hiding it. |
|
|
|
04:19.360 --> 04:21.520 |
|
You knew approximately, but you didn't actually |
|
|
|
04:21.520 --> 04:23.560 |
|
know the route that's going to be taken. |
|
|
|
04:23.560 --> 04:26.560 |
|
That's right, we didn't even really, |
|
|
|
04:26.560 --> 04:28.640 |
|
the way the rules had been described, |
|
|
|
04:28.640 --> 04:29.840 |
|
you had to kind of guess. |
|
|
|
04:29.840 --> 04:33.440 |
|
So if you think back to that challenge, |
|
|
|
04:33.440 --> 04:37.000 |
|
the idea was that the government would give us, |
|
|
|
04:37.000 --> 04:40.360 |
|
the DARPA would give us a set of waypoints |
|
|
|
04:40.360 --> 04:44.240 |
|
and kind of the width that you had to stay within between the line |
|
|
|
04:44.240 --> 04:46.840 |
|
that went between each of those waypoints. |
|
|
|
04:46.840 --> 04:49.280 |
|
And so the most devious thing they could have done |
|
|
|
04:49.280 --> 04:53.720 |
|
is set a kilometer wide corridor across a field of scrub |
|
|
|
04:53.720 --> 04:58.520 |
|
brush and rocks and said, go figure it out. |
|
|
|
04:58.520 --> 05:02.200 |
|
Fortunately, it turned into basically driving along |
|
|
|
05:02.200 --> 05:06.800 |
|
a set of trails, which is much more relevant to the application |
|
|
|
05:06.800 --> 05:08.760 |
|
they were looking for. |
|
|
|
05:08.760 --> 05:12.080 |
|
But no, it was a hell of a thing back in the day. |
|
|
|
05:12.080 --> 05:16.640 |
|
So the legend, Red, was kind of leading that effort |
|
|
|
05:16.640 --> 05:19.120 |
|
in terms of just broadly speaking. |
|
|
|
05:19.120 --> 05:22.040 |
|
So you're a leader now. |
|
|
|
05:22.040 --> 05:25.000 |
|
What have you learned from Red about leadership? |
|
|
|
05:25.000 --> 05:26.360 |
|
I think there's a couple of things. |
|
|
|
05:26.360 --> 05:30.880 |
|
One is go and try those really hard things. |
|
|
|
05:30.880 --> 05:34.760 |
|
That's where there is an incredible opportunity. |
|
|
|
05:34.760 --> 05:36.560 |
|
I think the other big one, though, |
|
|
|
05:36.560 --> 05:41.720 |
|
is to see people for who they can be, not who they are. |
|
|
|
05:41.720 --> 05:46.080 |
|
It's one of the deepest lessons I learned from Red, |
|
|
|
05:46.080 --> 05:51.000 |
|
was that he would look at undergraduates or graduate |
|
|
|
05:51.000 --> 05:56.120 |
|
students and empower them to be leaders, |
|
|
|
05:56.120 --> 06:01.400 |
|
to have responsibility, to do great things, |
|
|
|
06:01.400 --> 06:04.760 |
|
that I think another person might look at them and think, |
|
|
|
06:04.760 --> 06:06.600 |
|
oh, well, that's just an undergraduate student. |
|
|
|
06:06.600 --> 06:08.720 |
|
What could they know? |
|
|
|
06:08.720 --> 06:13.520 |
|
And so I think that trust, but verify, have confidence |
|
|
|
06:13.520 --> 06:14.880 |
|
in what people can become, I think, |
|
|
|
06:14.880 --> 06:16.680 |
|
is a really powerful thing. |
|
|
|
06:16.680 --> 06:20.480 |
|
So through that, let's just fast forward through the history. |
|
|
|
06:20.480 --> 06:24.200 |
|
Can you maybe talk through the technical evolution |
|
|
|
06:24.200 --> 06:27.480 |
|
of autonomous vehicle systems from the first two |
|
|
|
06:27.480 --> 06:30.920 |
|
Grand Challenges to the Urban Challenge to today? |
|
|
|
06:30.920 --> 06:33.600 |
|
Are there major shifts in your mind, |
|
|
|
06:33.600 --> 06:37.240 |
|
or is it the same kind of technology just made more robust? |
|
|
|
06:37.240 --> 06:40.880 |
|
I think there's been some big, big steps. |
|
|
|
06:40.880 --> 06:46.600 |
|
So for the Grand Challenge, the real technology |
|
|
|
06:46.600 --> 06:51.400 |
|
that unlocked that was HD mapping. |
|
|
|
06:51.400 --> 06:55.200 |
|
Prior to that, a lot of the off road robotics work |
|
|
|
06:55.200 --> 06:58.920 |
|
had been done without any real prior model of what |
|
|
|
06:58.920 --> 07:01.400 |
|
the vehicle was going to encounter. |
|
|
|
07:01.400 --> 07:03.960 |
|
And so that innovation, that the fact |
|
|
|
07:03.960 --> 07:11.320 |
|
that we could get decimeter resolution models, |
|
|
|
07:11.320 --> 07:13.560 |
|
was really a big deal. |
|
|
|
07:13.560 --> 07:17.480 |
|
And that allowed us to kind of bound |
|
|
|
07:17.480 --> 07:19.680 |
|
the complexity of the driving problem the vehicle had |
|
|
|
07:19.680 --> 07:21.040 |
|
and allowed it to operate at speed, |
|
|
|
07:21.040 --> 07:23.800 |
|
because we could assume things about the environment |
|
|
|
07:23.800 --> 07:26.400 |
|
that it was going to encounter. |
|
|
|
07:26.400 --> 07:31.320 |
|
So that was one of the big step there. |
|
|
|
07:31.320 --> 07:38.520 |
|
For the Urban Challenge, one of the big technological |
|
|
|
07:38.520 --> 07:41.960 |
|
innovations there was the multi beam LiDAR. |
|
|
|
07:41.960 --> 07:45.720 |
|
And be able to generate high resolution, |
|
|
|
07:45.720 --> 07:48.680 |
|
mid to long range 3D models the world, |
|
|
|
07:48.680 --> 07:54.120 |
|
and use that for understanding the world around the vehicle. |
|
|
|
07:54.120 --> 07:59.120 |
|
And that was really kind of a game changing technology. |
|
|
|
07:59.120 --> 08:02.880 |
|
And parallel with that, we saw a bunch |
|
|
|
08:02.880 --> 08:06.640 |
|
of other technologies that had been kind of converging |
|
|
|
08:06.640 --> 08:08.960 |
|
half their day in the sun. |
|
|
|
08:08.960 --> 08:16.800 |
|
So Bayesian estimation had been, SLAM had been a big field |
|
|
|
08:16.800 --> 08:18.600 |
|
in robotics. |
|
|
|
08:18.600 --> 08:20.800 |
|
You would go to a conference a couple of years |
|
|
|
08:20.800 --> 08:23.800 |
|
before that, and every paper would effectively |
|
|
|
08:23.800 --> 08:25.640 |
|
have SLAM somewhere in it. |
|
|
|
08:25.640 --> 08:31.560 |
|
And so seeing that Bayesian estimation techniques |
|
|
|
08:31.560 --> 08:34.040 |
|
play out on a very visible stage, |
|
|
|
08:34.040 --> 08:38.680 |
|
I thought that was pretty exciting to see. |
|
|
|
08:38.680 --> 08:41.760 |
|
And mostly SLAM was done based on LiDAR at that time? |
|
|
|
08:41.760 --> 08:42.400 |
|
Well, yeah. |
|
|
|
08:42.400 --> 08:46.720 |
|
And in fact, we weren't really doing SLAM per se in real time, |
|
|
|
08:46.720 --> 08:48.120 |
|
because we had a model ahead of time. |
|
|
|
08:48.120 --> 08:51.560 |
|
We had a roadmap, but we were doing localization. |
|
|
|
08:51.560 --> 08:54.080 |
|
And we were using the LiDAR or the cameras, |
|
|
|
08:54.080 --> 08:55.920 |
|
depending on who exactly was doing it, |
|
|
|
08:55.920 --> 08:58.080 |
|
to localize to a model of the world. |
|
|
|
08:58.080 --> 09:00.720 |
|
And I thought that was a big step |
|
|
|
09:00.720 --> 09:07.160 |
|
from kind of naively trusting GPS INS before that. |
|
|
|
09:07.160 --> 09:10.400 |
|
And again, lots of work had been going on in this field. |
|
|
|
09:10.400 --> 09:14.080 |
|
Certainly, this was not doing anything particularly |
|
|
|
09:14.080 --> 09:17.400 |
|
innovative in SLAM or in localization, |
|
|
|
09:17.400 --> 09:20.160 |
|
but it was seeing that technology necessary |
|
|
|
09:20.160 --> 09:21.800 |
|
in a real application on a big stage. |
|
|
|
09:21.800 --> 09:23.080 |
|
I thought it was very cool. |
|
|
|
09:23.080 --> 09:25.600 |
|
So for the Urban Challenge, those already maps |
|
|
|
09:25.600 --> 09:28.120 |
|
constructed offline in general? |
|
|
|
09:28.120 --> 09:28.600 |
|
OK. |
|
|
|
09:28.600 --> 09:30.920 |
|
And did people do that individually? |
|
|
|
09:30.920 --> 09:33.600 |
|
Did individual teams do it individually? |
|
|
|
09:33.600 --> 09:36.440 |
|
So they had their own different approaches there? |
|
|
|
09:36.440 --> 09:41.720 |
|
Or did everybody kind of share that information, |
|
|
|
09:41.720 --> 09:42.880 |
|
at least intuitively? |
|
|
|
09:42.880 --> 09:49.560 |
|
So DARPA gave all the teams a model of the world, a map. |
|
|
|
09:49.560 --> 09:53.720 |
|
And then one of the things that we had to figure out back then |
|
|
|
09:53.720 --> 09:56.720 |
|
was, and it's still one of these things that trips people up |
|
|
|
09:56.720 --> 10:00.240 |
|
today, is actually the coordinate system. |
|
|
|
10:00.240 --> 10:03.000 |
|
So you get a latitude, longitude. |
|
|
|
10:03.000 --> 10:05.120 |
|
And to so many decimal places, you |
|
|
|
10:05.120 --> 10:07.800 |
|
don't really care about kind of the ellipsoid of the Earth |
|
|
|
10:07.800 --> 10:09.520 |
|
that's being used. |
|
|
|
10:09.520 --> 10:12.720 |
|
But when you want to get to 10 centimeter or centimeter |
|
|
|
10:12.720 --> 10:18.480 |
|
resolution, you care whether the coordinate system is NADS 83 |
|
|
|
10:18.480 --> 10:22.720 |
|
or WGS 84, or these are different ways |
|
|
|
10:22.720 --> 10:26.720 |
|
to describe both the kind of nonsphericalness of the Earth, |
|
|
|
10:26.720 --> 10:31.560 |
|
but also kind of the actually, and I think when I can't remember |
|
|
|
10:31.560 --> 10:33.560 |
|
which one, the tectonic shifts that are happening |
|
|
|
10:33.560 --> 10:36.920 |
|
and how to transform the global datum as a function of that. |
|
|
|
10:36.920 --> 10:40.400 |
|
So getting a map and then actually matching it |
|
|
|
10:40.400 --> 10:41.880 |
|
to reality to centimeter resolution, |
|
|
|
10:41.880 --> 10:44.000 |
|
that was kind of interesting and fun back then. |
|
|
|
10:44.000 --> 10:46.800 |
|
So how much work was the perception doing there? |
|
|
|
10:46.800 --> 10:52.440 |
|
So how much were you relying on localization based on maps |
|
|
|
10:52.440 --> 10:55.720 |
|
without using perception to register to the maps? |
|
|
|
10:55.720 --> 10:57.960 |
|
And I guess the question is how advanced |
|
|
|
10:57.960 --> 10:59.720 |
|
was perception at that point? |
|
|
|
10:59.720 --> 11:01.920 |
|
It's certainly behind where we are today. |
|
|
|
11:01.920 --> 11:05.800 |
|
We're more than a decade since the urban challenge. |
|
|
|
11:05.800 --> 11:13.080 |
|
But the core of it was there, that we were tracking vehicles. |
|
|
|
11:13.080 --> 11:15.600 |
|
We had to do that at 100 plus meter range |
|
|
|
11:15.600 --> 11:18.280 |
|
because we had to merge with other traffic. |
|
|
|
11:18.280 --> 11:21.200 |
|
We were using, again, Bayesian estimates |
|
|
|
11:21.200 --> 11:23.800 |
|
for state of these vehicles. |
|
|
|
11:23.800 --> 11:25.560 |
|
We had to deal with a bunch of the problems |
|
|
|
11:25.560 --> 11:28.240 |
|
that you think of today of predicting |
|
|
|
11:28.240 --> 11:31.040 |
|
where that vehicle is going to be a few seconds into the future. |
|
|
|
11:31.040 --> 11:33.680 |
|
We had to deal with the fact that there |
|
|
|
11:33.680 --> 11:36.000 |
|
were multiple hypotheses for that because a vehicle |
|
|
|
11:36.000 --> 11:37.640 |
|
at an intersection might be going right |
|
|
|
11:37.640 --> 11:41.440 |
|
or it might be going straight or it might be making a left turn. |
|
|
|
11:41.440 --> 11:44.080 |
|
And we had to deal with the challenge of the fact |
|
|
|
11:44.080 --> 11:47.520 |
|
that our behavior was going to impact the behavior |
|
|
|
11:47.520 --> 11:48.880 |
|
of that other operator. |
|
|
|
11:48.880 --> 11:53.400 |
|
And we did a lot of that in relatively naive ways. |
|
|
|
11:53.400 --> 11:54.720 |
|
But it kind of worked. |
|
|
|
11:54.720 --> 11:57.000 |
|
Still had to have some kind of assumption. |
|
|
|
11:57.000 --> 12:00.640 |
|
And so where does that 10 years later, where does that take us |
|
|
|
12:00.640 --> 12:04.200 |
|
today from that artificial city construction |
|
|
|
12:04.200 --> 12:06.920 |
|
to real cities to the urban environment? |
|
|
|
12:06.920 --> 12:13.600 |
|
Yeah, I think the biggest thing is that the actors are truly |
|
|
|
12:13.600 --> 12:18.680 |
|
unpredictable, that most of the time, the drivers on the road, |
|
|
|
12:18.680 --> 12:24.000 |
|
the other road users are out there behaving well. |
|
|
|
12:24.000 --> 12:27.040 |
|
But every once in a while, they're not. |
|
|
|
12:27.040 --> 12:33.320 |
|
The variety of other vehicles is, you have all of them. |
|
|
|
12:33.320 --> 12:35.760 |
|
In terms of behavior, or terms of perception, or both? |
|
|
|
12:35.760 --> 12:38.320 |
|
Both. |
|
|
|
12:38.320 --> 12:40.480 |
|
Back then, we didn't have to deal with cyclists. |
|
|
|
12:40.480 --> 12:42.800 |
|
We didn't have to deal with pedestrians. |
|
|
|
12:42.800 --> 12:46.240 |
|
Didn't have to deal with traffic lights. |
|
|
|
12:46.240 --> 12:49.360 |
|
The scale over which that you have to operate is now |
|
|
|
12:49.360 --> 12:52.240 |
|
as much larger than the airbase that we were thinking about back |
|
|
|
12:52.240 --> 12:52.720 |
|
then. |
|
|
|
12:52.720 --> 12:56.280 |
|
So what easy question? |
|
|
|
12:56.280 --> 12:59.720 |
|
What do you think is the hardest part about driving? |
|
|
|
12:59.720 --> 13:00.480 |
|
Easy question. |
|
|
|
13:00.480 --> 13:01.320 |
|
Yeah. |
|
|
|
13:01.320 --> 13:02.600 |
|
No, I'm joking. |
|
|
|
13:02.600 --> 13:07.440 |
|
I'm sure nothing really jumps out at you as one thing. |
|
|
|
13:07.440 --> 13:12.920 |
|
But in the jump from the urban challenge to the real world, |
|
|
|
13:12.920 --> 13:16.200 |
|
is there something that's a particular euphorcy |
|
|
|
13:16.200 --> 13:18.480 |
|
as a very serious, difficult challenge? |
|
|
|
13:18.480 --> 13:21.120 |
|
I think the most fundamental difference |
|
|
|
13:21.120 --> 13:28.960 |
|
is that we're doing it for real, that in that environment, |
|
|
|
13:28.960 --> 13:31.840 |
|
it was both a limited complexity environment, |
|
|
|
13:31.840 --> 13:33.240 |
|
because certain actors weren't there, |
|
|
|
13:33.240 --> 13:35.360 |
|
because the roads were maintained. |
|
|
|
13:35.360 --> 13:38.720 |
|
There were barriers keeping people separate from robots |
|
|
|
13:38.720 --> 13:40.880 |
|
at the time. |
|
|
|
13:40.880 --> 13:44.480 |
|
And it only had to work for 60 miles, which looking at it |
|
|
|
13:44.480 --> 13:48.960 |
|
from 2006, it had to work for 60 miles. |
|
|
|
13:48.960 --> 13:52.720 |
|
Looking at it from now, we want things |
|
|
|
13:52.720 --> 13:57.200 |
|
that will go and drive for half a million miles. |
|
|
|
13:57.200 --> 14:00.960 |
|
And it's just a different game. |
|
|
|
14:00.960 --> 14:06.080 |
|
So how important, you said Lyder came into the game early on, |
|
|
|
14:06.080 --> 14:08.880 |
|
and it's really the primary driver of autonomous vehicles |
|
|
|
14:08.880 --> 14:10.240 |
|
today as a sensor. |
|
|
|
14:10.240 --> 14:12.880 |
|
So how important is the role of Lyder in the sensor suite |
|
|
|
14:12.880 --> 14:14.760 |
|
in the near term? |
|
|
|
14:14.760 --> 14:18.680 |
|
So I think it's essential. |
|
|
|
14:18.680 --> 14:20.520 |
|
But I also believe that cameras are essential, |
|
|
|
14:20.520 --> 14:22.160 |
|
and I believe the radar is essential. |
|
|
|
14:22.160 --> 14:27.400 |
|
I think that you really need to use the composition of data |
|
|
|
14:27.400 --> 14:28.920 |
|
from these different sensors if you |
|
|
|
14:28.920 --> 14:32.600 |
|
want the thing to really be robust. |
|
|
|
14:32.600 --> 14:35.440 |
|
The question I want to ask, let's see if we can untangle it, |
|
|
|
14:35.440 --> 14:40.240 |
|
is what are your thoughts on the Elon Musk provocative statement |
|
|
|
14:40.240 --> 14:45.840 |
|
that Lyder is a crutch, that is a kind of, I guess, |
|
|
|
14:45.840 --> 14:49.600 |
|
growing pains, and that much of the perception |
|
|
|
14:49.600 --> 14:52.160 |
|
task can be done with cameras? |
|
|
|
14:52.160 --> 14:56.920 |
|
So I think it is undeniable that people walk around |
|
|
|
14:56.920 --> 14:59.680 |
|
without lasers in their foreheads, |
|
|
|
14:59.680 --> 15:01.840 |
|
and they can get into vehicles and drive them. |
|
|
|
15:01.840 --> 15:05.560 |
|
And so there's an existence proof |
|
|
|
15:05.560 --> 15:10.840 |
|
that you can drive using passive vision. |
|
|
|
15:10.840 --> 15:12.680 |
|
No doubt, can't argue with that. |
|
|
|
15:12.680 --> 15:14.320 |
|
In terms of sensors, yeah. |
|
|
|
15:14.320 --> 15:14.800 |
|
So there's proof. |
|
|
|
15:14.800 --> 15:15.960 |
|
Yes, in terms of sensors, right? |
|
|
|
15:15.960 --> 15:18.720 |
|
So there's an example that we all |
|
|
|
15:18.720 --> 15:23.280 |
|
go do it at many of us every day. |
|
|
|
15:23.280 --> 15:28.200 |
|
In terms of Lyder being a crutch, sure. |
|
|
|
15:28.200 --> 15:33.080 |
|
But in the same way that the combustion engine |
|
|
|
15:33.080 --> 15:35.240 |
|
was a crutch on the path to an electric vehicle, |
|
|
|
15:35.240 --> 15:40.840 |
|
in the same way that any technology ultimately gets |
|
|
|
15:40.840 --> 15:44.640 |
|
replaced by some superior technology in the future. |
|
|
|
15:44.640 --> 15:47.720 |
|
And really, the way that I look at this |
|
|
|
15:47.720 --> 15:51.720 |
|
is that the way we get around on the ground, the way |
|
|
|
15:51.720 --> 15:55.280 |
|
that we use transportation is broken. |
|
|
|
15:55.280 --> 15:59.720 |
|
And that we have this, I think the number I saw this morning, |
|
|
|
15:59.720 --> 16:04.040 |
|
37,000 Americans killed last year on our roads. |
|
|
|
16:04.040 --> 16:05.360 |
|
And that's just not acceptable. |
|
|
|
16:05.360 --> 16:09.440 |
|
And so any technology that we can bring to bear |
|
|
|
16:09.440 --> 16:12.840 |
|
that accelerates this technology, self driving technology, |
|
|
|
16:12.840 --> 16:15.720 |
|
coming to market and saving lives, |
|
|
|
16:15.720 --> 16:18.280 |
|
is technology we should be using. |
|
|
|
16:18.280 --> 16:24.040 |
|
And it feels just arbitrary to say, well, I'm not |
|
|
|
16:24.040 --> 16:27.800 |
|
OK with using lasers, because that's whatever. |
|
|
|
16:27.800 --> 16:30.760 |
|
But I am OK with using an 8 megapixel camera |
|
|
|
16:30.760 --> 16:32.880 |
|
or a 16 megapixel camera. |
|
|
|
16:32.880 --> 16:34.640 |
|
These are just bits of technology, |
|
|
|
16:34.640 --> 16:36.880 |
|
and we should be taking the best technology from the tool |
|
|
|
16:36.880 --> 16:41.600 |
|
bin that allows us to go and solve a problem. |
|
|
|
16:41.600 --> 16:45.160 |
|
The question I often talk to, well, obviously you do as well, |
|
|
|
16:45.160 --> 16:48.320 |
|
to automotive companies. |
|
|
|
16:48.320 --> 16:51.880 |
|
And if there's one word that comes up more often than anything, |
|
|
|
16:51.880 --> 16:55.320 |
|
it's cost and drive costs down. |
|
|
|
16:55.320 --> 17:01.440 |
|
So while it's true that it's a tragic number, the 37,000, |
|
|
|
17:01.440 --> 17:04.880 |
|
the question is, and I'm not the one asking this question, |
|
|
|
17:04.880 --> 17:07.160 |
|
because I hate this question, but we |
|
|
|
17:07.160 --> 17:11.680 |
|
want to find the cheapest sensor suite that |
|
|
|
17:11.680 --> 17:13.400 |
|
creates a safe vehicle. |
|
|
|
17:13.400 --> 17:18.240 |
|
So in that uncomfortable trade off, |
|
|
|
17:18.240 --> 17:23.680 |
|
do you foresee lidar coming down in cost in the future? |
|
|
|
17:23.680 --> 17:28.000 |
|
Or do you see a day where level 4 autonomy is possible |
|
|
|
17:28.000 --> 17:29.880 |
|
without lidar? |
|
|
|
17:29.880 --> 17:32.880 |
|
I see both of those, but it's really a matter of time. |
|
|
|
17:32.880 --> 17:35.080 |
|
And I think, really, maybe I would |
|
|
|
17:35.080 --> 17:38.760 |
|
talk to the question you asked about the cheapest sensor. |
|
|
|
17:38.760 --> 17:40.440 |
|
I don't think that's actually what you want. |
|
|
|
17:40.440 --> 17:45.720 |
|
What you want is a sensor suite that is economically viable. |
|
|
|
17:45.720 --> 17:49.480 |
|
And then after that, everything is about margin |
|
|
|
17:49.480 --> 17:52.320 |
|
and driving cost out of the system. |
|
|
|
17:52.320 --> 17:55.400 |
|
What you also want is a sensor suite that works. |
|
|
|
17:55.400 --> 18:01.280 |
|
And so it's great to tell a story about how it would be better |
|
|
|
18:01.280 --> 18:04.560 |
|
to have a self driving system with a $50 sensor instead |
|
|
|
18:04.560 --> 18:08.720 |
|
of a $500 sensor. |
|
|
|
18:08.720 --> 18:11.560 |
|
But if the $500 sensor makes it work and the $50 sensor |
|
|
|
18:11.560 --> 18:15.680 |
|
doesn't work, who cares? |
|
|
|
18:15.680 --> 18:21.680 |
|
So long as you can actually have an economic opportunity there. |
|
|
|
18:21.680 --> 18:23.760 |
|
And the economic opportunity is important, |
|
|
|
18:23.760 --> 18:27.800 |
|
because that's how you actually have a sustainable business. |
|
|
|
18:27.800 --> 18:30.440 |
|
And that's how you can actually see this come to scale |
|
|
|
18:30.440 --> 18:32.520 |
|
and be out in the world. |
|
|
|
18:32.520 --> 18:36.400 |
|
And so when I look at lidar, I see |
|
|
|
18:36.400 --> 18:41.200 |
|
a technology that has no underlying fundamentally expense |
|
|
|
18:41.200 --> 18:43.240 |
|
to it, fundamental expense to it. |
|
|
|
18:43.240 --> 18:46.120 |
|
It's going to be more expensive than an imager, |
|
|
|
18:46.120 --> 18:51.400 |
|
because CMOS processes or FAP processes |
|
|
|
18:51.400 --> 18:56.200 |
|
are dramatically more scalable than mechanical processes. |
|
|
|
18:56.200 --> 18:58.160 |
|
But we still should be able to drive cost |
|
|
|
18:58.160 --> 19:00.440 |
|
out substantially on that side. |
|
|
|
19:00.440 --> 19:05.880 |
|
And then I also do think that with the right business model, |
|
|
|
19:05.880 --> 19:08.440 |
|
you can absorb more, certainly more cost |
|
|
|
19:08.440 --> 19:09.480 |
|
on the below materials. |
|
|
|
19:09.480 --> 19:12.600 |
|
Yeah, if the sensor suite works, extra value is provided. |
|
|
|
19:12.600 --> 19:15.480 |
|
Thereby, you don't need to drive cost down to zero. |
|
|
|
19:15.480 --> 19:17.120 |
|
It's a basic economics. |
|
|
|
19:17.120 --> 19:18.840 |
|
You've talked about your intuition |
|
|
|
19:18.840 --> 19:22.720 |
|
at level two autonomy is problematic because |
|
|
|
19:22.720 --> 19:27.280 |
|
of the human factor of vigilance, decrement, complacency, |
|
|
|
19:27.280 --> 19:29.600 |
|
overtrust, and so on, just us being human. |
|
|
|
19:29.600 --> 19:33.000 |
|
With the overtrust system, we start doing even more |
|
|
|
19:33.000 --> 19:36.480 |
|
so partaking in the secondary activities like smartphone |
|
|
|
19:36.480 --> 19:38.720 |
|
and so on. |
|
|
|
19:38.720 --> 19:42.960 |
|
Have your views evolved on this point in either direction? |
|
|
|
19:42.960 --> 19:44.760 |
|
Can you speak to it? |
|
|
|
19:44.760 --> 19:48.240 |
|
So I want to be really careful, because sometimes this |
|
|
|
19:48.240 --> 19:53.000 |
|
gets twisted in a way that I certainly didn't intend. |
|
|
|
19:53.000 --> 19:59.360 |
|
So active safety systems are a really important technology |
|
|
|
19:59.360 --> 20:03.400 |
|
that we should be pursuing and integrating into vehicles. |
|
|
|
20:03.400 --> 20:05.680 |
|
And there's an opportunity in the near term |
|
|
|
20:05.680 --> 20:09.400 |
|
to reduce accidents, reduce fatalities, and that's |
|
|
|
20:09.400 --> 20:13.400 |
|
and we should be pushing on that. |
|
|
|
20:13.400 --> 20:17.280 |
|
Level two systems are systems where |
|
|
|
20:17.280 --> 20:19.480 |
|
the vehicle is controlling two axes, |
|
|
|
20:19.480 --> 20:24.800 |
|
so breaking and thrall slash steering. |
|
|
|
20:24.800 --> 20:27.200 |
|
And I think there are variants of level two systems that |
|
|
|
20:27.200 --> 20:30.200 |
|
are supporting the driver that absolutely we |
|
|
|
20:30.200 --> 20:32.560 |
|
should encourage to be out there. |
|
|
|
20:32.560 --> 20:37.920 |
|
Where I think there's a real challenge is in the human factors |
|
|
|
20:37.920 --> 20:40.800 |
|
part around this and the misconception |
|
|
|
20:40.800 --> 20:44.920 |
|
from the public around the capability set that that enables |
|
|
|
20:44.920 --> 20:48.000 |
|
and the trust that they should have in it. |
|
|
|
20:48.000 --> 20:53.880 |
|
And that is where I'm actually incrementally more |
|
|
|
20:53.880 --> 20:55.800 |
|
concerned around level three systems |
|
|
|
20:55.800 --> 20:59.960 |
|
and how exactly a level two system is marketed and delivered |
|
|
|
20:59.960 --> 21:03.240 |
|
and how much effort people have put into those human factors. |
|
|
|
21:03.240 --> 21:07.000 |
|
So I still believe several things around this. |
|
|
|
21:07.000 --> 21:10.760 |
|
One is people will over trust the technology. |
|
|
|
21:10.760 --> 21:12.720 |
|
We've seen over the last few weeks |
|
|
|
21:12.720 --> 21:16.280 |
|
a spate of people sleeping in their Tesla. |
|
|
|
21:16.280 --> 21:23.240 |
|
I watched an episode last night of Trevor Noah talking |
|
|
|
21:23.240 --> 21:27.160 |
|
about this, and this is a smart guy |
|
|
|
21:27.160 --> 21:31.040 |
|
who has a lot of resources at his disposal describing |
|
|
|
21:31.040 --> 21:32.880 |
|
a Tesla as a self driving car. |
|
|
|
21:32.880 --> 21:35.640 |
|
And that why shouldn't people be sleeping in their Tesla? |
|
|
|
21:35.640 --> 21:38.800 |
|
It's like, well, because it's not a self driving car |
|
|
|
21:38.800 --> 21:41.120 |
|
and it is not intended to be. |
|
|
|
21:41.120 --> 21:48.400 |
|
And these people will almost certainly die at some point |
|
|
|
21:48.400 --> 21:50.400 |
|
or hurt other people. |
|
|
|
21:50.400 --> 21:52.640 |
|
And so we need to really be thoughtful about how |
|
|
|
21:52.640 --> 21:56.280 |
|
that technology is described and brought to market. |
|
|
|
21:56.280 --> 22:00.760 |
|
I also think that because of the economic issue, |
|
|
|
22:00.760 --> 22:03.320 |
|
economic challenges we were just talking about, |
|
|
|
22:03.320 --> 22:06.960 |
|
that technology path will, these level two driver system |
|
|
|
22:06.960 --> 22:08.400 |
|
systems, that technology path will |
|
|
|
22:08.400 --> 22:11.560 |
|
diverge from the technology path that we |
|
|
|
22:11.560 --> 22:15.800 |
|
need to be on to actually deliver truly self driving |
|
|
|
22:15.800 --> 22:19.120 |
|
vehicles, ones where you can get in it and sleep |
|
|
|
22:19.120 --> 22:21.480 |
|
and have the equivalent or better safety |
|
|
|
22:21.480 --> 22:24.600 |
|
than a human driver behind the wheel. |
|
|
|
22:24.600 --> 22:28.440 |
|
Because, again, the economics are very different |
|
|
|
22:28.440 --> 22:29.800 |
|
in those two worlds. |
|
|
|
22:29.800 --> 22:32.720 |
|
And so that leads to divergent technology. |
|
|
|
22:32.720 --> 22:36.920 |
|
So you just don't see the economics of gradually |
|
|
|
22:36.920 --> 22:41.520 |
|
increasing from level two and doing so quickly enough |
|
|
|
22:41.520 --> 22:44.400 |
|
to where it doesn't cost safety, critical safety concerns. |
|
|
|
22:44.400 --> 22:48.600 |
|
You believe that it needs to diverge at this point |
|
|
|
22:48.600 --> 22:50.600 |
|
into different, basically different routes. |
|
|
|
22:50.600 --> 22:53.760 |
|
And really that comes back to what |
|
|
|
22:53.760 --> 22:56.840 |
|
are those L2 and L1 systems doing? |
|
|
|
22:56.840 --> 22:59.800 |
|
And they are driver assistance functions |
|
|
|
22:59.800 --> 23:04.360 |
|
where the people that are marketing that responsibly |
|
|
|
23:04.360 --> 23:07.960 |
|
are being very clear and putting human factors in place |
|
|
|
23:07.960 --> 23:12.400 |
|
such that the driver is actually responsible for the vehicle |
|
|
|
23:12.400 --> 23:15.200 |
|
and that the technology is there to support the driver. |
|
|
|
23:15.200 --> 23:19.880 |
|
And the safety cases that are built around those |
|
|
|
23:19.880 --> 23:24.320 |
|
are dependent on that driver attention and attentiveness. |
|
|
|
23:24.320 --> 23:30.360 |
|
And at that point, you can kind of give up, to some degree, |
|
|
|
23:30.360 --> 23:34.280 |
|
for economic reasons, you can give up on, say, false negatives. |
|
|
|
23:34.280 --> 23:36.200 |
|
And so the way to think about this |
|
|
|
23:36.200 --> 23:40.760 |
|
is for a four collision mitigation braking system, |
|
|
|
23:40.760 --> 23:45.080 |
|
if half the times the driver missed a vehicle in front of it, |
|
|
|
23:45.080 --> 23:47.640 |
|
it hit the brakes and brought the vehicle to a stop, |
|
|
|
23:47.640 --> 23:51.200 |
|
that would be an incredible, incredible advance |
|
|
|
23:51.200 --> 23:52.960 |
|
in safety on our roads, right? |
|
|
|
23:52.960 --> 23:55.080 |
|
That would be equivalent to seatbelts. |
|
|
|
23:55.080 --> 23:57.560 |
|
But it would mean that if that vehicle wasn't being monitored, |
|
|
|
23:57.560 --> 24:00.560 |
|
it would hit one out of two cars. |
|
|
|
24:00.560 --> 24:05.080 |
|
And so economically, that's a perfectly good solution |
|
|
|
24:05.080 --> 24:06.200 |
|
for a driver assistance system. |
|
|
|
24:06.200 --> 24:07.360 |
|
What you should do at that point, |
|
|
|
24:07.360 --> 24:09.200 |
|
if you can get it to work 50% of the time, |
|
|
|
24:09.200 --> 24:11.040 |
|
is drive the cost out of that so you can get it |
|
|
|
24:11.040 --> 24:13.320 |
|
on as many vehicles as possible. |
|
|
|
24:13.320 --> 24:16.920 |
|
But driving the cost out of it doesn't drive up performance |
|
|
|
24:16.920 --> 24:18.840 |
|
on the false negative case. |
|
|
|
24:18.840 --> 24:21.480 |
|
And so you'll continue to not have a technology |
|
|
|
24:21.480 --> 24:25.720 |
|
that could really be available for a self driven vehicle. |
|
|
|
24:25.720 --> 24:28.480 |
|
So clearly the communication, |
|
|
|
24:28.480 --> 24:31.640 |
|
and this probably applies to all four vehicles as well, |
|
|
|
24:31.640 --> 24:34.440 |
|
the marketing and the communication |
|
|
|
24:34.440 --> 24:37.080 |
|
of what the technology is actually capable of, |
|
|
|
24:37.080 --> 24:38.440 |
|
how hard it is, how easy it is, |
|
|
|
24:38.440 --> 24:41.040 |
|
all that kind of stuff is highly problematic. |
|
|
|
24:41.040 --> 24:45.680 |
|
So say everybody in the world was perfectly communicated |
|
|
|
24:45.680 --> 24:48.400 |
|
and were made to be completely aware |
|
|
|
24:48.400 --> 24:50.040 |
|
of every single technology out there, |
|
|
|
24:50.040 --> 24:52.880 |
|
what it's able to do. |
|
|
|
24:52.880 --> 24:54.160 |
|
What's your intuition? |
|
|
|
24:54.160 --> 24:56.920 |
|
And now we're maybe getting into philosophical ground. |
|
|
|
24:56.920 --> 25:00.040 |
|
Is it possible to have a level two vehicle |
|
|
|
25:00.040 --> 25:03.280 |
|
where we don't overtrust it? |
|
|
|
25:04.720 --> 25:05.840 |
|
I don't think so. |
|
|
|
25:05.840 --> 25:10.840 |
|
If people truly understood the risks and internalized it, |
|
|
|
25:11.200 --> 25:14.320 |
|
then sure you could do that safely, |
|
|
|
25:14.320 --> 25:16.200 |
|
but that's a world that doesn't exist. |
|
|
|
25:16.200 --> 25:17.560 |
|
The people are going to, |
|
|
|
25:19.440 --> 25:20.800 |
|
if the facts are put in front of them, |
|
|
|
25:20.800 --> 25:24.480 |
|
they're gonna then combine that with their experience. |
|
|
|
25:24.480 --> 25:28.400 |
|
And let's say they're using an L2 system |
|
|
|
25:28.400 --> 25:31.040 |
|
and they go up and down the one on one every day |
|
|
|
25:31.040 --> 25:32.800 |
|
and they do that for a month |
|
|
|
25:32.800 --> 25:35.200 |
|
and it just worked every day for a month. |
|
|
|
25:36.320 --> 25:37.400 |
|
Like that's pretty compelling. |
|
|
|
25:37.400 --> 25:41.880 |
|
At that point, just even if you know the statistics, |
|
|
|
25:41.880 --> 25:43.520 |
|
you're like, well, I don't know, |
|
|
|
25:43.520 --> 25:44.840 |
|
maybe there's something a little funny about those. |
|
|
|
25:44.840 --> 25:47.000 |
|
Maybe they're driving in difficult places. |
|
|
|
25:47.000 --> 25:49.960 |
|
Like I've seen it with my own eyes, it works. |
|
|
|
25:49.960 --> 25:52.480 |
|
And the problem is that that sample size that they have, |
|
|
|
25:52.480 --> 25:54.000 |
|
so it's 30 miles up and down, |
|
|
|
25:54.000 --> 25:58.800 |
|
so 60 miles times 30 days, so 60, 180, 1,800 miles. |
|
|
|
26:01.720 --> 26:05.240 |
|
That's a drop in the bucket compared to the one, |
|
|
|
26:05.240 --> 26:07.640 |
|
what 85 million miles between fatalities. |
|
|
|
26:07.640 --> 26:11.400 |
|
And so they don't really have a true estimate |
|
|
|
26:11.400 --> 26:14.440 |
|
based on their personal experience of the real risks, |
|
|
|
26:14.440 --> 26:15.640 |
|
but they're gonna trust it anyway, |
|
|
|
26:15.640 --> 26:17.720 |
|
because it's hard not to, it worked for a month. |
|
|
|
26:17.720 --> 26:18.640 |
|
What's gonna change? |
|
|
|
26:18.640 --> 26:21.600 |
|
So even if you start a perfect understanding of the system, |
|
|
|
26:21.600 --> 26:24.160 |
|
your own experience will make it drift. |
|
|
|
26:24.160 --> 26:25.920 |
|
I mean, that's a big concern. |
|
|
|
26:25.920 --> 26:29.480 |
|
Over a year, over two years even, it doesn't have to be months. |
|
|
|
26:29.480 --> 26:33.720 |
|
And I think that as this technology moves from, |
|
|
|
26:35.440 --> 26:37.800 |
|
what I would say is kind of the more technology savvy |
|
|
|
26:37.800 --> 26:41.480 |
|
ownership group to the mass market, |
|
|
|
26:41.480 --> 26:44.640 |
|
you may be able to have some of those folks |
|
|
|
26:44.640 --> 26:46.320 |
|
who are really familiar with technology, |
|
|
|
26:46.320 --> 26:48.880 |
|
they may be able to internalize it better. |
|
|
|
26:48.880 --> 26:50.840 |
|
And you're kind of immunization |
|
|
|
26:50.840 --> 26:53.400 |
|
against this kind of false risk assessment |
|
|
|
26:53.400 --> 26:56.960 |
|
might last longer, but as folks who aren't as savvy |
|
|
|
26:56.960 --> 27:00.200 |
|
about that read the material |
|
|
|
27:00.200 --> 27:02.200 |
|
and they compare that to their personal experience, |
|
|
|
27:02.200 --> 27:08.200 |
|
I think there that it's gonna move more quickly. |
|
|
|
27:08.200 --> 27:11.320 |
|
So your work, the program that you've created at Google |
|
|
|
27:11.320 --> 27:16.320 |
|
and now at Aurora is focused more on the second path |
|
|
|
27:16.640 --> 27:18.520 |
|
of creating full autonomy. |
|
|
|
27:18.520 --> 27:20.920 |
|
So it's such a fascinating, |
|
|
|
27:21.800 --> 27:24.600 |
|
I think it's one of the most interesting AI problems |
|
|
|
27:24.600 --> 27:25.640 |
|
of the century, right? |
|
|
|
27:25.640 --> 27:28.320 |
|
It's a, I just talked to a lot of people, |
|
|
|
27:28.320 --> 27:30.400 |
|
just regular people, I don't know, my mom |
|
|
|
27:30.400 --> 27:33.840 |
|
about autonomous vehicles and you begin to grapple |
|
|
|
27:33.840 --> 27:38.080 |
|
with ideas of giving your life control over to a machine. |
|
|
|
27:38.080 --> 27:40.040 |
|
It's philosophically interesting, |
|
|
|
27:40.040 --> 27:41.760 |
|
it's practically interesting. |
|
|
|
27:41.760 --> 27:43.720 |
|
So let's talk about safety. |
|
|
|
27:43.720 --> 27:46.240 |
|
How do you think, we demonstrate, |
|
|
|
27:46.240 --> 27:47.880 |
|
you've spoken about metrics in the past, |
|
|
|
27:47.880 --> 27:51.880 |
|
how do you think we demonstrate to the world |
|
|
|
27:51.880 --> 27:56.160 |
|
that an autonomous vehicle, an Aurora system is safe? |
|
|
|
27:56.160 --> 27:57.320 |
|
This is one where it's difficult |
|
|
|
27:57.320 --> 27:59.280 |
|
because there isn't a sound bite answer. |
|
|
|
27:59.280 --> 28:04.280 |
|
That we have to show a combination of work |
|
|
|
28:05.960 --> 28:08.360 |
|
that was done diligently and thoughtfully. |
|
|
|
28:08.360 --> 28:10.840 |
|
And this is where something like a functional safety process |
|
|
|
28:10.840 --> 28:14.360 |
|
as part of that is like, here's the way we did the work. |
|
|
|
28:15.320 --> 28:17.200 |
|
That means that we were very thorough. |
|
|
|
28:17.200 --> 28:20.560 |
|
So, if you believe that we, what we said about, |
|
|
|
28:20.560 --> 28:21.480 |
|
this is the way we did it, |
|
|
|
28:21.480 --> 28:23.440 |
|
then you can have some confidence that we were thorough |
|
|
|
28:23.440 --> 28:27.000 |
|
in the engineering work we put into the system. |
|
|
|
28:27.000 --> 28:30.160 |
|
And then on top of that, to kind of demonstrate |
|
|
|
28:30.160 --> 28:32.000 |
|
that we weren't just thorough, |
|
|
|
28:32.000 --> 28:34.000 |
|
we were actually good at what we did. |
|
|
|
28:35.320 --> 28:38.240 |
|
There'll be a kind of a collection of evidence |
|
|
|
28:38.240 --> 28:40.480 |
|
in terms of demonstrating that the capabilities |
|
|
|
28:40.480 --> 28:43.960 |
|
work the way we thought they did, statistically |
|
|
|
28:43.960 --> 28:47.200 |
|
and to whatever degree we can demonstrate that |
|
|
|
28:48.200 --> 28:50.320 |
|
both in some combination of simulation, |
|
|
|
28:50.320 --> 28:54.720 |
|
some combination of unit testing and decomposition testing, |
|
|
|
28:54.720 --> 28:57.000 |
|
and then some part of it will be on road data. |
|
|
|
28:58.200 --> 29:03.200 |
|
And I think the way we'll ultimately convey this |
|
|
|
29:03.320 --> 29:06.800 |
|
to the public is there'll be clearly some conversation |
|
|
|
29:06.800 --> 29:08.240 |
|
with the public about it, |
|
|
|
29:08.240 --> 29:12.080 |
|
but we'll kind of invoke the kind of the trusted nodes |
|
|
|
29:12.080 --> 29:14.360 |
|
and that we'll spend more time being able to go |
|
|
|
29:14.360 --> 29:17.280 |
|
into more depth with folks like NHTSA |
|
|
|
29:17.280 --> 29:19.760 |
|
and other federal and state regulatory bodies |
|
|
|
29:19.760 --> 29:22.600 |
|
and kind of given that they are operating |
|
|
|
29:22.600 --> 29:25.120 |
|
in the public interest and they're trusted |
|
|
|
29:26.240 --> 29:28.680 |
|
that if we can show enough work to them |
|
|
|
29:28.680 --> 29:30.040 |
|
that they're convinced, |
|
|
|
29:30.040 --> 29:33.840 |
|
then I think we're in a pretty good place. |
|
|
|
29:33.840 --> 29:35.040 |
|
That means that you work with people |
|
|
|
29:35.040 --> 29:36.960 |
|
that are essentially experts at safety |
|
|
|
29:36.960 --> 29:39.040 |
|
to try to discuss and show, |
|
|
|
29:39.040 --> 29:41.800 |
|
do you think the answer is probably no, |
|
|
|
29:41.800 --> 29:44.360 |
|
but just in case, do you think there exists a metric? |
|
|
|
29:44.360 --> 29:46.360 |
|
So currently people have been using |
|
|
|
29:46.360 --> 29:48.200 |
|
a number of disengagement. |
|
|
|
29:48.200 --> 29:50.160 |
|
And it quickly turns into a marketing scheme |
|
|
|
29:50.160 --> 29:54.320 |
|
to sort of you alter the experiments you run to. |
|
|
|
29:54.320 --> 29:56.320 |
|
I think you've spoken that you don't like. |
|
|
|
29:56.320 --> 29:57.160 |
|
Don't love it. |
|
|
|
29:57.160 --> 29:59.720 |
|
No, in fact, I was on the record telling DMV |
|
|
|
29:59.720 --> 30:02.000 |
|
that I thought this was not a great metric. |
|
|
|
30:02.000 --> 30:05.360 |
|
Do you think it's possible to create a metric, |
|
|
|
30:05.360 --> 30:09.480 |
|
a number that could demonstrate safety |
|
|
|
30:09.480 --> 30:12.400 |
|
outside of fatalities? |
|
|
|
30:12.400 --> 30:16.640 |
|
So I do and I think that it won't be just one number. |
|
|
|
30:16.640 --> 30:21.320 |
|
So as we are internally grappling with this |
|
|
|
30:21.320 --> 30:23.600 |
|
and at some point we'll be able to talk |
|
|
|
30:23.600 --> 30:25.080 |
|
more publicly about it, |
|
|
|
30:25.080 --> 30:28.560 |
|
is how do we think about human performance |
|
|
|
30:28.560 --> 30:32.200 |
|
in different tasks, say detecting traffic lights |
|
|
|
30:32.200 --> 30:36.240 |
|
or safely making a left turn across traffic? |
|
|
|
30:37.720 --> 30:40.040 |
|
And what do we think the failure rates |
|
|
|
30:40.040 --> 30:42.520 |
|
are for those different capabilities for people? |
|
|
|
30:42.520 --> 30:44.760 |
|
And then demonstrating to ourselves |
|
|
|
30:44.760 --> 30:48.480 |
|
and then ultimately folks in regulatory role |
|
|
|
30:48.480 --> 30:50.760 |
|
and then ultimately the public, |
|
|
|
30:50.760 --> 30:52.400 |
|
that we have confidence that our system |
|
|
|
30:52.400 --> 30:54.800 |
|
will work better than that. |
|
|
|
30:54.800 --> 30:57.040 |
|
And so these individual metrics |
|
|
|
30:57.040 --> 31:00.720 |
|
will kind of tell a compelling story ultimately. |
|
|
|
31:01.760 --> 31:03.920 |
|
I do think at the end of the day, |
|
|
|
31:03.920 --> 31:06.640 |
|
what we care about in terms of safety |
|
|
|
31:06.640 --> 31:11.640 |
|
is life saved and injuries reduced. |
|
|
|
31:11.640 --> 31:15.320 |
|
And then ultimately kind of casualty dollars |
|
|
|
31:16.440 --> 31:19.360 |
|
that people aren't having to pay to get their car fixed. |
|
|
|
31:19.360 --> 31:22.680 |
|
And I do think that in aviation, |
|
|
|
31:22.680 --> 31:25.880 |
|
they look at a kind of an event pyramid |
|
|
|
31:25.880 --> 31:28.600 |
|
where a crash is at the top of that |
|
|
|
31:28.600 --> 31:30.440 |
|
and that's the worst event obviously. |
|
|
|
31:30.440 --> 31:34.240 |
|
And then there's injuries and near miss events and whatnot |
|
|
|
31:34.240 --> 31:37.320 |
|
and violation of operating procedures. |
|
|
|
31:37.320 --> 31:40.160 |
|
And you kind of build a statistical model |
|
|
|
31:40.160 --> 31:44.440 |
|
of the relevance of the low severity things |
|
|
|
31:44.440 --> 31:45.280 |
|
and the high severity things. |
|
|
|
31:45.280 --> 31:46.120 |
|
And I think that's something |
|
|
|
31:46.120 --> 31:48.240 |
|
where we'll be able to look at as well |
|
|
|
31:48.240 --> 31:51.920 |
|
because an event per 85 million miles |
|
|
|
31:51.920 --> 31:54.480 |
|
is statistically a difficult thing |
|
|
|
31:54.480 --> 31:59.440 |
|
even at the scale of the US to kind of compare directly. |
|
|
|
31:59.440 --> 32:02.280 |
|
And that event, the fatality that's connected |
|
|
|
32:02.280 --> 32:07.280 |
|
to an autonomous vehicle is significantly, |
|
|
|
32:07.480 --> 32:09.160 |
|
at least currently magnified |
|
|
|
32:09.160 --> 32:12.320 |
|
in the amount of attention you get. |
|
|
|
32:12.320 --> 32:15.080 |
|
So that speaks to public perception. |
|
|
|
32:15.080 --> 32:16.720 |
|
I think the most popular topic |
|
|
|
32:16.720 --> 32:19.520 |
|
about autonomous vehicles in the public |
|
|
|
32:19.520 --> 32:23.080 |
|
is the trolley problem formulation, right? |
|
|
|
32:23.080 --> 32:27.040 |
|
Which has, let's not get into that too much |
|
|
|
32:27.040 --> 32:29.600 |
|
but is misguided in many ways. |
|
|
|
32:29.600 --> 32:32.320 |
|
But it speaks to the fact that people are grappling |
|
|
|
32:32.320 --> 32:36.160 |
|
with this idea of giving control over to a machine. |
|
|
|
32:36.160 --> 32:41.160 |
|
So how do you win the hearts and minds of the people |
|
|
|
32:41.560 --> 32:43.600 |
|
that autonomy is something |
|
|
|
32:43.600 --> 32:45.480 |
|
that could be a part of their lives? |
|
|
|
32:45.480 --> 32:47.640 |
|
I think you let them experience it, right? |
|
|
|
32:47.640 --> 32:50.440 |
|
I think it's right. |
|
|
|
32:50.440 --> 32:52.720 |
|
I think people should be skeptical. |
|
|
|
32:52.720 --> 32:55.680 |
|
I think people should ask questions. |
|
|
|
32:55.680 --> 32:57.000 |
|
I think they should doubt |
|
|
|
32:58.040 --> 33:00.960 |
|
because this is something new and different. |
|
|
|
33:00.960 --> 33:01.960 |
|
They haven't touched it yet. |
|
|
|
33:01.960 --> 33:03.680 |
|
And I think it's perfectly reasonable. |
|
|
|
33:03.680 --> 33:07.360 |
|
And but at the same time, |
|
|
|
33:07.360 --> 33:09.360 |
|
it's clear there's an opportunity to make the road safer. |
|
|
|
33:09.360 --> 33:12.480 |
|
It's clear that we can improve access to mobility. |
|
|
|
33:12.480 --> 33:15.160 |
|
It's clear that we can reduce the cost of mobility. |
|
|
|
33:16.680 --> 33:19.520 |
|
And that once people try that |
|
|
|
33:19.520 --> 33:22.800 |
|
and understand that it's safe |
|
|
|
33:22.800 --> 33:24.480 |
|
and are able to use in their daily lives, |
|
|
|
33:24.480 --> 33:28.080 |
|
I think it's one of these things that will just be obvious. |
|
|
|
33:28.080 --> 33:32.280 |
|
And I've seen this practically in demonstrations |
|
|
|
33:32.280 --> 33:35.640 |
|
that I've given where I've had people come in |
|
|
|
33:35.640 --> 33:38.600 |
|
and they're very skeptical. |
|
|
|
33:38.600 --> 33:39.960 |
|
And they get in the vehicle. |
|
|
|
33:39.960 --> 33:42.640 |
|
My favorite one is taking somebody out on the freeway |
|
|
|
33:42.640 --> 33:46.080 |
|
and we're on the one on one driving at 65 miles an hour. |
|
|
|
33:46.080 --> 33:48.560 |
|
And after 10 minutes, they kind of turn and ask, |
|
|
|
33:48.560 --> 33:49.560 |
|
is that all it does? |
|
|
|
33:49.560 --> 33:52.160 |
|
And you're like, it's self driving car. |
|
|
|
33:52.160 --> 33:54.920 |
|
I'm not sure exactly what you thought it would do, right? |
|
|
|
33:54.920 --> 33:57.960 |
|
But it becomes mundane, |
|
|
|
33:58.920 --> 34:01.560 |
|
which is exactly what you want to technology |
|
|
|
34:01.560 --> 34:02.760 |
|
like this to be, right? |
|
|
|
34:02.760 --> 34:04.680 |
|
We don't really... |
|
|
|
34:04.680 --> 34:07.320 |
|
When I turn the light switch on in here, |
|
|
|
34:07.320 --> 34:12.040 |
|
I don't think about the complexity of those electrons |
|
|
|
34:12.040 --> 34:14.240 |
|
being pushed down a wire from wherever it was |
|
|
|
34:14.240 --> 34:15.880 |
|
and being generated. |
|
|
|
34:15.880 --> 34:19.120 |
|
It's like, I just get annoyed if it doesn't work, right? |
|
|
|
34:19.120 --> 34:21.440 |
|
And what I value is the fact |
|
|
|
34:21.440 --> 34:23.120 |
|
that I can do other things in this space. |
|
|
|
34:23.120 --> 34:24.600 |
|
I can see my colleagues. |
|
|
|
34:24.600 --> 34:26.200 |
|
I can read stuff on a paper. |
|
|
|
34:26.200 --> 34:29.240 |
|
I can not be afraid of the dark. |
|
|
|
34:29.240 --> 34:32.840 |
|
And I think that's what we want this technology to be like |
|
|
|
34:32.840 --> 34:34.160 |
|
is it's in the background |
|
|
|
34:34.160 --> 34:36.520 |
|
and people get to have those life experiences |
|
|
|
34:36.520 --> 34:37.880 |
|
and do so safely. |
|
|
|
34:37.880 --> 34:41.600 |
|
So putting this technology in the hands of people |
|
|
|
34:41.600 --> 34:45.800 |
|
speaks to scale of deployment, right? |
|
|
|
34:45.800 --> 34:50.360 |
|
So what do you think the dreaded question about the future |
|
|
|
34:50.360 --> 34:52.840 |
|
because nobody can predict the future? |
|
|
|
34:52.840 --> 34:57.080 |
|
But just maybe speak poetically about |
|
|
|
34:57.080 --> 35:00.600 |
|
when do you think we'll see a large scale deployment |
|
|
|
35:00.600 --> 35:05.600 |
|
of autonomous vehicles, 10,000, those kinds of numbers. |
|
|
|
35:06.360 --> 35:08.280 |
|
We'll see that within 10 years. |
|
|
|
35:09.280 --> 35:10.600 |
|
I'm pretty confident. |
|
|
|
35:10.600 --> 35:11.920 |
|
We... |
|
|
|
35:13.920 --> 35:15.920 |
|
What's an impressive scale? |
|
|
|
35:15.920 --> 35:19.000 |
|
What moment, so you've done the DARPA Challenge |
|
|
|
35:19.000 --> 35:20.240 |
|
where there's one vehicle, |
|
|
|
35:20.240 --> 35:22.000 |
|
at which moment does it become, |
|
|
|
35:22.000 --> 35:23.720 |
|
wow, this is serious scale? |
|
|
|
35:23.720 --> 35:27.960 |
|
So I think the moment it gets serious is when |
|
|
|
35:27.960 --> 35:32.040 |
|
we really do have a driverless vehicle |
|
|
|
35:32.040 --> 35:33.880 |
|
operating on public roads |
|
|
|
35:34.760 --> 35:37.760 |
|
and that we can do that kind of continuously. |
|
|
|
35:37.760 --> 35:38.640 |
|
Without a safety driver? |
|
|
|
35:38.640 --> 35:40.240 |
|
Without a safety driver in the vehicle. |
|
|
|
35:40.240 --> 35:41.320 |
|
I think at that moment, |
|
|
|
35:41.320 --> 35:44.160 |
|
we've kind of crossed the zero to one threshold. |
|
|
|
35:45.720 --> 35:50.000 |
|
And then it is about how do we continue to scale that? |
|
|
|
35:50.000 --> 35:53.720 |
|
How do we build the right business models? |
|
|
|
35:53.720 --> 35:56.040 |
|
How do we build the right customer experience around it |
|
|
|
35:56.040 --> 35:59.680 |
|
so that it is actually a useful product out in the world? |
|
|
|
36:00.720 --> 36:03.360 |
|
And I think that is really, |
|
|
|
36:03.360 --> 36:05.720 |
|
at that point, it moves from a, |
|
|
|
36:05.720 --> 36:08.960 |
|
what is this kind of mixed science engineering project |
|
|
|
36:08.960 --> 36:12.120 |
|
into engineering and commercialization |
|
|
|
36:12.120 --> 36:15.600 |
|
and really starting to deliver on the value |
|
|
|
36:15.600 --> 36:18.000 |
|
that we all see here. |
|
|
|
36:18.000 --> 36:20.680 |
|
And actually making that real in the world. |
|
|
|
36:20.680 --> 36:22.240 |
|
What do you think that deployment looks like? |
|
|
|
36:22.240 --> 36:24.920 |
|
Where do we first see the inkling of |
|
|
|
36:24.920 --> 36:28.600 |
|
no safety driver, one or two cars here and there? |
|
|
|
36:28.600 --> 36:29.760 |
|
Is it on the highway? |
|
|
|
36:29.760 --> 36:33.200 |
|
Is it in specific routes in the urban environment? |
|
|
|
36:33.200 --> 36:36.960 |
|
I think it's gonna be urban, suburban type environments. |
|
|
|
36:37.920 --> 36:38.920 |
|
You know, with Aurora, |
|
|
|
36:38.920 --> 36:41.560 |
|
when we thought about how to tackle this, |
|
|
|
36:42.400 --> 36:45.040 |
|
it was kind of invoke to think about trucking |
|
|
|
36:46.000 --> 36:47.760 |
|
as opposed to urban driving. |
|
|
|
36:47.760 --> 36:51.240 |
|
And again, the human intuition around this |
|
|
|
36:51.240 --> 36:55.360 |
|
is that freeways are easier to drive on |
|
|
|
36:57.040 --> 36:59.240 |
|
because everybody's kind of going in the same direction |
|
|
|
36:59.240 --> 37:01.560 |
|
and lanes are a little wider, et cetera. |
|
|
|
37:01.560 --> 37:03.280 |
|
And I think that that intuition is pretty good, |
|
|
|
37:03.280 --> 37:06.000 |
|
except we don't really care about most of the time. |
|
|
|
37:06.000 --> 37:08.360 |
|
We care about all of the time. |
|
|
|
37:08.360 --> 37:10.840 |
|
And when you're driving on a freeway with a truck, |
|
|
|
37:10.840 --> 37:15.840 |
|
say 70 miles an hour and you've got 70,000 pound load |
|
|
|
37:15.840 --> 37:17.840 |
|
to do with you, that's just an incredible amount |
|
|
|
37:17.840 --> 37:18.840 |
|
of kinetic energy. |
|
|
|
37:18.840 --> 37:21.440 |
|
And so when that goes wrong, it goes really wrong. |
|
|
|
37:22.600 --> 37:27.600 |
|
And that those challenges that you see occur more rarely |
|
|
|
37:27.760 --> 37:31.040 |
|
so you don't get to learn as quickly. |
|
|
|
37:31.040 --> 37:33.640 |
|
And they're incrementally more difficult |
|
|
|
37:33.640 --> 37:35.920 |
|
than urban driving, but they're not easier |
|
|
|
37:35.920 --> 37:37.400 |
|
than urban driving. |
|
|
|
37:37.400 --> 37:41.600 |
|
And so I think this happens in moderate speed, |
|
|
|
37:41.600 --> 37:43.840 |
|
urban environments, because there, |
|
|
|
37:43.840 --> 37:46.560 |
|
if two vehicles crash at 25 miles per hour, |
|
|
|
37:46.560 --> 37:50.040 |
|
it's not good, but probably everybody walks away. |
|
|
|
37:51.000 --> 37:53.680 |
|
And those events where there's the possibility |
|
|
|
37:53.680 --> 37:55.720 |
|
for that occurring happen frequently. |
|
|
|
37:55.720 --> 37:57.920 |
|
So we get to learn more rapidly. |
|
|
|
37:57.920 --> 38:01.320 |
|
We get to do that with lower risk for everyone. |
|
|
|
38:02.440 --> 38:04.280 |
|
And then we can deliver value to people |
|
|
|
38:04.280 --> 38:05.800 |
|
that need to get from one place to another. |
|
|
|
38:05.800 --> 38:08.160 |
|
And then once we've got that solved, |
|
|
|
38:08.160 --> 38:10.000 |
|
then the kind of the freeway driving part of this |
|
|
|
38:10.000 --> 38:12.440 |
|
just falls out, but we're able to learn |
|
|
|
38:12.440 --> 38:15.160 |
|
more safely, more quickly in the urban environment. |
|
|
|
38:15.160 --> 38:18.480 |
|
So 10 years and then scale 20, 30 years. |
|
|
|
38:18.480 --> 38:21.440 |
|
I mean, who knows if it's sufficiently compelling |
|
|
|
38:21.440 --> 38:24.320 |
|
experience is created, it can be faster and slower. |
|
|
|
38:24.320 --> 38:27.120 |
|
Do you think there could be breakthroughs |
|
|
|
38:27.120 --> 38:29.880 |
|
and what kind of breakthroughs might there be |
|
|
|
38:29.880 --> 38:32.360 |
|
that completely change that timeline? |
|
|
|
38:32.360 --> 38:35.320 |
|
Again, not only am I asking to predict the future, |
|
|
|
38:35.320 --> 38:37.280 |
|
I'm asking you to predict breakthroughs |
|
|
|
38:37.280 --> 38:38.280 |
|
that haven't happened yet. |
|
|
|
38:38.280 --> 38:41.800 |
|
So what's the, I think another way to ask that would be |
|
|
|
38:41.800 --> 38:44.240 |
|
if I could wave a magic wand, |
|
|
|
38:44.240 --> 38:46.640 |
|
what part of the system would I make work today |
|
|
|
38:46.640 --> 38:48.600 |
|
to accelerate it as quickly as possible? |
|
|
|
38:48.600 --> 38:49.440 |
|
Right. |
|
|
|
38:52.080 --> 38:54.080 |
|
Don't say infrastructure, please don't say infrastructure. |
|
|
|
38:54.080 --> 38:56.280 |
|
No, it's definitely not infrastructure. |
|
|
|
38:56.280 --> 39:00.520 |
|
It's really that perception forecasting capability. |
|
|
|
39:00.520 --> 39:04.760 |
|
So if tomorrow you could give me a perfect model |
|
|
|
39:04.760 --> 39:07.520 |
|
of what's happening and what will happen |
|
|
|
39:07.520 --> 39:11.400 |
|
for the next five seconds around a vehicle |
|
|
|
39:11.400 --> 39:14.480 |
|
on the roadway, that would accelerate things |
|
|
|
39:14.480 --> 39:15.320 |
|
pretty dramatically. |
|
|
|
39:15.320 --> 39:17.560 |
|
Are you interested in staying up at night? |
|
|
|
39:17.560 --> 39:21.680 |
|
Are you mostly bothered by cars, pedestrians, or cyclists? |
|
|
|
39:21.680 --> 39:25.920 |
|
So I worry most about the vulnerable road users |
|
|
|
39:25.920 --> 39:28.000 |
|
about the combination of cyclists and cars, right? |
|
|
|
39:28.000 --> 39:29.480 |
|
Just cyclists and pedestrians |
|
|
|
39:29.480 --> 39:31.880 |
|
because they're not in armor. |
|
|
|
39:33.240 --> 39:36.480 |
|
The cars, they're bigger, they've got protection |
|
|
|
39:36.480 --> 39:39.440 |
|
for the people and so the ultimate risk is lower there. |
|
|
|
39:39.440 --> 39:44.080 |
|
Whereas a pedestrian or cyclist, they're out on the road |
|
|
|
39:44.080 --> 39:46.520 |
|
and they don't have any protection. |
|
|
|
39:46.520 --> 39:49.760 |
|
And so we need to pay extra attention to that. |
|
|
|
39:49.760 --> 39:54.120 |
|
Do you think about a very difficult technical challenge |
|
|
|
39:55.760 --> 39:58.560 |
|
of the fact that pedestrians, |
|
|
|
39:58.560 --> 40:01.400 |
|
if you try to protect pedestrians by being careful |
|
|
|
40:01.400 --> 40:04.600 |
|
and slow, they'll take advantage of that. |
|
|
|
40:04.600 --> 40:07.560 |
|
So the game theoretic dance. |
|
|
|
40:07.560 --> 40:10.880 |
|
Does that worry you from a technical perspective |
|
|
|
40:10.880 --> 40:12.520 |
|
how we solve that? |
|
|
|
40:12.520 --> 40:14.600 |
|
Because as humans, the way we solve that |
|
|
|
40:14.600 --> 40:17.280 |
|
is kind of nudge our way through the pedestrians, |
|
|
|
40:17.280 --> 40:20.040 |
|
which doesn't feel from a technical perspective |
|
|
|
40:20.040 --> 40:22.320 |
|
as a appropriate algorithm. |
|
|
|
40:23.240 --> 40:25.960 |
|
But do you think about how we solve that problem? |
|
|
|
40:25.960 --> 40:30.960 |
|
Yeah, I think there's two different concepts there. |
|
|
|
40:31.400 --> 40:35.880 |
|
So one is, am I worried that because these vehicles |
|
|
|
40:35.880 --> 40:37.640 |
|
are self driving, people will kind of step on the road |
|
|
|
40:37.640 --> 40:38.680 |
|
and take advantage of them. |
|
|
|
40:38.680 --> 40:43.680 |
|
And I've heard this and I don't really believe it |
|
|
|
40:43.800 --> 40:46.000 |
|
because if I'm driving down the road |
|
|
|
40:46.000 --> 40:48.920 |
|
and somebody steps in front of me, I'm going to stop. |
|
|
|
40:48.920 --> 40:49.760 |
|
Right? |
|
|
|
40:49.760 --> 40:53.720 |
|
Like even if I'm annoyed, I'm not gonna just drive |
|
|
|
40:53.720 --> 40:55.200 |
|
through a person stood on the road. |
|
|
|
40:55.200 --> 40:56.440 |
|
Right. |
|
|
|
40:56.440 --> 41:00.440 |
|
And so I think today people can take advantage of this |
|
|
|
41:00.440 --> 41:02.600 |
|
and you do see some people do it. |
|
|
|
41:02.600 --> 41:04.200 |
|
I guess there's an incremental risk |
|
|
|
41:04.200 --> 41:05.920 |
|
because maybe they have lower confidence |
|
|
|
41:05.920 --> 41:06.760 |
|
that I'm going to see them |
|
|
|
41:06.760 --> 41:09.360 |
|
than they might have for an automated vehicle. |
|
|
|
41:09.360 --> 41:12.080 |
|
And so maybe that shifts it a little bit. |
|
|
|
41:12.080 --> 41:14.400 |
|
But I think people don't want to get hit by cars. |
|
|
|
41:14.400 --> 41:17.120 |
|
And so I think that I'm not that worried |
|
|
|
41:17.120 --> 41:18.800 |
|
about people walking out of the one on one |
|
|
|
41:18.800 --> 41:21.840 |
|
and creating chaos more than they would today. |
|
|
|
41:24.400 --> 41:27.040 |
|
Regarding kind of the nudging through a big stream |
|
|
|
41:27.040 --> 41:30.040 |
|
of pedestrians leaving a concert or something. |
|
|
|
41:30.040 --> 41:33.480 |
|
I think that is further down the technology pipeline. |
|
|
|
41:33.480 --> 41:36.920 |
|
I think that you're right, that's tricky. |
|
|
|
41:36.920 --> 41:38.560 |
|
I don't think it's necessarily, |
|
|
|
41:40.320 --> 41:43.360 |
|
I think the algorithm people use for this is pretty simple. |
|
|
|
41:43.360 --> 41:44.200 |
|
Right? |
|
|
|
41:44.200 --> 41:45.040 |
|
It's kind of just move forward slowly |
|
|
|
41:45.040 --> 41:47.600 |
|
and if somebody's really close and stop. |
|
|
|
41:47.600 --> 41:50.840 |
|
And I think that that probably can be replicated |
|
|
|
41:50.840 --> 41:54.040 |
|
pretty easily and particularly given that it's, |
|
|
|
41:54.040 --> 41:57.240 |
|
you don't do this at 30 miles an hour, you do it at one, |
|
|
|
41:57.240 --> 41:59.080 |
|
that even in those situations, |
|
|
|
41:59.080 --> 42:01.200 |
|
the risk is relatively minimal. |
|
|
|
42:01.200 --> 42:03.440 |
|
But it's not something we're thinking |
|
|
|
42:03.440 --> 42:04.560 |
|
about in any serious way. |
|
|
|
42:04.560 --> 42:08.000 |
|
And probably that's less an algorithm problem |
|
|
|
42:08.000 --> 42:10.160 |
|
more creating a human experience. |
|
|
|
42:10.160 --> 42:14.320 |
|
So the HCI people that create a visual display |
|
|
|
42:14.320 --> 42:16.280 |
|
that you're pleasantly as a pedestrian, |
|
|
|
42:16.280 --> 42:17.680 |
|
nudged out of the way. |
|
|
|
42:17.680 --> 42:21.960 |
|
That's an experience problem, not an algorithm problem. |
|
|
|
42:22.880 --> 42:25.480 |
|
Who's the main competitor to Aurora today? |
|
|
|
42:25.480 --> 42:28.600 |
|
And how do you out compete them in the long run? |
|
|
|
42:28.600 --> 42:31.200 |
|
So we really focus a lot on what we're doing here. |
|
|
|
42:31.200 --> 42:34.440 |
|
I think that, I've said this a few times |
|
|
|
42:34.440 --> 42:37.960 |
|
that this is a huge difficult problem |
|
|
|
42:37.960 --> 42:40.280 |
|
and it's great that a bunch of companies are tackling it |
|
|
|
42:40.280 --> 42:42.320 |
|
because I think it's so important for society |
|
|
|
42:42.320 --> 42:43.760 |
|
that somebody gets there. |
|
|
|
42:45.200 --> 42:49.040 |
|
So we don't spend a whole lot of time |
|
|
|
42:49.040 --> 42:51.560 |
|
like thinking tactically about who's out there |
|
|
|
42:51.560 --> 42:55.480 |
|
and how do we beat that person individually? |
|
|
|
42:55.480 --> 42:58.680 |
|
What are we trying to do to go faster ultimately? |
|
|
|
42:58.680 --> 43:02.600 |
|
Well, part of it is the leisure team we have |
|
|
|
43:02.600 --> 43:04.160 |
|
has got pretty tremendous experience. |
|
|
|
43:04.160 --> 43:06.400 |
|
And so we kind of understand the landscape |
|
|
|
43:06.400 --> 43:09.120 |
|
and understand where the cul de sacs are to some degree. |
|
|
|
43:09.120 --> 43:10.920 |
|
And we try and avoid those. |
|
|
|
43:12.600 --> 43:14.240 |
|
I think there's a part of it |
|
|
|
43:14.240 --> 43:16.240 |
|
just this great team we've built. |
|
|
|
43:16.240 --> 43:19.040 |
|
People, this is a technology and a company |
|
|
|
43:19.040 --> 43:22.280 |
|
that people believe in the mission of. |
|
|
|
43:22.280 --> 43:24.760 |
|
And so it allows us to attract just awesome people |
|
|
|
43:24.760 --> 43:25.680 |
|
to go work. |
|
|
|
43:26.760 --> 43:28.000 |
|
We've got a culture, I think, |
|
|
|
43:28.000 --> 43:30.440 |
|
that people appreciate, that allows them to focus, |
|
|
|
43:30.440 --> 43:33.080 |
|
allows them to really spend time solving problems. |
|
|
|
43:33.080 --> 43:35.880 |
|
And I think that keeps them energized. |
|
|
|
43:35.880 --> 43:40.880 |
|
And then we've invested heavily in the infrastructure |
|
|
|
43:43.520 --> 43:46.520 |
|
and architectures that we think will ultimately accelerate us. |
|
|
|
43:46.520 --> 43:50.640 |
|
So because of the folks we're able to bring in early on, |
|
|
|
43:50.640 --> 43:53.520 |
|
because of the great investors we have, |
|
|
|
43:53.520 --> 43:56.760 |
|
we don't spend all of our time doing demos |
|
|
|
43:56.760 --> 43:58.680 |
|
and kind of leaping from one demo to the next. |
|
|
|
43:58.680 --> 44:02.800 |
|
We've been given the freedom to invest in |
|
|
|
44:03.960 --> 44:05.480 |
|
infrastructure to do machine learning, |
|
|
|
44:05.480 --> 44:08.600 |
|
infrastructure to pull data from our on road testing, |
|
|
|
44:08.600 --> 44:11.480 |
|
infrastructure to use that to accelerate engineering. |
|
|
|
44:11.480 --> 44:14.480 |
|
And I think that early investment |
|
|
|
44:14.480 --> 44:17.320 |
|
and continuing investment in those kind of tools |
|
|
|
44:17.320 --> 44:19.400 |
|
will ultimately allow us to accelerate |
|
|
|
44:19.400 --> 44:21.920 |
|
and do something pretty incredible. |
|
|
|
44:21.920 --> 44:23.400 |
|
Chris, beautifully put. |
|
|
|
44:23.400 --> 44:24.640 |
|
It's a good place to end. |
|
|
|
44:24.640 --> 44:26.520 |
|
Thank you so much for talking today. |
|
|
|
44:26.520 --> 44:27.360 |
|
Thank you very much. |
|
|
|
44:27.360 --> 44:57.200 |
|
I hope you enjoyed it. |
|
|
|
|