|
WEBVTT |
|
|
|
00:00.000 --> 00:02.240 |
|
The following is a conversation with Kyle Vogt. |
|
|
|
00:02.240 --> 00:05.120 |
|
He's the president and the CTO of Cruise Automation, |
|
|
|
00:05.120 --> 00:08.000 |
|
leading an effort to solve one of the biggest |
|
|
|
00:08.000 --> 00:10.880 |
|
robotics challenges of our time, vehicle automation. |
|
|
|
00:10.880 --> 00:13.120 |
|
He's a cofounder of two successful companies, |
|
|
|
00:13.120 --> 00:17.040 |
|
Twitch and Cruise, that have each sold for a billion dollars. |
|
|
|
00:17.040 --> 00:19.880 |
|
And he's a great example of the innovative spirit |
|
|
|
00:19.880 --> 00:22.160 |
|
that flourishes in Silicon Valley. |
|
|
|
00:22.160 --> 00:25.760 |
|
And now is facing an interesting and exciting challenge |
|
|
|
00:25.760 --> 00:30.040 |
|
of matching that spirit with the mass production |
|
|
|
00:30.040 --> 00:32.800 |
|
and the safety centered culture of a major automaker, |
|
|
|
00:32.800 --> 00:34.440 |
|
like General Motors. |
|
|
|
00:34.440 --> 00:36.520 |
|
This conversation is part of the MIT |
|
|
|
00:36.520 --> 00:38.560 |
|
Artificial General Intelligence series |
|
|
|
00:38.560 --> 00:41.040 |
|
and the Artificial Intelligence podcast. |
|
|
|
00:41.040 --> 00:44.840 |
|
If you enjoy it, please subscribe on YouTube, iTunes, |
|
|
|
00:44.840 --> 00:47.640 |
|
or simply connect with me on Twitter at Lex Friedman, |
|
|
|
00:47.640 --> 00:49.800 |
|
spelled F R I D. |
|
|
|
00:49.800 --> 00:53.480 |
|
And now here's my conversation with Kyle Vogt. |
|
|
|
00:53.480 --> 00:55.600 |
|
You grew up in Kansas, right? |
|
|
|
00:55.600 --> 00:58.080 |
|
Yeah, and I just saw that picture you had to hit know |
|
|
|
00:58.080 --> 01:00.400 |
|
there, so I'm a little bit worried about that now. |
|
|
|
01:00.400 --> 01:02.480 |
|
So in high school in Kansas City, |
|
|
|
01:02.480 --> 01:07.200 |
|
you joined Shawnee Mission North High School Robotics Team. |
|
|
|
01:07.200 --> 01:09.120 |
|
Now that wasn't your high school. |
|
|
|
01:09.120 --> 01:09.960 |
|
That's right. |
|
|
|
01:09.960 --> 01:13.920 |
|
That was the only high school in the area that had a teacher |
|
|
|
01:13.920 --> 01:16.080 |
|
who was willing to sponsor our first robotics team. |
|
|
|
01:16.080 --> 01:18.360 |
|
I was gonna troll you a little bit. |
|
|
|
01:18.360 --> 01:20.320 |
|
Jog your mouth a little bit with that kid. |
|
|
|
01:20.320 --> 01:22.880 |
|
I was trying to look super cool and intense. |
|
|
|
01:22.880 --> 01:23.720 |
|
You did? |
|
|
|
01:23.720 --> 01:25.680 |
|
Because this was BattleBots, this is serious business. |
|
|
|
01:25.680 --> 01:28.840 |
|
So we're standing there with a welded steel frame |
|
|
|
01:28.840 --> 01:30.240 |
|
and looking tough. |
|
|
|
01:30.240 --> 01:31.800 |
|
So go back there. |
|
|
|
01:31.800 --> 01:33.840 |
|
What does that drew you to robotics? |
|
|
|
01:33.840 --> 01:36.480 |
|
Well, I think, I've been trying to figure this out |
|
|
|
01:36.480 --> 01:37.920 |
|
for a while, but I've always liked building things |
|
|
|
01:37.920 --> 01:38.760 |
|
with Legos. |
|
|
|
01:38.760 --> 01:39.920 |
|
And when I was really, really young, |
|
|
|
01:39.920 --> 01:42.360 |
|
I wanted the Legos that had motors and other things. |
|
|
|
01:42.360 --> 01:44.840 |
|
And then, you know, Lego Mindstorms came out |
|
|
|
01:44.840 --> 01:48.280 |
|
and for the first time you could program Lego contraptions. |
|
|
|
01:48.280 --> 01:52.560 |
|
And I think things just sort of snowballed from that. |
|
|
|
01:52.560 --> 01:56.800 |
|
But I remember seeing, you know, the BattleBots TV show |
|
|
|
01:56.800 --> 01:59.320 |
|
on Comedy Central and thinking that is the coolest thing |
|
|
|
01:59.320 --> 02:01.200 |
|
in the world, I wanna be a part of that. |
|
|
|
02:01.200 --> 02:03.680 |
|
And not knowing a whole lot about how to build |
|
|
|
02:03.680 --> 02:06.880 |
|
these 200 pound fighting robots. |
|
|
|
02:06.880 --> 02:11.000 |
|
So I sort of obsessively poured over the internet forums |
|
|
|
02:11.000 --> 02:13.440 |
|
where all the creators for BattleBots would sort of hang out |
|
|
|
02:13.440 --> 02:16.120 |
|
and talk about, you know, document their build progress |
|
|
|
02:16.120 --> 02:17.120 |
|
and everything. |
|
|
|
02:17.120 --> 02:20.520 |
|
And I think I read, I must have read like, you know, |
|
|
|
02:20.520 --> 02:24.280 |
|
tens of thousands of forum posts from basically everything |
|
|
|
02:24.280 --> 02:26.400 |
|
that was out there on what these people were doing. |
|
|
|
02:26.400 --> 02:28.920 |
|
And eventually, like sort of triangulated how to put |
|
|
|
02:28.920 --> 02:33.040 |
|
some of these things together and ended up doing BattleBots, |
|
|
|
02:33.040 --> 02:34.800 |
|
which was, you know, it was like 13 or 14, |
|
|
|
02:34.800 --> 02:35.960 |
|
which was pretty awesome. |
|
|
|
02:35.960 --> 02:37.680 |
|
I'm not sure if the show's still running, |
|
|
|
02:37.680 --> 02:42.000 |
|
but so BattleBots is, there's not an artificial intelligence |
|
|
|
02:42.000 --> 02:44.200 |
|
component, it's remotely controlled. |
|
|
|
02:44.200 --> 02:46.720 |
|
And it's almost like a mechanical engineering challenge |
|
|
|
02:46.720 --> 02:49.560 |
|
of building things that can be broken. |
|
|
|
02:49.560 --> 02:50.680 |
|
They're radio controlled. |
|
|
|
02:50.680 --> 02:53.880 |
|
So, and I think that they allowed some limited form |
|
|
|
02:53.880 --> 02:56.600 |
|
of autonomy, but, you know, in a two minute match, |
|
|
|
02:56.600 --> 02:58.800 |
|
you're, in the way these things ran, |
|
|
|
02:58.800 --> 03:00.720 |
|
you're really doing yourself a disservice by trying |
|
|
|
03:00.720 --> 03:02.360 |
|
to automate it versus just, you know, |
|
|
|
03:02.360 --> 03:04.760 |
|
do the practical thing, which is drive it yourself. |
|
|
|
03:04.760 --> 03:06.960 |
|
And there's an entertainment aspect, |
|
|
|
03:06.960 --> 03:08.240 |
|
just going on YouTube. |
|
|
|
03:08.240 --> 03:11.200 |
|
There's like some of them wield an axe, some of them, |
|
|
|
03:11.200 --> 03:12.200 |
|
I mean, there's that fun. |
|
|
|
03:12.200 --> 03:13.760 |
|
So what drew you to that aspect? |
|
|
|
03:13.760 --> 03:15.400 |
|
Was it the mechanical engineering? |
|
|
|
03:15.400 --> 03:19.400 |
|
Was it the dream to create like Frankenstein |
|
|
|
03:19.400 --> 03:21.080 |
|
and sentient being? |
|
|
|
03:21.080 --> 03:23.960 |
|
Or was it just like the Lego, you like tinkering stuff? |
|
|
|
03:23.960 --> 03:26.000 |
|
I mean, that was just building something. |
|
|
|
03:26.000 --> 03:27.960 |
|
I think the idea of, you know, |
|
|
|
03:27.960 --> 03:30.920 |
|
this radio controlled machine that can do various things. |
|
|
|
03:30.920 --> 03:33.800 |
|
If it has like a weapon or something was pretty interesting. |
|
|
|
03:33.800 --> 03:36.440 |
|
I agree, it doesn't have the same appeal as, you know, |
|
|
|
03:36.440 --> 03:38.520 |
|
autonomous robots, which I, which I, you know, |
|
|
|
03:38.520 --> 03:40.320 |
|
sort of gravitated towards later on, |
|
|
|
03:40.320 --> 03:42.720 |
|
but it was definitely an engineering challenge |
|
|
|
03:42.720 --> 03:45.600 |
|
because everything you did in that competition |
|
|
|
03:45.600 --> 03:48.480 |
|
was pushing components to their limits. |
|
|
|
03:48.480 --> 03:52.960 |
|
So we would buy like these $40 DC motors |
|
|
|
03:52.960 --> 03:54.840 |
|
that came out of a winch, |
|
|
|
03:54.840 --> 03:57.280 |
|
like on the front of a pickup truck or something. |
|
|
|
03:57.280 --> 03:59.240 |
|
And we'd power the car with those |
|
|
|
03:59.240 --> 04:01.120 |
|
and we'd run them at like double or triple |
|
|
|
04:01.120 --> 04:02.440 |
|
their rated voltage. |
|
|
|
04:02.440 --> 04:04.160 |
|
So they immediately start overheating, |
|
|
|
04:04.160 --> 04:06.920 |
|
but for that two minute match, you can get, you know, |
|
|
|
04:06.920 --> 04:08.680 |
|
a significant increase in the power output |
|
|
|
04:08.680 --> 04:10.560 |
|
of those motors before they burn out. |
|
|
|
04:10.560 --> 04:12.760 |
|
And so you're doing the same thing for your battery packs, |
|
|
|
04:12.760 --> 04:14.360 |
|
all the materials in the system. |
|
|
|
04:14.360 --> 04:15.560 |
|
And I think there was something, |
|
|
|
04:15.560 --> 04:17.800 |
|
something intrinsically interesting |
|
|
|
04:17.800 --> 04:20.360 |
|
about just seeing like where things break. |
|
|
|
04:20.360 --> 04:23.360 |
|
And did you offline see where they break? |
|
|
|
04:23.360 --> 04:25.040 |
|
Did you take it to the testing point? |
|
|
|
04:25.040 --> 04:26.120 |
|
Like, how did you know two minutes? |
|
|
|
04:26.120 --> 04:29.680 |
|
Or was there a reckless, let's just go with it and see. |
|
|
|
04:29.680 --> 04:31.320 |
|
We weren't very good at battle bots. |
|
|
|
04:31.320 --> 04:34.200 |
|
We lost all of our matches the first round. |
|
|
|
04:34.200 --> 04:36.240 |
|
The one I built first, |
|
|
|
04:36.240 --> 04:38.120 |
|
both of them were these wedge shaped robots |
|
|
|
04:38.120 --> 04:39.800 |
|
because the wedge, even though it's sort of boring |
|
|
|
04:39.800 --> 04:41.240 |
|
to look at is extremely effective. |
|
|
|
04:41.240 --> 04:42.600 |
|
You drive towards another robot |
|
|
|
04:42.600 --> 04:44.720 |
|
and the front edge of it gets under them |
|
|
|
04:44.720 --> 04:46.760 |
|
and then they sort of flip over, |
|
|
|
04:46.760 --> 04:48.280 |
|
it's kind of like a door stopper. |
|
|
|
04:48.280 --> 04:51.920 |
|
And the first one had a pneumatic polished stainless steel |
|
|
|
04:51.920 --> 04:54.880 |
|
spike on the front that would shoot out about eight inches. |
|
|
|
04:54.880 --> 04:56.240 |
|
The purpose of which is what? |
|
|
|
04:56.240 --> 04:58.800 |
|
Pretty ineffective actually, but it looked cool. |
|
|
|
04:58.800 --> 05:00.880 |
|
And was it to help with the lift? |
|
|
|
05:00.880 --> 05:04.080 |
|
No, it was just to try to poke holes in the other robot. |
|
|
|
05:04.080 --> 05:05.960 |
|
And then the second time I did it, |
|
|
|
05:05.960 --> 05:09.560 |
|
which is the following, I think maybe 18 months later, |
|
|
|
05:09.560 --> 05:14.400 |
|
we had a titanium axe with a hardened steel tip on it |
|
|
|
05:14.400 --> 05:17.200 |
|
that was powered by a hydraulic cylinder, |
|
|
|
05:17.200 --> 05:20.400 |
|
which we were activating with liquid CO2, |
|
|
|
05:20.400 --> 05:23.880 |
|
which had its own set of problems. |
|
|
|
05:23.880 --> 05:26.320 |
|
So great, so that's kind of on the hardware side. |
|
|
|
05:26.320 --> 05:28.360 |
|
I mean, at a certain point, |
|
|
|
05:28.360 --> 05:31.240 |
|
there must have been born a fascination |
|
|
|
05:31.240 --> 05:32.440 |
|
on the software side. |
|
|
|
05:32.440 --> 05:35.520 |
|
So what was the first piece of code you've written? |
|
|
|
05:35.520 --> 05:38.600 |
|
If you didn't go back there, see what language was it? |
|
|
|
05:38.600 --> 05:40.600 |
|
What was it, was it EMAX, VAM? |
|
|
|
05:40.600 --> 05:44.640 |
|
Was it a more respectable, modern ID? |
|
|
|
05:44.640 --> 05:45.800 |
|
Do you remember any of this? |
|
|
|
05:45.800 --> 05:49.840 |
|
Yeah, well, I remember, I think maybe when I was in |
|
|
|
05:49.840 --> 05:52.440 |
|
third or fourth grade, I was at elementary school, |
|
|
|
05:52.440 --> 05:55.040 |
|
had a bunch of Apple II computers, |
|
|
|
05:55.040 --> 05:56.680 |
|
and we'd play games on those. |
|
|
|
05:56.680 --> 05:57.760 |
|
And I remember every once in a while, |
|
|
|
05:57.760 --> 06:01.320 |
|
something would crash or wouldn't start up correctly, |
|
|
|
06:01.320 --> 06:03.960 |
|
and it would dump you out to what I later learned |
|
|
|
06:03.960 --> 06:05.800 |
|
was like sort of a command prompt. |
|
|
|
06:05.800 --> 06:07.600 |
|
And my teacher would come over and type, |
|
|
|
06:07.600 --> 06:09.440 |
|
I actually remember this to this day for some reason, |
|
|
|
06:09.440 --> 06:12.160 |
|
like PR number six, or PR pound six, |
|
|
|
06:12.160 --> 06:13.840 |
|
which is peripheral six, which is the disk drive, |
|
|
|
06:13.840 --> 06:15.920 |
|
which would fire up the disk and load the program. |
|
|
|
06:15.920 --> 06:17.880 |
|
And I just remember thinking, wow, she's like a hacker, |
|
|
|
06:17.880 --> 06:20.760 |
|
like teach me these codes, these error codes, |
|
|
|
06:20.760 --> 06:22.720 |
|
that is what I called them at the time. |
|
|
|
06:22.720 --> 06:23.760 |
|
But she had no interest in that. |
|
|
|
06:23.760 --> 06:26.480 |
|
So it wasn't until I think about fifth grade |
|
|
|
06:26.480 --> 06:29.120 |
|
that I had a school where you could actually |
|
|
|
06:29.120 --> 06:30.600 |
|
go on these Apple II's and learn to program. |
|
|
|
06:30.600 --> 06:31.920 |
|
And so it was all in basic, you know, |
|
|
|
06:31.920 --> 06:34.240 |
|
where every line, you know, the line numbers are all, |
|
|
|
06:34.240 --> 06:35.640 |
|
or that every line is numbered, |
|
|
|
06:35.640 --> 06:38.000 |
|
and you have to like leave enough space |
|
|
|
06:38.000 --> 06:40.760 |
|
between the numbers so that if you want to tweak your code, |
|
|
|
06:40.760 --> 06:42.600 |
|
you go back and if the first line was 10 |
|
|
|
06:42.600 --> 06:44.680 |
|
and the second line is 20, now you have to go back |
|
|
|
06:44.680 --> 06:45.640 |
|
and insert 15. |
|
|
|
06:45.640 --> 06:47.960 |
|
And if you need to add code in front of that, |
|
|
|
06:47.960 --> 06:49.720 |
|
you know, 11 or 12, and you hope you don't run out |
|
|
|
06:49.720 --> 06:51.880 |
|
of line numbers and have to redo the whole thing. |
|
|
|
06:51.880 --> 06:53.240 |
|
And there's go to statements? |
|
|
|
06:53.240 --> 06:56.920 |
|
Yeah, go to and is very basic, maybe hence the name, |
|
|
|
06:56.920 --> 06:58.200 |
|
but a lot of fun. |
|
|
|
06:58.200 --> 07:00.800 |
|
And that was like, that was, you know, |
|
|
|
07:00.800 --> 07:02.600 |
|
that's when, you know, when you first program, |
|
|
|
07:02.600 --> 07:03.560 |
|
you see the magic of it. |
|
|
|
07:03.560 --> 07:06.640 |
|
It's like, just like this world opens up with, |
|
|
|
07:06.640 --> 07:08.200 |
|
you know, endless possibilities for the things |
|
|
|
07:08.200 --> 07:10.600 |
|
you could build or accomplish with that computer. |
|
|
|
07:10.600 --> 07:13.400 |
|
So you got the bug then, so even starting with basic |
|
|
|
07:13.400 --> 07:16.720 |
|
and then what, C++ throughout, what did you, |
|
|
|
07:16.720 --> 07:18.200 |
|
was there a computer programming, |
|
|
|
07:18.200 --> 07:19.880 |
|
computer science classes in high school? |
|
|
|
07:19.880 --> 07:22.680 |
|
Not, not where I went, so it was self taught, |
|
|
|
07:22.680 --> 07:24.640 |
|
but I did a lot of programming. |
|
|
|
07:24.640 --> 07:28.560 |
|
The thing that, you know, sort of pushed me in the path |
|
|
|
07:28.560 --> 07:30.600 |
|
of eventually working on self driving cars |
|
|
|
07:30.600 --> 07:33.280 |
|
is actually one of these really long trips |
|
|
|
07:33.280 --> 07:38.000 |
|
driving from my house in Kansas to, I think, Las Vegas, |
|
|
|
07:38.000 --> 07:39.480 |
|
where we did the BattleBots competition. |
|
|
|
07:39.480 --> 07:42.760 |
|
And I had just gotten my, I think my learners permit |
|
|
|
07:42.760 --> 07:45.080 |
|
or early drivers permit. |
|
|
|
07:45.080 --> 07:48.280 |
|
And so I was driving this, you know, 10 hour stretch |
|
|
|
07:48.280 --> 07:50.600 |
|
across Western Kansas where it's just, |
|
|
|
07:50.600 --> 07:51.800 |
|
you're going straight on a highway |
|
|
|
07:51.800 --> 07:53.640 |
|
and it is mind numbingly boring. |
|
|
|
07:53.640 --> 07:54.960 |
|
And I remember thinking even then |
|
|
|
07:54.960 --> 07:58.080 |
|
with my sort of mediocre programming background |
|
|
|
07:58.080 --> 08:00.040 |
|
that this is something that a computer can do, right? |
|
|
|
08:00.040 --> 08:01.440 |
|
Let's take a picture of the road, |
|
|
|
08:01.440 --> 08:02.880 |
|
let's find the yellow lane markers |
|
|
|
08:02.880 --> 08:04.880 |
|
and, you know, steer the wheel. |
|
|
|
08:04.880 --> 08:06.600 |
|
And, you know, later I'd come to realize |
|
|
|
08:06.600 --> 08:09.800 |
|
this had been done, you know, since the 80s |
|
|
|
08:09.800 --> 08:12.760 |
|
or the 70s or even earlier, but I still wanted to do it. |
|
|
|
08:12.760 --> 08:14.840 |
|
And sort of immediately after that trip, |
|
|
|
08:14.840 --> 08:16.280 |
|
switched from sort of BattleBots, |
|
|
|
08:16.280 --> 08:18.640 |
|
which is more radio controlled machines |
|
|
|
08:18.640 --> 08:21.800 |
|
to thinking about building, you know, |
|
|
|
08:21.800 --> 08:23.600 |
|
autonomous vehicles of some scale, |
|
|
|
08:23.600 --> 08:25.080 |
|
start off with really small electric ones |
|
|
|
08:25.080 --> 08:28.280 |
|
and then, you know, progress to what we're doing now. |
|
|
|
08:28.280 --> 08:30.040 |
|
So what was your view of artificial intelligence |
|
|
|
08:30.040 --> 08:30.880 |
|
at that point? |
|
|
|
08:30.880 --> 08:31.880 |
|
What did you think? |
|
|
|
08:31.880 --> 08:35.040 |
|
So this is before there's been waves |
|
|
|
08:35.040 --> 08:36.680 |
|
in artificial intelligence, right? |
|
|
|
08:36.680 --> 08:39.480 |
|
The current wave with deep learning |
|
|
|
08:39.480 --> 08:41.760 |
|
makes people believe that you can solve |
|
|
|
08:41.760 --> 08:43.520 |
|
in a really rich, deep way, |
|
|
|
08:43.520 --> 08:46.200 |
|
the computer vision perception problem. |
|
|
|
08:46.200 --> 08:51.200 |
|
But like before the deep learning craze, |
|
|
|
08:51.320 --> 08:52.800 |
|
you know, how do you think about |
|
|
|
08:52.800 --> 08:55.320 |
|
how would you even go about building a thing |
|
|
|
08:55.320 --> 08:56.920 |
|
that perceives itself in the world, |
|
|
|
08:56.920 --> 08:59.160 |
|
localize itself in the world, moves around the world? |
|
|
|
08:59.160 --> 09:00.360 |
|
Like when you were younger, I mean, |
|
|
|
09:00.360 --> 09:02.120 |
|
as what was your thinking about it? |
|
|
|
09:02.120 --> 09:03.960 |
|
Well, prior to deep neural networks |
|
|
|
09:03.960 --> 09:05.360 |
|
or convolutional neural nets, |
|
|
|
09:05.360 --> 09:06.520 |
|
these modern techniques we have, |
|
|
|
09:06.520 --> 09:09.040 |
|
or at least ones that are in use today, |
|
|
|
09:09.040 --> 09:10.280 |
|
it was all heuristic space. |
|
|
|
09:10.280 --> 09:12.920 |
|
And so like old school image processing, |
|
|
|
09:12.920 --> 09:15.040 |
|
and I think extracting, you know, |
|
|
|
09:15.040 --> 09:18.000 |
|
yellow lane markers out of an image of a road |
|
|
|
09:18.000 --> 09:21.160 |
|
is one of the problems that lends itself |
|
|
|
09:21.160 --> 09:23.760 |
|
reasonably well to those heuristic base methods, you know, |
|
|
|
09:23.760 --> 09:26.760 |
|
like just do a threshold on the color yellow |
|
|
|
09:26.760 --> 09:28.520 |
|
and then try to fit some lines to that |
|
|
|
09:28.520 --> 09:30.320 |
|
using a huff transform or something |
|
|
|
09:30.320 --> 09:32.280 |
|
and then go from there. |
|
|
|
09:32.280 --> 09:34.800 |
|
Traffic light detection and stop sign detection, |
|
|
|
09:34.800 --> 09:35.920 |
|
red, yellow, green. |
|
|
|
09:35.920 --> 09:38.160 |
|
And I think you can, you could, |
|
|
|
09:38.160 --> 09:39.840 |
|
I mean, if you wanted to do a full, |
|
|
|
09:39.840 --> 09:41.960 |
|
I was just trying to make something that would stay |
|
|
|
09:41.960 --> 09:43.520 |
|
in between the lanes on a highway, |
|
|
|
09:43.520 --> 09:44.960 |
|
but if you wanted to do the full, |
|
|
|
09:46.920 --> 09:48.960 |
|
the full, you know, set of capabilities |
|
|
|
09:48.960 --> 09:50.520 |
|
needed for a driverless car, |
|
|
|
09:50.520 --> 09:53.360 |
|
I think you could, and we've done this at cruise, |
|
|
|
09:53.360 --> 09:54.440 |
|
you know, in the very first days, |
|
|
|
09:54.440 --> 09:56.320 |
|
you can start off with a really simple, |
|
|
|
09:56.320 --> 09:58.000 |
|
you know, human written heuristic |
|
|
|
09:58.000 --> 09:59.800 |
|
just to get the scaffolding in place |
|
|
|
09:59.800 --> 10:01.720 |
|
for your system, traffic light detection, |
|
|
|
10:01.720 --> 10:02.960 |
|
probably a really simple, you know, |
|
|
|
10:02.960 --> 10:04.760 |
|
color thresholding on day one |
|
|
|
10:04.760 --> 10:06.520 |
|
just to get the system up and running |
|
|
|
10:06.520 --> 10:08.640 |
|
before you migrate to, you know, |
|
|
|
10:08.640 --> 10:11.080 |
|
a deep learning based technique or something else. |
|
|
|
10:11.080 --> 10:12.800 |
|
And, you know, back in, when I was doing this, |
|
|
|
10:12.800 --> 10:15.120 |
|
my first one, it was on a Pentium 203, |
|
|
|
10:15.120 --> 10:17.840 |
|
233 megahertz computer in it. |
|
|
|
10:17.840 --> 10:19.920 |
|
And I think I wrote the first version in basic, |
|
|
|
10:19.920 --> 10:21.600 |
|
which is like an interpreted language. |
|
|
|
10:21.600 --> 10:23.760 |
|
It's extremely slow because that's the thing |
|
|
|
10:23.760 --> 10:24.800 |
|
I knew at the time. |
|
|
|
10:24.800 --> 10:27.840 |
|
And so there was no, no chance at all of using, |
|
|
|
10:27.840 --> 10:30.440 |
|
there's no computational power to do |
|
|
|
10:30.440 --> 10:33.480 |
|
any sort of reasonable deep nets like you have today. |
|
|
|
10:33.480 --> 10:35.360 |
|
So I don't know what kids these days are doing. |
|
|
|
10:35.360 --> 10:37.920 |
|
Are kids these days, you know, at age 13 |
|
|
|
10:37.920 --> 10:39.360 |
|
using neural networks in their garage? |
|
|
|
10:39.360 --> 10:40.200 |
|
I mean, that would be awesome. |
|
|
|
10:40.200 --> 10:43.040 |
|
I get emails all the time from, you know, |
|
|
|
10:43.040 --> 10:46.160 |
|
like 11, 12 year olds saying, I'm having, you know, |
|
|
|
10:46.160 --> 10:48.760 |
|
I'm trying to follow this TensorFlow tutorial |
|
|
|
10:48.760 --> 10:50.800 |
|
and I'm having this problem. |
|
|
|
10:50.800 --> 10:55.800 |
|
And the general approach in the deep learning community |
|
|
|
10:55.800 --> 11:00.200 |
|
is of extreme optimism of, as opposed to, |
|
|
|
11:00.200 --> 11:02.000 |
|
you mentioned like heuristics, you can, |
|
|
|
11:02.000 --> 11:04.800 |
|
you can, you can separate the autonomous driving problem |
|
|
|
11:04.800 --> 11:07.520 |
|
into modules and try to solve it sort of rigorously, |
|
|
|
11:07.520 --> 11:09.040 |
|
where you can just do it end to end. |
|
|
|
11:09.040 --> 11:11.840 |
|
And most people just kind of love the idea that, |
|
|
|
11:11.840 --> 11:13.360 |
|
you know, us humans do it end to end, |
|
|
|
11:13.360 --> 11:15.360 |
|
we just perceive and act. |
|
|
|
11:15.360 --> 11:17.040 |
|
We should be able to use that, |
|
|
|
11:17.040 --> 11:18.720 |
|
do the same kind of thing with your own nets. |
|
|
|
11:18.720 --> 11:20.920 |
|
And that, that kind of thinking, |
|
|
|
11:20.920 --> 11:22.840 |
|
you don't want to criticize that kind of thinking |
|
|
|
11:22.840 --> 11:24.640 |
|
because eventually they will be right. |
|
|
|
11:24.640 --> 11:26.360 |
|
Yeah. And so it's exciting. |
|
|
|
11:26.360 --> 11:28.720 |
|
And especially when they're younger to explore that |
|
|
|
11:28.720 --> 11:30.640 |
|
is a really exciting approach. |
|
|
|
11:30.640 --> 11:35.480 |
|
But yeah, it's, it's changed the, the language, |
|
|
|
11:35.480 --> 11:37.240 |
|
the kind of stuff you're tinkering with. |
|
|
|
11:37.240 --> 11:40.920 |
|
It's kind of exciting to see when these teenagers grow up. |
|
|
|
11:40.920 --> 11:43.760 |
|
Yeah, I can only imagine if you, if your starting point |
|
|
|
11:43.760 --> 11:46.720 |
|
is, you know, Python and TensorFlow at age 13, |
|
|
|
11:46.720 --> 11:47.800 |
|
where you end up, you know, |
|
|
|
11:47.800 --> 11:51.040 |
|
after 10 or 15 years of that, that's, that's pretty cool. |
|
|
|
11:51.040 --> 11:53.760 |
|
Because of GitHub, because the state tools |
|
|
|
11:53.760 --> 11:55.440 |
|
for solving most of the major problems |
|
|
|
11:55.440 --> 11:56.920 |
|
that are artificial intelligence |
|
|
|
11:56.920 --> 12:00.240 |
|
are within a few lines of code for most kids. |
|
|
|
12:00.240 --> 12:02.280 |
|
And that's incredible to think about, |
|
|
|
12:02.280 --> 12:04.280 |
|
also on the entrepreneurial side. |
|
|
|
12:04.280 --> 12:08.520 |
|
And, and, and at that point, was there any thought |
|
|
|
12:08.520 --> 12:11.960 |
|
about entrepreneurship before you came to college |
|
|
|
12:11.960 --> 12:15.160 |
|
is sort of doing your building this into a thing |
|
|
|
12:15.160 --> 12:17.800 |
|
that impacts the world on a large scale? |
|
|
|
12:17.800 --> 12:19.840 |
|
Yeah, I've always wanted to start a company. |
|
|
|
12:19.840 --> 12:22.600 |
|
I think that's, you know, just a cool concept |
|
|
|
12:22.600 --> 12:25.240 |
|
of creating something and exchanging it |
|
|
|
12:25.240 --> 12:28.360 |
|
for value or creating value, I guess. |
|
|
|
12:28.360 --> 12:31.120 |
|
So in high school, I was, I was trying to build like, |
|
|
|
12:31.120 --> 12:33.600 |
|
you know, servo motor drivers, little circuit boards |
|
|
|
12:33.600 --> 12:36.920 |
|
and sell them online or other, other things like that. |
|
|
|
12:36.920 --> 12:40.320 |
|
And certainly knew at some point I wanted to do a startup, |
|
|
|
12:40.320 --> 12:42.840 |
|
but it wasn't really, I'd say until college until I felt |
|
|
|
12:42.840 --> 12:46.720 |
|
like I had the, I guess the right combination |
|
|
|
12:46.720 --> 12:48.960 |
|
of the environment, the smart people around you |
|
|
|
12:48.960 --> 12:52.360 |
|
and some free time and a lot of free time at MIT. |
|
|
|
12:52.360 --> 12:55.800 |
|
So you came to MIT as an undergrad 2004. |
|
|
|
12:55.800 --> 12:57.080 |
|
That's right. |
|
|
|
12:57.080 --> 12:59.040 |
|
And that's when the first DARPA Grand Challenge |
|
|
|
12:59.040 --> 12:59.880 |
|
was happening. |
|
|
|
12:59.880 --> 13:00.720 |
|
Yeah. |
|
|
|
13:00.720 --> 13:03.360 |
|
The timing of that is beautifully poetic. |
|
|
|
13:03.360 --> 13:05.680 |
|
So how'd you get yourself involved in that one? |
|
|
|
13:05.680 --> 13:07.080 |
|
Originally there wasn't a |
|
|
|
13:07.080 --> 13:07.920 |
|
Official entry? |
|
|
|
13:07.920 --> 13:09.520 |
|
Yeah, faculty sponsored thing. |
|
|
|
13:09.520 --> 13:12.760 |
|
And so a bunch of undergrads, myself included, |
|
|
|
13:12.760 --> 13:14.160 |
|
started meeting and got together |
|
|
|
13:14.160 --> 13:17.800 |
|
and tried to, to haggle together some sponsorships. |
|
|
|
13:17.800 --> 13:20.120 |
|
We got a vehicle donated, a bunch of sensors |
|
|
|
13:20.120 --> 13:21.600 |
|
and tried to put something together. |
|
|
|
13:21.600 --> 13:24.640 |
|
And so we had, our team was probably mostly freshmen |
|
|
|
13:24.640 --> 13:26.800 |
|
and sophomores, you know, which, which was not really |
|
|
|
13:26.800 --> 13:30.960 |
|
a fair, fair fight against maybe the, you know, postdoc |
|
|
|
13:30.960 --> 13:32.840 |
|
and faculty led teams from other schools. |
|
|
|
13:32.840 --> 13:35.000 |
|
But we, we got something up and running. |
|
|
|
13:35.000 --> 13:37.400 |
|
We had our vehicle drive by wire and, you know, |
|
|
|
13:37.400 --> 13:42.440 |
|
very, very basic control and things, but on the day |
|
|
|
13:42.440 --> 13:46.800 |
|
of the qualifying, sort of pre qualifying round, |
|
|
|
13:46.800 --> 13:50.840 |
|
the one and only steering motor that we had purchased, |
|
|
|
13:50.840 --> 13:52.600 |
|
the thing that we had, you know, retrofitted to turn |
|
|
|
13:52.600 --> 13:55.760 |
|
the steering wheel on the truck died. |
|
|
|
13:55.760 --> 13:58.440 |
|
And so our vehicle was just dead in the water, couldn't steer. |
|
|
|
13:58.440 --> 13:59.880 |
|
So we didn't make it very far. |
|
|
|
13:59.880 --> 14:00.920 |
|
On the hardware side. |
|
|
|
14:00.920 --> 14:03.000 |
|
So was there a software component? |
|
|
|
14:03.000 --> 14:06.200 |
|
Was there, like, how did your view of autonomous vehicles |
|
|
|
14:06.200 --> 14:08.040 |
|
in terms of artificial intelligence |
|
|
|
14:09.520 --> 14:10.720 |
|
evolve in this moment? |
|
|
|
14:10.720 --> 14:12.400 |
|
I mean, you know, like you said, |
|
|
|
14:12.400 --> 14:14.080 |
|
from the 80s has been autonomous vehicles, |
|
|
|
14:14.080 --> 14:16.720 |
|
but really that was the birth of the modern wave. |
|
|
|
14:16.720 --> 14:20.080 |
|
The, the thing that captivated everyone's imagination |
|
|
|
14:20.080 --> 14:21.520 |
|
that we can actually do this. |
|
|
|
14:21.520 --> 14:26.000 |
|
So how, were you captivated in that way? |
|
|
|
14:26.000 --> 14:27.600 |
|
So how did your view of autonomous vehicles |
|
|
|
14:27.600 --> 14:29.000 |
|
change at that point? |
|
|
|
14:29.000 --> 14:33.760 |
|
I'd say at that point in time, it was, it was a curiosity |
|
|
|
14:33.760 --> 14:35.840 |
|
as in like, is this really possible? |
|
|
|
14:35.840 --> 14:38.440 |
|
And I think that was generally the spirit |
|
|
|
14:38.440 --> 14:43.280 |
|
and the purpose of that original DARPA Grand Challenge, |
|
|
|
14:43.280 --> 14:45.520 |
|
which was to just get a whole bunch |
|
|
|
14:45.520 --> 14:48.680 |
|
of really brilliant people exploring the space |
|
|
|
14:48.680 --> 14:49.880 |
|
and pushing the limits. |
|
|
|
14:49.880 --> 14:51.960 |
|
And, and I think like to this day, |
|
|
|
14:51.960 --> 14:54.160 |
|
that DARPA challenge with its, you know, |
|
|
|
14:54.160 --> 14:57.120 |
|
million dollar prize pool was probably one |
|
|
|
14:57.120 --> 15:00.840 |
|
of the most effective, you know, uses of taxpayer money, |
|
|
|
15:00.840 --> 15:03.320 |
|
dollar for dollar that I've seen, you know, |
|
|
|
15:03.320 --> 15:06.720 |
|
because that, that small sort of initiative |
|
|
|
15:06.720 --> 15:10.440 |
|
that DARPA put put out sort of, in my view, |
|
|
|
15:10.440 --> 15:12.560 |
|
was the catalyst or the tipping point |
|
|
|
15:12.560 --> 15:14.800 |
|
for this, this whole next wave |
|
|
|
15:14.800 --> 15:16.120 |
|
of autonomous vehicle development. |
|
|
|
15:16.120 --> 15:17.160 |
|
So that was pretty cool. |
|
|
|
15:17.160 --> 15:20.240 |
|
So let me jump around a little bit on that point. |
|
|
|
15:20.240 --> 15:23.240 |
|
They also did the urban challenge where it was in the city, |
|
|
|
15:23.240 --> 15:25.920 |
|
but it was very artificial and there's no pedestrians |
|
|
|
15:25.920 --> 15:27.640 |
|
and there's very little human involvement |
|
|
|
15:27.640 --> 15:30.480 |
|
except a few professional drivers. |
|
|
|
15:30.480 --> 15:31.640 |
|
Yeah. |
|
|
|
15:31.640 --> 15:33.560 |
|
Do you think there's room, and then there was |
|
|
|
15:33.560 --> 15:35.360 |
|
the robotics challenge with human robots? |
|
|
|
15:35.360 --> 15:36.200 |
|
Right. |
|
|
|
15:36.200 --> 15:38.720 |
|
So in your now role as looking at this, |
|
|
|
15:38.720 --> 15:41.640 |
|
you're trying to solve one of the, you know, |
|
|
|
15:41.640 --> 15:43.120 |
|
autonomous driving, one of the harder, |
|
|
|
15:43.120 --> 15:45.480 |
|
more difficult places in San Francisco. |
|
|
|
15:45.480 --> 15:47.320 |
|
Is there a role for DARPA to step in |
|
|
|
15:47.320 --> 15:49.680 |
|
to also kind of help out, like, |
|
|
|
15:49.680 --> 15:54.000 |
|
challenge with new ideas, specifically pedestrians |
|
|
|
15:54.000 --> 15:55.880 |
|
and so on, all these kinds of interesting things? |
|
|
|
15:55.880 --> 15:57.680 |
|
Well, I haven't thought about it from that perspective. |
|
|
|
15:57.680 --> 15:59.280 |
|
Is there anything DARPA could do today |
|
|
|
15:59.280 --> 16:00.680 |
|
to further accelerate things? |
|
|
|
16:00.680 --> 16:04.880 |
|
And I would say my instinct is that that's maybe not |
|
|
|
16:04.880 --> 16:07.040 |
|
the highest and best use of their resources in time |
|
|
|
16:07.040 --> 16:10.640 |
|
because, like, kick starting and spinning up the flywheel |
|
|
|
16:10.640 --> 16:12.720 |
|
is I think what they did in this case |
|
|
|
16:12.720 --> 16:14.200 |
|
for very, very little money. |
|
|
|
16:14.200 --> 16:15.800 |
|
But today this has become, |
|
|
|
16:16.880 --> 16:19.000 |
|
this has become, like, commercially interesting |
|
|
|
16:19.000 --> 16:20.680 |
|
to very large companies and the amount of money |
|
|
|
16:20.680 --> 16:23.040 |
|
going into it and the amount of people, |
|
|
|
16:23.040 --> 16:24.840 |
|
like, going through your class and learning |
|
|
|
16:24.840 --> 16:27.200 |
|
about these things and developing these skills |
|
|
|
16:27.200 --> 16:29.120 |
|
is just, you know, orders of magnitude |
|
|
|
16:29.120 --> 16:30.840 |
|
more than it was back then. |
|
|
|
16:30.840 --> 16:33.080 |
|
And so there's enough momentum and inertia |
|
|
|
16:33.080 --> 16:36.520 |
|
and energy and investment dollars into this space right now |
|
|
|
16:36.520 --> 16:39.960 |
|
that I don't, I don't, I think they're, |
|
|
|
16:39.960 --> 16:42.200 |
|
I think they're, they can just say mission accomplished |
|
|
|
16:42.200 --> 16:44.320 |
|
and move on to the next area of technology |
|
|
|
16:44.320 --> 16:45.360 |
|
that needs help. |
|
|
|
16:46.280 --> 16:49.120 |
|
So then stepping back to MIT, |
|
|
|
16:49.120 --> 16:50.880 |
|
you left MIT Junior Junior year, |
|
|
|
16:50.880 --> 16:53.080 |
|
what was that decision like? |
|
|
|
16:53.080 --> 16:55.680 |
|
As I said, I always wanted to do a company |
|
|
|
16:55.680 --> 16:59.080 |
|
or start a company and this opportunity landed in my lap |
|
|
|
16:59.080 --> 17:01.960 |
|
which was a couple of guys from Yale |
|
|
|
17:01.960 --> 17:04.240 |
|
were starting a new company and I Googled them |
|
|
|
17:04.240 --> 17:06.720 |
|
and found that they had started a company previously |
|
|
|
17:06.720 --> 17:10.640 |
|
and sold it actually on eBay for about a quarter million bucks |
|
|
|
17:10.640 --> 17:12.880 |
|
which was a pretty interesting story. |
|
|
|
17:12.880 --> 17:15.760 |
|
But so I thought to myself, these guys are, you know, |
|
|
|
17:15.760 --> 17:19.080 |
|
rock star entrepreneurs, they've done this before, |
|
|
|
17:19.080 --> 17:20.720 |
|
they must be driving around in Ferraris |
|
|
|
17:20.720 --> 17:22.320 |
|
because they sold their company. |
|
|
|
17:23.320 --> 17:26.000 |
|
And, you know, I thought I could learn a lot from them. |
|
|
|
17:26.000 --> 17:28.320 |
|
So I teamed up with those guys and, you know, |
|
|
|
17:28.320 --> 17:32.000 |
|
went out during, went out to California during IAP |
|
|
|
17:32.000 --> 17:36.440 |
|
which is MIT's month off on one way ticket |
|
|
|
17:36.440 --> 17:38.040 |
|
and basically never went back. |
|
|
|
17:38.040 --> 17:39.280 |
|
We were having so much fun, |
|
|
|
17:39.280 --> 17:42.040 |
|
we felt like we were building something and creating something |
|
|
|
17:42.040 --> 17:44.440 |
|
and it was gonna be interesting that, you know, |
|
|
|
17:44.440 --> 17:46.800 |
|
I was just all in and got completely hooked |
|
|
|
17:46.800 --> 17:49.640 |
|
and that business was Justin TV |
|
|
|
17:49.640 --> 17:52.400 |
|
which is originally a reality show about a guy named Justin |
|
|
|
17:53.720 --> 17:57.120 |
|
which morphed into a live video streaming platform |
|
|
|
17:57.120 --> 18:00.320 |
|
which then morphed into what is Twitch today. |
|
|
|
18:00.320 --> 18:03.720 |
|
So that was quite an unexpected journey. |
|
|
|
18:04.720 --> 18:07.000 |
|
So no regrets? |
|
|
|
18:07.000 --> 18:07.840 |
|
No. |
|
|
|
18:07.840 --> 18:09.120 |
|
Looking back, it was just an obvious, |
|
|
|
18:09.120 --> 18:10.720 |
|
I mean, one way ticket. |
|
|
|
18:10.720 --> 18:12.760 |
|
I mean, if we just pause on that for a second, |
|
|
|
18:12.760 --> 18:17.680 |
|
there was no, how did you know these were the right guys? |
|
|
|
18:17.680 --> 18:19.520 |
|
This is the right decision. |
|
|
|
18:19.520 --> 18:22.640 |
|
You didn't think it was just follow the heart kind of thing? |
|
|
|
18:22.640 --> 18:24.520 |
|
Well, I didn't know, but, you know, |
|
|
|
18:24.520 --> 18:26.520 |
|
just trying something for a month during IAP |
|
|
|
18:26.520 --> 18:28.240 |
|
seems pretty low risk, right? |
|
|
|
18:28.240 --> 18:30.760 |
|
And then, you know, well, maybe I'll take a semester off. |
|
|
|
18:30.760 --> 18:32.280 |
|
MIT's pretty flexible about that. |
|
|
|
18:32.280 --> 18:33.840 |
|
You can always go back, right? |
|
|
|
18:33.840 --> 18:35.680 |
|
And then after two or three cycles of that, |
|
|
|
18:35.680 --> 18:36.960 |
|
I eventually threw in the towel. |
|
|
|
18:36.960 --> 18:41.920 |
|
But, you know, I think it's, I guess in that case, |
|
|
|
18:41.920 --> 18:44.880 |
|
I felt like I could always hit the undo button if I had to. |
|
|
|
18:44.880 --> 18:45.720 |
|
Right. |
|
|
|
18:45.720 --> 18:49.600 |
|
But nevertheless, from when you look in retrospect, |
|
|
|
18:49.600 --> 18:51.680 |
|
I mean, it seems like a brave decision. |
|
|
|
18:51.680 --> 18:53.200 |
|
You know, it would be difficult |
|
|
|
18:53.200 --> 18:54.320 |
|
for a lot of people to make. |
|
|
|
18:54.320 --> 18:55.440 |
|
It wasn't as popular. |
|
|
|
18:55.440 --> 18:58.120 |
|
I'd say that the general, you know, |
|
|
|
18:58.120 --> 19:01.480 |
|
flux of people out of MIT at the time was mostly |
|
|
|
19:01.480 --> 19:04.120 |
|
into, you know, finance or consulting jobs |
|
|
|
19:04.120 --> 19:05.720 |
|
in Boston or New York. |
|
|
|
19:05.720 --> 19:07.840 |
|
And very few people were going to California |
|
|
|
19:07.840 --> 19:09.080 |
|
to start companies. |
|
|
|
19:09.080 --> 19:12.240 |
|
But today, I'd say that's probably inverted, |
|
|
|
19:12.240 --> 19:15.400 |
|
which is just a sign of a sign of the times, I guess. |
|
|
|
19:15.400 --> 19:16.080 |
|
Yeah. |
|
|
|
19:16.080 --> 19:21.560 |
|
So there's a story about midnight of March 18, 2007, |
|
|
|
19:21.560 --> 19:25.720 |
|
where TechCrunch, I guess, announced Justin TV earlier |
|
|
|
19:25.720 --> 19:29.080 |
|
than it was supposed to a few hours. |
|
|
|
19:29.080 --> 19:30.360 |
|
The site didn't work. |
|
|
|
19:30.360 --> 19:32.520 |
|
I don't know if any of this is true, you can tell me. |
|
|
|
19:32.520 --> 19:36.200 |
|
And you and one of the folks at Justin TV, |
|
|
|
19:36.200 --> 19:39.240 |
|
Emma Shear, coded through the night. |
|
|
|
19:39.240 --> 19:41.440 |
|
Can you take me through that experience? |
|
|
|
19:41.440 --> 19:47.160 |
|
So let me say a few nice things that the article I read quoted |
|
|
|
19:47.160 --> 19:49.600 |
|
Justin Khan said that you were known for bureau coding |
|
|
|
19:49.600 --> 19:53.520 |
|
through problems and being a creative genius. |
|
|
|
19:53.520 --> 19:59.440 |
|
So on that night, what was going through your head? |
|
|
|
19:59.440 --> 20:01.400 |
|
Or maybe I put another way, how do you |
|
|
|
20:01.400 --> 20:02.520 |
|
solve these problems? |
|
|
|
20:02.520 --> 20:05.480 |
|
What's your approach to solving these kind of problems |
|
|
|
20:05.480 --> 20:07.080 |
|
where the line between success and failure |
|
|
|
20:07.080 --> 20:09.680 |
|
seems to be pretty thin? |
|
|
|
20:09.680 --> 20:10.680 |
|
That's a good question. |
|
|
|
20:10.680 --> 20:13.400 |
|
Well, first of all, that's nice of Justin to say that. |
|
|
|
20:13.400 --> 20:16.880 |
|
I think I would have been maybe 21 years old then |
|
|
|
20:16.880 --> 20:18.800 |
|
and not very experienced at programming. |
|
|
|
20:18.800 --> 20:22.680 |
|
But as with everything in a startup, |
|
|
|
20:22.680 --> 20:24.720 |
|
you're sort of racing against the clock. |
|
|
|
20:24.720 --> 20:27.320 |
|
And so our plan was the second we |
|
|
|
20:27.320 --> 20:32.600 |
|
had this live streaming camera backpack up and running |
|
|
|
20:32.600 --> 20:33.600 |
|
where Justin could wear it. |
|
|
|
20:33.600 --> 20:35.320 |
|
And no matter where he went in the city, |
|
|
|
20:35.320 --> 20:36.400 |
|
it would be streaming live video. |
|
|
|
20:36.400 --> 20:37.960 |
|
And this is even before the iPhones, |
|
|
|
20:37.960 --> 20:40.880 |
|
this is like hard to do back then. |
|
|
|
20:40.880 --> 20:41.800 |
|
We would launch. |
|
|
|
20:41.800 --> 20:45.160 |
|
And so we thought we were there and the backpack was working. |
|
|
|
20:45.160 --> 20:47.080 |
|
And then we sent out all the emails |
|
|
|
20:47.080 --> 20:49.920 |
|
to launch the company and do the press thing. |
|
|
|
20:49.920 --> 20:53.000 |
|
And then we weren't quite actually there. |
|
|
|
20:53.000 --> 20:55.880 |
|
And then we thought, oh, well, they're |
|
|
|
20:55.880 --> 21:00.160 |
|
not going to announce it until maybe 10 AM the next morning. |
|
|
|
21:00.160 --> 21:01.880 |
|
And it's, I don't know, it's 5 PM now. |
|
|
|
21:01.880 --> 21:03.640 |
|
So how many hours do we have left? |
|
|
|
21:03.640 --> 21:08.000 |
|
What is that, like 17 hours to go? |
|
|
|
21:08.000 --> 21:10.440 |
|
And that was going to be fine. |
|
|
|
21:10.440 --> 21:11.440 |
|
Was the problem obvious? |
|
|
|
21:11.440 --> 21:13.280 |
|
Did you understand what could possibly be? |
|
|
|
21:13.280 --> 21:16.520 |
|
Like how complicated was the system at that point? |
|
|
|
21:16.520 --> 21:18.840 |
|
It was pretty messy. |
|
|
|
21:18.840 --> 21:22.760 |
|
So to get a live video feed that looked decent working |
|
|
|
21:22.760 --> 21:25.680 |
|
from anywhere in San Francisco, I |
|
|
|
21:25.680 --> 21:28.600 |
|
put together this system where we had like three or four |
|
|
|
21:28.600 --> 21:29.880 |
|
cell phone data modems. |
|
|
|
21:29.880 --> 21:32.200 |
|
And they were like, we take the video stream |
|
|
|
21:32.200 --> 21:35.600 |
|
and sort of spray it across these three or four modems |
|
|
|
21:35.600 --> 21:38.080 |
|
and then try to catch all the packets on the other side |
|
|
|
21:38.080 --> 21:39.480 |
|
with unreliable cell phone networks. |
|
|
|
21:39.480 --> 21:41.080 |
|
Pretty low level networking. |
|
|
|
21:41.080 --> 21:41.720 |
|
Yeah. |
|
|
|
21:41.720 --> 21:44.760 |
|
And putting these sort of protocols |
|
|
|
21:44.760 --> 21:47.560 |
|
on top of all that to reassemble and reorder the packets |
|
|
|
21:47.560 --> 21:49.720 |
|
and have time buffers and error correction |
|
|
|
21:49.720 --> 21:50.960 |
|
and all that kind of stuff. |
|
|
|
21:50.960 --> 21:53.960 |
|
And the night before, it was just |
|
|
|
21:53.960 --> 21:56.280 |
|
staticky. Every once in a while, the image would go |
|
|
|
21:56.280 --> 21:59.640 |
|
staticky and there would be this horrible like screeching |
|
|
|
21:59.640 --> 22:02.080 |
|
audio noise because the audio was also corrupted. |
|
|
|
22:02.080 --> 22:04.600 |
|
And this would happen like every five to 10 minutes or so. |
|
|
|
22:04.600 --> 22:08.080 |
|
And it was a really, you know, off of putting to the viewers. |
|
|
|
22:08.080 --> 22:08.880 |
|
Yeah. |
|
|
|
22:08.880 --> 22:10.200 |
|
How do you tackle that problem? |
|
|
|
22:10.200 --> 22:13.280 |
|
What was the, you're just freaking out behind a computer. |
|
|
|
22:13.280 --> 22:16.880 |
|
There's the word, are there other folks working on this problem? |
|
|
|
22:16.880 --> 22:18.120 |
|
Like were you behind a whiteboard? |
|
|
|
22:18.120 --> 22:22.000 |
|
Were you doing a hair coding? |
|
|
|
22:22.000 --> 22:23.840 |
|
Yeah, it's a little lonely because there's four of us |
|
|
|
22:23.840 --> 22:26.880 |
|
working on the company and only two people really wrote code. |
|
|
|
22:26.880 --> 22:29.200 |
|
And Emmett wrote the website in the chat system |
|
|
|
22:29.200 --> 22:32.400 |
|
and I wrote the software for this video streaming device |
|
|
|
22:32.400 --> 22:34.280 |
|
and video server. |
|
|
|
22:34.280 --> 22:36.240 |
|
And so, you know, it was my sole responsibility |
|
|
|
22:36.240 --> 22:37.320 |
|
to figure that out. |
|
|
|
22:37.320 --> 22:39.440 |
|
And I think it's those, you know, |
|
|
|
22:39.440 --> 22:42.200 |
|
setting deadlines, trying to move quickly and everything |
|
|
|
22:42.200 --> 22:44.200 |
|
where you're in that moment of intense pressure |
|
|
|
22:44.200 --> 22:46.960 |
|
that sometimes people do their best and most interesting work. |
|
|
|
22:46.960 --> 22:48.800 |
|
And so even though that was a terrible moment, |
|
|
|
22:48.800 --> 22:50.760 |
|
I look back on it fondly because that's like, you know, |
|
|
|
22:50.760 --> 22:54.720 |
|
that's one of those character defining moments, I think. |
|
|
|
22:54.720 --> 22:59.480 |
|
So in 2013, October, you founded Cruise Automation. |
|
|
|
22:59.480 --> 23:00.200 |
|
Yeah. |
|
|
|
23:00.200 --> 23:04.200 |
|
So progressing forward, another exceptionally successful |
|
|
|
23:04.200 --> 23:09.920 |
|
company was acquired by GM in 2016 for $1 billion. |
|
|
|
23:09.920 --> 23:14.120 |
|
But in October 2013, what was on your mind? |
|
|
|
23:14.120 --> 23:16.360 |
|
What was the plan? |
|
|
|
23:16.360 --> 23:19.840 |
|
How does one seriously start to tackle |
|
|
|
23:19.840 --> 23:22.800 |
|
one of the hardest robotics, most important impact |
|
|
|
23:22.800 --> 23:24.960 |
|
for robotics problems of our age? |
|
|
|
23:24.960 --> 23:28.760 |
|
After going through Twitch, Twitch was, |
|
|
|
23:28.760 --> 23:31.480 |
|
and is today pretty successful. |
|
|
|
23:31.480 --> 23:36.880 |
|
But the work was, the result was entertainment mostly. |
|
|
|
23:36.880 --> 23:39.840 |
|
Like the better the product was, the more we would entertain |
|
|
|
23:39.840 --> 23:42.760 |
|
people and then, you know, make money on the ad revenues |
|
|
|
23:42.760 --> 23:43.760 |
|
and other things. |
|
|
|
23:43.760 --> 23:45.000 |
|
And that was a good thing. |
|
|
|
23:45.000 --> 23:46.320 |
|
It felt good to entertain people. |
|
|
|
23:46.320 --> 23:49.120 |
|
But I figured like, you know, what is really the point |
|
|
|
23:49.120 --> 23:51.120 |
|
of becoming a really good engineer |
|
|
|
23:51.120 --> 23:53.160 |
|
and developing these skills other than, you know, |
|
|
|
23:53.160 --> 23:53.960 |
|
my own enjoyment. |
|
|
|
23:53.960 --> 23:55.760 |
|
And I realized I wanted something that scratched |
|
|
|
23:55.760 --> 23:57.680 |
|
more of an existential itch, like something |
|
|
|
23:57.680 --> 23:59.440 |
|
that truly matters. |
|
|
|
23:59.440 --> 24:03.680 |
|
And so I basically made this list of requirements |
|
|
|
24:03.680 --> 24:06.160 |
|
for a new, if I was going to do another company. |
|
|
|
24:06.160 --> 24:08.000 |
|
And the one thing I knew in the back of my head |
|
|
|
24:08.000 --> 24:12.320 |
|
that Twitch took like eight years to become successful. |
|
|
|
24:12.320 --> 24:14.880 |
|
And so whatever I do, I better be willing to commit, |
|
|
|
24:14.880 --> 24:17.000 |
|
you know, at least 10 years to something. |
|
|
|
24:17.000 --> 24:20.400 |
|
And when you think about things from that perspective, |
|
|
|
24:20.400 --> 24:21.760 |
|
you certainly, I think, raise the bar |
|
|
|
24:21.760 --> 24:23.200 |
|
on what you choose to work on. |
|
|
|
24:23.200 --> 24:24.320 |
|
So for me, the three things where |
|
|
|
24:24.320 --> 24:27.120 |
|
it had to be something where the technology itself |
|
|
|
24:27.120 --> 24:28.960 |
|
determines the success of the product, |
|
|
|
24:28.960 --> 24:31.840 |
|
like hard, really juicy technology problems, |
|
|
|
24:31.840 --> 24:33.600 |
|
because that's what motivates me. |
|
|
|
24:33.600 --> 24:36.280 |
|
And then it had to have a direct and positive impact |
|
|
|
24:36.280 --> 24:37.640 |
|
on society in some way. |
|
|
|
24:37.640 --> 24:39.200 |
|
So an example would be like, you know, |
|
|
|
24:39.200 --> 24:41.560 |
|
health care, self driving cars because they save lives, |
|
|
|
24:41.560 --> 24:43.600 |
|
other things where there's a clear connection to somehow |
|
|
|
24:43.600 --> 24:45.200 |
|
improving other people's lives. |
|
|
|
24:45.200 --> 24:47.160 |
|
And the last one is it had to be a big business |
|
|
|
24:47.160 --> 24:50.200 |
|
because for the positive impact to matter, |
|
|
|
24:50.200 --> 24:51.240 |
|
it's got to be a large scale. |
|
|
|
24:51.240 --> 24:52.080 |
|
Scale, yeah. |
|
|
|
24:52.080 --> 24:53.840 |
|
And I was thinking about that for a while |
|
|
|
24:53.840 --> 24:55.960 |
|
and I made like a, I tried writing a Gmail clone |
|
|
|
24:55.960 --> 24:57.640 |
|
and looked at some other ideas. |
|
|
|
24:57.640 --> 24:59.480 |
|
And then it just sort of light bulb went off |
|
|
|
24:59.480 --> 25:00.440 |
|
like self driving cars. |
|
|
|
25:00.440 --> 25:02.360 |
|
Like that was the most fun I had ever had |
|
|
|
25:02.360 --> 25:04.040 |
|
in college working on that. |
|
|
|
25:04.040 --> 25:05.960 |
|
And like, well, what's the state of the technology |
|
|
|
25:05.960 --> 25:08.440 |
|
has been 10 years, maybe times have changed |
|
|
|
25:08.440 --> 25:10.800 |
|
and maybe now is the time to make this work. |
|
|
|
25:10.800 --> 25:13.320 |
|
And I poked around and looked at the only other thing |
|
|
|
25:13.320 --> 25:15.480 |
|
out there really at the time was the Google self driving |
|
|
|
25:15.480 --> 25:16.680 |
|
car project. |
|
|
|
25:16.680 --> 25:19.600 |
|
And I thought surely there's a way to, you know, |
|
|
|
25:19.600 --> 25:21.600 |
|
have an entrepreneur mindset and sort of solve |
|
|
|
25:21.600 --> 25:23.520 |
|
the minimum viable product here. |
|
|
|
25:23.520 --> 25:25.200 |
|
And so I just took the plunge right then and there |
|
|
|
25:25.200 --> 25:26.680 |
|
and said, this, this is something I know |
|
|
|
25:26.680 --> 25:27.840 |
|
I can commit 10 years to. |
|
|
|
25:27.840 --> 25:30.760 |
|
It's probably the greatest applied AI problem |
|
|
|
25:30.760 --> 25:32.000 |
|
of our generation. |
|
|
|
25:32.000 --> 25:34.240 |
|
And if it works, it's going to be both a huge business |
|
|
|
25:34.240 --> 25:37.040 |
|
and therefore like probably the most positive impact |
|
|
|
25:37.040 --> 25:38.280 |
|
I can possibly have on the world. |
|
|
|
25:38.280 --> 25:40.920 |
|
So after that light bulb went off, |
|
|
|
25:40.920 --> 25:43.000 |
|
I went all in on cruise immediately |
|
|
|
25:43.000 --> 25:45.560 |
|
and got to work. |
|
|
|
25:45.560 --> 25:47.360 |
|
Did you have an idea how to solve this problem? |
|
|
|
25:47.360 --> 25:49.640 |
|
Which aspect of the problem to solve? |
|
|
|
25:49.640 --> 25:53.720 |
|
You know, slow, like we just had Oliver from voyage here |
|
|
|
25:53.720 --> 25:56.560 |
|
slow moving retirement communities, |
|
|
|
25:56.560 --> 25:58.080 |
|
urban driving, highway driving. |
|
|
|
25:58.080 --> 26:00.400 |
|
Did you have like, did you have a vision |
|
|
|
26:00.400 --> 26:03.560 |
|
of the city of the future or, you know, |
|
|
|
26:03.560 --> 26:06.400 |
|
the transportation is largely automated, |
|
|
|
26:06.400 --> 26:07.240 |
|
that kind of thing. |
|
|
|
26:07.240 --> 26:12.240 |
|
Or was it sort of more fuzzy and gray area than that? |
|
|
|
26:12.240 --> 26:16.640 |
|
My analysis of the situation is that Google's putting a lot, |
|
|
|
26:16.640 --> 26:19.200 |
|
had been putting a lot of money into that project. |
|
|
|
26:19.200 --> 26:20.760 |
|
They had a lot more resources. |
|
|
|
26:20.760 --> 26:23.720 |
|
And so, and they still hadn't cracked |
|
|
|
26:23.720 --> 26:26.200 |
|
the fully driverless car. |
|
|
|
26:26.200 --> 26:28.520 |
|
You know, this is 2013, I guess. |
|
|
|
26:29.480 --> 26:33.360 |
|
So I thought, what can I do to sort of go from zero |
|
|
|
26:33.360 --> 26:35.600 |
|
to, you know, significant scale |
|
|
|
26:35.600 --> 26:37.280 |
|
so I can actually solve the real problem, |
|
|
|
26:37.280 --> 26:38.640 |
|
which is the driverless cars. |
|
|
|
26:38.640 --> 26:40.480 |
|
And I thought, here's the strategy. |
|
|
|
26:40.480 --> 26:44.080 |
|
We'll start by doing a really simple problem |
|
|
|
26:44.080 --> 26:45.560 |
|
or solving a really simple problem |
|
|
|
26:45.560 --> 26:48.080 |
|
that creates value for people. |
|
|
|
26:48.080 --> 26:50.040 |
|
So it eventually ended up deciding |
|
|
|
26:50.040 --> 26:51.800 |
|
on automating highway driving, |
|
|
|
26:51.800 --> 26:54.240 |
|
which is relatively more straightforward |
|
|
|
26:54.240 --> 26:56.440 |
|
as long as there's a backup driver there. |
|
|
|
26:56.440 --> 26:58.480 |
|
And, you know, the go to market |
|
|
|
26:58.480 --> 27:00.240 |
|
will be able to retrofit people's cars |
|
|
|
27:00.240 --> 27:02.240 |
|
and just sell these products directly. |
|
|
|
27:02.240 --> 27:04.520 |
|
And the idea was, we'll take all the revenue |
|
|
|
27:04.520 --> 27:08.320 |
|
and profits from that and use it to do the, |
|
|
|
27:08.320 --> 27:10.920 |
|
to sort of reinvest that in research for doing |
|
|
|
27:10.920 --> 27:12.600 |
|
fully driverless cars. |
|
|
|
27:12.600 --> 27:13.960 |
|
And that was the plan. |
|
|
|
27:13.960 --> 27:15.720 |
|
The only thing that really changed along the way |
|
|
|
27:15.720 --> 27:17.360 |
|
between then and now is, |
|
|
|
27:17.360 --> 27:19.000 |
|
we never really launched the first product. |
|
|
|
27:19.000 --> 27:21.680 |
|
We had enough interest from investors |
|
|
|
27:21.680 --> 27:24.120 |
|
and enough of a signal that this was something |
|
|
|
27:24.120 --> 27:25.000 |
|
that we should be working on, |
|
|
|
27:25.000 --> 27:28.400 |
|
that after about a year of working on the highway autopilot, |
|
|
|
27:28.400 --> 27:31.040 |
|
we had it working, you know, at a prototype stage, |
|
|
|
27:31.040 --> 27:33.120 |
|
but we just completely abandoned that |
|
|
|
27:33.120 --> 27:34.960 |
|
and said, we're gonna go all in on driverless cars |
|
|
|
27:34.960 --> 27:36.480 |
|
now is the time. |
|
|
|
27:36.480 --> 27:38.120 |
|
Can't think of anything that's more exciting. |
|
|
|
27:38.120 --> 27:39.720 |
|
And if it works more impactful, |
|
|
|
27:39.720 --> 27:41.360 |
|
so we're just gonna go for it. |
|
|
|
27:41.360 --> 27:43.440 |
|
The idea of retrofit is kind of interesting. |
|
|
|
27:43.440 --> 27:44.280 |
|
Yeah. |
|
|
|
27:44.280 --> 27:46.880 |
|
Being able to, it's how you achieve scale. |
|
|
|
27:46.880 --> 27:47.880 |
|
It's a really interesting idea, |
|
|
|
27:47.880 --> 27:51.120 |
|
is it's something that's still in the back of your mind |
|
|
|
27:51.120 --> 27:52.800 |
|
as a possibility? |
|
|
|
27:52.800 --> 27:53.640 |
|
Not at all. |
|
|
|
27:53.640 --> 27:57.080 |
|
I've come full circle on that one after trying |
|
|
|
27:57.080 --> 27:58.880 |
|
to build a retrofit product. |
|
|
|
27:58.880 --> 28:01.240 |
|
And I'll touch on some of the complexities of that. |
|
|
|
28:01.240 --> 28:04.240 |
|
And then also having been inside an OEM |
|
|
|
28:04.240 --> 28:05.400 |
|
and seeing how things work |
|
|
|
28:05.400 --> 28:08.320 |
|
and how a vehicle is developed and validated. |
|
|
|
28:08.320 --> 28:09.360 |
|
When it comes to something |
|
|
|
28:09.360 --> 28:11.280 |
|
that has safety critical implications, |
|
|
|
28:11.280 --> 28:12.520 |
|
like controlling the steering |
|
|
|
28:12.520 --> 28:15.280 |
|
and other control inputs on your car, |
|
|
|
28:15.280 --> 28:17.720 |
|
it's pretty hard to get there with a retrofit. |
|
|
|
28:17.720 --> 28:20.520 |
|
Or if you did, even if you did, |
|
|
|
28:20.520 --> 28:23.280 |
|
it creates a whole bunch of new complications around |
|
|
|
28:23.280 --> 28:25.400 |
|
liability or how did you truly validate that? |
|
|
|
28:25.400 --> 28:27.480 |
|
Or, you know, something in the base vehicle fails |
|
|
|
28:27.480 --> 28:29.880 |
|
and causes your system to fail, whose fault is it? |
|
|
|
28:31.560 --> 28:34.080 |
|
Or if the car's anti lock brake systems |
|
|
|
28:34.080 --> 28:36.680 |
|
or other things kick in or the software has been, |
|
|
|
28:36.680 --> 28:38.240 |
|
it's different in one version of the car. |
|
|
|
28:38.240 --> 28:40.080 |
|
You retrofit versus another and you don't know |
|
|
|
28:40.080 --> 28:43.000 |
|
because the manufacturer has updated it behind the scenes. |
|
|
|
28:43.000 --> 28:45.400 |
|
There's basically an infinite list of long tail issues |
|
|
|
28:45.400 --> 28:46.240 |
|
that can get you. |
|
|
|
28:46.240 --> 28:47.760 |
|
And if you're dealing with a safety critical product, |
|
|
|
28:47.760 --> 28:48.960 |
|
that's not really acceptable. |
|
|
|
28:48.960 --> 28:52.160 |
|
That's a really convincing summary of why |
|
|
|
28:52.160 --> 28:53.160 |
|
it's really challenging. |
|
|
|
28:53.160 --> 28:54.360 |
|
But I didn't know all that at the time. |
|
|
|
28:54.360 --> 28:55.480 |
|
So we tried it anyway. |
|
|
|
28:55.480 --> 28:57.160 |
|
But as a pitch also at the time, |
|
|
|
28:57.160 --> 28:58.400 |
|
it's a really strong one. |
|
|
|
28:58.400 --> 29:00.720 |
|
That's how you achieve scale and that's how you beat |
|
|
|
29:00.720 --> 29:03.360 |
|
the current, the leader at the time of Google |
|
|
|
29:03.360 --> 29:04.720 |
|
or the only one in the market. |
|
|
|
29:04.720 --> 29:06.840 |
|
The other big problem we ran into, |
|
|
|
29:06.840 --> 29:08.240 |
|
which is perhaps the biggest problem |
|
|
|
29:08.240 --> 29:10.280 |
|
from a business model perspective, |
|
|
|
29:10.280 --> 29:15.280 |
|
is we had kind of assumed that we started with an Audi S4 |
|
|
|
29:15.440 --> 29:16.880 |
|
as the vehicle we retrofitted |
|
|
|
29:16.880 --> 29:18.760 |
|
with this highway driving capability. |
|
|
|
29:18.760 --> 29:21.040 |
|
And we had kind of assumed that if we just knock out |
|
|
|
29:21.040 --> 29:23.360 |
|
like three make and models of vehicle, |
|
|
|
29:23.360 --> 29:25.880 |
|
that'll cover like 80% of the San Francisco market. |
|
|
|
29:25.880 --> 29:27.400 |
|
Doesn't everyone there drive, I don't know, |
|
|
|
29:27.400 --> 29:30.240 |
|
a BMW or a Honda Civic or one of these three cars? |
|
|
|
29:30.240 --> 29:32.040 |
|
And then we surveyed our users and we found out |
|
|
|
29:32.040 --> 29:33.480 |
|
that it's all over the place. |
|
|
|
29:33.480 --> 29:36.680 |
|
We would, to get even a decent number of units sold, |
|
|
|
29:36.680 --> 29:39.880 |
|
we'd have to support like 20 or 50 different models. |
|
|
|
29:39.880 --> 29:42.200 |
|
And each one is a little butterfly that takes time |
|
|
|
29:42.200 --> 29:44.800 |
|
and effort to maintain that retrofit integration |
|
|
|
29:44.800 --> 29:47.120 |
|
and custom hardware and all this. |
|
|
|
29:47.120 --> 29:49.240 |
|
So it was a tough business. |
|
|
|
29:49.240 --> 29:54.240 |
|
So GM manufactures and sells over nine million cars a year. |
|
|
|
29:54.280 --> 29:58.560 |
|
And what you with crews are trying to do |
|
|
|
29:58.560 --> 30:01.160 |
|
some of the most cutting edge innovation |
|
|
|
30:01.160 --> 30:03.000 |
|
in terms of applying AI. |
|
|
|
30:03.000 --> 30:06.040 |
|
And so how do those, you've talked about it a little bit |
|
|
|
30:06.040 --> 30:07.760 |
|
before, but it's also just fascinating to me, |
|
|
|
30:07.760 --> 30:09.360 |
|
we work a lot of automakers. |
|
|
|
30:10.560 --> 30:12.880 |
|
The difference between the gap between Detroit |
|
|
|
30:12.880 --> 30:14.680 |
|
and Silicon Valley, let's say, |
|
|
|
30:14.680 --> 30:17.320 |
|
just to be sort of poetic about it, I guess. |
|
|
|
30:17.320 --> 30:18.680 |
|
How do you close that gap? |
|
|
|
30:18.680 --> 30:21.480 |
|
How do you take GM into the future |
|
|
|
30:21.480 --> 30:24.840 |
|
where a large part of the fleet would be autonomous perhaps? |
|
|
|
30:24.840 --> 30:28.520 |
|
I wanna start by acknowledging that GM is made up of |
|
|
|
30:28.520 --> 30:30.240 |
|
tens of thousands of really brilliant, |
|
|
|
30:30.240 --> 30:32.720 |
|
motivated people who wanna be a part of the future. |
|
|
|
30:32.720 --> 30:35.240 |
|
And so it's pretty fun to work with them. |
|
|
|
30:35.240 --> 30:37.480 |
|
The attitude inside a car company like that |
|
|
|
30:37.480 --> 30:41.240 |
|
is embracing this transformation and change |
|
|
|
30:41.240 --> 30:42.360 |
|
rather than fearing it. |
|
|
|
30:42.360 --> 30:45.440 |
|
And I think that's a testament to the leadership at GM |
|
|
|
30:45.440 --> 30:47.680 |
|
and that's flown all the way through to everyone |
|
|
|
30:47.680 --> 30:49.280 |
|
you talk to, even the people in the assembly plants |
|
|
|
30:49.280 --> 30:51.200 |
|
working on these cars. |
|
|
|
30:51.200 --> 30:52.040 |
|
So that's really great. |
|
|
|
30:52.040 --> 30:55.160 |
|
So starting from that position makes it a lot easier. |
|
|
|
30:55.160 --> 30:59.160 |
|
So then when the people in San Francisco |
|
|
|
30:59.160 --> 31:01.400 |
|
but cruise interact with the people at GM, |
|
|
|
31:01.400 --> 31:02.960 |
|
at least we have this common set of values, |
|
|
|
31:02.960 --> 31:05.000 |
|
which is that we really want this stuff to work |
|
|
|
31:05.000 --> 31:06.040 |
|
because we think it's important |
|
|
|
31:06.040 --> 31:07.440 |
|
and we think it's the future. |
|
|
|
31:08.360 --> 31:11.520 |
|
That's not to say those two cultures don't clash. |
|
|
|
31:11.520 --> 31:12.440 |
|
They absolutely do. |
|
|
|
31:12.440 --> 31:14.760 |
|
There's different sort of value systems. |
|
|
|
31:14.760 --> 31:17.960 |
|
Like in a car company, the thing that gets you promoted |
|
|
|
31:17.960 --> 31:22.600 |
|
and sort of the reward system is following the processes, |
|
|
|
31:22.600 --> 31:26.080 |
|
delivering the program on time and on budget. |
|
|
|
31:26.080 --> 31:30.440 |
|
So any sort of risk taking is discouraged in many ways |
|
|
|
31:30.440 --> 31:34.000 |
|
because if a program is late |
|
|
|
31:34.000 --> 31:36.200 |
|
or if you shut down the plant for a day, |
|
|
|
31:36.200 --> 31:37.560 |
|
you can count the millions of dollars |
|
|
|
31:37.560 --> 31:39.600 |
|
that burn by pretty quickly. |
|
|
|
31:39.600 --> 31:43.800 |
|
Whereas I think most Silicon Valley companies |
|
|
|
31:43.800 --> 31:48.280 |
|
and in cruise and the methodology we were employing, |
|
|
|
31:48.280 --> 31:50.080 |
|
especially around the time of the acquisition, |
|
|
|
31:50.080 --> 31:53.800 |
|
the reward structure is about trying to solve |
|
|
|
31:53.800 --> 31:56.120 |
|
these complex problems in any way, shape or form |
|
|
|
31:56.120 --> 31:59.640 |
|
or coming up with crazy ideas that 90% of them won't work. |
|
|
|
31:59.640 --> 32:02.920 |
|
And so meshing that culture |
|
|
|
32:02.920 --> 32:05.480 |
|
of sort of continuous improvement and experimentation |
|
|
|
32:05.480 --> 32:07.400 |
|
with one where everything needs to be |
|
|
|
32:07.400 --> 32:08.480 |
|
rigorously defined up front |
|
|
|
32:08.480 --> 32:12.760 |
|
so that you never slip a deadline or miss a budget |
|
|
|
32:12.760 --> 32:13.600 |
|
was a pretty big challenge |
|
|
|
32:13.600 --> 32:16.960 |
|
and that we're over three years in now |
|
|
|
32:16.960 --> 32:18.360 |
|
after the acquisition. |
|
|
|
32:18.360 --> 32:20.480 |
|
And I'd say like the investment we made |
|
|
|
32:20.480 --> 32:23.600 |
|
in figuring out how to work together successfully |
|
|
|
32:23.600 --> 32:24.440 |
|
and who should do what |
|
|
|
32:24.440 --> 32:26.360 |
|
and how we bridge the gaps |
|
|
|
32:26.360 --> 32:27.680 |
|
between these very different systems |
|
|
|
32:27.680 --> 32:29.520 |
|
and way of doing engineering work |
|
|
|
32:29.520 --> 32:30.920 |
|
is now one of our greatest assets |
|
|
|
32:30.920 --> 32:32.320 |
|
because I think we have this really powerful thing |
|
|
|
32:32.320 --> 32:35.560 |
|
but for a while it was both GM and cruise |
|
|
|
32:35.560 --> 32:37.440 |
|
were very steep on the learning curve. |
|
|
|
32:37.440 --> 32:38.920 |
|
Yeah, so I'm sure it was very stressful. |
|
|
|
32:38.920 --> 32:39.960 |
|
It's really important work |
|
|
|
32:39.960 --> 32:43.680 |
|
because that's how to revolutionize the transportation. |
|
|
|
32:43.680 --> 32:46.640 |
|
Really to revolutionize any system, |
|
|
|
32:46.640 --> 32:48.200 |
|
you look at the healthcare system |
|
|
|
32:48.200 --> 32:49.680 |
|
or you look at the legal system. |
|
|
|
32:49.680 --> 32:52.040 |
|
I have people like Laura's come up to me all the time |
|
|
|
32:52.040 --> 32:53.920 |
|
like everything they're working on |
|
|
|
32:53.920 --> 32:55.960 |
|
can easily be automated. |
|
|
|
32:55.960 --> 32:57.480 |
|
But then that's not a good feeling. |
|
|
|
32:57.480 --> 32:58.320 |
|
Yeah. |
|
|
|
32:58.320 --> 32:59.160 |
|
Well, it's not a good feeling, |
|
|
|
32:59.160 --> 33:01.200 |
|
but also there's no way to automate |
|
|
|
33:01.200 --> 33:06.200 |
|
because the entire infrastructure is really based |
|
|
|
33:06.360 --> 33:08.360 |
|
is older and it moves very slowly. |
|
|
|
33:08.360 --> 33:11.560 |
|
And so how do you close the gap between? |
|
|
|
33:11.560 --> 33:13.880 |
|
I haven't, how can I replace? |
|
|
|
33:13.880 --> 33:15.720 |
|
Of course, Laura's the one be replaced with an app |
|
|
|
33:15.720 --> 33:17.920 |
|
but you could replace a lot of aspect |
|
|
|
33:17.920 --> 33:20.160 |
|
when most of the data is still on paper. |
|
|
|
33:20.160 --> 33:23.400 |
|
And so the same thing with automotive. |
|
|
|
33:23.400 --> 33:26.080 |
|
I mean, it's fundamentally software. |
|
|
|
33:26.080 --> 33:28.560 |
|
So it's basically hiring software engineers. |
|
|
|
33:28.560 --> 33:30.320 |
|
It's thinking of software world. |
|
|
|
33:30.320 --> 33:32.560 |
|
I mean, I'm pretty sure nobody in Silicon Valley |
|
|
|
33:32.560 --> 33:34.640 |
|
has ever hit a deadline. |
|
|
|
33:34.640 --> 33:36.000 |
|
So and then on GM. |
|
|
|
33:36.000 --> 33:37.400 |
|
That's probably true, yeah. |
|
|
|
33:37.400 --> 33:39.920 |
|
And GM side is probably the opposite. |
|
|
|
33:39.920 --> 33:42.720 |
|
So that's that culture gap is really fascinating. |
|
|
|
33:42.720 --> 33:45.160 |
|
So you're optimistic about the future of that. |
|
|
|
33:45.160 --> 33:47.440 |
|
Yeah, I mean, from what I've seen, it's impressive. |
|
|
|
33:47.440 --> 33:49.400 |
|
And I think like, especially in Silicon Valley, |
|
|
|
33:49.400 --> 33:51.440 |
|
it's easy to write off building cars |
|
|
|
33:51.440 --> 33:53.120 |
|
because people have been doing that |
|
|
|
33:53.120 --> 33:54.960 |
|
for over a hundred years now in this country. |
|
|
|
33:54.960 --> 33:57.080 |
|
And so it seems like that's a solved problem, |
|
|
|
33:57.080 --> 33:58.840 |
|
but that doesn't mean it's an easy problem. |
|
|
|
33:58.840 --> 34:02.280 |
|
And I think it would be easy to sort of overlook that |
|
|
|
34:02.280 --> 34:06.080 |
|
and think that we're Silicon Valley engineers, |
|
|
|
34:06.080 --> 34:08.960 |
|
we can solve any problem, building a car, |
|
|
|
34:08.960 --> 34:13.200 |
|
it's been done, therefore it's not a real engineering |
|
|
|
34:13.200 --> 34:14.600 |
|
challenge. |
|
|
|
34:14.600 --> 34:17.480 |
|
But after having seen just the sheer scale |
|
|
|
34:17.480 --> 34:21.360 |
|
and magnitude and industrialization that occurs |
|
|
|
34:21.360 --> 34:23.280 |
|
inside of an automotive assembly plant, |
|
|
|
34:23.280 --> 34:25.840 |
|
that is a lot of work that I am very glad |
|
|
|
34:25.840 --> 34:28.200 |
|
that we don't have to reinvent |
|
|
|
34:28.200 --> 34:29.480 |
|
to make self driving cars work. |
|
|
|
34:29.480 --> 34:31.680 |
|
And so to have partners who have done that for a hundred |
|
|
|
34:31.680 --> 34:32.960 |
|
years and have these great processes |
|
|
|
34:32.960 --> 34:35.720 |
|
and this huge infrastructure and supply base |
|
|
|
34:35.720 --> 34:38.760 |
|
that we can tap into is just remarkable |
|
|
|
34:38.760 --> 34:43.760 |
|
because the scope and surface area of the problem |
|
|
|
34:44.560 --> 34:47.400 |
|
of deploying fleets of self driving cars is so large |
|
|
|
34:47.400 --> 34:50.320 |
|
that we're constantly looking for ways to do less |
|
|
|
34:50.320 --> 34:52.920 |
|
so we can focus on the things that really matter more. |
|
|
|
34:52.920 --> 34:55.360 |
|
And if we had to figure out how to build and assemble |
|
|
|
34:55.360 --> 35:00.120 |
|
and test and build the cars themselves, |
|
|
|
35:00.120 --> 35:01.640 |
|
I mean, we work closely with GM on that, |
|
|
|
35:01.640 --> 35:03.240 |
|
but if we had to develop all that capability |
|
|
|
35:03.240 --> 35:08.240 |
|
in house as well, that would just make the problem |
|
|
|
35:08.320 --> 35:10.200 |
|
really intractable, I think. |
|
|
|
35:10.200 --> 35:14.880 |
|
So yeah, just like your first entry at the MIT DARPA |
|
|
|
35:14.880 --> 35:17.680 |
|
challenge when it was what the motor that failed |
|
|
|
35:17.680 --> 35:19.000 |
|
and somebody that knows what they're doing |
|
|
|
35:19.000 --> 35:20.040 |
|
with the motor did it. |
|
|
|
35:20.040 --> 35:22.080 |
|
It would have been nice if we could focus on the software |
|
|
|
35:22.080 --> 35:23.880 |
|
and not the hardware platform. |
|
|
|
35:23.880 --> 35:24.800 |
|
Yeah, right. |
|
|
|
35:24.800 --> 35:27.080 |
|
So from your perspective now, |
|
|
|
35:28.080 --> 35:29.960 |
|
there's so many ways that autonomous vehicles |
|
|
|
35:29.960 --> 35:34.280 |
|
can impact society in the next year, five years, 10 years. |
|
|
|
35:34.280 --> 35:37.080 |
|
What do you think is the biggest opportunity |
|
|
|
35:37.080 --> 35:39.360 |
|
to make money in autonomous driving, |
|
|
|
35:40.560 --> 35:44.720 |
|
sort of make it a financially viable thing in the near term? |
|
|
|
35:44.720 --> 35:49.120 |
|
What do you think would be the biggest impact there? |
|
|
|
35:49.120 --> 35:52.160 |
|
Well, the things that drive the economics |
|
|
|
35:52.160 --> 35:53.600 |
|
for fleets of self driving cars |
|
|
|
35:53.600 --> 35:56.440 |
|
are there's sort of a handful of variables. |
|
|
|
35:56.440 --> 36:00.400 |
|
One is the cost to build the vehicle itself. |
|
|
|
36:00.400 --> 36:03.720 |
|
So the material cost, what's the cost of all your sensors, |
|
|
|
36:03.720 --> 36:05.200 |
|
plus the cost of the vehicle |
|
|
|
36:05.200 --> 36:07.560 |
|
and all the other components on it. |
|
|
|
36:07.560 --> 36:09.520 |
|
Another one is the lifetime of the vehicle. |
|
|
|
36:09.520 --> 36:12.480 |
|
It's very different if your vehicle drives 100,000 miles |
|
|
|
36:12.480 --> 36:14.800 |
|
and then it falls apart versus 2 million. |
|
|
|
36:16.720 --> 36:18.840 |
|
And then if you have a fleet, |
|
|
|
36:18.840 --> 36:22.920 |
|
it's kind of like an airplane or an airline |
|
|
|
36:22.920 --> 36:26.120 |
|
where once you produce the vehicle, |
|
|
|
36:26.120 --> 36:27.880 |
|
you want it to be in operation |
|
|
|
36:27.880 --> 36:30.760 |
|
as many hours a day as possible producing revenue. |
|
|
|
36:30.760 --> 36:32.480 |
|
And then the other piece of that |
|
|
|
36:32.480 --> 36:35.280 |
|
is how are you generating revenue? |
|
|
|
36:35.280 --> 36:36.880 |
|
I think that's kind of what you're asking in. |
|
|
|
36:36.880 --> 36:38.400 |
|
I think the obvious things today |
|
|
|
36:38.400 --> 36:40.080 |
|
are the ride sharing business |
|
|
|
36:40.080 --> 36:42.760 |
|
because that's pretty clear that there's demand for that. |
|
|
|
36:42.760 --> 36:46.240 |
|
There's existing markets you can tap into and... |
|
|
|
36:46.240 --> 36:47.960 |
|
Large urban areas, that kind of thing. |
|
|
|
36:47.960 --> 36:48.800 |
|
Yeah, yeah. |
|
|
|
36:48.800 --> 36:51.200 |
|
And I think that there are some real benefits |
|
|
|
36:51.200 --> 36:54.520 |
|
to having cars without drivers |
|
|
|
36:54.520 --> 36:56.040 |
|
compared to sort of the status quo |
|
|
|
36:56.040 --> 36:58.520 |
|
for people who use ride share services today. |
|
|
|
36:58.520 --> 37:01.040 |
|
You know, your privacy, consistency, |
|
|
|
37:01.040 --> 37:02.440 |
|
hopefully significantly improve safety, |
|
|
|
37:02.440 --> 37:05.120 |
|
all these benefits versus the current product. |
|
|
|
37:05.120 --> 37:06.520 |
|
But it's a crowded market. |
|
|
|
37:06.520 --> 37:08.000 |
|
And then other opportunities |
|
|
|
37:08.000 --> 37:09.600 |
|
which you've seen a lot of activity in the last, |
|
|
|
37:09.600 --> 37:12.560 |
|
really in the last six or 12 months is delivery, |
|
|
|
37:12.560 --> 37:17.560 |
|
whether that's parcels and packages, food or groceries. |
|
|
|
37:17.800 --> 37:20.320 |
|
Those are all sort of, I think, opportunities |
|
|
|
37:20.320 --> 37:23.640 |
|
that are pretty ripe for these. |
|
|
|
37:23.640 --> 37:26.000 |
|
Once you have this core technology, |
|
|
|
37:26.000 --> 37:28.080 |
|
which is the fleet of autonomous vehicles, |
|
|
|
37:28.080 --> 37:30.920 |
|
there's all sorts of different business opportunities |
|
|
|
37:30.920 --> 37:32.080 |
|
you can build on top of that. |
|
|
|
37:32.080 --> 37:34.520 |
|
But I think the important thing, of course, |
|
|
|
37:34.520 --> 37:36.440 |
|
is that there's zero monetization opportunity |
|
|
|
37:36.440 --> 37:37.520 |
|
until you actually have that fleet |
|
|
|
37:37.520 --> 37:39.160 |
|
of very capable driverless cars |
|
|
|
37:39.160 --> 37:41.040 |
|
that are as good or better than humans. |
|
|
|
37:41.040 --> 37:44.120 |
|
And that's sort of where the entire industry |
|
|
|
37:44.120 --> 37:45.920 |
|
is sort of in this holding pattern right now. |
|
|
|
37:45.920 --> 37:47.960 |
|
Yeah, they're trying to achieve that baseline. |
|
|
|
37:47.960 --> 37:51.520 |
|
But you said sort of not reliability consistency. |
|
|
|
37:51.520 --> 37:52.360 |
|
It's kind of interesting. |
|
|
|
37:52.360 --> 37:54.200 |
|
I think I heard you say somewhere, |
|
|
|
37:54.200 --> 37:55.440 |
|
not sure if that's what you meant, |
|
|
|
37:55.440 --> 37:58.240 |
|
but I can imagine a situation |
|
|
|
37:58.240 --> 38:01.200 |
|
where you would get an autonomous vehicle. |
|
|
|
38:01.200 --> 38:04.560 |
|
And when you get into an Uber or Lyft, |
|
|
|
38:04.560 --> 38:05.960 |
|
you don't get to choose the driver |
|
|
|
38:05.960 --> 38:07.320 |
|
in a sense that you don't get to choose |
|
|
|
38:07.320 --> 38:09.080 |
|
the personality of the driving. |
|
|
|
38:09.080 --> 38:12.040 |
|
Do you think there's room |
|
|
|
38:12.040 --> 38:14.120 |
|
to define the personality of the car |
|
|
|
38:14.120 --> 38:15.040 |
|
the way it drives you, |
|
|
|
38:15.040 --> 38:17.600 |
|
in terms of aggressiveness, for example, |
|
|
|
38:17.600 --> 38:21.120 |
|
in terms of sort of pushing the boundaries. |
|
|
|
38:21.120 --> 38:22.760 |
|
One of the biggest challenges in autonomous driving |
|
|
|
38:22.760 --> 38:27.760 |
|
is the trade off between sort of safety and assertiveness. |
|
|
|
38:28.600 --> 38:30.920 |
|
And do you think there's any room |
|
|
|
38:30.920 --> 38:35.920 |
|
for the human to take a role in that decision? |
|
|
|
38:36.040 --> 38:38.080 |
|
Sort of accept some of the liability, I guess. |
|
|
|
38:38.080 --> 38:41.000 |
|
I wouldn't say, no, I'd say within reasonable bounds, |
|
|
|
38:41.000 --> 38:42.280 |
|
as in we're not gonna, |
|
|
|
38:43.200 --> 38:44.360 |
|
I think it'd be higher than likely |
|
|
|
38:44.360 --> 38:46.600 |
|
we'd expose any knob that would let you |
|
|
|
38:46.600 --> 38:50.240 |
|
significantly increase safety risk. |
|
|
|
38:50.240 --> 38:53.080 |
|
I think that's just not something we'd be willing to do. |
|
|
|
38:53.080 --> 38:56.760 |
|
But I think driving style or like, |
|
|
|
38:56.760 --> 38:59.120 |
|
are you gonna relax the comfort constraints slightly |
|
|
|
38:59.120 --> 39:00.160 |
|
or things like that? |
|
|
|
39:00.160 --> 39:02.400 |
|
All of those things make sense and are plausible. |
|
|
|
39:02.400 --> 39:04.480 |
|
I see all those as nice optimizations. |
|
|
|
39:04.480 --> 39:06.760 |
|
Once again, we get the core problem solved |
|
|
|
39:06.760 --> 39:08.120 |
|
in these fleets out there. |
|
|
|
39:08.120 --> 39:10.440 |
|
But the other thing we've sort of observed |
|
|
|
39:10.440 --> 39:12.560 |
|
is that you have this intuition |
|
|
|
39:12.560 --> 39:15.400 |
|
that if you sort of slam your foot on the gas |
|
|
|
39:15.400 --> 39:16.680 |
|
right after the light turns green |
|
|
|
39:16.680 --> 39:18.160 |
|
and aggressively accelerate, |
|
|
|
39:18.160 --> 39:19.720 |
|
you're gonna get there faster. |
|
|
|
39:19.720 --> 39:22.080 |
|
But the actual impact of doing that is pretty small. |
|
|
|
39:22.080 --> 39:23.680 |
|
You feel like you're getting there faster, |
|
|
|
39:23.680 --> 39:26.680 |
|
but so the same would be true for AVs. |
|
|
|
39:26.680 --> 39:29.640 |
|
Even if they don't slam the pedal to the floor |
|
|
|
39:29.640 --> 39:31.000 |
|
when the light turns green, |
|
|
|
39:31.000 --> 39:32.520 |
|
they're gonna get you there within, |
|
|
|
39:32.520 --> 39:33.600 |
|
if it's a 15 minute trip, |
|
|
|
39:33.600 --> 39:36.400 |
|
within 30 seconds of what you would have done otherwise |
|
|
|
39:36.400 --> 39:37.800 |
|
if you were going really aggressively. |
|
|
|
39:37.800 --> 39:40.760 |
|
So I think there's this sort of self deception |
|
|
|
39:40.760 --> 39:44.440 |
|
that my aggressive driving style is getting me there faster. |
|
|
|
39:44.440 --> 39:46.640 |
|
Well, so that's, you know, some of the things I study, |
|
|
|
39:46.640 --> 39:48.760 |
|
some of the things I'm fascinated by the psychology of that. |
|
|
|
39:48.760 --> 39:50.640 |
|
And I don't think it matters |
|
|
|
39:50.640 --> 39:52.240 |
|
that it doesn't get you there faster. |
|
|
|
39:52.240 --> 39:55.520 |
|
It's the emotional release. |
|
|
|
39:55.520 --> 39:59.080 |
|
Driving is a place, being inside our car, |
|
|
|
39:59.080 --> 40:00.880 |
|
somebody said it's like the real world version |
|
|
|
40:00.880 --> 40:02.920 |
|
of being a troll. |
|
|
|
40:02.920 --> 40:04.960 |
|
So you have this protection, this mental protection, |
|
|
|
40:04.960 --> 40:06.640 |
|
and you're able to sort of yell at the world, |
|
|
|
40:06.640 --> 40:08.200 |
|
like release your anger, whatever it is. |
|
|
|
40:08.200 --> 40:10.040 |
|
But so there's an element of that |
|
|
|
40:10.040 --> 40:12.000 |
|
that I think autonomous vehicles |
|
|
|
40:12.000 --> 40:15.400 |
|
would also have to, you know, giving an outlet to people, |
|
|
|
40:15.400 --> 40:19.120 |
|
but it doesn't have to be through driving or honking |
|
|
|
40:19.120 --> 40:21.200 |
|
or so on, there might be other outlets. |
|
|
|
40:21.200 --> 40:24.040 |
|
But I think to just sort of even just put that aside, |
|
|
|
40:24.040 --> 40:26.880 |
|
the baseline is really, you know, that's the focus, |
|
|
|
40:26.880 --> 40:28.200 |
|
that's the thing you need to solve, |
|
|
|
40:28.200 --> 40:31.000 |
|
and then the fun human things can be solved after. |
|
|
|
40:31.000 --> 40:34.680 |
|
But so from the baseline of just solving autonomous driving, |
|
|
|
40:34.680 --> 40:36.000 |
|
you're working in San Francisco, |
|
|
|
40:36.000 --> 40:38.960 |
|
one of the more difficult cities to operate in, |
|
|
|
40:38.960 --> 40:42.080 |
|
what is the, in your view currently, |
|
|
|
40:42.080 --> 40:45.040 |
|
the hardest aspect of autonomous driving? |
|
|
|
40:46.880 --> 40:49.200 |
|
Negotiating with pedestrians, |
|
|
|
40:49.200 --> 40:51.400 |
|
is it edge cases of perception? |
|
|
|
40:51.400 --> 40:52.760 |
|
Is it planning? |
|
|
|
40:52.760 --> 40:54.520 |
|
Is there a mechanical engineering? |
|
|
|
40:54.520 --> 40:57.040 |
|
Is it data, fleet stuff? |
|
|
|
40:57.040 --> 41:01.200 |
|
What are your thoughts on the more challenging aspects there? |
|
|
|
41:01.200 --> 41:02.240 |
|
That's a good question. |
|
|
|
41:02.240 --> 41:03.520 |
|
I think before we go to that though, |
|
|
|
41:03.520 --> 41:05.080 |
|
I just want to, I like what you said |
|
|
|
41:05.080 --> 41:07.600 |
|
about the psychology aspect of this, |
|
|
|
41:07.600 --> 41:09.680 |
|
because I think one observation I've made is, |
|
|
|
41:09.680 --> 41:11.760 |
|
I think I read somewhere that I think it's, |
|
|
|
41:11.760 --> 41:13.880 |
|
maybe Americans on average spend, you know, |
|
|
|
41:13.880 --> 41:16.520 |
|
over an hour a day on social media, |
|
|
|
41:16.520 --> 41:18.280 |
|
like staring at Facebook. |
|
|
|
41:18.280 --> 41:20.080 |
|
And so that's just, you know, |
|
|
|
41:20.080 --> 41:21.600 |
|
60 minutes of your life, you're not getting back. |
|
|
|
41:21.600 --> 41:23.120 |
|
It's probably not super productive. |
|
|
|
41:23.120 --> 41:26.200 |
|
And so that's 3,600 seconds, right? |
|
|
|
41:26.200 --> 41:29.160 |
|
And that's, that's time, you know, |
|
|
|
41:29.160 --> 41:30.600 |
|
it's a lot of time you're giving up. |
|
|
|
41:30.600 --> 41:34.080 |
|
And if you compare that to people being on the road, |
|
|
|
41:34.080 --> 41:35.360 |
|
if another vehicle, |
|
|
|
41:35.360 --> 41:37.600 |
|
whether it's a human driver or autonomous vehicle, |
|
|
|
41:37.600 --> 41:39.840 |
|
delays them by even three seconds, |
|
|
|
41:39.840 --> 41:41.920 |
|
they're laying in on the horn, you know, |
|
|
|
41:41.920 --> 41:43.280 |
|
even though that's, that's, you know, |
|
|
|
41:43.280 --> 41:45.280 |
|
one 1,000th of the time they waste |
|
|
|
41:45.280 --> 41:46.360 |
|
looking at Facebook every day. |
|
|
|
41:46.360 --> 41:48.640 |
|
So there's, there's definitely some, |
|
|
|
41:48.640 --> 41:50.040 |
|
you know, psychology aspects of this, |
|
|
|
41:50.040 --> 41:50.880 |
|
I think that are pretty interesting. |
|
|
|
41:50.880 --> 41:51.720 |
|
Road rage in general. |
|
|
|
41:51.720 --> 41:52.960 |
|
And then the question, of course, |
|
|
|
41:52.960 --> 41:54.960 |
|
is if everyone is in self driving cars, |
|
|
|
41:54.960 --> 41:57.560 |
|
do they even notice these three second delays anymore? |
|
|
|
41:57.560 --> 41:58.920 |
|
Because they're doing other things |
|
|
|
41:58.920 --> 42:01.720 |
|
or reading or working or just talking to each other. |
|
|
|
42:01.720 --> 42:03.200 |
|
So it'll be interesting to see where that goes. |
|
|
|
42:03.200 --> 42:05.120 |
|
In a certain aspect, people, |
|
|
|
42:05.120 --> 42:06.360 |
|
people need to be distracted |
|
|
|
42:06.360 --> 42:07.360 |
|
by something entertaining, |
|
|
|
42:07.360 --> 42:09.160 |
|
something useful inside the car |
|
|
|
42:09.160 --> 42:10.960 |
|
so they don't pay attention to the external world. |
|
|
|
42:10.960 --> 42:14.240 |
|
And then, and then they can take whatever psychology |
|
|
|
42:14.240 --> 42:17.400 |
|
and bring it back to Twitter and then focus on that |
|
|
|
42:17.400 --> 42:19.640 |
|
as opposed to sort of interacting, |
|
|
|
42:20.920 --> 42:23.200 |
|
sort of putting the emotion out there into the world. |
|
|
|
42:23.200 --> 42:24.560 |
|
So it's an interesting problem, |
|
|
|
42:24.560 --> 42:26.960 |
|
but baseline autonomy. |
|
|
|
42:26.960 --> 42:28.760 |
|
I guess you could say self driving cars, |
|
|
|
42:28.760 --> 42:31.680 |
|
you know, at scale will lower the collective blood pressure |
|
|
|
42:31.680 --> 42:33.920 |
|
of society probably by a couple of points |
|
|
|
42:33.920 --> 42:35.760 |
|
without all that road rage and stress. |
|
|
|
42:35.760 --> 42:37.480 |
|
So that's a good, good externality. |
|
|
|
42:38.560 --> 42:41.760 |
|
So back to your question about the technology |
|
|
|
42:41.760 --> 42:43.760 |
|
and the, I guess the biggest problems. |
|
|
|
42:43.760 --> 42:45.560 |
|
And I have a hard time answering that question |
|
|
|
42:45.560 --> 42:48.680 |
|
because, you know, we've been at this, |
|
|
|
42:48.680 --> 42:51.440 |
|
like specifically focusing on driverless cars |
|
|
|
42:51.440 --> 42:53.520 |
|
and all the technology needed to enable that |
|
|
|
42:53.520 --> 42:55.160 |
|
for a little over four and a half years now. |
|
|
|
42:55.160 --> 42:58.080 |
|
And even a year or two in, |
|
|
|
42:58.080 --> 43:02.960 |
|
I felt like we had completed the functionality needed |
|
|
|
43:02.960 --> 43:04.800 |
|
to get someone from point A to point B. |
|
|
|
43:04.800 --> 43:07.280 |
|
As in, if we need to do a left turn maneuver |
|
|
|
43:07.280 --> 43:08.960 |
|
or if we need to drive around a, you know, |
|
|
|
43:08.960 --> 43:11.800 |
|
a double parked vehicle into oncoming traffic |
|
|
|
43:11.800 --> 43:13.840 |
|
or navigate through construction zones, |
|
|
|
43:13.840 --> 43:15.960 |
|
the scaffolding and the building blocks |
|
|
|
43:15.960 --> 43:17.800 |
|
was there pretty early on. |
|
|
|
43:17.800 --> 43:22.360 |
|
And so the challenge is not any one scenario or situation |
|
|
|
43:22.360 --> 43:25.520 |
|
for which, you know, we fail at 100% of those. |
|
|
|
43:25.520 --> 43:28.960 |
|
It's more, you know, we're benchmarking against a pretty good |
|
|
|
43:28.960 --> 43:31.320 |
|
or pretty high standard, which is human driving. |
|
|
|
43:31.320 --> 43:33.320 |
|
All things considered, humans are excellent |
|
|
|
43:33.320 --> 43:36.240 |
|
at handling edge cases and unexpected scenarios |
|
|
|
43:36.240 --> 43:38.400 |
|
where it's computers are the opposite. |
|
|
|
43:38.400 --> 43:43.080 |
|
And so beating that baseline set by humans is the challenge. |
|
|
|
43:43.080 --> 43:46.520 |
|
And so what we've been doing for quite some time now |
|
|
|
43:46.520 --> 43:50.760 |
|
is basically it's this continuous improvement process |
|
|
|
43:50.760 --> 43:55.000 |
|
where we find sort of the most, you know, uncomfortable |
|
|
|
43:55.000 --> 43:59.840 |
|
or the things that could lead to a safety issue |
|
|
|
43:59.840 --> 44:00.960 |
|
or other things, all these events. |
|
|
|
44:00.960 --> 44:02.520 |
|
And then we sort of categorize them |
|
|
|
44:02.520 --> 44:04.560 |
|
and rework parts of our system |
|
|
|
44:04.560 --> 44:06.200 |
|
to make incremental improvements |
|
|
|
44:06.200 --> 44:08.040 |
|
and do that over and over and over again. |
|
|
|
44:08.040 --> 44:10.160 |
|
And we just see sort of the overall performance |
|
|
|
44:10.160 --> 44:12.120 |
|
of the system, you know, |
|
|
|
44:12.120 --> 44:13.960 |
|
actually increasing in a pretty steady clip. |
|
|
|
44:13.960 --> 44:15.360 |
|
But there's no one thing. |
|
|
|
44:15.360 --> 44:17.360 |
|
There's actually like thousands of little things |
|
|
|
44:17.360 --> 44:19.880 |
|
and just like polishing functionality |
|
|
|
44:19.880 --> 44:21.640 |
|
and making sure that it handles, you know, |
|
|
|
44:21.640 --> 44:26.120 |
|
every version and possible permutation of a situation |
|
|
|
44:26.120 --> 44:29.200 |
|
by either applying more deep learning systems |
|
|
|
44:30.120 --> 44:32.960 |
|
or just by, you know, adding more test coverage |
|
|
|
44:32.960 --> 44:35.760 |
|
or new scenarios that we develop against |
|
|
|
44:35.760 --> 44:37.160 |
|
and just grinding on that. |
|
|
|
44:37.160 --> 44:40.120 |
|
We're sort of in the unsexy phase of development right now |
|
|
|
44:40.120 --> 44:41.800 |
|
which is doing the real engineering work |
|
|
|
44:41.800 --> 44:44.120 |
|
that it takes to go from prototype to production. |
|
|
|
44:44.120 --> 44:46.960 |
|
You're basically scaling the grinding. |
|
|
|
44:46.960 --> 44:50.560 |
|
So sort of taking seriously the process |
|
|
|
44:50.560 --> 44:54.040 |
|
of all those edge cases, both with human experts |
|
|
|
44:54.040 --> 44:57.520 |
|
and machine learning methods to cover, |
|
|
|
44:57.520 --> 44:59.320 |
|
to cover all those situations. |
|
|
|
44:59.320 --> 45:00.760 |
|
Yeah, and the exciting thing for me is |
|
|
|
45:00.760 --> 45:03.000 |
|
I don't think that grinding ever stops |
|
|
|
45:03.000 --> 45:04.840 |
|
because there's a moment in time |
|
|
|
45:04.840 --> 45:08.760 |
|
where you've crossed that threshold of human performance |
|
|
|
45:08.760 --> 45:10.000 |
|
and become superhuman. |
|
|
|
45:11.200 --> 45:13.560 |
|
But there's no reason, there's no first principles reason |
|
|
|
45:13.560 --> 45:17.560 |
|
that AV capability will tap out anywhere near humans. |
|
|
|
45:17.560 --> 45:20.280 |
|
Like there's no reason it couldn't be 20 times better |
|
|
|
45:20.280 --> 45:22.120 |
|
whether that's, you know, just better driving |
|
|
|
45:22.120 --> 45:24.240 |
|
or safer driving or more comfortable driving |
|
|
|
45:24.240 --> 45:26.800 |
|
or even a thousand times better given enough time. |
|
|
|
45:26.800 --> 45:31.480 |
|
And we intend to basically chase that, you know, forever |
|
|
|
45:31.480 --> 45:32.840 |
|
to build the best possible product. |
|
|
|
45:32.840 --> 45:33.960 |
|
Better and better and better |
|
|
|
45:33.960 --> 45:36.400 |
|
and always new edge cases come up and new experiences. |
|
|
|
45:36.400 --> 45:39.520 |
|
So, and you want to automate that process |
|
|
|
45:39.520 --> 45:40.720 |
|
as much as possible. |
|
|
|
45:42.680 --> 45:45.160 |
|
So what do you think in general in society |
|
|
|
45:45.160 --> 45:48.200 |
|
when do you think we may have hundreds of thousands |
|
|
|
45:48.200 --> 45:50.200 |
|
of fully autonomous vehicles driving around? |
|
|
|
45:50.200 --> 45:53.560 |
|
So first of all, predictions, nobody knows the future. |
|
|
|
45:53.560 --> 45:55.360 |
|
You're a part of the leading people |
|
|
|
45:55.360 --> 45:56.560 |
|
trying to define that future, |
|
|
|
45:56.560 --> 45:58.560 |
|
but even then you still don't know. |
|
|
|
45:58.560 --> 46:02.240 |
|
But if you think about hundreds of thousands of vehicles, |
|
|
|
46:02.240 --> 46:05.840 |
|
so a significant fraction of vehicles |
|
|
|
46:05.840 --> 46:07.600 |
|
in major cities are autonomous. |
|
|
|
46:07.600 --> 46:10.800 |
|
Do you think, are you with Rodney Brooks |
|
|
|
46:10.800 --> 46:13.960 |
|
who is 2050 and beyond? |
|
|
|
46:13.960 --> 46:17.200 |
|
Or are you more with Elon Musk |
|
|
|
46:17.200 --> 46:20.600 |
|
who is, we should have had that two years ago? |
|
|
|
46:20.600 --> 46:23.840 |
|
Well, I mean, I'd love to have it two years ago, |
|
|
|
46:23.840 --> 46:26.120 |
|
but we're not there yet. |
|
|
|
46:26.120 --> 46:28.480 |
|
So I guess the way I would think about that |
|
|
|
46:28.480 --> 46:31.240 |
|
is let's flip that question around. |
|
|
|
46:31.240 --> 46:34.200 |
|
So what would prevent you to reach hundreds |
|
|
|
46:34.200 --> 46:36.320 |
|
of thousands of vehicles and... |
|
|
|
46:36.320 --> 46:38.200 |
|
That's a good rephrasing. |
|
|
|
46:38.200 --> 46:43.200 |
|
Yeah, so the, I'd say that it seems the consensus |
|
|
|
46:43.200 --> 46:45.200 |
|
among the people developing self driving cars today |
|
|
|
46:45.200 --> 46:49.200 |
|
is to sort of start with some form of an easier environment, |
|
|
|
46:49.200 --> 46:52.200 |
|
whether it means lacking, inclement weather, |
|
|
|
46:52.200 --> 46:55.200 |
|
or mostly sunny or whatever it is. |
|
|
|
46:55.200 --> 46:59.200 |
|
And then add capability for more complex situations |
|
|
|
46:59.200 --> 47:00.200 |
|
over time. |
|
|
|
47:00.200 --> 47:05.200 |
|
And so if you're only able to deploy in areas |
|
|
|
47:05.200 --> 47:07.200 |
|
that meet sort of your criteria |
|
|
|
47:07.200 --> 47:09.200 |
|
or that the current don't meet, |
|
|
|
47:09.200 --> 47:13.200 |
|
operating domain of the software you developed, |
|
|
|
47:13.200 --> 47:16.200 |
|
that may put a cap on how many cities you could deploy in. |
|
|
|
47:16.200 --> 47:19.200 |
|
But then as those restrictions start to fall away, |
|
|
|
47:19.200 --> 47:22.200 |
|
like maybe you add capability to drive really well |
|
|
|
47:22.200 --> 47:25.200 |
|
and safely and have you rain or snow, |
|
|
|
47:25.200 --> 47:28.200 |
|
that probably opens up the market by two or three fold |
|
|
|
47:28.200 --> 47:31.200 |
|
in terms of the cities you can expand into and so on. |
|
|
|
47:31.200 --> 47:33.200 |
|
And so the real question is, |
|
|
|
47:33.200 --> 47:35.200 |
|
I know today if we wanted to, |
|
|
|
47:35.200 --> 47:39.200 |
|
we could produce that many autonomous vehicles, |
|
|
|
47:39.200 --> 47:41.200 |
|
but we wouldn't be able to make use of all of them yet |
|
|
|
47:41.200 --> 47:44.200 |
|
because we would sort of saturate the demand in the cities |
|
|
|
47:44.200 --> 47:47.200 |
|
in which we would want to operate initially. |
|
|
|
47:47.200 --> 47:49.200 |
|
So if I were to guess what the timeline is |
|
|
|
47:49.200 --> 47:51.200 |
|
for those things falling away |
|
|
|
47:51.200 --> 47:54.200 |
|
and reaching hundreds, thousands of vehicles. |
|
|
|
47:54.200 --> 47:55.200 |
|
Maybe a range is better. |
|
|
|
47:55.200 --> 47:57.200 |
|
I would say less than five years. |
|
|
|
47:57.200 --> 47:58.200 |
|
Less than five years. |
|
|
|
47:58.200 --> 47:59.200 |
|
Yeah. |
|
|
|
47:59.200 --> 48:02.200 |
|
And of course you're working hard to make that happen. |
|
|
|
48:02.200 --> 48:05.200 |
|
So you started two companies that were eventually acquired |
|
|
|
48:05.200 --> 48:08.200 |
|
for each $4 billion. |
|
|
|
48:08.200 --> 48:10.200 |
|
So you're a pretty good person to ask, |
|
|
|
48:10.200 --> 48:13.200 |
|
what does it take to build a successful startup? |
|
|
|
48:13.200 --> 48:18.200 |
|
I think there's sort of survivor bias here a little bit, |
|
|
|
48:18.200 --> 48:20.200 |
|
but I can try to find some common threads |
|
|
|
48:20.200 --> 48:22.200 |
|
for the things that worked for me, which is... |
|
|
|
48:24.200 --> 48:26.200 |
|
In both of these companies, |
|
|
|
48:26.200 --> 48:28.200 |
|
I was really passionate about the core technology. |
|
|
|
48:28.200 --> 48:31.200 |
|
I actually lay awake at night thinking about these problems |
|
|
|
48:31.200 --> 48:33.200 |
|
and how to solve them. |
|
|
|
48:33.200 --> 48:35.200 |
|
And I think that's helpful because when you start a business, |
|
|
|
48:35.200 --> 48:37.200 |
|
there are... |
|
|
|
48:37.200 --> 48:40.200 |
|
To this day, there are these crazy ups and downs. |
|
|
|
48:40.200 --> 48:43.200 |
|
One day you think the business is just on top of the world |
|
|
|
48:43.200 --> 48:45.200 |
|
and unstoppable and the next day you think, |
|
|
|
48:45.200 --> 48:47.200 |
|
okay, this is all going to end. |
|
|
|
48:47.200 --> 48:50.200 |
|
It's just going south and it's going to be over tomorrow. |
|
|
|
48:52.200 --> 48:55.200 |
|
And so I think having a true passion that you can fall back on |
|
|
|
48:55.200 --> 48:57.200 |
|
and knowing that you would be doing it |
|
|
|
48:57.200 --> 48:58.200 |
|
even if you weren't getting paid for it |
|
|
|
48:58.200 --> 49:00.200 |
|
helps you weather those tough times. |
|
|
|
49:00.200 --> 49:02.200 |
|
So that's one thing. |
|
|
|
49:02.200 --> 49:05.200 |
|
I think the other one is really good people. |
|
|
|
49:05.200 --> 49:07.200 |
|
So I've always been surrounded by really good cofounders |
|
|
|
49:07.200 --> 49:09.200 |
|
that are logical thinkers, |
|
|
|
49:09.200 --> 49:11.200 |
|
are always pushing their limits |
|
|
|
49:11.200 --> 49:13.200 |
|
and have very high levels of integrity. |
|
|
|
49:13.200 --> 49:15.200 |
|
So that's Dan Kahn in my current company |
|
|
|
49:15.200 --> 49:17.200 |
|
and actually his brother and a couple other guys |
|
|
|
49:17.200 --> 49:19.200 |
|
for Justin TV and Twitch. |
|
|
|
49:19.200 --> 49:23.200 |
|
And then I think the last thing is just, |
|
|
|
49:23.200 --> 49:26.200 |
|
I guess, persistence or perseverance. |
|
|
|
49:26.200 --> 49:29.200 |
|
And that can apply to sticking to |
|
|
|
49:29.200 --> 49:33.200 |
|
having conviction around the original premise of your idea |
|
|
|
49:33.200 --> 49:36.200 |
|
and sticking around to do all the unsexy work |
|
|
|
49:36.200 --> 49:38.200 |
|
to actually make it come to fruition, |
|
|
|
49:38.200 --> 49:41.200 |
|
including dealing with whatever it is |
|
|
|
49:41.200 --> 49:43.200 |
|
that you're not passionate about, |
|
|
|
49:43.200 --> 49:47.200 |
|
whether that's finance or HR or operations or those things. |
|
|
|
49:47.200 --> 49:49.200 |
|
As long as you are grinding away |
|
|
|
49:49.200 --> 49:52.200 |
|
and working towards that North Star for your business, |
|
|
|
49:52.200 --> 49:54.200 |
|
whatever it is and you don't give up |
|
|
|
49:54.200 --> 49:56.200 |
|
and you're making progress every day, |
|
|
|
49:56.200 --> 49:58.200 |
|
it seems like eventually you'll end up in a good place. |
|
|
|
49:58.200 --> 50:00.200 |
|
And the only things that can slow you down |
|
|
|
50:00.200 --> 50:01.200 |
|
are running out of money |
|
|
|
50:01.200 --> 50:03.200 |
|
or I suppose your competitor is destroying you, |
|
|
|
50:03.200 --> 50:06.200 |
|
but I think most of the time it's people giving up |
|
|
|
50:06.200 --> 50:08.200 |
|
or somehow destroying things themselves |
|
|
|
50:08.200 --> 50:10.200 |
|
rather than being beaten by their competition |
|
|
|
50:10.200 --> 50:11.200 |
|
or running out of money. |
|
|
|
50:11.200 --> 50:14.200 |
|
Yeah, if you never quit, eventually you'll arrive. |
|
|
|
50:14.200 --> 50:16.200 |
|
It's a much more concise version |
|
|
|
50:16.200 --> 50:18.200 |
|
of what I was trying to say. |
|
|
|
50:18.200 --> 50:21.200 |
|
So you went the Y Combinator out twice. |
|
|
|
50:21.200 --> 50:23.200 |
|
What do you think, in a quick question, |
|
|
|
50:23.200 --> 50:25.200 |
|
do you think is the best way to raise funds |
|
|
|
50:25.200 --> 50:27.200 |
|
in the early days? |
|
|
|
50:27.200 --> 50:30.200 |
|
Or not just funds, but just community, |
|
|
|
50:30.200 --> 50:32.200 |
|
develop your idea and so on. |
|
|
|
50:32.200 --> 50:37.200 |
|
Can you do it solo or maybe with a cofounder |
|
|
|
50:37.200 --> 50:39.200 |
|
like self funded? |
|
|
|
50:39.200 --> 50:40.200 |
|
Do you think Y Combinator is good? |
|
|
|
50:40.200 --> 50:41.200 |
|
Is it good to do VC route? |
|
|
|
50:41.200 --> 50:43.200 |
|
Is there no right answer or is there, |
|
|
|
50:43.200 --> 50:45.200 |
|
from the Y Combinator experience, |
|
|
|
50:45.200 --> 50:47.200 |
|
something that you could take away |
|
|
|
50:47.200 --> 50:49.200 |
|
that that was the right path to take? |
|
|
|
50:49.200 --> 50:50.200 |
|
There's no one size fits all answer, |
|
|
|
50:50.200 --> 50:54.200 |
|
but if your ambition I think is to see how big |
|
|
|
50:54.200 --> 50:57.200 |
|
you can make something or rapidly expand |
|
|
|
50:57.200 --> 50:59.200 |
|
and capture a market or solve a problem |
|
|
|
50:59.200 --> 51:02.200 |
|
or whatever it is, then going the venture |
|
|
|
51:02.200 --> 51:04.200 |
|
back route is probably a good approach |
|
|
|
51:04.200 --> 51:07.200 |
|
so that capital doesn't become your primary constraint. |
|
|
|
51:07.200 --> 51:10.200 |
|
Y Combinator, I love because it puts you |
|
|
|
51:10.200 --> 51:13.200 |
|
in this sort of competitive environment |
|
|
|
51:13.200 --> 51:16.200 |
|
where you're surrounded by the top, |
|
|
|
51:16.200 --> 51:19.200 |
|
maybe 1% of other really highly motivated |
|
|
|
51:19.200 --> 51:22.200 |
|
peers who are in the same place. |
|
|
|
51:22.200 --> 51:26.200 |
|
In that environment I think just breeds success. |
|
|
|
51:26.200 --> 51:28.200 |
|
If you're surrounded by really brilliant |
|
|
|
51:28.200 --> 51:30.200 |
|
hardworking people, you're going to feel |
|
|
|
51:30.200 --> 51:32.200 |
|
sort of compelled or inspired to try |
|
|
|
51:32.200 --> 51:35.200 |
|
to emulate them or beat them. |
|
|
|
51:35.200 --> 51:37.200 |
|
So even though I had done it once before |
|
|
|
51:37.200 --> 51:41.200 |
|
and I felt like I'm pretty self motivated, |
|
|
|
51:41.200 --> 51:43.200 |
|
I thought this is going to be a hard problem, |
|
|
|
51:43.200 --> 51:45.200 |
|
I can use all the help I can get. |
|
|
|
51:45.200 --> 51:46.200 |
|
So surrounding myself with other entrepreneurs |
|
|
|
51:46.200 --> 51:48.200 |
|
is going to make me work a little bit harder |
|
|
|
51:48.200 --> 51:51.200 |
|
or push a little harder then it's worth it. |
|
|
|
51:51.200 --> 51:54.200 |
|
That's why I did it, for example, the second time. |
|
|
|
51:54.200 --> 51:57.200 |
|
Let's go full soft, go existential. |
|
|
|
51:57.200 --> 52:00.200 |
|
If you go back and do something differently in your life, |
|
|
|
52:00.200 --> 52:06.200 |
|
starting in high school and MIT, leaving MIT, |
|
|
|
52:06.200 --> 52:08.200 |
|
you could have gone to the PhD route, |
|
|
|
52:08.200 --> 52:13.200 |
|
doing startup, going to see about a startup in California |
|
|
|
52:13.200 --> 52:15.200 |
|
or maybe some aspects of fundraising. |
|
|
|
52:15.200 --> 52:17.200 |
|
Is there something you regret, |
|
|
|
52:17.200 --> 52:20.200 |
|
not necessarily regret, but if you go back, |
|
|
|
52:20.200 --> 52:22.200 |
|
you could do differently? |
|
|
|
52:22.200 --> 52:24.200 |
|
I think I've made a lot of mistakes, |
|
|
|
52:24.200 --> 52:26.200 |
|
pretty much everything you can screw up, |
|
|
|
52:26.200 --> 52:28.200 |
|
I think I've screwed up at least once. |
|
|
|
52:28.200 --> 52:30.200 |
|
But I don't regret those things. |
|
|
|
52:30.200 --> 52:32.200 |
|
I think it's hard to look back on things, |
|
|
|
52:32.200 --> 52:34.200 |
|
even if they didn't go well and call it a regret, |
|
|
|
52:34.200 --> 52:37.200 |
|
because hopefully it took away some new knowledge |
|
|
|
52:37.200 --> 52:39.200 |
|
or learning from that. |
|
|
|
52:42.200 --> 52:45.200 |
|
I would say there's a period, |
|
|
|
52:45.200 --> 52:47.200 |
|
the closest I can come to this, |
|
|
|
52:47.200 --> 52:49.200 |
|
there's a period in just in TV, |
|
|
|
52:49.200 --> 52:54.200 |
|
I think after seven years where the company was going |
|
|
|
52:54.200 --> 52:57.200 |
|
one direction, which is towards Twitch and video gaming. |
|
|
|
52:57.200 --> 52:58.200 |
|
I'm not a video gamer. |
|
|
|
52:58.200 --> 53:01.200 |
|
I don't really even use Twitch at all. |
|
|
|
53:01.200 --> 53:04.200 |
|
I was still working on the core technology there, |
|
|
|
53:04.200 --> 53:06.200 |
|
but my heart was no longer in it, |
|
|
|
53:06.200 --> 53:08.200 |
|
because the business that we were creating |
|
|
|
53:08.200 --> 53:10.200 |
|
was not something that I was personally passionate about. |
|
|
|
53:10.200 --> 53:12.200 |
|
It didn't meet your bar of existential impact. |
|
|
|
53:12.200 --> 53:16.200 |
|
Yeah, and I'd say I probably spent an extra year or two |
|
|
|
53:16.200 --> 53:20.200 |
|
working on that, and I'd say I would have just tried |
|
|
|
53:20.200 --> 53:22.200 |
|
to do something different sooner. |
|
|
|
53:22.200 --> 53:26.200 |
|
Because those were two years where I felt like, |
|
|
|
53:26.200 --> 53:29.200 |
|
from this philosophical or existential thing, |
|
|
|
53:29.200 --> 53:31.200 |
|
I just felt that something was missing. |
|
|
|
53:31.200 --> 53:34.200 |
|
If I could look back now and tell myself, |
|
|
|
53:34.200 --> 53:35.200 |
|
I would have said exactly that. |
|
|
|
53:35.200 --> 53:38.200 |
|
You're not getting any meaning out of your work personally |
|
|
|
53:38.200 --> 53:39.200 |
|
right now. |
|
|
|
53:39.200 --> 53:41.200 |
|
You should find a way to change that. |
|
|
|
53:41.200 --> 53:44.200 |
|
And that's part of the pitch I used |
|
|
|
53:44.200 --> 53:46.200 |
|
to basically everyone who joins Cruise today. |
|
|
|
53:46.200 --> 53:48.200 |
|
It's like, hey, you've got that now by coming here. |
|
|
|
53:48.200 --> 53:51.200 |
|
Well, maybe you needed the two years of that existential dread |
|
|
|
53:51.200 --> 53:53.200 |
|
to develop the feeling that ultimately |
|
|
|
53:53.200 --> 53:55.200 |
|
it was the fire that created Cruise. |
|
|
|
53:55.200 --> 53:56.200 |
|
So you never know. |
|
|
|
53:56.200 --> 53:57.200 |
|
You can't repair. |
|
|
|
53:57.200 --> 53:58.200 |
|
Good theory, yeah. |
|
|
|
53:58.200 --> 53:59.200 |
|
So last question. |
|
|
|
53:59.200 --> 54:02.200 |
|
What does 2019 hold for Cruise? |
|
|
|
54:02.200 --> 54:05.200 |
|
After this, I guess we're going to go and talk to your class. |
|
|
|
54:05.200 --> 54:08.200 |
|
But one of the big things is going from prototype to production |
|
|
|
54:08.200 --> 54:09.200 |
|
for autonomous cars. |
|
|
|
54:09.200 --> 54:10.200 |
|
And what does that mean? |
|
|
|
54:10.200 --> 54:11.200 |
|
What does that look like? |
|
|
|
54:11.200 --> 54:14.200 |
|
2019 for us is the year that we try to cross over |
|
|
|
54:14.200 --> 54:17.200 |
|
that threshold and reach superhuman level of performance |
|
|
|
54:17.200 --> 54:20.200 |
|
to some degree with the software and have all the other |
|
|
|
54:20.200 --> 54:23.200 |
|
of the thousands of little building blocks in place |
|
|
|
54:23.200 --> 54:27.200 |
|
to launch our first commercial product. |
|
|
|
54:27.200 --> 54:30.200 |
|
So that's what's in store for us. |
|
|
|
54:30.200 --> 54:32.200 |
|
And we've got a lot of work to do. |
|
|
|
54:32.200 --> 54:35.200 |
|
We've got a lot of brilliant people working on it. |
|
|
|
54:35.200 --> 54:37.200 |
|
So it's all up to us now. |
|
|
|
54:37.200 --> 54:38.200 |
|
Yeah. |
|
|
|
54:38.200 --> 54:41.200 |
|
So Charlie Miller and Chris Vell is like the people I've |
|
|
|
54:41.200 --> 54:42.200 |
|
crossed paths with. |
|
|
|
54:42.200 --> 54:43.200 |
|
Oh, great, yeah. |
|
|
|
54:43.200 --> 54:46.200 |
|
It sounds like you have an amazing team. |
|
|
|
54:46.200 --> 54:49.200 |
|
So like I said, it's one of the most, I think, one of the most |
|
|
|
54:49.200 --> 54:52.200 |
|
important problems in artificial intelligence of this century. |
|
|
|
54:52.200 --> 54:53.200 |
|
It'll be one of the most defining. |
|
|
|
54:53.200 --> 54:55.200 |
|
It's super exciting that you work on it. |
|
|
|
54:55.200 --> 54:59.200 |
|
And the best of luck in 2019. |
|
|
|
54:59.200 --> 55:01.200 |
|
I'm really excited to see what Cruise comes up with. |
|
|
|
55:01.200 --> 55:02.200 |
|
Thank you. |
|
|
|
55:02.200 --> 55:03.200 |
|
Thanks for having me today. |
|
|
|
55:03.200 --> 55:08.200 |
|
Thank you. |
|
|
|
|