lexicap / vtt /episode_031_small.vtt
Shubham Gupta
Add readme and files
a3be5d0
raw
history blame
199 kB
WEBVTT
00:00.000 --> 00:02.520
The following is a conversation with George Hotz.
00:02.520 --> 00:04.480
He's the founder of Comma AI,
00:04.480 --> 00:07.400
a machine learning based vehicle automation company.
00:07.400 --> 00:10.200
He is most certainly an outspoken personality
00:10.200 --> 00:13.160
in the field of AI and technology in general.
00:13.160 --> 00:16.240
He first gained recognition for being the first person
00:16.240 --> 00:18.400
to carry on lock and iPhone.
00:18.400 --> 00:21.280
And since then, he's done quite a few interesting things
00:21.280 --> 00:24.400
at the intersection of hardware and software.
00:24.400 --> 00:27.440
This is the artificial intelligence podcast.
00:27.440 --> 00:29.560
If you enjoy it, subscribe on YouTube,
00:29.560 --> 00:32.920
give it five stars on iTunes, support it on Patreon,
00:32.920 --> 00:34.920
or simply connect with me on Twitter.
00:34.920 --> 00:39.120
Alex Friedman, spelled F R I D M A N.
00:39.120 --> 00:42.000
And I'd like to give a special thank you to Jennifer
00:42.000 --> 00:45.880
from Canada for her support of the podcast on Patreon.
00:45.880 --> 00:47.720
Merci beaucoup, Jennifer.
00:47.720 --> 00:50.600
She's been a friend and an engineering colleague
00:50.600 --> 00:52.800
for many years since I was in grad school.
00:52.800 --> 00:55.520
Your support means a lot and inspires me
00:55.520 --> 00:57.920
to keep this series going.
00:57.920 --> 01:01.600
And now here's my conversation with George Hotz.
01:02.720 --> 01:04.720
Do you think we're living in a simulation?
01:06.480 --> 01:10.080
Yes, but it may be unfalsifiable.
01:10.080 --> 01:12.440
What do you mean by unfalsifiable?
01:12.440 --> 01:16.840
So if the simulation is designed in such a way
01:16.840 --> 01:19.640
that they did like a formal proof
01:19.640 --> 01:22.320
to show that no information can get in and out.
01:22.320 --> 01:25.200
And if their hardware is designed for the anything
01:25.200 --> 01:27.880
in the simulation to always keep the hardware in spec,
01:27.880 --> 01:29.480
it may be impossible to prove
01:29.480 --> 01:31.280
whether we're in a simulation or not.
01:32.600 --> 01:35.680
So they've designed it such that it's a closed system,
01:35.680 --> 01:37.200
you can't get outside the system.
01:37.200 --> 01:38.760
Well, maybe it's one of three worlds.
01:38.760 --> 01:41.400
We're either in a simulation which can be exploited,
01:41.400 --> 01:44.200
we're in a simulation which not only can't be exploited,
01:44.200 --> 01:46.440
but like the same thing's true about VMs.
01:46.440 --> 01:48.160
A really well designed VM,
01:48.160 --> 01:50.520
you can't even detect if you're in a VM or not.
01:51.400 --> 01:52.520
That's brilliant.
01:52.520 --> 01:55.160
So we're, yeah, so the simulation is running
01:55.160 --> 01:56.800
on a virtual machine.
01:56.800 --> 01:59.440
But now in reality, all VMs have ways to detect.
01:59.440 --> 02:00.280
That's the point.
02:00.280 --> 02:04.840
I mean, is it, you've done quite a bit of hacking yourself.
02:04.840 --> 02:08.640
So you should know that really any complicated system
02:08.640 --> 02:11.000
will have ways in and out.
02:11.000 --> 02:14.240
So this isn't necessarily true going forward.
02:15.280 --> 02:18.080
I spent my time away from comma,
02:18.080 --> 02:21.240
I learned a cock, it's a dependently typed,
02:21.240 --> 02:24.360
like it's a language for writing math proofs.
02:24.360 --> 02:28.200
And if you write code that compiles in a language like that,
02:28.200 --> 02:30.840
it is correct by definition.
02:30.840 --> 02:33.560
The types check it's correctance.
02:33.560 --> 02:35.000
So it's possible that the simulation
02:35.000 --> 02:39.640
is written in a language like this, in which case, yeah.
02:39.640 --> 02:42.680
Yeah, but that can't be sufficiently expressive
02:42.680 --> 02:43.760
of language like that.
02:43.760 --> 02:44.600
Oh, it can.
02:44.600 --> 02:45.440
It can be?
02:45.440 --> 02:46.280
Oh, yeah.
02:46.280 --> 02:48.920
Okay, well, so, all right, so.
02:48.920 --> 02:50.640
The simulation doesn't have to be tearing complete
02:50.640 --> 02:52.320
if it has a scheduled end date.
02:52.320 --> 02:54.600
Looks like it does actually with entropy.
02:54.600 --> 02:58.520
I mean, I don't think that a simulation
02:58.520 --> 03:02.200
that results in something as complicated as the universe
03:03.080 --> 03:07.280
would have a formal proof of correctness, right?
03:08.240 --> 03:09.880
It's possible, of course.
03:09.880 --> 03:12.720
We have no idea how good their tooling is.
03:12.720 --> 03:14.640
And we have no idea how complicated
03:14.640 --> 03:16.280
the universe computer really is.
03:16.280 --> 03:17.920
It may be quite simple.
03:17.920 --> 03:19.680
It's just very large, right?
03:19.680 --> 03:22.160
It's very, it's definitely very large.
03:22.160 --> 03:24.480
But the fundamental rules might be super simple.
03:24.480 --> 03:26.240
Yeah, Conway's gonna like kinda stop.
03:26.240 --> 03:30.320
Right, so if you could hack,
03:30.320 --> 03:32.400
so imagine the simulation that is hackable,
03:32.400 --> 03:33.640
if you could hack it,
03:35.040 --> 03:37.960
what would you change about the universe?
03:37.960 --> 03:40.560
Like how would you approach hacking a simulation?
03:41.640 --> 03:44.360
The reason I gave that talk?
03:44.360 --> 03:46.680
By the way, I'm not familiar with the talk you gave.
03:46.680 --> 03:50.160
I just read that you talked about escaping the simulation
03:50.160 --> 03:51.280
or something like that.
03:51.280 --> 03:52.640
So maybe you can tell me a little bit
03:52.640 --> 03:55.360
about the theme and the message there too.
03:55.360 --> 03:57.680
It wasn't a very practical talk
03:57.680 --> 04:00.600
about how to actually escape a simulation.
04:00.600 --> 04:03.320
It was more about a way of restructuring
04:03.320 --> 04:05.120
an us versus them narrative.
04:05.120 --> 04:10.120
If we continue on the path we're going with technology,
04:12.360 --> 04:14.160
I think we're in big trouble,
04:14.160 --> 04:16.760
like as a species and not just as a species,
04:16.760 --> 04:19.480
but even as me as an individual member of the species.
04:19.480 --> 04:23.680
So if we could change rhetoric to be more like,
04:23.680 --> 04:24.920
to think upwards,
04:26.240 --> 04:29.080
like to think about that we're in a simulation
04:29.080 --> 04:30.360
and how we could get out,
04:30.360 --> 04:32.640
already we'd be on the right path.
04:32.640 --> 04:34.800
What you actually do once you do that,
04:34.800 --> 04:37.360
well, I assume I would have acquired way more intelligence
04:37.360 --> 04:39.760
in the process of doing that, so I'll just ask that.
04:39.760 --> 04:42.080
So the thinking upwards,
04:42.080 --> 04:43.760
what kind of ideas,
04:43.760 --> 04:45.640
what kind of breakthrough ideas do you think thinking
04:45.640 --> 04:47.280
in that way could inspire?
04:47.280 --> 04:49.800
And why did you say upwards?
04:49.800 --> 04:50.640
Upwards.
04:50.640 --> 04:51.480
Into space?
04:51.480 --> 04:54.120
Are you thinking sort of exploration in all forms?
04:54.120 --> 04:59.120
The space narrative that held for the modernist generation
04:59.880 --> 05:02.600
doesn't hold as well for the postmodern generation.
05:04.560 --> 05:05.480
What's the space narrative?
05:05.480 --> 05:06.520
Are we talking about the same space?
05:06.520 --> 05:07.360
The three dimensional space?
05:07.360 --> 05:08.840
No, no, space, like going up space,
05:08.840 --> 05:10.040
like building like Elon Musk,
05:10.040 --> 05:11.160
like we're going to build rockets,
05:11.160 --> 05:12.080
we're going to go to Mars,
05:12.080 --> 05:13.560
we're going to colonize the universe.
05:13.560 --> 05:14.720
And the narrative you're referring,
05:14.720 --> 05:16.040
I was born in the Soviet Union,
05:16.040 --> 05:18.000
you're referring to the race to space?
05:18.000 --> 05:18.840
The race to space, yeah.
05:18.840 --> 05:19.680
Yes, explore, okay.
05:19.680 --> 05:21.760
That was a great modernist narrative.
05:21.760 --> 05:23.360
Yeah.
05:23.360 --> 05:26.720
It doesn't seem to hold the same weight in today's culture.
05:27.640 --> 05:32.160
I'm hoping for good postmodern narratives that replace it.
05:32.160 --> 05:35.560
So let's think, so you work a lot with AI.
05:35.560 --> 05:39.080
So AI is one formulation of that narrative.
05:39.080 --> 05:40.080
There could be also,
05:40.080 --> 05:42.320
I don't know how much you do in VR and AR.
05:42.320 --> 05:43.160
Yeah.
05:43.160 --> 05:45.160
That's another, I know less about it,
05:45.160 --> 05:47.600
but every time I play with it and our research,
05:47.600 --> 05:49.640
it's fascinating, that virtual world.
05:49.640 --> 05:51.840
Are you interested in the virtual world?
05:51.840 --> 05:54.200
I would like to move to virtual reality.
05:55.360 --> 05:56.440
In terms of your work?
05:56.440 --> 05:58.760
No, I would like to physically move there.
05:58.760 --> 06:00.240
The apartment I can rent in the cloud
06:00.240 --> 06:03.240
is way better than the apartment I can rent in the real world.
06:03.240 --> 06:04.760
Well, it's all relative, isn't it?
06:04.760 --> 06:07.280
Because others will have very nice apartments too,
06:07.280 --> 06:09.200
so you'll be inferior in the virtual world as well.
06:09.200 --> 06:11.320
But that's not how I view the world, right?
06:11.320 --> 06:12.440
I don't view the world.
06:12.440 --> 06:15.640
I mean, that's a very like, almost zero summish way
06:15.640 --> 06:16.480
to view the world.
06:16.480 --> 06:18.800
Say like, my great apartment isn't great
06:18.800 --> 06:20.400
because my neighbor has one too.
06:20.400 --> 06:21.640
No, my great apartment is great
06:21.640 --> 06:24.320
because like, look at this dishwasher, man.
06:24.320 --> 06:26.640
You just touch the dish and it's washed, right?
06:26.640 --> 06:28.680
And that is great in and of itself
06:28.680 --> 06:30.120
if I had the only apartment
06:30.120 --> 06:31.520
or if everybody had the apartment.
06:31.520 --> 06:32.400
I don't care.
06:32.400 --> 06:34.760
So you have fundamental gratitude.
06:34.760 --> 06:39.080
The world first learned of Geohot, George Hots
06:39.080 --> 06:42.280
in August 2007, maybe before then,
06:42.280 --> 06:44.080
but certainly in August 2007
06:44.080 --> 06:46.760
when you were the first person to unlock,
06:46.760 --> 06:48.880
carry on lock an iPhone.
06:48.880 --> 06:50.520
How did you get into hacking?
06:50.520 --> 06:53.080
What was the first system you discovered
06:53.080 --> 06:55.040
vulnerabilities for and broke into?
06:56.240 --> 07:01.240
So that was really kind of the first thing.
07:01.640 --> 07:06.640
I had a book in 2006 called Gray Hat Hacking.
07:07.480 --> 07:11.000
And I guess I realized that
07:11.000 --> 07:13.480
if you acquired these sort of powers
07:13.480 --> 07:15.280
you could control the world.
07:16.160 --> 07:18.920
But I didn't really know that much
07:18.920 --> 07:20.560
about computers back then.
07:20.560 --> 07:22.120
I started with electronics.
07:22.120 --> 07:24.200
The first iPhone hack was physical.
07:24.200 --> 07:25.040
Cardware.
07:25.040 --> 07:28.160
You had to open it up and pull an address line high.
07:28.160 --> 07:29.960
And it was because I didn't really know
07:29.960 --> 07:31.320
about software exploitation.
07:31.320 --> 07:32.960
I learned that all in the next few years
07:32.960 --> 07:33.920
and I got very good at it.
07:33.920 --> 07:36.560
But back then I knew about like
07:36.560 --> 07:38.920
how memory chips are connected to processors and stuff.
07:38.920 --> 07:41.040
But you knew about software and programming.
07:41.040 --> 07:43.200
You didn't know.
07:43.200 --> 07:46.160
Oh really, so your view of the world
07:46.160 --> 07:49.320
and computers was physical, was hardware.
07:49.320 --> 07:52.400
Actually, if you read the code that I released with that
07:52.400 --> 07:55.760
in August 2007, it's atrocious.
07:55.760 --> 07:56.760
What language was it?
07:56.760 --> 07:57.600
C.
07:57.600 --> 07:58.440
C, nice.
07:58.440 --> 08:01.480
And in a broken sort of state machine, ask C.
08:01.480 --> 08:02.960
I didn't know how to program.
08:02.960 --> 08:04.160
Yeah.
08:04.160 --> 08:06.600
So how did you learn to program?
08:07.520 --> 08:08.440
What was your journey?
08:08.440 --> 08:10.040
I mean, we'll talk about it.
08:10.040 --> 08:12.680
You've live streamed some of your programming.
08:12.680 --> 08:14.400
This chaotic, beautiful mess.
08:14.400 --> 08:16.480
How did you arrive at that?
08:16.480 --> 08:18.640
Years and years of practice.
08:18.640 --> 08:22.240
I interned at Google after,
08:22.240 --> 08:24.800
the summer after the iPhone unlock.
08:24.800 --> 08:26.720
And I did a contract for them
08:26.720 --> 08:29.040
where I built a hardware for Street View
08:29.040 --> 08:31.760
and I wrote a software library to interact with it.
08:32.680 --> 08:34.920
And it was terrible code.
08:34.920 --> 08:36.560
And for the first time I got feedback
08:36.560 --> 08:38.760
from people who I respected saying,
08:38.760 --> 08:41.160
no, like, don't write code like this.
08:42.680 --> 08:45.680
Now, of course, just getting that feedback is not enough.
08:45.680 --> 08:50.680
The way that I really got good was,
08:51.000 --> 08:54.800
I wanted to write this thing that could emulate
08:54.800 --> 08:58.440
and then visualize like arm binaries
08:58.440 --> 09:00.040
because I wanted to hack the iPhone better.
09:00.040 --> 09:01.960
And I didn't like that I couldn't see what the,
09:01.960 --> 09:03.800
I couldn't single step through the processor
09:03.800 --> 09:05.200
because I had no debugger on there,
09:05.200 --> 09:06.640
especially for the low level things like the boot ROM
09:06.640 --> 09:07.480
and the boot loader.
09:07.480 --> 09:09.440
So I tried to build this tool to do it.
09:10.920 --> 09:13.440
And I built the tool once and it was terrible.
09:13.440 --> 09:15.120
I built the tool second times, it was terrible.
09:15.120 --> 09:16.320
I built the tool third time.
09:16.320 --> 09:18.600
This by the time I was at Facebook, it was kind of okay.
09:18.600 --> 09:20.560
And then I built the tool fourth time
09:20.560 --> 09:22.560
when I was a Google intern again in 2014.
09:22.560 --> 09:24.320
And that was the first time I was like,
09:24.320 --> 09:25.880
this is finally usable.
09:25.880 --> 09:27.120
How do you pronounce this, Kira?
09:27.120 --> 09:28.360
Kira, yeah.
09:28.360 --> 09:31.840
So it's essentially the most efficient way
09:31.840 --> 09:35.720
to visualize the change of state of the computer
09:35.720 --> 09:37.200
as the program is running.
09:37.200 --> 09:38.920
That's what you mean by debugger.
09:38.920 --> 09:41.760
Yeah, it's a timeless debugger.
09:41.760 --> 09:45.080
So you can rewind just as easily as going forward.
09:45.080 --> 09:46.280
Think about, if you're using GDB,
09:46.280 --> 09:47.880
you have to put a watch on a variable.
09:47.880 --> 09:49.680
If you want to see if that variable changes.
09:49.680 --> 09:51.480
In Kira, you can just click on that variable.
09:51.480 --> 09:53.880
And then it shows every single time
09:53.880 --> 09:56.520
when that variable was changed or accessed.
09:56.520 --> 09:59.760
Think about it like get for your computer's, the run lock.
09:59.760 --> 10:04.760
So there's like a deep log of the state of the computer
10:05.640 --> 10:07.840
as the program runs and you can rewind.
10:07.840 --> 10:11.480
Why isn't that, maybe it is, maybe you can educate me.
10:11.480 --> 10:14.640
Why isn't that kind of debugging used more often?
10:14.640 --> 10:16.320
Because the tooling's bad.
10:16.320 --> 10:17.160
Well, two things.
10:17.160 --> 10:19.360
One, if you're trying to debug Chrome,
10:19.360 --> 10:22.920
Chrome is a 200 megabyte binary
10:22.920 --> 10:25.440
that runs slowly on desktops.
10:25.440 --> 10:27.760
So that's gonna be really hard to use for that.
10:27.760 --> 10:30.160
But it's really good to use for like CTFs
10:30.160 --> 10:33.200
and for boot ROMs and for small parts of code.
10:33.200 --> 10:36.360
So it's hard if you're trying to debug like massive systems.
10:36.360 --> 10:38.200
What's a CTF and what's a boot ROM?
10:38.200 --> 10:40.480
A boot ROM is the first code that executes
10:40.480 --> 10:42.280
the minute you give power to your iPhone.
10:42.280 --> 10:43.520
Okay.
10:43.520 --> 10:46.040
And CTF were these competitions that I played.
10:46.040 --> 10:46.880
Capture the flag.
10:46.880 --> 10:47.720
Capture the flag.
10:47.720 --> 10:48.560
I was gonna ask you about that.
10:48.560 --> 10:49.920
What are those, those look at,
10:49.920 --> 10:51.440
I watched a couple of videos on YouTube.
10:51.440 --> 10:52.920
Those look fascinating.
10:52.920 --> 10:55.560
What have you learned about maybe at the high level
10:55.560 --> 10:58.040
in the vulnerability of systems from these competitions?
11:00.840 --> 11:04.200
I feel like in the heyday of CTFs,
11:04.200 --> 11:08.160
you had all of the best security people in the world
11:08.160 --> 11:10.720
challenging each other and coming up
11:10.720 --> 11:13.640
with new toy exploitable things over here.
11:13.640 --> 11:15.400
And then everybody, okay, who can break it?
11:15.400 --> 11:17.160
And when you break it, you get like,
11:17.160 --> 11:19.360
there's like a file in the server called flag.
11:19.360 --> 11:20.960
And then there's a program running,
11:20.960 --> 11:22.680
listening on a socket that's vulnerable.
11:22.680 --> 11:25.000
So you write an exploit, you get a shell,
11:25.000 --> 11:27.160
and then you cat flag, and then you type the flag
11:27.160 --> 11:29.480
into like a web based scoreboard and you get points.
11:29.480 --> 11:33.000
So the goal is essentially to find an exploit in the system
11:33.000 --> 11:35.280
that allows you to run shell,
11:35.280 --> 11:38.040
to run arbitrary code on that system.
11:38.040 --> 11:40.200
That's one of the categories.
11:40.200 --> 11:41.960
That's like the Poneable category.
11:43.560 --> 11:44.400
Poneable?
11:44.400 --> 11:45.240
Yeah, Poneable.
11:45.240 --> 11:47.600
It's like, you know, you Pone the program.
11:47.600 --> 11:48.440
It's a program.
11:48.440 --> 11:51.760
Oh, yeah.
11:51.760 --> 11:55.360
You know, first of all, I apologize, I'm gonna say,
11:55.360 --> 11:56.280
it's because I'm Russian,
11:56.280 --> 11:59.120
but maybe you can help educate me.
12:00.120 --> 12:01.680
Some video game like misspelled
12:01.680 --> 12:02.840
to own way back in the day.
12:02.840 --> 12:04.880
Yeah, and it's just,
12:04.880 --> 12:06.280
I wonder if there's a definition
12:06.280 --> 12:08.000
and I'll have to go to Urban Dictionary for it.
12:08.000 --> 12:09.800
Yeah, it'd be interesting to see what it says.
12:09.800 --> 12:12.760
Okay, so what was the heyday of CTL, by the way,
12:12.760 --> 12:15.480
but was it, what decade are we talking about?
12:15.480 --> 12:18.400
I think like, I mean, maybe I'm biased
12:18.400 --> 12:21.120
because it's the era that I played,
12:21.120 --> 12:25.800
but like 2011 to 2015,
12:27.200 --> 12:30.320
because the modern CTF scene
12:30.320 --> 12:32.640
is similar to the modern competitive programming scene.
12:32.640 --> 12:34.280
You have people who like do drills.
12:34.280 --> 12:35.880
You have people who practice.
12:35.880 --> 12:37.040
And then once you've done that,
12:37.040 --> 12:40.040
you've turned it less into a game of generic computer skill
12:40.040 --> 12:42.440
and more into a game of, okay, you memorize,
12:42.440 --> 12:44.620
you drill on these five categories.
12:45.760 --> 12:48.920
And then before that, it wasn't,
12:48.920 --> 12:51.560
it didn't have like as much attention as it had.
12:52.800 --> 12:53.640
I don't know, they were like,
12:53.640 --> 12:55.200
I won $30,000 once in Korea
12:55.200 --> 12:56.120
for one of these competitions.
12:56.120 --> 12:56.960
Holy crap.
12:56.960 --> 12:57.920
Yeah, they were, they were, that was...
12:57.920 --> 12:59.520
So that means, I mean, money is money,
12:59.520 --> 13:02.320
but that means there was probably good people there.
13:02.320 --> 13:03.600
Exactly, yeah.
13:03.600 --> 13:06.800
Are the challenges human constructed
13:06.800 --> 13:10.760
or are they grounded in some real flaws in real systems?
13:10.760 --> 13:13.080
Usually they're human constructed,
13:13.080 --> 13:15.760
but they're usually inspired by real flaws.
13:15.760 --> 13:17.320
What kind of systems are imagined
13:17.320 --> 13:19.080
is really focused on mobile?
13:19.080 --> 13:20.920
Like what has vulnerabilities these days?
13:20.920 --> 13:25.120
Is it primarily mobile systems like Android?
13:25.120 --> 13:26.680
Oh, everything does.
13:26.680 --> 13:28.120
Yeah, of course.
13:28.120 --> 13:29.360
The price has kind of gone up
13:29.360 --> 13:31.280
because less and less people can find them.
13:31.280 --> 13:33.160
And what's happened in security is now,
13:33.160 --> 13:34.560
if you want to like jailbreak an iPhone,
13:34.560 --> 13:36.960
you don't need one exploit anymore, you need nine.
13:37.960 --> 13:39.160
Nine change together?
13:39.160 --> 13:40.000
What would you mean?
13:40.000 --> 13:40.840
Yeah, wow.
13:40.840 --> 13:44.800
Okay, so it's really, what's the benefit?
13:44.800 --> 13:48.240
Speaking higher level philosophically about hacking.
13:48.240 --> 13:50.400
I mean, it sounds from everything I've seen about you,
13:50.400 --> 13:55.040
you just love the challenge and you don't want to do anything.
13:55.040 --> 13:58.120
You don't want to bring that exploit out into the world
13:58.120 --> 14:01.680
and do any actual, let it run wild.
14:01.680 --> 14:02.760
You just want to solve it
14:02.760 --> 14:05.400
and then you go on to the next thing.
14:05.400 --> 14:08.440
Oh yeah, I mean, doing criminal stuff's not really worth it.
14:08.440 --> 14:10.520
And I'll actually use the same argument
14:10.520 --> 14:15.440
for why I don't do defense for why I don't do crime.
14:15.440 --> 14:16.840
If you want to defend a system,
14:16.840 --> 14:19.280
say the system has 10 holes, right?
14:19.280 --> 14:22.240
If you find nine of those holes as a defender,
14:22.240 --> 14:24.240
you still lose because the attacker gets in
14:24.240 --> 14:25.520
through the last one.
14:25.520 --> 14:26.360
If you're an attacker,
14:26.360 --> 14:28.720
you only have to find one out of the 10.
14:28.720 --> 14:30.760
But if you're a criminal,
14:30.760 --> 14:34.800
if you log on with a VPN nine out of the 10 times,
14:34.800 --> 14:37.760
but one time you forget, you're done.
14:37.760 --> 14:39.400
Because you're caught, okay.
14:39.400 --> 14:41.160
Because you only have to mess up once
14:41.160 --> 14:42.920
to be caught as a criminal.
14:42.920 --> 14:44.320
That's why I'm not a criminal.
14:45.920 --> 14:47.080
But okay, let me,
14:47.080 --> 14:49.520
cause I was having a discussion with somebody
14:49.520 --> 14:52.440
just at a high level about nuclear weapons,
14:52.440 --> 14:56.240
actually why we're having blown ourselves up yet.
14:56.240 --> 14:59.840
And my feeling is all the smart people in the world,
14:59.840 --> 15:04.120
if you look at the distribution of smart people,
15:04.120 --> 15:06.760
smart people are generally good.
15:06.760 --> 15:07.680
And then the Southern person,
15:07.680 --> 15:09.480
I was talking to Sean Carroll, the physicist,
15:09.480 --> 15:11.400
and he was saying no good and bad people
15:11.400 --> 15:14.080
are evenly distributed amongst everybody.
15:14.080 --> 15:18.080
My sense was good hackers are in general good people
15:18.080 --> 15:20.400
and they don't want to mess with the world.
15:20.400 --> 15:21.920
What's your sense?
15:21.920 --> 15:24.720
I'm not even sure about that.
15:25.920 --> 15:30.520
Like, I have a nice life.
15:30.520 --> 15:32.120
Crime wouldn't get me anything.
15:34.320 --> 15:36.520
But if you're good and you have these skills,
15:36.520 --> 15:38.720
you probably have a nice life too, right?
15:38.720 --> 15:40.160
Right, you can use the father things.
15:40.160 --> 15:41.120
But is there an ethical,
15:41.120 --> 15:44.200
is there a little voice in your head that says,
15:46.120 --> 15:49.040
well, yeah, if you could hack something
15:49.040 --> 15:50.720
to where you could hurt people
15:52.840 --> 15:54.960
and you could earn a lot of money doing it though,
15:54.960 --> 15:56.320
not hurt physically perhaps,
15:56.320 --> 15:59.000
but disrupt their life in some kind of way.
16:00.200 --> 16:02.360
Isn't there a little voice that says,
16:03.360 --> 16:04.560
Well, two things.
16:04.560 --> 16:06.800
One, I don't really care about money.
16:06.800 --> 16:08.680
So like the money wouldn't be an incentive.
16:08.680 --> 16:10.640
The thrill might be an incentive.
16:10.640 --> 16:14.440
But when I was 19, I read crime and punishment.
16:14.440 --> 16:16.120
That was another great one
16:16.120 --> 16:18.440
that talked me out of ever really doing crime.
16:19.400 --> 16:21.720
Cause it's like, that's gonna be me.
16:21.720 --> 16:25.040
I'd get away with it, but it would just run through my head.
16:25.040 --> 16:26.480
Even if I got away with it, you know?
16:26.480 --> 16:27.640
And then you do crime for long enough,
16:27.640 --> 16:28.960
you'll never get away with it.
16:28.960 --> 16:30.360
That's right, in the end.
16:30.360 --> 16:32.680
That's a good reason to be good.
16:32.680 --> 16:34.880
I wouldn't say I'm good, I would just say I'm not bad.
16:34.880 --> 16:38.080
You're a talented programmer and a hacker
16:38.080 --> 16:40.920
in a good positive sense of the word.
16:40.920 --> 16:43.360
You've played around, found vulnerabilities
16:43.360 --> 16:44.720
in various systems.
16:44.720 --> 16:46.120
What have you learned broadly
16:46.120 --> 16:49.480
about the design of systems and so on
16:49.480 --> 16:51.520
from that whole process?
16:53.280 --> 16:58.280
You learn to not take things
16:59.280 --> 17:02.160
for what people say they are,
17:02.160 --> 17:05.320
but you look at things for what they actually are.
17:07.040 --> 17:07.880
Yeah.
17:07.880 --> 17:10.080
I understand that's what you tell me it is,
17:10.080 --> 17:11.320
but what does it do?
17:12.960 --> 17:14.600
And you have nice visualization tools
17:14.600 --> 17:16.720
to really know what it's really doing.
17:16.720 --> 17:20.080
Oh, I wish I'm a better programmer now than I was in 2014.
17:20.080 --> 17:21.880
I said, Kira, that was the first tool
17:21.880 --> 17:23.440
that I wrote that was usable.
17:23.440 --> 17:25.360
I wouldn't say the code was great.
17:25.360 --> 17:27.360
I still wouldn't say my code is great.
17:28.840 --> 17:30.760
So how was your evolution as a programmer?
17:30.760 --> 17:32.280
Except practice.
17:32.280 --> 17:33.880
You started with C,
17:33.880 --> 17:35.560
what point did you pick up Python?
17:35.560 --> 17:37.080
Because you're pretty big in Python now.
17:37.080 --> 17:39.960
Now, yeah, in college,
17:39.960 --> 17:42.520
I went to Carnegie Mellon when I was 22.
17:42.520 --> 17:44.200
I went back, I'm like,
17:44.200 --> 17:46.640
I'm gonna take all your hardest CS courses
17:46.640 --> 17:47.640
and we'll see how I do, right?
17:47.640 --> 17:48.560
Like, did I miss anything
17:48.560 --> 17:51.520
by not having a real undergraduate education?
17:51.520 --> 17:54.240
Took operating systems, compilers, AI,
17:54.240 --> 17:56.880
and they're like a freshman Weeder math course.
17:56.880 --> 18:01.880
And some of those classes you mentioned,
18:03.320 --> 18:04.240
pretty tough, actually.
18:04.240 --> 18:05.640
They're great.
18:05.640 --> 18:07.640
At least when the 2012,
18:07.640 --> 18:10.240
circa 2012 operating systems and compilers
18:11.240 --> 18:14.440
were two of the best classes I've ever taken in my life.
18:14.440 --> 18:15.640
Because you write an operating system
18:15.640 --> 18:16.840
and you write a compiler.
18:18.080 --> 18:19.760
I wrote my operating system in C
18:19.760 --> 18:21.400
and I wrote my compiler in Haskell,
18:21.400 --> 18:26.400
but somehow I picked up Python that semester as well.
18:26.400 --> 18:28.080
I started using it for the CTFs, actually.
18:28.080 --> 18:30.320
That's when I really started to get into CTFs
18:30.320 --> 18:33.360
and CTFs, you're all to race against the clock.
18:33.360 --> 18:35.120
So I can't write things and see.
18:35.120 --> 18:36.240
Oh, there's a clock component.
18:36.240 --> 18:37.840
So you really want to use the programming language
18:37.840 --> 18:38.960
just so you can be fastest.
18:38.960 --> 18:40.080
48 hours.
18:40.080 --> 18:41.440
Pwn as many of these challenges as you can.
18:41.440 --> 18:42.280
Pwn.
18:42.280 --> 18:43.120
Yeah.
18:43.120 --> 18:43.960
You got like 100 points of challenge,
18:43.960 --> 18:45.360
whatever team gets the most.
18:46.360 --> 18:50.240
You were both at Facebook and Google for a brief stint.
18:50.240 --> 18:51.080
Yeah.
18:51.080 --> 18:54.920
With Project Zero, actually, at Google for five months
18:54.920 --> 18:56.960
where you develop Kira.
18:56.960 --> 18:59.280
What was Project Zero about in general?
19:01.760 --> 19:05.160
Just curious about the security efforts in these companies.
19:05.160 --> 19:08.840
Well, Project Zero started the same time I went there.
19:08.840 --> 19:10.080
What year is it there?
19:11.080 --> 19:12.320
2015.
19:12.320 --> 19:13.160
2015.
19:13.160 --> 19:15.040
So that was right at the beginning of Project Zero.
19:15.040 --> 19:16.200
It's small.
19:16.200 --> 19:18.840
It's Google's offensive security team.
19:18.840 --> 19:23.840
I'll try to give the best public facing explanation
19:25.680 --> 19:26.520
that I can.
19:26.520 --> 19:30.960
So the idea is basically,
19:30.960 --> 19:33.240
these vulnerabilities exist in the world.
19:33.240 --> 19:35.240
Nation states have them.
19:35.240 --> 19:37.440
Some high powered bad actors have them.
19:39.840 --> 19:44.200
Sometimes people will find these vulnerabilities
19:44.200 --> 19:47.960
and submit them in bug bounties to the companies.
19:47.960 --> 19:49.440
But a lot of the companies don't only care.
19:49.440 --> 19:50.520
They don't even fix the bug.
19:50.520 --> 19:53.760
It doesn't hurt for there to be a vulnerability.
19:53.760 --> 19:55.880
So Project Zero is like, we're going to do it different.
19:55.880 --> 19:57.840
We're going to announce a vulnerability
19:57.840 --> 19:59.640
and we're going to give them 90 days to fix it.
19:59.640 --> 20:00.800
And then whether they fix it or not,
20:00.800 --> 20:03.200
we're going to drop the Zero Day.
20:03.200 --> 20:04.080
Oh, wow.
20:04.080 --> 20:05.240
We're going to drop the weapon on the textbook.
20:05.240 --> 20:06.080
That's so cool.
20:06.080 --> 20:07.480
That is so cool.
20:07.480 --> 20:09.200
I love that deadlines.
20:09.200 --> 20:10.040
Oh, that's so cool.
20:10.040 --> 20:10.880
Give them real deadlines.
20:10.880 --> 20:12.320
Yeah.
20:12.320 --> 20:15.800
And I think it's done a lot for moving the industry forward.
20:15.800 --> 20:20.360
I watched your coding sessions on the streamed online.
20:20.360 --> 20:25.280
You code things up, the basic projects, usually from scratch.
20:25.280 --> 20:28.200
I would say, sort of as a programmer myself,
20:28.200 --> 20:30.360
just watching you, that you type really fast
20:30.360 --> 20:34.440
and your brain works in both brilliant and chaotic ways.
20:34.440 --> 20:35.800
I don't know if that's always true,
20:35.800 --> 20:37.600
but certainly for the live streams.
20:37.600 --> 20:41.320
So it's interesting to me because I'm much slower
20:41.320 --> 20:43.520
and systematic and careful.
20:43.520 --> 20:48.040
And you just move probably in order of magnitude faster.
20:48.040 --> 20:51.800
So I'm curious, is there a method to your madness?
20:51.800 --> 20:53.040
Or is it just who you are?
20:53.040 --> 20:54.720
There's pros and cons.
20:54.720 --> 20:58.080
There's pros and cons to my programming style.
20:58.080 --> 21:00.360
And I'm aware of them.
21:00.360 --> 21:04.480
If you ask me to get something up and working quickly
21:04.480 --> 21:06.800
with an API that's kind of undocumented,
21:06.800 --> 21:08.880
I will do this super fast because I will throw things
21:08.880 --> 21:10.200
at it until it works.
21:10.200 --> 21:14.720
If you ask me to take a vector and rotate it 90 degrees
21:14.720 --> 21:19.320
and then flip it over the X, Y plane,
21:19.320 --> 21:22.280
I'll spam program for two hours and won't get it.
21:22.280 --> 21:23.480
Oh, because it's something that you
21:23.480 --> 21:26.240
could do with a sheet of paper or think through design
21:26.240 --> 21:30.400
and then just you really just throw stuff at the wall
21:30.400 --> 21:34.600
and you get so good at it that it usually works.
21:34.600 --> 21:36.920
I should become better at the other kind as well.
21:36.920 --> 21:39.440
Sometimes I will do things methodically.
21:39.440 --> 21:41.200
It's nowhere near as entertaining on the Twitch streams.
21:41.200 --> 21:43.520
I do exaggerate it a bit on the Twitch streams as well.
21:43.520 --> 21:45.480
The Twitch streams, I mean, what do you want to see a game
21:45.480 --> 21:46.840
or you want to see actions permit, right?
21:46.840 --> 21:48.200
I'll show you APM for programming too.
21:48.200 --> 21:50.280
Yeah, I'd recommend people go to it.
21:50.280 --> 21:53.800
I think I watched probably several hours that you put,
21:53.800 --> 21:57.480
like I've actually left you programming in the background
21:57.480 --> 22:00.400
while I was programming because you made me,
22:00.400 --> 22:03.120
it was like watching a really good gamer.
22:03.120 --> 22:06.240
It's like energizes you because you're like moving so fast
22:06.240 --> 22:08.840
and so it's awesome, it's inspiring.
22:08.840 --> 22:11.200
It made me jealous that like,
22:12.280 --> 22:14.280
because my own programming is inadequate
22:14.280 --> 22:16.960
in terms of speed, so I was like.
22:16.960 --> 22:20.520
So I'm twice as frantic on the live streams
22:20.520 --> 22:22.680
as I am when I code without, oh.
22:22.680 --> 22:23.720
It's super entertaining.
22:23.720 --> 22:26.400
So I wasn't even paying attention to what you were coding,
22:26.400 --> 22:29.760
which is great, it's just watching you switch windows
22:29.760 --> 22:31.400
and Vim, I guess is the most way.
22:31.400 --> 22:33.000
Yeah, does Vim on screen?
22:33.000 --> 22:35.640
I've developed a workload Facebook and stuck with it.
22:35.640 --> 22:37.320
How do you learn new programming tools,
22:37.320 --> 22:39.440
ideas, techniques these days?
22:39.440 --> 22:42.080
What's your like methodology for learning new things?
22:42.080 --> 22:45.920
So I wrote for comma,
22:47.200 --> 22:49.280
the distributed file systems out in the world
22:49.280 --> 22:50.720
are extremely complex.
22:50.720 --> 22:55.280
Like if you want to install something like like like Ceph,
22:55.280 --> 22:58.760
Ceph is I think the like open infrastructure
22:58.760 --> 23:03.040
distributed file system or there's like newer ones
23:03.040 --> 23:05.880
like seaweed FS, but these are all like 10,000
23:05.880 --> 23:06.880
plus line projects.
23:06.880 --> 23:09.520
I think some of them are even 100,000 line
23:09.520 --> 23:11.120
and just configuring them as a nightmare.
23:11.120 --> 23:16.120
So I wrote, I wrote one, it's 200 lines
23:16.440 --> 23:18.880
and it uses like engine X of the line servers
23:18.880 --> 23:21.600
and has this little master server that I wrote and go.
23:21.600 --> 23:24.840
And the way I go, this, if I would say
23:24.840 --> 23:27.240
that I'm proud per line of any code I wrote,
23:27.240 --> 23:29.160
maybe there's some exploits that I think are beautiful
23:29.160 --> 23:31.320
and then this, this is 200 lines
23:31.320 --> 23:33.720
and just the way that I thought about it,
23:33.720 --> 23:35.560
I think was very good and the reason it's very good
23:35.560 --> 23:37.640
is because that was the fourth version of it that I wrote
23:37.640 --> 23:39.320
and I had three versions that I threw away.
23:39.320 --> 23:41.000
You mentioned, did you say go?
23:41.000 --> 23:41.840
I wrote a go, yeah.
23:41.840 --> 23:42.680
And go.
23:42.680 --> 23:43.880
Is that a functional language?
23:43.880 --> 23:45.280
I forget what go is.
23:45.280 --> 23:47.160
Go is Google's language.
23:47.160 --> 23:48.200
Right.
23:48.200 --> 23:49.480
It's not functional.
23:49.480 --> 23:54.480
It's some, it's like, in a way it's C++, but easier.
23:56.160 --> 23:58.200
It's strongly typed.
23:58.200 --> 23:59.760
It has a nice ecosystem around it.
23:59.760 --> 24:01.680
When I first looked at it, I was like,
24:01.680 --> 24:03.800
this is like Python, but it takes twice as long
24:03.800 --> 24:05.600
to do anything.
24:05.600 --> 24:09.600
Now that I've open pilot is migrating to C,
24:09.600 --> 24:11.000
but it still has large Python components,
24:11.000 --> 24:12.760
I now understand why Python doesn't work
24:12.760 --> 24:15.840
for large code bases and why you want something like go.
24:15.840 --> 24:16.680
Interesting.
24:16.680 --> 24:18.680
So why, why doesn't Python work for,
24:18.680 --> 24:21.720
so even most, speaking for myself at least,
24:21.720 --> 24:24.960
like we do a lot of stuff, basically demo level work
24:24.960 --> 24:29.240
with autonomous vehicles and most of the work is Python.
24:29.240 --> 24:32.440
Why doesn't Python work for large code bases?
24:32.440 --> 24:37.440
Because, well, lack of type checking is a big one.
24:37.920 --> 24:39.360
So errors creep in.
24:39.360 --> 24:41.920
Yeah, and like you don't know,
24:41.920 --> 24:45.320
the compiler can tell you like nothing, right?
24:45.320 --> 24:48.440
So everything is either, you know,
24:48.440 --> 24:49.880
like syntax errors, fine,
24:49.880 --> 24:51.800
but if you misspell a variable in Python,
24:51.800 --> 24:53.000
the compiler won't catch that.
24:53.000 --> 24:56.600
There's like linters that can catch it some of the time.
24:56.600 --> 24:57.560
There's no types.
24:57.560 --> 25:00.520
This is really the biggest downside.
25:00.520 --> 25:02.640
And then we'll Python slow, but that's not related to it.
25:02.640 --> 25:04.840
Well, maybe it's kind of related to it, so it's lack of.
25:04.840 --> 25:06.600
So what's in your toolbox these days?
25:06.600 --> 25:07.760
Is it Python or what else?
25:07.760 --> 25:08.600
Go.
25:08.600 --> 25:10.240
I need to move to something else.
25:10.240 --> 25:12.880
My adventure into dependently typed languages,
25:12.880 --> 25:14.240
I love these languages.
25:14.240 --> 25:17.520
They just have like syntax from the 80s.
25:18.520 --> 25:21.120
What do you think about JavaScript?
25:21.120 --> 25:24.000
ES6, like the modern type script?
25:24.000 --> 25:27.320
JavaScript is, the whole ecosystem
25:27.320 --> 25:29.320
is unbelievably confusing.
25:29.320 --> 25:32.840
NPM updates a package from 022 to 025
25:32.840 --> 25:34.560
and that breaks your Babel linter,
25:34.560 --> 25:38.560
which translates your ES5 into ES6, which doesn't run on.
25:38.560 --> 25:42.480
So why do I have to compile my JavaScript again, huh?
25:42.480 --> 25:44.040
It may be the future though.
25:44.040 --> 25:45.800
You think about, I mean,
25:45.800 --> 25:47.400
I've embraced JavaScript recently
25:47.400 --> 25:52.280
just because just like I've continually embraced PHP,
25:52.280 --> 25:55.360
it seems that these worst possible languages live on
25:55.360 --> 25:57.480
for the longest, like cockroaches never die.
25:57.480 --> 26:00.760
Yeah, well, it's in the browser and it's fast.
26:00.760 --> 26:01.680
It's fast.
26:01.680 --> 26:02.520
Yeah.
26:02.520 --> 26:05.480
It's in the browser and compute might stay become,
26:05.480 --> 26:06.440
you know, the browser,
26:06.440 --> 26:09.040
it's unclear what the role of the browser is
26:09.040 --> 26:11.800
in terms of distributed computation in the future.
26:11.800 --> 26:12.640
So.
26:13.600 --> 26:15.240
JavaScript is definitely here to stay.
26:15.240 --> 26:16.080
Yeah.
26:16.080 --> 26:18.160
It's interesting if autonomous vehicles
26:18.160 --> 26:19.480
will run on JavaScript one day.
26:19.480 --> 26:21.760
I mean, you have to consider these possibilities.
26:21.760 --> 26:24.280
Well, all our debug tools are JavaScript.
26:24.280 --> 26:26.040
We actually just open source them.
26:26.040 --> 26:28.160
We have a tool explorer, which you can annotate
26:28.160 --> 26:30.080
your disengagements and we have tool Kibana,
26:30.080 --> 26:32.920
which lets you analyze the can traffic from the car.
26:32.920 --> 26:35.240
So basically any time you're visualizing something
26:35.240 --> 26:37.720
about the log using JavaScript.
26:37.720 --> 26:40.120
Well, the web is the best UI toolkit by far.
26:40.120 --> 26:40.960
Yeah.
26:40.960 --> 26:41.880
So, and then, you know what?
26:41.880 --> 26:42.760
You're coding in JavaScript.
26:42.760 --> 26:43.600
We have a React guy.
26:43.600 --> 26:44.440
He's good.
26:44.440 --> 26:46.080
React, nice.
26:46.080 --> 26:46.920
Let's get into it.
26:46.920 --> 26:49.120
So let's talk autonomous vehicles.
26:49.120 --> 26:50.640
You found a comma AI.
26:51.440 --> 26:54.920
Let's, at a high level,
26:54.920 --> 26:57.880
how did you get into the world of vehicle automation?
26:57.880 --> 26:59.920
Can you also just, for people who don't know,
26:59.920 --> 27:01.400
tell the story of comma AI?
27:01.400 --> 27:02.920
Sure.
27:02.920 --> 27:06.120
So I was working at this AI startup
27:06.120 --> 27:09.240
and a friend approached me and he's like,
27:09.240 --> 27:12.080
dude, I don't know where this is going,
27:12.080 --> 27:15.160
but the coolest applied AI problem today
27:15.160 --> 27:16.480
is self driving cars.
27:16.480 --> 27:17.720
I'm like, well, absolutely.
27:18.800 --> 27:20.520
You wanna meet with Elon Musk
27:20.520 --> 27:24.560
and he's looking for somebody to build a vision system
27:24.560 --> 27:27.600
for autopilot.
27:27.600 --> 27:29.320
This is when they were still on AP one.
27:29.320 --> 27:30.840
They were still using Mobileye.
27:30.840 --> 27:33.680
Elon back then was looking for a replacement.
27:33.680 --> 27:37.320
And he brought me in and we talked about a contract
27:37.320 --> 27:39.040
where I would deliver something
27:39.040 --> 27:41.640
that meets Mobileye level performance.
27:41.640 --> 27:43.920
I would get paid $12 million if I could deliver it tomorrow
27:43.920 --> 27:46.720
and I would lose $1 million for every month I didn't deliver.
27:47.720 --> 27:49.080
So I was like, okay, this is a great deal.
27:49.080 --> 27:50.800
This is a super exciting challenge.
27:52.360 --> 27:53.200
You know what?
27:53.200 --> 27:55.840
It takes me 10 months, I get $2 million, it's good.
27:55.840 --> 27:57.160
Maybe I can finish up in five.
27:57.160 --> 27:58.880
Maybe I don't finish it at all and I get paid nothing
27:58.880 --> 28:00.880
and I'll work for 12 months for free.
28:00.880 --> 28:02.960
So maybe just take a pause on that.
28:02.960 --> 28:04.280
I'm also curious about this
28:04.280 --> 28:06.360
because I've been working in robotics for a long time.
28:06.360 --> 28:08.320
And I'm curious to see a person like you just step in
28:08.320 --> 28:12.000
and sort of somewhat naive, but brilliant, right?
28:12.000 --> 28:14.000
So that's the best place to be
28:14.000 --> 28:17.240
because you basically full steam take on a problem.
28:17.240 --> 28:19.720
How confident, from that time,
28:19.720 --> 28:21.320
because you know a lot more now,
28:21.320 --> 28:23.440
at that time, how hard do you think it is
28:23.440 --> 28:25.880
to solve all of autonomous driving?
28:25.880 --> 28:30.440
I remember I suggested to Elon in the meeting
28:30.440 --> 28:33.120
on putting a GPU behind each camera
28:33.120 --> 28:35.120
to keep the compute local.
28:35.120 --> 28:38.000
This is an incredibly stupid idea.
28:38.000 --> 28:40.080
I leave the meeting 10 minutes later and I'm like,
28:40.080 --> 28:41.560
I could have spent a little bit of time
28:41.560 --> 28:42.880
thinking about this problem before I went in.
28:42.880 --> 28:44.200
Why is this a stupid idea?
28:44.200 --> 28:46.280
Oh, just send all your cameras to one big GPU.
28:46.280 --> 28:48.240
You're much better off doing that.
28:48.240 --> 28:50.160
Oh, sorry, you said behind every camera.
28:50.160 --> 28:51.000
Every camera.
28:51.000 --> 28:51.840
Every small GPU.
28:51.840 --> 28:52.720
I was like, oh, I'll put the first few layers
28:52.720 --> 28:54.520
of my comms there.
28:54.520 --> 28:56.080
Like why did I say that?
28:56.080 --> 28:56.920
That's possible.
28:56.920 --> 28:59.000
It's possible, but it's a bad idea.
28:59.000 --> 29:00.480
It's not obviously a bad idea.
29:00.480 --> 29:01.320
Pretty obviously bad.
29:01.320 --> 29:02.960
But whether it's actually a bad idea or not,
29:02.960 --> 29:05.240
I left that meeting with Elon, like beating myself up.
29:05.240 --> 29:07.080
I'm like, why did I say something stupid?
29:07.080 --> 29:09.360
Yeah, you haven't, like you haven't at least
29:09.360 --> 29:12.240
like thought through every aspect fully.
29:12.240 --> 29:13.200
He's very sharp too.
29:13.200 --> 29:15.760
Like usually in life, I get away with saying stupid things
29:15.760 --> 29:16.960
and then kind of course,
29:16.960 --> 29:18.560
right away he called me out about it.
29:18.560 --> 29:19.800
And like, usually in life,
29:19.800 --> 29:21.120
I get away with saying stupid things.
29:21.120 --> 29:24.640
And then like people will, you know,
29:24.640 --> 29:26.080
a lot of times people don't even notice.
29:26.080 --> 29:28.200
And I'll like correct it and bring the conversation back.
29:28.200 --> 29:30.600
But with Elon, it was like, nope, like, okay.
29:30.600 --> 29:33.520
Well, that's not at all why the contract fell through.
29:33.520 --> 29:35.520
I was much more prepared the second time I met him.
29:35.520 --> 29:36.360
Yeah.
29:36.360 --> 29:39.640
But in general, how hard did you think it,
29:39.640 --> 29:43.680
like 12 months is a tough timeline?
29:43.680 --> 29:45.720
Oh, I just thought I'd clone Mobileye IQ three.
29:45.720 --> 29:47.560
I didn't think I'd solve level five self driving
29:47.560 --> 29:48.400
or anything.
29:48.400 --> 29:51.000
So the goal there was to do lane keeping,
29:51.000 --> 29:52.840
good lane keeping.
29:52.840 --> 29:55.560
I saw my friend showed me the outputs from Mobileye.
29:55.560 --> 29:57.680
And the outputs from Mobileye was just basically two lanes
29:57.680 --> 29:59.440
and a position of a lead car.
29:59.440 --> 30:01.560
I'm like, I can gather a data set
30:01.560 --> 30:03.440
and train this net in weeks.
30:03.440 --> 30:04.840
And I did.
30:04.840 --> 30:07.600
Well, first time I tried the implementation of Mobileye
30:07.600 --> 30:11.240
in a Tesla, I was really surprised how good it is.
30:11.240 --> 30:12.320
It's quite incredibly good.
30:12.320 --> 30:14.080
Cause I thought it's just cause I've done
30:14.080 --> 30:14.920
a lot of computer vision.
30:14.920 --> 30:18.880
I thought it'd be a lot harder to create a system
30:18.880 --> 30:20.040
that that's stable.
30:21.000 --> 30:22.440
So I was personally surprised.
30:22.440 --> 30:25.000
Just, you know, have to admit it.
30:25.000 --> 30:27.840
Cause I was kind of skeptical before trying it.
30:27.840 --> 30:31.200
Cause I thought it would go in and out a lot more.
30:31.200 --> 30:33.160
It would get disengaged a lot more.
30:33.160 --> 30:35.000
And it's pretty robust.
30:36.200 --> 30:39.720
So what, how, how, how hard is the problem
30:39.720 --> 30:42.080
when you, when you tackled it?
30:42.080 --> 30:45.760
So I think AP one was great. Like Elon talked
30:45.760 --> 30:49.040
about disengagements on the 405 down in LA
30:49.040 --> 30:51.040
with like the lane marks were kind of faded
30:51.040 --> 30:52.960
and the Mobileye system would drop out.
30:53.960 --> 30:57.240
Like I had something up and working
30:57.240 --> 31:01.440
that I would say was like the same quality in three months.
31:02.480 --> 31:04.560
Same quality, but how do you know?
31:04.560 --> 31:07.400
You say stuff like that confidently, but you can't,
31:07.400 --> 31:12.120
and I love it, but the question is you can't,
31:12.120 --> 31:13.880
you're kind of going by feel cause you just,
31:13.880 --> 31:15.560
You're going by feel, absolutely, absolutely.
31:15.560 --> 31:17.280
Like, like I would take, I hadn't,
31:17.280 --> 31:18.480
I borrowed my friend's Tesla.
31:18.480 --> 31:20.760
I would take AP one out for a drive.
31:20.760 --> 31:22.320
And then I would take my system out for a drive.
31:22.320 --> 31:24.440
And seems reasonably like the same.
31:26.080 --> 31:30.480
So the 405, how hard is it to create something
31:30.480 --> 31:34.200
that could actually be a product that's deployed?
31:34.200 --> 31:39.200
I mean, I've read an article where Elon, this respond,
31:39.520 --> 31:41.880
it said something about you saying that
31:41.880 --> 31:46.880
to build autopilot is more complicated
31:47.080 --> 31:51.880
than a single George Hodds level job.
31:51.880 --> 31:55.520
How hard is that job to create something
31:55.520 --> 31:57.480
that would work across the globally?
31:58.960 --> 32:00.640
Why don't the global is the challenge,
32:00.640 --> 32:02.240
but Elon followed that up by saying
32:02.240 --> 32:04.920
it's going to take two years and a company of 10 people.
32:04.920 --> 32:07.920
And here I am four years later with a company of 12 people.
32:07.920 --> 32:09.960
And I think we still have another two to go.
32:09.960 --> 32:10.800
Two years.
32:10.800 --> 32:13.120
So yeah, so what do you think,
32:13.120 --> 32:15.960
what do you think about how Tesla's progressing
32:15.960 --> 32:19.200
with autopilot of V2, V3?
32:19.200 --> 32:23.120
I think we've kept pace with them pretty well.
32:24.080 --> 32:26.880
I think navigating autopilot is terrible.
32:26.880 --> 32:31.120
We had some demo features internally of the same stuff
32:31.120 --> 32:32.720
and we would test it and I'm like,
32:32.720 --> 32:34.720
I'm not shipping this even as like open source software
32:34.720 --> 32:35.560
to people.
32:35.560 --> 32:37.400
What do you think is terrible?
32:37.400 --> 32:39.600
Consumer Reports does a great job of describing it.
32:39.600 --> 32:41.240
Like when it makes a lane change,
32:41.240 --> 32:43.600
it does it worse than a human.
32:43.600 --> 32:46.960
You shouldn't ship things like autopilot, open pilot,
32:46.960 --> 32:49.760
they lane keep better than a human.
32:49.760 --> 32:53.440
If you turn it on for a stretch of highway,
32:53.440 --> 32:56.680
like an hour long, it's never going to touch a lane line.
32:56.680 --> 32:59.040
Human will touch probably a lane line twice.
32:59.040 --> 33:00.080
You just inspired me.
33:00.080 --> 33:02.200
I don't know if you're grounded in data on that.
33:02.200 --> 33:03.280
I read your paper.
33:03.280 --> 33:05.400
Okay, but no, but that's interesting.
33:06.720 --> 33:09.840
I wonder actually how often we touch lane lines
33:11.200 --> 33:13.400
a little bit because it is.
33:13.400 --> 33:14.960
I could answer that question pretty easily
33:14.960 --> 33:15.800
with the common data side.
33:15.800 --> 33:16.920
Yeah, I'm curious.
33:16.920 --> 33:17.760
I've never answered it.
33:17.760 --> 33:18.600
I don't know.
33:18.600 --> 33:20.000
I just too was like my personal.
33:20.000 --> 33:22.400
It feels right, but that's interesting
33:22.400 --> 33:23.800
because every time you touch a lane,
33:23.800 --> 33:26.760
that's a source of a little bit of stress
33:26.760 --> 33:29.320
and kind of lane keeping is removing that stress.
33:29.320 --> 33:31.840
That's ultimately the biggest value add
33:31.840 --> 33:34.240
honestly is just removing the stress
33:34.240 --> 33:35.480
of having to stay in lane.
33:35.480 --> 33:39.040
And I think I don't think people fully realize
33:39.040 --> 33:41.960
first of all that that's a big value add,
33:41.960 --> 33:45.000
but also that that's all it is.
33:45.000 --> 33:48.560
And that not only I find it a huge value add.
33:48.560 --> 33:50.440
I drove down when we moved to San Diego,
33:50.440 --> 33:52.640
I drove down in an enterprise rental car
33:52.640 --> 33:53.480
and I missed it.
33:53.480 --> 33:55.480
So I missed having the system so much.
33:55.480 --> 33:59.200
It's so much more tiring to drive
33:59.200 --> 34:00.320
without it.
34:00.320 --> 34:02.960
It's, it is that lane centering.
34:02.960 --> 34:04.840
That's the key feature.
34:04.840 --> 34:05.680
Yeah.
34:06.600 --> 34:08.960
And in a way it's the only feature
34:08.960 --> 34:11.040
that actually adds value to people's lives
34:11.040 --> 34:12.200
in autonomous vehicles today.
34:12.200 --> 34:13.840
Waymo does not add value to people's lives.
34:13.840 --> 34:15.880
It's a more expensive, slower Uber.
34:15.880 --> 34:18.640
Maybe someday it'll be this big cliff where it adds value,
34:18.640 --> 34:19.480
but I don't usually.
34:19.480 --> 34:20.320
It's fascinating.
34:20.320 --> 34:22.560
I haven't talked to, this is good.
34:22.560 --> 34:25.840
Cause I haven't, I have intuitively,
34:25.840 --> 34:28.320
but I think we're making it explicit now.
34:28.320 --> 34:33.320
I actually believe that really good lane keeping
34:35.480 --> 34:37.240
is a reason to buy a car.
34:37.240 --> 34:38.440
Will be a reason to buy a car.
34:38.440 --> 34:39.720
It is a huge value add.
34:39.720 --> 34:41.760
I've never, until we just started talking about it,
34:41.760 --> 34:43.880
haven't really quite realized it,
34:43.880 --> 34:48.880
that I've felt with Elon's chase of level four
34:49.440 --> 34:52.360
is not the correct chase.
34:52.360 --> 34:56.000
It was on, cause you should just say Tesla has the best
34:56.000 --> 34:58.320
as if from a Tesla perspective say,
34:58.320 --> 35:00.600
Tesla has the best lane keeping.
35:00.600 --> 35:04.160
Kama AI should say Kama AI is the best lane keeping.
35:04.160 --> 35:05.640
And that is it.
35:05.640 --> 35:06.480
Yeah.
35:06.480 --> 35:07.320
Yeah.
35:07.320 --> 35:08.160
Do you think?
35:08.160 --> 35:09.920
You have to do the longitudinal as well.
35:09.920 --> 35:10.960
You can't just lane keep.
35:10.960 --> 35:12.920
You have to do ACC,
35:12.920 --> 35:15.840
but ACC is much more forgiving than lane keep,
35:15.840 --> 35:17.400
especially on the highway.
35:17.400 --> 35:22.000
By the way, are you Kama AI's camera only, correct?
35:22.000 --> 35:23.440
No, we use the radar.
35:23.440 --> 35:26.960
We, from the car, you're able to get to, okay.
35:26.960 --> 35:28.800
We can do it camera only now.
35:28.800 --> 35:29.640
It's gotten to the point,
35:29.640 --> 35:31.600
but we leave the radar there as like a,
35:31.600 --> 35:33.440
it's fusion now.
35:33.440 --> 35:35.440
Okay, so let's maybe talk through
35:35.440 --> 35:37.920
some of the system specs on the hardware.
35:37.920 --> 35:42.880
What's the hardware side of what you're providing?
35:42.880 --> 35:44.720
What's the capabilities on the software side
35:44.720 --> 35:46.800
with OpenPilot and so on?
35:46.800 --> 35:51.800
So OpenPilot as the box that we sell that it runs on,
35:51.800 --> 35:53.920
it's a phone in a plastic case.
35:53.920 --> 35:54.840
It's nothing special.
35:54.840 --> 35:56.200
We sell it without the software.
35:56.200 --> 35:57.840
So you're like, you know, you buy the phone,
35:57.840 --> 35:58.920
it's just easy.
35:58.920 --> 36:00.240
It'll be easy set up,
36:00.240 --> 36:01.720
but it's sold with no software.
36:03.480 --> 36:06.600
OpenPilot right now is about to be 0.6.
36:06.600 --> 36:07.880
When it gets to 1.0,
36:07.880 --> 36:09.680
I think we'll be ready for a consumer product.
36:09.680 --> 36:11.120
We're not gonna add any new features.
36:11.120 --> 36:13.800
We're just gonna make the lane keeping really, really good.
36:13.800 --> 36:15.120
Okay, I got it.
36:15.120 --> 36:16.120
So what do we have right now?
36:16.120 --> 36:18.200
It's a Snapdragon 820.
36:18.200 --> 36:23.200
It's a Sony IMX 298 forward facing camera,
36:23.680 --> 36:24.720
driver monitoring camera.
36:24.720 --> 36:26.400
It's just a selfie cam on the phone.
36:26.400 --> 36:30.000
And a can transceiver,
36:30.000 --> 36:32.320
maybe it's a little thing called pandas.
36:32.320 --> 36:35.040
And they talk over USB to the phone
36:35.040 --> 36:36.400
and then they have three can buses
36:36.400 --> 36:37.560
that they talk to the car.
36:38.560 --> 36:40.920
One of those can buses is the radar can bus.
36:40.920 --> 36:42.920
One of them is the main car can bus.
36:42.920 --> 36:44.920
And the other one is the proxy camera can bus.
36:44.920 --> 36:47.320
We leave the existing camera in place.
36:47.320 --> 36:49.560
So we don't turn AEB off.
36:49.560 --> 36:51.040
Right now we still turn AEB off
36:51.040 --> 36:52.280
if you're using our longitudinal,
36:52.280 --> 36:54.320
but we're gonna fix that before 1.0.
36:54.320 --> 36:55.160
Got it.
36:55.160 --> 36:56.000
Wow, that's cool.
36:56.000 --> 36:57.960
So in its can both ways.
36:57.960 --> 37:02.120
So how are you able to control vehicles?
37:02.120 --> 37:05.520
So we proxy the vehicles that we work with
37:05.520 --> 37:08.960
already have a lane keeping assist system.
37:08.960 --> 37:12.520
So lane keeping assist can mean a huge variety of things.
37:12.520 --> 37:16.120
It can mean it will apply a small torque
37:16.120 --> 37:18.920
to the wheel after you've already crossed a lane line
37:18.920 --> 37:22.720
by a foot, which is the system in the older Toyotas.
37:22.720 --> 37:26.360
Versus like, I think Tesla still calls it lane keeping assist
37:26.360 --> 37:28.920
where it'll keep you perfectly in the center of the lane
37:28.920 --> 37:29.960
on the highway.
37:31.240 --> 37:34.000
You can control like you with the joystick, the cars.
37:34.000 --> 37:36.600
So these cars already have the capability of drive by wire.
37:36.600 --> 37:41.600
So is it, is it trivial to convert a car
37:41.600 --> 37:43.320
that it operates with?
37:43.320 --> 37:47.480
It open pilot is able to control the steering.
37:48.480 --> 37:49.720
Oh, a new car or a car that we,
37:49.720 --> 37:52.800
so we have support now for 45 different makes of cars.
37:52.800 --> 37:54.880
What are the cars in general?
37:54.880 --> 37:56.360
Mostly Honda's and Toyotas.
37:56.360 --> 38:00.640
We support almost every Honda and Toyota made this year.
38:01.680 --> 38:04.480
And then bunch of GM's, bunch of Subaru's.
38:04.480 --> 38:05.960
But it doesn't have to be like a Prius.
38:05.960 --> 38:07.320
It could be Corolla as well.
38:07.320 --> 38:10.760
Oh, the 2020 Corolla is the best car with open pilot.
38:10.760 --> 38:11.720
It just came out there.
38:11.720 --> 38:14.200
The actuator has less lag than the older Corolla.
38:15.840 --> 38:18.240
I think I started watching a video with you.
38:18.240 --> 38:21.480
I mean, the way you make videos is awesome.
38:21.480 --> 38:24.320
It's just literally at the dealership streaming.
38:25.320 --> 38:26.160
I had my friend to follow him.
38:26.160 --> 38:27.560
I probably want to stream for an hour.
38:27.560 --> 38:31.120
Yeah, and basically like if stuff goes a little wrong,
38:31.120 --> 38:33.160
you just like, you just go with it.
38:33.160 --> 38:34.000
Yeah, I love it.
38:34.000 --> 38:34.840
It's real.
38:34.840 --> 38:35.680
Yeah, it's real.
38:35.680 --> 38:42.000
That's so beautiful and it's so in contrast to the way
38:42.000 --> 38:44.600
other companies would put together a video like that.
38:44.600 --> 38:46.000
Kind of why I like to do it like that.
38:46.000 --> 38:46.840
Good.
38:46.840 --> 38:49.720
I mean, if you become super rich one day and successful,
38:49.720 --> 38:52.280
I hope you keep it that way because I think that's actually
38:52.280 --> 38:54.600
what people love, that kind of genuine.
38:54.600 --> 38:56.520
Oh, it's all that has value to me.
38:56.520 --> 38:59.840
Money has no, if I sell out to like make money,
38:59.840 --> 39:00.680
I sold out.
39:00.680 --> 39:01.520
It doesn't matter.
39:01.520 --> 39:02.360
What do I get?
39:02.360 --> 39:04.440
Yacht, I don't want a yacht.
39:04.440 --> 39:09.440
And I think Tesla actually has a small inkling of that
39:09.440 --> 39:11.240
as well with autonomy day.
39:11.240 --> 39:14.000
They did reveal more than, I mean, of course,
39:14.000 --> 39:15.680
there's marketing communications, you could tell,
39:15.680 --> 39:17.640
but it's more than most companies would reveal,
39:17.640 --> 39:20.960
which is I hope they go towards that direction
39:20.960 --> 39:23.000
more other companies, GM, Ford.
39:23.000 --> 39:25.400
Oh, Tesla's going to win level five.
39:25.400 --> 39:26.560
They really are.
39:26.560 --> 39:27.800
So let's talk about it.
39:27.800 --> 39:33.000
You think, you're focused on level two currently, currently.
39:33.000 --> 39:36.160
We're going to be one to two years behind Tesla
39:36.160 --> 39:37.160
getting to level five.
39:37.160 --> 39:38.520
OK.
39:38.520 --> 39:39.320
We're Android, right?
39:39.320 --> 39:39.880
We're Android.
39:39.880 --> 39:40.680
You're Android.
39:40.680 --> 39:42.240
I'm just saying once Tesla gets it,
39:42.240 --> 39:43.440
we're one to two years behind.
39:43.440 --> 39:45.680
I'm not making any timeline on when Tesla's going to get it.
39:45.680 --> 39:46.120
That's right.
39:46.120 --> 39:46.360
You did.
39:46.360 --> 39:46.960
That's brilliant.
39:46.960 --> 39:48.560
I'm sorry, Tesla investors, if you
39:48.560 --> 39:50.520
think you're going to have an autonomous robot taxi
39:50.520 --> 39:54.920
fleet by the end of the year, I'll bet against that.
39:54.920 --> 39:57.720
So what do you think about this?
39:57.720 --> 40:03.280
The most level four companies are kind of just
40:03.280 --> 40:08.360
doing their usual safety driver, doing full autonomy kind
40:08.360 --> 40:08.800
of testing.
40:08.800 --> 40:10.880
And then Tesla does basically trying
40:10.880 --> 40:15.280
to go from lane keeping to full autonomy.
40:15.280 --> 40:16.840
What do you think about that approach?
40:16.840 --> 40:18.360
How successful would it be?
40:18.360 --> 40:20.680
It's a ton better approach.
40:20.680 --> 40:23.960
Because Tesla is gathering data on a scale
40:23.960 --> 40:25.200
that none of them are.
40:25.200 --> 40:29.560
They're putting real users behind the wheel of the cars.
40:29.560 --> 40:34.440
It's, I think, the only strategy that works, the incremental.
40:34.440 --> 40:37.000
Well, so there's a few components to Tesla approach
40:37.000 --> 40:38.800
that's more than just the incremental.
40:38.800 --> 40:41.400
What you spoke with is the software,
40:41.400 --> 40:43.720
so over the air software updates.
40:43.720 --> 40:44.800
Necessity.
40:44.800 --> 40:46.440
I mean, Waymo crews have those too.
40:46.440 --> 40:47.560
Those aren't.
40:47.560 --> 40:48.080
But no.
40:48.080 --> 40:49.800
Those differentiate from the automakers.
40:49.800 --> 40:50.080
Right.
40:50.080 --> 40:53.440
No lane keeping systems have no cars with lane keeping system
40:53.440 --> 40:54.760
have that except Tesla.
40:54.760 --> 40:55.720
Yeah.
40:55.720 --> 40:59.760
And the other one is the data, the other direction,
40:59.760 --> 41:01.840
which is the ability to query the data.
41:01.840 --> 41:03.480
I don't think they're actually collecting
41:03.480 --> 41:05.240
as much data as people think, but the ability
41:05.240 --> 41:09.440
to turn on collection and turn it off.
41:09.440 --> 41:13.400
So I'm both in the robotics world, in the psychology,
41:13.400 --> 41:15.000
human factors world.
41:15.000 --> 41:17.320
Many people believe that level two autonomy
41:17.320 --> 41:20.040
is problematic because of the human factor.
41:20.040 --> 41:23.280
Like the more the task is automated,
41:23.280 --> 41:25.960
the more there's a vigilance decrement.
41:25.960 --> 41:27.200
You start to fall asleep.
41:27.200 --> 41:30.480
You start to become complacent, start texting more and so on.
41:30.480 --> 41:32.200
Do you worry about that?
41:32.200 --> 41:35.000
Because if you're talking about transition from lane keeping
41:35.000 --> 41:40.960
to full autonomy, if you're spending 80% of the time
41:40.960 --> 41:43.080
not supervising the machine, do you
41:43.080 --> 41:47.080
worry about what that means for the safety of the drivers?
41:47.080 --> 41:49.640
One, we don't consider OpenPilot to be 1.0
41:49.640 --> 41:52.880
until we have 100% driver monitoring.
41:52.880 --> 41:55.000
You can cheat right now, our driver monitoring system.
41:55.000 --> 41:56.080
There's a few ways to cheat it.
41:56.080 --> 41:58.160
They're pretty obvious.
41:58.160 --> 41:59.680
We're working on making that better.
41:59.680 --> 42:02.520
Before we ship a consumer product that can drive cars,
42:02.520 --> 42:04.240
I want to make sure that I have driver monitoring
42:04.240 --> 42:05.440
that you can't cheat.
42:05.440 --> 42:09.000
What's a successful driver monitoring system look like?
42:09.000 --> 42:11.680
Is it all about just keeping your eyes on the road?
42:11.680 --> 42:12.760
Well, a few things.
42:12.760 --> 42:16.600
So that's what we went with at first for driver monitoring.
42:16.600 --> 42:17.160
I'm checking.
42:17.160 --> 42:19.000
I'm actually looking at where your head is looking.
42:19.000 --> 42:19.880
The camera's not that high.
42:19.880 --> 42:21.840
Resolution eyes are a little bit hard to get.
42:21.840 --> 42:22.880
Well, head is big.
42:22.880 --> 42:23.560
I mean, that's just.
42:23.560 --> 42:24.640
Head is good.
42:24.640 --> 42:28.720
And actually, a lot of it, just psychology wise,
42:28.720 --> 42:30.720
to have that monitor constantly there,
42:30.720 --> 42:33.400
it reminds you that you have to be paying attention.
42:33.400 --> 42:35.080
But we want to go further.
42:35.080 --> 42:36.760
We just hired someone full time to come on
42:36.760 --> 42:37.960
to do the driver monitoring.
42:37.960 --> 42:40.600
I want to detect phone in frame, and I
42:40.600 --> 42:42.600
want to make sure you're not sleeping.
42:42.600 --> 42:44.880
How much does the camera see of the body?
42:44.880 --> 42:47.480
This one, not enough.
42:47.480 --> 42:48.400
Not enough.
42:48.400 --> 42:50.720
The next one, everything.
42:50.720 --> 42:52.920
What's interesting, FishEye, is we're
42:52.920 --> 42:55.200
doing just data collection, not real time.
42:55.200 --> 42:59.200
But FishEye is a beautiful being able to capture the body.
42:59.200 --> 43:03.280
And the smartphone is really the biggest problem.
43:03.280 --> 43:03.880
I'll show you.
43:03.880 --> 43:07.800
I can show you one of the pictures from our new system.
43:07.800 --> 43:08.160
Awesome.
43:08.160 --> 43:10.520
So you're basically saying the driver monitoring
43:10.520 --> 43:13.080
will be the answer to that.
43:13.080 --> 43:15.320
I think the other point that you raised in your paper
43:15.320 --> 43:16.920
is good as well.
43:16.920 --> 43:20.400
You're not asking a human to supervise a machine
43:20.400 --> 43:23.920
without giving them the they can take over at any time.
43:23.920 --> 43:25.760
Our safety model, you can take over.
43:25.760 --> 43:27.720
We disengage on both the gas or the brake.
43:27.720 --> 43:28.880
We don't disengage on steering.
43:28.880 --> 43:29.920
I don't feel you have to.
43:29.920 --> 43:31.720
But we disengage on gas or brake.
43:31.720 --> 43:34.240
So it's very easy for you to take over.
43:34.240 --> 43:36.400
And it's very easy for you to reengage.
43:36.400 --> 43:39.320
That switching should be super cheap.
43:39.320 --> 43:40.800
The cars that require, even autopilot,
43:40.800 --> 43:42.400
requires a double press.
43:42.400 --> 43:44.360
That's almost, I see, I don't like that.
43:44.360 --> 43:46.440
And then the cancel.
43:46.440 --> 43:48.320
To cancel in autopilot, you either
43:48.320 --> 43:49.920
have to press cancel, which no one knows where that is.
43:49.920 --> 43:51.000
So they press the brake.
43:51.000 --> 43:53.360
But a lot of times you don't want to press the brake.
43:53.360 --> 43:54.560
You want to press the gas.
43:54.560 --> 43:56.880
So you should cancel on gas or wiggle the steering wheel,
43:56.880 --> 43:57.960
which is bad as well.
43:57.960 --> 43:58.920
Wow, that's brilliant.
43:58.920 --> 44:01.440
I haven't heard anyone articulate that point.
44:01.440 --> 44:04.960
Oh, there's a lot I think about.
44:04.960 --> 44:09.800
Because I think actually Tesla has done a better job
44:09.800 --> 44:12.920
than most automakers at making that frictionless.
44:12.920 --> 44:16.600
But you just described that it could be even better.
44:16.600 --> 44:19.320
I love Super Cruise as an experience.
44:19.320 --> 44:21.120
Once it's engaged.
44:21.120 --> 44:22.800
I don't know if you've used it, but getting the thing
44:22.800 --> 44:25.040
to try to engage.
44:25.040 --> 44:27.480
Yeah, I've used the driven Super Cruise a lot.
44:27.480 --> 44:29.680
So what's your thoughts on the Super Cruise system in general?
44:29.680 --> 44:32.640
You disengage Super Cruise, and it falls back to ACC.
44:32.640 --> 44:34.600
So my car is still accelerating.
44:34.600 --> 44:36.280
It feels weird.
44:36.280 --> 44:39.000
Otherwise, when you actually have Super Cruise engaged
44:39.000 --> 44:41.200
on the highway, it is phenomenal.
44:41.200 --> 44:42.320
We bought that Cadillac.
44:42.320 --> 44:43.240
We just sold it.
44:43.240 --> 44:45.600
But we bought it just to experience this.
44:45.600 --> 44:47.440
And I wanted everyone in the office to be like,
44:47.440 --> 44:49.360
this is what we're striving to build.
44:49.360 --> 44:52.800
GM pioneering with the driver monitoring.
44:52.800 --> 44:55.040
You like their driver monitoring system?
44:55.040 --> 44:56.440
It has some bugs.
44:56.440 --> 45:01.960
If there's a sun shining back here, it'll be blind to you.
45:01.960 --> 45:03.360
But overall, mostly, yeah.
45:03.360 --> 45:05.960
That's so cool that you know all this stuff.
45:05.960 --> 45:09.960
I don't often talk to people that because it's such a rare car,
45:09.960 --> 45:10.960
unfortunately, currently.
45:10.960 --> 45:12.760
We bought one explicitly for that.
45:12.760 --> 45:15.040
We lost like $25K in the deprecation,
45:15.040 --> 45:16.720
but it feels worth it.
45:16.720 --> 45:21.280
I was very pleasantly surprised that our GM system
45:21.280 --> 45:26.320
was so innovative and really wasn't advertised much,
45:26.320 --> 45:28.480
wasn't talked about much.
45:28.480 --> 45:31.840
And I was nervous that it would die, that it would disappear.
45:31.840 --> 45:33.520
Well, they put it on the wrong car.
45:33.520 --> 45:35.680
They should have put it on the bolt and not some weird Cadillac
45:35.680 --> 45:36.640
that nobody bought.
45:36.640 --> 45:39.520
I think that's going to be into, they're saying at least
45:39.520 --> 45:41.840
it's going to be into their entire fleet.
45:41.840 --> 45:44.320
So what do you think about, as long as we're
45:44.320 --> 45:46.920
on the driver monitoring, what do you think
45:46.920 --> 45:51.920
about Elon Musk's claim that driver monitoring is not needed?
45:51.920 --> 45:53.680
Normally, I love his claims.
45:53.680 --> 45:55.560
That one is stupid.
45:55.560 --> 45:56.560
That one is stupid.
45:56.560 --> 46:00.320
And he's not going to have his level five fleet
46:00.320 --> 46:01.320
by the end of the year.
46:01.320 --> 46:04.880
Hopefully, he's like, OK, I was wrong.
46:04.880 --> 46:06.280
I'm going to add driver monitoring.
46:06.280 --> 46:08.240
Because when these systems get to the point
46:08.240 --> 46:10.320
that they're only messing up once every 1,000 miles,
46:10.320 --> 46:14.080
you absolutely need driver monitoring.
46:14.080 --> 46:15.880
So let me play, because I agree with you,
46:15.880 --> 46:17.320
but let me play devil's advocate.
46:17.320 --> 46:22.440
One possibility is that without driver monitoring,
46:22.440 --> 46:29.400
people are able to self regulate, monitor themselves.
46:29.400 --> 46:30.680
So your idea is, I'm just.
46:30.680 --> 46:34.160
You're seeing all the people sleeping in Teslas?
46:34.160 --> 46:35.280
Yeah.
46:35.280 --> 46:38.320
Well, I'm a little skeptical of all the people sleeping
46:38.320 --> 46:43.960
in Teslas because I've stopped paying attention to that kind
46:43.960 --> 46:45.680
of stuff because I want to see real data.
46:45.680 --> 46:47.240
It's too much glorified.
46:47.240 --> 46:48.720
It doesn't feel scientific to me.
46:48.720 --> 46:52.560
So I want to know how many people are really sleeping
46:52.560 --> 46:55.080
in Teslas versus sleeping.
46:55.080 --> 46:57.640
I was driving here, sleep deprived,
46:57.640 --> 46:59.520
in a car with no automation.
46:59.520 --> 47:01.040
I was falling asleep.
47:01.040 --> 47:02.120
I agree that it's hypey.
47:02.120 --> 47:04.840
It's just like, you know what?
47:04.840 --> 47:08.480
If Elon put driver monitoring, my last autopilot experience
47:08.480 --> 47:12.200
was I rented a Model 3 in March and drove it around.
47:12.200 --> 47:13.640
The wheel thing is annoying.
47:13.640 --> 47:15.440
And the reason the wheel thing is annoying.
47:15.440 --> 47:17.080
We use the wheel thing as well, but we
47:17.080 --> 47:18.720
don't disengage on wheel.
47:18.720 --> 47:21.720
For Tesla, you have to touch the wheel just enough
47:21.720 --> 47:25.320
to trigger the torque sensor to tell it that you're there,
47:25.320 --> 47:29.720
but not enough as to disengage it, which don't use it
47:29.720 --> 47:30.440
for two things.
47:30.440 --> 47:31.360
Don't disengage on wheel.
47:31.360 --> 47:32.400
You don't have to.
47:32.400 --> 47:35.360
That whole experience, wow, beautifully put.
47:35.360 --> 47:38.360
All those elements, even if you don't have driver monitoring,
47:38.360 --> 47:41.080
that whole experience needs to be better.
47:41.080 --> 47:43.760
Driver monitoring, I think would make,
47:43.760 --> 47:46.200
I mean, I think supercruise is a better experience
47:46.200 --> 47:48.440
once it's engaged over autopilot.
47:48.440 --> 47:51.600
I think supercruise is a transition to engagement
47:51.600 --> 47:55.200
and disengagement are significantly worse.
47:55.200 --> 47:57.880
There's a tricky thing, because if I were to criticize
47:57.880 --> 48:00.800
supercruise, it's a little too crude.
48:00.800 --> 48:03.640
And I think it's like six seconds or something.
48:03.640 --> 48:06.080
If you look off road, it'll start warning you.
48:06.080 --> 48:09.120
It's some ridiculously long period of time.
48:09.120 --> 48:14.120
And just the way, I think it's basically, it's a binary.
48:15.840 --> 48:17.440
It should be adapted.
48:17.440 --> 48:19.880
Yeah, it needs to learn more about you.
48:19.880 --> 48:23.160
It needs to communicate what it sees about you more.
48:23.160 --> 48:25.800
I'm not, you know, Tesla shows what it sees
48:25.800 --> 48:27.160
about the external world.
48:27.160 --> 48:29.120
It would be nice if supercruise would tell us
48:29.120 --> 48:30.840
what it sees about the internal world.
48:30.840 --> 48:31.960
It's even worse than that.
48:31.960 --> 48:33.320
You press the button to engage
48:33.320 --> 48:35.480
and it just says supercruise unavailable.
48:35.480 --> 48:36.320
Yeah, why?
48:36.320 --> 48:37.800
Why?
48:37.800 --> 48:41.480
Yeah, that transparency is good.
48:41.480 --> 48:43.520
We've renamed the driver monitoring packet
48:43.520 --> 48:45.360
to driver state.
48:45.360 --> 48:46.280
Driver state.
48:46.280 --> 48:48.360
We have car state packet, which has the state of the car
48:48.360 --> 48:51.040
and driver state packet, which has state of the driver.
48:51.040 --> 48:52.240
So what is it?
48:52.240 --> 48:54.080
Estimate their BAC.
48:54.080 --> 48:54.920
What's BAC?
48:54.920 --> 48:55.920
Blood alcohol, kind of.
48:57.360 --> 48:59.240
You think that's possible with computer vision?
48:59.240 --> 49:00.080
Absolutely.
49:02.560 --> 49:04.520
It's a, to me, it's an open question.
49:04.520 --> 49:06.600
I haven't looked into too much.
49:06.600 --> 49:08.440
Actually, I quite seriously looked at the literature.
49:08.440 --> 49:10.840
It's not obvious to me that from the eyes and so on,
49:10.840 --> 49:11.680
you can tell.
49:11.680 --> 49:13.440
You might need stuff from the car as well.
49:13.440 --> 49:15.760
You might need how they're controlling the car, right?
49:15.760 --> 49:17.360
And that's fundamentally at the end of the day
49:17.360 --> 49:18.640
what you care about.
49:18.640 --> 49:21.640
But I think, especially when people are really drunk,
49:21.640 --> 49:23.640
they're not controlling the car nearly as smoothly
49:23.640 --> 49:25.160
as they would look at them walking, right?
49:25.160 --> 49:27.240
They're, the car is like an extension of the body.
49:27.240 --> 49:29.360
So I think you could totally detect.
49:29.360 --> 49:30.880
And if you could fix people who are drunk,
49:30.880 --> 49:32.840
distracted, asleep, if you fix those three.
49:32.840 --> 49:35.480
Yeah, that's a huge, that's huge.
49:35.480 --> 49:38.240
So what are the current limitations of OpenPilot?
49:38.240 --> 49:41.720
What are the main problems that still need to be solved?
49:41.720 --> 49:45.440
We're hopefully fixing a few of them in zero six.
49:45.440 --> 49:48.400
We're not as good as autopilot at stop cars.
49:49.440 --> 49:54.240
So if you're coming up to a red light at like 55,
49:55.200 --> 49:56.880
so it's the radar stopped car problem,
49:56.880 --> 49:59.200
which is responsible for two autopilot accidents,
49:59.200 --> 50:01.480
it's hard to differentiate a stopped car
50:01.480 --> 50:03.640
from a like signpost.
50:03.640 --> 50:05.320
Yeah, static object.
50:05.320 --> 50:07.520
So you have to fuse, you have to do this visually.
50:07.520 --> 50:09.600
There's no way from the radar data to tell the difference.
50:09.600 --> 50:10.680
Maybe you can make a map,
50:10.680 --> 50:13.840
but I don't really believe in mapping at all anymore.
50:13.840 --> 50:14.920
Wait, wait, wait, what?
50:14.920 --> 50:16.040
You don't believe in mapping?
50:16.040 --> 50:16.880
No.
50:16.880 --> 50:21.120
So you're basically, the OpenPilot solution is saying,
50:21.120 --> 50:22.480
react to the environment as you see it,
50:22.480 --> 50:24.480
just like human doing beings do.
50:24.480 --> 50:26.200
And then eventually when you want to do navigate
50:26.200 --> 50:30.400
on OpenPilot, I'll train the net to look at ways.
50:30.400 --> 50:31.360
I'll run ways in the background,
50:31.360 --> 50:32.200
I'll train and come down a way.
50:32.200 --> 50:33.560
Are you using GPS at all?
50:33.560 --> 50:34.840
We use it to ground truth.
50:34.840 --> 50:37.440
We use it to very carefully ground truth the paths.
50:37.440 --> 50:39.560
We have a stack which can recover relative
50:39.560 --> 50:41.800
to 10 centimeters over one minute.
50:41.800 --> 50:43.440
And then we use that to ground truth
50:43.440 --> 50:45.880
exactly where the car went in that local part
50:45.880 --> 50:47.800
of the environment, but it's all local.
50:47.800 --> 50:49.160
How are you testing in general?
50:49.160 --> 50:51.400
Just for yourself, like experiments and stuff.
50:51.400 --> 50:54.000
Where are you located?
50:54.000 --> 50:54.840
San Diego.
50:54.840 --> 50:55.680
San Diego.
50:55.680 --> 50:56.520
Yeah.
50:56.520 --> 50:57.360
Okay.
50:57.360 --> 50:59.760
So you basically drive around there,
50:59.760 --> 51:02.200
collect some data and watch the performance?
51:02.200 --> 51:04.800
We have a simulator now and we have,
51:04.800 --> 51:06.440
our simulator is really cool.
51:06.440 --> 51:08.120
Our simulator is not,
51:08.120 --> 51:09.720
it's not like a Unity based simulator.
51:09.720 --> 51:11.840
Our simulator lets us load in real estate.
51:12.880 --> 51:13.720
What do you mean?
51:13.720 --> 51:16.760
We can load in a drive and simulate
51:16.760 --> 51:20.280
what the system would have done on the historical data.
51:20.280 --> 51:21.480
Ooh, nice.
51:22.520 --> 51:24.360
Interesting.
51:24.360 --> 51:26.080
Right now we're only using it for testing,
51:26.080 --> 51:28.640
but as soon as we start using it for training.
51:28.640 --> 51:29.480
That's it.
51:29.480 --> 51:30.840
That's all set up for us.
51:30.840 --> 51:33.040
What's your feeling about the real world versus simulation?
51:33.040 --> 51:34.320
Do you like simulation for training?
51:34.320 --> 51:35.720
If this moves to training?
51:35.720 --> 51:40.040
So we have to distinguish two types of simulators, right?
51:40.040 --> 51:44.720
There's a simulator that like is completely fake.
51:44.720 --> 51:46.720
I could get my car to drive around in GTA.
51:47.800 --> 51:51.080
I feel that this kind of simulator is useless.
51:51.880 --> 51:53.640
You're never, there's so many.
51:54.640 --> 51:57.000
My analogy here is like, okay, fine.
51:57.000 --> 51:59.920
You're not solving the computer vision problem,
51:59.920 --> 52:02.440
but you're solving the computer graphics problem.
52:02.440 --> 52:03.280
Right.
52:03.280 --> 52:04.600
And you don't think you can get very far
52:04.600 --> 52:08.040
by creating ultra realistic graphics?
52:08.040 --> 52:10.360
No, because you can create ultra realistic graphics
52:10.360 --> 52:13.160
or the road, now create ultra realistic behavioral models
52:13.160 --> 52:14.600
of the other cars.
52:14.600 --> 52:16.920
Oh, well, I'll just use myself driving.
52:16.920 --> 52:18.280
No, you won't.
52:18.280 --> 52:21.640
You need real, you need actual human behavior
52:21.640 --> 52:23.320
because that's what you're trying to learn.
52:23.320 --> 52:25.840
The driving does not have a spec.
52:25.840 --> 52:29.920
The definition of driving is what humans do when they drive.
52:29.920 --> 52:32.800
Whatever Waymo does, I don't think it's driving.
52:32.800 --> 52:33.640
Right.
52:33.640 --> 52:36.400
Well, I think actually Waymo and others,
52:36.400 --> 52:38.920
if there's any use for reinforcement learning,
52:38.920 --> 52:40.360
I've seen it used quite well.
52:40.360 --> 52:41.640
I studied pedestrians a lot too,
52:41.640 --> 52:44.360
is try to train models from real data
52:44.360 --> 52:46.920
of how pedestrians move and try to use reinforcement learning
52:46.920 --> 52:50.040
models to make pedestrians move in human like ways.
52:50.040 --> 52:53.520
By that point, you've already gone so many layers,
52:53.520 --> 52:55.680
you detected a pedestrian.
52:55.680 --> 52:59.640
Did you hand code the feature vector of their state?
52:59.640 --> 53:00.480
Right.
53:00.480 --> 53:02.880
Did you guys learn anything from computer vision
53:02.880 --> 53:04.600
before deep learning?
53:04.600 --> 53:07.160
Well, okay, I feel like this is...
53:07.160 --> 53:10.840
So perception to you is the sticking point.
53:10.840 --> 53:13.760
I mean, what's the hardest part of the stack here?
53:13.760 --> 53:18.760
There is no human understandable feature vector
53:19.680 --> 53:22.000
separating perception and planning.
53:23.040 --> 53:25.120
That's the best way I can put that.
53:25.120 --> 53:25.960
There is no...
53:25.960 --> 53:29.600
So it's all together and it's a joint problem.
53:29.600 --> 53:31.480
So you can take localization.
53:31.480 --> 53:32.960
Localization and planning,
53:32.960 --> 53:34.760
there is a human understandable feature vector
53:34.760 --> 53:36.000
between these two things.
53:36.000 --> 53:38.720
I mean, okay, so I have like three degrees position,
53:38.720 --> 53:40.560
three degrees orientation and those derivatives,
53:40.560 --> 53:42.000
maybe those second derivatives, right?
53:42.000 --> 53:44.520
That's human understandable, that's physical.
53:44.520 --> 53:48.560
The between perception and planning.
53:49.520 --> 53:53.600
So like Waymo has a perception stack and then a planner.
53:53.600 --> 53:55.560
And one of the things Waymo does right
53:55.560 --> 54:00.000
is they have a simulator that can separate those two.
54:00.000 --> 54:02.920
They can like replay their perception data
54:02.920 --> 54:03.920
and test their system,
54:03.920 --> 54:04.880
which is what I'm talking about
54:04.880 --> 54:06.520
about like the two different kinds of simulators.
54:06.520 --> 54:08.240
There's the kind that can work on real data
54:08.240 --> 54:10.920
and there's the kind that can't work on real data.
54:10.920 --> 54:13.880
Now, the problem is that I don't think
54:13.880 --> 54:16.160
you can hand code a feature vector, right?
54:16.160 --> 54:17.360
Like you have some list of like,
54:17.360 --> 54:19.040
well, here's my list of cars in the scenes.
54:19.040 --> 54:21.280
Here's my list of pedestrians in the scene.
54:21.280 --> 54:23.240
This isn't what humans are doing.
54:23.240 --> 54:24.920
What are humans doing?
54:24.920 --> 54:25.760
Global.
54:27.200 --> 54:28.040
Some, some.
54:28.040 --> 54:31.960
You're saying that's too difficult to hand engineer.
54:31.960 --> 54:34.120
I'm saying that there is no state vector.
54:34.120 --> 54:36.560
Given a perfect, I could give you the best team
54:36.560 --> 54:38.520
of engineers in the world to build a perception system
54:38.520 --> 54:40.640
and the best team to build a planner.
54:40.640 --> 54:42.640
All you have to do is define the state vector
54:42.640 --> 54:43.960
that separates those two.
54:43.960 --> 54:48.560
I'm missing the state vector that separates those two.
54:48.560 --> 54:49.400
What do you mean?
54:49.400 --> 54:54.000
So what is the output of your perception system?
54:54.000 --> 54:56.880
Output of the perception system.
54:56.880 --> 55:01.560
It's, there's, okay, well, there's several ways to do it.
55:01.560 --> 55:03.840
One is the slam component is localization.
55:03.840 --> 55:05.920
The other is drivable area, drivable space.
55:05.920 --> 55:06.760
Drivable space, yep.
55:06.760 --> 55:09.000
And then there's the different objects in the scene.
55:09.000 --> 55:09.840
Yep.
55:11.000 --> 55:16.000
And different objects in the scene over time maybe
55:16.000 --> 55:18.720
to give you input to then try to start
55:18.720 --> 55:21.560
modeling the trajectories of those objects.
55:21.560 --> 55:22.400
Sure.
55:22.400 --> 55:23.240
That's it.
55:23.240 --> 55:25.160
I can give you a concrete example of something you missed.
55:25.160 --> 55:26.000
What's that?
55:26.000 --> 55:28.640
So say there's a bush in the scene.
55:28.640 --> 55:30.920
Humans understand that when they see this bush
55:30.920 --> 55:34.680
that there may or may not be a car behind that bush.
55:34.680 --> 55:37.280
Drivable area and a list of objects does not include that.
55:37.280 --> 55:38.920
Humans are doing this constantly
55:38.920 --> 55:40.920
at the simplest intersections.
55:40.920 --> 55:43.880
So now you have to talk about occluded area.
55:43.880 --> 55:44.720
Right.
55:44.720 --> 55:47.800
Right, but even that, what do you mean by occluded?
55:47.800 --> 55:49.640
Okay, so I can't see it.
55:49.640 --> 55:51.840
Well, if it's the other side of a house, I don't care.
55:51.840 --> 55:53.560
What's the likelihood that there's a car
55:53.560 --> 55:55.280
in that occluded area, right?
55:55.280 --> 55:58.080
And if you say, okay, we'll add that,
55:58.080 --> 56:00.680
I can come up with 10 more examples that you can't add.
56:01.680 --> 56:03.960
Certainly occluded area would be something
56:03.960 --> 56:06.760
that simulator would have because it's simulating
56:06.760 --> 56:11.320
the entire, you know, occlusion is part of it.
56:11.320 --> 56:12.680
Occlusion is part of a vision stack.
56:12.680 --> 56:13.520
Vision stack.
56:13.520 --> 56:16.600
But what I'm saying is if you have a hand engineered,
56:16.600 --> 56:20.040
if your perception system output can be written
56:20.040 --> 56:22.240
in a spec document, it is incomplete.
56:23.120 --> 56:27.800
Yeah, I mean, I certainly, it's hard to argue with that
56:27.800 --> 56:30.120
because in the end, that's going to be true.
56:30.120 --> 56:31.760
Yeah, and I'll tell you what the output
56:31.760 --> 56:32.720
of our perception system is.
56:32.720 --> 56:33.560
What's that?
56:33.560 --> 56:37.120
It's a 1024 dimensional vector.
56:37.120 --> 56:38.000
Transparent neural net.
56:38.000 --> 56:39.000
Oh, you know that.
56:39.000 --> 56:42.000
No, that's the 1024 dimensions of who knows what.
56:43.520 --> 56:45.160
Because it's operating on real data.
56:45.160 --> 56:47.000
Yeah.
56:47.000 --> 56:48.320
And that's the perception.
56:48.320 --> 56:50.360
That's the perception state, right?
56:50.360 --> 56:53.520
Think about an autoencoder for faces, right?
56:53.520 --> 56:54.720
If you have an autoencoder for faces
56:54.720 --> 56:59.720
and you say it has 256 dimensions in the middle,
56:59.720 --> 57:00.680
and I'm taking a face over here
57:00.680 --> 57:02.800
and projecting it to a face over here.
57:02.800 --> 57:05.360
Can you hand label all 256 of those dimensions?
57:06.280 --> 57:09.240
Well, no, but those are generated automatically.
57:09.240 --> 57:11.360
But even if you tried to do it by hand,
57:11.360 --> 57:15.520
could you come up with a spec between your encoder
57:15.520 --> 57:16.360
and your decoder?
57:17.400 --> 57:20.720
No, no, because it wasn't designed, but they're...
57:20.720 --> 57:22.600
No, no, no, but if you could design it,
57:23.600 --> 57:26.480
if you could design a face reconstructor system,
57:26.480 --> 57:28.080
could you come up with a spec?
57:29.240 --> 57:32.320
No, but I think we're missing here a little bit.
57:32.320 --> 57:35.080
I think you're just being very poetic
57:35.080 --> 57:37.880
about expressing a fundamental problem of simulators,
57:37.880 --> 57:41.640
that they are going to be missing so much
57:42.480 --> 57:44.680
that the feature of actually
57:44.680 --> 57:47.080
would just look fundamentally different
57:47.080 --> 57:50.440
from in the simulated world than the real world.
57:51.280 --> 57:53.800
I'm not making a claim about simulators.
57:53.800 --> 57:57.120
I'm making a claim about the spec division
57:57.120 --> 57:58.800
between perception and planning.
57:58.800 --> 57:59.640
And planning.
57:59.640 --> 58:00.840
Even in your system.
58:00.840 --> 58:01.800
Just in general.
58:01.800 --> 58:03.360
Right, just in general.
58:03.360 --> 58:05.680
If you're trying to build a car that drives,
58:05.680 --> 58:07.280
if you're trying to hand code
58:07.280 --> 58:08.760
the output of your perception system,
58:08.760 --> 58:10.960
like saying, here's a list of all the cars in the scene.
58:10.960 --> 58:11.920
Here's a list of all the people.
58:11.920 --> 58:13.120
Here's a list of the occluded areas.
58:13.120 --> 58:14.920
Here's a vector of drivable areas.
58:14.920 --> 58:16.600
It's insufficient.
58:16.600 --> 58:18.000
And if you start to believe that,
58:18.000 --> 58:20.840
you realize that what Waymo and Cruz are doing is impossible.
58:20.840 --> 58:24.320
Currently, what we're doing is the perception problem
58:24.320 --> 58:28.200
is converting the scene into a chessboard.
58:29.200 --> 58:31.720
And then you reason some basic reasoning
58:31.720 --> 58:33.400
around that chessboard.
58:33.400 --> 58:38.080
And you're saying that really there's a lot missing there.
58:38.080 --> 58:40.240
First of all, why are we talking about this?
58:40.240 --> 58:42.840
Because isn't this a full autonomy?
58:42.840 --> 58:44.720
Is this something you think about?
58:44.720 --> 58:47.680
Oh, I want to win self driving cars.
58:47.680 --> 58:52.680
So your definition of win includes the full five.
58:53.680 --> 58:55.800
I don't think level four is a real thing.
58:55.800 --> 58:59.720
I want to build the AlphaGo of driving.
58:59.720 --> 59:04.720
So AlphaGo is really end to end.
59:06.160 --> 59:07.000
Yeah.
59:07.000 --> 59:09.840
Is, yeah, it's end to end.
59:09.840 --> 59:12.480
And do you think this whole problem,
59:12.480 --> 59:14.680
is that also kind of what you're getting at
59:14.680 --> 59:16.640
with the perception and the planning?
59:16.640 --> 59:19.440
Is that this whole problem, the right way to do it,
59:19.440 --> 59:21.600
is really to learn the entire thing?
59:21.600 --> 59:23.680
I'll argue that not only is it the right way,
59:23.680 --> 59:27.640
it's the only way that's going to exceed human performance.
59:27.640 --> 59:29.960
Well, it's certainly true for Go.
59:29.960 --> 59:31.520
Everyone who tried to hand code Go things
59:31.520 --> 59:33.440
built human inferior things.
59:33.440 --> 59:36.200
And then someone came along and wrote some 10,000 line thing
59:36.200 --> 59:39.800
that doesn't know anything about Go that beat everybody.
59:39.800 --> 59:41.080
It's 10,000 lines.
59:41.080 --> 59:43.360
True, in that sense.
59:43.360 --> 59:47.520
The open question then that maybe I can ask you
59:47.520 --> 59:52.520
is driving is much harder than Go.
59:53.440 --> 59:56.240
The open question is how much harder?
59:56.240 --> 59:59.480
So how, because I think the Elon Musk approach here
59:59.480 --> 1:00:01.600
with planning and perception is similar
1:00:01.600 --> 1:00:02.960
to what you're describing,
1:00:02.960 --> 1:00:07.960
which is really turning into not some kind of modular thing,
1:00:08.280 --> 1:00:11.120
but really do formulate as a learning problem
1:00:11.120 --> 1:00:13.360
and solve the learning problem with scale.
1:00:13.360 --> 1:00:17.120
So how many years, put one,
1:00:17.120 --> 1:00:18.880
how many years would it take to solve this problem
1:00:18.880 --> 1:00:21.680
or just how hard is this freaking problem?
1:00:21.680 --> 1:00:24.560
Well, the cool thing is,
1:00:24.560 --> 1:00:27.800
I think there's a lot of value
1:00:27.800 --> 1:00:29.840
that we can deliver along the way.
1:00:30.840 --> 1:00:35.840
I think that you can build lame keeping assist
1:00:36.600 --> 1:00:41.440
actually plus adaptive cruise control plus, okay,
1:00:41.440 --> 1:00:46.000
looking at ways extends to like all of driving.
1:00:46.000 --> 1:00:47.920
Yeah, most of driving, right?
1:00:47.920 --> 1:00:49.760
Oh, your adaptive cruise control treats red lights
1:00:49.760 --> 1:00:51.200
like cars, okay.
1:00:51.200 --> 1:00:53.480
So let's jump around with you mentioned
1:00:53.480 --> 1:00:55.760
that you didn't like navigate an autopilot.
1:00:55.760 --> 1:00:57.760
What advice, how would you make it better?
1:00:57.760 --> 1:01:00.560
Do you think as a feature that if it's done really well,
1:01:00.560 --> 1:01:02.360
it's a good feature?
1:01:02.360 --> 1:01:07.360
I think that it's too reliant on like hand coded hacks
1:01:07.520 --> 1:01:10.400
for like, how does navigate an autopilot do a lane change?
1:01:10.400 --> 1:01:13.400
It actually does the same lane change every time
1:01:13.400 --> 1:01:14.320
and it feels mechanical.
1:01:14.320 --> 1:01:15.920
Humans do different lane changes.
1:01:15.920 --> 1:01:17.360
Humans, sometimes we'll do a slow one,
1:01:17.360 --> 1:01:18.920
sometimes do a fast one.
1:01:18.920 --> 1:01:20.880
Navigate an autopilot at least every time I use it
1:01:20.880 --> 1:01:23.040
is it the identical lane change?
1:01:23.040 --> 1:01:24.280
How do you learn?
1:01:24.280 --> 1:01:26.800
I mean, this is a fundamental thing actually
1:01:26.800 --> 1:01:30.400
is the breaking and accelerating,
1:01:30.400 --> 1:01:33.960
something that still, Tesla probably does it better
1:01:33.960 --> 1:01:36.800
than most cars, but it still doesn't do a great job
1:01:36.800 --> 1:01:39.960
of creating a comfortable natural experience
1:01:39.960 --> 1:01:42.680
and navigate an autopilot is just lane changes
1:01:42.680 --> 1:01:44.120
and extension of that.
1:01:44.120 --> 1:01:49.120
So how do you learn to do natural lane change?
1:01:49.120 --> 1:01:52.920
So we have it and I can talk about how it works.
1:01:52.920 --> 1:01:57.920
So I feel that we have the solution for lateral
1:01:58.720 --> 1:02:00.640
but we don't yet have the solution for longitudinal.
1:02:00.640 --> 1:02:03.360
There's a few reasons longitudinal is harder than lateral.
1:02:03.360 --> 1:02:06.920
The lane change component, the way that we train on it
1:02:06.920 --> 1:02:10.840
very simply is like our model has an input
1:02:10.840 --> 1:02:14.040
for whether it's doing a lane change or not.
1:02:14.040 --> 1:02:16.360
And then when we train the end to end model,
1:02:16.360 --> 1:02:19.560
we hand label all the lane changes because you have to.
1:02:19.560 --> 1:02:22.440
I've struggled a long time about not wanting to do that
1:02:22.440 --> 1:02:24.280
but I think you have to.
1:02:24.280 --> 1:02:25.320
Or the training data.
1:02:25.320 --> 1:02:26.520
For the training data, right?
1:02:26.520 --> 1:02:28.280
We actually have an automatic ground truth
1:02:28.280 --> 1:02:30.600
or which automatically labels all the lane changes.
1:02:30.600 --> 1:02:31.680
Was that possible?
1:02:31.680 --> 1:02:32.720
To automatically label lane changes?
1:02:32.720 --> 1:02:33.560
Yeah.
1:02:33.560 --> 1:02:34.800
And detect the lane I see when it crosses it, right?
1:02:34.800 --> 1:02:36.680
And I don't have to get that high percent accuracy
1:02:36.680 --> 1:02:38.080
but it's like 95 good enough.
1:02:38.080 --> 1:02:38.960
Okay.
1:02:38.960 --> 1:02:43.200
Now I set the bit when it's doing the lane change
1:02:43.200 --> 1:02:44.840
in the end to end learning.
1:02:44.840 --> 1:02:47.920
And then I set it to zero when it's not doing a lane change.
1:02:47.920 --> 1:02:49.720
So now if I want us to do a lane change a test time,
1:02:49.720 --> 1:02:52.360
I just put the bit to a one and it'll do a lane change.
1:02:52.360 --> 1:02:54.640
Yeah, but so if you look at the space of lane change,
1:02:54.640 --> 1:02:57.320
you know some percentage, not a hundred percent,
1:02:57.320 --> 1:03:01.120
that we make as humans is not a pleasant experience
1:03:01.120 --> 1:03:02.800
because we messed some part of it up.
1:03:02.800 --> 1:03:04.320
It's nerve wracking to change.
1:03:04.320 --> 1:03:05.760
If you look, you have to see,
1:03:05.760 --> 1:03:06.920
it has to accelerate.
1:03:06.920 --> 1:03:09.920
How do we label the ones that are natural and feel good?
1:03:09.920 --> 1:03:11.560
You know, that's the,
1:03:11.560 --> 1:03:13.360
because that's your ultimate criticism,
1:03:13.360 --> 1:03:17.000
the current navigate and autopilot just doesn't feel good.
1:03:17.000 --> 1:03:18.520
Well, the current navigate and autopilot
1:03:18.520 --> 1:03:21.720
is a hand coded policy written by an engineer in a room
1:03:21.720 --> 1:03:25.080
who probably went out and tested it a few times on the 280.
1:03:25.080 --> 1:03:28.560
Probably a more, a better version of that.
1:03:28.560 --> 1:03:29.400
But yes.
1:03:29.400 --> 1:03:30.560
That's how we would have written it.
1:03:30.560 --> 1:03:31.400
Yeah.
1:03:31.400 --> 1:03:33.480
Maybe Tesla did a Tesla, they tested it in.
1:03:33.480 --> 1:03:34.920
That might have been two engineers.
1:03:34.920 --> 1:03:35.760
Two engineers.
1:03:35.760 --> 1:03:37.400
Yeah.
1:03:37.400 --> 1:03:40.120
No, but so if you learn the lane change,
1:03:40.120 --> 1:03:42.480
if you learn how to do a lane change from data,
1:03:42.480 --> 1:03:44.680
just like you have a label that says lane change
1:03:44.680 --> 1:03:48.040
and then you put it in when you want it to do the lane change,
1:03:48.040 --> 1:03:49.640
it'll automatically do the lane change
1:03:49.640 --> 1:03:51.600
that's appropriate for the situation.
1:03:51.600 --> 1:03:54.720
Now, to get at the problem of some humans
1:03:54.720 --> 1:03:55.960
do bad lane changes,
1:03:57.400 --> 1:03:59.920
we haven't worked too much on this problem yet.
1:03:59.920 --> 1:04:03.120
It's not that much of a problem in practice.
1:04:03.120 --> 1:04:06.160
My theory is that all good drivers are good in the same way
1:04:06.160 --> 1:04:08.440
and all bad drivers are bad in different ways.
1:04:09.360 --> 1:04:11.320
And we've seen some data to back this up.
1:04:11.320 --> 1:04:12.400
Well, beautifully put.
1:04:12.400 --> 1:04:16.560
So you just basically, if that's true hypothesis,
1:04:16.560 --> 1:04:19.920
then your task is to discover the good drivers.
1:04:19.920 --> 1:04:21.800
The good drivers stand out
1:04:21.800 --> 1:04:23.360
because they're in one cluster
1:04:23.360 --> 1:04:25.200
and the bad drivers are scattered all over the place
1:04:25.200 --> 1:04:27.240
and your net learns the cluster.
1:04:27.240 --> 1:04:28.080
Yeah.
1:04:28.080 --> 1:04:30.800
So you just learn from the good drivers
1:04:30.800 --> 1:04:32.200
and they're easy to cluster.
1:04:33.200 --> 1:04:34.240
In fact, we learned from all of them
1:04:34.240 --> 1:04:35.840
and the net automatically learns the policy
1:04:35.840 --> 1:04:36.920
that's like the majority.
1:04:36.920 --> 1:04:38.440
But we'll eventually probably have to build some out.
1:04:38.440 --> 1:04:41.560
So if that theory is true, I hope it's true
1:04:41.560 --> 1:04:46.440
because the counter theory is there is many clusters,
1:04:49.480 --> 1:04:53.680
maybe arbitrarily many clusters of good drivers.
1:04:53.680 --> 1:04:55.840
Because if there's one cluster of good drivers,
1:04:55.840 --> 1:04:57.600
you can at least discover a set of policies.
1:04:57.600 --> 1:04:59.000
You can learn a set of policies
1:04:59.000 --> 1:05:00.640
which would be good universally.
1:05:00.640 --> 1:05:01.640
Yeah.
1:05:01.640 --> 1:05:04.560
That would be nice if it's true.
1:05:04.560 --> 1:05:06.560
And you're saying that there is some evidence that...
1:05:06.560 --> 1:05:09.720
Let's say lane changes can be clustered into four clusters.
1:05:09.720 --> 1:05:10.560
Right.
1:05:10.560 --> 1:05:12.040
There's a finite level of...
1:05:12.040 --> 1:05:15.280
I would argue that all four of those are good clusters.
1:05:15.280 --> 1:05:18.360
All the things that are random are noise and probably bad.
1:05:18.360 --> 1:05:20.360
And which one of the four you pick?
1:05:20.360 --> 1:05:21.920
Or maybe it's 10 or maybe it's 20.
1:05:21.920 --> 1:05:22.760
You can learn that.
1:05:22.760 --> 1:05:23.800
It's context dependent.
1:05:23.800 --> 1:05:25.040
It depends on the scene.
1:05:26.760 --> 1:05:30.440
And the hope is it's not too dependent on the driver.
1:05:31.400 --> 1:05:34.240
Yeah, the hope is that it all washes out.
1:05:34.240 --> 1:05:36.960
The hope is that the distribution is not bimodal.
1:05:36.960 --> 1:05:39.080
The hope is that it's a nice Gaussian.
1:05:39.080 --> 1:05:41.640
So what advice would you give to Tesla?
1:05:41.640 --> 1:05:45.000
How to fix, how to improve, navigate an autopilot?
1:05:45.000 --> 1:05:48.240
That's the lessons that you've learned from Kamii.
1:05:48.240 --> 1:05:50.560
The only real advice I would give to Tesla
1:05:50.560 --> 1:05:52.920
is please put driver monitoring in your cars.
1:05:53.920 --> 1:05:55.160
With respect to improving it.
1:05:55.160 --> 1:05:56.000
You can't do that anymore.
1:05:56.000 --> 1:05:57.280
I started to interrupt.
1:05:57.280 --> 1:06:01.760
But there's a practical nature of many of hundreds of thousands
1:06:01.760 --> 1:06:05.760
of cars being produced that don't have a good driver facing camera.
1:06:05.760 --> 1:06:07.520
The Model 3 has a selfie cam.
1:06:07.520 --> 1:06:08.680
Is it not good enough?
1:06:08.680 --> 1:06:10.800
Did they not have put IR LEDs for night?
1:06:10.800 --> 1:06:11.640
That's a good question.
1:06:11.640 --> 1:06:13.360
But I do know that it's fish eye
1:06:13.360 --> 1:06:15.800
and it's relatively low resolution.
1:06:15.800 --> 1:06:16.760
So it's really not designed.
1:06:16.760 --> 1:06:18.760
It wasn't designed for driver monitoring.
1:06:18.760 --> 1:06:21.760
You can hope that you can kind of scrape up
1:06:21.760 --> 1:06:24.400
and have something from it.
1:06:24.400 --> 1:06:27.520
But why didn't they put it in today?
1:06:27.520 --> 1:06:28.280
Put it in today.
1:06:28.280 --> 1:06:29.520
Put it in today.
1:06:29.520 --> 1:06:31.520
Every time I've heard Carpathian talk about the problem
1:06:31.520 --> 1:06:33.240
and talking about like software 2.0
1:06:33.240 --> 1:06:35.240
and how the machine learning is gobbling up everything,
1:06:35.240 --> 1:06:37.440
I think this is absolutely the right strategy.
1:06:37.440 --> 1:06:40.160
I think that he didn't write and navigate on autopilot.
1:06:40.160 --> 1:06:43.240
I think somebody else did and kind of hacked it on top of that stuff.
1:06:43.240 --> 1:06:45.680
I think when Carpathian says, wait a second,
1:06:45.680 --> 1:06:47.440
why did we hand code this lane change policy
1:06:47.440 --> 1:06:48.360
with all these magic numbers?
1:06:48.360 --> 1:06:49.360
We're going to learn it from data.
1:06:49.360 --> 1:06:49.840
They'll fix it.
1:06:49.840 --> 1:06:51.040
They already know what to do there.
1:06:51.040 --> 1:06:54.360
Well, that's Andre's job is to turn everything
1:06:54.360 --> 1:06:57.480
into a learning problem and collect a huge amount of data.
1:06:57.480 --> 1:07:01.120
The reality is, though, not every problem
1:07:01.120 --> 1:07:04.080
can be turned into a learning problem in the short term.
1:07:04.080 --> 1:07:07.280
In the end, everything will be a learning problem.
1:07:07.280 --> 1:07:12.880
The reality is, like if you want to build L5 vehicles today,
1:07:12.880 --> 1:07:15.600
it will likely involve no learning.
1:07:15.600 --> 1:07:20.320
And that's the reality is, so at which point does learning start?
1:07:20.320 --> 1:07:23.480
It's the crutch statement that LiDAR is a crutch.
1:07:23.480 --> 1:07:27.240
Which point will learning get up to part of human performance?
1:07:27.240 --> 1:07:31.960
It's over human performance on ImageNet, classification,
1:07:31.960 --> 1:07:34.000
on driving, it's a question still.
1:07:34.000 --> 1:07:35.760
It is a question.
1:07:35.760 --> 1:07:39.160
I'll say this, I'm here to play for 10 years.
1:07:39.160 --> 1:07:40.280
I'm not here to try to.
1:07:40.280 --> 1:07:42.960
I'm here to play for 10 years and make money along the way.
1:07:42.960 --> 1:07:45.040
I'm not here to try to promise people
1:07:45.040 --> 1:07:47.600
that I'm going to have my L5 taxi network up and working
1:07:47.600 --> 1:07:48.200
in two years.
1:07:48.200 --> 1:07:49.400
Do you think that was a mistake?
1:07:49.400 --> 1:07:50.520
Yes.
1:07:50.520 --> 1:07:53.160
What do you think was the motivation behind saying
1:07:53.160 --> 1:07:56.640
that other companies are also promising L5 vehicles
1:07:56.640 --> 1:08:01.880
with their different approaches in 2020, 2021, 2022?
1:08:01.880 --> 1:08:05.720
If anybody would like to bet me that those things do not pan out,
1:08:05.720 --> 1:08:07.000
I will bet you.
1:08:07.000 --> 1:08:10.800
Even money, even money, I'll bet you as much as you want.
1:08:10.800 --> 1:08:13.600
So are you worried about what's going to happen?
1:08:13.600 --> 1:08:16.040
Because you're not in full agreement on that.
1:08:16.040 --> 1:08:19.160
What's going to happen when 2022, 2021 come around
1:08:19.160 --> 1:08:22.800
and nobody has fleets of autonomous vehicles?
1:08:22.800 --> 1:08:25.000
Well, you can look at the history.
1:08:25.000 --> 1:08:26.880
If you go back five years ago, they
1:08:26.880 --> 1:08:29.880
were all promised by 2018 and 2017.
1:08:29.880 --> 1:08:32.200
But they weren't that strong of promises.
1:08:32.200 --> 1:08:36.240
I mean, Ford really declared.
1:08:36.240 --> 1:08:40.560
I think not many have declared as definitively
1:08:40.560 --> 1:08:42.600
as they have now these dates.
1:08:42.600 --> 1:08:43.320
Well, OK.
1:08:43.320 --> 1:08:45.040
So let's separate L4 and L5.
1:08:45.040 --> 1:08:46.800
Do I think that it's possible for Waymo
1:08:46.800 --> 1:08:50.960
to continue to hack on their system
1:08:50.960 --> 1:08:53.400
until it gets to level four in Chandler, Arizona?
1:08:53.400 --> 1:08:55.040
Yes.
1:08:55.040 --> 1:08:56.800
No safety driver?
1:08:56.800 --> 1:08:57.600
Chandler, Arizona?
1:08:57.600 --> 1:08:59.600
Yeah.
1:08:59.600 --> 1:09:02.440
By which year are we talking about?
1:09:02.440 --> 1:09:06.120
Oh, I even think that's possible by like 2020, 2021.
1:09:06.120 --> 1:09:09.480
But level four, Chandler, Arizona, not level five,
1:09:09.480 --> 1:09:11.480
New York City.
1:09:11.480 --> 1:09:15.920
Level four, meaning some very defined streets.
1:09:15.920 --> 1:09:17.400
It works out really well.
1:09:17.400 --> 1:09:18.280
Very defined streets.
1:09:18.280 --> 1:09:20.680
And then practically, these streets are pretty empty.
1:09:20.680 --> 1:09:24.680
If most of the streets are covered in Waymos,
1:09:24.680 --> 1:09:28.360
Waymo can kind of change the definition of what driving is.
1:09:28.360 --> 1:09:28.920
Right?
1:09:28.920 --> 1:09:31.720
If your self driving network is the majority
1:09:31.720 --> 1:09:34.120
of cars in an area, they only need
1:09:34.120 --> 1:09:35.720
to be safe with respect to each other,
1:09:35.720 --> 1:09:38.640
and all the humans will need to learn to adapt to them.
1:09:38.640 --> 1:09:41.120
Now go drive in downtown New York.
1:09:41.120 --> 1:09:42.200
Oh, yeah, that's.
1:09:42.200 --> 1:09:43.440
I mean, already.
1:09:43.440 --> 1:09:46.040
You can talk about autonomy and like on farms,
1:09:46.040 --> 1:09:48.520
it already works great, because you can really just
1:09:48.520 --> 1:09:51.320
follow the GPS line.
1:09:51.320 --> 1:09:56.800
So what does success look like for Kama AI?
1:09:56.800 --> 1:09:58.200
What are the milestones like where
1:09:58.200 --> 1:09:59.800
you can sit back with some champagne
1:09:59.800 --> 1:10:04.120
and say, we did it, boys and girls?
1:10:04.120 --> 1:10:06.320
Well, it's never over.
1:10:06.320 --> 1:10:07.800
Yeah, but don't be so.
1:10:07.800 --> 1:10:10.400
You must drink champagne every time you celebrate.
1:10:10.400 --> 1:10:11.440
So what is good?
1:10:11.440 --> 1:10:13.160
What are some wins?
1:10:13.160 --> 1:10:19.480
A big milestone that we're hoping for by mid next year
1:10:19.480 --> 1:10:20.680
is profitability of the company.
1:10:20.680 --> 1:10:28.560
And we're going to have to revisit the idea of selling
1:10:28.560 --> 1:10:30.280
a consumer product.
1:10:30.280 --> 1:10:32.720
But it's not going to be like the Kama One.
1:10:32.720 --> 1:10:36.240
When we do it, it's going to be perfect.
1:10:36.240 --> 1:10:39.600
OpenPilot has gotten so much better in the last two years.
1:10:39.600 --> 1:10:41.680
We're going to have a few features.
1:10:41.680 --> 1:10:43.760
We're going to have 100% driver monitoring.
1:10:43.760 --> 1:10:46.720
We're going to disable no safety features in the car.
1:10:46.720 --> 1:10:48.760
Actually, I think it'd be really cool what we're doing right
1:10:48.760 --> 1:10:51.600
now, our project this week is we're analyzing the data set
1:10:51.600 --> 1:10:53.240
and looking for all the AEB triggers
1:10:53.240 --> 1:10:55.640
from the manufacturer systems.
1:10:55.640 --> 1:10:59.440
We have better data set on that than the manufacturers.
1:10:59.440 --> 1:11:02.960
How much does Toyota have 10 million miles of real world
1:11:02.960 --> 1:11:05.360
driving to know how many times they're AEB triggered?
1:11:05.360 --> 1:11:10.880
So let me give you, because you asked, financial advice.
1:11:10.880 --> 1:11:12.440
Because I work with a lot of automakers
1:11:12.440 --> 1:11:15.840
and one possible source of money for you,
1:11:15.840 --> 1:11:21.400
which I'll be excited to see you take on, is basically
1:11:21.400 --> 1:11:29.120
selling the data, which is something that most people,
1:11:29.120 --> 1:11:31.800
and not selling in a way where here, here at Automaker,
1:11:31.800 --> 1:11:33.000
but creating.
1:11:33.000 --> 1:11:35.480
We've done this actually at MIT, not for money purposes,
1:11:35.480 --> 1:11:37.760
but you could do it for significant money purposes
1:11:37.760 --> 1:11:39.440
and make the world a better place
1:11:39.440 --> 1:11:44.240
by creating a consortia where automakers would pay in
1:11:44.240 --> 1:11:46.960
and then they get to have free access to the data.
1:11:46.960 --> 1:11:52.400
And I think a lot of people are really hungry for that
1:11:52.400 --> 1:11:54.200
and would pay significant amount of money for it.
1:11:54.200 --> 1:11:55.400
Here's the problem with that.
1:11:55.400 --> 1:11:56.840
I like this idea all in theory.
1:11:56.840 --> 1:11:59.640
It'd be very easy for me to give them access to my servers.
1:11:59.640 --> 1:12:02.280
And we already have all open source tools to access this data.
1:12:02.280 --> 1:12:03.400
It's in a great format.
1:12:03.400 --> 1:12:05.560
We have a great pipeline.
1:12:05.560 --> 1:12:07.120
But they're going to put me in the room
1:12:07.120 --> 1:12:10.120
with some business development guy.
1:12:10.120 --> 1:12:12.400
And I'm going to have to talk to this guy.
1:12:12.400 --> 1:12:15.040
And he's not going to know most of the words I'm saying.
1:12:15.040 --> 1:12:17.280
I'm not willing to tolerate that.
1:12:17.280 --> 1:12:18.840
OK, Mick Jagger.
1:12:18.840 --> 1:12:19.800
No, no, no, no.
1:12:19.800 --> 1:12:21.040
But I think I agree with you.
1:12:21.040 --> 1:12:21.720
I'm the same way.
1:12:21.720 --> 1:12:22.960
But you just tell them the terms
1:12:22.960 --> 1:12:24.640
and there's no discussion needed.
1:12:24.640 --> 1:12:30.480
If I could just tell them the terms, then like, all right.
1:12:30.480 --> 1:12:31.600
Who wants access to my data?
1:12:31.600 --> 1:12:36.680
I will sell it to you for, let's say,
1:12:36.680 --> 1:12:37.640
you want a subscription?
1:12:37.640 --> 1:12:40.680
I'll sell you for 100k a month.
1:12:40.680 --> 1:12:41.200
Anyone?
1:12:41.200 --> 1:12:42.000
100k a month?
1:12:42.000 --> 1:12:43.040
100k a month?
1:12:43.040 --> 1:12:45.080
I'll give you access to the data subscription?
1:12:45.080 --> 1:12:45.680
Yeah.
1:12:45.680 --> 1:12:46.680
Yeah, I think that's kind of fair.
1:12:46.680 --> 1:12:48.440
Came up with that number off the top of my head.
1:12:48.440 --> 1:12:50.840
If somebody sends me like a three line email where it's like,
1:12:50.840 --> 1:12:54.000
we would like to pay 100k a month to get access to your data.
1:12:54.000 --> 1:12:56.160
We would agree to like reasonable privacy terms
1:12:56.160 --> 1:12:58.360
of the people who are in the data set.
1:12:58.360 --> 1:12:59.520
I would be happy to do it.
1:12:59.520 --> 1:13:01.200
But that's not going to be the email.
1:13:01.200 --> 1:13:03.120
The email is going to be, hey, do you
1:13:03.120 --> 1:13:05.560
have some time in the next month where we can sit down
1:13:05.560 --> 1:13:07.000
and we can, I don't have time for that.
1:13:07.000 --> 1:13:08.360
We're moving too fast.
1:13:08.360 --> 1:13:10.040
You could politely respond to that email,
1:13:10.040 --> 1:13:13.240
but not saying I don't have any time for your bullshit.
1:13:13.240 --> 1:13:15.440
You say, oh, well, unfortunately, these are the terms.
1:13:15.440 --> 1:13:19.280
And so this is what we try to, we brought the cost down
1:13:19.280 --> 1:13:22.320
for you in order to minimize the friction, the communication.
1:13:22.320 --> 1:13:22.920
Yeah, absolutely.
1:13:22.920 --> 1:13:26.720
Here's the whatever it is, $1, $2 million a year.
1:13:26.720 --> 1:13:28.880
And you have access.
1:13:28.880 --> 1:13:31.440
And it's not like I get that email from like,
1:13:31.440 --> 1:13:32.720
but OK, am I going to reach out?
1:13:32.720 --> 1:13:34.200
Am I going to hire a business development person
1:13:34.200 --> 1:13:35.840
who's going to reach out to the automakers?
1:13:35.840 --> 1:13:36.480
No way.
1:13:36.480 --> 1:13:36.880
Yeah.
1:13:36.880 --> 1:13:37.840
OK, I got you.
1:13:37.840 --> 1:13:38.520
I admire.
1:13:38.520 --> 1:13:39.680
If they reached into me, I'm not
1:13:39.680 --> 1:13:40.600
going to ignore the email.
1:13:40.600 --> 1:13:42.160
I'll come back with something like, yeah,
1:13:42.160 --> 1:13:44.560
if you're willing to pay $100,000 for access to the data,
1:13:44.560 --> 1:13:46.080
I'm happy to set that up.
1:13:46.080 --> 1:13:48.200
That's worth my engineering time.
1:13:48.200 --> 1:13:49.520
That's actually quite insightful of you.
1:13:49.520 --> 1:13:50.440
You're right.
1:13:50.440 --> 1:13:52.480
Probably because many of the automakers
1:13:52.480 --> 1:13:54.480
are quite a bit old school, there
1:13:54.480 --> 1:13:56.200
will be a need to reach out.
1:13:56.200 --> 1:13:58.440
And they want it, but there will need
1:13:58.440 --> 1:13:59.800
to be some communication.
1:13:59.800 --> 1:14:00.160
You're right.
1:14:00.160 --> 1:14:06.760
Mobileye circa 2015 had the lowest R&D spend of any chipmaker.
1:14:06.760 --> 1:14:10.640
Like per, and you look at all the people who work for them,
1:14:10.640 --> 1:14:12.120
and it's all business development people
1:14:12.120 --> 1:14:15.320
because the car companies are impossible to work with.
1:14:15.320 --> 1:14:17.880
Yeah, so you have no patience for that,
1:14:17.880 --> 1:14:20.040
and you're a legit Android, huh?
1:14:20.040 --> 1:14:21.440
I have something to do, right?
1:14:21.440 --> 1:14:24.040
Like, it's not like I don't mean to be a dick and say,
1:14:24.040 --> 1:14:25.920
I don't have patience for that, but it's like,
1:14:25.920 --> 1:14:29.160
that stuff doesn't help us with our goal of winning
1:14:29.160 --> 1:14:30.560
self driving cars.
1:14:30.560 --> 1:14:33.800
If I want money in the short term,
1:14:33.800 --> 1:14:38.040
if I showed off the actual learning tech that we have,
1:14:38.040 --> 1:14:40.160
it's somewhat sad.
1:14:40.160 --> 1:14:43.000
It's years and years ahead of everybody else's.
1:14:43.000 --> 1:14:43.720
Maybe not Tesla's.
1:14:43.720 --> 1:14:45.720
I think Tesla has similar stuff to us, actually.
1:14:45.720 --> 1:14:47.640
I think Tesla has similar stuff, but when you compare it
1:14:47.640 --> 1:14:50.920
to what the Toyota Research Institute has,
1:14:50.920 --> 1:14:53.480
you're not even close to what we have.
1:14:53.480 --> 1:14:55.840
No comments, but I also can't.
1:14:55.840 --> 1:14:58.440
I have to take your comments.
1:14:58.440 --> 1:15:01.960
I intuitively believe you, but I have
1:15:01.960 --> 1:15:04.680
to take it with a grain of salt because,
1:15:04.680 --> 1:15:07.440
I mean, you are an inspiration because you basically
1:15:07.440 --> 1:15:10.000
don't care about a lot of things that other companies care
1:15:10.000 --> 1:15:10.880
about.
1:15:10.880 --> 1:15:16.600
You don't try to bullshit, in a sense, like make up stuff,
1:15:16.600 --> 1:15:18.600
so to drive up valuation.
1:15:18.600 --> 1:15:19.960
You're really very real, and you're
1:15:19.960 --> 1:15:22.280
trying to solve the problem, and I admire that a lot.
1:15:22.280 --> 1:15:26.520
What I don't necessarily fully can't trust you on about your
1:15:26.520 --> 1:15:28.440
respect is how good it is, right?
1:15:28.440 --> 1:15:33.320
I can only, but I also know how bad others are.
1:15:33.320 --> 1:15:36.680
I'll say two things about, trust, but verify, right?
1:15:36.680 --> 1:15:38.040
I'll say two things about that.
1:15:38.040 --> 1:15:42.360
One is try, get in a 2020 Corolla,
1:15:42.360 --> 1:15:46.680
and try OpenPilot 0.6 when it comes out next month.
1:15:46.680 --> 1:15:48.400
I think already, you'll look at this,
1:15:48.400 --> 1:15:51.400
and you'll be like, this is already really good.
1:15:51.400 --> 1:15:54.240
And then, I could be doing that all with hand labelers
1:15:54.240 --> 1:15:58.000
and all with the same approach that Mobileye uses.
1:15:58.000 --> 1:16:00.040
When we release a model that no longer
1:16:00.040 --> 1:16:05.000
has the lanes in it, that only outputs a path,
1:16:05.000 --> 1:16:08.720
then think about how we did that machine learning,
1:16:08.720 --> 1:16:10.080
and then right away, when you see,
1:16:10.080 --> 1:16:11.240
and that's going to be an OpenPilot,
1:16:11.240 --> 1:16:13.000
that's going to be an OpenPilot before 1.0,
1:16:13.000 --> 1:16:14.400
when you see that model, you'll know
1:16:14.400 --> 1:16:15.360
that everything I'm saying is true,
1:16:15.360 --> 1:16:16.840
because how else did I get that model?
1:16:16.840 --> 1:16:17.320
Good.
1:16:17.320 --> 1:16:19.240
You know what I'm saying is true about the simulator.
1:16:19.240 --> 1:16:20.600
Yeah, yeah, yeah, this is super exciting.
1:16:20.600 --> 1:16:22.680
That's super exciting.
1:16:22.680 --> 1:16:25.760
But I listened to your talk with Kyle,
1:16:25.760 --> 1:16:30.480
and Kyle was originally building the aftermarket system,
1:16:30.480 --> 1:16:34.920
and he gave up on it because of technical challenges,
1:16:34.920 --> 1:16:37.360
because of the fact that he's going
1:16:37.360 --> 1:16:39.160
to have to support 20 to 50 cars.
1:16:39.160 --> 1:16:41.120
We support 45, because what is he
1:16:41.120 --> 1:16:43.440
going to do when the manufacturer ABS system triggers?
1:16:43.440 --> 1:16:45.480
We have alerts and warnings to deal with all of that
1:16:45.480 --> 1:16:48.400
and all the cars, and how is he going to formally verify it?
1:16:48.400 --> 1:16:49.800
Well, I got 10 million miles of data.
1:16:49.800 --> 1:16:53.240
It's probably better verified than the spec.
1:16:53.240 --> 1:16:57.720
Yeah, I'm glad you're here talking to me.
1:16:57.720 --> 1:17:01.120
I'll remember this day, because it's interesting.
1:17:01.120 --> 1:17:04.160
If you look at Kyle's from Cruise,
1:17:04.160 --> 1:17:06.320
I'm sure they have a large number of business development
1:17:06.320 --> 1:17:10.200
folks, and he's working with GM.
1:17:10.200 --> 1:17:13.280
He could work with Argo AI, worked with Ford.
1:17:13.280 --> 1:17:18.520
It's interesting, because chances that you fail businesswise,
1:17:18.520 --> 1:17:21.120
like bankrupt, are pretty high.
1:17:21.120 --> 1:17:23.880
And yet, it's the Android model,
1:17:23.880 --> 1:17:26.440
is you're actually taking on the problem.
1:17:26.440 --> 1:17:28.160
So that's really inspiring.
1:17:28.160 --> 1:17:30.920
Well, I have a long term way for comedy to make money, too.
1:17:30.920 --> 1:17:34.400
And one of the nice things when you really take on the problem,
1:17:34.400 --> 1:17:36.760
which is my hope for autopilot, for example,
1:17:36.760 --> 1:17:41.040
is things you don't expect, ways to make money,
1:17:41.040 --> 1:17:44.160
or create value that you don't expect will pop up.
1:17:44.160 --> 1:17:48.560
I've known how to do it since 2017 is the first time I said it.
1:17:48.560 --> 1:17:50.440
Which part to know how to do which part?
1:17:50.440 --> 1:17:52.520
Our long term plan is to be a car insurance company.
1:17:52.520 --> 1:17:53.160
Insurance.
1:17:53.160 --> 1:17:55.320
Yeah, I love it.
1:17:55.320 --> 1:17:56.680
I make driving twice as safe.
1:17:56.680 --> 1:17:57.680
Not only that, I have the best data
1:17:57.680 --> 1:18:00.040
such to know who statistically is the safest drivers.
1:18:00.040 --> 1:18:02.160
And oh, oh, we see you.
1:18:02.160 --> 1:18:03.720
We see you driving unsafely.
1:18:03.720 --> 1:18:05.360
We're not going to insure you.
1:18:05.360 --> 1:18:08.960
And that causes a bifurcation in the market,
1:18:08.960 --> 1:18:10.920
because the only people who can't get common insurance
1:18:10.920 --> 1:18:12.760
or the bad drivers, Geico can insure them.
1:18:12.760 --> 1:18:15.360
Their premiums are crazy high, our premiums are crazy low.
1:18:15.360 --> 1:18:16.240
We win car insurance.
1:18:16.240 --> 1:18:18.120
Take over that whole market.
1:18:18.120 --> 1:18:21.560
OK, so if we win, if we win, but that's
1:18:21.560 --> 1:18:23.800
I'm saying like how do you turn comma into a $10 billion
1:18:23.800 --> 1:18:24.640
company is that.
1:18:24.640 --> 1:18:25.600
That's right.
1:18:25.600 --> 1:18:30.000
So you Elon Musk, who else?
1:18:30.000 --> 1:18:32.720
Who else is thinking like this and working like this
1:18:32.720 --> 1:18:33.160
in your view?
1:18:33.160 --> 1:18:34.800
Who are the competitors?
1:18:34.800 --> 1:18:36.160
Are there people seriously?
1:18:36.160 --> 1:18:39.480
I don't think anyone that I'm aware of is seriously
1:18:39.480 --> 1:18:45.280
taking on lane keeping, like to where it's a huge business that
1:18:45.280 --> 1:18:51.400
turns eventually to full autonomy that then creates
1:18:51.400 --> 1:18:53.440
other businesses on top of it and so on.
1:18:53.440 --> 1:18:56.480
Thinks insurance, thinks all kinds of ideas like that.
1:18:56.480 --> 1:19:00.480
Do you know anyone else thinking like this?
1:19:00.480 --> 1:19:02.200
Not really.
1:19:02.200 --> 1:19:02.960
That's interesting.
1:19:02.960 --> 1:19:06.560
I mean, my sense is everybody turns to that in like four
1:19:06.560 --> 1:19:07.800
or five years.
1:19:07.800 --> 1:19:11.240
Like Ford, once the autonomy doesn't fall through.
1:19:11.240 --> 1:19:12.600
But at this time.
1:19:12.600 --> 1:19:14.120
Elon's the iOS.
1:19:14.120 --> 1:19:16.720
By the way, he paved the way for all of us.
1:19:16.720 --> 1:19:18.000
It's not iOS, true.
1:19:18.000 --> 1:19:21.520
I would not be doing comma AI today if it was not
1:19:21.520 --> 1:19:23.480
for those conversations with Elon.
1:19:23.480 --> 1:19:26.840
And if it were not for him saying like,
1:19:26.840 --> 1:19:28.600
I think he said like, well, obviously we're not
1:19:28.600 --> 1:19:31.280
going to use LiDAR, we use cameras, humans use cameras.
1:19:31.280 --> 1:19:32.600
So what do you think about that?
1:19:32.600 --> 1:19:33.880
How important is LiDAR?
1:19:33.880 --> 1:19:36.960
Everybody else's on L5 is using LiDAR.
1:19:36.960 --> 1:19:39.160
What are your thoughts on his provocative statement
1:19:39.160 --> 1:19:41.320
that LiDAR is a crutch?
1:19:41.320 --> 1:19:43.520
See, sometimes they'll say dumb things like the driver
1:19:43.520 --> 1:19:45.680
monitoring thing, but sometimes they'll say absolutely
1:19:45.680 --> 1:19:48.400
completely 100% obviously true things.
1:19:48.400 --> 1:19:50.840
Of course LiDAR is a crutch.
1:19:50.840 --> 1:19:53.040
It's not even a good crutch.
1:19:53.040 --> 1:19:54.200
You're not even using it.
1:19:54.200 --> 1:19:56.920
They're using it for localization,
1:19:56.920 --> 1:19:58.160
which isn't good in the first place.
1:19:58.160 --> 1:20:00.480
If you have to localize your car to centimeters
1:20:00.480 --> 1:20:04.280
in order to drive, that's not driving.
1:20:04.280 --> 1:20:06.320
Currently not doing much machine learning.
1:20:06.320 --> 1:20:09.280
I thought LiDAR data, meaning like to help you
1:20:09.280 --> 1:20:12.840
in the task of general task of perception.
1:20:12.840 --> 1:20:15.320
The main goal of those LiDARs on those cars
1:20:15.320 --> 1:20:18.840
I think is actually localization more than perception,
1:20:18.840 --> 1:20:20.080
or at least that's what they use them for.
1:20:20.080 --> 1:20:20.920
Yeah, that's true.
1:20:20.920 --> 1:20:22.480
If you want to localize to centimeters,
1:20:22.480 --> 1:20:23.720
you can't use GPS.
1:20:23.720 --> 1:20:25.120
The fancies GPS in the world can't do it,
1:20:25.120 --> 1:20:26.960
especially if you're under tree cover and stuff.
1:20:26.960 --> 1:20:28.480
LiDAR you can do this pretty easily.
1:20:28.480 --> 1:20:30.240
So really they're not taking on,
1:20:30.240 --> 1:20:33.200
I mean in some research they're using it for perception,
1:20:33.200 --> 1:20:35.840
but and they're certainly not, which is sad,
1:20:35.840 --> 1:20:38.680
they're not fusing it well with vision.
1:20:38.680 --> 1:20:40.560
They do use it for perception.
1:20:40.560 --> 1:20:42.400
I'm not saying they don't use it for perception,
1:20:42.400 --> 1:20:45.480
but the thing that they have vision based
1:20:45.480 --> 1:20:47.680
and radar based perception systems as well.
1:20:47.680 --> 1:20:51.440
You could remove the LiDAR and keep around
1:20:51.440 --> 1:20:54.040
a lot of the dynamic object perception.
1:20:54.040 --> 1:20:56.320
You want to get centimeter accurate localization.
1:20:56.320 --> 1:20:59.120
Good luck doing that with anything else.
1:20:59.120 --> 1:21:02.880
So what should a cruise Waymo do?
1:21:02.880 --> 1:21:05.360
Like what would be your advice to them now?
1:21:06.400 --> 1:21:11.400
I mean Waymo is actually, they're serious.
1:21:11.400 --> 1:21:13.120
Waymo out of the ball of them,
1:21:13.120 --> 1:21:16.120
are quite serious about the long game.
1:21:16.120 --> 1:21:20.680
If L5 is a lot, is requires 50 years,
1:21:20.680 --> 1:21:24.000
I think Waymo will be the only one left standing at the end
1:21:24.000 --> 1:21:26.560
with a given the financial backing that they have.
1:21:26.560 --> 1:21:28.640
They're boo Google box.
1:21:28.640 --> 1:21:31.040
I'll say nice things about both Waymo and cruise.
1:21:32.320 --> 1:21:33.480
Let's do it.
1:21:33.480 --> 1:21:34.320
Nice is good.
1:21:35.720 --> 1:21:39.200
Waymo is by far the furthest along with technology.
1:21:39.200 --> 1:21:41.160
Waymo has a three to five year lead
1:21:41.160 --> 1:21:42.880
on all the competitors.
1:21:43.960 --> 1:21:48.640
If the Waymo looking stack works,
1:21:48.640 --> 1:21:49.720
maybe three year lead.
1:21:49.720 --> 1:21:51.280
If the Waymo looking stack works,
1:21:51.280 --> 1:21:52.800
they have a three year lead.
1:21:52.800 --> 1:21:55.800
Now, I argue that Waymo has spent too much money
1:21:55.800 --> 1:21:59.240
to recapitalize, to gain back their losses
1:21:59.240 --> 1:22:00.160
in those three years.
1:22:00.160 --> 1:22:03.600
Also self driving cars have no network effect like that.
1:22:03.600 --> 1:22:04.800
Uber has a network effect.
1:22:04.800 --> 1:22:07.120
You have a market, you have drivers and you have riders.
1:22:07.120 --> 1:22:09.880
Self driving cars, you have capital and you have riders.
1:22:09.880 --> 1:22:11.400
There's no network effect.
1:22:11.400 --> 1:22:13.800
If I want to blanket a new city in self driving cars,
1:22:13.800 --> 1:22:16.000
I buy the off the shelf Chinese knockoff self driving cars
1:22:16.000 --> 1:22:17.160
and I buy enough of them in the city.
1:22:17.160 --> 1:22:18.360
I can't do that with drivers.
1:22:18.360 --> 1:22:20.840
And that's why Uber has a first mover advantage
1:22:20.840 --> 1:22:22.640
that no self driving car company will.
1:22:23.960 --> 1:22:26.520
Can you just a thing, let a little bit.
1:22:26.520 --> 1:22:28.160
Uber, you're not talking about Uber,
1:22:28.160 --> 1:22:29.240
the autonomous vehicle Uber.
1:22:29.240 --> 1:22:30.960
You're talking about the Uber cars.
1:22:30.960 --> 1:22:31.800
Yeah.
1:22:31.800 --> 1:22:32.640
I'm Uber.
1:22:32.640 --> 1:22:35.920
I open for business in Austin, Texas, let's say.
1:22:35.920 --> 1:22:38.760
I need to attract both sides of the market.
1:22:38.760 --> 1:22:41.200
I need to both get drivers on my platform
1:22:41.200 --> 1:22:42.720
and riders on my platform.
1:22:42.720 --> 1:22:45.320
And I need to keep them both sufficiently happy, right?
1:22:45.320 --> 1:22:46.520
Riders aren't going to use it
1:22:46.520 --> 1:22:48.960
if it takes more than five minutes for an Uber to show up.
1:22:48.960 --> 1:22:50.120
Drivers aren't going to use it
1:22:50.120 --> 1:22:52.120
if they have to sit around all day and there's no riders.
1:22:52.120 --> 1:22:54.480
So you have to carefully balance a market.
1:22:54.480 --> 1:22:56.240
And whenever you have to carefully balance a market,
1:22:56.240 --> 1:22:58.280
there's a great first mover advantage
1:22:58.280 --> 1:23:01.000
because there's a switching cost for everybody, right?
1:23:01.000 --> 1:23:02.120
The drivers and the riders
1:23:02.120 --> 1:23:04.080
would have to switch at the same time.
1:23:04.080 --> 1:23:08.880
Let's even say that, let's say, Uber shows up.
1:23:08.880 --> 1:23:13.880
And Uber somehow agrees to do things at a bigger,
1:23:14.800 --> 1:23:17.440
we've done it more efficiently, right?
1:23:17.440 --> 1:23:19.800
Uber only takes 5% of a car
1:23:19.800 --> 1:23:21.600
instead of the 10% that Uber takes.
1:23:21.600 --> 1:23:22.760
No one is going to switch
1:23:22.760 --> 1:23:24.920
because the switching cost is higher than that 5%.
1:23:24.920 --> 1:23:27.200
So you actually can, in markets like that,
1:23:27.200 --> 1:23:28.520
you have a first mover advantage.
1:23:28.520 --> 1:23:29.360
Yeah.
1:23:30.160 --> 1:23:32.720
Autonomous vehicles of the level five variety
1:23:32.720 --> 1:23:34.560
have no first mover advantage.
1:23:34.560 --> 1:23:36.800
If the technology becomes commoditized,
1:23:36.800 --> 1:23:39.520
say I want to go to a new city, look at the scooters.
1:23:39.520 --> 1:23:41.480
It's going to look a lot more like scooters.
1:23:41.480 --> 1:23:44.040
Every person with a checkbook
1:23:44.040 --> 1:23:45.720
can blanket a city in scooters
1:23:45.720 --> 1:23:47.920
and that's why you have 10 different scooter companies.
1:23:47.920 --> 1:23:48.760
Which one's going to win?
1:23:48.760 --> 1:23:49.600
It's a race to the bottom.
1:23:49.600 --> 1:23:51.040
It's a terrible market to be in
1:23:51.040 --> 1:23:53.160
because there's no market for scooters.
1:23:54.960 --> 1:23:56.520
And the scooters don't get a say
1:23:56.520 --> 1:23:57.480
in whether they want to be bought
1:23:57.480 --> 1:23:58.440
and deployed to a city or not.
1:23:58.440 --> 1:23:59.280
Right.
1:23:59.280 --> 1:24:00.120
So yeah.
1:24:00.120 --> 1:24:02.080
We're going to entice the scooters with subsidies
1:24:02.080 --> 1:24:02.920
and deals.
1:24:03.840 --> 1:24:05.480
So whenever you have to invest that capital,
1:24:05.480 --> 1:24:06.720
it doesn't...
1:24:06.720 --> 1:24:07.560
It doesn't come back.
1:24:07.560 --> 1:24:08.600
Yeah.
1:24:08.600 --> 1:24:12.320
They can't be your main criticism of the Waymo approach.
1:24:12.320 --> 1:24:14.840
Oh, I'm saying even if it does technically work.
1:24:14.840 --> 1:24:17.040
Even if it does technically work, that's a problem.
1:24:17.040 --> 1:24:18.000
Yeah.
1:24:18.000 --> 1:24:21.720
I don't know if I were to say, I would say,
1:24:22.840 --> 1:24:23.520
you're already there.
1:24:23.520 --> 1:24:24.560
I haven't even thought about that.
1:24:24.560 --> 1:24:26.520
But I would say the bigger challenge
1:24:26.520 --> 1:24:27.760
is the technical approach.
1:24:29.760 --> 1:24:31.840
So Waymo's cruise is...
1:24:31.840 --> 1:24:33.000
And not just the technical approach,
1:24:33.000 --> 1:24:34.800
but of creating value.
1:24:34.800 --> 1:24:39.800
I still don't understand how you beat Uber,
1:24:40.760 --> 1:24:43.480
the human driven cars.
1:24:43.480 --> 1:24:44.920
In terms of financially,
1:24:44.920 --> 1:24:47.160
it doesn't make sense to me
1:24:47.160 --> 1:24:50.080
that people want to get an autonomous vehicle.
1:24:50.080 --> 1:24:52.800
I don't understand how you make money.
1:24:52.800 --> 1:24:56.440
In the long term, yes, like real long term,
1:24:56.440 --> 1:24:58.640
but it just feels like there's too much
1:24:58.640 --> 1:24:59.960
capital investment needed.
1:24:59.960 --> 1:25:01.200
Oh, and they're going to be worse than Ubers
1:25:01.200 --> 1:25:02.440
because they're going to stop
1:25:02.440 --> 1:25:04.760
for every little thing everywhere.
1:25:06.320 --> 1:25:07.360
I'll say a nice thing about cruise.
1:25:07.360 --> 1:25:08.440
That was my nice thing about Waymo.
1:25:08.440 --> 1:25:09.280
They're three years ahead of me.
1:25:09.280 --> 1:25:10.120
It was a nice...
1:25:10.120 --> 1:25:10.960
Oh, because they're three years.
1:25:10.960 --> 1:25:12.480
They're three years technically ahead of everybody.
1:25:12.480 --> 1:25:13.960
Their tech stack is great.
1:25:14.800 --> 1:25:17.920
My nice thing about cruise is GM buying them
1:25:17.920 --> 1:25:19.160
was a great move for GM.
1:25:20.600 --> 1:25:22.240
For $1 billion,
1:25:22.240 --> 1:25:25.600
GM bought an insurance policy against Waymo.
1:25:26.560 --> 1:25:30.000
They put cruise is three years behind Waymo.
1:25:30.000 --> 1:25:32.600
That means Google will get a monopoly
1:25:32.600 --> 1:25:35.160
on the technology for at most three years.
1:25:36.840 --> 1:25:38.880
And if technology works,
1:25:38.880 --> 1:25:40.840
you might not even be right about the three years.
1:25:40.840 --> 1:25:41.840
It might be less.
1:25:41.840 --> 1:25:42.680
Might be less.
1:25:42.680 --> 1:25:44.320
Cruise actually might not be that far behind.
1:25:44.320 --> 1:25:47.360
I don't know how much Waymo has waffled around
1:25:47.360 --> 1:25:49.760
or how much of it actually is just that long tail.
1:25:49.760 --> 1:25:50.600
Yeah, okay.
1:25:50.600 --> 1:25:53.600
If that's the best you could say in terms of nice things,
1:25:53.600 --> 1:25:55.200
that's more of a nice thing for GM
1:25:55.200 --> 1:25:58.560
that that's a smart insurance policy.
1:25:58.560 --> 1:25:59.680
It's a smart insurance policy.
1:25:59.680 --> 1:26:01.880
I mean, I think that's how...
1:26:01.880 --> 1:26:05.200
I can't see cruise working out any other.
1:26:05.200 --> 1:26:07.840
For cruise to leapfrog Waymo would really surprise me.
1:26:10.400 --> 1:26:13.000
Yeah, so let's talk about the underlying assumptions
1:26:13.000 --> 1:26:13.840
of everything is...
1:26:13.840 --> 1:26:15.440
We're not gonna leapfrog Tesla.
1:26:17.560 --> 1:26:19.240
Tesla would have to seriously mess up
1:26:19.240 --> 1:26:20.440
for us to leapfrog them.
1:26:20.440 --> 1:26:23.240
Okay, so the way you leapfrog, right,
1:26:23.240 --> 1:26:26.120
is you come up with an idea
1:26:26.120 --> 1:26:28.560
or you take a direction, perhaps secretly,
1:26:28.560 --> 1:26:30.640
that the other people aren't taking.
1:26:31.640 --> 1:26:36.640
And so cruise, Waymo, even Aurora...
1:26:38.080 --> 1:26:40.080
I don't know, Aurora, Zooks is the same stack as well.
1:26:40.080 --> 1:26:41.720
They're all the same code base even.
1:26:41.720 --> 1:26:44.120
They're all the same DARPA Urban Challenge code base.
1:26:44.120 --> 1:26:45.360
It's...
1:26:45.360 --> 1:26:47.760
So the question is, do you think there's a room
1:26:47.760 --> 1:26:49.120
for brilliance and innovation there
1:26:49.120 --> 1:26:50.560
that will change everything?
1:26:51.560 --> 1:26:53.880
Like say, okay, so I'll give you examples.
1:26:53.880 --> 1:26:58.880
It could be if revolution and mapping, for example,
1:26:59.640 --> 1:27:03.040
that allow you to map things,
1:27:03.040 --> 1:27:05.840
do HD maps of the whole world,
1:27:05.840 --> 1:27:08.080
all weather conditions somehow really well,
1:27:08.080 --> 1:27:13.080
or revolution and simulation,
1:27:14.480 --> 1:27:18.840
to where all the way you said before becomes incorrect.
1:27:20.480 --> 1:27:21.520
That kind of thing.
1:27:21.520 --> 1:27:23.920
Any room for breakthrough innovation?
1:27:24.920 --> 1:27:25.960
What I said before about,
1:27:25.960 --> 1:27:28.280
oh, they actually get the whole thing, well,
1:27:28.280 --> 1:27:32.600
I'll say this about we divide driving into three problems.
1:27:32.600 --> 1:27:33.800
And I actually haven't solved the third yet,
1:27:33.800 --> 1:27:34.800
but I haven't had any idea how to do it.
1:27:34.800 --> 1:27:36.120
So there's the static.
1:27:36.120 --> 1:27:38.000
The static driving problem is assuming
1:27:38.000 --> 1:27:40.120
you are the only car on the road, right?
1:27:40.120 --> 1:27:42.000
And this problem can be solved 100%
1:27:42.000 --> 1:27:44.000
with mapping and localization.
1:27:44.000 --> 1:27:45.760
This is why farms work the way they do.
1:27:45.760 --> 1:27:48.440
If all you have to deal with is the static problem,
1:27:48.440 --> 1:27:50.160
and you can statically schedule your machines, right?
1:27:50.160 --> 1:27:52.680
It's the same as like statically scheduling processes.
1:27:52.680 --> 1:27:54.040
You can statically schedule your tractors
1:27:54.040 --> 1:27:56.160
to never hit each other on their paths, right?
1:27:56.160 --> 1:27:57.520
Because then you know the speed they go at.
1:27:57.520 --> 1:28:00.160
So that's the static driving problem.
1:28:00.160 --> 1:28:03.160
Maps only helps you with the static driving problem.
1:28:03.920 --> 1:28:06.960
Yeah, the question about static driving,
1:28:06.960 --> 1:28:08.800
you've just made it sound like it's really easy.
1:28:08.800 --> 1:28:10.160
Static driving is really easy.
1:28:11.880 --> 1:28:13.040
How easy?
1:28:13.040 --> 1:28:16.480
How, well, because the whole drifting out of lane,
1:28:16.480 --> 1:28:18.760
when Tesla drifts out of lane,
1:28:18.760 --> 1:28:21.960
it's failing on the fundamental static driving problem.
1:28:21.960 --> 1:28:24.440
Tesla is drifting out of lane?
1:28:24.440 --> 1:28:27.720
The static driving problem is not easy for the world.
1:28:27.720 --> 1:28:30.320
The static driving problem is easy for one route.
1:28:31.840 --> 1:28:33.920
One route in one weather condition
1:28:33.920 --> 1:28:37.920
with one state of lane markings
1:28:37.920 --> 1:28:40.920
and like no deterioration, no cracks in the road.
1:28:40.920 --> 1:28:42.600
Well, I'm assuming you have a perfect localizer.
1:28:42.600 --> 1:28:44.200
So that's all for the weather condition
1:28:44.200 --> 1:28:45.560
and the lane marking condition.
1:28:45.560 --> 1:28:46.640
But that's the problem.
1:28:46.640 --> 1:28:47.680
How do you have a perfect localizer?
1:28:47.680 --> 1:28:50.560
You can build, perfect localizers are not that hard to build.
1:28:50.560 --> 1:28:53.360
Okay, come on now, with LIDAR.
1:28:53.360 --> 1:28:54.200
LIDAR, yeah.
1:28:54.200 --> 1:28:55.040
With LIDAR, okay.
1:28:55.040 --> 1:28:56.440
LIDAR, yeah, but you use LIDAR, right?
1:28:56.440 --> 1:28:58.640
Like you use LIDAR, build a perfect localizer.
1:28:58.640 --> 1:29:00.960
Building a perfect localizer without LIDAR,
1:29:03.000 --> 1:29:04.320
it's gonna be hard.
1:29:04.320 --> 1:29:05.760
You can get 10 centimeters without LIDAR,
1:29:05.760 --> 1:29:07.240
you can get one centimeter with LIDAR.
1:29:07.240 --> 1:29:09.280
I'm not even concerned about the one or 10 centimeters.
1:29:09.280 --> 1:29:12.680
I'm concerned if every once in a while you just weigh off.
1:29:12.680 --> 1:29:17.480
Yeah, so this is why you have to carefully
1:29:17.480 --> 1:29:20.040
make sure you're always tracking your position.
1:29:20.040 --> 1:29:21.760
You wanna use LIDAR camera fusion,
1:29:21.760 --> 1:29:24.480
but you can get the reliability of that system
1:29:24.480 --> 1:29:28.000
up to 100,000 miles
1:29:28.000 --> 1:29:29.720
and then you write some fallback condition
1:29:29.720 --> 1:29:32.160
where it's not that bad if you're way off, right?
1:29:32.160 --> 1:29:33.800
I think that you can get it to the point,
1:29:33.800 --> 1:29:36.800
it's like ASL D that you're never in a case
1:29:36.800 --> 1:29:38.480
where you're way off and you don't know it.
1:29:38.480 --> 1:29:40.240
Yeah, okay, so this is brilliant.
1:29:40.240 --> 1:29:41.160
So that's the static.
1:29:41.160 --> 1:29:42.280
Static.
1:29:42.280 --> 1:29:45.960
We can, especially with LIDAR and good HD maps,
1:29:45.960 --> 1:29:47.080
you can solve that problem.
1:29:47.080 --> 1:29:47.920
It's easy.
1:29:47.920 --> 1:29:51.840
The static, the static problem is so easy.
1:29:51.840 --> 1:29:54.000
It's very typical for you to say something's easy.
1:29:54.000 --> 1:29:54.840
I got it.
1:29:54.840 --> 1:29:56.920
It's not as challenging as the other ones, okay.
1:29:56.920 --> 1:29:58.760
Well, okay, maybe it's obvious how to solve it.
1:29:58.760 --> 1:29:59.760
The third one's the hardest.
1:29:59.760 --> 1:30:01.920
And a lot of people don't even think about the third one
1:30:01.920 --> 1:30:03.640
and even see it as different from the second one.
1:30:03.640 --> 1:30:05.720
So the second one is dynamic.
1:30:05.720 --> 1:30:08.560
The second one is like, say there's an obvious example,
1:30:08.560 --> 1:30:10.360
it's like a car stopped at a red light, right?
1:30:10.360 --> 1:30:12.520
You can't have that car in your map
1:30:12.520 --> 1:30:13.720
because you don't know whether that car
1:30:13.720 --> 1:30:14.880
is gonna be there or not.
1:30:14.880 --> 1:30:17.960
So you have to detect that car in real time
1:30:17.960 --> 1:30:21.600
and then you have to do the appropriate action, right?
1:30:21.600 --> 1:30:24.800
Also, that car is not a fixed object.
1:30:24.800 --> 1:30:26.600
That car may move and you have to predict
1:30:26.600 --> 1:30:28.680
what that car will do, right?
1:30:28.680 --> 1:30:30.840
So this is the dynamic problem.
1:30:30.840 --> 1:30:31.680
Yeah.
1:30:31.680 --> 1:30:32.800
So you have to deal with this.
1:30:32.800 --> 1:30:36.640
This involves, again, like you're gonna need models
1:30:36.640 --> 1:30:38.760
of other people's behavior.
1:30:38.760 --> 1:30:40.160
Do you, are you including in that?
1:30:40.160 --> 1:30:42.320
I don't wanna step on the third one.
1:30:42.320 --> 1:30:46.600
Oh, but are you including in that your influence
1:30:46.600 --> 1:30:47.440
on people?
1:30:47.440 --> 1:30:48.280
Ah, that's the third one.
1:30:48.280 --> 1:30:49.120
Okay.
1:30:49.120 --> 1:30:49.960
That's the third one.
1:30:49.960 --> 1:30:51.880
We call it the counterfactual.
1:30:51.880 --> 1:30:52.720
Yeah, brilliant.
1:30:52.720 --> 1:30:53.560
And that.
1:30:53.560 --> 1:30:54.920
I just talked to Judea Pro who's obsessed
1:30:54.920 --> 1:30:55.760
with counterfactuals.
1:30:55.760 --> 1:30:58.640
Counterfactual, oh yeah, yeah, I read his books.
1:30:58.640 --> 1:31:03.640
So the static and the dynamic are our approach right now
1:31:03.960 --> 1:31:07.600
for lateral will scale completely to the static and dynamic.
1:31:07.600 --> 1:31:10.760
The counterfactual, the only way I have to do it yet,
1:31:10.760 --> 1:31:14.000
the thing that I wanna do once we have all of these cars
1:31:14.000 --> 1:31:16.760
is I wanna do reinforcement learning on the world.
1:31:16.760 --> 1:31:18.880
I'm always gonna turn the exploiter up to max.
1:31:18.880 --> 1:31:20.440
I'm not gonna have them explore.
1:31:20.440 --> 1:31:22.760
But the only real way to get at the counterfactual
1:31:22.760 --> 1:31:24.080
is to do reinforcement learning
1:31:24.080 --> 1:31:26.360
because the other agents are humans.
1:31:27.760 --> 1:31:30.080
So that's fascinating that you break it down like that.
1:31:30.080 --> 1:31:31.680
I agree completely.
1:31:31.680 --> 1:31:33.600
I've spent my life thinking about this problem.
1:31:33.600 --> 1:31:34.920
This is beautiful.
1:31:34.920 --> 1:31:37.880
And part of it, cause you're slightly insane,
1:31:37.880 --> 1:31:42.880
because not my life, just the last four years.
1:31:43.120 --> 1:31:48.120
No, no, you have some non zero percent of your brain
1:31:48.920 --> 1:31:52.360
has a madman in it, which is a really good feature.
1:31:52.360 --> 1:31:55.920
But there's a safety component to it
1:31:55.920 --> 1:31:57.320
that I think when there's sort of
1:31:57.320 --> 1:31:59.040
with counterfactuals and so on,
1:31:59.040 --> 1:32:00.280
that would just freak people out.
1:32:00.280 --> 1:32:03.320
How do you even start to think about this in general?
1:32:03.320 --> 1:32:07.600
I mean, you've had some friction with NHTSA and so on.
1:32:07.600 --> 1:32:12.600
I am frankly exhausted by safety engineers.
1:32:14.280 --> 1:32:19.280
The prioritization on safety over innovation
1:32:21.360 --> 1:32:23.720
to a degree where it kills, in my view,
1:32:23.720 --> 1:32:26.200
kills safety in the longterm.
1:32:26.200 --> 1:32:28.080
So the counterfactual thing,
1:32:28.080 --> 1:32:31.560
they just actually exploring this world
1:32:31.560 --> 1:32:33.600
of how do you interact with dynamic objects and so on?
1:32:33.600 --> 1:32:34.840
How do you think about safety?
1:32:34.840 --> 1:32:38.120
You can do reinforcement learning without ever exploring.
1:32:38.120 --> 1:32:39.200
And I said that, like,
1:32:39.200 --> 1:32:41.560
so you can think about your, in like reinforcement learning,
1:32:41.560 --> 1:32:44.320
it's usually called like a temperature parameter.
1:32:44.320 --> 1:32:45.360
And your temperature parameter
1:32:45.360 --> 1:32:48.080
is how often you deviate from the argmax.
1:32:48.080 --> 1:32:50.720
I could always set that to zero and still learn.
1:32:50.720 --> 1:32:52.840
And I feel that you'd always want that set to zero
1:32:52.840 --> 1:32:54.080
on your actual system.
1:32:54.080 --> 1:32:54.920
Gotcha.
1:32:54.920 --> 1:32:58.160
But the problem is you first don't know very much
1:32:58.160 --> 1:32:59.560
and so you're going to make mistakes.
1:32:59.560 --> 1:33:01.680
So the learning, the exploration happens through mistakes.
1:33:01.680 --> 1:33:03.240
We're all ready, yeah, but.
1:33:03.240 --> 1:33:06.080
Okay, so the consequences of a mistake.
1:33:06.080 --> 1:33:09.400
OpenPilot and Autopilot are making mistakes left and right.
1:33:09.400 --> 1:33:12.560
We have 700 daily active users,
1:33:12.560 --> 1:33:14.080
1,000 weekly active users.
1:33:14.080 --> 1:33:18.920
OpenPilot makes tens of thousands of mistakes a week.
1:33:18.920 --> 1:33:21.160
These mistakes have zero consequences.
1:33:21.160 --> 1:33:22.520
These mistakes are,
1:33:22.520 --> 1:33:26.800
oh, I wanted to take this exit and it went straight.
1:33:26.800 --> 1:33:28.520
So I'm just going to carefully touch the wheel.
1:33:28.520 --> 1:33:29.360
The humans catch them.
1:33:29.360 --> 1:33:30.640
The humans catch them.
1:33:30.640 --> 1:33:33.120
And the human disengagement is labeling
1:33:33.120 --> 1:33:35.000
that reinforcement learning in a completely
1:33:35.000 --> 1:33:36.200
consequence free way.
1:33:37.240 --> 1:33:39.840
So driver monitoring is the way you ensure they keep.
1:33:39.840 --> 1:33:40.680
Yes.
1:33:40.680 --> 1:33:42.120
They keep paying attention.
1:33:42.120 --> 1:33:43.240
How's your messaging?
1:33:43.240 --> 1:33:45.200
Say I gave you a billion dollars,
1:33:45.200 --> 1:33:46.960
so you would be scaling it now.
1:33:47.800 --> 1:33:49.720
Oh, if I could scale, I couldn't scale with any amount of money.
1:33:49.720 --> 1:33:51.640
I'd raise money if I could, if I had a way to scale it.
1:33:51.640 --> 1:33:53.320
Yeah, you're not, no, I'm not focused on scale.
1:33:53.320 --> 1:33:54.160
I don't know how to do.
1:33:54.160 --> 1:33:55.760
Oh, like, I guess I could sell it to more people,
1:33:55.760 --> 1:33:56.960
but I want to make the system better.
1:33:56.960 --> 1:33:57.800
Better, better.
1:33:57.800 --> 1:33:58.840
And I don't know how to.
1:33:58.840 --> 1:34:01.080
But what's the messaging here?
1:34:01.080 --> 1:34:02.560
I got a chance to talk to Elon.
1:34:02.560 --> 1:34:07.560
And he basically said that the human factor doesn't matter.
1:34:09.280 --> 1:34:10.360
You know, the human doesn't matter
1:34:10.360 --> 1:34:12.280
because the system will perform.
1:34:12.280 --> 1:34:14.760
There'll be sort of a, sorry to use the term,
1:34:14.760 --> 1:34:16.120
but like a singular, like a point
1:34:16.120 --> 1:34:17.920
where it gets just much better.
1:34:17.920 --> 1:34:20.800
And so the human, it won't really matter.
1:34:20.800 --> 1:34:25.000
But it seems like that human catching the system
1:34:25.000 --> 1:34:29.360
when it gets into trouble is like the thing
1:34:29.360 --> 1:34:32.720
which will make something like reinforcement learning work.
1:34:32.720 --> 1:34:35.640
So how do you, how do you think messaging for Tesla,
1:34:35.640 --> 1:34:39.080
for you, for the industry in general, should change?
1:34:39.080 --> 1:34:40.840
I think my messaging is pretty clear,
1:34:40.840 --> 1:34:43.080
at least like our messaging wasn't that clear
1:34:43.080 --> 1:34:45.200
in the beginning and I do kind of fault myself for that.
1:34:45.200 --> 1:34:48.480
We are proud right now to be a level two system.
1:34:48.480 --> 1:34:50.360
We are proud to be level two.
1:34:50.360 --> 1:34:51.640
If we talk about level four,
1:34:51.640 --> 1:34:53.200
it's not with the current hardware.
1:34:53.200 --> 1:34:55.920
It's not going to be just a magical OTA upgrade.
1:34:55.920 --> 1:34:57.280
It's going to be new hardware.
1:34:57.280 --> 1:35:00.000
It's going to be very carefully thought out right now.
1:35:00.000 --> 1:35:01.560
We are proud to be level two.
1:35:01.560 --> 1:35:03.320
And we have a rigorous safety model.
1:35:03.320 --> 1:35:05.680
I mean, not like, like, okay, rigorous.
1:35:05.680 --> 1:35:06.600
Who knows what that means?
1:35:06.600 --> 1:35:08.600
But we at least have a safety model
1:35:08.600 --> 1:35:09.560
and we make it explicit.
1:35:09.560 --> 1:35:11.800
It's in safety.md and open pilot.
1:35:11.800 --> 1:35:13.960
And it says, seriously though.
1:35:13.960 --> 1:35:14.800
Safety.md.
1:35:14.800 --> 1:35:15.840
Safety.md.
1:35:16.840 --> 1:35:18.400
This is really, this is so Android.
1:35:18.400 --> 1:35:21.800
So, well, this is, this is the safety model
1:35:21.800 --> 1:35:25.520
and I like to have conversations like if, like, you know,
1:35:25.520 --> 1:35:27.120
sometimes people will come to you and they're like,
1:35:27.120 --> 1:35:29.240
your system's not safe.
1:35:29.240 --> 1:35:30.080
Okay.
1:35:30.080 --> 1:35:31.080
Have you read my safety docs?
1:35:31.080 --> 1:35:32.720
Would you like to have an intelligent conversation
1:35:32.720 --> 1:35:33.560
about this?
1:35:33.560 --> 1:35:34.400
And the answer is always no.
1:35:34.400 --> 1:35:36.880
They just like scream about, it runs Python.
1:35:38.240 --> 1:35:39.080
Okay. What?
1:35:39.080 --> 1:35:41.560
So you're saying that, that because Python's not real time,
1:35:41.560 --> 1:35:44.240
Python not being real time never causes disengagement.
1:35:44.240 --> 1:35:47.640
Disengagement's are caused by, you know, the model is QM.
1:35:47.640 --> 1:35:49.760
But safety.md says the following.
1:35:49.760 --> 1:35:50.600
First and foremost,
1:35:50.600 --> 1:35:53.000
the driver must be paying attention at all times.
1:35:54.240 --> 1:35:55.320
I don't consider, I do,
1:35:55.320 --> 1:35:57.720
I still consider the software to be alpha software
1:35:57.720 --> 1:36:00.080
until we can actually enforce that statement.
1:36:00.080 --> 1:36:03.280
But I feel it's very well communicated to our users.
1:36:03.280 --> 1:36:04.520
Two more things.
1:36:04.520 --> 1:36:09.080
One is the user must be able to easily take control
1:36:09.080 --> 1:36:10.880
of the vehicle at all times.
1:36:10.880 --> 1:36:14.440
So if you step on the gas or brake with open pilot,
1:36:14.440 --> 1:36:16.400
it gives full manual control back to the user
1:36:16.400 --> 1:36:18.680
or press the cancel button.
1:36:18.680 --> 1:36:23.240
Step two, the car will never react so quickly.
1:36:23.240 --> 1:36:26.000
We define so quickly to be about one second
1:36:26.000 --> 1:36:27.640
that you can't react in time.
1:36:27.640 --> 1:36:29.480
And we do this by enforcing torque limits,
1:36:29.480 --> 1:36:31.520
braking limits and acceleration limits.
1:36:31.520 --> 1:36:36.520
So we have like our torque limits way lower than Tesla's.
1:36:36.520 --> 1:36:39.080
This is another potential.
1:36:39.080 --> 1:36:40.240
If I could tweak autopilot,
1:36:40.240 --> 1:36:41.360
I would lower their torque limit
1:36:41.360 --> 1:36:42.960
and I would add driver monitoring.
1:36:42.960 --> 1:36:46.240
Because autopilot can jerk the wheel hard.
1:36:46.240 --> 1:36:47.520
Open pilot can't.
1:36:47.520 --> 1:36:52.080
It's, we limit and all this code is open source, readable.
1:36:52.080 --> 1:36:54.880
And I believe now it's all MISRA C compliant.
1:36:54.880 --> 1:36:55.800
What's that mean?
1:36:57.080 --> 1:37:00.400
MISRA is like the automotive coding standard.
1:37:00.400 --> 1:37:03.400
At first, I've come to respect,
1:37:03.400 --> 1:37:04.960
I've been reading like the standards lately
1:37:04.960 --> 1:37:05.920
and I've come to respect them.
1:37:05.920 --> 1:37:07.800
They're actually written by very smart people.
1:37:07.800 --> 1:37:09.920
Yeah, they're brilliant people actually.
1:37:09.920 --> 1:37:11.320
They have a lot of experience.
1:37:11.320 --> 1:37:13.360
They're sometimes a little too cautious,
1:37:13.360 --> 1:37:16.800
but in this case, it pays off.
1:37:16.800 --> 1:37:18.440
MISRA is written by like computer scientists
1:37:18.440 --> 1:37:19.840
and you can tell by the language they use.
1:37:19.840 --> 1:37:21.080
You can tell by the language they use.
1:37:21.080 --> 1:37:24.440
They talk about like whether certain conditions in MISRA
1:37:24.440 --> 1:37:26.520
are decidable or undecidable.
1:37:26.520 --> 1:37:28.360
And you mean like the halting problem?
1:37:28.360 --> 1:37:31.600
And yes, all right, you've earned my respect.
1:37:31.600 --> 1:37:33.120
I will read carefully what you have to say
1:37:33.120 --> 1:37:35.760
and we want to make our code compliant with that.
1:37:35.760 --> 1:37:38.160
All right, so you're proud level two, beautiful.
1:37:38.160 --> 1:37:42.320
So you were the founder and I think CEO of Kama AI,
1:37:42.320 --> 1:37:44.320
then you were the head of research.
1:37:44.320 --> 1:37:46.080
What the heck are you now?
1:37:46.080 --> 1:37:47.480
What's your connection to Kama AI?
1:37:47.480 --> 1:37:49.640
I'm the president, but I'm one of those like
1:37:49.640 --> 1:37:53.440
unelected presidents of like a small dictatorship country,
1:37:53.440 --> 1:37:55.200
not one of those like elected presidents.
1:37:55.200 --> 1:37:57.640
Oh, so you're like Putin when he was like the, yeah,
1:37:57.640 --> 1:37:58.980
I got you.
1:37:58.980 --> 1:38:02.120
So there's, what's the governance structure?
1:38:02.120 --> 1:38:04.800
What's the future of Kama AI finance?
1:38:04.800 --> 1:38:08.120
I mean, yeah, as a business, do you want,
1:38:08.120 --> 1:38:11.640
are you just focused on getting things right now,
1:38:11.640 --> 1:38:14.920
making some small amount of money in the meantime
1:38:14.920 --> 1:38:17.520
and then when it works, it works a new scale.
1:38:17.520 --> 1:38:20.480
Our burn rate is about 200 K a month
1:38:20.480 --> 1:38:23.040
and our revenue is about 100 K a month.
1:38:23.040 --> 1:38:24.920
So we need to forex our revenue,
1:38:24.920 --> 1:38:28.200
but we haven't like tried very hard at that yet.
1:38:28.200 --> 1:38:30.160
And the revenue is basically selling stuff online.
1:38:30.160 --> 1:38:32.360
Yeah, we sell stuff shop.com.ai.
1:38:32.360 --> 1:38:33.920
Is there other, well, okay.
1:38:33.920 --> 1:38:35.360
So you'll have to figure out.
1:38:35.360 --> 1:38:37.880
That's our only, see, but to me,
1:38:37.880 --> 1:38:40.400
that's like respectable revenues.
1:38:40.400 --> 1:38:42.640
We make it by selling products to consumers
1:38:42.640 --> 1:38:45.040
for honest and transparent about what they are.
1:38:45.040 --> 1:38:49.000
Most actually level four companies, right?
1:38:50.720 --> 1:38:54.320
Cause you could easily start blowing up like smoke,
1:38:54.320 --> 1:38:57.080
like overselling the hype and feeding into,
1:38:57.080 --> 1:38:59.080
getting some fundraisers.
1:38:59.080 --> 1:39:00.520
Oh, you're the guy, you're a genius
1:39:00.520 --> 1:39:01.800
because you hacked the iPhone.
1:39:01.800 --> 1:39:02.920
Oh, I hate that.
1:39:02.920 --> 1:39:03.760
I hate that.
1:39:03.760 --> 1:39:06.360
Yeah, I can trade my social capital for more money.
1:39:06.360 --> 1:39:07.320
I did it once.
1:39:07.320 --> 1:39:10.320
I almost regret it doing it the first time.
1:39:10.320 --> 1:39:11.640
Well, on a small tangent,
1:39:11.640 --> 1:39:16.560
what's your, you seem to not like fame
1:39:16.560 --> 1:39:18.840
and yet you're also drawn to fame.
1:39:18.840 --> 1:39:23.840
What's, where have you on, where are you on that currently?
1:39:24.560 --> 1:39:27.200
Have you had some introspection, some soul searching?
1:39:27.200 --> 1:39:28.480
Yeah.
1:39:28.480 --> 1:39:32.200
I actually, I've come to a pretty stable position on that.
1:39:32.200 --> 1:39:33.880
Like after the first time,
1:39:33.880 --> 1:39:36.840
I realized that I don't want attention from the masses.
1:39:36.840 --> 1:39:39.160
I want attention from people who I respect.
1:39:39.160 --> 1:39:41.960
Who do you respect?
1:39:41.960 --> 1:39:43.960
I can give a list of people.
1:39:43.960 --> 1:39:47.200
So are these like Elon Musk type characters?
1:39:47.200 --> 1:39:49.040
Yeah.
1:39:49.040 --> 1:39:50.000
Actually, you know what?
1:39:50.000 --> 1:39:51.200
I'll make it more broad than that.
1:39:51.200 --> 1:39:52.600
I won't make it about a person.
1:39:52.600 --> 1:39:54.040
I respect skill.
1:39:54.040 --> 1:39:56.880
I respect people who have skills, right?
1:39:56.880 --> 1:40:00.280
And I would like to like be,
1:40:00.280 --> 1:40:01.400
I'm not gonna say famous,
1:40:01.400 --> 1:40:03.760
but be like known among more people
1:40:03.760 --> 1:40:05.440
who have like real skills.
1:40:05.440 --> 1:40:10.440
Who in cars, do you think have skill?
1:40:12.560 --> 1:40:13.720
Not do you respect?
1:40:15.000 --> 1:40:17.760
Oh, Kyle Voat has skill.
1:40:17.760 --> 1:40:19.880
A lot of people at Waymo have skill.
1:40:19.880 --> 1:40:20.840
And I respect them.
1:40:20.840 --> 1:40:23.760
I respect them as engineers.
1:40:23.760 --> 1:40:24.920
Like I can think, I mean,
1:40:24.920 --> 1:40:26.280
I think about all the times in my life
1:40:26.280 --> 1:40:27.960
where I've been like dead set on approaches
1:40:27.960 --> 1:40:29.160
and they turn out to be wrong.
1:40:29.160 --> 1:40:30.000
Yeah.
1:40:30.000 --> 1:40:31.720
So I mean, this might, I might be wrong.
1:40:31.720 --> 1:40:34.720
I accept that, I accept that there's a decent chance
1:40:34.720 --> 1:40:36.600
that I'm wrong.
1:40:36.600 --> 1:40:38.400
And actually, I mean, having talked to Chris Armson,
1:40:38.400 --> 1:40:40.480
Sterling Anderson, those guys,
1:40:40.480 --> 1:40:43.360
I mean, I deeply respect Chris.
1:40:43.360 --> 1:40:44.640
I just admire the guy.
1:40:46.040 --> 1:40:47.400
He's legit.
1:40:47.400 --> 1:40:48.960
When you drive a car through the desert
1:40:48.960 --> 1:40:52.400
when everybody thinks it's impossible, that's legit.
1:40:52.400 --> 1:40:53.840
And then I also really respect the people
1:40:53.840 --> 1:40:55.680
who are like writing the infrastructure of the world,
1:40:55.680 --> 1:40:57.360
like the Linus Torvalds and the Chris Ladin.
1:40:57.360 --> 1:40:59.080
Oh yeah, they were doing the real work.
1:40:59.080 --> 1:41:00.800
I know they're doing the real work.
1:41:02.000 --> 1:41:03.760
Having talked to Chris Ladin,
1:41:03.760 --> 1:41:05.680
you realize, especially when they're humble,
1:41:05.680 --> 1:41:07.680
it's like, you realize, oh, you guys,
1:41:07.680 --> 1:41:09.640
we're just using your...
1:41:09.640 --> 1:41:10.480
Oh yeah.
1:41:10.480 --> 1:41:11.520
All the hard work that you did.
1:41:11.520 --> 1:41:13.120
Yeah, that's incredible.
1:41:13.120 --> 1:41:17.160
What do you think, Mr. Anthony Lewandowski?
1:41:18.440 --> 1:41:21.640
What do you, he's a, he's another mad genius.
1:41:21.640 --> 1:41:22.480
Sharp guy.
1:41:22.480 --> 1:41:23.320
Oh yeah.
1:41:23.320 --> 1:41:27.640
What, do you think he might long term become a competitor?
1:41:27.640 --> 1:41:28.840
Oh, to comma?
1:41:28.840 --> 1:41:32.400
Well, so I think that he has the other right approach.
1:41:32.400 --> 1:41:35.280
I think that right now, there's two right approaches.
1:41:35.280 --> 1:41:37.680
One is what we're doing and one is what he's doing.
1:41:37.680 --> 1:41:39.800
Can you describe, I think it's called Pronto AI,
1:41:39.800 --> 1:41:42.360
he's starting using, do you know what the approach is?
1:41:42.360 --> 1:41:43.200
I actually don't know.
1:41:43.200 --> 1:41:45.040
Embark is also doing the same sort of thing.
1:41:45.040 --> 1:41:47.280
The idea is almost that you want to,
1:41:47.280 --> 1:41:51.800
so if you're, I can't partner with Honda and Toyota.
1:41:51.800 --> 1:41:56.800
Honda and Toyota are like 400,000 person companies.
1:41:57.600 --> 1:41:59.400
It's not even a company at that point.
1:41:59.400 --> 1:42:01.400
Like I don't think of it like, I don't personify it.
1:42:01.400 --> 1:42:06.400
I think of it like an object, but a trucker drives for a fleet.
1:42:07.120 --> 1:42:10.280
Maybe that has like, some truckers are independent.
1:42:10.280 --> 1:42:12.080
Some truckers drive for fleets with a hundred trucks.
1:42:12.080 --> 1:42:14.960
There are tons of independent trucking companies out there.
1:42:14.960 --> 1:42:18.120
Start a trucking company and drive your costs down
1:42:18.120 --> 1:42:23.120
or figure out how to drive down the cost of trucking.
1:42:23.760 --> 1:42:26.560
Another company that I really respect is Nauto.
1:42:26.560 --> 1:42:28.320
Actually, I respect their business model.
1:42:28.320 --> 1:42:31.560
Nauto sells a driver monitoring camera
1:42:31.560 --> 1:42:33.920
and they sell it to fleet owners.
1:42:33.920 --> 1:42:38.920
If I owned a fleet of cars and I could pay 40 bucks a month
1:42:39.120 --> 1:42:41.280
to monitor my employees,
1:42:42.400 --> 1:42:45.520
this is gonna like reduces accidents 18%.
1:42:45.520 --> 1:42:48.960
It's so like that in the space,
1:42:48.960 --> 1:42:52.000
that is like the business model that I like most respect
1:42:53.400 --> 1:42:55.360
because they're creating value today.
1:42:55.360 --> 1:42:57.840
Yeah, which is, that's a huge one.
1:42:57.840 --> 1:42:59.800
How do we create value today with some of this?
1:42:59.800 --> 1:43:01.680
And the length keeping thing is huge.
1:43:01.680 --> 1:43:03.800
And it sounds like you're creeping in
1:43:03.800 --> 1:43:06.680
or full steam ahead on the drive of monitoring too.
1:43:06.680 --> 1:43:09.240
Which I think actually where the short term value,
1:43:09.240 --> 1:43:10.480
if you can get right.
1:43:10.480 --> 1:43:12.800
I still, I'm not a huge fan of the statement
1:43:12.800 --> 1:43:15.120
that everything is to have drive of monitoring.
1:43:15.120 --> 1:43:16.120
I agree with that completely,
1:43:16.120 --> 1:43:18.680
but that statement usually misses the point
1:43:18.680 --> 1:43:21.920
that to get the experience of it right is not trivial.
1:43:21.920 --> 1:43:22.840
Oh, no, not at all.
1:43:22.840 --> 1:43:25.280
In fact, like, so right now we have,
1:43:25.280 --> 1:43:28.480
I think the timeout depends on speed of the car,
1:43:29.560 --> 1:43:32.520
but we want to depend on like the scene state.
1:43:32.520 --> 1:43:35.440
If you're on like an empty highway,
1:43:35.440 --> 1:43:37.680
it's very different if you don't pay attention
1:43:37.680 --> 1:43:40.600
than if like you're like coming up to a traffic light.
1:43:42.040 --> 1:43:45.720
And long term, it should probably learn from the driver
1:43:45.720 --> 1:43:48.120
because that's to do, I watched a lot of video.
1:43:48.120 --> 1:43:49.480
We've built a smartphone detector
1:43:49.480 --> 1:43:51.520
just to analyze how people are using smartphones
1:43:51.520 --> 1:43:53.400
and people are using it very differently.
1:43:53.400 --> 1:43:57.760
And there's this, it's a texting styles.
1:43:57.760 --> 1:44:00.320
We haven't watched nearly enough of the videos.
1:44:00.320 --> 1:44:01.800
We haven't, I got millions of miles
1:44:01.800 --> 1:44:02.960
of people driving cars.
1:44:02.960 --> 1:44:05.960
In this moment, I spent a large fraction of my time
1:44:05.960 --> 1:44:10.880
just watching videos because it's never fails to learn.
1:44:10.880 --> 1:44:13.480
Like I've never failed from a video watching session
1:44:13.480 --> 1:44:15.400
to learn something I didn't know before.
1:44:15.400 --> 1:44:19.640
In fact, I usually, like when I eat lunch, I'll sit,
1:44:19.640 --> 1:44:20.680
especially when the weather is good
1:44:20.680 --> 1:44:22.080
and just watch pedestrians.
1:44:22.080 --> 1:44:26.400
With an eye to understand like from a computer vision eye,
1:44:26.400 --> 1:44:29.280
just to see, can this model, can you predict
1:44:29.280 --> 1:44:30.480
what are the decisions made?
1:44:30.480 --> 1:44:33.040
And there's so many things that we don't understand.
1:44:33.040 --> 1:44:34.760
This is what I mean about state vector.
1:44:34.760 --> 1:44:37.880
Yeah, it's, I'm trying to always think like,
1:44:37.880 --> 1:44:40.280
because I'm understanding in my human brain,
1:44:40.280 --> 1:44:42.000
how do we convert that into,
1:44:43.000 --> 1:44:44.960
how hard is the learning problem here?
1:44:44.960 --> 1:44:46.960
I guess is the fundamental question.
1:44:46.960 --> 1:44:51.800
So something that's from a hacking perspective,
1:44:51.800 --> 1:44:54.200
this is always comes up, especially with folks.
1:44:54.200 --> 1:44:56.480
Well, first, the most popular question is
1:44:56.480 --> 1:44:58.440
the trolley problem, right?
1:44:58.440 --> 1:45:01.960
So that's not a sort of a serious problem.
1:45:01.960 --> 1:45:05.000
There are some ethical questions, I think that arise.
1:45:06.080 --> 1:45:09.600
Maybe you wanna, do you think there's any ethical,
1:45:09.600 --> 1:45:11.280
serious ethical questions?
1:45:11.280 --> 1:45:14.080
We have a solution to the trolley problem at com.ai.
1:45:14.080 --> 1:45:15.920
Well, so there is actually an alert
1:45:15.920 --> 1:45:18.000
in our code, ethical dilemma detected.
1:45:18.000 --> 1:45:18.960
It's not triggered yet.
1:45:18.960 --> 1:45:21.040
We don't know how yet to detect the ethical dilemmas,
1:45:21.040 --> 1:45:22.360
but we're a level two system.
1:45:22.360 --> 1:45:23.760
So we're going to disengage and leave
1:45:23.760 --> 1:45:25.320
that decision to the human.
1:45:25.320 --> 1:45:26.680
You're such a troll.
1:45:26.680 --> 1:45:28.760
No, but the trolley problem deserves to be trolled.
1:45:28.760 --> 1:45:32.040
Yeah, that's a beautiful answer actually.
1:45:32.040 --> 1:45:34.440
I know, I gave it to someone who was like,
1:45:34.440 --> 1:45:36.600
sometimes people ask like you asked about the trolley problem.
1:45:36.600 --> 1:45:38.080
Like you can have a kind of discussion about it.
1:45:38.080 --> 1:45:39.720
Like when you get someone who's like really like
1:45:39.720 --> 1:45:43.600
earnest about it, because it's the kind of thing where
1:45:43.600 --> 1:45:45.600
if you ask a bunch of people in an office,
1:45:45.600 --> 1:45:48.360
whether we should use a SQL stack or no SQL stack,
1:45:48.360 --> 1:45:50.600
if they're not that technical, they have no opinion.
1:45:50.600 --> 1:45:52.360
But if you ask them what color they want to paint the office,
1:45:52.360 --> 1:45:54.040
everyone has an opinion on that.
1:45:54.040 --> 1:45:56.040
And that's why the trolley problem is.
1:45:56.040 --> 1:45:57.280
I mean, that's a beautiful answer.
1:45:57.280 --> 1:45:59.240
Yeah, we're able to detect the problem
1:45:59.240 --> 1:46:01.960
and we're able to pass it on to the human.
1:46:01.960 --> 1:46:03.760
Wow, I've never heard anyone say it.
1:46:03.760 --> 1:46:06.160
This is your nice escape route.
1:46:06.160 --> 1:46:07.320
Okay, but...
1:46:07.320 --> 1:46:08.680
Proud level two.
1:46:08.680 --> 1:46:09.760
I'm proud level two.
1:46:09.760 --> 1:46:10.600
I love it.
1:46:10.600 --> 1:46:14.400
So the other thing that people have some concern about
1:46:14.400 --> 1:46:17.800
with AI in general is hacking.
1:46:17.800 --> 1:46:21.400
So how hard is it, do you think, to hack an autonomous vehicle
1:46:21.400 --> 1:46:25.000
either through physical access or through the more sort of
1:46:25.000 --> 1:46:28.240
popular now, these adversarial examples on the sensors?
1:46:28.240 --> 1:46:30.720
Okay, the adversarial examples one.
1:46:30.720 --> 1:46:32.320
You want to see some adversarial examples
1:46:32.320 --> 1:46:34.880
that affect humans, right?
1:46:34.880 --> 1:46:38.040
Oh, well, there used to be a stop sign here,
1:46:38.040 --> 1:46:40.000
but I put a black bag over the stop sign
1:46:40.000 --> 1:46:43.520
and then people ran it, adversarial, right?
1:46:43.520 --> 1:46:48.360
Like, there's tons of human adversarial examples too.
1:46:48.360 --> 1:46:52.240
The question in general about security, if you saw,
1:46:52.240 --> 1:46:54.040
something just came out today and there are always
1:46:54.040 --> 1:46:57.560
such hypey headlines about how navigate on autopilot
1:46:57.560 --> 1:47:00.960
was fooled by a GPS spoof to take an exit.
1:47:00.960 --> 1:47:01.800
Right.
1:47:01.800 --> 1:47:03.920
At least that's all they could do was take an exit.
1:47:03.920 --> 1:47:06.720
If your car is relying on GPS in order
1:47:06.720 --> 1:47:10.240
to have a safe driving policy, you're doing something wrong.
1:47:10.240 --> 1:47:12.680
If you're relying, and this is why V2V
1:47:12.680 --> 1:47:17.680
is such a terrible idea, V2V now relies on both parties
1:47:18.160 --> 1:47:19.800
getting communication right.
1:47:19.800 --> 1:47:24.800
This is not even, so I think of safety,
1:47:26.080 --> 1:47:28.480
security is like a special case of safety, right?
1:47:28.480 --> 1:47:31.880
Safety is like we put a little, you know,
1:47:31.880 --> 1:47:33.360
piece of caution tape around the hole
1:47:33.360 --> 1:47:35.560
so that people won't walk into it by accident.
1:47:35.560 --> 1:47:38.240
Security is like put a 10 foot fence around the hole
1:47:38.240 --> 1:47:40.120
so you actually physically cannot climb into it
1:47:40.120 --> 1:47:42.360
with barbed wire on the top and stuff, right?
1:47:42.360 --> 1:47:44.560
So like if you're designing systems
1:47:44.560 --> 1:47:47.440
that are like unreliable, they're definitely not secure.
1:47:48.440 --> 1:47:51.240
Your car should always do something safe
1:47:51.240 --> 1:47:53.400
using its local sensors.
1:47:53.400 --> 1:47:55.240
And then the local sensor should be hardwired.
1:47:55.240 --> 1:47:57.400
And then could somebody hack into your can boss
1:47:57.400 --> 1:47:58.640
and turn your steering wheel on your brakes?
1:47:58.640 --> 1:48:01.240
Yes, but they could do it before comma AI too, so.
1:48:02.800 --> 1:48:04.680
Let's think out of the box and some things.
1:48:04.680 --> 1:48:09.400
So do you think teleoperation has a role in any of this?
1:48:09.400 --> 1:48:13.880
So remotely stepping in and controlling the cars?
1:48:13.880 --> 1:48:18.880
No, I think that if the safety operation
1:48:21.320 --> 1:48:26.160
by design requires a constant link to the cars,
1:48:26.160 --> 1:48:27.560
I think it doesn't work.
1:48:27.560 --> 1:48:31.080
So that's the same argument used for V2I, V2V.
1:48:31.080 --> 1:48:34.280
Well, there's a lot of non safety critical stuff
1:48:34.280 --> 1:48:35.120
you can do with V2I.
1:48:35.120 --> 1:48:37.440
I like V2I, I like V2I way more than V2V
1:48:37.440 --> 1:48:39.280
because V2I is already like,
1:48:39.280 --> 1:48:40.880
I already have internet in the car, right?
1:48:40.880 --> 1:48:43.280
There's a lot of great stuff you can do with V2I.
1:48:44.280 --> 1:48:46.320
Like for example, you can,
1:48:46.320 --> 1:48:48.880
well, where I already have V2V, Waze is V2I, right?
1:48:48.880 --> 1:48:50.520
Waze can route me around traffic jams.
1:48:50.520 --> 1:48:52.760
That's a great example of V2I.
1:48:52.760 --> 1:48:54.440
And then, okay, the car automatically talks
1:48:54.440 --> 1:48:55.800
to that same service, like it works.
1:48:55.800 --> 1:48:56.800
So it's improving the experience,
1:48:56.800 --> 1:48:59.480
but it's not a fundamental fallback for safety.
1:48:59.480 --> 1:49:04.160
No, if any of your things that require
1:49:04.160 --> 1:49:07.480
wireless communication are more than QM,
1:49:07.480 --> 1:49:10.640
like have an ASL rating, you shouldn't.
1:49:10.640 --> 1:49:14.200
You previously said that life is work
1:49:15.440 --> 1:49:17.480
and that you don't do anything to relax.
1:49:17.480 --> 1:49:20.800
So how do you think about hard work?
1:49:20.800 --> 1:49:22.200
Well, what is it?
1:49:22.200 --> 1:49:24.720
What do you think it takes to accomplish great things?
1:49:24.720 --> 1:49:25.840
And there's a lot of people saying
1:49:25.840 --> 1:49:28.280
that there needs to be some balance.
1:49:28.280 --> 1:49:29.600
You know, you need to,
1:49:29.600 --> 1:49:31.120
in order to accomplish great things,
1:49:31.120 --> 1:49:32.200
you need to take some time off,
1:49:32.200 --> 1:49:34.640
you need to reflect and so on.
1:49:34.640 --> 1:49:37.840
And then some people are just insanely working,
1:49:37.840 --> 1:49:39.640
burning the candle on both ends.
1:49:39.640 --> 1:49:41.360
How do you think about that?
1:49:41.360 --> 1:49:43.400
I think I was trolling in the Suraj interview
1:49:43.400 --> 1:49:45.600
when I said that off camera,
1:49:45.600 --> 1:49:47.240
but right before I spoke to a little bit of weed,
1:49:47.240 --> 1:49:49.800
like, you know, come on, this is a joke, right?
1:49:49.800 --> 1:49:50.880
Like I do nothing to relax.
1:49:50.880 --> 1:49:52.560
Look where I am, I'm at a party, right?
1:49:52.560 --> 1:49:53.960
Yeah, yeah, yeah, sure.
1:49:53.960 --> 1:49:55.200
That's true.
1:49:55.200 --> 1:49:58.040
So no, no, of course I don't.
1:49:58.040 --> 1:49:59.800
When I say that life is work though,
1:49:59.800 --> 1:50:01.960
I mean that like, I think that
1:50:01.960 --> 1:50:04.200
what gives my life meaning is work.
1:50:04.200 --> 1:50:05.720
I don't mean that every minute of the day
1:50:05.720 --> 1:50:06.560
you should be working.
1:50:06.560 --> 1:50:08.000
I actually think this is not the best way
1:50:08.000 --> 1:50:09.800
to maximize results.
1:50:09.800 --> 1:50:12.040
I think that if you're working 12 hours a day,
1:50:12.040 --> 1:50:14.920
you should be working smarter and not harder.
1:50:14.920 --> 1:50:17.880
Well, so it gives work gives you meaning
1:50:17.880 --> 1:50:20.520
for some people, other sorts of meaning
1:50:20.520 --> 1:50:24.560
is personal relationships, like family and so on.
1:50:24.560 --> 1:50:27.200
You've also in that interview with Suraj
1:50:27.200 --> 1:50:30.720
or the trolling mentioned that one of the things
1:50:30.720 --> 1:50:33.400
you look forward to in the future is AI girlfriends.
1:50:33.400 --> 1:50:34.360
Yes.
1:50:34.360 --> 1:50:38.800
So that's a topic that I'm very much fascinated by,
1:50:38.800 --> 1:50:39.840
not necessarily girlfriends,
1:50:39.840 --> 1:50:41.880
but just forming a deep connection with AI.
1:50:41.880 --> 1:50:42.960
Yeah.
1:50:42.960 --> 1:50:44.400
What kind of system do you imagine
1:50:44.400 --> 1:50:46.240
when you say AI girlfriend,
1:50:46.240 --> 1:50:47.800
whether you were trolling or not?
1:50:47.800 --> 1:50:49.720
No, that one I'm very serious about.
1:50:49.720 --> 1:50:52.360
And I'm serious about that on both a shallow level
1:50:52.360 --> 1:50:53.680
and a deep level.
1:50:53.680 --> 1:50:55.720
I think that VR brothels are coming soon
1:50:55.720 --> 1:50:57.800
and are gonna be really cool.
1:50:57.800 --> 1:50:59.760
It's not cheating if it's a robot.
1:50:59.760 --> 1:51:01.080
I see the slogan already.
1:51:01.080 --> 1:51:04.320
Um, but...
1:51:04.320 --> 1:51:06.200
There's a, I don't know if you've watched
1:51:06.200 --> 1:51:08.320
or just watched the Black Mirror episode.
1:51:08.320 --> 1:51:09.320
I watched the latest one, yeah.
1:51:09.320 --> 1:51:11.320
Yeah, yeah.
1:51:11.320 --> 1:51:13.160
Oh, the Ashley 2 one?
1:51:13.160 --> 1:51:15.120
Or the...
1:51:15.120 --> 1:51:16.920
No, where there's two friends
1:51:16.920 --> 1:51:20.160
who were having sex with each other in...
1:51:20.160 --> 1:51:21.240
Oh, in the VR game.
1:51:21.240 --> 1:51:23.560
In the VR game, it's the two guys,
1:51:23.560 --> 1:51:26.720
but one of them was a female, yeah.
1:51:26.720 --> 1:51:27.560
Yeah, the...
1:51:27.560 --> 1:51:29.560
Which is another mind blowing concept.
1:51:29.560 --> 1:51:33.320
That in VR, you don't have to be the form.
1:51:33.320 --> 1:51:37.720
You can be two animals having sex, it's weird.
1:51:37.720 --> 1:51:38.560
I mean, I'll see how nice
1:51:38.560 --> 1:51:40.280
that the software maps the nerve endings, right?
1:51:40.280 --> 1:51:41.600
Yeah, it's weird.
1:51:41.600 --> 1:51:44.480
I mean, yeah, they sweep a lot of the fascinating,
1:51:44.480 --> 1:51:46.440
really difficult technical challenges under the rug,
1:51:46.440 --> 1:51:48.360
like assuming it's possible
1:51:48.360 --> 1:51:51.160
to do the mapping of the nerve endings, then...
1:51:51.160 --> 1:51:52.000
I wish, yeah, I saw that.
1:51:52.000 --> 1:51:53.800
The way they did it with the little like stim unit
1:51:53.800 --> 1:51:55.400
on the head, that'd be amazing.
1:51:56.800 --> 1:51:58.760
So, well, no, no, on a shallow level,
1:51:58.760 --> 1:52:01.640
like you could set up like almost a brothel
1:52:01.640 --> 1:52:05.160
with like real dolls and Oculus quests,
1:52:05.160 --> 1:52:06.200
write some good software.
1:52:06.200 --> 1:52:08.280
I think it'd be a cool novelty experience.
1:52:09.280 --> 1:52:11.400
But no, on a deeper, like emotional level.
1:52:12.800 --> 1:52:16.960
I mean, yeah, I would really like to fall in love
1:52:16.960 --> 1:52:18.120
with the machine.
1:52:18.120 --> 1:52:23.120
Do you see yourself having a long term relationship
1:52:23.120 --> 1:52:27.520
of the kind monogamous relationship that we have now
1:52:27.520 --> 1:52:31.360
with the robot, with the AI system, even?
1:52:31.360 --> 1:52:32.680
Not even just the robot.
1:52:32.680 --> 1:52:37.680
So, I think about maybe my ideal future.
1:52:38.200 --> 1:52:43.200
When I was 15, I read Eliezer Yudkowsky's early writings
1:52:44.320 --> 1:52:49.120
on the singularity and like that AI
1:52:49.120 --> 1:52:53.040
is going to surpass human intelligence massively.
1:52:53.040 --> 1:52:55.480
He made some Moore's law based predictions
1:52:55.480 --> 1:52:57.400
that I mostly agree with.
1:52:57.400 --> 1:52:59.360
And then I really struggled
1:52:59.360 --> 1:53:01.360
for the next couple of years of my life.
1:53:01.360 --> 1:53:03.360
Like, why should I even bother to learn anything?
1:53:03.360 --> 1:53:06.160
It's all gonna be meaningless when the machine show up.
1:53:06.160 --> 1:53:07.000
Right.
1:53:07.000 --> 1:53:10.520
Well, maybe when I was that young,
1:53:10.520 --> 1:53:12.040
I was still a little bit more pure
1:53:12.040 --> 1:53:13.160
and really like clung to that.
1:53:13.160 --> 1:53:14.720
And then I'm like, well, the machine's ain't here yet.
1:53:14.720 --> 1:53:16.800
You know, and I seem to be pretty good at this stuff.
1:53:16.800 --> 1:53:18.520
Let's try my best, you know,
1:53:18.520 --> 1:53:20.320
like what's the worst that happens?
1:53:20.320 --> 1:53:23.440
But the best possible future I see
1:53:23.440 --> 1:53:26.120
is me sort of merging with the machine.
1:53:26.120 --> 1:53:28.120
And the way that I personify this
1:53:28.120 --> 1:53:30.800
is in a longterm and augments relationship with the machine.
1:53:32.160 --> 1:53:33.320
Oh, you don't think there's room
1:53:33.320 --> 1:53:35.040
for another human in your life
1:53:35.040 --> 1:53:37.440
if you really truly merge with another machine?
1:53:38.440 --> 1:53:40.240
I mean, I see merging.
1:53:40.240 --> 1:53:44.240
I see like the best interface to my brain
1:53:45.520 --> 1:53:48.000
is like the same relationship interface
1:53:48.000 --> 1:53:49.320
to merge with an AI, right?
1:53:49.320 --> 1:53:51.440
What does that merging feel like?
1:53:52.440 --> 1:53:55.320
I've seen couples who've been together for a long time
1:53:55.320 --> 1:53:57.840
and like, I almost think of them as one person.
1:53:57.840 --> 1:54:01.280
Like couples who spend all their time together and...
1:54:01.280 --> 1:54:02.120
That's fascinating.
1:54:02.120 --> 1:54:03.320
You're actually putting,
1:54:03.320 --> 1:54:05.520
what does that merging actually looks like?
1:54:05.520 --> 1:54:07.600
It's not just a nice channel.
1:54:07.600 --> 1:54:11.640
Like a lot of people imagine it's just an efficient link,
1:54:11.640 --> 1:54:13.800
search link to Wikipedia or something.
1:54:13.800 --> 1:54:14.640
I don't believe in that.
1:54:14.640 --> 1:54:17.120
But it's more, you're saying that there's the same kind of,
1:54:17.120 --> 1:54:19.520
the same kind of relationship you have with another human
1:54:19.520 --> 1:54:22.960
as a deep relationship is that's what merging looks like.
1:54:22.960 --> 1:54:24.480
That's pretty...
1:54:24.480 --> 1:54:26.680
I don't believe that link is possible.
1:54:26.680 --> 1:54:28.120
I think that that link, so you're like,
1:54:28.120 --> 1:54:30.160
oh, I'm gonna download Wikipedia right to my brain.
1:54:30.160 --> 1:54:33.360
My reading speed is not limited by my eyes.
1:54:33.360 --> 1:54:36.800
My reading speed is limited by my inner processing loop.
1:54:36.800 --> 1:54:38.680
And to like bootstrap that
1:54:38.680 --> 1:54:42.440
sounds kind of unclear how to do it and horrifying.
1:54:42.440 --> 1:54:46.560
But if I am with somebody, and I'll use somebody
1:54:46.560 --> 1:54:51.400
who is making a super sophisticated model of me
1:54:51.400 --> 1:54:53.200
and then running simulations on that model,
1:54:53.200 --> 1:54:54.120
I'm not gonna get into the question
1:54:54.120 --> 1:54:55.880
whether the simulations are conscious or not.
1:54:55.880 --> 1:54:58.240
I don't really wanna know what it's doing.
1:54:58.240 --> 1:55:01.600
But using those simulations to play out hypothetical futures
1:55:01.600 --> 1:55:04.880
for me, deciding what things to say to me
1:55:04.880 --> 1:55:08.720
to guide me along a path and that's how I envision it.
1:55:08.720 --> 1:55:13.720
So on that path to AI of super human level intelligence,
1:55:13.720 --> 1:55:15.680
you've mentioned that you believe in the singularity,
1:55:15.680 --> 1:55:17.280
that singularity is coming.
1:55:17.280 --> 1:55:20.440
Again, could be trolling, could be not, could be part...
1:55:20.440 --> 1:55:21.760
All trolling has truth in it.
1:55:21.760 --> 1:55:22.840
I don't know what that means anymore.
1:55:22.840 --> 1:55:24.520
What is the singularity?
1:55:24.520 --> 1:55:26.720
So yeah, so that's really the question.
1:55:26.720 --> 1:55:29.280
How many years do you think before the singularity
1:55:29.280 --> 1:55:30.920
of what form do you think it will take?
1:55:30.920 --> 1:55:34.200
Does that mean fundamental shifts in capabilities of AI?
1:55:34.200 --> 1:55:36.960
Does it mean some other kind of ideas?
1:55:36.960 --> 1:55:40.120
Maybe that's just my roots, but...
1:55:40.120 --> 1:55:42.920
So I can buy a human being's worth of computers
1:55:42.920 --> 1:55:46.000
for things worth of compute for like a million bucks a day.
1:55:46.000 --> 1:55:47.800
It's about one TPU pod V3.
1:55:47.800 --> 1:55:50.240
I want like, I think they claim a hundred pay to flops.
1:55:50.240 --> 1:55:51.080
That's being generous.
1:55:51.080 --> 1:55:52.320
I think humans are actually more like 20.
1:55:52.320 --> 1:55:53.160
So that's like five humans.
1:55:53.160 --> 1:55:54.040
That's pretty good.
1:55:54.040 --> 1:55:55.560
Google needs to sell their TPUs.
1:55:56.840 --> 1:55:58.640
But no, I could buy GPUs.
1:55:58.640 --> 1:56:02.280
I could buy a stack of like, I buy 1080TIs,
1:56:02.280 --> 1:56:03.880
build data center full of them.
1:56:03.880 --> 1:56:07.280
And for a million bucks, I can get a human worth of compute.
1:56:08.160 --> 1:56:12.280
But when you look at the total number of flops in the world,
1:56:12.280 --> 1:56:14.400
when you look at human flops,
1:56:14.400 --> 1:56:17.040
which goes up very, very slowly with the population,
1:56:17.040 --> 1:56:19.760
and machine flops, which goes up exponentially,
1:56:19.760 --> 1:56:22.360
but it's still nowhere near.
1:56:22.360 --> 1:56:24.040
I think that's the key thing
1:56:24.040 --> 1:56:25.880
to talk about when the singularity happened.
1:56:25.880 --> 1:56:28.560
When most flops in the world are silicon
1:56:28.560 --> 1:56:32.280
and not biological, that's kind of the crossing point.
1:56:32.280 --> 1:56:35.480
Like they are now the dominant species on the planet.
1:56:35.480 --> 1:56:38.720
And just looking at how technology is progressing,
1:56:38.720 --> 1:56:40.360
when do you think that could possibly happen?
1:56:40.360 --> 1:56:41.680
Do you think it would happen in your lifetime?
1:56:41.680 --> 1:56:43.640
Oh yeah, definitely in my lifetime.
1:56:43.640 --> 1:56:44.480
I've done the math.
1:56:44.480 --> 1:56:47.560
I like 2038 because it's the UNIX timestamp roll over.
1:56:49.920 --> 1:56:51.840
Yeah, beautifully put.
1:56:52.680 --> 1:56:57.680
So you've said that the meaning of life is to win.
1:56:58.000 --> 1:56:59.560
If you look five years into the future,
1:56:59.560 --> 1:57:01.000
what does winning look like?
1:57:02.640 --> 1:57:03.720
So...
1:57:03.720 --> 1:57:08.720
I can go into technical depth to what I mean by that, to win.
1:57:11.720 --> 1:57:12.720
It may not mean...
1:57:12.720 --> 1:57:14.400
I was criticized for that in the comments.
1:57:14.400 --> 1:57:17.720
Like, doesn't this guy want to save the penguins in Antarctica?
1:57:17.720 --> 1:57:20.960
Or like, oh man, listen to what I'm saying.
1:57:20.960 --> 1:57:23.720
I'm not talking about like I have a yacht or something.
1:57:24.720 --> 1:57:26.720
I am an agent.
1:57:26.720 --> 1:57:28.720
I am put into this world.
1:57:28.720 --> 1:57:32.720
And I don't really know what my purpose is.
1:57:33.720 --> 1:57:36.720
But if you're a reinforcement, if you're an intelligent agent
1:57:36.720 --> 1:57:39.720
and you're put into a world, what is the ideal thing to do?
1:57:39.720 --> 1:57:41.720
Well, the ideal thing, mathematically,
1:57:41.720 --> 1:57:43.720
you can go back to like Schmidt Hoover theories about this,
1:57:43.720 --> 1:57:46.720
is to build a compressive model of the world.
1:57:46.720 --> 1:57:49.720
To build a maximally compressive to explore the world
1:57:49.720 --> 1:57:52.720
such that your exploration function maximizes
1:57:52.720 --> 1:57:55.720
the derivative of compression of the past.
1:57:55.720 --> 1:57:58.720
Schmidt Hoover has a paper about this.
1:57:58.720 --> 1:58:01.720
And like, I took that kind of as like a personal goal function.
1:58:02.720 --> 1:58:04.720
So what I mean to win, I mean like,
1:58:04.720 --> 1:58:08.720
maybe this is religious, but like I think that in the future
1:58:08.720 --> 1:58:10.720
I might be given a real purpose.
1:58:10.720 --> 1:58:12.720
Or I may decide this purpose myself.
1:58:12.720 --> 1:58:14.720
And then at that point, now I know what the game is
1:58:14.720 --> 1:58:15.720
and I know how to win.
1:58:15.720 --> 1:58:18.720
I think right now I'm still just trying to figure out what the game is.
1:58:18.720 --> 1:58:19.720
But once I know...
1:58:20.720 --> 1:58:22.720
So you have...
1:58:22.720 --> 1:58:25.720
You have imperfect information.
1:58:25.720 --> 1:58:27.720
You have a lot of uncertainty about the reward function
1:58:27.720 --> 1:58:28.720
and you're discovering it.
1:58:28.720 --> 1:58:29.720
Exactly.
1:58:29.720 --> 1:58:30.720
But the purpose is...
1:58:30.720 --> 1:58:31.720
That's a better way to put it.
1:58:31.720 --> 1:58:33.720
The purpose is to maximize it
1:58:33.720 --> 1:58:36.720
while you have a lot of uncertainty around it.
1:58:36.720 --> 1:58:38.720
And you're both reducing the uncertainty
1:58:38.720 --> 1:58:40.720
and maximizing at the same time.
1:58:40.720 --> 1:58:43.720
And so that's at the technical level.
1:58:43.720 --> 1:58:44.720
What is the...
1:58:44.720 --> 1:58:46.720
If you believe in the universal prior,
1:58:46.720 --> 1:58:48.720
what is the universal reward function?
1:58:48.720 --> 1:58:50.720
That's the better way to put it.
1:58:50.720 --> 1:58:53.720
So that win is interesting.
1:58:53.720 --> 1:58:56.720
I think I speak for everyone in saying that
1:58:56.720 --> 1:59:01.720
I wonder what that reward function is for you.
1:59:01.720 --> 1:59:06.720
And I look forward to seeing that in five years and ten years.
1:59:06.720 --> 1:59:09.720
I think a lot of people including myself are cheering you on, man.
1:59:09.720 --> 1:59:11.720
So I'm happy you exist.
1:59:11.720 --> 1:59:13.720
And I wish you the best of luck.
1:59:13.720 --> 1:59:14.720
Thanks for talking today, man.
1:59:14.720 --> 1:59:15.720
Thank you.
1:59:15.720 --> 1:59:20.720
This was a lot of fun.