|
WEBVTT |
|
|
|
00:00.000 --> 00:02.800 |
|
The following is a conversation with Colin Engel. |
|
|
|
00:02.800 --> 00:05.840 |
|
He's the CEO and cofounder of iRobot, |
|
|
|
00:05.840 --> 00:09.680 |
|
a robotics company that for 29 years has been creating robots |
|
|
|
00:09.680 --> 00:12.600 |
|
that operate successfully in the real world. |
|
|
|
00:12.600 --> 00:15.600 |
|
Not as a demo or on a scale of dozens, |
|
|
|
00:15.600 --> 00:18.880 |
|
but on a scale of thousands and millions. |
|
|
|
00:18.880 --> 00:24.320 |
|
As of this year, iRobot has sold more than 25 million robots |
|
|
|
00:24.320 --> 00:28.200 |
|
to consumers, including the Roomba vacuum cleaning robot, |
|
|
|
00:28.200 --> 00:30.000 |
|
the Bravo floor mopping robot, |
|
|
|
00:30.000 --> 00:34.000 |
|
and soon the Terra lawn mowing robot. |
|
|
|
00:34.000 --> 00:37.680 |
|
29 million robots successfully operating autonomously |
|
|
|
00:37.680 --> 00:39.680 |
|
in real people's homes. |
|
|
|
00:39.680 --> 00:42.120 |
|
To me, it's an incredible accomplishment |
|
|
|
00:42.120 --> 00:45.120 |
|
of science, engineering, logistics, |
|
|
|
00:45.120 --> 00:48.800 |
|
and all kinds of general entrepreneurial innovation. |
|
|
|
00:48.800 --> 00:51.360 |
|
Most robotics companies fail. |
|
|
|
00:51.360 --> 00:56.880 |
|
iRobot has survived and succeeded for 29 years. |
|
|
|
00:56.880 --> 01:00.160 |
|
I spent all day at iRobot, including a long tour |
|
|
|
01:00.160 --> 01:03.520 |
|
and conversation with Colin about the history of iRobot, |
|
|
|
01:03.520 --> 01:06.720 |
|
and then sat down for this podcast conversation |
|
|
|
01:06.720 --> 01:08.560 |
|
that would have been much longer |
|
|
|
01:08.560 --> 01:10.760 |
|
if I didn't spend all day learning about |
|
|
|
01:10.760 --> 01:13.960 |
|
and playing with the various robots in the company's history. |
|
|
|
01:13.960 --> 01:17.480 |
|
I'll release the video of the tour separately. |
|
|
|
01:17.480 --> 01:20.720 |
|
Colin, iRobot, its founding team, |
|
|
|
01:20.720 --> 01:23.200 |
|
its current team, and its mission |
|
|
|
01:23.200 --> 01:26.200 |
|
has been and continues to be an inspiration to me |
|
|
|
01:26.200 --> 01:28.880 |
|
and thousands of engineers who are working hard |
|
|
|
01:28.880 --> 01:33.000 |
|
to create AI systems that help real people. |
|
|
|
01:33.000 --> 01:35.640 |
|
This is the Artificial Intelligence Podcast. |
|
|
|
01:35.640 --> 01:38.000 |
|
If you enjoy it, subscribe on YouTube, |
|
|
|
01:38.000 --> 01:41.240 |
|
give it five stars on iTunes, support it on Patreon, |
|
|
|
01:41.240 --> 01:43.280 |
|
or simply connect with me on Twitter |
|
|
|
01:43.280 --> 01:47.120 |
|
at Lex Freedman, spelled F R I D M A N. |
|
|
|
01:47.120 --> 01:51.080 |
|
And now, here's my conversation with Colin Engel. |
|
|
|
01:52.120 --> 01:55.120 |
|
In his 1942 short story, Run Around, |
|
|
|
01:55.120 --> 01:58.520 |
|
from his iRobot collection, Asimov, |
|
|
|
01:59.400 --> 02:02.840 |
|
proposed the three laws of robotics in order, |
|
|
|
02:02.840 --> 02:06.800 |
|
don't harm humans, obey orders, protect yourself. |
|
|
|
02:06.800 --> 02:07.640 |
|
So two questions. |
|
|
|
02:07.640 --> 02:11.640 |
|
First, does the Roomba follow these three laws? |
|
|
|
02:11.640 --> 02:14.760 |
|
And also, more seriously, |
|
|
|
02:14.760 --> 02:17.120 |
|
what role do you hope to see robots take |
|
|
|
02:17.120 --> 02:20.280 |
|
in modern society and in the future world? |
|
|
|
02:20.280 --> 02:25.280 |
|
So the three laws are very thought provoking |
|
|
|
02:25.720 --> 02:30.720 |
|
and require such a profound understanding |
|
|
|
02:31.360 --> 02:36.280 |
|
of the world a robot lives in, |
|
|
|
02:36.280 --> 02:38.360 |
|
the ramifications of its action |
|
|
|
02:38.360 --> 02:40.040 |
|
and its own sense of self, |
|
|
|
02:40.040 --> 02:45.040 |
|
that it's not a relevant bar, |
|
|
|
02:46.640 --> 02:48.360 |
|
at least it won't be a relevant bar |
|
|
|
02:48.360 --> 02:50.160 |
|
for decades to come. |
|
|
|
02:50.160 --> 02:54.560 |
|
And so, if Roomba follows the three laws, |
|
|
|
02:54.560 --> 02:56.840 |
|
and I believe it does, |
|
|
|
02:58.040 --> 03:00.920 |
|
it is designed to help humans not hurt them, |
|
|
|
03:00.920 --> 03:03.120 |
|
it's designed to be inherently safe, |
|
|
|
03:03.120 --> 03:05.960 |
|
and we design it to last a long time. |
|
|
|
03:07.200 --> 03:11.600 |
|
It's not through any AI or intent on the robot's part. |
|
|
|
03:11.600 --> 03:14.960 |
|
It's because following the three laws |
|
|
|
03:14.960 --> 03:18.200 |
|
is aligned with being a good robot product. |
|
|
|
03:19.600 --> 03:23.120 |
|
So I guess it does, |
|
|
|
03:23.120 --> 03:27.240 |
|
but not by explicit design. |
|
|
|
03:27.240 --> 03:28.800 |
|
So then the bigger picture, |
|
|
|
03:28.800 --> 03:31.560 |
|
what role do you hope to see robotics, |
|
|
|
03:31.560 --> 03:36.560 |
|
robots take in what's currently mostly a world of humans? |
|
|
|
03:37.360 --> 03:42.360 |
|
We need robots to help us continue |
|
|
|
03:42.360 --> 03:44.960 |
|
to improve our standard of living. |
|
|
|
03:46.160 --> 03:51.160 |
|
We need robots because the average age of humanity |
|
|
|
03:52.840 --> 03:55.040 |
|
is increasing very quickly, |
|
|
|
03:55.040 --> 03:59.760 |
|
and simply the number of people young enough |
|
|
|
03:59.760 --> 04:02.480 |
|
and spry enough to care for the elder |
|
|
|
04:03.880 --> 04:07.880 |
|
growing demographic is inadequate. |
|
|
|
04:08.800 --> 04:11.440 |
|
And so what is the role of robots? |
|
|
|
04:11.440 --> 04:15.000 |
|
Today, the role is to make our lives a little easier, |
|
|
|
04:15.000 --> 04:17.400 |
|
a little cleaner, maybe a little healthier. |
|
|
|
04:18.520 --> 04:22.160 |
|
But in time, robots are gonna be the difference |
|
|
|
04:22.160 --> 04:25.720 |
|
between real gut wrenching declines |
|
|
|
04:25.720 --> 04:28.360 |
|
in our ability to live independently |
|
|
|
04:28.360 --> 04:30.280 |
|
and maintain our standard of living, |
|
|
|
04:30.280 --> 04:34.920 |
|
and a future that is the bright one |
|
|
|
04:34.920 --> 04:37.520 |
|
where we have more control of our lives, |
|
|
|
04:37.520 --> 04:40.560 |
|
can spend more of our time focused |
|
|
|
04:40.560 --> 04:43.280 |
|
on activities we choose. |
|
|
|
04:44.680 --> 04:48.840 |
|
And I'm so honored and excited to be |
|
|
|
04:48.840 --> 04:50.520 |
|
playing a role in that journey. |
|
|
|
04:50.520 --> 04:51.800 |
|
So you've given me a tour, |
|
|
|
04:51.800 --> 04:54.280 |
|
you showed me some of the long histories now, |
|
|
|
04:54.280 --> 04:57.280 |
|
29 years that iRobot has been at it, |
|
|
|
04:57.280 --> 04:59.280 |
|
creating some incredible robots. |
|
|
|
04:59.280 --> 05:01.280 |
|
You showed me PacBot, |
|
|
|
05:01.280 --> 05:03.200 |
|
you showed me a bunch of other stuff |
|
|
|
05:03.200 --> 05:04.600 |
|
that led up to Roomba, |
|
|
|
05:04.600 --> 05:08.760 |
|
that led to Brava and Tara. |
|
|
|
05:08.760 --> 05:13.760 |
|
So let's skip that incredible history |
|
|
|
05:14.080 --> 05:15.080 |
|
in the interest of time, |
|
|
|
05:15.080 --> 05:16.160 |
|
because we already talked about it, |
|
|
|
05:16.160 --> 05:18.080 |
|
I'll show this incredible footage. |
|
|
|
05:18.080 --> 05:22.680 |
|
You mentioned elderly and robotics and society. |
|
|
|
05:22.680 --> 05:26.320 |
|
I think the home is a fascinating place for robots to be. |
|
|
|
05:26.320 --> 05:29.840 |
|
So where do you see robots in the home? |
|
|
|
05:29.840 --> 05:31.680 |
|
Currently, I would say, once again, |
|
|
|
05:31.680 --> 05:34.560 |
|
probably most homes in the world don't have a robot. |
|
|
|
05:34.560 --> 05:36.200 |
|
So how do you see that changing? |
|
|
|
05:36.200 --> 05:39.880 |
|
Where do you think is the big initial value add |
|
|
|
05:39.880 --> 05:41.960 |
|
that robots can do? |
|
|
|
05:41.960 --> 05:46.960 |
|
So iRobot has sort of over the years narrowed in on the home, |
|
|
|
05:48.160 --> 05:53.160 |
|
the consumer's home as the place where we want to innovate |
|
|
|
05:53.160 --> 05:58.160 |
|
and deliver tools that will help a home be |
|
|
|
06:00.840 --> 06:04.280 |
|
a more automatically maintained place, |
|
|
|
06:04.280 --> 06:06.840 |
|
a healthier place, a safer place |
|
|
|
06:06.840 --> 06:11.520 |
|
and perhaps even a more efficient place to be. |
|
|
|
06:11.520 --> 06:16.520 |
|
And today vacuum we mop, soon we'll be mowing your lawn, |
|
|
|
06:17.000 --> 06:21.520 |
|
but where things are going is, |
|
|
|
06:22.760 --> 06:27.120 |
|
when do we get to the point where the home, |
|
|
|
06:27.120 --> 06:29.160 |
|
not just the robots that live in your home, |
|
|
|
06:29.160 --> 06:32.200 |
|
but the home itself becomes part of a system |
|
|
|
06:32.200 --> 06:36.000 |
|
that maintains itself and plays an active role |
|
|
|
06:36.000 --> 06:40.800 |
|
in caring for and helping the people live in that home. |
|
|
|
06:40.800 --> 06:43.240 |
|
And I see everything that we're doing |
|
|
|
06:43.240 --> 06:46.200 |
|
as steps along the path toward that future. |
|
|
|
06:46.200 --> 06:47.760 |
|
So what are the steps? |
|
|
|
06:47.760 --> 06:51.760 |
|
So if we can summarize some of the history of Roomba, |
|
|
|
06:53.280 --> 06:55.560 |
|
you've mentioned, and maybe you can elaborate on it, |
|
|
|
06:55.560 --> 06:57.280 |
|
but you mentioned that the early days |
|
|
|
06:57.280 --> 07:02.280 |
|
were really taking a robot from something that works |
|
|
|
07:02.360 --> 07:04.920 |
|
either in the lab or something that works in the field |
|
|
|
07:04.920 --> 07:09.920 |
|
that helps soldiers do the difficult work they do |
|
|
|
07:10.240 --> 07:12.680 |
|
to actually be in the hands of consumers |
|
|
|
07:12.680 --> 07:15.680 |
|
and tens of thousands, hundreds of thousands of robots |
|
|
|
07:15.680 --> 07:18.520 |
|
that don't break down over how much people love them |
|
|
|
07:18.520 --> 07:21.480 |
|
over months of very extensive use. |
|
|
|
07:21.480 --> 07:22.880 |
|
So that was the big first step. |
|
|
|
07:22.880 --> 07:26.040 |
|
And then the second big step was the ability |
|
|
|
07:26.040 --> 07:29.920 |
|
to sense the environment, to build a map, to localize, |
|
|
|
07:29.920 --> 07:32.600 |
|
to be able to build a picture of the home |
|
|
|
07:32.600 --> 07:34.640 |
|
that the human can then attach labels to |
|
|
|
07:34.640 --> 07:38.440 |
|
in terms of giving some semantic knowledge |
|
|
|
07:38.440 --> 07:40.920 |
|
to the robot about its environment. |
|
|
|
07:40.920 --> 07:44.880 |
|
Okay, so that's like a huge, two big, huge steps. |
|
|
|
07:46.280 --> 07:47.560 |
|
Maybe you can comment on them, |
|
|
|
07:47.560 --> 07:51.080 |
|
but also what is the next step |
|
|
|
07:51.080 --> 07:54.760 |
|
of making a robot part of the home? |
|
|
|
07:54.760 --> 07:55.600 |
|
Sure. |
|
|
|
07:55.600 --> 08:00.600 |
|
So the goal is to make a home that takes care of itself, |
|
|
|
08:01.320 --> 08:03.720 |
|
takes care of the people in the home |
|
|
|
08:03.720 --> 08:07.880 |
|
and gives a user an experience of just living their life |
|
|
|
08:07.880 --> 08:10.880 |
|
and the home is somehow doing the right thing, |
|
|
|
08:10.880 --> 08:14.160 |
|
turning on and off lights when you leave, |
|
|
|
08:14.160 --> 08:17.280 |
|
cleaning up the environment. |
|
|
|
08:17.280 --> 08:22.280 |
|
And we went from robots that were right in the lab, |
|
|
|
08:22.280 --> 08:26.760 |
|
but were both too expensive and not sufficiently capable |
|
|
|
08:26.760 --> 08:30.480 |
|
to ever do an acceptable job of anything |
|
|
|
08:30.480 --> 08:34.080 |
|
other than being a toy or a curio in your home |
|
|
|
08:34.080 --> 08:39.080 |
|
to something that was both affordable |
|
|
|
08:38.880 --> 08:42.440 |
|
and sufficiently effective to drive, |
|
|
|
08:42.440 --> 08:45.360 |
|
be above threshold and drive purchase intent. |
|
|
|
08:47.520 --> 08:50.000 |
|
Now we've disrupted the intent of the work |
|
|
|
08:50.000 --> 08:54.400 |
|
and now we've disrupted the entire vacuuming industry. |
|
|
|
08:55.480 --> 08:59.840 |
|
The number one selling vacuums, for example, in the US |
|
|
|
08:59.840 --> 09:02.480 |
|
are Roombas, so not robot vacuums, |
|
|
|
09:02.480 --> 09:05.160 |
|
but vacuums and that's really crazy and weird. |
|
|
|
09:05.160 --> 09:08.080 |
|
We need to pause that, I mean, that's incredible. |
|
|
|
09:08.080 --> 09:10.520 |
|
That's incredible that a robot |
|
|
|
09:10.520 --> 09:15.520 |
|
is the number one selling thing that does something. |
|
|
|
09:15.560 --> 09:16.400 |
|
Yep. |
|
|
|
09:16.400 --> 09:18.240 |
|
Something as essential as vacuuming. |
|
|
|
09:18.240 --> 09:20.120 |
|
So we're... Congratulations. |
|
|
|
09:20.120 --> 09:20.960 |
|
Thank you. |
|
|
|
09:20.960 --> 09:22.440 |
|
It's still kind of fun to say, |
|
|
|
09:22.440 --> 09:26.600 |
|
but just because this was a crazy idea |
|
|
|
09:26.600 --> 09:30.920 |
|
that just started in a room here, |
|
|
|
09:30.920 --> 09:33.800 |
|
we're like, do you think we can do this? |
|
|
|
09:35.000 --> 09:36.160 |
|
Hey, let's give it a try. |
|
|
|
09:38.000 --> 09:40.440 |
|
But now the robots are starting |
|
|
|
09:40.440 --> 09:42.840 |
|
to understand their environment. |
|
|
|
09:42.840 --> 09:45.280 |
|
And if you think about the next step, |
|
|
|
09:45.280 --> 09:47.880 |
|
there's two dimensions. |
|
|
|
09:48.720 --> 09:53.040 |
|
I've been working so hard since the beginning of iRobot |
|
|
|
09:53.040 --> 09:55.080 |
|
to make robots are autonomous, |
|
|
|
09:55.080 --> 09:59.120 |
|
that they're smart enough and understand their task enough |
|
|
|
09:59.120 --> 10:02.480 |
|
that they can just go do it without human involvement. |
|
|
|
10:04.160 --> 10:07.440 |
|
Now what I'm really excited and working on |
|
|
|
10:07.440 --> 10:09.520 |
|
is how do I make them less autonomous? |
|
|
|
10:09.520 --> 10:14.520 |
|
Meaning that the robot is supposed to be your partner, |
|
|
|
10:15.080 --> 10:20.080 |
|
not this automaton that just goes and does what a robot does. |
|
|
|
10:20.240 --> 10:22.440 |
|
And so that if you tell it, |
|
|
|
10:22.440 --> 10:27.160 |
|
hey, I just dropped some flour by the fridge in the kitchen. |
|
|
|
10:27.160 --> 10:29.000 |
|
Can you deal with it? |
|
|
|
10:29.000 --> 10:31.840 |
|
Wouldn't be awesome if the right thing just happened |
|
|
|
10:31.840 --> 10:35.240 |
|
based on that utterance. |
|
|
|
10:35.240 --> 10:37.960 |
|
And to some extent that's less autonomous |
|
|
|
10:37.960 --> 10:40.120 |
|
because it's actually listening to you, |
|
|
|
10:40.120 --> 10:44.400 |
|
understanding the context and intent of the sentence, |
|
|
|
10:44.400 --> 10:49.400 |
|
mapping it against its understanding of the home it lives in |
|
|
|
10:50.040 --> 10:52.680 |
|
and knowing what to do. |
|
|
|
10:52.680 --> 10:56.360 |
|
And so that's an area of research. |
|
|
|
10:56.360 --> 10:59.360 |
|
It's an area where we're starting to roll out features. |
|
|
|
10:59.360 --> 11:02.880 |
|
You can now tell your robot to clean up the kitchen |
|
|
|
11:02.880 --> 11:05.880 |
|
and it knows what the kitchen is and can do that. |
|
|
|
11:05.880 --> 11:09.360 |
|
And that's sort of 1.0 of where we're going. |
|
|
|
11:10.440 --> 11:12.480 |
|
The other cool thing is that |
|
|
|
11:12.480 --> 11:14.640 |
|
we're starting to know where stuff is. |
|
|
|
11:14.640 --> 11:16.040 |
|
And why is that important? |
|
|
|
11:16.040 --> 11:21.040 |
|
Well, robots are supposed to have arms, right? |
|
|
|
11:21.520 --> 11:24.200 |
|
Data had an arm, Rosie had an arm, |
|
|
|
11:24.200 --> 11:25.240 |
|
Robbie the robot had an arm. |
|
|
|
11:25.240 --> 11:26.680 |
|
I mean, robots are, you know, |
|
|
|
11:26.680 --> 11:29.560 |
|
they are physical things that move around in an environment |
|
|
|
11:29.560 --> 11:31.280 |
|
they're supposed to like do work. |
|
|
|
11:31.280 --> 11:34.120 |
|
And if you think about it, |
|
|
|
11:34.120 --> 11:35.440 |
|
if a robot doesn't know anything, |
|
|
|
11:35.440 --> 11:38.760 |
|
where anything is, why should it have an arm? |
|
|
|
11:38.760 --> 11:43.760 |
|
But with this new dawn of home understanding |
|
|
|
11:44.320 --> 11:47.680 |
|
that we're starting to go enjoy, |
|
|
|
11:47.680 --> 11:49.320 |
|
I know where the kitchen is. |
|
|
|
11:49.320 --> 11:52.040 |
|
I might in the future know where the refrigerator is. |
|
|
|
11:52.040 --> 11:55.280 |
|
I might, if I had an arm, be able to find the handle, |
|
|
|
11:55.280 --> 11:58.440 |
|
open it and even get myself a beer. |
|
|
|
11:58.440 --> 12:01.920 |
|
Obviously that's one of the true dreams of robotics |
|
|
|
12:01.920 --> 12:03.520 |
|
is to have robots bringing us a beer |
|
|
|
12:03.520 --> 12:05.280 |
|
while we watch television. |
|
|
|
12:05.280 --> 12:10.280 |
|
But I think that that new category of tasks |
|
|
|
12:11.000 --> 12:14.240 |
|
where physical manipulation, robot arms |
|
|
|
12:14.240 --> 12:19.240 |
|
is just a popery of new opportunity and excitement. |
|
|
|
12:20.200 --> 12:23.800 |
|
And you see humans as a crucial part of that. |
|
|
|
12:23.800 --> 12:26.280 |
|
So you kind of mentioned that. |
|
|
|
12:26.280 --> 12:28.960 |
|
And I personally find that a really compelling idea. |
|
|
|
12:28.960 --> 12:33.960 |
|
I think full autonomy can only take us so far, |
|
|
|
12:33.960 --> 12:35.360 |
|
especially in the home. |
|
|
|
12:35.360 --> 12:38.920 |
|
So you see humans as helping the robot understand |
|
|
|
12:38.920 --> 12:42.600 |
|
or give deeper meaning to the spatial information. |
|
|
|
12:43.600 --> 12:45.680 |
|
Right, it's a partnership. |
|
|
|
12:46.800 --> 12:51.800 |
|
The robot is supposed to operate according to descriptors |
|
|
|
12:52.800 --> 12:55.620 |
|
that you would use to describe your own home. |
|
|
|
12:57.040 --> 13:02.040 |
|
The robot is supposed to in lieu of better direction |
|
|
|
13:02.040 --> 13:03.840 |
|
kind of go about its routine, |
|
|
|
13:03.840 --> 13:07.520 |
|
which ought to be basically right |
|
|
|
13:07.520 --> 13:12.200 |
|
and lead to a home maintained |
|
|
|
13:12.200 --> 13:14.840 |
|
in a way that it's learned you like, |
|
|
|
13:14.840 --> 13:19.840 |
|
but also be perpetually ready to take direction |
|
|
|
13:21.440 --> 13:26.440 |
|
that would activate a different set of behaviors or actions |
|
|
|
13:26.880 --> 13:28.880 |
|
to meet a current need |
|
|
|
13:28.880 --> 13:32.320 |
|
to the extent it could actually perform that task. |
|
|
|
13:32.320 --> 13:33.560 |
|
So I gotta ask you, |
|
|
|
13:33.560 --> 13:36.960 |
|
I think this is a fundamental and a fascinating question |
|
|
|
13:36.960 --> 13:39.760 |
|
because iRobot has been a successful company |
|
|
|
13:39.760 --> 13:42.320 |
|
and a rare successful robotics company. |
|
|
|
13:42.320 --> 13:46.720 |
|
So Anki, Gibo, Mayfield Robotics with a Robot Curry, |
|
|
|
13:46.720 --> 13:49.160 |
|
SciFi Works, Rethink Robotics. |
|
|
|
13:49.160 --> 13:51.320 |
|
These were robotics companies that were founded |
|
|
|
13:51.320 --> 13:52.920 |
|
and run by brilliant people. |
|
|
|
13:54.000 --> 13:58.760 |
|
But all very unfortunately, at least for us roboticists, |
|
|
|
13:58.760 --> 14:02.120 |
|
and all went out of business recently. |
|
|
|
14:02.120 --> 14:05.160 |
|
So why do you think they didn't last longer? |
|
|
|
14:05.160 --> 14:07.000 |
|
Why do you think it is so hard |
|
|
|
14:07.000 --> 14:10.680 |
|
to keep a robotics company alive? |
|
|
|
14:10.680 --> 14:14.160 |
|
You know, I say this only partially in jest |
|
|
|
14:14.160 --> 14:16.760 |
|
that back in the day before Roomba, |
|
|
|
14:17.840 --> 14:22.840 |
|
you know, I was a high tech entrepreneur building robots, |
|
|
|
14:23.920 --> 14:26.440 |
|
but it wasn't until I became a vacuum cleaner salesman |
|
|
|
14:26.440 --> 14:29.420 |
|
that we had any success. |
|
|
|
14:29.420 --> 14:34.240 |
|
So I mean, the point is technology alone |
|
|
|
14:34.240 --> 14:37.680 |
|
doesn't equal a successful business. |
|
|
|
14:37.680 --> 14:42.680 |
|
We need to go and find the compelling need |
|
|
|
14:43.600 --> 14:45.920 |
|
where the robot that we're creating |
|
|
|
14:47.640 --> 14:52.640 |
|
can deliver clearly more value |
|
|
|
14:52.800 --> 14:55.440 |
|
to the end user than it costs. |
|
|
|
14:55.440 --> 14:59.040 |
|
And this is not a marginal thing |
|
|
|
14:59.040 --> 15:01.800 |
|
where you're looking at the skin, like, it's close. |
|
|
|
15:01.800 --> 15:04.380 |
|
Maybe we can hold our breath and make it work. |
|
|
|
15:04.380 --> 15:09.380 |
|
It's clearly more value than the cost of the robot |
|
|
|
15:11.560 --> 15:13.920 |
|
to bring, you know, in the store. |
|
|
|
15:13.920 --> 15:17.360 |
|
And I think that the challenge has been finding |
|
|
|
15:17.360 --> 15:22.360 |
|
those businesses where that's true |
|
|
|
15:22.360 --> 15:26.400 |
|
that's true in a sustainable fashion. |
|
|
|
15:28.240 --> 15:33.240 |
|
You know, when you get into entertainment style things, |
|
|
|
15:34.600 --> 15:38.240 |
|
you could be the cat's meow one year, |
|
|
|
15:38.240 --> 15:42.440 |
|
but 85% of toys, regardless of their merit, |
|
|
|
15:43.400 --> 15:45.640 |
|
fail to make it to their second season. |
|
|
|
15:45.640 --> 15:47.720 |
|
It's just super hard to do so. |
|
|
|
15:47.720 --> 15:52.720 |
|
And so that that's just a tough business. |
|
|
|
15:53.800 --> 15:58.800 |
|
And there's been a lot of experimentation around |
|
|
|
15:59.200 --> 16:02.640 |
|
what is the right type of social companion? |
|
|
|
16:02.640 --> 16:05.840 |
|
What is the right robot in the home |
|
|
|
16:05.840 --> 16:10.840 |
|
that is doing something other than tasks people do every week |
|
|
|
16:14.560 --> 16:16.400 |
|
that they'd rather not do? |
|
|
|
16:16.400 --> 16:20.880 |
|
And I'm not sure we've got it all figured out right. |
|
|
|
16:20.880 --> 16:22.960 |
|
And so that you get brilliant roboticists |
|
|
|
16:22.960 --> 16:25.680 |
|
with super interesting robots |
|
|
|
16:25.680 --> 16:29.800 |
|
that ultimately don't quite have |
|
|
|
16:29.800 --> 16:34.040 |
|
that magical user experience and thus the, |
|
|
|
16:36.480 --> 16:40.560 |
|
that value benefit equation remains ambiguous. |
|
|
|
16:40.560 --> 16:43.960 |
|
So you as somebody who dreams of robots, |
|
|
|
16:43.960 --> 16:45.720 |
|
you know, changing the world, |
|
|
|
16:45.720 --> 16:46.880 |
|
what's your estimate? |
|
|
|
16:47.840 --> 16:52.840 |
|
Why, how big is the space of applications |
|
|
|
16:53.200 --> 16:55.800 |
|
that fit the criteria that you just described |
|
|
|
16:55.800 --> 16:58.080 |
|
where you can really demonstrate |
|
|
|
16:58.080 --> 17:00.520 |
|
an obvious significant value |
|
|
|
17:00.520 --> 17:04.620 |
|
over the alternative non robotic solution? |
|
|
|
17:05.720 --> 17:08.680 |
|
Well, I think that we're just about none of the way |
|
|
|
17:10.000 --> 17:13.360 |
|
to achieving the potential of robotics at home. |
|
|
|
17:13.360 --> 17:18.360 |
|
But we have to do it in a really eyes wide open, |
|
|
|
17:20.400 --> 17:21.920 |
|
honest fashion. |
|
|
|
17:21.920 --> 17:25.400 |
|
And so another way to put that is the potential is infinite |
|
|
|
17:25.400 --> 17:27.040 |
|
because we did take a few steps |
|
|
|
17:27.040 --> 17:29.640 |
|
but you're saying those steps are just very initial steps. |
|
|
|
17:29.640 --> 17:32.560 |
|
So the Roomba is a hugely successful product |
|
|
|
17:32.560 --> 17:34.440 |
|
but you're saying that's just the very, very beginning. |
|
|
|
17:34.440 --> 17:36.520 |
|
That's just the very, very beginning. |
|
|
|
17:36.520 --> 17:37.960 |
|
It's the foot in the door. |
|
|
|
17:37.960 --> 17:40.840 |
|
And, you know, I think I was lucky |
|
|
|
17:40.840 --> 17:45.120 |
|
that in the early days of robotics, |
|
|
|
17:45.120 --> 17:48.360 |
|
people would ask me, when are you gonna clean my floor? |
|
|
|
17:48.360 --> 17:52.240 |
|
It was something that I grew up saying, |
|
|
|
17:53.560 --> 17:54.800 |
|
I got all these really good ideas |
|
|
|
17:54.800 --> 17:58.080 |
|
but everyone seems to want their floor clean. |
|
|
|
17:58.080 --> 18:02.480 |
|
And so maybe we should do that. |
|
|
|
18:02.480 --> 18:03.320 |
|
Yeah, your good ideas. |
|
|
|
18:03.320 --> 18:05.840 |
|
Earn the right to do the next thing after that. |
|
|
|
18:05.840 --> 18:10.160 |
|
So the good ideas have to match with the desire of the people |
|
|
|
18:10.160 --> 18:13.280 |
|
and then the actual cost has to like the business, |
|
|
|
18:13.280 --> 18:16.640 |
|
the financial aspect has to all match together. |
|
|
|
18:16.640 --> 18:20.160 |
|
Yeah, during our partnership |
|
|
|
18:20.160 --> 18:22.200 |
|
back a number of years ago with Johnson Wax, |
|
|
|
18:22.200 --> 18:27.200 |
|
they would explain to me that they would go into homes |
|
|
|
18:29.480 --> 18:32.520 |
|
and just watch how people lived |
|
|
|
18:32.520 --> 18:35.560 |
|
and try to figure out what were they doing |
|
|
|
18:35.560 --> 18:39.960 |
|
that they really didn't really like to do |
|
|
|
18:39.960 --> 18:42.440 |
|
but they had to do it frequently enough |
|
|
|
18:42.440 --> 18:47.440 |
|
that it was top of mind and understood as a burden. |
|
|
|
18:51.720 --> 18:53.000 |
|
Hey, let's make a product |
|
|
|
18:54.480 --> 18:58.480 |
|
or come up with a solution to make that pain point |
|
|
|
18:58.480 --> 19:02.040 |
|
less challenging. |
|
|
|
19:02.040 --> 19:07.040 |
|
And sometimes we do certain burdens so often as a society |
|
|
|
19:07.400 --> 19:09.400 |
|
that we actually don't even realize, |
|
|
|
19:09.400 --> 19:10.680 |
|
like it's actually hard to see |
|
|
|
19:10.680 --> 19:13.200 |
|
that that burden is something that could be removed. |
|
|
|
19:13.200 --> 19:15.760 |
|
So it does require just going into the home |
|
|
|
19:15.760 --> 19:19.560 |
|
and staring at, wait, how do I actually live life? |
|
|
|
19:19.560 --> 19:21.080 |
|
What are the pain points? |
|
|
|
19:21.080 --> 19:26.080 |
|
Yeah, and it getting those insights is a lot harder |
|
|
|
19:26.400 --> 19:29.360 |
|
than it would seem it should be in retrospect. |
|
|
|
19:29.360 --> 19:33.120 |
|
So how hard on that point, |
|
|
|
19:33.120 --> 19:37.480 |
|
I mean, one of the big challenges of robotics |
|
|
|
19:37.480 --> 19:40.240 |
|
is driving the cost to something, |
|
|
|
19:40.240 --> 19:42.240 |
|
driving the cost down to something |
|
|
|
19:42.240 --> 19:45.680 |
|
that consumers, people would afford. |
|
|
|
19:45.680 --> 19:48.880 |
|
So people would be less likely to buy a Roomba |
|
|
|
19:48.880 --> 19:52.280 |
|
if it costs $500,000, right? |
|
|
|
19:52.280 --> 19:55.840 |
|
Which is probably sort of what a Roomba would cost |
|
|
|
19:55.840 --> 19:58.040 |
|
several decades ago. |
|
|
|
19:58.040 --> 20:02.200 |
|
So how do you drive, which I mentioned is very difficult. |
|
|
|
20:02.200 --> 20:04.200 |
|
How do you drive the cost of a Roomba |
|
|
|
20:04.200 --> 20:07.920 |
|
or a robot down such that people would want to buy it? |
|
|
|
20:07.920 --> 20:09.720 |
|
When I started building robots, |
|
|
|
20:09.720 --> 20:12.240 |
|
the cost of the robot had a lot to do |
|
|
|
20:12.240 --> 20:15.480 |
|
with the amount of time it took to build it. |
|
|
|
20:15.480 --> 20:18.400 |
|
And so that we would build our robots out of aluminum, |
|
|
|
20:18.400 --> 20:21.160 |
|
I would go spend my time in the machine shop |
|
|
|
20:21.160 --> 20:22.840 |
|
on the milling machine, |
|
|
|
20:23.840 --> 20:28.000 |
|
cutting out the parts and so forth. |
|
|
|
20:28.000 --> 20:29.720 |
|
And then when we got into the toy industry, |
|
|
|
20:29.720 --> 20:34.520 |
|
I realized that if we were building at scale, |
|
|
|
20:34.520 --> 20:36.000 |
|
I could determine the cost of the Roomba |
|
|
|
20:36.000 --> 20:38.920 |
|
instead of adding up all the hours to mill out the parts, |
|
|
|
20:38.920 --> 20:40.360 |
|
but by weighing it. |
|
|
|
20:42.080 --> 20:44.200 |
|
And that's liberating. |
|
|
|
20:44.200 --> 20:49.200 |
|
You can say, wow, the world has just changed |
|
|
|
20:49.560 --> 20:53.160 |
|
as I think about construction in a different way. |
|
|
|
20:53.160 --> 20:56.880 |
|
The 3D CAD tools that are available to us today, |
|
|
|
20:56.880 --> 21:01.680 |
|
the operating at scale where I can do tooling |
|
|
|
21:01.680 --> 21:06.680 |
|
and injection mold, an arbitrarily complicated part, |
|
|
|
21:07.080 --> 21:09.800 |
|
and the cost is going to be basically |
|
|
|
21:09.800 --> 21:12.560 |
|
the weight of the plastic in that part |
|
|
|
21:13.920 --> 21:16.360 |
|
is incredibly exciting and liberating |
|
|
|
21:16.360 --> 21:18.560 |
|
and opens up all sorts of opportunities. |
|
|
|
21:18.560 --> 21:23.560 |
|
And for the sensing part of it, where we are today |
|
|
|
21:23.560 --> 21:28.560 |
|
is instead of trying to build skin, |
|
|
|
21:29.280 --> 21:31.440 |
|
which is like really hard for a long time. |
|
|
|
21:31.440 --> 21:36.440 |
|
I spent creating strategies and ideas |
|
|
|
21:38.240 --> 21:42.720 |
|
around how could we duplicate the skin on the human body |
|
|
|
21:42.720 --> 21:45.360 |
|
because it's such an amazing sensor. |
|
|
|
21:47.880 --> 21:49.600 |
|
Instead of going down that path, |
|
|
|
21:49.600 --> 21:53.000 |
|
why don't we focus on vision? |
|
|
|
21:54.000 --> 21:59.000 |
|
And how many of the problems that face a robot |
|
|
|
22:00.080 --> 22:04.880 |
|
trying to do real work could be solved |
|
|
|
22:04.880 --> 22:08.480 |
|
with a cheap camera and a big ass computer? |
|
|
|
22:09.480 --> 22:12.440 |
|
And Moore's Law continues to work. |
|
|
|
22:12.440 --> 22:16.520 |
|
The cell phone industry, the mobile industry |
|
|
|
22:16.520 --> 22:18.800 |
|
is giving us better and better tools |
|
|
|
22:18.800 --> 22:21.120 |
|
that can run on these embedded computers. |
|
|
|
22:21.120 --> 22:26.120 |
|
And I think we passed an important moment, |
|
|
|
22:26.600 --> 22:31.600 |
|
maybe two years ago, where you could put |
|
|
|
22:32.760 --> 22:37.520 |
|
machine vision capable processors on robots |
|
|
|
22:37.520 --> 22:39.640 |
|
at consumer price points. |
|
|
|
22:39.640 --> 22:43.040 |
|
And I was waiting for it to happen. |
|
|
|
22:43.040 --> 22:46.600 |
|
We avoided putting lasers on our robots |
|
|
|
22:46.600 --> 22:51.600 |
|
to do navigation and instead spent years researching |
|
|
|
22:51.840 --> 22:54.640 |
|
how to do vision based navigation |
|
|
|
22:54.640 --> 22:58.440 |
|
because you could just see it |
|
|
|
22:58.440 --> 23:01.640 |
|
where these technology trends were going |
|
|
|
23:01.640 --> 23:05.880 |
|
and between injection molded plastic |
|
|
|
23:05.880 --> 23:08.040 |
|
and a camera with a computer |
|
|
|
23:08.040 --> 23:10.840 |
|
capable of running machine learning |
|
|
|
23:10.840 --> 23:12.560 |
|
and visual object recognition, |
|
|
|
23:12.560 --> 23:15.560 |
|
I could build an incredibly affordable, |
|
|
|
23:15.560 --> 23:18.760 |
|
incredibly capable robot |
|
|
|
23:18.760 --> 23:21.240 |
|
and that's gonna be the future. |
|
|
|
23:21.240 --> 23:23.440 |
|
So you know, on that point with a small tangent |
|
|
|
23:23.440 --> 23:25.000 |
|
but I think an important one, |
|
|
|
23:25.000 --> 23:27.600 |
|
another industry in which I would say |
|
|
|
23:27.600 --> 23:31.880 |
|
the only other industry in which there is automation |
|
|
|
23:31.880 --> 23:34.840 |
|
actually touching people's lives today |
|
|
|
23:34.840 --> 23:36.520 |
|
is autonomous vehicles. |
|
|
|
23:37.640 --> 23:42.360 |
|
What the vision is described of using computer vision |
|
|
|
23:42.360 --> 23:44.520 |
|
and using cheap camera sensors, |
|
|
|
23:44.520 --> 23:48.320 |
|
there's a debate on that of LiDAR versus computer vision |
|
|
|
23:48.320 --> 23:53.320 |
|
and sort of the Elon Musk famously said |
|
|
|
23:53.320 --> 23:58.320 |
|
that LiDAR is a crutch that really in the long term, |
|
|
|
23:58.440 --> 24:00.880 |
|
camera only is the right solution |
|
|
|
24:00.880 --> 24:03.520 |
|
which echoes some of the ideas you're expressing. |
|
|
|
24:03.520 --> 24:05.120 |
|
Of course, the domain |
|
|
|
24:05.120 --> 24:07.720 |
|
in terms of its safety criticality is different. |
|
|
|
24:07.720 --> 24:10.720 |
|
But what do you think about that approach |
|
|
|
24:10.720 --> 24:13.480 |
|
in the autonomous vehicle space? |
|
|
|
24:13.480 --> 24:15.200 |
|
And in general, do you see a connection |
|
|
|
24:15.200 --> 24:18.560 |
|
between the incredible real world challenges |
|
|
|
24:18.560 --> 24:20.800 |
|
you have to solve in the home with Roomba |
|
|
|
24:20.800 --> 24:23.000 |
|
and I saw a demonstration of some of them, |
|
|
|
24:23.000 --> 24:27.920 |
|
corner cases literally and autonomous vehicles. |
|
|
|
24:27.920 --> 24:31.720 |
|
So there's absolutely a tremendous overlap |
|
|
|
24:31.720 --> 24:35.520 |
|
between both the problems, you know, |
|
|
|
24:35.520 --> 24:38.680 |
|
a robot vacuum and autonomous vehicle are trying to solve |
|
|
|
24:38.680 --> 24:41.880 |
|
and the tools and the types of sensors |
|
|
|
24:41.880 --> 24:46.720 |
|
that are being applied in the pursuit of the solutions. |
|
|
|
24:48.040 --> 24:53.040 |
|
In my world, my environment is actually much harder |
|
|
|
24:54.680 --> 24:57.320 |
|
than the environment in automobile travels. |
|
|
|
24:57.320 --> 25:02.320 |
|
We don't have roads, we have t shirts, we have steps, |
|
|
|
25:02.720 --> 25:07.120 |
|
we have a near infinite number of patterns and colors |
|
|
|
25:07.120 --> 25:10.200 |
|
and surface textures on the floor. |
|
|
|
25:10.200 --> 25:12.560 |
|
Especially from a visual perspective. |
|
|
|
25:12.560 --> 25:14.760 |
|
Yeah, visually it's really tough. |
|
|
|
25:14.760 --> 25:18.880 |
|
Is an infinitely variable. |
|
|
|
25:18.880 --> 25:22.560 |
|
On the other hand, safety is way easier on the inside. |
|
|
|
25:22.560 --> 25:26.440 |
|
My robots, they're not very heavy, |
|
|
|
25:26.440 --> 25:28.280 |
|
they're not very fast. |
|
|
|
25:28.280 --> 25:31.480 |
|
If they bump into your foot, you think it's funny. |
|
|
|
25:32.720 --> 25:36.920 |
|
And, you know, and autonomous vehicles |
|
|
|
25:36.920 --> 25:39.400 |
|
kind of have the inverse problem. |
|
|
|
25:39.400 --> 25:44.400 |
|
And so that for me saying vision is the future, |
|
|
|
25:45.920 --> 25:47.800 |
|
I can say that without reservation. |
|
|
|
25:49.480 --> 25:54.480 |
|
For autonomous vehicles, I think I believe what Elon's saying |
|
|
|
25:55.360 --> 25:59.040 |
|
about the future is ultimately gonna be vision. |
|
|
|
25:59.040 --> 26:01.000 |
|
Maybe if we put a cheap lighter on there |
|
|
|
26:01.000 --> 26:03.280 |
|
as a backup sensor, it might not be the worst idea |
|
|
|
26:03.280 --> 26:04.120 |
|
in the world. |
|
|
|
26:04.120 --> 26:04.960 |
|
So the stakes are much higher. |
|
|
|
26:04.960 --> 26:05.800 |
|
The stakes are much higher. |
|
|
|
26:05.800 --> 26:08.200 |
|
You have to be much more careful thinking through |
|
|
|
26:08.200 --> 26:10.720 |
|
how far away that future is. |
|
|
|
26:10.720 --> 26:14.240 |
|
Right, but I think that the primary |
|
|
|
26:16.080 --> 26:19.320 |
|
environmental understanding sensor |
|
|
|
26:19.320 --> 26:21.760 |
|
is going to be a visual system. |
|
|
|
26:21.760 --> 26:23.000 |
|
Visual system. |
|
|
|
26:23.000 --> 26:25.560 |
|
So on that point, well, let me ask, |
|
|
|
26:25.560 --> 26:29.440 |
|
do you hope there's an iRobot robot in every home |
|
|
|
26:29.440 --> 26:30.920 |
|
in the world one day? |
|
|
|
26:31.880 --> 26:34.880 |
|
I expect there to be at least one iRobot robot |
|
|
|
26:34.880 --> 26:36.440 |
|
in every home. |
|
|
|
26:36.440 --> 26:41.160 |
|
You know, we've sold 25 million robots. |
|
|
|
26:41.160 --> 26:44.600 |
|
So we're in about 10% of US homes, |
|
|
|
26:44.600 --> 26:45.760 |
|
which is a great start. |
|
|
|
26:47.120 --> 26:51.080 |
|
But I think that when we think about the numbers |
|
|
|
26:51.080 --> 26:55.840 |
|
of things that robots can do, you know, |
|
|
|
26:55.840 --> 26:58.560 |
|
today I can vacuum your floor, mop your floor, |
|
|
|
26:58.560 --> 27:01.240 |
|
cut your lawn or soon we'll be able to cut your lawn. |
|
|
|
27:01.240 --> 27:06.240 |
|
But there are more things that we could do in the home. |
|
|
|
27:06.720 --> 27:11.520 |
|
And I hope that we continue using the techniques |
|
|
|
27:11.520 --> 27:14.480 |
|
I described around exploiting computer vision |
|
|
|
27:14.480 --> 27:18.680 |
|
and low cost manufacturing that we'll be able |
|
|
|
27:18.680 --> 27:22.680 |
|
to create these solutions at affordable price points. |
|
|
|
27:22.680 --> 27:25.600 |
|
So let me ask, on that point of a robot in every home, |
|
|
|
27:25.600 --> 27:26.880 |
|
that's my dream as well. |
|
|
|
27:26.880 --> 27:28.400 |
|
I'd love to see that. |
|
|
|
27:28.400 --> 27:31.880 |
|
I think the possibilities there are indeed |
|
|
|
27:31.880 --> 27:34.560 |
|
infinite positive possibilities. |
|
|
|
27:34.560 --> 27:39.560 |
|
But in our current culture, no thanks to science fiction |
|
|
|
27:39.800 --> 27:44.720 |
|
and so on, there's a serious kind of hesitation |
|
|
|
27:44.720 --> 27:47.120 |
|
and anxiety, concern about robots |
|
|
|
27:47.120 --> 27:50.080 |
|
and also a concern about privacy. |
|
|
|
27:51.480 --> 27:54.040 |
|
And it's a fascinating question to me |
|
|
|
27:54.040 --> 27:59.040 |
|
why that concern is amongst a certain group of people |
|
|
|
27:59.600 --> 28:02.880 |
|
is as intense as it is. |
|
|
|
28:02.880 --> 28:04.280 |
|
So you have to think about it |
|
|
|
28:04.280 --> 28:05.520 |
|
because it's a serious concern, |
|
|
|
28:05.520 --> 28:08.040 |
|
but I wonder how you address it best. |
|
|
|
28:08.040 --> 28:09.840 |
|
So from a perspective of a vision sensor, |
|
|
|
28:09.840 --> 28:14.200 |
|
so robots that move about the home and sense the world, |
|
|
|
28:14.200 --> 28:19.200 |
|
how do you alleviate people's privacy concerns? |
|
|
|
28:19.720 --> 28:22.880 |
|
How do you make sure that they can trust iRobot |
|
|
|
28:22.880 --> 28:25.360 |
|
and the robots that they share their home with? |
|
|
|
28:26.640 --> 28:28.160 |
|
I think that's a great question. |
|
|
|
28:28.160 --> 28:33.160 |
|
And we've really leaned way forward on this |
|
|
|
28:33.760 --> 28:38.760 |
|
because given our vision as to the role the company |
|
|
|
28:38.880 --> 28:40.720 |
|
intends to play in the home, |
|
|
|
28:43.000 --> 28:45.480 |
|
really for us, make or break is, |
|
|
|
28:45.480 --> 28:50.480 |
|
can our approach be trusted to protecting the data |
|
|
|
28:50.480 --> 28:53.560 |
|
and the privacy of the people who have our robots? |
|
|
|
28:53.560 --> 28:56.920 |
|
And so we've gone out publicly |
|
|
|
28:56.920 --> 29:00.440 |
|
with a privacy manifesto stating we'll never sell your data. |
|
|
|
29:00.440 --> 29:05.440 |
|
We've adopted GDPR, not just where GDPR is required, |
|
|
|
29:05.520 --> 29:10.520 |
|
but globally, we have ensured that images |
|
|
|
29:16.640 --> 29:18.080 |
|
don't leave the robot. |
|
|
|
29:18.080 --> 29:22.120 |
|
So processing data from the visual sensors |
|
|
|
29:22.120 --> 29:23.680 |
|
happens locally on the robot |
|
|
|
29:23.680 --> 29:28.680 |
|
and only semantic knowledge of the home |
|
|
|
29:29.520 --> 29:32.720 |
|
with the consumer's consent is sent up. |
|
|
|
29:32.720 --> 29:34.480 |
|
We show you what we know |
|
|
|
29:34.480 --> 29:39.480 |
|
and are trying to go use data as an enabler |
|
|
|
29:41.560 --> 29:44.120 |
|
for the performance of the robots |
|
|
|
29:44.120 --> 29:49.120 |
|
with the informed consent and understanding |
|
|
|
29:50.040 --> 29:52.440 |
|
of the people who own those robots. |
|
|
|
29:52.440 --> 29:56.880 |
|
And we take it very seriously. |
|
|
|
29:56.880 --> 30:01.880 |
|
And ultimately, we think that by showing a customer |
|
|
|
30:01.880 --> 30:06.880 |
|
that if you let us build a semantic map of your home |
|
|
|
30:07.360 --> 30:09.000 |
|
and know where the rooms are, |
|
|
|
30:09.000 --> 30:11.760 |
|
well, then you can say clean the kitchen. |
|
|
|
30:11.760 --> 30:13.720 |
|
If you don't want the robot to do that, |
|
|
|
30:13.720 --> 30:14.560 |
|
don't make the map, |
|
|
|
30:14.560 --> 30:17.000 |
|
it'll do its best job cleaning your home, |
|
|
|
30:17.000 --> 30:18.640 |
|
but it won't be able to do that. |
|
|
|
30:18.640 --> 30:20.280 |
|
And if you ever want us to forget |
|
|
|
30:20.280 --> 30:22.080 |
|
that we know that it's your kitchen, |
|
|
|
30:22.080 --> 30:26.680 |
|
you can have confidence that we will do that for you. |
|
|
|
30:26.680 --> 30:31.680 |
|
So we're trying to go and be a sort of a data 2.0 |
|
|
|
30:34.520 --> 30:37.680 |
|
perspective company where we treat the data |
|
|
|
30:37.680 --> 30:40.800 |
|
that the robots have of the consumer's home |
|
|
|
30:40.800 --> 30:43.200 |
|
as if it were the consumer's data |
|
|
|
30:43.200 --> 30:47.360 |
|
and that they have rights to it. |
|
|
|
30:47.360 --> 30:50.960 |
|
So we think by being the good guys on this front, |
|
|
|
30:50.960 --> 30:53.840 |
|
we can build the trust and thus be entrusted |
|
|
|
30:55.120 --> 31:00.120 |
|
to enable robots to do more things that are thoughtful. |
|
|
|
31:00.200 --> 31:04.520 |
|
You think people's worries will diminish over time? |
|
|
|
31:04.520 --> 31:06.840 |
|
As a society, broadly speaking, |
|
|
|
31:06.840 --> 31:09.320 |
|
do you think you can win over trust, |
|
|
|
31:09.320 --> 31:10.640 |
|
not just for the company, |
|
|
|
31:10.640 --> 31:14.880 |
|
but just the comfort of people have with AI in their home |
|
|
|
31:14.880 --> 31:17.040 |
|
enriching their lives in some way? |
|
|
|
31:17.040 --> 31:19.560 |
|
I think we're an interesting place today |
|
|
|
31:19.560 --> 31:22.400 |
|
where it's less about winning them over |
|
|
|
31:22.400 --> 31:26.240 |
|
and more about finding a way to talk about privacy |
|
|
|
31:26.240 --> 31:28.840 |
|
in a way that more people can understand. |
|
|
|
31:28.840 --> 31:30.920 |
|
I would tell you that today, |
|
|
|
31:30.920 --> 31:33.320 |
|
when there's a privacy breach, |
|
|
|
31:33.320 --> 31:37.040 |
|
people get very upset and then go to the store |
|
|
|
31:37.040 --> 31:38.320 |
|
and buy the cheapest thing, |
|
|
|
31:38.320 --> 31:41.000 |
|
paying no attention to whether or not the products |
|
|
|
31:41.000 --> 31:44.640 |
|
that they're buying honor privacy standards or not. |
|
|
|
31:44.640 --> 31:48.600 |
|
In fact, if I put on the package of my Roomba, |
|
|
|
31:50.080 --> 31:53.640 |
|
the privacy commitments that we have, |
|
|
|
31:53.640 --> 31:58.640 |
|
I would sell less than I would if I did nothing at all |
|
|
|
31:58.720 --> 32:00.400 |
|
and that needs to change. |
|
|
|
32:00.400 --> 32:02.880 |
|
So it's not a question about earning trust. |
|
|
|
32:02.880 --> 32:05.000 |
|
I think that's necessary but not sufficient. |
|
|
|
32:05.000 --> 32:08.440 |
|
We need to figure out how to have a comfortable set |
|
|
|
32:08.440 --> 32:13.440 |
|
of what is the grade A meat standard applied to privacy |
|
|
|
32:14.200 --> 32:18.400 |
|
that customers can trust and understand |
|
|
|
32:18.400 --> 32:20.480 |
|
and then use in the buying decisions. |
|
|
|
32:23.040 --> 32:25.520 |
|
That will reward companies for good behavior |
|
|
|
32:25.520 --> 32:29.880 |
|
and that will ultimately be how this moves forward. |
|
|
|
32:29.880 --> 32:32.680 |
|
And maybe be part of the conversation |
|
|
|
32:32.680 --> 32:34.800 |
|
between regular people about what it means, |
|
|
|
32:34.800 --> 32:36.280 |
|
what privacy means. |
|
|
|
32:36.280 --> 32:38.400 |
|
If you have some standards, you can say, |
|
|
|
32:38.400 --> 32:41.080 |
|
you can start talking about who's following them, |
|
|
|
32:41.080 --> 32:42.680 |
|
who's not, have more. |
|
|
|
32:42.680 --> 32:45.440 |
|
Because most people are actually quite clueless |
|
|
|
32:45.440 --> 32:47.320 |
|
about all aspects of artificial intelligence |
|
|
|
32:47.320 --> 32:48.400 |
|
or data collection and so on. |
|
|
|
32:48.400 --> 32:49.920 |
|
It would be nice to change that |
|
|
|
32:49.920 --> 32:52.760 |
|
for people to understand the good that AI can do |
|
|
|
32:52.760 --> 32:56.520 |
|
and it's not some system that's trying to steal |
|
|
|
32:56.520 --> 32:58.760 |
|
all the most sensitive data. |
|
|
|
32:58.760 --> 33:02.640 |
|
Do you think, do you dream of a Roomba |
|
|
|
33:02.640 --> 33:05.240 |
|
with human level intelligence one day? |
|
|
|
33:05.240 --> 33:10.240 |
|
So you've mentioned a very successful localization |
|
|
|
33:10.520 --> 33:11.880 |
|
and mapping of the environment, |
|
|
|
33:11.880 --> 33:14.360 |
|
being able to do some basic communication |
|
|
|
33:14.360 --> 33:16.560 |
|
to say go clean the kitchen. |
|
|
|
33:16.560 --> 33:21.360 |
|
Do you see in your maybe more bored moments, |
|
|
|
33:22.880 --> 33:24.840 |
|
once you get the beer, |
|
|
|
33:24.840 --> 33:27.000 |
|
just sit back with that beer |
|
|
|
33:27.000 --> 33:30.800 |
|
and have a chat on a Friday night with the Roomba |
|
|
|
33:30.800 --> 33:32.840 |
|
about how your day went. |
|
|
|
33:34.120 --> 33:37.560 |
|
So through your latter question, absolutely. |
|
|
|
33:38.640 --> 33:40.720 |
|
To your former question as to whether Roomba |
|
|
|
33:40.720 --> 33:43.680 |
|
can have human level intelligence, not in my lifetime. |
|
|
|
33:45.200 --> 33:48.440 |
|
You can have you, you can have a great conversation, |
|
|
|
33:49.680 --> 33:51.960 |
|
a meaningful conversation with a robot |
|
|
|
33:53.920 --> 33:55.400 |
|
without it having anything |
|
|
|
33:55.400 --> 33:57.760 |
|
that resembles human level intelligence. |
|
|
|
33:57.760 --> 34:02.760 |
|
And I think that as long as you realize |
|
|
|
34:02.800 --> 34:05.280 |
|
that conversation is not about the robot |
|
|
|
34:06.360 --> 34:08.600 |
|
and making the robot feel good. |
|
|
|
34:08.600 --> 34:13.600 |
|
That conversation is about you learning interesting things |
|
|
|
34:14.880 --> 34:18.400 |
|
that make you feel like the conversation |
|
|
|
34:18.400 --> 34:21.240 |
|
that you had with the robot is |
|
|
|
34:23.800 --> 34:27.280 |
|
a pretty awesome way of learning something. |
|
|
|
34:27.280 --> 34:30.800 |
|
And it could be about what kind of day your pet had. |
|
|
|
34:30.800 --> 34:35.040 |
|
It could be about, how can I make my home |
|
|
|
34:35.040 --> 34:36.160 |
|
more energy efficient? |
|
|
|
34:36.160 --> 34:39.600 |
|
It could be about, if I'm thinking about |
|
|
|
34:39.600 --> 34:41.720 |
|
climbing Mount Everest, what should I know? |
|
|
|
34:44.320 --> 34:46.480 |
|
And that's a very doable thing. |
|
|
|
34:48.600 --> 34:51.480 |
|
But if I think that that conversation |
|
|
|
34:51.480 --> 34:53.640 |
|
I'm gonna have with the robot is, |
|
|
|
34:53.640 --> 34:56.760 |
|
I'm gonna be rewarded by making the robot happy. |
|
|
|
34:56.760 --> 34:58.720 |
|
But I could have just put a button on the robot |
|
|
|
34:58.720 --> 35:00.280 |
|
that you could push and the robot would smile |
|
|
|
35:00.280 --> 35:02.120 |
|
and that sort of thing. |
|
|
|
35:02.120 --> 35:04.160 |
|
So I think you need to think about the question |
|
|
|
35:04.160 --> 35:06.680 |
|
in the right way. |
|
|
|
35:06.680 --> 35:11.520 |
|
And robots can be awesomely effective |
|
|
|
35:11.520 --> 35:14.400 |
|
at helping people feel less isolated, |
|
|
|
35:14.400 --> 35:17.520 |
|
learn more about the home that they live in |
|
|
|
35:17.520 --> 35:21.920 |
|
and fill some of those lonely gaps |
|
|
|
35:21.920 --> 35:23.760 |
|
that we wish we were engaged |
|
|
|
35:23.760 --> 35:25.640 |
|
learning cool stuff about our world. |
|
|
|
35:25.640 --> 35:30.640 |
|
Last question, if you could hang out for a day |
|
|
|
35:30.640 --> 35:34.640 |
|
with a robot from science fiction, movies, books |
|
|
|
35:34.640 --> 35:39.640 |
|
and safely pick, safely pick its brain for that day, |
|
|
|
35:39.640 --> 35:41.640 |
|
who would you pick? |
|
|
|
35:41.640 --> 35:42.640 |
|
Data. |
|
|
|
35:42.640 --> 35:43.640 |
|
Data. |
|
|
|
35:43.640 --> 35:44.640 |
|
From Star Trek. |
|
|
|
35:44.640 --> 35:48.640 |
|
I think that data is really smart. |
|
|
|
35:48.640 --> 35:52.640 |
|
Data's been through a lot trying to go and save the galaxy |
|
|
|
35:52.640 --> 35:57.640 |
|
and I'm really interested actually in emotion and robotics. |
|
|
|
35:58.640 --> 36:00.640 |
|
And I think you'd have a lot to say about that |
|
|
|
36:00.640 --> 36:05.640 |
|
because I believe actually that emotion plays |
|
|
|
36:05.640 --> 36:10.640 |
|
an incredibly useful role in doing reasonable things |
|
|
|
36:11.640 --> 36:14.640 |
|
in situations where we have imperfect understanding |
|
|
|
36:14.640 --> 36:15.640 |
|
of what's going on. |
|
|
|
36:15.640 --> 36:18.640 |
|
In social situations when there's imperfect information. |
|
|
|
36:18.640 --> 36:23.640 |
|
In social situations also in competitive |
|
|
|
36:23.640 --> 36:25.640 |
|
or dangerous situations, |
|
|
|
36:26.640 --> 36:30.640 |
|
that we have emotion for a reason. |
|
|
|
36:30.640 --> 36:35.640 |
|
And so that ultimately, my theory is that as robots |
|
|
|
36:35.640 --> 36:36.640 |
|
get smarter and smarter, |
|
|
|
36:36.640 --> 36:38.640 |
|
they're actually going to get more emotional |
|
|
|
36:38.640 --> 36:46.640 |
|
because you can't actually survive on pure logic. |
|
|
|
36:46.640 --> 36:51.640 |
|
Because only a very tiny fraction of the situations |
|
|
|
36:51.640 --> 36:55.640 |
|
we find ourselves in can be resolved reasonably with logic. |
|
|
|
36:55.640 --> 36:57.640 |
|
And so I think data would have a lot to say about that |
|
|
|
36:57.640 --> 36:59.640 |
|
and so I could find out whether he agrees. |
|
|
|
36:59.640 --> 37:02.640 |
|
What, if you could ask data one question |
|
|
|
37:02.640 --> 37:05.640 |
|
and you would get a deep, honest answer to, |
|
|
|
37:05.640 --> 37:06.640 |
|
what would you ask? |
|
|
|
37:06.640 --> 37:08.640 |
|
What's Captain Picard really like? |
|
|
|
37:08.640 --> 37:12.640 |
|
Okay, I think that's the perfect way to end the call |
|
|
|
37:12.640 --> 37:14.640 |
|
and thank you so much for talking today. |
|
|
|
37:14.640 --> 37:16.640 |
|
I really appreciate it. |
|
|
|
37:16.640 --> 37:45.640 |
|
My pleasure. |
|
|
|
|