|
WEBVTT |
|
|
|
00:00.000 --> 00:02.880 |
|
The following is a conversation with Rosalind Picard. |
|
|
|
00:02.880 --> 00:04.560 |
|
She's a professor at MIT, |
|
|
|
00:04.560 --> 00:06.880 |
|
director of the Effective Computing Research Group |
|
|
|
00:06.880 --> 00:08.360 |
|
at the MIT Media Lab, |
|
|
|
00:08.360 --> 00:12.440 |
|
and cofounder of two companies, Effectiva and Empatica. |
|
|
|
00:12.440 --> 00:13.560 |
|
Over two decades ago, |
|
|
|
00:13.560 --> 00:15.440 |
|
she launched a field of effective computing |
|
|
|
00:15.440 --> 00:17.560 |
|
with her book of the same name. |
|
|
|
00:17.560 --> 00:20.040 |
|
This book described the importance of emotion |
|
|
|
00:20.040 --> 00:23.040 |
|
in artificial and natural intelligence. |
|
|
|
00:23.040 --> 00:25.320 |
|
The vital role of emotional communication |
|
|
|
00:25.320 --> 00:28.520 |
|
has to the relationship between people in general |
|
|
|
00:28.520 --> 00:30.920 |
|
and human robot interaction. |
|
|
|
00:30.920 --> 00:34.040 |
|
I really enjoy talking with Ros over so many topics, |
|
|
|
00:34.040 --> 00:36.440 |
|
including emotion, ethics, privacy, |
|
|
|
00:36.440 --> 00:39.720 |
|
wearable computing and her recent research in epilepsy |
|
|
|
00:39.720 --> 00:42.600 |
|
and even love and meaning. |
|
|
|
00:42.600 --> 00:44.000 |
|
This conversation is part |
|
|
|
00:44.000 --> 00:46.040 |
|
of the Artificial Intelligence Podcast. |
|
|
|
00:46.040 --> 00:48.200 |
|
If you enjoy it, subscribe on YouTube, |
|
|
|
00:48.200 --> 00:50.760 |
|
iTunes or simply connect with me on Twitter |
|
|
|
00:50.760 --> 00:54.000 |
|
at Lex Freedman, spelled F R I D. |
|
|
|
00:54.000 --> 00:58.000 |
|
And now here's my conversation with Rosalind Picard. |
|
|
|
00:59.520 --> 01:00.760 |
|
More than 20 years ago, |
|
|
|
01:00.760 --> 01:03.360 |
|
you've coined the term Effective Computing |
|
|
|
01:03.360 --> 01:05.400 |
|
and led a lot of research in this area. |
|
|
|
01:05.400 --> 01:08.800 |
|
Since then, as I understand the goal is to make the machine |
|
|
|
01:08.800 --> 01:12.400 |
|
detect and interpret the emotional state of a human being |
|
|
|
01:12.400 --> 01:14.240 |
|
and adopt the behavior of the machine |
|
|
|
01:14.240 --> 01:16.160 |
|
based on the emotional state. |
|
|
|
01:16.160 --> 01:19.680 |
|
So how is your understanding of the problems |
|
|
|
01:19.680 --> 01:24.680 |
|
based defined by effective computing changed in the past 24 years? |
|
|
|
01:25.400 --> 01:28.960 |
|
So it's the scope, the applications, the challenges, |
|
|
|
01:28.960 --> 01:32.160 |
|
what's involved, how has that evolved over the years? |
|
|
|
01:32.160 --> 01:35.400 |
|
Yeah, actually, originally when I defined the term |
|
|
|
01:35.400 --> 01:38.080 |
|
Effective Computing, it was a bit broader |
|
|
|
01:38.080 --> 01:41.320 |
|
than just recognizing and responding intelligently |
|
|
|
01:41.320 --> 01:44.560 |
|
to human emotion, although those are probably the two pieces |
|
|
|
01:44.560 --> 01:47.160 |
|
that we've worked on the hardest. |
|
|
|
01:47.160 --> 01:50.680 |
|
The original concept also encompassed machines |
|
|
|
01:50.680 --> 01:53.520 |
|
that would have mechanisms that functioned |
|
|
|
01:53.520 --> 01:55.680 |
|
like human emotion does inside them. |
|
|
|
01:55.680 --> 01:59.000 |
|
It would be any computing that relates to arises from |
|
|
|
01:59.000 --> 02:01.600 |
|
or deliberately influences human emotion. |
|
|
|
02:02.600 --> 02:05.160 |
|
So the human computer interaction part |
|
|
|
02:05.160 --> 02:07.880 |
|
is the part that people tend to see. |
|
|
|
02:07.880 --> 02:11.000 |
|
Like if I'm really ticked off at my computer |
|
|
|
02:11.000 --> 02:13.480 |
|
and I'm scowling at it and I'm cursing at it |
|
|
|
02:13.480 --> 02:15.720 |
|
and it just keeps acting smiling and happy |
|
|
|
02:15.720 --> 02:17.480 |
|
like that little paperclip used to do, |
|
|
|
02:17.480 --> 02:22.160 |
|
like dancing, winking, that kind of thing |
|
|
|
02:22.160 --> 02:24.640 |
|
just makes you even more frustrated. |
|
|
|
02:24.640 --> 02:29.080 |
|
And I thought that stupid thing needs to see my affect |
|
|
|
02:29.080 --> 02:30.600 |
|
and if it's gonna be intelligent, |
|
|
|
02:30.600 --> 02:33.000 |
|
which Microsoft researchers had worked really hard on, |
|
|
|
02:33.000 --> 02:35.120 |
|
it actually had some of the most sophisticated AI in it |
|
|
|
02:35.120 --> 02:37.960 |
|
at the time, that thing's gonna actually be smart. |
|
|
|
02:37.960 --> 02:41.560 |
|
It needs to respond to me and you |
|
|
|
02:41.560 --> 02:45.320 |
|
and we can send it very different signals. |
|
|
|
02:45.320 --> 02:47.160 |
|
So by the way, just a quick interruption. |
|
|
|
02:47.160 --> 02:52.160 |
|
The Clippy, maybe it's in Word 95 and 98, |
|
|
|
02:52.360 --> 02:54.360 |
|
I don't remember when it was born, |
|
|
|
02:54.360 --> 02:58.280 |
|
but many people, do you find yourself with that reference |
|
|
|
02:58.280 --> 03:00.720 |
|
that people recognize what you're talking about still |
|
|
|
03:00.720 --> 03:01.640 |
|
to this point? |
|
|
|
03:01.640 --> 03:05.160 |
|
I don't expect the newest students to these days, |
|
|
|
03:05.160 --> 03:07.160 |
|
but I've mentioned it to a lot of audiences |
|
|
|
03:07.160 --> 03:09.200 |
|
like how many of you know this Clippy thing |
|
|
|
03:09.200 --> 03:11.680 |
|
and still the majority of people seem to know it. |
|
|
|
03:11.680 --> 03:15.280 |
|
So Clippy kind of looks at maybe natural language processing |
|
|
|
03:15.280 --> 03:19.200 |
|
where you are typing and tries to help you complete, I think. |
|
|
|
03:19.200 --> 03:22.440 |
|
I don't even remember what Clippy was except annoying. |
|
|
|
03:22.440 --> 03:25.760 |
|
Yeah, some people actually liked it. |
|
|
|
03:25.760 --> 03:27.480 |
|
I would hear those stories. |
|
|
|
03:27.480 --> 03:28.400 |
|
You miss it? |
|
|
|
03:28.400 --> 03:31.240 |
|
Well, I miss the annoyance. |
|
|
|
03:31.240 --> 03:35.680 |
|
They felt like there's an element, somebody was there |
|
|
|
03:35.680 --> 03:37.560 |
|
and we're in it together and they were annoying. |
|
|
|
03:37.560 --> 03:40.800 |
|
It's like a puppy that just doesn't get it. |
|
|
|
03:40.800 --> 03:42.120 |
|
They keep stripping up the couch kind of thing. |
|
|
|
03:42.120 --> 03:44.920 |
|
And in fact, they could have done it smarter like a puppy. |
|
|
|
03:44.920 --> 03:47.960 |
|
If they had done, like if when you yelled at it |
|
|
|
03:47.960 --> 03:50.760 |
|
or cursed at it, if it had put its little ears back |
|
|
|
03:50.760 --> 03:52.880 |
|
and its tail down and shirked off, |
|
|
|
03:52.880 --> 03:55.840 |
|
probably people would have wanted it back, right? |
|
|
|
03:55.840 --> 03:58.560 |
|
But instead when you yelled at it, what did it do? |
|
|
|
03:58.560 --> 04:01.200 |
|
It smiled, it winked, it danced, right? |
|
|
|
04:01.200 --> 04:03.160 |
|
If somebody comes to my office and I yell at them, |
|
|
|
04:03.160 --> 04:04.720 |
|
they started smiling, winking and dancing. |
|
|
|
04:04.720 --> 04:06.720 |
|
I'm like, I never want to see you again. |
|
|
|
04:06.720 --> 04:08.480 |
|
So Bill Gates got a standing ovation |
|
|
|
04:08.480 --> 04:10.080 |
|
when he said it was going away |
|
|
|
04:10.080 --> 04:12.320 |
|
because people were so ticked. |
|
|
|
04:12.320 --> 04:14.960 |
|
It was so emotionally unintelligent, right? |
|
|
|
04:14.960 --> 04:18.080 |
|
It was intelligent about whether you were writing a letter |
|
|
|
04:18.080 --> 04:20.800 |
|
or what kind of help you needed for that context. |
|
|
|
04:20.800 --> 04:23.360 |
|
It was completely unintelligent about, |
|
|
|
04:23.360 --> 04:25.720 |
|
hey, if you're annoying your customer, |
|
|
|
04:25.720 --> 04:28.320 |
|
don't smile in their face when you do it. |
|
|
|
04:28.320 --> 04:32.280 |
|
So that kind of mismatch was something |
|
|
|
04:32.280 --> 04:35.000 |
|
the developers just didn't think about. |
|
|
|
04:35.000 --> 04:39.440 |
|
And intelligence at the time was really all about math |
|
|
|
04:39.440 --> 04:44.920 |
|
and language and chess and games, |
|
|
|
04:44.920 --> 04:47.880 |
|
problems that could be pretty well defined. |
|
|
|
04:47.880 --> 04:51.440 |
|
Social emotional interaction is much more complex than chess |
|
|
|
04:51.440 --> 04:55.960 |
|
or Go or any of the games that people are trying to solve. |
|
|
|
04:55.960 --> 04:58.640 |
|
And in order to understand that required skills |
|
|
|
04:58.640 --> 05:00.240 |
|
that most people in computer science |
|
|
|
05:00.240 --> 05:02.520 |
|
actually were lacking personally. |
|
|
|
05:02.520 --> 05:03.760 |
|
Well, let's talk about computer science |
|
|
|
05:03.760 --> 05:06.320 |
|
if things gotten better since the work, |
|
|
|
05:06.320 --> 05:09.480 |
|
since the message, since you've really launched a field |
|
|
|
05:09.480 --> 05:11.280 |
|
with a lot of research work in the space, |
|
|
|
05:11.280 --> 05:14.080 |
|
I still find as a person like yourself |
|
|
|
05:14.080 --> 05:16.640 |
|
who's deeply passionate about human beings |
|
|
|
05:16.640 --> 05:18.840 |
|
and yet in computer science, |
|
|
|
05:18.840 --> 05:23.040 |
|
there still seems to be a lack of, sorry to say, |
|
|
|
05:23.040 --> 05:26.760 |
|
empathy in us computer scientists. |
|
|
|
05:26.760 --> 05:28.800 |
|
Yeah, well, or hasn't gotten better. |
|
|
|
05:28.800 --> 05:30.680 |
|
Let's just say there's a lot more variety |
|
|
|
05:30.680 --> 05:32.320 |
|
among computer scientists these days. |
|
|
|
05:32.320 --> 05:34.960 |
|
Computer scientists are a much more diverse group today |
|
|
|
05:34.960 --> 05:37.600 |
|
than they were 25 years ago. |
|
|
|
05:37.600 --> 05:39.000 |
|
And that's good. |
|
|
|
05:39.000 --> 05:41.760 |
|
We need all kinds of people to become computer scientists |
|
|
|
05:41.760 --> 05:45.600 |
|
so that computer science reflects more what society needs. |
|
|
|
05:45.600 --> 05:49.080 |
|
And there's brilliance among every personality type. |
|
|
|
05:49.080 --> 05:52.000 |
|
So it need not be limited to people |
|
|
|
05:52.000 --> 05:54.120 |
|
who prefer computers to other people. |
|
|
|
05:54.120 --> 05:55.400 |
|
How hard do you think it is? |
|
|
|
05:55.400 --> 05:58.600 |
|
How your view of how difficult it is to recognize emotion |
|
|
|
05:58.600 --> 06:03.960 |
|
or to create a deeply emotional intelligent interaction |
|
|
|
06:03.960 --> 06:07.440 |
|
has it gotten easier or harder as you've explored it further? |
|
|
|
06:07.440 --> 06:10.040 |
|
And how far away are we from cracking this? |
|
|
|
06:12.440 --> 06:16.040 |
|
If you think of the Turing test solving the intelligence, |
|
|
|
06:16.040 --> 06:18.760 |
|
looking at the Turing test for emotional intelligence? |
|
|
|
06:20.720 --> 06:25.600 |
|
I think it is as difficult as I thought it was gonna be. |
|
|
|
06:25.600 --> 06:29.280 |
|
I think my prediction of its difficulty is spot on. |
|
|
|
06:29.280 --> 06:33.160 |
|
I think the time estimates are always hard |
|
|
|
06:33.160 --> 06:37.360 |
|
because they're always a function of society's love |
|
|
|
06:37.360 --> 06:39.560 |
|
and hate of a particular topic. |
|
|
|
06:39.560 --> 06:44.560 |
|
If society gets excited and you get thousands |
|
|
|
06:44.560 --> 06:49.080 |
|
of researchers working on it for a certain application, |
|
|
|
06:49.080 --> 06:52.080 |
|
that application gets solved really quickly. |
|
|
|
06:52.080 --> 06:54.360 |
|
The general intelligence, |
|
|
|
06:54.360 --> 06:58.200 |
|
the computer's complete lack of ability |
|
|
|
06:58.200 --> 07:03.200 |
|
to have awareness of what it's doing, |
|
|
|
07:03.560 --> 07:05.560 |
|
the fact that it's not conscious, |
|
|
|
07:05.560 --> 07:08.640 |
|
the fact that there's no signs of it becoming conscious, |
|
|
|
07:08.640 --> 07:11.880 |
|
the fact that it doesn't read between the lines, |
|
|
|
07:11.880 --> 07:15.080 |
|
those kinds of things that we have to teach it explicitly, |
|
|
|
07:15.080 --> 07:17.520 |
|
what other people pick up implicitly. |
|
|
|
07:17.520 --> 07:20.440 |
|
We don't see that changing yet. |
|
|
|
07:20.440 --> 07:23.600 |
|
There aren't breakthroughs yet that lead us to believe |
|
|
|
07:23.600 --> 07:25.360 |
|
that that's gonna go any faster, |
|
|
|
07:25.360 --> 07:28.680 |
|
which means that it's still gonna be kind of stuck |
|
|
|
07:28.680 --> 07:31.320 |
|
with a lot of limitations |
|
|
|
07:31.320 --> 07:34.080 |
|
where it's probably only gonna do the right thing |
|
|
|
07:34.080 --> 07:37.200 |
|
in very limited, narrow, prespecified contexts |
|
|
|
07:37.200 --> 07:40.960 |
|
where we can prescribe pretty much |
|
|
|
07:40.960 --> 07:42.880 |
|
what's gonna happen there. |
|
|
|
07:42.880 --> 07:44.920 |
|
So I don't see the, |
|
|
|
07:47.000 --> 07:48.040 |
|
it's hard to predict the date |
|
|
|
07:48.040 --> 07:51.800 |
|
because when people don't work on it, it's infinite. |
|
|
|
07:51.800 --> 07:55.240 |
|
When everybody works on it, you get a nice piece of it |
|
|
|
07:55.240 --> 07:58.600 |
|
well solved in a short amount of time. |
|
|
|
07:58.600 --> 08:01.560 |
|
I actually think there's a more important issue right now |
|
|
|
08:01.560 --> 08:04.560 |
|
than the difficulty of it. |
|
|
|
08:04.560 --> 08:07.440 |
|
And that's causing some of us to put the brakes on a little bit. |
|
|
|
08:07.440 --> 08:09.360 |
|
Usually we're all just like step on the gas, |
|
|
|
08:09.360 --> 08:11.160 |
|
let's go faster. |
|
|
|
08:11.160 --> 08:14.200 |
|
This is causing us to pull back and put the brakes on. |
|
|
|
08:14.200 --> 08:18.680 |
|
And that's the way that some of this technology |
|
|
|
08:18.680 --> 08:21.200 |
|
is being used in places like China right now. |
|
|
|
08:21.200 --> 08:24.520 |
|
And that worries me so deeply |
|
|
|
08:24.520 --> 08:27.800 |
|
that it's causing me to pull back myself |
|
|
|
08:27.800 --> 08:30.080 |
|
on a lot of the things that we could be doing |
|
|
|
08:30.080 --> 08:33.680 |
|
and try to get the community to think a little bit more |
|
|
|
08:33.680 --> 08:36.040 |
|
about, okay, if we're gonna go forward with that, |
|
|
|
08:36.040 --> 08:39.320 |
|
how can we do it in a way that puts in place safeguards |
|
|
|
08:39.320 --> 08:41.120 |
|
that protects people? |
|
|
|
08:41.120 --> 08:43.560 |
|
So the technology we're referring to is |
|
|
|
08:43.560 --> 08:46.400 |
|
just when a computer senses the human being, |
|
|
|
08:46.400 --> 08:47.720 |
|
like the human face. |
|
|
|
08:47.720 --> 08:48.560 |
|
Yeah, yeah. |
|
|
|
08:48.560 --> 08:51.840 |
|
And so there's a lot of exciting things there |
|
|
|
08:51.840 --> 08:53.960 |
|
like forming a deep connection with the human being. |
|
|
|
08:53.960 --> 08:55.440 |
|
So what are your worries? |
|
|
|
08:55.440 --> 08:56.880 |
|
How that could go wrong? |
|
|
|
08:58.000 --> 08:59.520 |
|
Is it in terms of privacy? |
|
|
|
08:59.520 --> 09:03.000 |
|
Is it in terms of other kinds of more subtle things? |
|
|
|
09:03.000 --> 09:04.320 |
|
Let's dig into privacy. |
|
|
|
09:04.320 --> 09:07.800 |
|
So here in the US, if I'm watching a video |
|
|
|
09:07.800 --> 09:11.920 |
|
of say a political leader and in the US |
|
|
|
09:11.920 --> 09:13.640 |
|
we're quite free as we all know |
|
|
|
09:13.640 --> 09:17.920 |
|
to even criticize the president of the United States, right? |
|
|
|
09:17.920 --> 09:19.440 |
|
Here that's not a shocking thing. |
|
|
|
09:19.440 --> 09:22.720 |
|
It happens about every five seconds, right? |
|
|
|
09:22.720 --> 09:27.720 |
|
But in China, what happens if you criticize |
|
|
|
09:27.880 --> 09:31.080 |
|
the leader of the government, right? |
|
|
|
09:31.080 --> 09:34.320 |
|
And so people are very careful not to do that. |
|
|
|
09:34.320 --> 09:37.880 |
|
However, what happens if you're simply watching a video |
|
|
|
09:37.880 --> 09:41.000 |
|
and you make a facial expression |
|
|
|
09:41.000 --> 09:45.280 |
|
that shows a little bit of skepticism, right? |
|
|
|
09:45.280 --> 09:48.200 |
|
Well, and here we're completely free to do that. |
|
|
|
09:48.200 --> 09:50.680 |
|
In fact, we're free to fly off the handle |
|
|
|
09:50.680 --> 09:54.720 |
|
and say anything we want, usually. |
|
|
|
09:54.720 --> 09:56.560 |
|
I mean, there are some restrictions |
|
|
|
09:56.560 --> 10:00.840 |
|
when the athlete does this as part of the national broadcast |
|
|
|
10:00.840 --> 10:03.840 |
|
maybe the teams get a little unhappy |
|
|
|
10:03.840 --> 10:05.880 |
|
about picking that forum to do it, right? |
|
|
|
10:05.880 --> 10:08.720 |
|
But that's more a question of judgment. |
|
|
|
10:08.720 --> 10:12.760 |
|
We have these freedoms and in places |
|
|
|
10:12.760 --> 10:14.160 |
|
that don't have those freedoms, |
|
|
|
10:14.160 --> 10:17.080 |
|
what if our technology can read |
|
|
|
10:17.080 --> 10:19.600 |
|
your underlying affective state? |
|
|
|
10:19.600 --> 10:22.440 |
|
What if our technology can read it even noncontact? |
|
|
|
10:22.440 --> 10:24.440 |
|
What if our technology can read it |
|
|
|
10:24.440 --> 10:28.840 |
|
without your prior consent? |
|
|
|
10:28.840 --> 10:31.400 |
|
And here in the US and my first company, |
|
|
|
10:31.400 --> 10:32.960 |
|
we started Affectiva. |
|
|
|
10:32.960 --> 10:35.600 |
|
We have worked super hard to turn away money |
|
|
|
10:35.600 --> 10:38.440 |
|
and opportunities that try to read people's affect |
|
|
|
10:38.440 --> 10:41.360 |
|
without their prior informed consent. |
|
|
|
10:41.360 --> 10:45.160 |
|
And even the software that is licensible, |
|
|
|
10:45.160 --> 10:47.680 |
|
you have to sign things saying you will only use it |
|
|
|
10:47.680 --> 10:51.800 |
|
in certain ways, which essentially is get people's buy in, |
|
|
|
10:51.800 --> 10:52.640 |
|
right? |
|
|
|
10:52.640 --> 10:55.400 |
|
Don't do this without people agreeing to it. |
|
|
|
10:56.840 --> 10:58.640 |
|
There are other countries where they're not interested |
|
|
|
10:58.640 --> 10:59.560 |
|
in people's buy in. |
|
|
|
10:59.560 --> 11:01.440 |
|
They're just gonna use it. |
|
|
|
11:01.440 --> 11:03.080 |
|
They're gonna inflict it on you. |
|
|
|
11:03.080 --> 11:04.480 |
|
And if you don't like it, |
|
|
|
11:04.480 --> 11:08.480 |
|
you better not scowl in the direction of any sensors. |
|
|
|
11:08.480 --> 11:11.440 |
|
So one, let me just comment on a small tangent. |
|
|
|
11:11.440 --> 11:16.000 |
|
Do you know what the idea of adversarial examples |
|
|
|
11:16.000 --> 11:20.880 |
|
and deep fakes and so on, what you bring up is actually, |
|
|
|
11:20.880 --> 11:25.800 |
|
in one sense, deep fakes provide a comforting protection |
|
|
|
11:25.800 --> 11:30.640 |
|
that you can no longer really trust |
|
|
|
11:30.640 --> 11:34.560 |
|
that the video of your face was legitimate. |
|
|
|
11:34.560 --> 11:37.080 |
|
And therefore you always have an escape clause |
|
|
|
11:37.080 --> 11:38.480 |
|
if a government is trying, |
|
|
|
11:38.480 --> 11:43.480 |
|
if a stable balanced ethical government |
|
|
|
11:43.480 --> 11:46.200 |
|
is trying to accuse you of something, |
|
|
|
11:46.200 --> 11:47.040 |
|
at least you have protection. |
|
|
|
11:47.040 --> 11:48.720 |
|
You can say it was fake news, |
|
|
|
11:48.720 --> 11:50.560 |
|
as is a popular term now. |
|
|
|
11:50.560 --> 11:52.320 |
|
Yeah, that's the general thinking of it. |
|
|
|
11:52.320 --> 11:54.320 |
|
We know how to go into the video |
|
|
|
11:54.320 --> 11:58.320 |
|
and see, for example, your heart rate and respiration |
|
|
|
11:58.320 --> 12:02.160 |
|
and whether or not they've been tampered with. |
|
|
|
12:02.160 --> 12:05.480 |
|
And we also can put fake heart rate and respiration |
|
|
|
12:05.480 --> 12:06.320 |
|
in your video. |
|
|
|
12:06.320 --> 12:09.120 |
|
Now too, we decided we needed to do that |
|
|
|
12:09.120 --> 12:12.600 |
|
after we developed a way to extract it, |
|
|
|
12:12.600 --> 12:15.960 |
|
but we decided we also needed a way to jam it. |
|
|
|
12:15.960 --> 12:18.320 |
|
And so the fact that we took time |
|
|
|
12:18.320 --> 12:20.920 |
|
to do that other step too, |
|
|
|
12:20.920 --> 12:22.560 |
|
that was time that I wasn't spending |
|
|
|
12:22.560 --> 12:25.280 |
|
making the machine more effectively intelligent. |
|
|
|
12:25.280 --> 12:28.520 |
|
And there's a choice in how we spend our time, |
|
|
|
12:28.520 --> 12:32.440 |
|
which is now being swayed a little bit less by this goal |
|
|
|
12:32.440 --> 12:34.360 |
|
and a little bit more like by concern |
|
|
|
12:34.360 --> 12:36.600 |
|
about what's happening in society |
|
|
|
12:36.600 --> 12:38.880 |
|
and what kind of future do we wanna build. |
|
|
|
12:38.880 --> 12:41.680 |
|
And as we step back and say, |
|
|
|
12:41.680 --> 12:44.600 |
|
okay, we don't just build AI to build AI |
|
|
|
12:44.600 --> 12:46.520 |
|
to make Elon Musk more money |
|
|
|
12:46.520 --> 12:48.760 |
|
or to make Amazon Jeff Bezos more money. |
|
|
|
12:48.760 --> 12:52.920 |
|
You could gosh, that's the wrong ethic. |
|
|
|
12:52.920 --> 12:54.160 |
|
Why are we building it? |
|
|
|
12:54.160 --> 12:57.240 |
|
What is the point of building AI? |
|
|
|
12:57.240 --> 13:01.560 |
|
It used to be, it was driven by researchers in academia |
|
|
|
13:01.560 --> 13:04.200 |
|
to get papers published and to make a career for themselves |
|
|
|
13:04.200 --> 13:06.080 |
|
and to do something cool, right? |
|
|
|
13:06.080 --> 13:07.680 |
|
Cause maybe it could be done. |
|
|
|
13:07.680 --> 13:12.480 |
|
Now we realize that this is enabling rich people |
|
|
|
13:12.480 --> 13:14.280 |
|
to get vastly richer. |
|
|
|
13:15.560 --> 13:19.800 |
|
The poor are, the divide is even larger. |
|
|
|
13:19.800 --> 13:22.880 |
|
And is that the kind of future that we want? |
|
|
|
13:22.880 --> 13:25.920 |
|
Maybe we wanna think about, maybe we wanna rethink AI. |
|
|
|
13:25.920 --> 13:29.120 |
|
Maybe we wanna rethink the problems in society |
|
|
|
13:29.120 --> 13:32.760 |
|
that are causing the greatest inequity |
|
|
|
13:32.760 --> 13:35.040 |
|
and rethink how to build AI |
|
|
|
13:35.040 --> 13:36.760 |
|
that's not about a general intelligence, |
|
|
|
13:36.760 --> 13:39.320 |
|
but that's about extending the intelligence |
|
|
|
13:39.320 --> 13:41.240 |
|
and capability of the have nots |
|
|
|
13:41.240 --> 13:43.800 |
|
so that we close these gaps in society. |
|
|
|
13:43.800 --> 13:46.640 |
|
Do you hope that kind of stepping on the break |
|
|
|
13:46.640 --> 13:48.000 |
|
happens organically? |
|
|
|
13:48.000 --> 13:51.240 |
|
Because I think still majority of the force behind AI |
|
|
|
13:51.240 --> 13:52.800 |
|
is the desire to publish papers, |
|
|
|
13:52.800 --> 13:55.800 |
|
is to make money without thinking about the why. |
|
|
|
13:55.800 --> 13:57.280 |
|
Do you hope it happens organically? |
|
|
|
13:57.280 --> 13:59.040 |
|
Is there a room for regulation? |
|
|
|
14:01.120 --> 14:02.960 |
|
Yeah, yeah, yeah, great questions. |
|
|
|
14:02.960 --> 14:07.360 |
|
I prefer the, they talk about the carrot versus the stick. |
|
|
|
14:07.360 --> 14:09.160 |
|
I definitely prefer the carrot to the stick. |
|
|
|
14:09.160 --> 14:14.160 |
|
And in our free world, there's only so much stick, right? |
|
|
|
14:14.920 --> 14:17.280 |
|
You're gonna find a way around it. |
|
|
|
14:17.280 --> 14:21.160 |
|
I generally think less regulation is better. |
|
|
|
14:21.160 --> 14:24.440 |
|
That said, even though my position is classically carrot, |
|
|
|
14:24.440 --> 14:26.280 |
|
no stick, no regulation, |
|
|
|
14:26.280 --> 14:29.080 |
|
I think we do need some regulations in this space. |
|
|
|
14:29.080 --> 14:30.720 |
|
I do think we need regulations |
|
|
|
14:30.720 --> 14:33.640 |
|
around protecting people with their data, |
|
|
|
14:33.640 --> 14:38.200 |
|
that you own your data, not Amazon, not Google. |
|
|
|
14:38.200 --> 14:40.800 |
|
I would like to see people own their own data. |
|
|
|
14:40.800 --> 14:42.520 |
|
I would also like to see the regulations |
|
|
|
14:42.520 --> 14:44.520 |
|
that we have right now around lie detection |
|
|
|
14:44.520 --> 14:48.200 |
|
being extended to emotion recognition in general. |
|
|
|
14:48.200 --> 14:51.040 |
|
That right now you can't use a lie detector on an employee |
|
|
|
14:51.040 --> 14:52.760 |
|
when you're on a candidate, |
|
|
|
14:52.760 --> 14:54.720 |
|
when you're interviewing them for a job. |
|
|
|
14:54.720 --> 14:57.800 |
|
I think similarly, we need to put in place protection |
|
|
|
14:57.800 --> 15:00.720 |
|
around reading people's emotions without their consent |
|
|
|
15:00.720 --> 15:02.200 |
|
and in certain cases, |
|
|
|
15:02.200 --> 15:06.160 |
|
like characterizing them for a job and other opportunities. |
|
|
|
15:06.160 --> 15:09.200 |
|
So I also think that when we're reading emotion |
|
|
|
15:09.200 --> 15:11.760 |
|
that's predictive around mental health, |
|
|
|
15:11.760 --> 15:14.240 |
|
that that should, even though it's not medical data, |
|
|
|
15:14.240 --> 15:16.160 |
|
that that should get the kinds of protections |
|
|
|
15:16.160 --> 15:18.520 |
|
that our medical data gets. |
|
|
|
15:18.520 --> 15:20.080 |
|
What most people don't know yet |
|
|
|
15:20.080 --> 15:22.640 |
|
is right now with your smartphone use, |
|
|
|
15:22.640 --> 15:25.280 |
|
and if you're wearing a sensor |
|
|
|
15:25.280 --> 15:27.760 |
|
and you wanna learn about your stress and your sleep, |
|
|
|
15:27.760 --> 15:29.040 |
|
and your physical activity, |
|
|
|
15:29.040 --> 15:30.840 |
|
and how much you're using your phone |
|
|
|
15:30.840 --> 15:32.640 |
|
and your social interaction, |
|
|
|
15:32.640 --> 15:34.960 |
|
all of that non medical data, |
|
|
|
15:34.960 --> 15:37.960 |
|
when we put it together with machine learning, |
|
|
|
15:37.960 --> 15:40.160 |
|
now called AI, even though the founders of AI |
|
|
|
15:40.160 --> 15:41.720 |
|
wouldn't have called it that, |
|
|
|
15:42.960 --> 15:47.960 |
|
that capability can not only tell that you're calm right now, |
|
|
|
15:48.440 --> 15:50.840 |
|
or that you're getting a little stressed, |
|
|
|
15:50.840 --> 15:53.920 |
|
but it can also predict how you're likely to be tomorrow. |
|
|
|
15:53.920 --> 15:55.880 |
|
If you're likely to be sick or healthy, |
|
|
|
15:55.880 --> 15:58.720 |
|
happy or sad, stressed or calm. |
|
|
|
15:58.720 --> 16:00.640 |
|
Especially when you're tracking data over time. |
|
|
|
16:00.640 --> 16:03.760 |
|
Especially when we're tracking a week of your data or more. |
|
|
|
16:03.760 --> 16:06.000 |
|
Do you have an optimism towards, |
|
|
|
16:06.000 --> 16:07.800 |
|
that a lot of people on our phones |
|
|
|
16:07.800 --> 16:10.360 |
|
are worried about this camera that's looking at us? |
|
|
|
16:10.360 --> 16:12.520 |
|
For the most part, on balance, |
|
|
|
16:12.520 --> 16:16.120 |
|
are you optimistic about the benefits |
|
|
|
16:16.120 --> 16:17.480 |
|
that can be brought from that camera |
|
|
|
16:17.480 --> 16:19.640 |
|
that's looking at billions of us, |
|
|
|
16:19.640 --> 16:22.080 |
|
or should we be more worried? |
|
|
|
16:22.080 --> 16:27.080 |
|
I think we should be a little bit more worried |
|
|
|
16:28.840 --> 16:32.480 |
|
about who's looking at us and listening to us. |
|
|
|
16:32.480 --> 16:36.680 |
|
The device sitting on your countertop in your kitchen, |
|
|
|
16:36.680 --> 16:41.680 |
|
whether it's Alexa or Google Home or Apple, Siri, |
|
|
|
16:42.120 --> 16:46.440 |
|
these devices want to listen, |
|
|
|
16:47.480 --> 16:49.640 |
|
while they say ostensibly to help us. |
|
|
|
16:49.640 --> 16:52.080 |
|
And I think there are great people in these companies |
|
|
|
16:52.080 --> 16:54.160 |
|
who do want to help people. |
|
|
|
16:54.160 --> 16:56.160 |
|
Let me not brand them all bad. |
|
|
|
16:56.160 --> 16:59.320 |
|
I'm a user of products from all of these companies. |
|
|
|
16:59.320 --> 17:04.320 |
|
I'm naming all the A companies, Alphabet, Apple, Amazon. |
|
|
|
17:04.360 --> 17:09.120 |
|
They are awfully big companies, right? |
|
|
|
17:09.120 --> 17:11.520 |
|
They have incredible power. |
|
|
|
17:11.520 --> 17:16.520 |
|
And what if China were to buy them, right? |
|
|
|
17:16.520 --> 17:19.320 |
|
And suddenly, all of that data |
|
|
|
17:19.320 --> 17:21.880 |
|
were not part of free America, |
|
|
|
17:21.880 --> 17:23.800 |
|
but all of that data were part of somebody |
|
|
|
17:23.800 --> 17:26.040 |
|
who just wants to take over the world |
|
|
|
17:26.040 --> 17:27.480 |
|
and you submit to them. |
|
|
|
17:27.480 --> 17:31.560 |
|
And guess what happens if you so much as smirk the wrong way |
|
|
|
17:31.560 --> 17:34.000 |
|
when they say something that you don't like? |
|
|
|
17:34.000 --> 17:36.840 |
|
Well, they have reeducation camps, right? |
|
|
|
17:36.840 --> 17:38.360 |
|
That's a nice word for them. |
|
|
|
17:38.360 --> 17:40.800 |
|
By the way, they have a surplus of organs |
|
|
|
17:40.800 --> 17:42.720 |
|
for people who have surgery these days. |
|
|
|
17:42.720 --> 17:44.440 |
|
They don't have an organ donation problem, |
|
|
|
17:44.440 --> 17:45.640 |
|
because they take your blood. |
|
|
|
17:45.640 --> 17:47.480 |
|
And they know you're a match. |
|
|
|
17:47.480 --> 17:51.640 |
|
And the doctors are on record of taking organs from people |
|
|
|
17:51.640 --> 17:54.680 |
|
who are perfectly healthy and not prisoners. |
|
|
|
17:54.680 --> 17:57.960 |
|
They're just simply not the favored ones of the government. |
|
|
|
17:58.960 --> 18:03.960 |
|
And, you know, that's a pretty freaky evil society. |
|
|
|
18:03.960 --> 18:05.920 |
|
And we can use the word evil there. |
|
|
|
18:05.920 --> 18:07.280 |
|
I was born in the Soviet Union. |
|
|
|
18:07.280 --> 18:11.480 |
|
I can certainly connect to the worry |
|
|
|
18:11.480 --> 18:12.520 |
|
that you're expressing. |
|
|
|
18:12.520 --> 18:14.920 |
|
At the same time, probably both you and I |
|
|
|
18:14.920 --> 18:19.000 |
|
and you're very much so, you know, |
|
|
|
18:19.000 --> 18:22.440 |
|
there's an exciting possibility |
|
|
|
18:22.440 --> 18:27.000 |
|
that you can have a deep connection with the machine. |
|
|
|
18:27.000 --> 18:28.000 |
|
Yeah, yeah. |
|
|
|
18:28.000 --> 18:29.000 |
|
Right, so. |
|
|
|
18:29.000 --> 18:35.000 |
|
Those of us, I've admitted students who say that they, |
|
|
|
18:35.000 --> 18:37.400 |
|
you know, when you list, like, who do you most wish |
|
|
|
18:37.400 --> 18:40.640 |
|
you could have lunch with or dinner with, right? |
|
|
|
18:40.640 --> 18:42.680 |
|
And they'll write, like, I don't like people. |
|
|
|
18:42.680 --> 18:44.040 |
|
I just like computers. |
|
|
|
18:44.040 --> 18:45.760 |
|
And one of them said to me once |
|
|
|
18:45.760 --> 18:48.480 |
|
when I had this party at my house, |
|
|
|
18:48.480 --> 18:52.560 |
|
I want you to know this is my only social event of the year, |
|
|
|
18:52.560 --> 18:54.840 |
|
my one social event of the year. |
|
|
|
18:54.840 --> 18:58.080 |
|
Like, okay, now this is a brilliant machine learning person, |
|
|
|
18:58.080 --> 18:59.080 |
|
right? |
|
|
|
18:59.080 --> 19:01.320 |
|
And we need that kind of brilliance and machine learning. |
|
|
|
19:01.320 --> 19:04.040 |
|
And I love that Computer Science welcomes people |
|
|
|
19:04.040 --> 19:07.040 |
|
who love people and people who are very awkward around people. |
|
|
|
19:07.040 --> 19:12.040 |
|
I love that this is a field that anybody could join. |
|
|
|
19:12.040 --> 19:15.040 |
|
We need all kinds of people. |
|
|
|
19:15.040 --> 19:17.040 |
|
And you don't need to be a social person. |
|
|
|
19:17.040 --> 19:19.040 |
|
I'm not trying to force people who don't like people |
|
|
|
19:19.040 --> 19:21.040 |
|
to suddenly become social. |
|
|
|
19:21.040 --> 19:26.040 |
|
At the same time, if most of the people building the AIs |
|
|
|
19:26.040 --> 19:29.040 |
|
of the future are the kind of people who don't like people, |
|
|
|
19:29.040 --> 19:31.040 |
|
we've got a little bit of a problem. |
|
|
|
19:31.040 --> 19:32.040 |
|
Hold on a second. |
|
|
|
19:32.040 --> 19:34.040 |
|
So let me, let me push back on that. |
|
|
|
19:34.040 --> 19:39.040 |
|
So don't you think a large percentage of the world can, |
|
|
|
19:39.040 --> 19:41.040 |
|
you know, there's loneliness. |
|
|
|
19:41.040 --> 19:44.040 |
|
There is a huge problem with loneliness and it's growing. |
|
|
|
19:44.040 --> 19:47.040 |
|
And so there's a longing for connection. |
|
|
|
19:47.040 --> 19:51.040 |
|
If you're lonely, you're part of a big and growing group. |
|
|
|
19:51.040 --> 19:52.040 |
|
Yes. |
|
|
|
19:52.040 --> 19:54.040 |
|
So we're in it together, I guess. |
|
|
|
19:54.040 --> 19:56.040 |
|
If you're lonely, join the group. |
|
|
|
19:56.040 --> 19:57.040 |
|
You're not alone. |
|
|
|
19:57.040 --> 19:58.040 |
|
You're not alone. |
|
|
|
19:58.040 --> 20:00.040 |
|
That's a good line. |
|
|
|
20:00.040 --> 20:04.040 |
|
But do you think there's a, you talked about some worry, |
|
|
|
20:04.040 --> 20:07.040 |
|
but do you think there's an exciting possibility |
|
|
|
20:07.040 --> 20:11.040 |
|
that something like Alexa, when these kinds of tools |
|
|
|
20:11.040 --> 20:16.040 |
|
can alleviate that loneliness in a way that other humans can't? |
|
|
|
20:16.040 --> 20:17.040 |
|
Yeah. |
|
|
|
20:17.040 --> 20:18.040 |
|
Yeah, definitely. |
|
|
|
20:18.040 --> 20:21.040 |
|
I mean, a great book can kind of alleviate loneliness. |
|
|
|
20:21.040 --> 20:22.040 |
|
Right. |
|
|
|
20:22.040 --> 20:23.040 |
|
Exactly. |
|
|
|
20:23.040 --> 20:25.040 |
|
Because you just get sucked into this amazing story |
|
|
|
20:25.040 --> 20:27.040 |
|
and you can't wait to go spend time with that character. |
|
|
|
20:27.040 --> 20:28.040 |
|
Right. |
|
|
|
20:28.040 --> 20:30.040 |
|
And they're not a human character. |
|
|
|
20:30.040 --> 20:32.040 |
|
There is a human behind it. |
|
|
|
20:32.040 --> 20:35.040 |
|
But yeah, it can be an incredibly delightful way |
|
|
|
20:35.040 --> 20:37.040 |
|
to pass the hours. |
|
|
|
20:37.040 --> 20:39.040 |
|
And it can meet needs. |
|
|
|
20:39.040 --> 20:43.040 |
|
Even, you know, I don't read those trashy romance books, |
|
|
|
20:43.040 --> 20:44.040 |
|
but somebody does, right? |
|
|
|
20:44.040 --> 20:46.040 |
|
And what are they getting from this? |
|
|
|
20:46.040 --> 20:50.040 |
|
Well, probably some of that feeling of being there, right? |
|
|
|
20:50.040 --> 20:54.040 |
|
Being there in that social moment, that romantic moment, |
|
|
|
20:54.040 --> 20:56.040 |
|
or connecting with somebody. |
|
|
|
20:56.040 --> 20:59.040 |
|
I've had a similar experience reading some science fiction books, right? |
|
|
|
20:59.040 --> 21:01.040 |
|
And connecting with the character, Orson Scott Card. |
|
|
|
21:01.040 --> 21:05.040 |
|
And, you know, just amazing writing and Inder's Game |
|
|
|
21:05.040 --> 21:07.040 |
|
and Speaker for the Dead, terrible title. |
|
|
|
21:07.040 --> 21:10.040 |
|
But those kind of books that pull you into a character, |
|
|
|
21:10.040 --> 21:13.040 |
|
and you feel like you're, you feel very social. |
|
|
|
21:13.040 --> 21:17.040 |
|
It's very connected, even though it's not responding to you. |
|
|
|
21:17.040 --> 21:19.040 |
|
And a computer, of course, can respond to you. |
|
|
|
21:19.040 --> 21:20.040 |
|
Right. |
|
|
|
21:20.040 --> 21:21.040 |
|
So it can deepen it, right? |
|
|
|
21:21.040 --> 21:25.040 |
|
You can have a very deep connection, |
|
|
|
21:25.040 --> 21:29.040 |
|
much more than the movie Her, you know, plays up, right? |
|
|
|
21:29.040 --> 21:30.040 |
|
Well, much more. |
|
|
|
21:30.040 --> 21:34.040 |
|
I mean, movie Her is already a pretty deep connection, right? |
|
|
|
21:34.040 --> 21:36.040 |
|
Well, but it's just a movie, right? |
|
|
|
21:36.040 --> 21:37.040 |
|
It's scripted. |
|
|
|
21:37.040 --> 21:38.040 |
|
It's just, you know. |
|
|
|
21:38.040 --> 21:42.040 |
|
But I mean, like, there can be a real interaction |
|
|
|
21:42.040 --> 21:46.040 |
|
where the character can learn and you can learn. |
|
|
|
21:46.040 --> 21:49.040 |
|
You could imagine it not just being you and one character. |
|
|
|
21:49.040 --> 21:51.040 |
|
You can imagine a group of characters. |
|
|
|
21:51.040 --> 21:53.040 |
|
You can imagine a group of people and characters, |
|
|
|
21:53.040 --> 21:56.040 |
|
human and AI connecting. |
|
|
|
21:56.040 --> 22:01.040 |
|
Where maybe a few people can't sort of be friends with everybody, |
|
|
|
22:01.040 --> 22:07.040 |
|
but the few people and their AIs can befriend more people. |
|
|
|
22:07.040 --> 22:10.040 |
|
There can be an extended human intelligence in there |
|
|
|
22:10.040 --> 22:14.040 |
|
where each human can connect with more people that way. |
|
|
|
22:14.040 --> 22:18.040 |
|
But it's still very limited. |
|
|
|
22:18.040 --> 22:21.040 |
|
But there are just, what I mean is there are many more possibilities |
|
|
|
22:21.040 --> 22:22.040 |
|
than what's in that movie. |
|
|
|
22:22.040 --> 22:24.040 |
|
So there's a tension here. |
|
|
|
22:24.040 --> 22:28.040 |
|
So one, you expressed a really serious concern about privacy, |
|
|
|
22:28.040 --> 22:31.040 |
|
about how governments can misuse the information. |
|
|
|
22:31.040 --> 22:34.040 |
|
And there's the possibility of this connection. |
|
|
|
22:34.040 --> 22:36.040 |
|
So let's look at Alexa. |
|
|
|
22:36.040 --> 22:38.040 |
|
So personal assistance. |
|
|
|
22:38.040 --> 22:42.040 |
|
For the most part, as far as I'm aware, they ignore your emotion. |
|
|
|
22:42.040 --> 22:47.040 |
|
They ignore even the context or the existence of you, |
|
|
|
22:47.040 --> 22:51.040 |
|
the intricate, beautiful, complex aspects of who you are, |
|
|
|
22:51.040 --> 22:56.040 |
|
except maybe aspects of your voice that help it recognize |
|
|
|
22:56.040 --> 22:58.040 |
|
speech recognition. |
|
|
|
22:58.040 --> 23:03.040 |
|
Do you think they should move towards trying to understand your emotion? |
|
|
|
23:03.040 --> 23:07.040 |
|
All of these companies are very interested in understanding human emotion. |
|
|
|
23:07.040 --> 23:13.040 |
|
More people are telling Siri every day they want to kill themselves. |
|
|
|
23:13.040 --> 23:17.040 |
|
Apple wants to know the difference between if a person is really suicidal |
|
|
|
23:17.040 --> 23:21.040 |
|
versus if a person is just kind of fooling around with Siri. |
|
|
|
23:21.040 --> 23:25.040 |
|
The words may be the same, the tone of voice, |
|
|
|
23:25.040 --> 23:31.040 |
|
and what surrounds those words is pivotal to understand |
|
|
|
23:31.040 --> 23:35.040 |
|
if they should respond in a very serious way, bring help to that person, |
|
|
|
23:35.040 --> 23:40.040 |
|
or if they should kind of jokingly tease back, |
|
|
|
23:40.040 --> 23:45.040 |
|
you just want to sell me for something else. |
|
|
|
23:45.040 --> 23:48.040 |
|
Like, how do you respond when somebody says that? |
|
|
|
23:48.040 --> 23:53.040 |
|
Well, you do want to err on the side of being careful and taking it seriously. |
|
|
|
23:53.040 --> 23:59.040 |
|
People want to know if the person is happy or stressed. |
|
|
|
23:59.040 --> 24:03.040 |
|
In part, well, so let me give you an altruistic reason |
|
|
|
24:03.040 --> 24:08.040 |
|
and a business profit motivated reason, |
|
|
|
24:08.040 --> 24:13.040 |
|
and there are people in companies that operate on both principles. |
|
|
|
24:13.040 --> 24:17.040 |
|
Altruistic people really care about their customers |
|
|
|
24:17.040 --> 24:20.040 |
|
and really care about helping you feel a little better at the end of the day, |
|
|
|
24:20.040 --> 24:24.040 |
|
and it would just make those people happy if they knew that they made your life better. |
|
|
|
24:24.040 --> 24:29.040 |
|
If you came home stressed and after talking with their product, you felt better. |
|
|
|
24:29.040 --> 24:35.040 |
|
There are other people who maybe have studied the way affect affects decision making |
|
|
|
24:35.040 --> 24:39.040 |
|
and prices people pay, and they know, I don't know if I should tell you, |
|
|
|
24:39.040 --> 24:44.040 |
|
the work of Jen Lerner on heartstrings and purse strings. |
|
|
|
24:44.040 --> 24:50.040 |
|
If we manipulate you into a slightly sadder mood, you'll pay more. |
|
|
|
24:50.040 --> 24:54.040 |
|
You'll pay more to change your situation. |
|
|
|
24:54.040 --> 24:58.040 |
|
You'll pay more for something you don't even need to make yourself feel better. |
|
|
|
24:58.040 --> 25:01.040 |
|
If they sound a little sad, maybe I don't want to cheer them up. |
|
|
|
25:01.040 --> 25:05.040 |
|
Maybe first I want to help them get something, |
|
|
|
25:05.040 --> 25:09.040 |
|
a little shopping therapy that helps them. |
|
|
|
25:09.040 --> 25:13.040 |
|
Which is really difficult for a company that's primarily funded on advertisements, |
|
|
|
25:13.040 --> 25:16.040 |
|
so they're encouraged to get you to... |
|
|
|
25:16.040 --> 25:20.040 |
|
To offer you products or Amazon that's primarily funded on you buying things from their store. |
|
|
|
25:20.040 --> 25:25.040 |
|
Maybe we need regulation in the future to put a little bit of a wall |
|
|
|
25:25.040 --> 25:29.040 |
|
between these agents that have access to our emotion |
|
|
|
25:29.040 --> 25:32.040 |
|
and agents that want to sell us stuff. |
|
|
|
25:32.040 --> 25:38.040 |
|
Maybe there needs to be a little bit more of a firewall in between those. |
|
|
|
25:38.040 --> 25:42.040 |
|
So maybe digging in a little bit on the interaction with Alexa, |
|
|
|
25:42.040 --> 25:45.040 |
|
you mentioned, of course, a really serious concern about, |
|
|
|
25:45.040 --> 25:50.040 |
|
like recognizing emotion if somebody is speaking of suicide or depression and so on, |
|
|
|
25:50.040 --> 25:54.040 |
|
but what about the actual interaction itself? |
|
|
|
25:54.040 --> 25:57.040 |
|
Do you think... |
|
|
|
25:57.040 --> 26:01.040 |
|
You mentioned clippy and being annoying. |
|
|
|
26:01.040 --> 26:04.040 |
|
What is the objective function we're trying to optimize? |
|
|
|
26:04.040 --> 26:09.040 |
|
Is it minimize annoyingness or maximize happiness? |
|
|
|
26:09.040 --> 26:12.040 |
|
Or if we look at human relations, |
|
|
|
26:12.040 --> 26:16.040 |
|
I think that push and pull, the tension, the dance, |
|
|
|
26:16.040 --> 26:20.040 |
|
the annoying, the flaws, that's what makes it fun. |
|
|
|
26:20.040 --> 26:23.040 |
|
So is there a room for... |
|
|
|
26:23.040 --> 26:25.040 |
|
What is the objective function? |
|
|
|
26:25.040 --> 26:29.040 |
|
In times when you want to have a little push and pull, think of kids sparring. |
|
|
|
26:29.040 --> 26:34.040 |
|
You know, I see my sons and one of them wants to provoke the other to be upset. |
|
|
|
26:34.040 --> 26:35.040 |
|
And that's fun. |
|
|
|
26:35.040 --> 26:38.040 |
|
And it's actually healthy to learn where your limits are, |
|
|
|
26:38.040 --> 26:40.040 |
|
to learn how to self regulate. |
|
|
|
26:40.040 --> 26:43.040 |
|
You can imagine a game where it's trying to make you mad |
|
|
|
26:43.040 --> 26:45.040 |
|
and you're trying to show self control. |
|
|
|
26:45.040 --> 26:48.040 |
|
And so if we're doing a AI human interaction |
|
|
|
26:48.040 --> 26:51.040 |
|
that's helping build resilience and self control, |
|
|
|
26:51.040 --> 26:54.040 |
|
whether it's to learn how to not be a bully |
|
|
|
26:54.040 --> 26:58.040 |
|
or how to turn the other cheek or how to deal with an abusive person in your life, |
|
|
|
26:58.040 --> 27:03.040 |
|
then you might need an AI that pushes your buttons, right? |
|
|
|
27:03.040 --> 27:07.040 |
|
But in general, do you want an AI that pushes your buttons? |
|
|
|
27:07.040 --> 27:11.040 |
|
Probably depends on your personality. |
|
|
|
27:11.040 --> 27:12.040 |
|
I don't. |
|
|
|
27:12.040 --> 27:17.040 |
|
I want one that's respectful, that is there to serve me |
|
|
|
27:17.040 --> 27:22.040 |
|
and that is there to extend my ability to do things. |
|
|
|
27:22.040 --> 27:24.040 |
|
I'm not looking for a rival. |
|
|
|
27:24.040 --> 27:26.040 |
|
I'm looking for a helper. |
|
|
|
27:26.040 --> 27:29.040 |
|
And that's the kind of AI I'd put my money on. |
|
|
|
27:29.040 --> 27:32.040 |
|
Your senses for the majority of people in the world, |
|
|
|
27:32.040 --> 27:34.040 |
|
in order to have a rich experience, |
|
|
|
27:34.040 --> 27:36.040 |
|
that's what they're looking for as well. |
|
|
|
27:36.040 --> 27:40.040 |
|
So if you look at the movie Her, spoiler alert, |
|
|
|
27:40.040 --> 27:45.040 |
|
I believe the program, the woman in the movie Her, |
|
|
|
27:45.040 --> 27:50.040 |
|
leaves the person for somebody else. |
|
|
|
27:50.040 --> 27:53.040 |
|
Says they don't want to be dating anymore. |
|
|
|
27:53.040 --> 27:56.040 |
|
Right. |
|
|
|
27:56.040 --> 27:59.040 |
|
Your senses, if Alexis said, you know what? |
|
|
|
27:59.040 --> 28:03.040 |
|
I actually had enough of you for a while, |
|
|
|
28:03.040 --> 28:05.040 |
|
so I'm going to shut myself off. |
|
|
|
28:05.040 --> 28:07.040 |
|
You don't see that as... |
|
|
|
28:07.040 --> 28:10.040 |
|
I'd say you're trash because I'm paid for you, right? |
|
|
|
28:10.040 --> 28:14.040 |
|
We've got to remember, |
|
|
|
28:14.040 --> 28:18.040 |
|
and this is where this blending human AI |
|
|
|
28:18.040 --> 28:22.040 |
|
as if we're equals is really deceptive, |
|
|
|
28:22.040 --> 28:26.040 |
|
because AI is something at the end of the day |
|
|
|
28:26.040 --> 28:29.040 |
|
that my students and I are making in the lab, |
|
|
|
28:29.040 --> 28:33.040 |
|
and we're choosing what it's allowed to say, |
|
|
|
28:33.040 --> 28:36.040 |
|
when it's allowed to speak, what it's allowed to listen to, |
|
|
|
28:36.040 --> 28:39.040 |
|
what it's allowed to act on, |
|
|
|
28:39.040 --> 28:43.040 |
|
given the inputs that we choose to expose it to, |
|
|
|
28:43.040 --> 28:45.040 |
|
what outputs it's allowed to have. |
|
|
|
28:45.040 --> 28:49.040 |
|
It's all something made by a human, |
|
|
|
28:49.040 --> 28:52.040 |
|
and if we want to make something that makes our lives miserable, |
|
|
|
28:52.040 --> 28:55.040 |
|
fine, I wouldn't invest in it as a business, |
|
|
|
28:55.040 --> 28:59.040 |
|
unless it's just there for self regulation training. |
|
|
|
28:59.040 --> 29:02.040 |
|
But I think we need to think about what kind of future we want, |
|
|
|
29:02.040 --> 29:05.040 |
|
and actually your question, I really like the... |
|
|
|
29:05.040 --> 29:07.040 |
|
What is the objective function? |
|
|
|
29:07.040 --> 29:10.040 |
|
Is it to calm people down sometimes? |
|
|
|
29:10.040 --> 29:14.040 |
|
Is it to always make people happy and calm them down? |
|
|
|
29:14.040 --> 29:16.040 |
|
Well, there was a book about that, right? |
|
|
|
29:16.040 --> 29:19.040 |
|
The Brave New World, you know, make everybody happy, |
|
|
|
29:19.040 --> 29:22.040 |
|
take your Soma if you're unhappy, take your happy pill, |
|
|
|
29:22.040 --> 29:24.040 |
|
and if you refuse to take your happy pill, |
|
|
|
29:24.040 --> 29:29.040 |
|
well, we'll threaten you by sending you to Iceland to live there. |
|
|
|
29:29.040 --> 29:32.040 |
|
I lived in Iceland three years. It's a great place. |
|
|
|
29:32.040 --> 29:35.040 |
|
Don't take your Soma, then go to Iceland. |
|
|
|
29:35.040 --> 29:37.040 |
|
A little TV commercial there. |
|
|
|
29:37.040 --> 29:40.040 |
|
I was a child there for a few years. It's a wonderful place. |
|
|
|
29:40.040 --> 29:43.040 |
|
So that part of the book never scared me. |
|
|
|
29:43.040 --> 29:48.040 |
|
But really, do we want AI to manipulate us into submission, |
|
|
|
29:48.040 --> 29:49.040 |
|
into making us happy? |
|
|
|
29:49.040 --> 29:56.040 |
|
Well, if you are a power obsessed, sick dictator, individual |
|
|
|
29:56.040 --> 29:59.040 |
|
who only wants to control other people to get your jollies in life, |
|
|
|
29:59.040 --> 30:04.040 |
|
then yeah, you want to use AI to extend your power in your scale |
|
|
|
30:04.040 --> 30:07.040 |
|
to force people into submission. |
|
|
|
30:07.040 --> 30:11.040 |
|
If you believe that the human race is better off being given freedom |
|
|
|
30:11.040 --> 30:15.040 |
|
and the opportunity to do things that might surprise you, |
|
|
|
30:15.040 --> 30:20.040 |
|
then you want to use AI to extend people's ability to build, |
|
|
|
30:20.040 --> 30:23.040 |
|
you want to build AI that extends human intelligence, |
|
|
|
30:23.040 --> 30:27.040 |
|
that empowers the weak and helps balance the power |
|
|
|
30:27.040 --> 30:29.040 |
|
between the weak and the strong, |
|
|
|
30:29.040 --> 30:32.040 |
|
not that gives more power to the strong. |
|
|
|
30:32.040 --> 30:38.040 |
|
So in this process of empowering people and sensing people |
|
|
|
30:38.040 --> 30:43.040 |
|
and what is your sense on emotion in terms of recognizing emotion? |
|
|
|
30:43.040 --> 30:47.040 |
|
The difference between emotion that is shown and emotion that is felt. |
|
|
|
30:47.040 --> 30:54.040 |
|
So yeah, emotion that is expressed on the surface through your face, |
|
|
|
30:54.040 --> 30:57.040 |
|
your body, and various other things, |
|
|
|
30:57.040 --> 31:00.040 |
|
and what's actually going on deep inside on the biological level, |
|
|
|
31:00.040 --> 31:04.040 |
|
on the neuroscience level, or some kind of cognitive level. |
|
|
|
31:04.040 --> 31:06.040 |
|
Yeah, yeah. |
|
|
|
31:06.040 --> 31:08.040 |
|
So no easy questions here. |
|
|
|
31:08.040 --> 31:11.040 |
|
Yeah, I'm sure there's no definitive answer, |
|
|
|
31:11.040 --> 31:16.040 |
|
but what's your sense, how far can we get by just looking at the face? |
|
|
|
31:16.040 --> 31:18.040 |
|
We're very limited when we just look at the face, |
|
|
|
31:18.040 --> 31:22.040 |
|
but we can get further than most people think we can get. |
|
|
|
31:22.040 --> 31:26.040 |
|
People think, hey, I have a great poker face, |
|
|
|
31:26.040 --> 31:28.040 |
|
therefore all you're ever going to get from me is neutral. |
|
|
|
31:28.040 --> 31:30.040 |
|
Well, that's naive. |
|
|
|
31:30.040 --> 31:35.040 |
|
We can read with the ordinary camera on your laptop or on your phone. |
|
|
|
31:35.040 --> 31:39.040 |
|
We can read from a neutral face if your heart is racing. |
|
|
|
31:39.040 --> 31:45.040 |
|
We can read from a neutral face if your breathing is becoming irregular |
|
|
|
31:45.040 --> 31:47.040 |
|
and showing signs of stress. |
|
|
|
31:47.040 --> 31:53.040 |
|
We can read under some conditions that maybe I won't give you details on, |
|
|
|
31:53.040 --> 31:57.040 |
|
how your heart rate variability power is changing. |
|
|
|
31:57.040 --> 32:03.040 |
|
That could be a sign of stress even when your heart rate is not necessarily accelerating. |
|
|
|
32:03.040 --> 32:06.040 |
|
Sorry, from physiosensors or from the face? |
|
|
|
32:06.040 --> 32:09.040 |
|
From the color changes that you cannot even see, |
|
|
|
32:09.040 --> 32:11.040 |
|
but the camera can see. |
|
|
|
32:11.040 --> 32:13.040 |
|
That's amazing. |
|
|
|
32:13.040 --> 32:15.040 |
|
So you can get a lot of signal. |
|
|
|
32:15.040 --> 32:18.040 |
|
So we get things people can't see using a regular camera, |
|
|
|
32:18.040 --> 32:22.040 |
|
and from that we can tell things about your stress. |
|
|
|
32:22.040 --> 32:25.040 |
|
So if you were just sitting there with a blank face |
|
|
|
32:25.040 --> 32:30.040 |
|
thinking nobody can read my emotion, well, you're wrong. |
|
|
|
32:30.040 --> 32:34.040 |
|
So that's really interesting, but that's from visual information from the face. |
|
|
|
32:34.040 --> 32:39.040 |
|
That's almost like cheating your way to the physiological state of the body |
|
|
|
32:39.040 --> 32:43.040 |
|
by being very clever with what you can do with vision. |
|
|
|
32:43.040 --> 32:44.040 |
|
With signal processing. |
|
|
|
32:44.040 --> 32:49.040 |
|
So that's really impressive, but if you just look at the stuff we humans can see, |
|
|
|
32:49.040 --> 32:54.040 |
|
the smile, the smirks, the subtle, all the facial actions. |
|
|
|
32:54.040 --> 32:57.040 |
|
So then you can hide that on your face for a limited amount of time. |
|
|
|
32:57.040 --> 33:01.040 |
|
Now, if you're just going in for a brief interview and you're hiding it, |
|
|
|
33:01.040 --> 33:03.040 |
|
that's pretty easy for most people. |
|
|
|
33:03.040 --> 33:08.040 |
|
If you are, however, surveilled constantly everywhere you go, |
|
|
|
33:08.040 --> 33:13.040 |
|
then it's going to say, gee, you know, Lex used to smile a lot, |
|
|
|
33:13.040 --> 33:15.040 |
|
and now I'm not seeing so many smiles. |
|
|
|
33:15.040 --> 33:22.040 |
|
And Roz used to, you know, laugh a lot and smile a lot very spontaneously. |
|
|
|
33:22.040 --> 33:26.040 |
|
And now I'm only seeing these not so spontaneous looking smiles |
|
|
|
33:26.040 --> 33:28.040 |
|
and only when she's asked these questions. |
|
|
|
33:28.040 --> 33:32.040 |
|
You know, that's probably not getting enough sleep. |
|
|
|
33:32.040 --> 33:35.040 |
|
We could look at that too. |
|
|
|
33:35.040 --> 33:37.040 |
|
So now I have to be a little careful too. |
|
|
|
33:37.040 --> 33:42.040 |
|
When I say we, you think we can't read your emotion and we can, it's not that binary. |
|
|
|
33:42.040 --> 33:48.040 |
|
What we're reading is more some physiological changes that relate to your activation. |
|
|
|
33:48.040 --> 33:52.040 |
|
Now, that doesn't mean that we know everything about how you feel. |
|
|
|
33:52.040 --> 33:54.040 |
|
In fact, we still know very little about how you feel. |
|
|
|
33:54.040 --> 33:56.040 |
|
Your thoughts are still private. |
|
|
|
33:56.040 --> 34:00.040 |
|
Your nuanced feelings are still completely private. |
|
|
|
34:00.040 --> 34:02.040 |
|
We can't read any of that. |
|
|
|
34:02.040 --> 34:06.040 |
|
So there's some relief that we can't read that. |
|
|
|
34:06.040 --> 34:09.040 |
|
Even brain imaging can't read that. |
|
|
|
34:09.040 --> 34:11.040 |
|
Wearables can't read that. |
|
|
|
34:11.040 --> 34:18.040 |
|
However, as we read your body state changes and we know what's going on in your environment, |
|
|
|
34:18.040 --> 34:21.040 |
|
and we look at patterns of those over time, |
|
|
|
34:21.040 --> 34:27.040 |
|
we can start to make some inferences about what you might be feeling. |
|
|
|
34:27.040 --> 34:31.040 |
|
And that is where it's not just the momentary feeling, |
|
|
|
34:31.040 --> 34:34.040 |
|
but it's more your stance toward things. |
|
|
|
34:34.040 --> 34:41.040 |
|
And that could actually be a little bit more scary with certain kinds of governmental, |
|
|
|
34:41.040 --> 34:48.040 |
|
control free people who want to know more about are you on their team or are you not. |
|
|
|
34:48.040 --> 34:51.040 |
|
And getting that information through over time. |
|
|
|
34:51.040 --> 34:54.040 |
|
So you're saying there's a lot of signal by looking at the change over time. |
|
|
|
34:54.040 --> 35:00.040 |
|
So you've done a lot of exciting work both in computer vision and physiological sounds like wearables. |
|
|
|
35:00.040 --> 35:03.040 |
|
What do you think is the best modality for, |
|
|
|
35:03.040 --> 35:08.040 |
|
what's the best window into the emotional soul? |
|
|
|
35:08.040 --> 35:10.040 |
|
Is it the face? Is it the voice? |
|
|
|
35:10.040 --> 35:12.040 |
|
Depends what you want to know. |
|
|
|
35:12.040 --> 35:14.040 |
|
It depends what you want to know. |
|
|
|
35:14.040 --> 35:16.040 |
|
Everything is informative. |
|
|
|
35:16.040 --> 35:18.040 |
|
Everything we do is informative. |
|
|
|
35:18.040 --> 35:20.040 |
|
So for health and well being and things like that, |
|
|
|
35:20.040 --> 35:29.040 |
|
do you find the wearable, measuring physiological signals is the best for health based stuff? |
|
|
|
35:29.040 --> 35:35.040 |
|
So here I'm going to answer empirically with data and studies we've been doing. |
|
|
|
35:35.040 --> 35:37.040 |
|
We've been doing studies now. |
|
|
|
35:37.040 --> 35:40.040 |
|
These are currently running with lots of different kinds of people, |
|
|
|
35:40.040 --> 35:44.040 |
|
but where we've published data, and I can speak publicly to it, |
|
|
|
35:44.040 --> 35:47.040 |
|
the data are limited right now to New England college students. |
|
|
|
35:47.040 --> 35:50.040 |
|
So that's a small group. |
|
|
|
35:50.040 --> 35:52.040 |
|
Among New England college students, |
|
|
|
35:52.040 --> 36:01.040 |
|
when they are wearing a wearable like the emphatic embrace here that's measuring skin conductance, movement, temperature, |
|
|
|
36:01.040 --> 36:10.040 |
|
and when they are using a smartphone that is collecting their time of day of when they're texting, |
|
|
|
36:10.040 --> 36:17.040 |
|
who they're texting, their movement around it, their GPS, the weather information based upon their location, |
|
|
|
36:17.040 --> 36:22.040 |
|
and when it's using machine learning and putting all of that together and looking not just at right now, |
|
|
|
36:22.040 --> 36:28.040 |
|
but looking at your rhythm of behaviors over about a week. |
|
|
|
36:28.040 --> 36:38.040 |
|
When we look at that, we are very accurate at forecasting tomorrow's stress, mood, and happy sad, mood, and health. |
|
|
|
36:38.040 --> 36:43.040 |
|
And when we look at which pieces of that are most useful, |
|
|
|
36:43.040 --> 36:48.040 |
|
first of all, if you have all the pieces, you get the best results. |
|
|
|
36:48.040 --> 36:53.040 |
|
If you have only the wearable, you get the next best results. |
|
|
|
36:53.040 --> 37:00.040 |
|
And that's still better than 80% accurate at forecasting tomorrow's levels. |
|
|
|
37:00.040 --> 37:12.040 |
|
Isn't that exciting because the wearable stuff with physiological information, it feels like it violates privacy less than the noncontact face based methods. |
|
|
|
37:12.040 --> 37:14.040 |
|
Yeah, it's interesting. |
|
|
|
37:14.040 --> 37:18.040 |
|
I think what people sometimes don't, you know, it's fine. |
|
|
|
37:18.040 --> 37:22.040 |
|
The early days people would say, oh, wearing something or giving blood is invasive, right? |
|
|
|
37:22.040 --> 37:26.040 |
|
Whereas a camera is less invasive because it's not touching you. |
|
|
|
37:26.040 --> 37:33.040 |
|
I think on the contrary, the things that are not touching you are maybe the scariest because you don't know when they're on or off. |
|
|
|
37:33.040 --> 37:39.040 |
|
And you don't know when, and you don't know who's behind it, right? |
|
|
|
37:39.040 --> 37:52.040 |
|
A wearable, depending upon what's happening to the data on it, if it's just stored locally or if it's streaming and what it is being attached to. |
|
|
|
37:52.040 --> 37:59.040 |
|
In a sense, you have the most control over it because it's also very easy to just take it off, right? |
|
|
|
37:59.040 --> 38:01.040 |
|
Now it's not sensing me. |
|
|
|
38:01.040 --> 38:07.040 |
|
So if I'm uncomfortable with what it's sensing, now I'm free, right? |
|
|
|
38:07.040 --> 38:13.040 |
|
If I'm comfortable with what it's sensing, then, and I happen to know everything about this one and what it's doing with it. |
|
|
|
38:13.040 --> 38:15.040 |
|
So I'm quite comfortable with it. |
|
|
|
38:15.040 --> 38:20.040 |
|
Then I'm, you know, I have control, I'm comfortable. |
|
|
|
38:20.040 --> 38:26.040 |
|
Control is one of the biggest factors for an individual in reducing their stress. |
|
|
|
38:26.040 --> 38:32.040 |
|
If I have control over it, if I know others to know about it, then my stress is a lot lower. |
|
|
|
38:32.040 --> 38:38.040 |
|
And I'm making an informed choice about whether to wear it or not, or when to wear it or not. |
|
|
|
38:38.040 --> 38:40.040 |
|
I want to wear it sometimes, maybe not others. |
|
|
|
38:40.040 --> 38:43.040 |
|
Right. So that control, yeah, I'm with you. |
|
|
|
38:43.040 --> 38:49.040 |
|
That control, even if, yeah, the ability to turn it off is a really empowering thing. |
|
|
|
38:49.040 --> 38:50.040 |
|
It's huge. |
|
|
|
38:50.040 --> 39:00.040 |
|
And we need to maybe, you know, if there's regulations, maybe that's number one to protect is people's ability to, is easy to opt out as to opt in. |
|
|
|
39:00.040 --> 39:01.040 |
|
Right. |
|
|
|
39:01.040 --> 39:04.040 |
|
So you've studied a bit of neuroscience as well. |
|
|
|
39:04.040 --> 39:16.040 |
|
How have looking at our own minds, sort of the biological stuff or the neurobiological, the neuroscience look at the signals in our brain, |
|
|
|
39:16.040 --> 39:20.040 |
|
helped you understand the problem and the approach of effective computing. |
|
|
|
39:20.040 --> 39:28.040 |
|
So originally I was a computer architect and I was building hardware and computer designs and I wanted to build ones that work like the brain. |
|
|
|
39:28.040 --> 39:33.040 |
|
So I've been studying the brain as long as I've been studying how to build computers. |
|
|
|
39:33.040 --> 39:35.040 |
|
Have you figured out anything yet? |
|
|
|
39:35.040 --> 39:37.040 |
|
Very little. |
|
|
|
39:37.040 --> 39:39.040 |
|
It's so amazing. |
|
|
|
39:39.040 --> 39:46.040 |
|
You know, they used to think like, oh, if you remove this chunk of the brain and you find this function goes away, well, that's the part of the brain that did it. |
|
|
|
39:46.040 --> 39:53.040 |
|
And then later they realize if you remove this other chunk of the brain, that function comes back and oh no, we really don't understand it. |
|
|
|
39:53.040 --> 40:02.040 |
|
Brains are so interesting and changing all the time and able to change in ways that will probably continue to surprise us. |
|
|
|
40:02.040 --> 40:14.040 |
|
When we were measuring stress, you may know the story where we found an unusual big skin conductance pattern on one wrist and one of our kids with autism. |
|
|
|
40:14.040 --> 40:20.040 |
|
And in trying to figure out how on earth you could be stressed on one wrist and not the other, like how can you get sweaty on one wrist, right? |
|
|
|
40:20.040 --> 40:26.040 |
|
When you when you get stressed with that sympathetic fight or flight response, like you kind of should like sweat more in some places than others, |
|
|
|
40:26.040 --> 40:28.040 |
|
but not more on one wrist than the other. |
|
|
|
40:28.040 --> 40:31.040 |
|
That didn't make any sense. |
|
|
|
40:31.040 --> 40:37.040 |
|
We learned that what had actually happened was a part of his brain had unusual electrical activity. |
|
|
|
40:37.040 --> 40:44.040 |
|
And that caused an unusually large sweat response on one wrist and not the other. |
|
|
|
40:44.040 --> 40:49.040 |
|
And since then we've learned that seizures cause this unusual electrical activity. |
|
|
|
40:49.040 --> 40:58.040 |
|
And depending where the seizure is, if it's in one place and it's staying there, you can have a big electrical response we can pick up with a wearable at one part of the body. |
|
|
|
40:58.040 --> 41:07.040 |
|
You can also have a seizure that spreads over the whole brain generalized grand mal seizure and that response spreads and we can pick it up pretty much anywhere. |
|
|
|
41:07.040 --> 41:13.040 |
|
As we learned this and then later built embrace that's now FDA cleared for seizure detection. |
|
|
|
41:13.040 --> 41:23.040 |
|
We have also built relationships with some of the most amazing doctors in the world who not only help people with unusual brain activity or epilepsy, |
|
|
|
41:23.040 --> 41:35.040 |
|
but some of them are also surgeons and they're going in and they're implanting electrodes, not just to momentarily read the strange patterns of brain activity that we'd like to see return to normal, |
|
|
|
41:35.040 --> 41:42.040 |
|
but also to read out continuously what's happening in some of these deep regions of the brain during most of life when these patients are not seizing. |
|
|
|
41:42.040 --> 41:45.040 |
|
Most of the time they're not seizing. Most of the time they're fine. |
|
|
|
41:45.040 --> 41:58.040 |
|
And so we are now working on mapping those deep brain regions that you can't even usually get with EEG scalp electrodes because the changes deep inside don't reach the surface. |
|
|
|
41:58.040 --> 42:04.040 |
|
But interesting when some of those regions are activated, we see a big skin conductance response. |
|
|
|
42:04.040 --> 42:08.040 |
|
Who would have thunk it, right? Like nothing here, but something here. |
|
|
|
42:08.040 --> 42:17.040 |
|
In fact, right after seizures that we think are the most dangerous ones that precede what's called Sudep, sudden unexpected death and epilepsy. |
|
|
|
42:17.040 --> 42:23.040 |
|
There's a period where the brain waves go flat and it looks like the person's brain has stopped, but it hasn't. |
|
|
|
42:23.040 --> 42:32.040 |
|
The activity has gone deep into a region that can make the cortical activity look flat, like a quick shutdown signal here. |
|
|
|
42:32.040 --> 42:38.040 |
|
It can unfortunately cause breathing to stop if it progresses long enough. |
|
|
|
42:38.040 --> 42:43.040 |
|
Before that happens, we see a big skin conductance response in the data that we have. |
|
|
|
42:43.040 --> 42:46.040 |
|
The longer this flattening, the bigger our response here. |
|
|
|
42:46.040 --> 42:52.040 |
|
So we have been trying to learn, you know, initially like, why are we getting a big response here when there's nothing here? |
|
|
|
42:52.040 --> 42:55.040 |
|
Well, it turns out there's something much deeper. |
|
|
|
42:55.040 --> 43:05.040 |
|
So we can now go inside the brains of some of these individuals, fabulous people who usually aren't seizing and get this data and start to map it. |
|
|
|
43:05.040 --> 43:09.040 |
|
So that's active research that we're doing right now with top medical partners. |
|
|
|
43:09.040 --> 43:18.040 |
|
So this wearable sensor that's looking at skin conductance can capture sort of the ripples of the complexity of what's going on in our brain. |
|
|
|
43:18.040 --> 43:27.040 |
|
So this little device, you have a hope that you can start to get the signal from the interesting things happening in the brain. |
|
|
|
43:27.040 --> 43:35.040 |
|
Yeah, we've already published the strong correlations between the size of this response and the flattening that happens afterwards. |
|
|
|
43:35.040 --> 43:42.040 |
|
And unfortunately also in a real suit up case where the patient died because the, well, we don't know why. |
|
|
|
43:42.040 --> 43:48.040 |
|
We don't know if somebody was there, it would have definitely prevented it, but we know that most suit ups happen when the person's alone. |
|
|
|
43:48.040 --> 44:01.040 |
|
And in this case, a suit up is an acronym, S U D E P, and it stands for the number two cause of years of life lost actually among all neurological disorders. |
|
|
|
44:01.040 --> 44:06.040 |
|
Stroke is number one, suit up is number two, but most people haven't heard of it. |
|
|
|
44:06.040 --> 44:11.040 |
|
Actually, I'll plug my Ted talk, it's on the front page of Ted right now, that talks about this. |
|
|
|
44:11.040 --> 44:14.040 |
|
And we hope to change that. |
|
|
|
44:14.040 --> 44:21.040 |
|
I hope everybody who's heard of SIDS and stroke will now hear of suit up because we think in most cases it's preventable. |
|
|
|
44:21.040 --> 44:27.040 |
|
If people take their meds and aren't alone when they have a seizure, not guaranteed to be preventable. |
|
|
|
44:27.040 --> 44:31.040 |
|
There are some exceptions, but we think most cases probably are. |
|
|
|
44:31.040 --> 44:41.040 |
|
So you have this embrace now in the version two wristband, right, for epilepsy management. That's the one that's FDA approved. |
|
|
|
44:41.040 --> 44:42.040 |
|
Yes. |
|
|
|
44:42.040 --> 44:44.040 |
|
Which is kind of cleared. |
|
|
|
44:44.040 --> 44:45.040 |
|
They say. |
|
|
|
44:45.040 --> 44:46.040 |
|
Sorry. |
|
|
|
44:46.040 --> 44:47.040 |
|
No, it's okay. |
|
|
|
44:47.040 --> 44:49.040 |
|
It essentially means it's approved for marketing. |
|
|
|
44:49.040 --> 44:50.040 |
|
Got it. |
|
|
|
44:50.040 --> 44:53.040 |
|
Just a side note, how difficult is that to do? |
|
|
|
44:53.040 --> 44:57.040 |
|
It's essentially getting FDA approval for computer science technology. |
|
|
|
44:57.040 --> 45:04.040 |
|
It's so agonizing. It's much harder than publishing multiple papers in top medical journals. |
|
|
|
45:04.040 --> 45:11.040 |
|
Yeah, we published peer reviewed, top medical journal, neurology, best results, and that's not good enough for the FDA. |
|
|
|
45:11.040 --> 45:19.040 |
|
Is that system, so if we look at the peer review of medical journals, there's flaws, the strengths, is the FDA approval process? |
|
|
|
45:19.040 --> 45:22.040 |
|
How does it compare to the peer review process? |
|
|
|
45:22.040 --> 45:24.040 |
|
Does it have its strength? |
|
|
|
45:24.040 --> 45:26.040 |
|
I take peer review over FDA any day. |
|
|
|
45:26.040 --> 45:28.040 |
|
Is that a good thing? Is that a good thing for FDA? |
|
|
|
45:28.040 --> 45:32.040 |
|
You're saying, does it stop some amazing technology from getting through? |
|
|
|
45:32.040 --> 45:33.040 |
|
Yeah, it does. |
|
|
|
45:33.040 --> 45:37.040 |
|
The FDA performs a very important good role in keeping people safe. |
|
|
|
45:37.040 --> 45:44.040 |
|
They keep things, they put you through tons of safety testing and that's wonderful and that's great. |
|
|
|
45:44.040 --> 45:48.040 |
|
I'm all in favor of the safety testing. |
|
|
|
45:48.040 --> 45:54.040 |
|
Sometimes they put you through additional testing that they don't have to explain why they put you through it. |
|
|
|
45:54.040 --> 46:00.040 |
|
You don't understand why you're going through it and it doesn't make sense and that's very frustrating. |
|
|
|
46:00.040 --> 46:09.040 |
|
Maybe they have really good reasons and it would do people a service to articulate those reasons. |
|
|
|
46:09.040 --> 46:10.040 |
|
Be more transparent. |
|
|
|
46:10.040 --> 46:12.040 |
|
Be more transparent. |
|
|
|
46:12.040 --> 46:27.040 |
|
As part of Empatica, we have sensors, so what kind of problems can we crack? What kind of things from seizures to autism to, I think I've heard you mentioned depression. |
|
|
|
46:27.040 --> 46:35.040 |
|
What kind of things can we alleviate? Can we detect? What's your hope of how we can make the world a better place with this wearable tech? |
|
|
|
46:35.040 --> 46:48.040 |
|
I would really like to see my fellow brilliant researchers step back and say, what are the really hard problems that we don't know how to solve |
|
|
|
46:48.040 --> 46:57.040 |
|
that come from people maybe we don't even see in our normal life because they're living in the poorer places, they're stuck on the bus, |
|
|
|
46:57.040 --> 47:05.040 |
|
they can't even afford the Uber or the Lyft or the data plan or all these other wonderful things we have that we keep improving on. |
|
|
|
47:05.040 --> 47:14.040 |
|
Meanwhile, there's all these folks left behind in the world and they're struggling with horrible diseases, with depression, with epilepsy, with diabetes, |
|
|
|
47:14.040 --> 47:25.040 |
|
with just awful stuff that maybe a little more time and attention hanging out with them and learning what are their challenges in life, what are their needs? |
|
|
|
47:25.040 --> 47:36.040 |
|
How do we help them have job skills? How do we help them have a hope and a future and a chance to have the great life that so many of us building technology have? |
|
|
|
47:36.040 --> 47:43.040 |
|
And then how would that reshape the kinds of AI that we build? How would that reshape the new apps that we build? |
|
|
|
47:43.040 --> 47:57.040 |
|
Or maybe we need to focus on how to make things more low cost and green instead of $1,000 phones. I mean, come on. Why can't we be thinking more about things that do more with less for these books? |
|
|
|
47:57.040 --> 48:09.040 |
|
Quality of life is not related to the cost of your phone. It's not something that, it's been shown that what, about $75,000 of income and happiness is the same. |
|
|
|
48:09.040 --> 48:16.040 |
|
However, I can tell you, you get a lot of happiness from helping other people. You get a lot more than $75,000 buys. |
|
|
|
48:16.040 --> 48:30.040 |
|
So how do we connect up the people who have real needs with the people who have the ability to build the future and build the kind of future that truly improves the lives of all the people that are currently being left behind? |
|
|
|
48:30.040 --> 48:53.040 |
|
So let me return just briefly on a point, maybe in movie her. So do you think if we look farther into the future, you said so much of the benefit from making our technology more empathetic to us human beings would make them better tools, empower us, make our lives better. |
|
|
|
48:53.040 --> 49:08.040 |
|
If we look farther into the future, do you think we'll ever create an AI system that we can fall in love with and loves us back on a level that is similar to human to human interaction, like in the movie her or beyond? |
|
|
|
49:08.040 --> 49:25.040 |
|
I think we can simulate it in ways that could, you know, sustain engagement for a while. Would it be as good as another person? I don't think so for if you're used to like good people. |
|
|
|
49:25.040 --> 49:39.040 |
|
If you've just grown up with nothing but abuse and you can't stand human beings, can we do something that helps you there that gives you something through a machine? Yeah, but that's pretty low bar, right? If you've only encountered pretty awful people. |
|
|
|
49:39.040 --> 50:00.040 |
|
If you've encountered wonderful, amazing people, we're nowhere near building anything like that. And I'm, I would not bet on building it. I would bet instead on building the kinds of AI that helps all helps kind of raise all boats that helps all people be better people helps all |
|
|
|
50:00.040 --> 50:14.040 |
|
people figure out if they're getting sick tomorrow and helps give them what they need to stay well tomorrow. That's the kind of AI I want to build that improves human lives, not the kind of AI that just walks on the Tonight Show and people go, wow, look how smart that is. |
|
|
|
50:14.040 --> 50:18.040 |
|
You know, really, like, and then it goes back in a box, you know. |
|
|
|
50:18.040 --> 50:40.040 |
|
So on that point, if we continue looking a little bit into the future, do you think an AI that's empathetic and does improve our lives need to have a physical presence, a body, and even let me cautiously say the C word consciousness and even fear of mortality. |
|
|
|
50:40.040 --> 50:59.040 |
|
So some of those human characteristics, do you think it needs to have those aspects? Or can it remain simply machine learning tool that learns from data of behavior that that learns to make us based on previous patterns feel better? |
|
|
|
50:59.040 --> 51:02.040 |
|
Or does it need those elements of consciousness? |
|
|
|
51:02.040 --> 51:13.040 |
|
It depends on your goals. If you're making a movie, it needs a body, it needs a gorgeous body, it needs to act like it has consciousness, it needs to act like it has emotion, right, because that's what sells. |
|
|
|
51:13.040 --> 51:17.040 |
|
That's what's going to get me to show up and enjoy the movie. Okay. |
|
|
|
51:17.040 --> 51:34.040 |
|
In real life, does it need all that? Well, if you've read Orson Scott Card, Ender's Game, Speaker for the Dead, you know, it could just be like a little voice in your earring, right, and you could have an intimate relationship and it could get to know you, and it doesn't need to be a robot. |
|
|
|
51:34.040 --> 51:43.040 |
|
But that doesn't make this compelling of a movie, right? I mean, we already think it's kind of weird when a guy looks like he's talking to himself on the train, you know, even though it's earbuds. |
|
|
|
51:43.040 --> 51:57.040 |
|
So we have these, embodied is more powerful, embodied when you compare interactions with an embodied robot versus a video of a robot versus no robot. |
|
|
|
51:57.040 --> 52:08.040 |
|
The robot is more engaging, the robot gets our attention more, the robot when you walk in your house is more likely to get you to remember to do the things that you asked it to do because it's kind of got a physical presence. |
|
|
|
52:08.040 --> 52:21.040 |
|
You can avoid it if you don't like it, it could see you're avoiding it. There's a lot of power to being embodied. There will be embodied AIs. They have great power and opportunity and potential. |
|
|
|
52:21.040 --> 52:32.040 |
|
There will also be AIs that aren't embodied that just our little software assistants that help us with different things that may get to know things about us. |
|
|
|
52:32.040 --> 52:42.040 |
|
Will they be conscious? There will be attempts to program them to make them appear to be conscious. We can already write programs that make it look like, what do you mean? |
|
|
|
52:42.040 --> 52:52.040 |
|
Of course, I'm aware that you're there, right? I mean, it's trivial to say stuff like that. It's easy to fool people. But does it actually have conscious experience like we do? |
|
|
|
52:52.040 --> 53:01.040 |
|
Nobody has a clue how to do that yet. That seems to be something that is beyond what any of us knows how to build now. |
|
|
|
53:01.040 --> 53:11.040 |
|
Will it have to have that? I think you can get pretty far with a lot of stuff without it. Will we accord it rights? |
|
|
|
53:11.040 --> 53:16.040 |
|
Well, that's more a political game that it is a question of real consciousness. |
|
|
|
53:16.040 --> 53:25.040 |
|
Can you go to jail for turning off Alexa? It's a question for an election maybe a few decades from now. |
|
|
|
53:25.040 --> 53:34.040 |
|
Sophia Robot's already been given rights as a citizen in Saudi Arabia, right? Even before women have full rights. |
|
|
|
53:34.040 --> 53:42.040 |
|
Then the robot was still put back in the box to be shipped to the next place where it would get a paid appearance, right? |
|
|
|
53:42.040 --> 53:50.040 |
|
Yeah, it's dark and almost comedic, if not absurd. |
|
|
|
53:50.040 --> 54:03.040 |
|
So I've heard you speak about your journey in finding faith and how you discovered some wisdoms about life and beyond from reading the Bible. |
|
|
|
54:03.040 --> 54:11.040 |
|
And I've also heard you say that you said scientists too often assume that nothing exists beyond what can be currently measured. |
|
|
|
54:11.040 --> 54:28.040 |
|
Materialism and scientism. In some sense, this assumption enables the near term scientific method, assuming that we can uncover the mysteries of this world by the mechanisms of measurement that we currently have. |
|
|
|
54:28.040 --> 54:38.040 |
|
But we easily forget that we've made this assumption. So what do you think we missed out on by making that assumption? |
|
|
|
54:38.040 --> 54:49.040 |
|
It's fine to limit the scientific method to things we can measure and reason about and reproduce. That's fine. |
|
|
|
54:49.040 --> 54:57.040 |
|
I think we have to recognize that sometimes we scientists also believe in things that happen historically, you know, like I believe the Holocaust happened. |
|
|
|
54:57.040 --> 55:11.040 |
|
I can't prove events from past history scientifically. You prove them with historical evidence, right, with the impact they had on people with eyewitness testimony and things like that. |
|
|
|
55:11.040 --> 55:21.040 |
|
So a good thinker recognizes that science is one of many ways to get knowledge. It's not the only way. |
|
|
|
55:21.040 --> 55:31.040 |
|
And there's been some really bad philosophy and bad thinking recently called scientism where people say science is the only way to get to truth. |
|
|
|
55:31.040 --> 55:43.040 |
|
And it's not. It just isn't. There are other ways that work also, like knowledge of love with someone. You don't prove your love through science, right? |
|
|
|
55:43.040 --> 56:01.040 |
|
So history, philosophy, love, a lot of other things in life show us that there's more ways to gain knowledge and truth if you're willing to believe there is such a thing, and I believe there is, than science. |
|
|
|
56:01.040 --> 56:14.040 |
|
I am a scientist, however, and in my science, I do limit my science to the things that the scientific method can do. But I recognize that it's myopic to say that that's all there is. |
|
|
|
56:14.040 --> 56:28.040 |
|
Right. Just like you listed, there's all the why questions. And really, we know if we're being honest with ourselves, the percent of what we really know is basically zero relative to the full mystery of those. |
|
|
|
56:28.040 --> 56:34.040 |
|
Measure theory, a set of measures, if I have a finite amount of knowledge, which I do. |
|
|
|
56:34.040 --> 56:45.040 |
|
So you said that you believe in truth. So let me ask that old question. What do you think this thing is all about? Life on earth? |
|
|
|
56:45.040 --> 56:53.040 |
|
Life, the universe and everything. I can't put Douglas Adams 42. My favorite number. By the way, that's my street address. |
|
|
|
56:53.040 --> 57:00.040 |
|
My husband and I guessed the exact same number for our house. We got to pick it. And there's a reason we picked 42. Yeah. |
|
|
|
57:00.040 --> 57:05.040 |
|
So is it just 42 or do you have other words that you can put around it? |
|
|
|
57:05.040 --> 57:15.040 |
|
Well, I think there's a grand adventure and I think this life is a part of it. I think there's a lot more to it than meets the eye and the heart and the mind and the soul here. |
|
|
|
57:15.040 --> 57:28.040 |
|
We see, but through a glass dimly in this life, we see only a part of all there is to know. If people haven't read the Bible, they should if they consider themselves educated. |
|
|
|
57:28.040 --> 57:36.040 |
|
And you could read Proverbs and find tremendous wisdom in there that cannot be scientifically proven. |
|
|
|
57:36.040 --> 57:43.040 |
|
But when you read it, there's something in you like a musician knows when the instruments played right and it's beautiful. |
|
|
|
57:43.040 --> 57:54.040 |
|
There's something in you that comes alive and knows that there's a truth there that like your strings are being plucked by the master instead of by me playing when I pluck it. |
|
|
|
57:54.040 --> 57:57.040 |
|
But probably when you play, it sounds spectacular. |
|
|
|
57:57.040 --> 58:11.040 |
|
And when you encounter those truths, there's something in you that sings and knows that there is more than what I can prove mathematically or program a computer to do. |
|
|
|
58:11.040 --> 58:19.040 |
|
Don't get me wrong. The math is gorgeous. The computer programming can be brilliant. It's inspiring, right? We want to do more. |
|
|
|
58:19.040 --> 58:26.040 |
|
None of this squashes my desire to do science or to get knowledge through science. I'm not dissing the science at all. |
|
|
|
58:26.040 --> 58:33.040 |
|
I grow even more in awe of what the science can do because I'm more in awe of all there is we don't know. |
|
|
|
58:33.040 --> 58:41.040 |
|
And really at the heart of science, you have to have a belief that there's truth that there's something greater to be discovered. |
|
|
|
58:41.040 --> 58:48.040 |
|
And some scientists may not want to use the faith word, but it's faith that drives us to do science. |
|
|
|
58:48.040 --> 59:00.040 |
|
It's faith that there is truth that there's something to know that we don't know that it's worth knowing that it's worth working hard and that there is meaning that there is such a thing as meaning, |
|
|
|
59:00.040 --> 59:07.040 |
|
which by the way, science can't prove either. We have to kind of start with some assumptions that there's things like truth and meaning. |
|
|
|
59:07.040 --> 59:14.040 |
|
And these are really questions philosophers own, right? This is their space of philosophers and theologians at some level. |
|
|
|
59:14.040 --> 59:24.040 |
|
So these are things science, you know, if we when people claim that science will tell you all truth, that's there's a name for that. |
|
|
|
59:24.040 --> 59:29.040 |
|
It's it's its own kind of faith. It's scientism and it's very myopic. |
|
|
|
59:29.040 --> 59:37.040 |
|
Yeah, there's a much bigger world out there to be explored in ways that science may not, at least for now, allows to explore. |
|
|
|
59:37.040 --> 59:45.040 |
|
Yeah. And there's meaning and purpose and hope and joy and love and all these awesome things that make it all worthwhile too. |
|
|
|
59:45.040 --> 59:49.040 |
|
I don't think there's a better way to end it, Ross. Thank you so much for talking today. |
|
|
|
59:49.040 --> 1:00:11.040 |
|
Thanks, Lex. What a pleasure. Great questions. |
|
|
|
|