|
WEBVTT |
|
|
|
00:00.000 --> 00:04.640 |
|
The following is a conversation with Pamela McCordick. She's an author who has written |
|
|
|
00:04.640 --> 00:08.400 |
|
on the history and the philosophical significance of artificial intelligence. |
|
|
|
00:09.040 --> 00:17.440 |
|
Her books include Machines Who Think in 1979, The Fifth Generation in 1983, with Ed Fangenbaum, |
|
|
|
00:17.440 --> 00:22.960 |
|
who's considered to be the father of expert systems, The Edge of Chaos, The Features of Women, |
|
|
|
00:22.960 --> 00:28.960 |
|
and many more books. I came across her work in an unusual way by stumbling in a quote from |
|
|
|
00:28.960 --> 00:35.280 |
|
Machines Who Think that is something like, artificial intelligence began with the ancient |
|
|
|
00:35.280 --> 00:41.920 |
|
wish to forge the gods. That was a beautiful way to draw a connecting line between our societal |
|
|
|
00:41.920 --> 00:48.720 |
|
relationship with AI from the grounded day to day science, math, and engineering to popular stories |
|
|
|
00:48.720 --> 00:55.520 |
|
and science fiction and myths of automatons that go back for centuries. Through her literary work, |
|
|
|
00:55.520 --> 01:00.400 |
|
she has spent a lot of time with the seminal figures of artificial intelligence, |
|
|
|
01:00.400 --> 01:07.760 |
|
including the founding fathers of AI from the 1956 Dartmouth summer workshop where the field |
|
|
|
01:07.760 --> 01:13.600 |
|
was launched. I reached out to Pamela for a conversation in hopes of getting a sense of |
|
|
|
01:13.600 --> 01:18.400 |
|
what those early days were like and how their dreams continued to reverberate |
|
|
|
01:18.400 --> 01:23.840 |
|
through the work of our community today. I often don't know where the conversation may take us, |
|
|
|
01:23.840 --> 01:29.600 |
|
but I jump in and see. Having no constraints, rules, or goals is a wonderful way to discover new |
|
|
|
01:29.600 --> 01:36.320 |
|
ideas. This is the Artificial Intelligence Podcast. If you enjoy it, subscribe on YouTube, |
|
|
|
01:36.320 --> 01:41.600 |
|
give it five stars on iTunes, support it on Patreon, or simply connect with me on Twitter |
|
|
|
01:41.600 --> 01:49.680 |
|
at Lex Freedman, spelled F R I D M A N. And now here's my conversation with Pamela McCordick. |
|
|
|
01:49.680 --> 01:58.640 |
|
In 1979, your book, Machines Who Think, was published. In it, you interview some of the early |
|
|
|
01:58.640 --> 02:06.320 |
|
AI pioneers and explore the idea that AI was born not out of maybe math and computer science, |
|
|
|
02:06.320 --> 02:14.960 |
|
but out of myth and legend. So tell me if you could the story of how you first arrived at the |
|
|
|
02:14.960 --> 02:22.400 |
|
book, the journey of beginning to write it. I had been a novelist. I'd published two novels. |
|
|
|
02:23.120 --> 02:31.760 |
|
And I was sitting under the portal at Stanford one day in the house we were renting for the |
|
|
|
02:31.760 --> 02:37.760 |
|
summer. And I thought, I should write a novel about these weird people in AI, I know. And then I |
|
|
|
02:37.760 --> 02:44.240 |
|
thought, ah, don't write a novel, write a history. Simple. Just go around, you know, interview them, |
|
|
|
02:44.240 --> 02:50.400 |
|
splice it together. Voila, instant book. Ha, ha, ha. It was much harder than that. |
|
|
|
02:50.400 --> 02:58.320 |
|
But nobody else was doing it. And so I thought, well, this is a great opportunity. And there were |
|
|
|
02:59.760 --> 03:06.240 |
|
people who, John McCarthy, for example, thought it was a nutty idea. There were much, you know, |
|
|
|
03:06.240 --> 03:11.920 |
|
the field had not evolved yet, so on. And he had some mathematical thing he thought I should write |
|
|
|
03:11.920 --> 03:18.480 |
|
instead. And I said, no, John, I am not a woman in search of a project. I'm, this is what I want |
|
|
|
03:18.480 --> 03:24.000 |
|
to do. I hope you'll cooperate. And he said, oh, mother, mother, well, okay, it's your, your time. |
|
|
|
03:24.960 --> 03:31.280 |
|
What was the pitch for the, I mean, such a young field at that point. How do you write |
|
|
|
03:31.280 --> 03:37.520 |
|
a personal history of a field that's so young? I said, this is wonderful. The founders of the |
|
|
|
03:37.520 --> 03:43.120 |
|
field are alive and kicking and able to talk about what they're doing. Did they sound or feel like |
|
|
|
03:43.120 --> 03:48.560 |
|
founders at the time? Did they know that they've been found, that they've founded something? Oh, |
|
|
|
03:48.560 --> 03:55.120 |
|
yeah, they knew what they were doing was very important, very. What they, what I now see in |
|
|
|
03:55.120 --> 04:04.080 |
|
retrospect is that they were at the height of their research careers. And it's humbling to me |
|
|
|
04:04.080 --> 04:09.280 |
|
that they took time out from all the things that they had to do as a consequence of being there. |
|
|
|
04:10.320 --> 04:14.560 |
|
And to talk to this woman who said, I think I'm going to write a book about you. |
|
|
|
04:14.560 --> 04:23.840 |
|
No, it was amazing, just amazing. So who, who stands out to you? Maybe looking 63 years ago, |
|
|
|
04:23.840 --> 04:31.040 |
|
the Dartmouth conference. So Marvin Minsky was there. McCarthy was there. Claude Shannon, |
|
|
|
04:31.040 --> 04:36.960 |
|
Alan Newell, Herb Simon, some of the folks you've mentioned. Right. Then there's other characters, |
|
|
|
04:36.960 --> 04:44.720 |
|
right? One of your coauthors. He wasn't at Dartmouth. He wasn't at Dartmouth, but I mean. |
|
|
|
04:44.720 --> 04:50.880 |
|
He was a, I think an undergraduate then. And, and of course, Joe Traub. I mean, |
|
|
|
04:50.880 --> 04:58.720 |
|
all of these are players, not at Dartmouth them, but in that era. Right. It's same you and so on. |
|
|
|
04:58.720 --> 05:03.680 |
|
So who are the characters, if you could paint a picture that stand out to you from memory, |
|
|
|
05:03.680 --> 05:07.200 |
|
those people you've interviewed and maybe not people that were just in the, |
|
|
|
05:08.400 --> 05:13.760 |
|
in the, the atmosphere, in the atmosphere. Of course, the four founding fathers were |
|
|
|
05:13.760 --> 05:17.040 |
|
extraordinary guys. They really were. Who are the founding fathers? |
|
|
|
05:18.560 --> 05:22.480 |
|
Alan Newell, Herbert Simon, Marvin Minsky, John McCarthy, |
|
|
|
05:22.480 --> 05:26.240 |
|
they were the four who were not only at the Dartmouth conference, |
|
|
|
05:26.240 --> 05:31.200 |
|
but Newell and Simon arrived there with a working program called the logic theorist. |
|
|
|
05:31.200 --> 05:38.400 |
|
Everybody else had great ideas about how they might do it, but they weren't going to do it yet. |
|
|
|
05:41.040 --> 05:48.720 |
|
And you mentioned Joe Traub, my husband. I was immersed in AI before I met Joe, |
|
|
|
05:50.080 --> 05:54.960 |
|
because I had been Ed Feigenbaum's assistant at Stanford. And before that, |
|
|
|
05:54.960 --> 06:01.520 |
|
I had worked on a book by edited by Feigenbaum and Julian Feldman called |
|
|
|
06:02.800 --> 06:09.200 |
|
Computers and Thought. It was the first textbook of readings of AI. And they, they only did it |
|
|
|
06:09.200 --> 06:13.120 |
|
because they were trying to teach AI to people at Berkeley. And there was nothing, you know, |
|
|
|
06:13.120 --> 06:17.600 |
|
you'd have to send them to this journal and that journal. This was not the internet where you could |
|
|
|
06:17.600 --> 06:26.080 |
|
go look at an article. So I was fascinated from the get go by AI. I was an English major, you know, |
|
|
|
06:26.080 --> 06:33.200 |
|
what did I know? And yet I was fascinated. And that's why you saw that historical, |
|
|
|
06:33.200 --> 06:40.320 |
|
that literary background, which I think is very much a part of the continuum of AI that |
|
|
|
06:40.320 --> 06:48.000 |
|
the AI grew out of that same impulse. Was that, yeah, that traditional? What, what was, what drew |
|
|
|
06:48.000 --> 06:54.800 |
|
you to AI? How did you even think of it back, back then? What, what was the possibilities, |
|
|
|
06:54.800 --> 07:03.200 |
|
the dreams? What was interesting to you? The idea of intelligence outside the human cranium, |
|
|
|
07:03.200 --> 07:08.000 |
|
this was a phenomenal idea. And even when I finished machines who think, |
|
|
|
07:08.000 --> 07:15.040 |
|
I didn't know if they were going to succeed. In fact, the final chapter is very wishy washy, |
|
|
|
07:15.040 --> 07:25.200 |
|
frankly. I don't succeed the field did. Yeah. Yeah. So was there the idea that AI began with |
|
|
|
07:25.200 --> 07:32.000 |
|
the wish to forge the God? So the spiritual component that we crave to create this other |
|
|
|
07:32.000 --> 07:40.880 |
|
thing greater than ourselves? For those guys, I don't think so. Newell and Simon were cognitive |
|
|
|
07:40.880 --> 07:49.840 |
|
psychologists. What they wanted was to simulate aspects of human intelligence. And they found |
|
|
|
07:49.840 --> 07:57.600 |
|
they could do it on the computer. Minsky just thought it was a really cool thing to do. |
|
|
|
07:57.600 --> 08:07.120 |
|
Likewise, McCarthy. McCarthy had got the idea in 1949 when, when he was a Caltech student. And |
|
|
|
08:08.560 --> 08:15.520 |
|
he listened to somebody's lecture. It's in my book, I forget who it was. And he thought, |
|
|
|
08:15.520 --> 08:20.480 |
|
oh, that would be fun to do. How do we do that? And he took a very mathematical approach. |
|
|
|
08:20.480 --> 08:28.800 |
|
Minsky was hybrid. And Newell and Simon were very much cognitive psychology. How can we |
|
|
|
08:28.800 --> 08:37.280 |
|
simulate various things about human cognition? What happened over the many years is, of course, |
|
|
|
08:37.280 --> 08:42.480 |
|
our definition of intelligence expanded tremendously. I mean, these days, |
|
|
|
08:43.920 --> 08:48.960 |
|
biologists are comfortable talking about the intelligence of cell, the intelligence of the |
|
|
|
08:48.960 --> 08:57.840 |
|
brain, not just human brain, but the intelligence of any kind of brain, cephalopause. I mean, |
|
|
|
08:59.520 --> 09:05.840 |
|
an octopus is really intelligent by any, we wouldn't have thought of that in the 60s, |
|
|
|
09:05.840 --> 09:12.560 |
|
even the 70s. So all these things have worked in. And I did hear one behavioral |
|
|
|
09:12.560 --> 09:20.640 |
|
primatologist, Franz Duval, say AI taught us the questions to ask. |
|
|
|
09:22.800 --> 09:27.760 |
|
Yeah, this is what happens, right? It's when you try to build it, is when you start to actually |
|
|
|
09:27.760 --> 09:35.360 |
|
ask questions, if it puts a mirror to ourselves. So you were there in the middle of it. It seems |
|
|
|
09:35.360 --> 09:41.920 |
|
like not many people were asking the questions that you were trying to look at this field, |
|
|
|
09:41.920 --> 09:48.480 |
|
the way you were. I was solo. When I went to get funding for this, because I needed somebody to |
|
|
|
09:48.480 --> 09:59.840 |
|
transcribe the interviews and I needed travel expenses, I went to every thing you could think of, |
|
|
|
09:59.840 --> 10:11.280 |
|
the NSF, the DARPA. There was an Air Force place that doled out money. And each of them said, |
|
|
|
10:11.840 --> 10:18.640 |
|
well, that was very interesting. That's a very interesting idea. But we'll think about it. |
|
|
|
10:19.200 --> 10:24.320 |
|
And the National Science Foundation actually said to me in plain English, |
|
|
|
10:24.320 --> 10:30.960 |
|
hey, you're only a writer. You're not an historian of science. And I said, yeah, that's true. But |
|
|
|
10:30.960 --> 10:35.360 |
|
the historians of science will be crawling all over this field. I'm writing for the general |
|
|
|
10:35.360 --> 10:44.000 |
|
audience. So I thought, and they still wouldn't budge. I finally got a private grant without |
|
|
|
10:44.000 --> 10:51.440 |
|
knowing who it was from from Ed Fredkin at MIT. He was a wealthy man, and he liked what he called |
|
|
|
10:51.440 --> 10:56.880 |
|
crackpot ideas. And he considered this a crackpot idea. This a crackpot idea. And he was willing to |
|
|
|
10:56.880 --> 11:04.240 |
|
support it. I am ever grateful. Let me say that. You know, some would say that a history of science |
|
|
|
11:04.240 --> 11:09.360 |
|
approach to AI, or even just a history or anything like the book that you've written, |
|
|
|
11:09.360 --> 11:16.640 |
|
hasn't been written since. Maybe I'm not familiar. But it's certainly not many. |
|
|
|
11:16.640 --> 11:24.000 |
|
If we think about bigger than just these couple of decades, a few decades, what are the roots |
|
|
|
11:25.120 --> 11:32.160 |
|
of AI? Oh, they go back so far. Yes, of course, there's all the legendary stuff, the |
|
|
|
11:32.800 --> 11:42.160 |
|
Golem and the early robots of the 20th century. But they go back much further than that. If |
|
|
|
11:42.160 --> 11:50.320 |
|
you read Homer, Homer has robots in the Iliad. And a classical scholar was pointing out to me |
|
|
|
11:50.320 --> 11:55.440 |
|
just a few months ago. Well, you said you just read the Odyssey. The Odyssey is full of robots. |
|
|
|
11:55.440 --> 12:01.520 |
|
It is, I said. Yeah, how do you think Odysseus's ship gets from place one place to another? He |
|
|
|
12:01.520 --> 12:09.040 |
|
doesn't have the crew people to do that, the crew men. Yeah, it's magic. It's robots. Oh, I thought. |
|
|
|
12:09.040 --> 12:17.680 |
|
How interesting. So we've had this notion of AI for a long time. And then toward the end of the |
|
|
|
12:17.680 --> 12:24.000 |
|
19th century, the beginning of the 20th century, there were scientists who actually tried to |
|
|
|
12:24.000 --> 12:29.280 |
|
make this happen some way or another, not successfully, they didn't have the technology |
|
|
|
12:29.280 --> 12:40.160 |
|
for it. And of course, Babbage, in the 1850s and 60s, he saw that what he was building was capable |
|
|
|
12:40.160 --> 12:47.040 |
|
of intelligent behavior. And he, when he ran out of funding, the British government finally said, |
|
|
|
12:47.040 --> 12:53.360 |
|
that's enough. He and Lady Lovelace decided, oh, well, why don't we make, you know, why don't we |
|
|
|
12:53.360 --> 13:00.560 |
|
play the ponies with this? He had other ideas for raising money too. But if we actually reach back |
|
|
|
13:00.560 --> 13:07.280 |
|
once again, I think people don't actually really know that robots do appear or ideas of robots. |
|
|
|
13:07.280 --> 13:14.240 |
|
You talk about the Hellenic and the Hebraic points of view. Oh, yes. Can you tell me about each? |
|
|
|
13:15.040 --> 13:22.480 |
|
I defined it this way, the Hellenic point of view is robots are great. You know, they're party help, |
|
|
|
13:22.480 --> 13:30.320 |
|
they help this guy, Hephaestus, this God Hephaestus in his forge. I presume he made them to help him, |
|
|
|
13:31.200 --> 13:38.560 |
|
and so on and so forth. And they welcome the whole idea of robots. The Hebraic view has to do with, |
|
|
|
13:39.360 --> 13:46.800 |
|
I think it's the second commandment, thou shalt not make any graven image. In other words, you |
|
|
|
13:46.800 --> 13:54.480 |
|
better not start imitating humans, because that's just forbidden. It's the second commandment. |
|
|
|
13:55.520 --> 14:05.840 |
|
And a lot of the reaction to artificial intelligence has been a sense that this is |
|
|
|
14:05.840 --> 14:16.800 |
|
this is somehow wicked. This is somehow blasphemous. We shouldn't be going there. Now, you can say, |
|
|
|
14:16.800 --> 14:21.600 |
|
yeah, but there're going to be some downsides. And I say, yes, there are. But blasphemy is not one of |
|
|
|
14:21.600 --> 14:29.520 |
|
them. You know, there's a kind of fear that feels to be almost primal. Is there religious roots to |
|
|
|
14:29.520 --> 14:36.160 |
|
that? Because so much of our society has religious roots. And so there is a feeling of, like you |
|
|
|
14:36.160 --> 14:44.080 |
|
said, blasphemy of creating the other, of creating something, you know, it doesn't have to be artificial |
|
|
|
14:44.080 --> 14:50.480 |
|
intelligence. It's creating life in general. It's the Frankenstein idea. There's the annotated |
|
|
|
14:50.480 --> 14:58.080 |
|
Frankenstein on my coffee table. It's a tremendous novel. It really is just beautifully perceptive. |
|
|
|
14:58.080 --> 15:06.800 |
|
Yes, we do fear this and we have good reason to fear it, but because it can get out of hand. |
|
|
|
15:06.800 --> 15:11.360 |
|
Maybe you can speak to that fear, the psychology, if you thought about it, you know, |
|
|
|
15:11.360 --> 15:16.160 |
|
there's a practical set of fears, concerns in the short term, you can think of, if we actually |
|
|
|
15:16.160 --> 15:22.720 |
|
think about artificial intelligence systems, you can think about bias of discrimination in |
|
|
|
15:22.720 --> 15:32.160 |
|
algorithms or you can think about their social networks, have algorithms that recommend the |
|
|
|
15:32.160 --> 15:37.680 |
|
content you see, thereby these algorithms control the behavior of the masses. There's these concerns. |
|
|
|
15:38.240 --> 15:45.040 |
|
But to me, it feels like the fear that people have is deeper than that. So have you thought about |
|
|
|
15:45.040 --> 15:53.440 |
|
the psychology of it? I think in a superficial way I have. There is this notion that if we |
|
|
|
15:55.680 --> 16:01.200 |
|
produce a machine that can think, it will outthink us and therefore replace us. |
|
|
|
16:02.000 --> 16:11.840 |
|
I guess that's a primal fear of almost kind of a kind of mortality. So around the time you said |
|
|
|
16:11.840 --> 16:21.920 |
|
you worked with Ed Stamford with Ed Faganbaum. So let's look at that one person throughout his |
|
|
|
16:21.920 --> 16:31.600 |
|
history, clearly a key person, one of the many in the history of AI. How has he changed in general |
|
|
|
16:31.600 --> 16:36.480 |
|
around him? How has Stamford changed in the last, how many years are we talking about here? |
|
|
|
16:36.480 --> 16:44.720 |
|
Oh, since 65. So maybe it doesn't have to be about him. It could be bigger, but because he was a |
|
|
|
16:44.720 --> 16:51.360 |
|
key person in expert systems, for example, how are these folks who you've interviewed |
|
|
|
16:53.040 --> 16:58.400 |
|
in the 70s, 79, changed through the decades? |
|
|
|
16:58.400 --> 17:10.720 |
|
In Ed's case, I know him well. We are dear friends. We see each other every month or so. |
|
|
|
17:11.520 --> 17:16.160 |
|
He told me that when machines who think first came out, he really thought all the front |
|
|
|
17:16.160 --> 17:26.000 |
|
matter was kind of baloney. And 10 years later, he said, no, I see what you're getting at. Yes, |
|
|
|
17:26.000 --> 17:31.520 |
|
this is an impulse that has been, this has been a human impulse for thousands of years |
|
|
|
17:32.160 --> 17:36.640 |
|
to create something outside the human cranium that has intelligence. |
|
|
|
17:41.120 --> 17:47.520 |
|
I think it's very hard when you're down at the algorithmic level, and you're just trying to |
|
|
|
17:47.520 --> 17:53.840 |
|
make something work, which is hard enough to step back and think of the big picture. |
|
|
|
17:53.840 --> 18:02.000 |
|
It reminds me of when I was in Santa Fe, I knew a lot of archaeologists, which was a hobby of mine, |
|
|
|
18:02.800 --> 18:08.000 |
|
and I would say, yeah, yeah, well, you can look at the shards and say, oh, |
|
|
|
18:08.000 --> 18:14.160 |
|
this came from this tribe and this came from this trade route and so on. But what about the big |
|
|
|
18:14.160 --> 18:21.600 |
|
picture? And a very distinguished archaeologist said to me, they don't think that way. You do |
|
|
|
18:21.600 --> 18:27.920 |
|
know they're trying to match the shard to the to where it came from. That's, you know, where did |
|
|
|
18:27.920 --> 18:34.480 |
|
this corn, the remainder of this corn come from? Was it grown here? Was it grown elsewhere? And I |
|
|
|
18:34.480 --> 18:44.560 |
|
think this is part of the AI, any scientific field. You're so busy doing the hard work. And it is |
|
|
|
18:44.560 --> 18:49.920 |
|
hard work that you don't step back and say, oh, well, now let's talk about the, you know, |
|
|
|
18:49.920 --> 18:56.720 |
|
the general meaning of all this. Yes. So none of the, even Minsky and McCarthy, |
|
|
|
18:58.080 --> 19:04.880 |
|
they, oh, those guys did. Yeah. The founding fathers did early on or pretty early on. Well, |
|
|
|
19:04.880 --> 19:11.200 |
|
they had, but in a different way from how I looked at it, the two cognitive psychologists, |
|
|
|
19:11.200 --> 19:20.960 |
|
Newell and Simon, they wanted to imagine reforming cognitive psychology so that we would really, |
|
|
|
19:20.960 --> 19:31.520 |
|
really understand the brain. Yeah. Minsky was more speculative. And John McCarthy saw it as, |
|
|
|
19:32.960 --> 19:40.080 |
|
I think I'm doing, doing him right by this. He really saw it as a great boon for human beings to |
|
|
|
19:40.080 --> 19:49.440 |
|
have this technology. And that was reason enough to do it. And he had wonderful, wonderful fables |
|
|
|
19:50.240 --> 19:57.920 |
|
about how if you do the mathematics, you will see that these things are really good for human beings. |
|
|
|
19:57.920 --> 20:05.280 |
|
And if you had a technological objection, he had an answer, a technological answer. But here's how |
|
|
|
20:05.280 --> 20:10.480 |
|
we could get over that. And then blah, blah, blah, blah. And one of his favorite things was |
|
|
|
20:10.480 --> 20:15.680 |
|
what he called the literary problem, which of course, he presented to me several times. |
|
|
|
20:16.400 --> 20:23.680 |
|
That is, everything in literature, there are conventions in literature. One of the conventions |
|
|
|
20:23.680 --> 20:37.040 |
|
is that you have a villain and a hero. And the hero in most literature is human. And the villain |
|
|
|
20:37.040 --> 20:42.000 |
|
in most literature is a machine. And he said, no, that's just not the way it's going to be. |
|
|
|
20:42.560 --> 20:48.000 |
|
But that's the way we're used to it. So when we tell stories about AI, it's always with this |
|
|
|
20:48.000 --> 20:57.760 |
|
paradigm. I thought, yeah, he's right. Looking back, the classics, RUR is certainly the machines |
|
|
|
20:57.760 --> 21:07.040 |
|
trying to overthrow the humans. Frankenstein is different. Frankenstein is a creature. |
|
|
|
21:08.480 --> 21:14.560 |
|
He never has a name. Frankenstein, of course, is the guy who created him, the human Dr. Frankenstein. |
|
|
|
21:14.560 --> 21:23.120 |
|
And this creature wants to be loved, wants to be accepted. And it is only when Frankenstein |
|
|
|
21:24.720 --> 21:32.720 |
|
turns his head, in fact, runs the other way. And the creature is without love |
|
|
|
21:34.400 --> 21:38.560 |
|
that he becomes the monster that he later becomes. |
|
|
|
21:39.680 --> 21:43.840 |
|
So who's the villain in Frankenstein? It's unclear, right? |
|
|
|
21:43.840 --> 21:45.520 |
|
Oh, it is unclear. Yeah. |
|
|
|
21:45.520 --> 21:54.320 |
|
It's really the people who drive him, by driving him away, they bring out the worst. |
|
|
|
21:54.320 --> 22:00.800 |
|
That's right. They give him no human solace. And he is driven away, you're right. |
|
|
|
22:03.040 --> 22:10.160 |
|
He becomes, at one point, the friend of a blind man. And he serves this blind man, |
|
|
|
22:10.160 --> 22:16.640 |
|
and they become very friendly. But when the sighted people of the blind man's family come in, |
|
|
|
22:18.640 --> 22:26.000 |
|
you got a monster here. So it's very didactic in its way. And what I didn't know is that Mary Shelley |
|
|
|
22:26.000 --> 22:33.440 |
|
and Percy Shelley were great readers of the literature surrounding abolition in the United |
|
|
|
22:33.440 --> 22:41.200 |
|
States, the abolition of slavery. And they picked that up wholesale. You are making monsters of |
|
|
|
22:41.200 --> 22:45.680 |
|
these people because you won't give them the respect and love that they deserve. |
|
|
|
22:46.800 --> 22:54.880 |
|
Do you have, if we get philosophical for a second, do you worry that once we create |
|
|
|
22:54.880 --> 22:59.840 |
|
machines that are a little bit more intelligent? Let's look at Roomba, the vacuum cleaner, |
|
|
|
22:59.840 --> 23:05.360 |
|
that this darker part of human nature where we abuse |
|
|
|
23:07.760 --> 23:12.400 |
|
the other, somebody who's different, will come out? |
|
|
|
23:13.520 --> 23:22.640 |
|
I don't worry about it. I could imagine it happening. But I think that what AI has to offer |
|
|
|
23:22.640 --> 23:32.480 |
|
the human race will be so attractive that people will be won over. So you have looked deep into |
|
|
|
23:32.480 --> 23:40.080 |
|
these people, had deep conversations, and it's interesting to get a sense of stories of the |
|
|
|
23:40.080 --> 23:44.480 |
|
way they were thinking and the way it was changed, the way your own thinking about AI has changed. |
|
|
|
23:44.480 --> 23:53.360 |
|
As you mentioned, McCarthy, what about the years at CMU, Carnegie Mellon, with Joe? |
|
|
|
23:53.360 --> 24:02.800 |
|
Sure. Joe was not in AI. He was in algorithmic complexity. |
|
|
|
24:03.440 --> 24:09.040 |
|
Was there always a line between AI and computer science, for example? Is AI its own place of |
|
|
|
24:09.040 --> 24:15.920 |
|
outcasts? Was that the feeling? There was a kind of outcast period for AI. |
|
|
|
24:15.920 --> 24:28.720 |
|
For instance, in 1974, the new field was hardly 10 years old. The new field of computer science |
|
|
|
24:28.720 --> 24:33.200 |
|
was asked by the National Science Foundation, I believe, but it may have been the National |
|
|
|
24:33.200 --> 24:42.720 |
|
Academies, I can't remember, to tell our fellow scientists where computer science is and what |
|
|
|
24:42.720 --> 24:52.880 |
|
it means. And they wanted to leave out AI. And they only agreed to put it in because Don Knuth |
|
|
|
24:52.880 --> 24:59.760 |
|
said, hey, this is important. You can't just leave that out. Really? Don? Don Knuth, yes. |
|
|
|
24:59.760 --> 25:06.480 |
|
I talked to Mr. Nietzsche. Out of all the people. Yes. But you see, an AI person couldn't have made |
|
|
|
25:06.480 --> 25:10.880 |
|
that argument. He wouldn't have been believed, but Knuth was believed. Yes. |
|
|
|
25:10.880 --> 25:18.160 |
|
So Joe Trout worked on the real stuff. Joe was working on algorithmic complexity, |
|
|
|
25:18.160 --> 25:24.800 |
|
but he would say in plain English again and again, the smartest people I know are in AI. |
|
|
|
25:24.800 --> 25:32.320 |
|
Really? Oh, yes. No question. Anyway, Joe loved these guys. What happened was that |
|
|
|
25:34.080 --> 25:40.160 |
|
I guess it was as I started to write machines who think, Herb Simon and I became very close |
|
|
|
25:40.160 --> 25:46.000 |
|
friends. He would walk past our house on Northumberland Street every day after work. |
|
|
|
25:46.560 --> 25:52.160 |
|
And I would just be putting my cover on my typewriter and I would lean out the door and say, |
|
|
|
25:52.160 --> 25:58.800 |
|
Herb, would you like a sherry? And Herb almost always would like a sherry. So he'd stop in |
|
|
|
25:59.440 --> 26:06.000 |
|
and we'd talk for an hour, two hours. My journal says we talked this afternoon for three hours. |
|
|
|
26:06.720 --> 26:11.520 |
|
What was on his mind at the time in terms of on the AI side of things? |
|
|
|
26:12.160 --> 26:15.120 |
|
We didn't talk too much about AI. We talked about other things. Just life. |
|
|
|
26:15.120 --> 26:24.480 |
|
We both love literature and Herb had read Proust in the original French twice all the way through. |
|
|
|
26:25.280 --> 26:31.280 |
|
I can't. I read it in English in translation. So we talked about literature. We talked about |
|
|
|
26:31.280 --> 26:37.120 |
|
languages. We talked about music because he loved music. We talked about art because he was |
|
|
|
26:37.120 --> 26:45.840 |
|
he was actually enough of a painter that he had to give it up because he was afraid it was interfering |
|
|
|
26:45.840 --> 26:53.120 |
|
with his research and so on. So no, it was really just chat chat, but it was very warm. |
|
|
|
26:54.000 --> 26:59.840 |
|
So one summer I said to Herb, you know, my students have all the really interesting |
|
|
|
26:59.840 --> 27:04.480 |
|
conversations. I was teaching at the University of Pittsburgh then in the English department. |
|
|
|
27:04.480 --> 27:08.880 |
|
And, you know, they get to talk about the meaning of life and that kind of thing. |
|
|
|
27:08.880 --> 27:15.200 |
|
And what do I have? I have university meetings where we talk about the photocopying budget and, |
|
|
|
27:15.200 --> 27:20.160 |
|
you know, whether the course on romantic poetry should be one semester or two. |
|
|
|
27:21.200 --> 27:25.760 |
|
So Herb laughed. He said, yes, I know what you mean. He said, but, you know, you could do something |
|
|
|
27:25.760 --> 27:33.920 |
|
about that. Dot, that was his wife, Dot and I used to have a salon at the University of Chicago every |
|
|
|
27:33.920 --> 27:42.400 |
|
Sunday night. And we would have essentially an open house. And people knew it wasn't for a small |
|
|
|
27:42.400 --> 27:51.440 |
|
talk. It was really for some topic of depth. He said, but my advice would be that you choose |
|
|
|
27:51.440 --> 27:59.200 |
|
the topic ahead of time. Fine, I said. So the following, we exchanged mail over the summer. |
|
|
|
27:59.200 --> 28:09.120 |
|
That was US post in those days because you didn't have personal email. And I decided I would organize |
|
|
|
28:09.120 --> 28:16.880 |
|
it. And there would be eight of us, Alan Nolan, his wife, Herb Simon, and his wife, Dorothea. |
|
|
|
28:16.880 --> 28:27.040 |
|
There was a novelist in town, a man named Mark Harris. He had just arrived and his wife, Josephine. |
|
|
|
28:27.840 --> 28:33.040 |
|
Mark was most famous then for a novel called Bang the Drum Slowly, which was about baseball. |
|
|
|
28:34.160 --> 28:43.360 |
|
And Joe and me, so eight people. And we met monthly and we just sank our teeth into really |
|
|
|
28:43.360 --> 28:52.160 |
|
hard topics. And it was great fun. How have your own views around artificial intelligence changed |
|
|
|
28:53.200 --> 28:57.520 |
|
in through the process of writing machines who think and afterwards the ripple effects? |
|
|
|
28:58.240 --> 29:04.400 |
|
I was a little skeptical that this whole thing would work out. It didn't matter. To me, it was |
|
|
|
29:04.400 --> 29:16.160 |
|
so audacious. This whole thing being AI generally. And in some ways, it hasn't worked out the way I |
|
|
|
29:16.160 --> 29:25.760 |
|
expected so far. That is to say, there is this wonderful lot of apps, thanks to deep learning |
|
|
|
29:25.760 --> 29:35.680 |
|
and so on. But those are algorithmic. And in the part of symbolic processing, |
|
|
|
29:36.640 --> 29:45.600 |
|
there is very little yet. And that's a field that lies waiting for industrious graduate students. |
|
|
|
29:46.800 --> 29:53.040 |
|
Maybe you can tell me some figures that popped up in your life in the 80s with expert systems, |
|
|
|
29:53.040 --> 30:01.840 |
|
where there was the symbolic AI possibilities of what most people think of as AI. If you dream |
|
|
|
30:01.840 --> 30:08.000 |
|
of the possibilities of AI, it's really expert systems. And those hit a few walls and there |
|
|
|
30:08.000 --> 30:12.960 |
|
were challenges there. And I think, yes, they will reemerge again with some new breakthroughs and so |
|
|
|
30:12.960 --> 30:18.640 |
|
on. But what did that feel like, both the possibility and the winter that followed, the |
|
|
|
30:18.640 --> 30:25.520 |
|
slowdown in research? This whole thing about AI winter is, to me, a crock. |
|
|
|
30:26.160 --> 30:33.200 |
|
It's no winters. Because I look at the basic research that was being done in the 80s, which is |
|
|
|
30:33.200 --> 30:39.520 |
|
supposed to be, my God, it was really important. It was laying down things that nobody had thought |
|
|
|
30:39.520 --> 30:44.880 |
|
about before. But it was basic research. You couldn't monetize it. Hence the winter. |
|
|
|
30:44.880 --> 30:53.680 |
|
Science research goes and fits and starts. It isn't this nice, smooth, |
|
|
|
30:54.240 --> 30:59.200 |
|
oh, this follows this, follows this. No, it just doesn't work that way. |
|
|
|
30:59.200 --> 31:03.600 |
|
Well, the interesting thing, the way winters happen, it's never the fault of the researchers. |
|
|
|
31:04.480 --> 31:11.920 |
|
It's the some source of hype, over promising. Well, no, let me take that back. Sometimes it |
|
|
|
31:11.920 --> 31:17.200 |
|
is the fault of the researchers. Sometimes certain researchers might overpromise the |
|
|
|
31:17.200 --> 31:23.760 |
|
possibilities. They themselves believe that we're just a few years away, sort of just recently talked |
|
|
|
31:23.760 --> 31:30.240 |
|
to Elon Musk and he believes he'll have an autonomous vehicle in a year and he believes it. |
|
|
|
31:30.240 --> 31:33.520 |
|
A year? A year, yeah, would have mass deployment of a time. |
|
|
|
31:33.520 --> 31:38.640 |
|
For the record, this is 2019 right now. So he's talking 2020. |
|
|
|
31:38.640 --> 31:44.800 |
|
To do the impossible, you really have to believe it. And I think what's going to happen when you |
|
|
|
31:44.800 --> 31:49.520 |
|
believe it, because there's a lot of really brilliant people around him, is some good stuff |
|
|
|
31:49.520 --> 31:54.640 |
|
will come out of it. Some unexpected brilliant breakthroughs will come out of it. When you |
|
|
|
31:54.640 --> 31:59.520 |
|
really believe it, when you work that hard. I believe that and I believe autonomous vehicles |
|
|
|
31:59.520 --> 32:05.280 |
|
will come. I just don't believe it'll be in a year. I wish. But nevertheless, there's |
|
|
|
32:05.280 --> 32:11.680 |
|
autonomous vehicles is a good example. There's a feeling many companies have promised by 2021, |
|
|
|
32:11.680 --> 32:18.000 |
|
by 2022 for GM. Basically, every single automotive company has promised they'll |
|
|
|
32:18.000 --> 32:22.480 |
|
have autonomous vehicles. So that kind of overpromise is what leads to the winter. |
|
|
|
32:23.040 --> 32:28.320 |
|
Because we'll come to those dates, there won't be autonomous vehicles, and there'll be a feeling, |
|
|
|
32:28.960 --> 32:34.160 |
|
well, wait a minute, if we took your word at that time, that means we just spent billions of |
|
|
|
32:34.160 --> 32:41.600 |
|
dollars, had made no money. And there's a counter response to where everybody gives up on it. |
|
|
|
32:41.600 --> 32:49.600 |
|
Sort of intellectually, at every level, the hope just dies. And all that's left is a few basic |
|
|
|
32:49.600 --> 32:56.800 |
|
researchers. So you're uncomfortable with some aspects of this idea. Well, it's the difference |
|
|
|
32:56.800 --> 33:04.160 |
|
between science and commerce. So you think science, science goes on the way it does? |
|
|
|
33:06.480 --> 33:14.800 |
|
Science can really be killed by not getting proper funding or timely funding. I think |
|
|
|
33:14.800 --> 33:22.080 |
|
Great Britain was a perfect example of that. The Lighthill report in the 1960s. |
|
|
|
33:22.080 --> 33:27.360 |
|
The year essentially said, there's no use of Great Britain putting any money into this. |
|
|
|
33:27.360 --> 33:35.600 |
|
It's going nowhere. And this was all about social factions in Great Britain. |
|
|
|
33:36.960 --> 33:44.400 |
|
Edinburgh hated Cambridge, and Cambridge hated Manchester, and somebody else can write that |
|
|
|
33:44.400 --> 33:53.760 |
|
story. But it really did have a hard effect on research there. Now, they've come roaring back |
|
|
|
33:53.760 --> 34:01.360 |
|
with deep mind. But that's one guy and his visionaries around him. |
|
|
|
34:01.360 --> 34:08.320 |
|
But just to push on that, it's kind of interesting, you have this dislike of the idea of an AI winter. |
|
|
|
34:08.320 --> 34:15.440 |
|
Where's that coming from? Where were you? Oh, because I just don't think it's true. |
|
|
|
34:16.560 --> 34:21.360 |
|
There was a particular period of time. It's a romantic notion, certainly. |
|
|
|
34:21.360 --> 34:32.960 |
|
Yeah, well, I admire science, perhaps more than I admire commerce. Commerce is fine. Hey, |
|
|
|
34:32.960 --> 34:42.960 |
|
you know, we all got to live. But science has a much longer view than commerce, |
|
|
|
34:44.080 --> 34:54.000 |
|
and continues almost regardless. It can't continue totally regardless, but it almost |
|
|
|
34:54.000 --> 34:59.600 |
|
regardless of what's saleable and what's not, what's monetizable and what's not. |
|
|
|
34:59.600 --> 35:05.840 |
|
So the winter is just something that happens on the commerce side, and the science marches. |
|
|
|
35:07.200 --> 35:12.560 |
|
That's a beautifully optimistic inspired message. I agree with you. I think |
|
|
|
35:13.760 --> 35:19.440 |
|
if we look at the key people that work in AI, they work in key scientists in most disciplines, |
|
|
|
35:19.440 --> 35:25.360 |
|
they continue working out of the love for science. You can always scrape up some funding |
|
|
|
35:25.360 --> 35:33.120 |
|
to stay alive, and they continue working diligently. But there certainly is a huge |
|
|
|
35:33.120 --> 35:39.840 |
|
amount of funding now, and there's a concern on the AI side and deep learning. There's a concern |
|
|
|
35:39.840 --> 35:46.160 |
|
that we might, with over promising, hit another slowdown in funding, which does affect the number |
|
|
|
35:46.160 --> 35:51.280 |
|
of students, you know, that kind of thing. Yeah, it does. So the kind of ideas you had |
|
|
|
35:51.280 --> 35:56.400 |
|
in machines who think, did you continue that curiosity through the decades that followed? |
|
|
|
35:56.400 --> 36:04.800 |
|
Yes, I did. And what was your view, historical view of how AI community evolved, the conversations |
|
|
|
36:04.800 --> 36:11.520 |
|
about it, the work? Has it persisted the same way from its birth? No, of course not. It's just |
|
|
|
36:11.520 --> 36:21.360 |
|
we were just talking. The symbolic AI really kind of dried up and it all became algorithmic. |
|
|
|
36:22.400 --> 36:29.520 |
|
I remember a young AI student telling me what he was doing, and I had been away from the field |
|
|
|
36:29.520 --> 36:34.080 |
|
long enough. I'd gotten involved with complexity at the Santa Fe Institute. |
|
|
|
36:34.080 --> 36:40.960 |
|
I thought, algorithms, yeah, they're in the service of, but they're not the main event. |
|
|
|
36:41.680 --> 36:49.200 |
|
No, they became the main event. That surprised me. And we all know the downside of this. We |
|
|
|
36:49.200 --> 36:58.240 |
|
all know that if you're using an algorithm to make decisions based on a gazillion human decisions |
|
|
|
36:58.240 --> 37:05.040 |
|
baked into it, are all the mistakes that humans make, the bigotries, the short sightedness, |
|
|
|
37:06.000 --> 37:14.000 |
|
so on and so on. So you mentioned Santa Fe Institute. So you've written the novel Edge |
|
|
|
37:14.000 --> 37:21.200 |
|
of Chaos, but it's inspired by the ideas of complexity, a lot of which have been extensively |
|
|
|
37:21.200 --> 37:30.160 |
|
explored at the Santa Fe Institute. It's another fascinating topic of just sort of |
|
|
|
37:31.040 --> 37:37.600 |
|
emergent complexity from chaos. Nobody knows how it happens really, but it seems to wear |
|
|
|
37:37.600 --> 37:44.160 |
|
all the interesting stuff that does happen. So how do first, not your novel, but just |
|
|
|
37:44.160 --> 37:49.520 |
|
complexity in general in the work at Santa Fe fit into the bigger puzzle of the history of AI? |
|
|
|
37:49.520 --> 37:53.520 |
|
Or it may be even your personal journey through that. |
|
|
|
37:54.480 --> 38:03.040 |
|
One of the last projects I did concerning AI in particular was looking at the work of |
|
|
|
38:03.040 --> 38:12.960 |
|
Harold Cohen, the painter. And Harold was deeply involved with AI. He was a painter first. |
|
|
|
38:12.960 --> 38:27.600 |
|
And what his project, Aaron, which was a lifelong project, did, was reflect his own cognitive |
|
|
|
38:27.600 --> 38:34.800 |
|
processes. Okay. Harold and I, even though I wrote a book about it, we had a lot of friction between |
|
|
|
38:34.800 --> 38:44.560 |
|
us. And I went, I thought, this is it, you know, the book died. It was published and fell into a |
|
|
|
38:44.560 --> 38:53.040 |
|
ditch. This is it. I'm finished. It's time for me to do something different. By chance, |
|
|
|
38:53.040 --> 38:59.280 |
|
this was a sabbatical year for my husband. And we spent two months at the Santa Fe Institute |
|
|
|
38:59.280 --> 39:09.200 |
|
and two months at Caltech. And then the spring semester in Munich, Germany. Okay. Those two |
|
|
|
39:09.200 --> 39:19.520 |
|
months at the Santa Fe Institute were so restorative for me. And I began to, the institute was very |
|
|
|
39:19.520 --> 39:26.240 |
|
small then. It was in some kind of office complex on old Santa Fe trail. Everybody kept their door |
|
|
|
39:26.240 --> 39:33.440 |
|
open. So you could crack your head on a problem. And if you finally didn't get it, you could walk |
|
|
|
39:33.440 --> 39:42.480 |
|
in to see Stuart Kaufman or any number of people and say, I don't get this. Can you explain? |
|
|
|
39:43.680 --> 39:51.120 |
|
And one of the people that I was talking to about complex adaptive systems was Murray Gellemann. |
|
|
|
39:51.120 --> 39:58.960 |
|
And I told Murray what Harold Cohen had done. And I said, you know, this sounds to me |
|
|
|
39:58.960 --> 40:06.080 |
|
like a complex adaptive system. And he said, yeah, it is. Well, what do you know? Harold's |
|
|
|
40:06.080 --> 40:12.560 |
|
Aaron had all these kissing cousins all over the world in science and in economics and so on and |
|
|
|
40:12.560 --> 40:21.200 |
|
so forth. I was so relieved. I thought, okay, your instincts are okay. You're doing the right thing. I |
|
|
|
40:21.200 --> 40:25.920 |
|
didn't have the vocabulary. And that was one of the things that the Santa Fe Institute gave me. |
|
|
|
40:25.920 --> 40:31.040 |
|
If I could have rewritten that book, no, it had just come out. I couldn't rewrite it. I would have |
|
|
|
40:31.040 --> 40:37.680 |
|
had a vocabulary to explain what Aaron was doing. Okay. So I got really interested in |
|
|
|
40:37.680 --> 40:47.440 |
|
what was going on at the Institute. The people were again, bright and funny and willing to explain |
|
|
|
40:47.440 --> 40:54.800 |
|
anything to this amateur. George Cowan, who was then the head of the Institute, said he thought it |
|
|
|
40:54.800 --> 41:02.160 |
|
might be a nice idea if I wrote a book about the Institute. And I thought about it. And I had my |
|
|
|
41:02.160 --> 41:08.960 |
|
eye on some other project. God knows what. And I said, oh, I'm sorry, George. Yeah, I'd really love |
|
|
|
41:08.960 --> 41:13.840 |
|
to do it. But, you know, just not going to work for me at this moment. And he said, oh, too bad. |
|
|
|
41:13.840 --> 41:18.560 |
|
I think it would make an interesting book. Well, he was right and I was wrong. I wish I'd done it. |
|
|
|
41:18.560 --> 41:24.080 |
|
But that's interesting. I hadn't thought about that, that that was a road not taken that I wish |
|
|
|
41:24.080 --> 41:32.400 |
|
I'd taken. Well, you know what? That's just on that point. It's quite brave for you as a writer, |
|
|
|
41:32.400 --> 41:39.680 |
|
as sort of coming from a world of literature, the literary thinking and historical thinking. I mean, |
|
|
|
41:39.680 --> 41:52.640 |
|
just from that world and bravely talking to quite, I assume, large egos in AI or in complexity and so |
|
|
|
41:52.640 --> 42:00.560 |
|
on. How'd you do it? Like, where did you? I mean, I suppose they could be intimidated of you as well, |
|
|
|
42:00.560 --> 42:06.160 |
|
because it's two different worlds. I never picked up that anybody was intimidated by me. |
|
|
|
42:06.160 --> 42:11.040 |
|
But how were you brave enough? Where did you find the guts to just dumb, dumb luck? I mean, |
|
|
|
42:11.680 --> 42:16.160 |
|
this is an interesting rock to turn over. I'm going to write a book about and you know, |
|
|
|
42:16.160 --> 42:21.840 |
|
people have enough patience with writers, if they think they're going to end up at a book |
|
|
|
42:21.840 --> 42:27.840 |
|
that they let you flail around and so on. It's well, but they also look if the writer has. |
|
|
|
42:27.840 --> 42:33.440 |
|
There's like, if there's a sparkle in their eye, if they get it. Yeah, sure. Right. When were you |
|
|
|
42:33.440 --> 42:44.480 |
|
at the Santa Fe Institute? The time I'm talking about is 1990. Yeah, 1990, 1991, 1992. But we then, |
|
|
|
42:44.480 --> 42:49.920 |
|
because Joe was an external faculty member, we're in Santa Fe every summer, we bought a house there. |
|
|
|
42:49.920 --> 42:55.600 |
|
And I didn't have that much to do with the Institute anymore. I was writing my novels, |
|
|
|
42:55.600 --> 43:04.320 |
|
I was doing whatever I was doing. But I loved the Institute and I loved |
|
|
|
43:06.720 --> 43:12.960 |
|
the, again, the audacity of the ideas. That really appeals to me. |
|
|
|
43:12.960 --> 43:22.160 |
|
I think that there's this feeling, much like in great institutes of neuroscience, for example, |
|
|
|
43:23.040 --> 43:29.840 |
|
that they're in it for the long game of understanding something fundamental about |
|
|
|
43:29.840 --> 43:36.800 |
|
reality and nature. And that's really exciting. So if we start not to look a little bit more recently, |
|
|
|
43:36.800 --> 43:49.280 |
|
how AI is really popular today. How is this world, you mentioned algorithmic, but in general, |
|
|
|
43:50.080 --> 43:54.320 |
|
is the spirit of the people, the kind of conversations you hear through the grapevine |
|
|
|
43:54.320 --> 44:00.160 |
|
and so on, is that different than the roots that you remember? No, the same kind of excitement, |
|
|
|
44:00.160 --> 44:07.120 |
|
the same kind of, this is really going to make a difference in the world. And it will, it has. |
|
|
|
44:07.120 --> 44:13.280 |
|
A lot of folks, especially young, 20 years old or something, they think we've just found something |
|
|
|
44:14.080 --> 44:21.040 |
|
special here. We're going to change the world tomorrow. On a time scale, do you have |
|
|
|
44:22.000 --> 44:27.120 |
|
a sense of what, of the time scale at which breakthroughs in AI happen? |
|
|
|
44:27.120 --> 44:39.040 |
|
I really don't, because look at deep learning. That was, Jeffrey Hinton came up with the algorithm |
|
|
|
44:39.920 --> 44:48.960 |
|
in 86. But it took all these years for the technology to be good enough to actually |
|
|
|
44:48.960 --> 44:57.680 |
|
be applicable. So no, I can't predict that at all. I can't, I wouldn't even try. |
|
|
|
44:58.320 --> 45:05.440 |
|
Well, let me ask you to, not to try to predict, but to speak to the, I'm sure in the 60s, |
|
|
|
45:06.000 --> 45:10.320 |
|
as it continues now, there's people that think, let's call it, we can call it this |
|
|
|
45:11.040 --> 45:17.120 |
|
fun word, the singularity. When there's a phase shift, there's some profound feeling where |
|
|
|
45:17.120 --> 45:23.040 |
|
we're all really surprised by what's able to be achieved. I'm sure those dreams are there. |
|
|
|
45:23.040 --> 45:29.200 |
|
I remember reading quotes in the 60s and those continued. How have your own views, |
|
|
|
45:29.200 --> 45:34.960 |
|
maybe if you look back, about the timeline of a singularity changed? |
|
|
|
45:37.040 --> 45:45.760 |
|
Well, I'm not a big fan of the singularity as Ray Kurzweil has presented it. |
|
|
|
45:45.760 --> 45:52.480 |
|
But how would you define the Ray Kurzweil? How would you, how do you think of singularity in |
|
|
|
45:52.480 --> 45:58.880 |
|
those? If I understand Kurzweil's view, it's sort of, there's going to be this moment when |
|
|
|
45:58.880 --> 46:06.320 |
|
machines are smarter than humans and, you know, game over. However, the game over is, |
|
|
|
46:06.320 --> 46:10.800 |
|
I mean, do they put us on a reservation? Do they, et cetera, et cetera. And |
|
|
|
46:10.800 --> 46:19.840 |
|
first of all, machines are smarter than humans in some ways, all over the place. And they have been |
|
|
|
46:19.840 --> 46:27.280 |
|
since adding machines were invented. So it's not, it's not going to come like some great |
|
|
|
46:27.280 --> 46:34.320 |
|
edible crossroads, you know, where they meet each other and our offspring, Oedipus says, |
|
|
|
46:34.320 --> 46:41.040 |
|
you're dead. It's just not going to happen. Yeah. So it's already game over with calculators, |
|
|
|
46:41.040 --> 46:47.920 |
|
right? They're already out to do much better at basic arithmetic than us. But, you know, |
|
|
|
46:47.920 --> 46:55.840 |
|
there's a human like intelligence. And that's not the ones that destroy us. But, you know, |
|
|
|
46:55.840 --> 47:01.520 |
|
somebody that you can have as a, as a friend, you can have deep connections with that kind of |
|
|
|
47:01.520 --> 47:07.680 |
|
passing the Turing test and beyond, those kinds of ideas. Have you dreamt of those? |
|
|
|
47:07.680 --> 47:08.880 |
|
Oh, yes, yes, yes. |
|
|
|
47:08.880 --> 47:10.160 |
|
Those possibilities. |
|
|
|
47:10.160 --> 47:17.520 |
|
In a book I wrote with Ed Feigenbaum, there's a little story called the geriatric robot. And |
|
|
|
47:18.880 --> 47:24.240 |
|
how I came up with the geriatric robot is a story in itself. But here's, here's what the |
|
|
|
47:24.240 --> 47:29.520 |
|
geriatric robot does. It doesn't just clean you up and feed you and will you out into the sun. |
|
|
|
47:29.520 --> 47:42.720 |
|
It's great advantages. It listens. It says, tell me again about the great coup of 73. |
|
|
|
47:43.520 --> 47:52.080 |
|
Tell me again about how awful or how wonderful your grandchildren are and so on and so forth. |
|
|
|
47:53.040 --> 47:59.440 |
|
And it isn't hanging around to inherit your money. It isn't hanging around because it can't get |
|
|
|
47:59.440 --> 48:08.320 |
|
any other job. This is its job and so on and so forth. Well, I would love something like that. |
|
|
|
48:09.120 --> 48:15.600 |
|
Yeah. I mean, for me, that deeply excites me. So I think there's a lot of us. |
|
|
|
48:15.600 --> 48:20.880 |
|
Lex, you got to know it was a joke. I dreamed it up because I needed to talk to college students |
|
|
|
48:20.880 --> 48:26.880 |
|
and I needed to give them some idea of what AI might be. And they were rolling in the aisles |
|
|
|
48:26.880 --> 48:32.160 |
|
as I elaborated and elaborated and elaborated. When it went into the book, |
|
|
|
48:34.320 --> 48:40.240 |
|
they took my hide off in the New York Review of Books. This is just what we've thought about |
|
|
|
48:40.240 --> 48:44.400 |
|
these people in AI. They're inhuman. Oh, come on. Get over it. |
|
|
|
48:45.120 --> 48:49.120 |
|
Don't you think that's a good thing for the world that AI could potentially... |
|
|
|
48:49.120 --> 48:58.800 |
|
Why? I do. Absolutely. And furthermore, I want... I'm pushing 80 now. By the time I need help |
|
|
|
48:59.360 --> 49:04.160 |
|
like that, I also want it to roll itself in a corner and shut the fuck up. |
|
|
|
49:06.960 --> 49:09.920 |
|
Let me linger on that point. Do you really, though? |
|
|
|
49:09.920 --> 49:11.040 |
|
Yeah, I do. Here's what. |
|
|
|
49:11.040 --> 49:15.120 |
|
But you wanted to push back a little bit a little. |
|
|
|
49:15.120 --> 49:21.520 |
|
But I have watched my friends go through the whole issue around having help in the house. |
|
|
|
49:22.480 --> 49:29.760 |
|
And some of them have been very lucky and had fabulous help. And some of them have had people |
|
|
|
49:29.760 --> 49:35.120 |
|
in the house who want to keep the television going on all day, who want to talk on their phones all day. |
|
|
|
49:35.760 --> 49:38.960 |
|
No. So basically... Just roll yourself in the corner. |
|
|
|
49:38.960 --> 49:45.760 |
|
Unfortunately, us humans, when we're assistants, we care... We're still... |
|
|
|
49:45.760 --> 49:48.400 |
|
Even when we're assisting others, we care about ourselves more. |
|
|
|
49:48.400 --> 49:49.280 |
|
Of course. |
|
|
|
49:49.280 --> 49:56.800 |
|
And so you create more frustration. And a robot AI assistant can really optimize |
|
|
|
49:57.360 --> 50:03.040 |
|
the experience for you. I was just speaking to the point... You actually bring up a very, |
|
|
|
50:03.040 --> 50:07.120 |
|
very good point. But I was speaking to the fact that us humans are a little complicated, |
|
|
|
50:07.120 --> 50:14.560 |
|
that we don't necessarily want a perfect servant. I don't... Maybe you disagree with that. |
|
|
|
50:14.560 --> 50:21.360 |
|
But there's... I think there's a push and pull with humans. |
|
|
|
50:22.240 --> 50:28.080 |
|
A little tension, a little mystery that, of course, that's really difficult for you to get right. |
|
|
|
50:28.080 --> 50:35.120 |
|
But I do sense, especially in today with social media, that people are getting more and more |
|
|
|
50:35.120 --> 50:42.000 |
|
lonely, even young folks. And sometimes, especially young folks, that loneliness, |
|
|
|
50:42.000 --> 50:47.760 |
|
there's a longing for connection. And AI can help alleviate some of that loneliness. |
|
|
|
50:48.560 --> 50:52.880 |
|
Some. Just somebody who listens. Like in person. |
|
|
|
50:54.640 --> 50:54.960 |
|
That... |
|
|
|
50:54.960 --> 50:55.680 |
|
So to speak. |
|
|
|
50:56.240 --> 51:00.080 |
|
So to speak, yeah. So to speak. |
|
|
|
51:00.080 --> 51:07.120 |
|
Yeah, that to me is really exciting. But so if we look at that level of intelligence, |
|
|
|
51:07.120 --> 51:12.560 |
|
which is exceptionally difficult to achieve, actually, as the singularity, or whatever, |
|
|
|
51:12.560 --> 51:18.320 |
|
that's the human level bar, that people have dreamt of that too. Touring dreamt of it. |
|
|
|
51:19.520 --> 51:26.320 |
|
He had a date timeline. Do you have how of your own timeline evolved on past time? |
|
|
|
51:26.320 --> 51:27.520 |
|
I don't even think about it. |
|
|
|
51:27.520 --> 51:28.240 |
|
You don't even think? |
|
|
|
51:28.240 --> 51:35.520 |
|
No. Just this field has been so full of surprises for me. |
|
|
|
51:35.520 --> 51:39.120 |
|
That you're just taking in and see a fun bunch of basic science? |
|
|
|
51:39.120 --> 51:45.840 |
|
Yeah, I just can't. Maybe that's because I've been around the field long enough to think, |
|
|
|
51:45.840 --> 51:52.240 |
|
you know, don't go that way. Herb Simon was terrible about making these predictions of |
|
|
|
51:52.240 --> 51:53.840 |
|
when this and that would happen. |
|
|
|
51:53.840 --> 51:54.320 |
|
Right. |
|
|
|
51:54.320 --> 51:58.400 |
|
And he was a sensible guy. |
|
|
|
51:58.400 --> 52:01.600 |
|
Yeah. And his quotes are often used, right, as a... |
|
|
|
52:01.600 --> 52:02.880 |
|
That's a legend, yeah. |
|
|
|
52:02.880 --> 52:13.840 |
|
Yeah. Do you have concerns about AI, the existential threats that many people like Elon Musk and |
|
|
|
52:13.840 --> 52:16.160 |
|
Sam Harris and others that are thinking about? |
|
|
|
52:16.160 --> 52:21.200 |
|
Oh, yeah, yeah. That takes up a half a chapter in my book. |
|
|
|
52:21.200 --> 52:27.120 |
|
I call it the male gaze. |
|
|
|
52:27.120 --> 52:35.200 |
|
Well, you hear me out. The male gaze is actually a term from film criticism. |
|
|
|
52:36.240 --> 52:40.640 |
|
And I'm blocking on the woman who dreamed this up. |
|
|
|
52:41.280 --> 52:48.880 |
|
But she pointed out how most movies were made from the male point of view, that women were |
|
|
|
52:48.880 --> 52:55.520 |
|
objects, not subjects. They didn't have any agency and so on and so forth. |
|
|
|
52:56.080 --> 53:00.560 |
|
So when Elon and his pals, Hawking and so on, |
|
|
|
53:00.560 --> 53:05.760 |
|
okay, AI is going to eat our lunch and our dinner and our midnight snack too, |
|
|
|
53:06.640 --> 53:11.600 |
|
I thought, what? And I said to Ed Feigenbaum, oh, this is the first guy. |
|
|
|
53:11.600 --> 53:14.880 |
|
First, these guys have always been the smartest guy on the block. |
|
|
|
53:14.880 --> 53:20.880 |
|
And here comes something that might be smarter. Ooh, let's stamp it out before it takes over. |
|
|
|
53:20.880 --> 53:23.360 |
|
And Ed laughed. He said, I didn't think about it that way. |
|
|
|
53:24.080 --> 53:30.320 |
|
But I did. I did. And it is the male gaze. |
|
|
|
53:32.000 --> 53:37.120 |
|
Okay, suppose these things do have agency. Well, let's wait and see what happens. |
|
|
|
53:37.120 --> 53:47.040 |
|
Can we imbue them with ethics? Can we imbue them with a sense of empathy? |
|
|
|
53:48.560 --> 53:54.960 |
|
Or are they just going to be, I don't know, we've had centuries of guys like that? |
|
|
|
53:55.760 --> 54:03.600 |
|
That's interesting that the ego, the male gaze is immediately threatened. |
|
|
|
54:03.600 --> 54:12.320 |
|
And so you can't think in a patient, calm way of how the tech could evolve. |
|
|
|
54:13.280 --> 54:21.520 |
|
Speaking of which, here in 96 book, The Future of Women, I think at the time and now, certainly now, |
|
|
|
54:21.520 --> 54:27.760 |
|
I mean, I'm sorry, maybe at the time, but I'm more cognizant of now, is extremely relevant. |
|
|
|
54:27.760 --> 54:34.160 |
|
And you and Nancy Ramsey talk about four possible futures of women in science and tech. |
|
|
|
54:35.120 --> 54:42.400 |
|
So if we look at the decades before and after the book was released, can you tell a history, |
|
|
|
54:43.120 --> 54:50.560 |
|
sorry, of women in science and tech and how it has evolved? How have things changed? Where do we |
|
|
|
54:50.560 --> 55:01.520 |
|
stand? Not enough. They have not changed enough. The way that women are ground down in computing is |
|
|
|
55:02.320 --> 55:09.680 |
|
simply unbelievable. But what are the four possible futures for women in tech from the book? |
|
|
|
55:10.800 --> 55:16.720 |
|
What you're really looking at are various aspects of the present. So for each of those, |
|
|
|
55:16.720 --> 55:22.800 |
|
you could say, oh, yeah, we do have backlash. Look at what's happening with abortion and so on and so |
|
|
|
55:22.800 --> 55:31.280 |
|
forth. We have one step forward, one step back. The golden age of equality was the hardest chapter |
|
|
|
55:31.280 --> 55:37.040 |
|
to write. And I used something from the Santa Fe Institute, which is the sand pile effect, |
|
|
|
55:37.760 --> 55:44.480 |
|
that you drop sand very slowly onto a pile, and it grows and it grows and it grows until |
|
|
|
55:44.480 --> 55:56.640 |
|
suddenly it just breaks apart. And in a way, MeToo has done that. That was the last drop of sand |
|
|
|
55:56.640 --> 56:02.800 |
|
that broke everything apart. That was a perfect example of the sand pile effect. And that made |
|
|
|
56:02.800 --> 56:07.440 |
|
me feel good. It didn't change all of society, but it really woke a lot of people up. |
|
|
|
56:07.440 --> 56:15.760 |
|
But are you in general optimistic about maybe after MeToo? MeToo is about a very specific kind |
|
|
|
56:15.760 --> 56:21.920 |
|
of thing. Boy, solve that and you'll solve everything. But are you in general optimistic |
|
|
|
56:21.920 --> 56:27.600 |
|
about the future? Yes, I'm a congenital optimistic. I can't help it. |
|
|
|
56:28.400 --> 56:35.600 |
|
What about AI? What are your thoughts about the future of AI? Of course, I get asked, |
|
|
|
56:35.600 --> 56:41.280 |
|
what do you worry about? And the one thing I worry about is the things we can't anticipate. |
|
|
|
56:44.320 --> 56:47.440 |
|
There's going to be something out of that field that we will just say, |
|
|
|
56:47.440 --> 56:59.040 |
|
we weren't prepared for that. I am generally optimistic. When I first took up being interested |
|
|
|
56:59.040 --> 57:06.560 |
|
in AI, like most people in the field, more intelligence was like more virtue. What could be |
|
|
|
57:06.560 --> 57:15.760 |
|
bad? And in a way, I still believe that, but I realize that my notion of intelligence has |
|
|
|
57:15.760 --> 57:22.240 |
|
broadened. There are many kinds of intelligence, and we need to imbue our machines with those many |
|
|
|
57:22.240 --> 57:32.720 |
|
kinds. So you've now just finished or in the process of finishing the book, even working on |
|
|
|
57:32.720 --> 57:40.160 |
|
the memoir. How have you changed? I know it's just writing, but how have you changed the process? |
|
|
|
57:40.800 --> 57:48.880 |
|
If you look back, what kind of stuff did it bring up to you that surprised you looking at the entirety |
|
|
|
57:48.880 --> 58:01.200 |
|
of it all? The biggest thing, and it really wasn't a surprise, is how lucky I was, oh my, to be, |
|
|
|
58:03.680 --> 58:08.880 |
|
to have access to the beginning of a scientific field that is going to change the world. |
|
|
|
58:08.880 --> 58:20.880 |
|
How did I luck out? And yes, of course, my view of things has widened a lot. |
|
|
|
58:23.040 --> 58:32.080 |
|
If I can get back to one feminist part of our conversation, without knowing it, it really |
|
|
|
58:32.080 --> 58:40.400 |
|
was subconscious. I wanted AI to succeed because I was so tired of hearing that intelligence was |
|
|
|
58:40.400 --> 58:47.920 |
|
inside the male cranium. And I thought if there was something out there that wasn't a male |
|
|
|
58:49.360 --> 58:57.120 |
|
thinking and doing well, then that would put a lie to this whole notion of intelligence resides |
|
|
|
58:57.120 --> 59:04.560 |
|
in the male cranium. I did not know that until one night, Harold Cohen and I were |
|
|
|
59:05.760 --> 59:12.560 |
|
having a glass of wine, maybe two, and he said, what drew you to AI? And I said, oh, you know, |
|
|
|
59:12.560 --> 59:18.080 |
|
smartest people I knew, great project, blah, blah, blah. And I said, and I wanted something |
|
|
|
59:18.080 --> 59:30.640 |
|
besides male smarts. And it just bubbled up out of me, Lex. It's brilliant, actually. So AI really |
|
|
|
59:32.160 --> 59:36.080 |
|
humbles all of us and humbles the people that need to be humbled the most. |
|
|
|
59:37.040 --> 59:43.440 |
|
Let's hope. Oh, wow, that is so beautiful. Pamela, thank you so much for talking to |
|
|
|
59:43.440 --> 59:49.440 |
|
us. Oh, it's been a great pleasure. Thank you. |
|
|
|
|