WEBVTT 00:00.000 --> 00:03.240 The following is a conversation with Regina Barsley. 00:03.240 --> 00:06.680 She's a professor at MIT and a world class researcher 00:06.680 --> 00:09.120 in natural language processing and applications 00:09.120 --> 00:12.440 of deep learning to chemistry and oncology, 00:12.440 --> 00:16.240 or the use of deep learning for early diagnosis, prevention, 00:16.240 --> 00:18.320 and treatment of cancer. 00:18.320 --> 00:21.000 She has also been recognized for a teaching 00:21.000 --> 00:24.680 of several successful AI related courses at MIT, 00:24.680 --> 00:27.360 including the popular introduction to machine 00:27.360 --> 00:28.920 learning course. 00:28.920 --> 00:32.160 This is the Artificial Intelligence Podcast. 00:32.160 --> 00:34.560 If you enjoy it, subscribe on YouTube, 00:34.560 --> 00:37.840 give it 5,000 iTunes, support it on Patreon, 00:37.840 --> 00:39.840 or simply connect with me on Twitter 00:39.840 --> 00:43.760 at Lex Freedman, spelled F R I D M A N. 00:43.760 --> 00:47.720 And now here's my conversation with Regina Barsley. 00:48.800 --> 00:50.320 In an interview you've mentioned 00:50.320 --> 00:51.960 that if there's one course you would take, 00:51.960 --> 00:54.600 it would be a literature course with a friend of yours 00:54.600 --> 00:57.920 that a friend of yours teaches just out of curiosity 00:57.920 --> 01:00.320 because I couldn't find anything on it. 01:00.320 --> 01:04.480 Are there books or ideas that had profound impact 01:04.480 --> 01:07.760 on your life journey, books and ideas perhaps outside 01:07.760 --> 01:10.880 of computer science and the technical fields? 01:11.840 --> 01:14.800 I think because I'm spending a lot of my time at MIT 01:14.800 --> 01:18.360 and previously in other institutions where I was a student, 01:18.360 --> 01:21.080 I have a limited ability to interact with people. 01:21.080 --> 01:22.720 So a lot of what I know about the world 01:22.720 --> 01:24.280 actually comes from books. 01:25.320 --> 01:27.280 And there were quite a number of books 01:27.280 --> 01:31.400 that had profound impact on me and how I view the world. 01:31.400 --> 01:35.840 Let me just give you one example of such a book. 01:35.840 --> 01:39.680 I've maybe a year ago read a book 01:39.680 --> 01:42.520 called The Emperor of All Melodies. 01:42.520 --> 01:45.760 It's a book about, it's kind of a history of science book 01:45.760 --> 01:50.760 on how the treatments and drugs for cancer were developed. 01:50.760 --> 01:54.040 And that book, despite the fact that I am 01:54.040 --> 01:57.040 in the business of science, really opened my eyes 01:57.040 --> 02:02.040 on how imprecise and imperfect the discovery process is 02:03.080 --> 02:05.920 and how imperfect our current solutions. 02:07.000 --> 02:11.080 And what makes science succeed and be implemented 02:11.080 --> 02:14.120 and sometimes it's actually not the strengths of the idea 02:14.120 --> 02:17.480 but devotion of the person who wants to see it implemented. 02:17.480 --> 02:19.800 So this is one of the books that, you know, 02:19.800 --> 02:22.320 at least for the last year quite changed the way 02:22.320 --> 02:24.920 I'm thinking about scientific process 02:24.920 --> 02:26.720 just from the historical perspective. 02:26.720 --> 02:31.720 And what do I need to do to make my ideas really implemented? 02:33.440 --> 02:36.040 Let me give you an example of a book 02:36.040 --> 02:39.560 which is not kind of, which is a fiction book. 02:40.600 --> 02:43.080 It's a book called Americana. 02:44.440 --> 02:48.760 And this is a book about a young female student 02:48.760 --> 02:53.240 who comes from Africa to study in the United States. 02:53.240 --> 02:57.720 And it describes her past, you know, within her studies 02:57.720 --> 03:02.000 and her life transformation that, you know, 03:02.000 --> 03:06.560 in a new country and kind of adaptation to a new culture. 03:06.560 --> 03:09.520 And when I read this book, 03:09.520 --> 03:13.520 I saw myself in many different points of it. 03:14.600 --> 03:19.600 But it also kind of gave me the lens on different events 03:20.160 --> 03:22.040 and some events that I never actually paid attention 03:22.040 --> 03:24.720 to one of the funny stories in this book 03:24.720 --> 03:29.720 is how she arrives to her new college 03:30.400 --> 03:32.880 and she starts speaking in English 03:32.880 --> 03:35.680 and she had this beautiful British accent 03:35.680 --> 03:39.840 because that's how she was educated in her country. 03:39.840 --> 03:40.960 This is not my case. 03:40.960 --> 03:45.400 And then she notices that the person who talks to her, 03:45.400 --> 03:47.160 you know, talks to her in a very funny way, 03:47.160 --> 03:48.280 in a very slow way. 03:48.280 --> 03:50.800 And she's thinking that this woman is disabled 03:50.800 --> 03:54.480 and she's also trying to kind of to accommodate her. 03:54.480 --> 03:55.400 And then after a while, 03:55.400 --> 03:57.560 when she finishes her discussion with this officer 03:57.560 --> 04:01.040 from her college, she sees how she interacts 04:01.040 --> 04:03.000 with the other students, with American students 04:03.000 --> 04:08.000 and she discovers that actually she talked to her this way 04:09.720 --> 04:12.200 because she saw that she doesn't understand English. 04:12.200 --> 04:15.080 And he thought, wow, this is a funny experience. 04:15.080 --> 04:18.000 And literally within few weeks, 04:18.000 --> 04:21.880 I went to LA to a conference 04:21.880 --> 04:24.320 and they asked somebody in an airport, you know, 04:24.320 --> 04:26.560 how to find like a cab or something. 04:26.560 --> 04:29.360 And then I noticed that this person is talking 04:29.360 --> 04:30.280 in a very strange way. 04:30.280 --> 04:32.080 And my first thought was that this person 04:32.080 --> 04:35.480 have some, you know, pronunciation issues or something. 04:35.480 --> 04:37.160 And I'm trying to talk very slowly to him 04:37.160 --> 04:39.640 and I was with another professor, Ernst Frankel. 04:39.640 --> 04:43.120 And he's like laughing because it's funny 04:43.120 --> 04:45.720 that I don't get that the guy is talking in this way 04:45.720 --> 04:47.000 because he thinks that I cannot speak. 04:47.000 --> 04:50.120 So it was really kind of mirroring experience 04:50.120 --> 04:54.320 and it led me think a lot about my own experiences 04:54.320 --> 04:57.000 moving, you know, from different countries. 04:57.000 --> 05:00.280 So I think that books play a big role 05:00.280 --> 05:02.720 in my understanding of the world. 05:02.720 --> 05:06.440 On the science question, you mentioned that 05:06.440 --> 05:08.760 it made you discover that personalities 05:08.760 --> 05:12.440 of human beings are more important than perhaps ideas. 05:12.440 --> 05:13.680 Is that what I heard? 05:13.680 --> 05:15.760 It's not necessarily that they are more important 05:15.760 --> 05:19.200 than ideas, but I think that ideas on their own 05:19.200 --> 05:20.520 are not sufficient. 05:20.520 --> 05:24.720 And many times at least at the local horizon 05:24.720 --> 05:29.200 is the personalities and their devotion to their ideas 05:29.200 --> 05:33.000 is really that locally changes the landscape. 05:33.000 --> 05:37.520 Now, if you're looking at AI, like let's say 30 years ago, 05:37.520 --> 05:39.240 you know, dark ages of AI or whatever, 05:39.240 --> 05:42.480 what the symbolic times you can use any word, 05:42.480 --> 05:44.720 you know, there were some people, 05:44.720 --> 05:46.680 now we are looking at a lot of that work 05:46.680 --> 05:48.840 and we are kind of thinking this was not really 05:48.840 --> 05:50.760 maybe a relevant work. 05:50.760 --> 05:53.160 But you can see that some people managed to take it 05:53.160 --> 05:58.000 and to make it so shiny and dominate the, you know, 05:58.000 --> 06:02.440 the academic world and make it to be the standard. 06:02.440 --> 06:05.280 If you look at the area of natural language processing, 06:06.520 --> 06:09.240 it is well known fact that the reason the statistics 06:09.240 --> 06:14.080 in NLP took such a long time to become mainstream 06:14.080 --> 06:16.880 because there were quite a number of personalities 06:16.880 --> 06:18.480 which didn't believe in this idea 06:18.480 --> 06:22.080 and then stop research progress in this area. 06:22.080 --> 06:27.080 So I do not think that, you know, kind of asymptotically 06:27.360 --> 06:28.960 maybe personalities matters, 06:28.960 --> 06:33.960 but I think locally it does make quite a bit of impact 06:33.960 --> 06:38.080 and it's generally, you know, speed up, 06:38.080 --> 06:41.440 speeds up the rate of adoption of the new ideas. 06:41.440 --> 06:45.480 Yeah, and the other interesting question is in the early days 06:45.480 --> 06:50.000 of particular discipline, I think you mentioned in that book 06:50.000 --> 06:52.400 was, is ultimately a book of cancer? 06:52.400 --> 06:55.160 It's called The Emperor of All Melodies. 06:55.160 --> 06:58.640 Yeah, and those melodies included the trying to, 06:58.640 --> 07:00.680 the medicine, was it centered around? 07:00.680 --> 07:04.960 So it was actually centered on, you know, 07:04.960 --> 07:07.240 how people thought of curing cancer. 07:07.240 --> 07:10.720 Like for me, it was really a discovery how people, 07:10.720 --> 07:14.200 what was the science of chemistry behind drug development 07:14.200 --> 07:17.280 that it actually grew up out of dyeing, 07:17.280 --> 07:21.960 like coloring industry that people who develop chemistry 07:21.960 --> 07:25.840 in 19th century in Germany and Britain to do, 07:25.840 --> 07:28.160 you know, the really new dyes, 07:28.160 --> 07:30.200 they looked at the molecules and identified 07:30.200 --> 07:32.160 that they do certain things to cells. 07:32.160 --> 07:34.960 And from there, the process started and, you know, 07:34.960 --> 07:36.920 like historians think, yeah, this is fascinating 07:36.920 --> 07:38.720 that they managed to make the connection 07:38.720 --> 07:42.280 and look under the microscope and do all this discovery. 07:42.280 --> 07:44.440 But as you continue reading about it 07:44.440 --> 07:48.760 and you read about how chemotherapy drugs 07:48.760 --> 07:50.520 were actually developed in Boston 07:50.520 --> 07:52.480 and some of them were developed 07:52.480 --> 07:57.480 and Dr. Farber from Dana Farber, 07:57.480 --> 08:00.480 you know, how the experiments were done, 08:00.480 --> 08:03.360 that, you know, there was some miscalculation. 08:03.360 --> 08:04.520 Let's put it this way. 08:04.520 --> 08:05.960 And they tried it on the patients 08:05.960 --> 08:08.480 and they just, and those were children with leukemia 08:08.480 --> 08:11.640 and they died and they tried another modification. 08:11.640 --> 08:15.040 You look at the process, how imperfect is this process? 08:15.040 --> 08:17.520 And, you know, like, if we're again looking back 08:17.520 --> 08:19.200 like 60 years ago, 70 years ago, 08:19.200 --> 08:20.800 you can kind of understand it. 08:20.800 --> 08:23.040 But some of the stories in this book, 08:23.040 --> 08:24.640 which were really shocking to me, 08:24.640 --> 08:28.040 were really happening, you know, maybe decades ago. 08:28.040 --> 08:30.680 And we still don't have a vehicle 08:30.680 --> 08:35.120 to do it much more fast and effective and, you know, 08:35.120 --> 08:36.280 scientific the way I'm thinking, 08:36.280 --> 08:38.240 computer science scientific. 08:38.240 --> 08:40.480 So from the perspective of computer science, 08:40.480 --> 08:43.800 you've got a chance to work the application to cancer 08:43.800 --> 08:44.920 and to medicine in general. 08:44.920 --> 08:48.480 From a perspective of an engineer and a computer scientist, 08:48.480 --> 08:51.840 how far along are we from understanding the human body, 08:51.840 --> 08:55.600 biology, of being able to manipulate it in a way 08:55.600 --> 08:59.760 we can cure some of the maladies, some of the diseases? 08:59.760 --> 09:02.240 So this is a very interesting question. 09:03.520 --> 09:06.080 And if you're thinking as a computer scientist 09:06.080 --> 09:09.880 about this problem, I think one of the reasons 09:09.880 --> 09:11.920 that we succeeded in the areas 09:11.920 --> 09:14.040 we as a computer scientist succeeded 09:14.040 --> 09:16.320 is because we don't have, 09:16.320 --> 09:19.040 we are not trying to understand in some ways. 09:19.040 --> 09:22.280 Like if you're thinking about like eCommerce, Amazon, 09:22.280 --> 09:24.280 Amazon doesn't really understand you. 09:24.280 --> 09:27.760 And that's why it recommends you certain books 09:27.760 --> 09:29.600 or certain products, correct? 09:30.720 --> 09:34.280 And in, you know, traditionally, 09:34.280 --> 09:36.440 when people were thinking about marketing, you know, 09:36.440 --> 09:37.840 they divided the population 09:37.840 --> 09:39.840 to different kind of subgroups, 09:39.840 --> 09:41.800 identify the features of the subgroup 09:41.800 --> 09:43.160 and come up with a strategy 09:43.160 --> 09:45.640 which is specific to that subgroup. 09:45.640 --> 09:47.120 If you're looking about recommendations, 09:47.120 --> 09:50.640 they're not claiming that they're understanding somebody, 09:50.640 --> 09:54.800 they're just managing from the patterns of your behavior 09:54.800 --> 09:57.600 to recommend you a product. 09:57.600 --> 09:59.600 Now, if you look at the traditional biology, 09:59.600 --> 10:04.600 obviously I wouldn't say that I am at any way, 10:04.920 --> 10:06.200 you know, educated in this field. 10:06.200 --> 10:07.120 But, you know, what I see, 10:07.120 --> 10:09.320 there is really a lot of emphasis 10:09.320 --> 10:10.720 on mechanistic understanding. 10:10.720 --> 10:13.840 And it was very surprising to me coming from computer science 10:13.840 --> 10:17.640 how much emphasis is on this understanding. 10:17.640 --> 10:20.760 And given the complexity of the system, 10:20.760 --> 10:23.240 maybe the deterministic full understanding 10:23.240 --> 10:27.400 of this process is, you know, beyond our capacity. 10:27.400 --> 10:29.480 And the same way as in computer science, 10:29.480 --> 10:30.520 when we're doing recognition, 10:30.520 --> 10:32.760 when we're doing recommendation in many other areas, 10:32.760 --> 10:35.960 it's just probabilistic matching process. 10:35.960 --> 10:40.080 And in some way, maybe in certain cases, 10:40.080 --> 10:42.960 we shouldn't even attempt to understand 10:42.960 --> 10:45.760 or we can attempt to understand, but in parallel, 10:45.760 --> 10:48.040 we can actually do this kind of matching 10:48.040 --> 10:52.600 that would help us to find Qo to do early diagnostics 10:52.600 --> 10:54.120 and so on. 10:54.120 --> 10:55.840 And I know that in these communities, 10:55.840 --> 10:59.040 it's really important to understand, 10:59.040 --> 11:00.680 but I am sometimes wondering, 11:00.680 --> 11:02.920 what exactly does it mean to understand here? 11:02.920 --> 11:04.440 Well, there's stuff that works, 11:04.440 --> 11:07.600 and, but that can be, like you said, 11:07.600 --> 11:10.320 separate from this deep human desire 11:10.320 --> 11:12.720 to uncover the mysteries of the universe, 11:12.720 --> 11:16.160 of science, of the way the body works, 11:16.160 --> 11:17.600 the way the mind works. 11:17.600 --> 11:19.560 It's the dream of symbolic AI, 11:19.560 --> 11:24.560 of being able to reduce human knowledge into logic 11:25.200 --> 11:26.880 and be able to play with that logic 11:26.880 --> 11:28.680 in a way that's very explainable 11:28.680 --> 11:30.280 and understandable for us humans. 11:30.280 --> 11:31.760 I mean, that's a beautiful dream. 11:31.760 --> 11:34.840 So I understand it, but it seems that 11:34.840 --> 11:37.880 what seems to work today, and we'll talk about it more, 11:37.880 --> 11:40.760 is as much as possible, reduce stuff into data, 11:40.760 --> 11:43.880 reduce whatever problem you're interested in to data 11:43.880 --> 11:47.040 and try to apply statistical methods, 11:47.040 --> 11:49.080 apply machine learning to that. 11:49.080 --> 11:51.120 On a personal note, 11:51.120 --> 11:54.160 you were diagnosed with breast cancer in 2014. 11:55.400 --> 11:58.400 Would it facing your mortality make you think about 11:58.400 --> 12:00.200 how did it change you? 12:00.200 --> 12:01.800 You know, this is a great question. 12:01.800 --> 12:03.800 And I think that I was interviewed many times 12:03.800 --> 12:05.680 and nobody actually asked me this question. 12:05.680 --> 12:09.640 I think I was 43 at a time. 12:09.640 --> 12:12.800 And the first time I realized in my life that I may die. 12:12.800 --> 12:14.400 And I never thought about it before. 12:14.400 --> 12:16.200 And yeah, and there was a long time 12:16.200 --> 12:17.920 since you diagnosed until you actually know 12:17.920 --> 12:20.120 what you have and how severe is your disease. 12:20.120 --> 12:23.480 For me, it was like maybe two and a half months. 12:23.480 --> 12:28.280 And I didn't know where I am during this time 12:28.280 --> 12:30.640 because I was getting different tests. 12:30.640 --> 12:32.200 And one would say, it's bad. 12:32.200 --> 12:33.360 And I would say, no, it is not. 12:33.360 --> 12:34.840 So until I knew where I am, 12:34.840 --> 12:36.280 I really was thinking about 12:36.280 --> 12:38.200 all these different possible outcomes. 12:38.200 --> 12:39.680 Were you imagining the worst? 12:39.680 --> 12:41.920 Or were you trying to be optimistic? 12:41.920 --> 12:45.600 It would be really, I don't remember, you know, 12:45.600 --> 12:47.360 what was my thinking? 12:47.360 --> 12:50.880 It was really a mixture with many components at the time 12:50.880 --> 12:54.080 at speaking, you know, in our terms. 12:54.080 --> 12:59.320 And one thing that I remember, 12:59.320 --> 13:01.480 and you know, every test comes and then you think, 13:01.480 --> 13:03.280 oh, it could be this or it may not be this. 13:03.280 --> 13:04.680 And you're hopeful and then you're desperate. 13:04.680 --> 13:06.600 So it's like, there is a whole, you know, 13:06.600 --> 13:09.800 slew of emotions that goes through you. 13:09.800 --> 13:15.120 But what I remember is that when I came back to MIT, 13:15.120 --> 13:17.160 I was kind of going the whole time 13:17.160 --> 13:18.280 through the treatment to MIT, 13:18.280 --> 13:19.760 but my brain was not really there. 13:19.760 --> 13:21.800 But when I came back really, finished my treatment 13:21.800 --> 13:24.560 and I was here teaching and everything. 13:24.560 --> 13:27.080 You know, I look back at what my group was doing, 13:27.080 --> 13:28.840 what other groups was doing, 13:28.840 --> 13:30.840 and I saw these three realities. 13:30.840 --> 13:33.240 It's like people are building their careers 13:33.240 --> 13:35.520 on improving some parts around two or three percent 13:35.520 --> 13:36.880 or whatever. 13:36.880 --> 13:38.400 I was, it's like, seriously, 13:38.400 --> 13:40.760 I did a work on how to decipher eukaryotic, 13:40.760 --> 13:42.880 like a language that nobody speak and whatever, 13:42.880 --> 13:46.160 like what is significance? 13:46.160 --> 13:49.000 When I was sad, you know, I walked out of MIT, 13:49.000 --> 13:51.600 which is, you know, when people really do care, 13:51.600 --> 13:54.280 you know, what happened to your iClear paper, 13:54.280 --> 13:56.680 you know, what is your next publication, 13:56.680 --> 13:59.960 to ACL, to the world where people, you know, 13:59.960 --> 14:01.880 people, you see a lot of suffering 14:01.880 --> 14:04.920 that I'm kind of totally shielded on it on a daily basis. 14:04.920 --> 14:07.520 And it's like the first time I've seen like real life 14:07.520 --> 14:08.720 and real suffering. 14:09.760 --> 14:13.280 And I was thinking, why are we trying to improve the parser 14:13.280 --> 14:16.120 or deal with some trivialities 14:16.120 --> 14:20.760 when we have capacity to really make a change? 14:20.760 --> 14:23.520 And it was really challenging to me 14:23.520 --> 14:24.640 because on one hand, you know, 14:24.640 --> 14:26.720 I have my graduate students who really want to do 14:26.720 --> 14:28.760 their papers and their work 14:28.760 --> 14:30.880 and they want to continue to do what they were doing, 14:30.880 --> 14:31.960 which was great. 14:31.960 --> 14:36.360 And then it was me who really kind of reevaluated 14:36.360 --> 14:38.600 what is importance and also at that point 14:38.600 --> 14:40.280 because I had to take some break. 14:42.560 --> 14:47.560 I look back into like my years in science 14:47.760 --> 14:49.080 and I was thinking, you know, 14:49.080 --> 14:51.720 like 10 years ago, this was the biggest thing. 14:51.720 --> 14:53.000 I don't know, topic models. 14:53.000 --> 14:55.400 We have like millions of papers on topic models 14:55.400 --> 14:56.720 and variation of topics models now. 14:56.720 --> 14:58.640 It's totally like irrelevant. 14:58.640 --> 15:01.520 And you start looking at this, you know, 15:01.520 --> 15:03.280 what do you perceive as important 15:03.280 --> 15:04.560 at different point of time 15:04.560 --> 15:08.960 and how, you know, it fades over time. 15:08.960 --> 15:13.040 And since we have a limited time, 15:13.040 --> 15:14.960 all of us have limited time on us. 15:14.960 --> 15:18.440 It's really important to prioritize things 15:18.440 --> 15:19.760 that really matter to you, 15:19.760 --> 15:22.040 maybe matter to you at that particular point, 15:22.040 --> 15:24.400 but it's important to take some time 15:24.400 --> 15:26.960 and understand what matters to you, 15:26.960 --> 15:28.880 which may not necessarily be the same 15:28.880 --> 15:31.720 as what matters to the rest of your scientific community 15:31.720 --> 15:34.600 and pursue that vision. 15:34.600 --> 15:38.480 And so though that moment, did it make you cognizant? 15:38.480 --> 15:42.520 You mentioned suffering of just the general amount 15:42.520 --> 15:44.360 of suffering in the world. 15:44.360 --> 15:45.640 Is that what you're referring to? 15:45.640 --> 15:49.480 So as opposed to topic models and specific detail problems 15:49.480 --> 15:54.480 in NLP, did you start to think about other people 15:54.480 --> 15:57.040 who have been diagnosed with cancer? 15:57.040 --> 15:58.360 Is that the way you saw the, 15:58.360 --> 16:00.040 started to see the world perhaps? 16:00.040 --> 16:00.880 Oh, absolutely. 16:00.880 --> 16:04.480 And it actually creates because like, for instance, 16:04.480 --> 16:06.080 you know, there's parts of the treatment 16:06.080 --> 16:08.520 where you need to go to the hospital every day 16:08.520 --> 16:11.640 and you see, you know, the community of people that you see 16:11.640 --> 16:16.080 and many of them are much worse than I was at a time 16:16.080 --> 16:20.480 and you're all of a sudden see it all. 16:20.480 --> 16:23.920 And people who are happier someday 16:23.920 --> 16:25.320 just because they feel better. 16:25.320 --> 16:28.480 And for people who are in our normal realm, 16:28.480 --> 16:30.800 you take it totally for granted that you feel well, 16:30.800 --> 16:32.920 that if you decide to go running, you can go running 16:32.920 --> 16:36.120 and you can, you know, you're pretty much free to do 16:36.120 --> 16:37.600 whatever you want with your body. 16:37.600 --> 16:40.200 Like I saw like a community, 16:40.200 --> 16:42.840 my community became those people. 16:42.840 --> 16:46.760 And I remember one of my friends, 16:46.760 --> 16:48.920 Dina Katabi took me to Prudential 16:48.920 --> 16:50.480 to buy me a gift for my birthday. 16:50.480 --> 16:52.360 And it was like the first time in months 16:52.360 --> 16:55.000 that I went to kind of to see other people. 16:55.000 --> 16:56.640 And I was like, wow. 16:56.640 --> 16:58.200 First of all, these people, you know, 16:58.200 --> 16:59.880 they're happy and they're laughing 16:59.880 --> 17:02.680 and they're very different from this other my people. 17:02.680 --> 17:04.680 And second of thing, are they totally crazy? 17:04.680 --> 17:06.400 They're like laughing and wasting their money 17:06.400 --> 17:08.480 on some stupid gifts. 17:08.480 --> 17:12.560 And, you know, they may die. 17:12.560 --> 17:14.280 They already may have cancer 17:14.280 --> 17:16.000 and they don't understand it. 17:16.000 --> 17:20.120 So you can really see how the mind changes 17:20.120 --> 17:22.400 that you can see that, you know, 17:22.400 --> 17:23.240 before that you can ask, 17:23.240 --> 17:24.400 didn't you know that you're gonna die? 17:24.400 --> 17:28.360 Of course I knew, but it was kind of a theoretical notion. 17:28.360 --> 17:31.080 It wasn't something which was concrete. 17:31.080 --> 17:33.920 And at that point when you really see it 17:33.920 --> 17:37.680 and see how little means sometime the system 17:37.680 --> 17:40.480 has to help them, you really feel 17:40.480 --> 17:43.960 that we need to take a lot of our brilliance 17:43.960 --> 17:45.480 that we have here at MIT 17:45.480 --> 17:48.040 and translate it into something useful. 17:48.040 --> 17:48.880 Yeah. 17:48.880 --> 17:50.560 And you still couldn't have a lot of definitions, 17:50.560 --> 17:53.640 but of course alleviating, suffering, alleviating, 17:53.640 --> 17:57.480 trying to cure cancer is a beautiful mission. 17:57.480 --> 18:01.320 So I, of course, know the theoretically the notion 18:01.320 --> 18:04.200 of cancer, but just reading more and more 18:04.200 --> 18:08.040 about it's the 1.7 million new cancer cases 18:08.040 --> 18:09.880 in the United States every year, 18:09.880 --> 18:13.520 600,000 cancer related deaths every year. 18:13.520 --> 18:17.600 So this has a huge impact, United States globally. 18:19.360 --> 18:24.360 When broadly, before we talk about how machine learning, 18:24.400 --> 18:28.680 how MIT can help, when do you think 18:28.680 --> 18:32.160 we as a civilization will cure cancer? 18:32.160 --> 18:34.640 How hard of a problem is it from everything 18:34.640 --> 18:36.240 you've learned from it recently? 18:37.280 --> 18:39.320 I cannot really assess it. 18:39.320 --> 18:42.120 What I do believe will happen with the advancement 18:42.120 --> 18:45.960 in machine learning that a lot of types of cancer 18:45.960 --> 18:48.480 we will be able to predict way early 18:48.480 --> 18:53.400 and more effectively utilize existing treatments. 18:53.400 --> 18:57.520 I think, I hope at least that with all the advancements 18:57.520 --> 19:01.200 in AI and drug discovery, we would be able 19:01.200 --> 19:04.680 to much faster find relevant molecules. 19:04.680 --> 19:08.240 What I'm not sure about is how long it will take 19:08.240 --> 19:11.920 the medical establishment and regulatory bodies 19:11.920 --> 19:14.800 to kind of catch up and to implement it. 19:14.800 --> 19:17.400 And I think this is a very big piece of puzzle 19:17.400 --> 19:20.440 that is currently not addressed. 19:20.440 --> 19:21.800 That's the really interesting question. 19:21.800 --> 19:25.480 So first, a small detail that I think the answer is yes, 19:25.480 --> 19:30.480 but is cancer one of the diseases 19:30.480 --> 19:33.400 that when detected earlier, 19:33.400 --> 19:36.800 that's a significantly improves the outcomes. 19:38.320 --> 19:40.720 Cause we will talk about, there's the cure 19:40.720 --> 19:42.720 and then there is detection. 19:42.720 --> 19:44.880 And I think one machine learning can really help 19:44.880 --> 19:46.360 is earlier detection. 19:46.360 --> 19:48.280 So does detection help? 19:48.280 --> 19:49.400 Detection is crucial. 19:49.400 --> 19:53.640 For instance, the vast majority of pancreatic cancer patients 19:53.640 --> 19:57.040 are detected at the stage that they are incurable. 19:57.040 --> 20:02.040 That's why they have such a terrible survival rate. 20:03.680 --> 20:07.200 It's like just a few percent over five years. 20:07.200 --> 20:09.640 It's pretty much today a death sentence. 20:09.640 --> 20:13.400 But if you can discover this disease early, 20:14.320 --> 20:16.600 there are mechanisms to treat it. 20:16.600 --> 20:20.560 And in fact, I know a number of people who were diagnosed 20:20.560 --> 20:23.440 and saved just because they had food poisoning. 20:23.440 --> 20:24.840 They had terrible food poisoning. 20:24.840 --> 20:28.480 They went to ER, they got scan. 20:28.480 --> 20:30.560 There were early signs on the scan 20:30.560 --> 20:33.440 and that would save their lives. 20:33.440 --> 20:35.720 But this wasn't really an accidental case. 20:35.720 --> 20:37.520 So as we become better, 20:38.520 --> 20:42.720 we would be able to help too many more people 20:42.720 --> 20:46.400 that are likely to develop diseases. 20:46.400 --> 20:50.840 And I just want to say that as I got more into this field, 20:50.840 --> 20:53.440 I realized that cancer is of course a terrible disease 20:53.440 --> 20:55.600 when there are really the whole slew 20:55.600 --> 20:58.240 of terrible diseases out there, 20:58.240 --> 21:01.560 like neurodegenerative diseases and others. 21:01.560 --> 21:04.480 So we, of course, a lot of us are fixated on cancer 21:04.480 --> 21:06.440 just because it's so prevalent in our society. 21:06.440 --> 21:08.560 And you see these people when there are a lot of patients 21:08.560 --> 21:10.320 with neurodegenerative diseases 21:10.320 --> 21:12.520 and the kind of aging diseases 21:12.520 --> 21:17.120 that we still don't have a good solution for. 21:17.120 --> 21:22.120 And I felt as a computer scientist, 21:22.120 --> 21:24.920 we kind of decided that it's other people's job 21:24.920 --> 21:26.360 to treat these diseases 21:27.360 --> 21:29.760 because it's like traditionally people in biology 21:29.760 --> 21:33.720 or in chemistry or MDs are the ones who are thinking about it. 21:34.720 --> 21:36.720 And after kind of start paying attention, 21:36.720 --> 21:39.720 I think that it's really a wrong assumption 21:39.720 --> 21:42.280 and we all need to join the battle. 21:42.280 --> 21:45.880 So how it seems like in cancer specifically 21:45.880 --> 21:48.480 that there's a lot of ways that machine learning can help. 21:48.480 --> 21:51.240 So what's the role of machine learning 21:51.240 --> 21:54.160 and machine learning in the diagnosis of cancer? 21:55.320 --> 21:57.280 So for many cancers today, 21:57.280 --> 22:02.280 we really don't know what is your likelihood to get cancer. 22:03.520 --> 22:06.360 And for the vast majority of patients, 22:06.360 --> 22:08.000 especially on the younger patients, 22:08.000 --> 22:09.640 it really comes as a surprise. 22:09.640 --> 22:11.200 Like for instance, for breast cancer, 22:11.200 --> 22:13.920 80% of the patients are first in their families, 22:13.920 --> 22:15.440 it's like me. 22:15.440 --> 22:18.520 And I never saw that I had any increased risk 22:18.520 --> 22:20.880 because nobody had it in my family 22:20.880 --> 22:22.360 and for some reason in my head, 22:22.360 --> 22:24.880 it was kind of inherited disease. 22:26.640 --> 22:28.440 But even if I would pay attention, 22:28.440 --> 22:30.280 the models that currently, 22:30.280 --> 22:32.440 there's very simplistic statistical models 22:32.440 --> 22:34.600 that are currently used in clinical practice 22:34.600 --> 22:37.520 that really don't give you an answer, so you don't know. 22:37.520 --> 22:40.440 And the same true for pancreatic cancer, 22:40.440 --> 22:45.440 the same true for non smoking lung cancer and many others. 22:45.440 --> 22:47.400 So what machine learning can do here 22:47.400 --> 22:51.680 is utilize all this data to tell us Ellie, 22:51.680 --> 22:53.200 who is likely to be susceptible 22:53.200 --> 22:56.040 and using all the information that is already there, 22:56.040 --> 23:00.040 be it imaging, be it your other tests 23:00.040 --> 23:04.880 and eventually liquid biopsies and others, 23:04.880 --> 23:08.240 where the signal itself is not sufficiently strong 23:08.240 --> 23:11.360 for human eye to do good discrimination 23:11.360 --> 23:12.960 because the signal may be weak, 23:12.960 --> 23:15.680 but by combining many sources, 23:15.680 --> 23:18.120 machine which is trained on large volumes of data 23:18.120 --> 23:20.680 can really detect it Ellie 23:20.680 --> 23:22.480 and that's what we've seen with breast cancer 23:22.480 --> 23:25.920 and people are reporting it in other diseases as well. 23:25.920 --> 23:28.240 That really boils down to data, right? 23:28.240 --> 23:30.960 And in the different kinds of sources of data. 23:30.960 --> 23:33.720 And you mentioned regulatory challenges. 23:33.720 --> 23:35.160 So what are the challenges 23:35.160 --> 23:39.240 in gathering large data sets in the space? 23:40.840 --> 23:42.640 Again, another great question. 23:42.640 --> 23:44.320 So it took me after I decided 23:44.320 --> 23:48.720 that I want to work on it two years to get access to data. 23:48.720 --> 23:51.360 And you did, like any significant amount. 23:51.360 --> 23:53.560 Like right now in this country, 23:53.560 --> 23:58.040 there is no publicly available data set of modern mammograms 23:58.040 --> 23:59.560 that you can just go on your computer, 23:59.560 --> 24:01.840 sign a document and get it. 24:01.840 --> 24:03.320 It just doesn't exist. 24:03.320 --> 24:06.880 I mean, obviously every hospital has its own collection 24:06.880 --> 24:10.160 of mammograms, there are data that came out 24:10.160 --> 24:11.320 of clinical trials. 24:11.320 --> 24:13.200 What we're talking about here is a computer scientist 24:13.200 --> 24:17.120 who just want to run his or her model 24:17.120 --> 24:19.040 and see how it works. 24:19.040 --> 24:22.880 This data, like ImageNet, doesn't exist. 24:22.880 --> 24:27.880 And there is an set which is called like Florid data set 24:28.640 --> 24:30.880 which is a film mammogram from 90s 24:30.880 --> 24:32.440 which is totally not representative 24:32.440 --> 24:33.880 of the current developments, 24:33.880 --> 24:35.800 whatever you're learning on them doesn't scale up. 24:35.800 --> 24:39.320 This is the only resource that is available. 24:39.320 --> 24:44.320 And today there are many agencies that govern access to data, 24:44.440 --> 24:46.280 like the hospital holds your data 24:46.280 --> 24:49.240 and the hospital decides whether they would give it 24:49.240 --> 24:52.320 to the researcher to work with this data or not. 24:52.320 --> 24:54.160 In the individual hospital? 24:54.160 --> 24:57.160 Yeah, I mean, the hospital may, you know, 24:57.160 --> 24:59.200 assuming that you're doing research collaboration, 24:59.200 --> 25:01.960 you can submit, you know, 25:01.960 --> 25:05.040 there is a proper approval process guided by IRB 25:05.040 --> 25:07.800 and if you go through all the processes, 25:07.800 --> 25:10.120 you can eventually get access to the data. 25:10.120 --> 25:13.520 But if you yourself know our AI community, 25:13.520 --> 25:14.680 there are not that many people 25:14.680 --> 25:16.560 who actually ever got access to data 25:16.560 --> 25:20.200 because it's very challenging process. 25:20.200 --> 25:22.720 And sorry, just in a quick comment, 25:22.720 --> 25:25.760 MGH or any kind of hospital, 25:25.760 --> 25:28.280 are they scanning the data? 25:28.280 --> 25:29.680 Are they digitally storing it? 25:29.680 --> 25:31.560 Oh, it is already digitally stored. 25:31.560 --> 25:34.120 You don't need to do any extra processing steps. 25:34.120 --> 25:36.320 It's already there in the right format. 25:36.320 --> 25:39.800 Is that right now there are a lot of issues 25:39.800 --> 25:41.200 that govern access to the data 25:41.200 --> 25:46.200 because the hospital is legally responsible for the data. 25:46.280 --> 25:51.120 And, you know, they have a lot to lose 25:51.120 --> 25:53.200 if they give the data to the wrong person, 25:53.200 --> 25:55.360 but they may not have a lot to gain 25:55.360 --> 25:58.680 if they give it as a hospital, as a legal entity 25:59.920 --> 26:00.760 as given it to you. 26:00.760 --> 26:02.800 And the way, you know, what I would mention 26:02.800 --> 26:04.840 happening in the future is the same thing 26:04.840 --> 26:06.840 that happens when you're getting your driving license. 26:06.840 --> 26:09.880 You can decide whether you want to donate your organs. 26:09.880 --> 26:11.600 So you can imagine that whenever a person 26:11.600 --> 26:13.080 goes to the hospital, 26:14.280 --> 26:18.080 it should be easy for them to donate their data for research 26:18.080 --> 26:19.480 and it can be different kind of, 26:19.480 --> 26:21.320 do they only give you a test results 26:21.320 --> 26:25.880 or only imaging data or the whole medical record? 26:27.080 --> 26:29.000 Because at the end, 26:30.560 --> 26:33.880 we all will benefit from all this insights. 26:33.880 --> 26:36.080 And it's only gonna say, I want to keep my data private, 26:36.080 --> 26:38.800 but I would really love to get it from other people 26:38.800 --> 26:40.760 because other people are thinking the same way. 26:40.760 --> 26:45.760 So if there is a mechanism to do this donation 26:45.760 --> 26:48.040 and the patient has an ability to say 26:48.040 --> 26:50.840 how they want to use their data for research, 26:50.840 --> 26:54.120 it would be really a game changer. 26:54.120 --> 26:56.480 People, when they think about this problem, 26:56.480 --> 26:58.480 there's a depends on the population, 26:58.480 --> 27:00.160 depends on the demographics, 27:00.160 --> 27:02.400 but there's some privacy concerns. 27:02.400 --> 27:04.440 Generally, not just medical data, 27:04.440 --> 27:05.880 just any kind of data. 27:05.880 --> 27:07.840 It's what you said, my data, 27:07.840 --> 27:09.600 it should belong kinda to me, 27:09.600 --> 27:11.520 I'm worried how it's gonna be misused. 27:12.520 --> 27:15.640 How do we alleviate those concerns? 27:17.080 --> 27:19.440 Because that seems like a problem that needs to be, 27:19.440 --> 27:23.000 that problem of trust, of transparency needs to be solved 27:23.000 --> 27:27.240 before we build large data sets that help detect cancer, 27:27.240 --> 27:30.160 help save those very people in the future. 27:30.160 --> 27:31.920 So seeing that two things that could be done, 27:31.920 --> 27:34.480 there is a technical solutions 27:34.480 --> 27:38.240 and there are societal solutions. 27:38.240 --> 27:40.200 So on the technical end, 27:41.440 --> 27:46.440 we today have ability to improve disambiguation, 27:48.120 --> 27:49.720 like for instance, for imaging, 27:49.720 --> 27:54.720 it's, you know, for imaging, you can do it pretty well. 27:55.600 --> 27:56.760 What's disambiguation? 27:56.760 --> 27:58.520 And disambiguation, sorry, disambiguation, 27:58.520 --> 27:59.840 removing the identification, 27:59.840 --> 28:02.200 removing the names of the people. 28:02.200 --> 28:04.840 There are other data, like if it is a raw text, 28:04.840 --> 28:08.200 you cannot really achieve 99.9% 28:08.200 --> 28:10.080 but there are all these techniques 28:10.080 --> 28:12.480 that actually some of them are developed at MIT, 28:12.480 --> 28:15.440 how you can do learning on the encoded data 28:15.440 --> 28:17.400 where you locally encode the image, 28:17.400 --> 28:19.040 you train on network, 28:19.040 --> 28:22.440 which only works on the encoded images 28:22.440 --> 28:24.960 and then you send the outcome back to the hospital 28:24.960 --> 28:26.600 and you can open it up. 28:26.600 --> 28:28.040 So those are the technical solutions. 28:28.040 --> 28:30.720 There are a lot of people who are working in this space 28:30.720 --> 28:33.320 where the learning happens in the encoded form. 28:33.320 --> 28:36.160 We are still early, 28:36.160 --> 28:39.240 but this is an interesting research area 28:39.240 --> 28:41.880 where I think we'll make more progress. 28:43.360 --> 28:45.640 There is a lot of work in natural language processing 28:45.640 --> 28:48.600 community, how to do the identification better. 28:50.400 --> 28:54.040 But even today, there are already a lot of data 28:54.040 --> 28:55.920 which can be identified perfectly, 28:55.920 --> 28:58.760 like your test data, for instance, correct, 28:58.760 --> 29:00.040 where you can just, you know, 29:00.040 --> 29:01.000 the name of the patient, 29:01.000 --> 29:04.320 you just want to extract the part with the numbers. 29:04.320 --> 29:07.480 The big problem here is again, 29:08.440 --> 29:10.440 hospitals don't see much incentive 29:10.440 --> 29:12.640 to give this data away on one hand 29:12.640 --> 29:14.200 and then there is general concern. 29:14.200 --> 29:17.720 Now, when I'm talking about societal benefits 29:17.720 --> 29:19.640 and about the education, 29:19.640 --> 29:23.640 the public needs to understand 29:23.640 --> 29:27.800 and I think that there are situations 29:27.800 --> 29:29.360 that I still remember myself 29:29.360 --> 29:31.520 when I really needed an answer. 29:31.520 --> 29:33.280 I had to make a choice 29:33.280 --> 29:35.200 and there was no information to make a choice. 29:35.200 --> 29:36.640 You're just guessing. 29:36.640 --> 29:38.760 And at that moment, 29:38.760 --> 29:41.040 you feel that your life is at the stake, 29:41.040 --> 29:44.760 but you just don't have information to make the choice. 29:44.760 --> 29:48.680 And many times when I give talks, 29:48.680 --> 29:51.240 I get emails from women who say, 29:51.240 --> 29:52.760 you know, I'm in this situation, 29:52.760 --> 29:54.160 can you please run statistic 29:54.160 --> 29:55.920 and see what are the outcomes? 29:57.040 --> 30:00.000 We get almost every week a mammogram 30:00.000 --> 30:02.520 that comes by mail to my office at MIT. 30:02.520 --> 30:06.200 I'm serious that people ask to run 30:06.200 --> 30:07.840 because they need to make, you know, 30:07.840 --> 30:10.000 life changing decisions. 30:10.000 --> 30:11.320 And of course, you know, 30:11.320 --> 30:12.920 I'm not planning to open a clinic here, 30:12.920 --> 30:16.600 but we do run and give them the results for their doctors. 30:16.600 --> 30:20.040 But the point that I'm trying to make 30:20.040 --> 30:23.760 that we all at some point or our loved ones 30:23.760 --> 30:26.600 will be in the situation where you need information 30:26.600 --> 30:28.840 to make the best choice. 30:28.840 --> 30:31.840 And if this information is not available, 30:31.840 --> 30:35.080 you would feel vulnerable and unprotected. 30:35.080 --> 30:36.880 And then the question is, you know, 30:36.880 --> 30:37.840 what do I care more? 30:37.840 --> 30:40.320 Because at the end everything is a trade off, correct? 30:40.320 --> 30:41.640 Yeah, exactly. 30:41.640 --> 30:43.080 Just out of curiosity, 30:43.080 --> 30:45.560 what it seems like one possible solution, 30:45.560 --> 30:47.160 I'd like to see what you think of it 30:47.160 --> 30:50.680 based on what you just said, 30:50.680 --> 30:52.480 based on wanting to know answers 30:52.480 --> 30:55.040 for when you're yourself in that situation. 30:55.040 --> 30:58.400 Is it possible for patients to own their data 30:58.400 --> 31:01.040 as opposed to hospitals owning their data? 31:01.040 --> 31:02.280 Of course, theoretically, 31:02.280 --> 31:04.120 I guess patients own their data, 31:04.120 --> 31:06.640 but can you walk out there with a USB stick 31:07.600 --> 31:10.600 containing everything or upload it to the cloud 31:10.600 --> 31:13.400 where a company, you know, 31:13.400 --> 31:15.680 I remember Microsoft had a service, 31:15.680 --> 31:17.760 like I try, I was really excited about 31:17.760 --> 31:19.240 and Google Health was there. 31:19.240 --> 31:21.880 I tried to give, I was excited about it. 31:21.880 --> 31:24.760 Basically companies helping you upload your data 31:24.760 --> 31:27.920 to the cloud so that you can move from hospital to hospital 31:27.920 --> 31:29.200 from doctor to doctor. 31:29.200 --> 31:32.680 Do you see a promise of that kind of possibility? 31:32.680 --> 31:34.640 I absolutely think this is, you know, 31:34.640 --> 31:38.160 the right way to exchange the data. 31:38.160 --> 31:41.680 I don't know now who's the biggest player in this field, 31:41.680 --> 31:46.280 but I can clearly see that even for totally selfish health 31:46.280 --> 31:49.280 reasons, when you are going to a new facility 31:49.280 --> 31:52.600 and many of us are sent to some specialized treatment, 31:52.600 --> 31:55.720 they don't easily have access to your data. 31:55.720 --> 31:58.960 And today, you know, we would want to send us 31:58.960 --> 32:00.680 Mammogram need to go to their hospital, 32:00.680 --> 32:01.760 find some small office, 32:01.760 --> 32:04.760 which gives them the CD and they ship as a CD. 32:04.760 --> 32:08.280 So you can imagine we're looking at kind of decades old 32:08.280 --> 32:10.080 mechanism of data exchange. 32:10.080 --> 32:15.080 So I definitely think this is an area where hopefully 32:15.600 --> 32:20.360 all the right regulatory and technical forces will align 32:20.360 --> 32:23.200 and we will see it actually implemented. 32:23.200 --> 32:25.720 It's sad because unfortunately, 32:25.720 --> 32:28.400 and I have, I need to research why that happened, 32:28.400 --> 32:32.080 but I'm pretty sure Google Health and Microsoft Health Vault 32:32.080 --> 32:34.640 or whatever it's called, both closed down, 32:34.640 --> 32:37.560 which means that there was either regulatory pressure 32:37.560 --> 32:39.080 or there's not a business case 32:39.080 --> 32:41.760 or there's challenges from hospitals, 32:41.760 --> 32:43.240 which is very disappointing. 32:43.240 --> 32:46.480 So when you say, you don't know what the biggest players are, 32:46.480 --> 32:50.520 the two biggest that I was aware of closed their doors. 32:50.520 --> 32:53.120 So I'm hoping I'd love to see why 32:53.120 --> 32:54.760 and I'd love to see who else can come up. 32:54.760 --> 32:59.600 It seems like one of those Elon Musk style problems 32:59.600 --> 33:01.280 that are obvious needs to be solved 33:01.280 --> 33:02.360 and somebody needs to step up 33:02.360 --> 33:07.360 and actually do this large scale data collection. 33:07.360 --> 33:09.600 So I know there is an initiative in Massachusetts, 33:09.600 --> 33:11.720 a thing actually led by the governor 33:11.720 --> 33:15.440 to try to create this kind of health exchange system 33:15.440 --> 33:17.840 where at least to help people who are kind of when you show up 33:17.840 --> 33:20.160 in emergency room and there is no information 33:20.160 --> 33:22.520 about what are your allergies and other things. 33:23.480 --> 33:26.080 So I don't know how far it will go, 33:26.080 --> 33:30.280 but another thing that you said and I find it very interesting 33:30.280 --> 33:33.760 is actually who are the successful players in this space 33:33.760 --> 33:36.080 and the whole implementation. 33:36.080 --> 33:37.240 How does it go? 33:37.240 --> 33:40.280 To me, it is from the anthropological perspective, 33:40.280 --> 33:44.640 it's more fascinating that AI that today goes in health care. 33:44.640 --> 33:49.640 We've seen so many attempts and so very little successes 33:50.360 --> 33:54.200 and it's interesting to understand that I by no means 33:54.200 --> 33:58.280 have knowledge to assess why we are in the position 33:58.280 --> 33:59.600 where we are. 33:59.600 --> 34:02.920 Yeah, it's interesting because data is really fuel 34:02.920 --> 34:04.960 for a lot of successful applications 34:04.960 --> 34:08.480 and when that data requires regulatory approval 34:08.480 --> 34:12.400 like the FDA or any kind of approval, 34:14.160 --> 34:16.920 it seems that the computer scientists are not quite there yet 34:16.920 --> 34:18.840 in being able to play the regulatory game, 34:18.840 --> 34:21.200 understanding the fundamentals of it. 34:21.200 --> 34:26.200 I think that in many cases when even people do have data, 34:26.480 --> 34:31.480 we still don't know what exactly do you need to demonstrate 34:31.480 --> 34:35.040 to change the standard of care. 34:35.040 --> 34:40.040 Let me give you an example related to my breast cancer research. 34:41.000 --> 34:45.400 So in traditional breast cancer risk assessment, 34:45.400 --> 34:47.040 there is something called density 34:47.040 --> 34:50.400 which determines the likelihood of a woman to get cancer 34:50.400 --> 34:52.680 and this is pretty much says how much white 34:52.680 --> 34:54.120 do you see on the mammogram? 34:54.120 --> 34:58.840 The whiter it is, the more likely the tissue is dense. 34:58.840 --> 35:02.640 And the idea behind density, 35:02.640 --> 35:03.560 it's not a bad idea, 35:03.560 --> 35:07.960 in 1967 a radiologist called Wolf decided to look back 35:07.960 --> 35:09.680 at women who were diagnosed 35:09.680 --> 35:12.320 and see what is special in their images. 35:12.320 --> 35:14.600 Can we look back and say that they're likely to develop? 35:14.600 --> 35:16.080 So he come up with some patterns 35:16.080 --> 35:20.520 and it was the best that his human eye can identify 35:20.520 --> 35:24.160 then it was kind of formalized and coded into four categories 35:24.160 --> 35:26.840 and that's what we are using today. 35:26.840 --> 35:31.840 And today this density assessment is actually a federal law 35:32.240 --> 35:36.120 from 2019 approved by President Trump 35:36.120 --> 35:38.640 and for the previous FDA commissioner 35:40.040 --> 35:43.560 where women are supposed to be advised by their providers 35:43.560 --> 35:45.040 if they have high density, 35:45.040 --> 35:47.200 putting them into higher risk category 35:47.200 --> 35:51.240 and in some states you can actually get supplementary screening 35:51.240 --> 35:53.640 paid by your insurance because you are in this category. 35:53.640 --> 35:56.720 Now you can say how much science do we have behind it? 35:56.720 --> 36:00.800 Whatever biological science or epidemiological evidence. 36:00.800 --> 36:05.080 So it turns out that between 40 and 50% of women 36:05.080 --> 36:06.600 have dense breast. 36:06.600 --> 36:11.040 So above 40% of patients are coming out of their screening 36:11.040 --> 36:14.960 and somebody tells them you are in high risk. 36:14.960 --> 36:16.800 Now what exactly does it mean 36:16.800 --> 36:19.520 if you as half of the population in high risk 36:19.520 --> 36:21.960 gets from saying maybe I'm not, 36:21.960 --> 36:23.600 or what do I really need to do with it? 36:23.600 --> 36:28.280 Because the system doesn't provide me a lot of the solutions 36:28.280 --> 36:30.080 because there are so many people like me, 36:30.080 --> 36:34.560 we cannot really provide very expensive solutions for them. 36:34.560 --> 36:38.680 And the reason this whole density became this big deal 36:38.680 --> 36:40.720 it's actually advocated by the patients 36:40.720 --> 36:43.560 who felt very unprotected because many women 36:43.560 --> 36:46.200 when did the mammograms which were normal 36:46.200 --> 36:49.400 and then it turns out that they already had cancer, 36:49.400 --> 36:50.520 quite developed cancer. 36:50.520 --> 36:54.320 So they didn't have a way to know who is really at risk 36:54.320 --> 36:56.240 and what is the likelihood that when the doctor tells you 36:56.240 --> 36:58.000 you're okay, you are not okay. 36:58.000 --> 37:02.080 So at the time and it was 15 years ago, 37:02.080 --> 37:06.760 this maybe was the best piece of science that we had 37:06.760 --> 37:12.120 and it took quite 15, 16 years to make it federal law. 37:12.120 --> 37:15.600 But now this is a standard. 37:15.600 --> 37:17.560 Now with a deep learning model 37:17.560 --> 37:19.600 we can so much more accurately predict 37:19.600 --> 37:21.560 who is gonna develop breast cancer 37:21.560 --> 37:23.680 just because you're trained on a logical thing. 37:23.680 --> 37:26.040 And instead of describing how much white 37:26.040 --> 37:27.360 and what kind of white machine 37:27.360 --> 37:30.120 can systematically identify the patterns 37:30.120 --> 37:32.760 which was the original idea behind the sort 37:32.760 --> 37:33.680 of the tradiologist, 37:33.680 --> 37:35.680 machines can do it much more systematically 37:35.680 --> 37:38.240 and predict the risk when you're training the machine 37:38.240 --> 37:42.080 to look at the image and to say the risk in one to five years. 37:42.080 --> 37:45.000 Now you can ask me how long it will take 37:45.000 --> 37:46.400 to substitute this density 37:46.400 --> 37:48.560 which is broadly used across the country 37:48.560 --> 37:53.560 and really it's not helping to bring this new models. 37:54.320 --> 37:56.640 And I would say it's not a matter of the algorithm. 37:56.640 --> 37:58.720 Algorithm is already orders of magnitude better 37:58.720 --> 38:00.360 than what is currently in practice. 38:00.360 --> 38:02.440 I think it's really the question, 38:02.440 --> 38:04.320 who do you need to convince? 38:04.320 --> 38:07.400 How many hospitals do you need to run the experiment? 38:07.400 --> 38:11.560 What, you know, all this mechanism of adoption 38:11.560 --> 38:15.120 and how do you explain to patients 38:15.120 --> 38:17.520 and to women across the country 38:17.520 --> 38:20.400 that this is really a better measure? 38:20.400 --> 38:22.680 And again, I don't think it's an AI question. 38:22.680 --> 38:25.880 We can walk more and make the algorithm even better 38:25.880 --> 38:29.240 but I don't think that this is the current, you know, 38:29.240 --> 38:32.000 the barrier, the barrier is really this other piece 38:32.000 --> 38:35.200 that for some reason is not really explored. 38:35.200 --> 38:36.800 It's like anthropological piece. 38:36.800 --> 38:39.760 And coming back to your question about books, 38:39.760 --> 38:42.920 there is a book that I'm reading. 38:42.920 --> 38:47.920 It's called American Sickness by Elizabeth Rosenthal 38:48.240 --> 38:51.560 and I got this book from my clinical collaborator, 38:51.560 --> 38:53.080 Dr. Kony Lehman. 38:53.080 --> 38:54.800 And I said, I know everything that I need to know 38:54.800 --> 38:56.000 about American health system, 38:56.000 --> 38:59.200 but you know, every page doesn't fail to surprise me. 38:59.200 --> 39:03.080 And I think that there is a lot of interesting 39:03.080 --> 39:06.840 and really deep lessons for people like us 39:06.840 --> 39:09.600 from computer science who are coming into this field 39:09.600 --> 39:13.640 to really understand how complex is the system of incentives 39:13.640 --> 39:17.160 in the system to understand how you really need 39:17.160 --> 39:18.760 to play to drive adoption. 39:19.720 --> 39:21.120 You just said it's complex, 39:21.120 --> 39:23.960 but if we're trying to simplify it, 39:23.960 --> 39:27.360 who do you think most likely would be successful 39:27.360 --> 39:29.480 if we push on this group of people? 39:29.480 --> 39:30.720 Is it the doctors? 39:30.720 --> 39:31.760 Is it the hospitals? 39:31.760 --> 39:34.240 Is it the governments or policy makers? 39:34.240 --> 39:37.240 Is it the individual patients, consumers 39:37.240 --> 39:42.240 who needs to be inspired to most likely lead to adoption? 39:45.200 --> 39:47.120 Or is there no simple answer? 39:47.120 --> 39:48.280 There's no simple answer, 39:48.280 --> 39:52.000 but I think there is a lot of good people in medical system 39:52.000 --> 39:55.240 who do want to make a change. 39:56.520 --> 40:01.520 And I think a lot of power will come from us as a consumers 40:01.600 --> 40:04.320 because we all are consumers or future consumers 40:04.320 --> 40:06.560 of healthcare services. 40:06.560 --> 40:11.560 And I think we can do so much more 40:12.080 --> 40:15.560 in explaining the potential and not in the hype terms 40:15.560 --> 40:17.920 and not saying that we're now cured or Alzheimer 40:17.920 --> 40:20.560 and I'm really sick of reading this kind of articles 40:20.560 --> 40:22.120 which make these claims. 40:22.120 --> 40:24.800 But really to show with some examples 40:24.800 --> 40:26.520 what this implementation does 40:26.520 --> 40:29.080 and how it changes the care. 40:29.080 --> 40:30.040 Because I can't imagine, 40:30.040 --> 40:33.240 it doesn't matter what kind of politician it is, 40:33.240 --> 40:35.240 we all are susceptible to these diseases. 40:35.240 --> 40:37.800 There is no one who is free. 40:37.800 --> 40:41.080 And eventually, we all are humans 40:41.080 --> 40:44.880 and we are looking for a way to alleviate the suffering. 40:44.880 --> 40:47.320 And this is one possible way 40:47.320 --> 40:49.360 where we currently are underutilizing, 40:49.360 --> 40:50.960 which I think can help. 40:51.880 --> 40:55.120 So it sounds like the biggest problems are outside of AI 40:55.120 --> 40:58.000 in terms of the biggest impact at this point. 40:58.000 --> 41:00.440 But are there any open problems 41:00.440 --> 41:03.800 in the application of ML to oncology in general? 41:03.800 --> 41:05.400 So improving the detection 41:05.400 --> 41:07.600 or any other creative methods, 41:07.600 --> 41:09.640 whether it's on the detection segmentations 41:09.640 --> 41:11.800 or the vision perception side 41:11.800 --> 41:16.320 or some other clever of inference. 41:16.320 --> 41:18.560 Yeah, what in general in your view 41:18.560 --> 41:20.320 are the open problems in this space? 41:20.320 --> 41:22.480 So I just want to mention that beside detection, 41:22.480 --> 41:24.880 another area where I am kind of quite active 41:24.880 --> 41:28.640 and I think it's really an increasingly important area 41:28.640 --> 41:30.980 in healthcare is drug design. 41:30.980 --> 41:32.820 Absolutely. 41:32.820 --> 41:36.940 Because it's fine if you detect something early, 41:36.940 --> 41:41.140 but you still need to get drugs 41:41.140 --> 41:43.900 and new drugs for these conditions. 41:43.900 --> 41:48.300 And today, all of the drug design, ML is non existent there. 41:48.300 --> 41:53.020 We don't have any drug that was developed by the ML model 41:53.020 --> 41:54.940 or even not developed, 41:54.940 --> 41:56.220 but at least even you, 41:56.220 --> 41:59.300 that ML model plays some significant role. 41:59.300 --> 42:03.300 I think this area with all the new ability 42:03.300 --> 42:05.820 to generate molecules with desired properties 42:05.820 --> 42:10.820 to do in silica screening is really a big open area. 42:11.300 --> 42:12.660 It to be totally honest with you, 42:12.660 --> 42:14.940 when we are doing diagnostics and imaging, 42:14.940 --> 42:17.300 primarily taking the ideas that were developed 42:17.300 --> 42:20.500 for other areas and you're applying them with some adaptation. 42:20.500 --> 42:24.700 The area of drug design 42:24.700 --> 42:27.980 is really technically interesting and exciting area. 42:27.980 --> 42:30.380 You need to work a lot with graphs 42:30.380 --> 42:34.620 and capture various 3D properties. 42:34.620 --> 42:37.420 There are lots and lots of opportunities 42:37.420 --> 42:39.820 to be technically creative. 42:39.820 --> 42:44.820 And I think there are a lot of open questions in this area. 42:46.780 --> 42:48.820 We're already getting a lot of successes 42:48.820 --> 42:52.700 even with the kind of the first generation of this models, 42:52.700 --> 42:56.540 but there is much more new creative things that you can do. 42:56.540 --> 43:01.540 And what's very nice to see is actually the more powerful, 43:03.060 --> 43:05.420 the more interesting models actually do better. 43:05.420 --> 43:10.420 So there is a place to innovate in machine learning 43:11.300 --> 43:12.500 in this area. 43:13.900 --> 43:16.820 And some of these techniques are really unique too, 43:16.820 --> 43:19.620 let's say to graph generation and other things. 43:19.620 --> 43:20.820 So... 43:20.820 --> 43:23.980 What just to interrupt really quick, I'm sorry. 43:23.980 --> 43:28.980 Graph generation or graphs, drug discovery in general. 43:30.660 --> 43:31.980 How do you discover a drug? 43:31.980 --> 43:33.340 Is this chemistry? 43:33.340 --> 43:37.500 Is this trying to predict different chemical reactions? 43:37.500 --> 43:39.620 Or is it some kind of... 43:39.620 --> 43:42.100 What do graphs even represent in this space? 43:42.100 --> 43:43.940 Oh, sorry. 43:43.940 --> 43:45.300 And what's a drug? 43:45.300 --> 43:47.820 Okay, so let's say you think that there are many different 43:47.820 --> 43:49.580 types of drugs, but let's say you're going to talk 43:49.580 --> 43:51.900 about small molecules because I think today, 43:51.900 --> 43:53.580 the majority of drugs are small molecules. 43:53.580 --> 43:55.020 So small molecule is a graph. 43:55.020 --> 44:00.020 The molecule is just where the node in the graph is an atom 44:00.060 --> 44:01.460 and then you have the bond. 44:01.460 --> 44:03.220 So it's really a graph representation 44:03.220 --> 44:05.540 if you look at it in 2D, correct? 44:05.540 --> 44:07.460 You can do it 3D, but let's say, well, 44:07.460 --> 44:09.540 let's keep it simple and stick in 2D. 44:11.500 --> 44:14.740 So pretty much my understanding today, 44:14.740 --> 44:17.740 how it is done at scale in the companies, 44:17.740 --> 44:20.220 you're without machine learning, 44:20.220 --> 44:22.100 you have high throughput screening. 44:22.100 --> 44:24.540 So you know that you are interested to get certain 44:24.540 --> 44:26.580 biological activity of the compounds. 44:26.580 --> 44:28.860 So you scan a lot of compounds, 44:28.860 --> 44:30.700 like maybe hundreds of thousands, 44:30.700 --> 44:32.980 some really big number of compounds. 44:32.980 --> 44:36.100 You identify some compounds which have the right activity 44:36.100 --> 44:39.260 and then at this point, the chemists come 44:39.260 --> 44:44.260 and they're trying to now to optimize this original heat 44:44.340 --> 44:46.340 to different properties that you want it to be, 44:46.340 --> 44:49.100 maybe soluble, you want to decrease toxicity, 44:49.100 --> 44:51.660 you want to decrease the side effects. 44:51.660 --> 44:54.060 Are those, sorry, again to the drop, 44:54.060 --> 44:55.540 can that be done in simulation 44:55.540 --> 44:57.700 or just by looking at the molecules 44:57.700 --> 44:59.860 or do you need to actually run reactions 44:59.860 --> 45:02.180 in real labs with lab posts and stuff? 45:02.180 --> 45:04.060 So when you do high throughput screening, 45:04.060 --> 45:07.060 you really do screening, it's in the lab. 45:07.060 --> 45:09.180 It's really the lab screening, 45:09.180 --> 45:10.980 you screen the molecules, correct? 45:10.980 --> 45:12.540 I don't know what screening is. 45:12.540 --> 45:15.100 The screening, you just check them for certain property. 45:15.100 --> 45:17.340 Like in the physical space, in the physical world, 45:17.340 --> 45:18.780 like actually there's a machine probably 45:18.780 --> 45:21.460 that's actually running the reaction. 45:21.460 --> 45:22.900 Actually running the reactions, yeah. 45:22.900 --> 45:25.420 So there is a process where you can run 45:25.420 --> 45:26.860 and that's why it's called high throughput, 45:26.860 --> 45:29.580 you know, it becomes cheaper and faster 45:29.580 --> 45:33.820 to do it on very big number of molecules. 45:33.820 --> 45:38.340 You run the screening, you identify potential, 45:38.340 --> 45:40.300 you know, potential good starts 45:40.300 --> 45:42.340 and then where the chemists come in 45:42.340 --> 45:44.060 who, you know, have done it many times 45:44.060 --> 45:45.900 and then they can try to look at it 45:45.900 --> 45:48.260 and say, how can you change the molecule 45:48.260 --> 45:53.260 to get the desired profile in terms of all other properties? 45:53.460 --> 45:56.500 So maybe how do I make it more bioactive and so on? 45:56.500 --> 45:59.460 And there, you know, the creativity of the chemists 45:59.460 --> 46:03.980 really is the one that determines the success 46:03.980 --> 46:07.460 of this design because again, 46:07.460 --> 46:10.180 they have a lot of domain knowledge of, you know, 46:10.180 --> 46:12.900 what works, how do you decrease the CCT and so on? 46:12.900 --> 46:15.020 And that's what they do. 46:15.020 --> 46:17.860 So all the drugs that are currently, you know, 46:17.860 --> 46:20.820 in the FDA approved drugs or even drugs 46:20.820 --> 46:22.140 that are in clinical trials, 46:22.140 --> 46:27.140 they are designed using these domain experts 46:27.140 --> 46:30.060 which goes through this combinatorial space 46:30.060 --> 46:31.980 of molecules or graphs or whatever 46:31.980 --> 46:35.180 and find the right one or adjust it to be the right ones. 46:35.180 --> 46:39.260 Sounds like the breast density heuristic from 67, 46:39.260 --> 46:40.500 the same echoes. 46:40.500 --> 46:41.820 It's not necessarily that. 46:41.820 --> 46:45.380 It's really, you know, it's really driven by deep understanding. 46:45.380 --> 46:46.820 It's not like they just observe it. 46:46.820 --> 46:48.540 I mean, they do deeply understand chemistry 46:48.540 --> 46:50.460 and they do understand how different groups 46:50.460 --> 46:53.140 and how does it change the properties. 46:53.140 --> 46:56.660 So there is a lot of science that gets into it 46:56.660 --> 46:58.740 and a lot of kind of simulation, 46:58.740 --> 47:00.940 how do you want it to behave? 47:01.900 --> 47:03.900 It's very, very complex. 47:03.900 --> 47:06.140 So they're quite effective at this design, obviously. 47:06.140 --> 47:08.420 Now, effective, yeah, we have drugs. 47:08.420 --> 47:10.780 Like depending on how do you measure effective? 47:10.780 --> 47:13.940 If you measure, it's in terms of cost, it's prohibitive. 47:13.940 --> 47:15.820 If you measure it in terms of times, you know, 47:15.820 --> 47:18.420 we have lots of diseases for which we don't have any drugs 47:18.420 --> 47:20.100 and we don't even know how to approach. 47:20.100 --> 47:23.460 I don't need to mention few drugs 47:23.460 --> 47:26.980 or degenerative disease drugs that fail, you know. 47:26.980 --> 47:30.900 So there are lots of, you know, trials that fail, 47:30.900 --> 47:32.180 you know, in later stages, 47:32.180 --> 47:35.180 which is really catastrophic from the financial perspective. 47:35.180 --> 47:38.260 So, you know, is it the effective, 47:38.260 --> 47:39.540 the most effective mechanism? 47:39.540 --> 47:42.740 Absolutely no, but this is the only one that currently works. 47:42.740 --> 47:46.660 And I would, you know, I was closely interacting 47:46.660 --> 47:48.020 with people in pharmaceutical industry. 47:48.020 --> 47:50.020 I was really fascinating on how sharp 47:50.020 --> 47:53.860 and what a deep understanding of the domain do they have. 47:53.860 --> 47:55.500 It's not observation driven. 47:55.500 --> 47:58.660 There is really a lot of science behind what they do. 47:58.660 --> 48:00.940 But if you ask me, can machine learning change it? 48:00.940 --> 48:03.460 I firmly believe yes, 48:03.460 --> 48:07.020 because even the most experienced chemists cannot, you know, 48:07.020 --> 48:09.460 hold in their memory and understanding 48:09.460 --> 48:11.020 everything that you can learn, you know, 48:11.020 --> 48:14.140 from millions of molecules and reactions. 48:14.140 --> 48:18.380 And the space of graphs is a totally new space. 48:18.380 --> 48:20.460 I mean, it's a really interesting space 48:20.460 --> 48:22.540 for machine learning to explore, graph generation. 48:22.540 --> 48:24.740 Yeah, so there are a lot of things that you can do here. 48:24.740 --> 48:27.140 So we do a lot of work. 48:27.140 --> 48:29.940 So the first tool that we started with 48:29.940 --> 48:34.940 was the tool that can predict properties of the molecules. 48:34.940 --> 48:37.820 So you can just give the molecule and the property. 48:37.820 --> 48:39.900 It can be bioactivity properties. 48:39.900 --> 48:42.700 Or it can be some other property. 48:42.700 --> 48:48.580 And you train the molecules and you can now take a new molecule 48:48.580 --> 48:50.620 and predict this property. 48:50.620 --> 48:53.420 Now, when people started working in this area, 48:53.420 --> 48:54.620 it is something very simple. 48:54.620 --> 48:57.220 They do kind of existing, you know, fingerprints, 48:57.220 --> 48:59.380 which is kind of handcrafted features of the molecule 48:59.380 --> 49:01.420 when you break the graph to substructures 49:01.420 --> 49:04.420 and then you run, I don't know, a feedforward neural network. 49:04.420 --> 49:07.100 And what was interesting to see that clearly, you know, 49:07.100 --> 49:09.980 this was not the most effective way to proceed. 49:09.980 --> 49:12.900 And you need to have much more complex models 49:12.900 --> 49:15.140 that can induce a representation 49:15.140 --> 49:18.220 which can translate this graph into the embeddings 49:18.220 --> 49:20.100 and do these predictions. 49:20.100 --> 49:22.020 So this is one direction. 49:22.020 --> 49:24.180 And another direction, which is kind of related, 49:24.180 --> 49:27.940 is not only to stop by looking at the embedding itself, 49:27.940 --> 49:31.580 but actually modify it to produce better molecules. 49:31.580 --> 49:34.780 So you can think about it as the machine translation 49:34.780 --> 49:37.220 that you can start with a molecule 49:37.220 --> 49:39.500 and then there is an improved version of molecule. 49:39.500 --> 49:41.380 And you can again, with encoder, 49:41.380 --> 49:42.820 translate it into the hidden space 49:42.820 --> 49:45.020 and then learn how to modify it to improve 49:45.020 --> 49:48.220 the in some ways version of the molecules. 49:48.220 --> 49:51.540 So that's, it's kind of really exciting. 49:51.540 --> 49:54.220 We already have seen that the property prediction works 49:54.220 --> 49:58.740 pretty well and now we are generating molecules 49:58.740 --> 50:00.780 and there is actually labs 50:00.780 --> 50:03.140 which are manufacturing this molecule. 50:03.140 --> 50:05.180 So we'll see why it will get to us. 50:05.180 --> 50:06.540 Okay, that's really exciting. 50:06.540 --> 50:07.980 There's a lot of problems. 50:07.980 --> 50:10.740 Speaking of machine translation and embeddings, 50:10.740 --> 50:15.180 you have done a lot of really great research in NLP, 50:15.180 --> 50:16.740 natural language processing. 50:18.020 --> 50:20.380 Can you tell me your journey through NLP, 50:20.380 --> 50:23.860 what ideas, problems, approaches were you working on 50:23.860 --> 50:26.820 were you fascinated with, did you explore 50:26.820 --> 50:31.820 before this magic of deep learning reemerged and after? 50:31.820 --> 50:35.740 So when I started my work in NLP, it was in 97. 50:35.740 --> 50:37.260 This was a very interesting time. 50:37.260 --> 50:40.780 It was exactly the time that I came to ACL 50:40.780 --> 50:43.540 and the dynamic would barely understand English. 50:43.540 --> 50:46.140 But it was exactly like the transition point 50:46.140 --> 50:51.140 because half of the papers were really rule based approaches 50:51.140 --> 50:53.820 where people took more kind of heavy linguistic approaches 50:53.820 --> 50:57.820 for small domains and try to build up from there. 50:57.820 --> 50:59.980 And then there were the first generation of papers 50:59.980 --> 51:01.980 which were corpus based papers. 51:01.980 --> 51:03.900 And they were very simple in our terms 51:03.900 --> 51:05.420 when you collect some statistics 51:05.420 --> 51:07.300 and do prediction based on them. 51:07.300 --> 51:10.700 But I found it really fascinating that one community 51:10.700 --> 51:16.700 can think so very differently about the problem. 51:16.700 --> 51:20.260 And I remember my first papers that I wrote, 51:20.260 --> 51:21.940 it didn't have a single formula, 51:21.940 --> 51:25.700 it didn't have evaluation, it just had examples of outputs. 51:25.700 --> 51:29.500 And this was a standard of the first generation 51:29.500 --> 51:32.020 of the field at a time. 51:32.020 --> 51:35.820 In some ways, I mean, people maybe just started emphasizing 51:35.820 --> 51:37.820 the empirical evaluation, 51:37.820 --> 51:39.780 but for many applications like summarization, 51:39.780 --> 51:42.780 you just wrote some examples of outputs. 51:42.780 --> 51:44.460 And then increasingly you can see 51:44.460 --> 51:48.300 that how the statistical approach has dominated the field. 51:48.300 --> 51:52.060 And we've seen increased performance 51:52.060 --> 51:56.020 across many basic tasks. 51:56.020 --> 52:00.020 The sad part of the story may be that if you look 52:00.020 --> 52:01.580 again through this journey, 52:01.580 --> 52:05.060 we see that the role of linguistics 52:05.060 --> 52:07.420 in some ways greatly diminishes. 52:07.420 --> 52:11.580 And I think that you really need to look 52:11.580 --> 52:14.540 through the whole proceeding to find one or two papers 52:14.540 --> 52:17.260 which make some interesting linguistic references. 52:17.260 --> 52:18.100 It's really big. 52:18.100 --> 52:18.940 You mean today? 52:18.940 --> 52:19.780 Today. 52:19.780 --> 52:20.620 Today. 52:20.620 --> 52:21.460 This was definitely... 52:21.460 --> 52:22.300 Things like syntactic trees, 52:22.300 --> 52:24.380 just even basically against our conversation 52:24.380 --> 52:27.500 about human understanding of language, 52:27.500 --> 52:30.260 which I guess what linguistics would be, 52:30.260 --> 52:34.260 structured hierarchical representing language 52:34.260 --> 52:35.700 in a way that's human explainable, 52:35.700 --> 52:39.420 understandable is missing today. 52:39.420 --> 52:41.100 I don't know if it is, 52:41.100 --> 52:43.580 what is explainable and understandable. 52:43.580 --> 52:45.900 At the end, we perform functions 52:45.900 --> 52:50.100 and it's okay to have machine which performs a function. 52:50.100 --> 52:53.180 Like when you're thinking about your calculator, correct? 52:53.180 --> 52:55.420 Your calculator can do calculation 52:55.420 --> 52:57.580 very different from you would do the calculation, 52:57.580 --> 52:58.820 but it's very effective in it. 52:58.820 --> 52:59.700 And this is fine. 52:59.700 --> 53:04.420 If we can achieve certain tasks with high accuracy, 53:04.420 --> 53:07.100 it doesn't necessarily mean that it has to understand 53:07.100 --> 53:09.260 in the same way as we understand. 53:09.260 --> 53:11.220 In some ways, it's even naive to request 53:11.220 --> 53:14.900 because you have so many other sources of information 53:14.900 --> 53:17.860 that are absent when you are training your system. 53:17.860 --> 53:19.180 So it's okay. 53:19.180 --> 53:20.020 Is it delivered? 53:20.020 --> 53:21.460 And I would tell you one application 53:21.460 --> 53:22.780 that's just really fascinating. 53:22.780 --> 53:24.300 In 97, when it came to ACL, 53:24.300 --> 53:25.900 there were some papers on machine translation. 53:25.900 --> 53:27.460 They were like primitive, 53:27.460 --> 53:31.100 like people were trying really, really simple. 53:31.100 --> 53:34.300 And the feeling, my feeling was that, 53:34.300 --> 53:36.300 to make real machine translation system, 53:36.300 --> 53:39.580 it's like to fly at the moon and build a house there 53:39.580 --> 53:41.620 and the garden and live happily ever after. 53:41.620 --> 53:42.620 I mean, it's like impossible. 53:42.620 --> 53:46.740 I never could imagine that within 10 years, 53:46.740 --> 53:48.580 we would already see the system working. 53:48.580 --> 53:51.620 And now nobody is even surprised 53:51.620 --> 53:54.460 to utilize the system on daily basis. 53:54.460 --> 53:56.260 So this was like a huge, huge progress, 53:56.260 --> 53:57.900 saying that people for very long time 53:57.900 --> 54:00.820 tried to solve using other mechanisms 54:00.820 --> 54:03.220 and they were unable to solve it. 54:03.220 --> 54:06.180 That's why I'm coming back to a question about biology, 54:06.180 --> 54:10.820 that in linguistics, people try to go this way 54:10.820 --> 54:13.540 and try to write the syntactic trees 54:13.540 --> 54:14.860 and try to obstruct it 54:14.860 --> 54:17.060 and to find the right representation. 54:17.060 --> 54:22.060 And, you know, they couldn't get very far 54:22.220 --> 54:25.940 with this understanding while these models, 54:25.940 --> 54:29.620 using, you know, other sources actually capable 54:29.620 --> 54:31.660 to make a lot of progress. 54:31.660 --> 54:33.940 Now, I'm not naive to think 54:33.940 --> 54:36.860 that we are in this paradise space in NLP 54:36.860 --> 54:38.540 and I'm sure as you know, 54:38.540 --> 54:40.860 that when we slightly change the domain 54:40.860 --> 54:42.580 and when we decrease the amount of training, 54:42.580 --> 54:44.740 it can do like really bizarre and funny thing. 54:44.740 --> 54:47.140 But I think it's just a matter of improving 54:47.140 --> 54:48.540 generalization capacity, 54:48.540 --> 54:51.500 which is just a technical question. 54:51.500 --> 54:54.300 Well, so that's the question. 54:54.300 --> 54:57.020 How much of language understanding 54:57.020 --> 54:59.180 can be solved with deep neural networks? 54:59.180 --> 55:03.740 In your intuition, I mean, it's unknown, I suppose. 55:03.740 --> 55:07.660 But as we start to creep towards romantic notions 55:07.660 --> 55:10.620 of the spirit of the Turing test 55:10.620 --> 55:14.220 and conversation and dialogue 55:14.220 --> 55:18.300 and something that may be to me or to us, 55:18.300 --> 55:21.620 so the humans feels like it needs real understanding. 55:21.620 --> 55:23.500 How much can I be achieved 55:23.500 --> 55:27.140 with these neural networks or statistical methods? 55:28.060 --> 55:33.060 So I guess I am very much driven by the outcomes. 55:33.340 --> 55:35.420 Can we achieve the performance, 55:35.420 --> 55:40.420 which would be satisfactory for us for different tasks. 55:40.700 --> 55:43.020 Now, if you again look at machine translation system, 55:43.020 --> 55:46.060 which are trained on large amounts of data, 55:46.060 --> 55:48.820 they really can do a remarkable job 55:48.820 --> 55:51.380 relatively to where they've been a few years ago. 55:51.380 --> 55:54.660 And if you project into the future, 55:54.660 --> 55:57.020 if it will be the same speed of improvement, 55:59.380 --> 56:00.220 this is great. 56:00.220 --> 56:01.060 Now, does it bother me 56:01.060 --> 56:04.900 that it's not doing the same translation as we are doing? 56:04.900 --> 56:06.660 Now, if you go to cognitive science, 56:06.660 --> 56:09.460 we still don't really understand what we are doing. 56:10.460 --> 56:11.900 I mean, there are a lot of theories 56:11.900 --> 56:13.860 and there is obviously a lot of progress and studying, 56:13.860 --> 56:17.580 but our understanding what exactly goes on in our brains 56:17.580 --> 56:21.060 when we process language is still not crystal clear 56:21.060 --> 56:25.500 and precise that we can translate it into machines. 56:25.500 --> 56:29.820 What does bother me is that, again, 56:29.820 --> 56:31.740 that machines can be extremely brittle 56:31.740 --> 56:34.540 when you go out of your comfort zone of there, 56:34.540 --> 56:36.100 when there is a distributional shift 56:36.100 --> 56:37.340 between training and testing. 56:37.340 --> 56:39.060 And it have been years and years, 56:39.060 --> 56:41.540 every year when they teach a NLP class, 56:41.540 --> 56:43.580 show them some examples of translation 56:43.580 --> 56:45.740 from some newspaper in Hebrew, 56:45.740 --> 56:47.340 whatever, it was perfect. 56:47.340 --> 56:48.860 And then they have a recipe 56:48.860 --> 56:52.620 that Tomi Akala's system sent me a while ago 56:52.620 --> 56:55.740 and it was written in Finnish of Carillian pies. 56:55.740 --> 56:59.340 And it's just a terrible translation. 56:59.340 --> 57:01.500 You cannot understand anything what it does. 57:01.500 --> 57:03.180 It's not like some syntactic mistakes. 57:03.180 --> 57:04.340 It's just terrible. 57:04.340 --> 57:07.140 And year after year, I tried it and will translate it. 57:07.140 --> 57:09.020 And year after year, it does this terrible work 57:09.020 --> 57:12.060 because I guess the recipes are not big part 57:12.060 --> 57:14.620 of their training repertoire. 57:15.500 --> 57:17.780 So, but in terms of outcomes, 57:17.780 --> 57:20.260 that's a really clean, good way to look at it. 57:21.140 --> 57:23.180 I guess the question I was asking is, 57:24.100 --> 57:27.740 do you think, imagine a future, 57:27.740 --> 57:29.820 do you think the current approaches 57:29.820 --> 57:32.540 can pass the Turing test in the way, 57:34.740 --> 57:37.060 in the best possible formulation of the Turing test? 57:37.060 --> 57:39.500 Which is, would you want to have a conversation 57:39.500 --> 57:42.380 with a neural network for an hour? 57:42.380 --> 57:45.860 Oh God, no, no, there are not that many people 57:45.860 --> 57:48.140 that I would want to talk for an hour. 57:48.140 --> 57:51.540 But there are some people in this world, alive or not, 57:51.540 --> 57:53.300 that you would like to talk to for an hour, 57:53.300 --> 57:56.740 could a neural network achieve that outcome? 57:56.740 --> 57:58.220 So I think it would be really hard 57:58.220 --> 58:01.180 to create a successful training set, 58:01.180 --> 58:03.380 which would enable it to have a conversation 58:03.380 --> 58:07.140 for an intercontextual conversation for an hour. 58:07.140 --> 58:08.180 So you think it's a problem of data, perhaps? 58:08.180 --> 58:09.980 I think in some ways it's an important data. 58:09.980 --> 58:12.500 It's a problem both of data and the problem 58:12.500 --> 58:15.780 of the way we are training our systems, 58:15.780 --> 58:18.100 their ability to truly to generalize, 58:18.100 --> 58:21.300 to be very compositional, in some ways, it's limited, 58:21.300 --> 58:24.120 in the current capacity, at least. 58:25.580 --> 58:28.020 You know, we can translate well, 58:28.020 --> 58:32.540 we can find information well, we can extract information. 58:32.540 --> 58:35.220 So there are many capacities in which it's doing very well. 58:35.220 --> 58:38.020 And you can ask me, would you trust the machine 58:38.020 --> 58:39.900 to translate for you and use it as a source? 58:39.900 --> 58:41.900 I would say absolutely, especially if we're talking 58:41.900 --> 58:44.180 about newspaper data or other data, 58:44.180 --> 58:46.780 which is in the realm of its own training set, 58:46.780 --> 58:47.940 I would say yes. 58:48.940 --> 58:52.940 But, you know, having conversations with the machine, 58:52.940 --> 58:56.500 it's not something that I would choose to do. 58:56.500 --> 58:58.180 But you know, I would tell you something, 58:58.180 --> 58:59.460 talking about Turing tests 58:59.460 --> 59:02.980 and about all this kind of ELISA conversations. 59:02.980 --> 59:05.580 I remember visiting Tencent in China 59:05.580 --> 59:06.980 and they have this chat board. 59:06.980 --> 59:09.540 And they claim that it is like really humongous amount 59:09.540 --> 59:10.820 of the local population, 59:10.820 --> 59:12.980 which like for hours talks to the chat board, 59:12.980 --> 59:15.380 to me it was, I cannot believe it, 59:15.380 --> 59:17.140 but apparently it's like documented 59:17.140 --> 59:20.820 that there are some people who enjoy this conversation. 59:20.820 --> 59:24.580 And you know, it brought to me another MIT story 59:24.580 --> 59:26.940 about ELISA and Weizimbau. 59:26.940 --> 59:29.380 I don't know if you're familiar with the story. 59:29.380 --> 59:31.060 So Weizimbau was a professor at MIT 59:31.060 --> 59:32.620 and when he developed this ELISA, 59:32.620 --> 59:36.740 which was just doing string matching, very trivial, 59:36.740 --> 59:38.580 like restating of what you said, 59:38.580 --> 59:41.300 with very few rules, no syntax. 59:41.300 --> 59:43.780 Apparently there were secretaries at MIT 59:43.780 --> 59:48.220 that would sit for hours and converse with this trivial thing. 59:48.220 --> 59:50.220 And at the time there was no beautiful interfaces. 59:50.220 --> 59:53.580 So you actually need to go through the pain of communicating. 59:53.580 --> 59:56.980 And Weizimbau himself was so horrified by this phenomenon 59:56.980 --> 59:59.300 that people can believe enough to the machine 59:59.300 --> 1:00:00.860 that you just need to give them the hint 1:00:00.860 --> 1:00:02.060 that machine understands you 1:00:02.060 --> 1:00:03.980 and you can complete the rest. 1:00:03.980 --> 1:00:05.460 So he kind of stopped this research 1:00:05.460 --> 1:00:08.700 and went into kind of trying to understand 1:00:08.700 --> 1:00:11.500 what this artificial intelligence can do to our brains. 1:00:12.780 --> 1:00:15.580 So my point is, you know, how much, 1:00:15.580 --> 1:00:19.380 it's not how good is the technology, 1:00:19.380 --> 1:00:22.660 it's how ready we are to believe 1:00:22.660 --> 1:00:25.620 that it delivers the good that we are trying to get. 1:00:25.620 --> 1:00:27.220 That's a really beautiful way to put it. 1:00:27.220 --> 1:00:29.780 I, by the way, I'm not horrified by that possibility 1:00:29.780 --> 1:00:34.780 but inspired by it because, I mean, human connection, 1:00:35.940 --> 1:00:38.100 whether it's through language or through love, 1:00:39.180 --> 1:00:44.180 it seems like it's very amenable to machine learning 1:00:44.900 --> 1:00:49.340 and the rest is just challenges of psychology. 1:00:49.340 --> 1:00:52.460 Like you said, the secretaries who enjoy spending hours, 1:00:52.460 --> 1:00:55.020 I would say I would describe most of our lives 1:00:55.020 --> 1:00:58.060 as enjoying spending hours with those we love 1:00:58.060 --> 1:01:00.860 for very silly reasons. 1:01:00.860 --> 1:01:02.820 All we're doing is keyword matching as well. 1:01:02.820 --> 1:01:05.140 So I'm not sure how much intelligence 1:01:05.140 --> 1:01:08.180 we exhibit to each other with the people we love 1:01:08.180 --> 1:01:09.860 that we're close with. 1:01:09.860 --> 1:01:12.700 So it's a very interesting point 1:01:12.700 --> 1:01:16.060 of what it means to pass the Turing test with language. 1:01:16.060 --> 1:01:16.900 I think you're right. 1:01:16.900 --> 1:01:18.260 In terms of conversation, 1:01:18.260 --> 1:01:23.140 I think machine translation has very clear performance 1:01:23.140 --> 1:01:24.460 and improvement, right? 1:01:24.460 --> 1:01:28.060 What it means to have a fulfilling conversation 1:01:28.060 --> 1:01:31.060 is very, very person dependent 1:01:31.060 --> 1:01:33.580 and context dependent and so on. 1:01:33.580 --> 1:01:36.060 That's, yeah, it's very well put. 1:01:36.060 --> 1:01:38.340 So, but in your view, 1:01:38.340 --> 1:01:41.940 what's a benchmark in natural language, a test, 1:01:41.940 --> 1:01:43.700 that's just out of reach right now, 1:01:43.700 --> 1:01:46.060 but we might be able to, that's exciting. 1:01:46.060 --> 1:01:49.140 Is it in machine, isn't perfecting machine translation 1:01:49.140 --> 1:01:51.940 or is there other, is it summarization? 1:01:51.940 --> 1:01:53.340 What's out there just out of reach? 1:01:53.340 --> 1:01:55.860 It goes across specific application. 1:01:55.860 --> 1:01:58.300 It's more about the ability to learn 1:01:58.300 --> 1:02:00.100 from few examples for real, 1:02:00.100 --> 1:02:03.340 what we call future planning and all these cases. 1:02:03.340 --> 1:02:05.940 Because, you know, the way we publish these papers today, 1:02:05.940 --> 1:02:09.940 we say, if we have like naively, we get 55, 1:02:09.940 --> 1:02:12.500 but now we had a few example and we can move to 65. 1:02:12.500 --> 1:02:14.020 None of these methods actually 1:02:14.020 --> 1:02:15.980 realistically doing anything useful. 1:02:15.980 --> 1:02:18.540 You cannot use them today. 1:02:18.540 --> 1:02:23.540 And the ability to be able to generalize and to move 1:02:25.460 --> 1:02:28.980 or to be autonomous in finding the data 1:02:28.980 --> 1:02:30.300 that you need to learn, 1:02:31.340 --> 1:02:34.260 to be able to perfect new tasks or new language. 1:02:35.300 --> 1:02:38.100 This is an area where I think we really need 1:02:39.220 --> 1:02:43.060 to move forward to and we are not yet there. 1:02:43.060 --> 1:02:45.060 Are you at all excited, 1:02:45.060 --> 1:02:48.540 curious by the possibility of creating human level intelligence? 1:02:49.900 --> 1:02:52.540 Is this, because you've been very in your discussion. 1:02:52.540 --> 1:02:54.340 So if we look at oncology, 1:02:54.340 --> 1:02:58.100 you're trying to use machine learning to help the world 1:02:58.100 --> 1:02:59.700 in terms of alleviating suffering. 1:02:59.700 --> 1:03:02.340 If you look at natural language processing, 1:03:02.340 --> 1:03:05.300 you're focused on the outcomes of improving practical things 1:03:05.300 --> 1:03:06.820 like machine translation. 1:03:06.820 --> 1:03:09.860 But, you know, human level intelligence is a thing 1:03:09.860 --> 1:03:13.100 that our civilizations dream about creating 1:03:13.100 --> 1:03:15.740 super human level intelligence. 1:03:15.740 --> 1:03:16.940 Do you think about this? 1:03:16.940 --> 1:03:19.140 Do you think it's at all within our reach? 1:03:20.420 --> 1:03:22.660 So as you said yourself earlier, 1:03:22.660 --> 1:03:26.660 talking about, you know, how do you perceive, 1:03:26.660 --> 1:03:28.980 you know, our communications with each other 1:03:28.980 --> 1:03:30.700 that, you know, we're matching keywords 1:03:30.700 --> 1:03:33.020 and certain behaviors and so on. 1:03:33.020 --> 1:03:36.860 So at the end, whenever one assesses, 1:03:36.860 --> 1:03:38.660 let's say relations with another person, 1:03:38.660 --> 1:03:41.460 you have separate kind of measurements and outcomes 1:03:41.460 --> 1:03:43.620 inside your head that determine, you know, 1:03:43.620 --> 1:03:45.860 what is the status of the relation. 1:03:45.860 --> 1:03:48.580 So one way, this is this classical level. 1:03:48.580 --> 1:03:49.580 What is the intelligence? 1:03:49.580 --> 1:03:51.260 Is it the fact that now we are going to do 1:03:51.260 --> 1:03:52.940 the same way as human is doing 1:03:52.940 --> 1:03:55.500 when we don't even understand what the human is doing? 1:03:55.500 --> 1:03:59.100 Or we now have an ability to deliver these outcomes, 1:03:59.100 --> 1:04:01.300 but not in one area, not in an LPL, 1:04:01.300 --> 1:04:03.940 not just to translate or just to answer questions, 1:04:03.940 --> 1:04:06.900 but across many, many areas that we can achieve 1:04:06.900 --> 1:04:09.740 the functionalities that humans can achieve 1:04:09.740 --> 1:04:12.380 with their ability to learn and do other things. 1:04:12.380 --> 1:04:15.500 I think this is, and this we can actually measure 1:04:15.500 --> 1:04:20.340 how far we are, and that's what makes me excited 1:04:20.340 --> 1:04:22.420 that we, you know, in my lifetime, 1:04:22.420 --> 1:04:23.780 at least so far what we've seen, 1:04:23.780 --> 1:04:26.260 it's like tremendous progress across 1:04:26.260 --> 1:04:28.700 with these different functionalities. 1:04:28.700 --> 1:04:32.260 And I think it will be really exciting 1:04:32.260 --> 1:04:35.540 to see where we will be. 1:04:35.540 --> 1:04:40.020 And again, one way to think about is there are machines 1:04:40.020 --> 1:04:41.820 which are improving their functionality. 1:04:41.820 --> 1:04:44.900 Another one is to think about us with our brains, 1:04:44.900 --> 1:04:49.020 which are imperfect, how they can be accelerated 1:04:49.020 --> 1:04:54.020 by this technology as it becomes stronger and stronger. 1:04:55.860 --> 1:04:58.580 Coming back to another book that I love, 1:04:58.580 --> 1:05:02.060 Flowers for Algernon, have you read this book? 1:05:02.060 --> 1:05:02.900 Yes. 1:05:02.900 --> 1:05:05.740 You know, there is this point that the patient gets 1:05:05.740 --> 1:05:08.020 this miracle cure which changes his brain 1:05:08.020 --> 1:05:11.060 and all of a sudden they see life in a different way 1:05:11.060 --> 1:05:13.340 and can do certain things better, 1:05:13.340 --> 1:05:14.900 but certain things much worse. 1:05:16.540 --> 1:05:21.540 So you can imagine this kind of computer augmented cognition 1:05:22.420 --> 1:05:24.820 where it can bring you that now in the same way 1:05:24.820 --> 1:05:28.140 as, you know, the cars enable us to get to places 1:05:28.140 --> 1:05:30.100 where we've never been before. 1:05:30.100 --> 1:05:31.620 Can we think differently? 1:05:31.620 --> 1:05:32.820 Can we think faster? 1:05:32.820 --> 1:05:36.700 So, and we already see a lot of it happening 1:05:36.700 --> 1:05:38.260 in how it impacts us, 1:05:38.260 --> 1:05:42.180 but I think we have a long way to go there. 1:05:42.180 --> 1:05:45.020 So that's sort of artificial intelligence 1:05:45.020 --> 1:05:47.260 and technology affecting our, 1:05:47.260 --> 1:05:50.500 augmenting our intelligence as humans. 1:05:50.500 --> 1:05:55.500 Yesterday, a company called Neuralink announced 1:05:55.540 --> 1:05:56.820 they did this whole demonstration. 1:05:56.820 --> 1:05:57.980 I don't know if you saw it. 1:05:57.980 --> 1:06:00.980 It's, they demonstrated brain, computer, 1:06:00.980 --> 1:06:05.260 brain machine interface where there's like a sewing machine 1:06:05.260 --> 1:06:06.340 for the brain. 1:06:06.340 --> 1:06:11.140 Do you, you know, a lot of that is quite out there 1:06:11.140 --> 1:06:15.300 in terms of things that some people would say are impossible, 1:06:15.300 --> 1:06:18.100 but they're dreamers and want to engineer systems like that. 1:06:18.100 --> 1:06:20.380 Do you see, based on what you just said, 1:06:20.380 --> 1:06:23.820 a hope for that more direct interaction with the brain? 1:06:25.140 --> 1:06:27.020 I think there are different ways. 1:06:27.020 --> 1:06:28.980 One is a direct interaction with the brain. 1:06:28.980 --> 1:06:30.900 And again, there are lots of companies 1:06:30.900 --> 1:06:32.220 that work in this space. 1:06:32.220 --> 1:06:35.060 And I think there will be a lot of developments. 1:06:35.060 --> 1:06:36.540 When I'm just thinking that many times 1:06:36.540 --> 1:06:39.020 we are not aware of our feelings 1:06:39.020 --> 1:06:41.420 of motivation, what drives us. 1:06:41.420 --> 1:06:44.100 Like let me give you a trivial example, our attention. 1:06:45.500 --> 1:06:47.260 There are a lot of studies that demonstrate 1:06:47.260 --> 1:06:49.220 that it takes a while to a person to understand 1:06:49.220 --> 1:06:51.100 that they are not attentive anymore. 1:06:51.100 --> 1:06:52.180 And we know that there are people 1:06:52.180 --> 1:06:54.540 who really have strong capacity to hold attention. 1:06:54.540 --> 1:06:55.980 There are another end of the spectrum, 1:06:55.980 --> 1:06:57.980 people with ADD and other issues 1:06:57.980 --> 1:07:00.740 that they have problem to regulate their attention. 1:07:00.740 --> 1:07:03.540 Imagine to yourself that you have like a cognitive aid 1:07:03.540 --> 1:07:06.260 that just alerts you based on your gaze. 1:07:06.260 --> 1:07:09.300 That your attention is now not on what you are doing. 1:07:09.300 --> 1:07:11.460 And instead of writing a paper, you're now dreaming 1:07:11.460 --> 1:07:12.740 of what you're gonna do in the evening. 1:07:12.740 --> 1:07:16.340 So even this kind of simple measurement things, 1:07:16.340 --> 1:07:18.020 how they can change us. 1:07:18.020 --> 1:07:22.380 And I see it even in the simple ways with myself. 1:07:22.380 --> 1:07:26.460 I have my zone up from that I got in MIT gym. 1:07:26.460 --> 1:07:28.780 It kind of records how much did you run 1:07:28.780 --> 1:07:31.940 and you have some points and you can get some status, 1:07:31.940 --> 1:07:32.900 whatever. 1:07:32.900 --> 1:07:35.860 Like I said, what is this ridiculous thing? 1:07:35.860 --> 1:07:38.820 Who would ever care about some status in some arm? 1:07:38.820 --> 1:07:39.660 Guess what? 1:07:39.660 --> 1:07:41.580 So to maintain the status, 1:07:41.580 --> 1:07:44.660 you have to set a number of points every month. 1:07:44.660 --> 1:07:48.060 And not only is that they do it every single month 1:07:48.060 --> 1:07:50.580 for the last 18 months, 1:07:50.580 --> 1:07:54.180 it went to the point that I was injured. 1:07:54.180 --> 1:07:56.180 And when I could run again, 1:07:56.180 --> 1:08:01.180 I in two days, I did like some humongous amount 1:08:01.860 --> 1:08:04.020 of writing just to complete the points. 1:08:04.020 --> 1:08:05.820 It was like really not safe. 1:08:05.820 --> 1:08:08.340 It's like, I'm not gonna lose my status 1:08:08.340 --> 1:08:10.100 because I want to get there. 1:08:10.100 --> 1:08:13.180 So you can already see that this direct measurement 1:08:13.180 --> 1:08:16.180 and the feedback, we're looking at video games 1:08:16.180 --> 1:08:18.540 and see why the addiction aspect of it, 1:08:18.540 --> 1:08:20.340 but you can imagine that the same idea 1:08:20.340 --> 1:08:23.500 can be expanded to many other areas of our life 1:08:23.500 --> 1:08:25.820 when we really can get feedback 1:08:25.820 --> 1:08:28.380 and imagine in your case in relations 1:08:29.740 --> 1:08:31.220 when we are doing keyword matching, 1:08:31.220 --> 1:08:36.100 imagine that the person who is generating the key ones, 1:08:36.100 --> 1:08:37.700 that person gets direct feedback 1:08:37.700 --> 1:08:39.540 before the whole thing explodes. 1:08:39.540 --> 1:08:41.940 Is it maybe at this happy point, 1:08:41.940 --> 1:08:43.980 we are going in the wrong direction? 1:08:43.980 --> 1:08:47.980 Maybe it will be really a behavior modifying moment. 1:08:47.980 --> 1:08:51.300 So yeah, it's a relationship management too. 1:08:51.300 --> 1:08:54.180 So yeah, that's a fascinating whole area 1:08:54.180 --> 1:08:56.100 of psychology actually as well, 1:08:56.100 --> 1:08:58.220 of seeing how our behavior has changed 1:08:58.220 --> 1:09:00.820 with basically all human relations 1:09:00.820 --> 1:09:05.820 now have other non human entities helping us out. 1:09:06.180 --> 1:09:09.420 So you've, you teach a large, 1:09:09.420 --> 1:09:12.600 a huge machine learning course here at MIT. 1:09:13.620 --> 1:09:15.340 I can ask you a million questions, 1:09:15.340 --> 1:09:17.580 but you've seen a lot of students. 1:09:17.580 --> 1:09:20.900 What ideas do students struggle with the most 1:09:20.900 --> 1:09:23.940 as they first enter this world of machine learning? 1:09:25.700 --> 1:09:28.020 Actually, this year was the first time 1:09:28.020 --> 1:09:30.060 I started teaching a small machine learning class 1:09:30.060 --> 1:09:32.860 and it came as a result of what I saw 1:09:32.860 --> 1:09:35.620 in my big machine learning class that Tommy Ackle 1:09:35.620 --> 1:09:38.280 and I built maybe six years ago. 1:09:39.660 --> 1:09:42.820 What we've seen that as this area become more and more popular, 1:09:42.820 --> 1:09:46.940 more and more people at MIT want to take this class. 1:09:46.940 --> 1:09:49.940 And while we designed it for computer science majors, 1:09:49.940 --> 1:09:52.340 there were a lot of people who really are interested 1:09:52.340 --> 1:09:54.220 to learn it, but unfortunately, 1:09:54.220 --> 1:09:57.380 their background was not enabling them 1:09:57.380 --> 1:09:58.780 to do well in the class. 1:09:58.780 --> 1:10:01.000 And many of them associated machine learning 1:10:01.000 --> 1:10:02.980 with a world struggle and failure, 1:10:04.380 --> 1:10:06.380 primarily for non majors. 1:10:06.380 --> 1:10:08.660 And that's why we actually started a new class 1:10:08.660 --> 1:10:12.620 which we call machine learning from algorithms to modeling, 1:10:12.620 --> 1:10:16.740 which emphasizes more the modeling aspects of it 1:10:16.740 --> 1:10:21.700 and focuses on, it has majors and non majors. 1:10:21.700 --> 1:10:25.300 So we kind of try to extract the relevant parts 1:10:25.300 --> 1:10:27.340 and make it more accessible 1:10:27.340 --> 1:10:29.620 because the fact that we're teaching 20 classifiers 1:10:29.620 --> 1:10:31.020 in standard machine learning class 1:10:31.020 --> 1:10:34.100 is really a big question we really needed. 1:10:34.100 --> 1:10:36.380 But it was interesting to see this 1:10:36.380 --> 1:10:38.300 from first generation of students, 1:10:38.300 --> 1:10:40.900 when they came back from their internships 1:10:40.900 --> 1:10:43.940 and from their jobs, 1:10:43.940 --> 1:10:47.460 what different and exciting things they can do 1:10:47.460 --> 1:10:48.380 is that they would never think 1:10:48.380 --> 1:10:51.100 that you can even apply machine learning to. 1:10:51.100 --> 1:10:53.740 Some of them are like matching their relations 1:10:53.740 --> 1:10:56.020 and other things like variety of different applications. 1:10:56.020 --> 1:10:58.060 Everything is amenable to machine learning. 1:10:58.060 --> 1:11:00.260 That actually brings up an interesting point 1:11:00.260 --> 1:11:02.580 of computer science in general. 1:11:02.580 --> 1:11:05.420 It almost seems, maybe I'm crazy, 1:11:05.420 --> 1:11:08.420 but it almost seems like everybody needs to learn 1:11:08.420 --> 1:11:10.060 how to program these days. 1:11:10.060 --> 1:11:13.340 If you're 20 years old or if you're starting school, 1:11:13.340 --> 1:11:15.900 even if you're an English major, 1:11:15.900 --> 1:11:18.100 it seems like programming 1:11:18.980 --> 1:11:21.860 unlocks so much possibility in this world. 1:11:21.860 --> 1:11:24.980 So when you interact with those non majors, 1:11:24.980 --> 1:11:29.980 is there skills that they were simply lacking at the time 1:11:30.220 --> 1:11:31.980 that you wish they had 1:11:31.980 --> 1:11:34.620 and that they learned in high school and so on? 1:11:34.620 --> 1:11:37.460 Like how should education change 1:11:37.460 --> 1:11:41.260 in this computerized world that we live in? 1:11:41.260 --> 1:11:43.500 So seeing because they knew that there is a Python component 1:11:43.500 --> 1:11:44.820 in the class, 1:11:44.820 --> 1:11:47.020 their Python skills were okay 1:11:47.020 --> 1:11:49.140 and the class is not really heavy on programming. 1:11:49.140 --> 1:11:52.420 They primarily kind of add parts to the programs. 1:11:52.420 --> 1:11:55.420 I think it was more of their mathematical barriers 1:11:55.420 --> 1:11:58.220 and the class, again, with the design on the majors 1:11:58.220 --> 1:12:01.220 was using the notation like big O for complexity 1:12:01.220 --> 1:12:04.540 and others, people who come from different backgrounds 1:12:04.540 --> 1:12:05.820 just don't have it in the lexical. 1:12:05.820 --> 1:12:09.140 So necessarily very challenging notion, 1:12:09.140 --> 1:12:11.500 but they were just not aware. 1:12:12.380 --> 1:12:15.340 So I think that, you know, kind of linear algebra 1:12:15.340 --> 1:12:17.660 and probability, the basics, the calculus, 1:12:17.660 --> 1:12:20.860 want to vary the calculus, things that can help. 1:12:20.860 --> 1:12:23.580 What advice would you give to students 1:12:23.580 --> 1:12:26.620 interested in machine learning, interested, 1:12:26.620 --> 1:12:30.100 if you've talked about detecting curing cancer, 1:12:30.100 --> 1:12:33.140 drug design, if they want to get into that field, 1:12:33.140 --> 1:12:34.540 what should they do? 1:12:36.380 --> 1:12:39.100 Get into it and succeed as researchers 1:12:39.100 --> 1:12:42.100 and entrepreneurs. 1:12:43.300 --> 1:12:45.260 The first good piece of news is that right now 1:12:45.260 --> 1:12:47.420 there are lots of resources 1:12:47.420 --> 1:12:50.180 that are created at different levels 1:12:50.180 --> 1:12:51.860 and you can find online 1:12:51.860 --> 1:12:54.820 on your school classes, 1:12:54.820 --> 1:12:57.580 which are more mathematical or more applied and so on. 1:12:57.580 --> 1:13:01.340 So you can find a kind of a preacher 1:13:01.340 --> 1:13:02.780 which preaches your own language 1:13:02.780 --> 1:13:04.580 where you can enter the field 1:13:04.580 --> 1:13:06.740 and you can make many different types of contribution 1:13:06.740 --> 1:13:09.620 depending of what is your strengths. 1:13:10.740 --> 1:13:12.020 And the second point, 1:13:12.020 --> 1:13:15.300 I think it's really important to find some area 1:13:15.300 --> 1:13:18.140 which you really care about 1:13:18.140 --> 1:13:20.220 and it can motivate your learning 1:13:20.220 --> 1:13:22.620 and it can be for somebody curing cancer 1:13:22.620 --> 1:13:25.380 or doing cell driving cars or whatever, 1:13:25.380 --> 1:13:29.660 but to find an area where there is data 1:13:29.660 --> 1:13:31.300 where you believe there are strong patterns 1:13:31.300 --> 1:13:32.340 and we should be doing it 1:13:32.340 --> 1:13:33.580 and we're still not doing it 1:13:33.580 --> 1:13:35.260 or you can do it better 1:13:35.260 --> 1:13:37.860 and just start there 1:13:37.860 --> 1:13:39.700 and see a way it can bring you. 1:13:40.780 --> 1:13:44.060 So you've been very successful 1:13:44.060 --> 1:13:45.580 in many directions in life, 1:13:46.460 --> 1:13:48.860 but you also mentioned Flowers of Argonaut. 1:13:51.020 --> 1:13:53.820 And I think I've read or listened to you mention somewhere 1:13:53.820 --> 1:13:55.340 that researchers often get lost 1:13:55.340 --> 1:13:56.740 in the details of their work. 1:13:56.740 --> 1:14:00.220 This is per our original discussion with cancer and so on 1:14:00.220 --> 1:14:02.180 and don't look at the bigger picture, 1:14:02.180 --> 1:14:05.340 the bigger questions of meaning and so on. 1:14:05.340 --> 1:14:07.460 So let me ask you the impossible question 1:14:09.900 --> 1:14:11.620 of what's the meaning of this thing, 1:14:11.620 --> 1:14:16.740 of life, of your life, of research. 1:14:16.740 --> 1:14:21.460 Why do you think we descendant of great apes 1:14:21.460 --> 1:14:24.500 are here on this spinning ball? 1:14:26.820 --> 1:14:30.300 You know, I don't think that I have really a global answer 1:14:30.300 --> 1:14:32.900 you know, maybe that's why I didn't go to humanities 1:14:33.780 --> 1:14:36.500 and I didn't take humanities classes in my undergrad. 1:14:39.500 --> 1:14:43.580 But the way I am thinking about it, 1:14:43.580 --> 1:14:48.220 each one of us inside of them have their own set of, 1:14:48.220 --> 1:14:51.140 you know, things that we believe are important. 1:14:51.140 --> 1:14:53.380 And it just happens that we are busy 1:14:53.380 --> 1:14:54.820 with achieving various goals, 1:14:54.820 --> 1:14:56.260 busy listening to others 1:14:56.260 --> 1:14:58.100 and to kind of try to conform 1:14:58.100 --> 1:15:03.100 to be part of the crowd that we don't listen to that part. 1:15:04.580 --> 1:15:09.580 And, you know, we all should find some time to understand 1:15:09.580 --> 1:15:11.820 what is our own individual missions 1:15:11.820 --> 1:15:14.060 and we may have very different missions 1:15:14.060 --> 1:15:18.180 and to make sure that while we are running 10,000 things, 1:15:18.180 --> 1:15:21.900 we are not, you know, missing out 1:15:21.900 --> 1:15:24.420 and we're putting all the resources 1:15:24.420 --> 1:15:28.500 to satisfy our own mission. 1:15:28.500 --> 1:15:31.500 And if I look over my time, 1:15:31.500 --> 1:15:34.820 when I was younger, most of these missions, 1:15:34.820 --> 1:15:38.620 you know, I was primarily driven by the external stimulus, 1:15:38.620 --> 1:15:41.540 you know, to achieve this or to be that. 1:15:41.540 --> 1:15:46.540 And now a lot of what I do is driven by really thinking 1:15:47.660 --> 1:15:51.380 what is important for me to achieve independently 1:15:51.380 --> 1:15:55.140 of the external recognition. 1:15:55.140 --> 1:16:00.140 And, you know, I don't mind to be viewed in certain ways. 1:16:01.380 --> 1:16:05.740 The most important thing for me is to be true to myself, 1:16:05.740 --> 1:16:07.500 to what I think is right. 1:16:07.500 --> 1:16:08.700 How long did it take? 1:16:08.700 --> 1:16:13.220 How hard was it to find the you that you have to be true to? 1:16:14.180 --> 1:16:17.740 So it takes time and even now sometimes, you know, 1:16:17.740 --> 1:16:20.860 the vanity and the triviality can take, you know. 1:16:20.860 --> 1:16:26.060 Yeah, it can everywhere, you know, it's just the vanity. 1:16:26.060 --> 1:16:28.140 The vanity is different, the vanity in different places, 1:16:28.140 --> 1:16:30.940 but we all have our piece of vanity. 1:16:30.940 --> 1:16:34.700 But I think actually for me, 1:16:34.700 --> 1:16:39.700 the many times the place to get back to it is, you know, 1:16:41.700 --> 1:16:45.820 when I'm alone and also when I read. 1:16:45.820 --> 1:16:47.740 And I think by selecting the right books, 1:16:47.740 --> 1:16:52.740 you can get the right questions and learn from what you read. 1:16:54.900 --> 1:16:58.060 So, but again, it's not perfect, 1:16:58.060 --> 1:17:02.020 like vanity sometimes dominates. 1:17:02.020 --> 1:17:04.780 Well, that's a beautiful way to end. 1:17:04.780 --> 1:17:06.380 Thank you so much for talking today. 1:17:06.380 --> 1:17:07.820 Thank you. That was fun. 1:17:07.820 --> 1:17:17.820 It was fun.