lexicap / vtt /episode_027_small.vtt
Shubham Gupta
Add readme and files
a3be5d0
raw
history blame
103 kB
WEBVTT
00:00.000 --> 00:02.960
The following is a conversation with Kai Fu Li.
00:02.960 --> 00:06.520
He's the chairman and CEO of Sinovation Ventures
00:06.520 --> 00:10.560
that manages a $2 billion dual currency investment fund
00:10.560 --> 00:13.160
with a focus on developing the next generation
00:13.160 --> 00:15.440
of Chinese high tech companies.
00:15.440 --> 00:17.840
He's the former president of Google China
00:17.840 --> 00:20.880
and the founder of what is now called Microsoft Research
00:20.880 --> 00:24.160
Asia, an institute that trained many
00:24.160 --> 00:26.520
of the artificial intelligence leaders in China,
00:26.520 --> 00:32.080
including CTOs or AI execs at Baidu, Tencent, Alibaba,
00:32.080 --> 00:34.840
Lenovo, and Huawei.
00:34.840 --> 00:38.520
He was named one of the 100 most influential people
00:38.520 --> 00:40.680
in the world by Time Magazine.
00:40.680 --> 00:43.880
He's the author of seven bestselling books in Chinese
00:43.880 --> 00:47.080
and most recently, the New York Times bestseller called
00:47.080 --> 00:50.600
AI Superpowers, China, Silicon Valley,
00:50.600 --> 00:52.760
and the New World Order.
00:52.760 --> 00:57.200
He has unparalleled experience in working across major tech
00:57.200 --> 01:00.120
companies and governments on applications of AI.
01:00.120 --> 01:02.440
And so he has a unique perspective
01:02.440 --> 01:05.080
on global innovation in the future of AI
01:05.080 --> 01:09.000
that I think is important to listen to and think about.
01:09.000 --> 01:11.960
This is the Artificial Intelligence Podcast.
01:11.960 --> 01:15.240
If you enjoy it, subscribe on YouTube and iTunes,
01:15.240 --> 01:18.880
support it on Patreon, or simply connect with me on Twitter
01:18.880 --> 01:21.040
at Lex Freedman.
01:21.040 --> 01:26.120
And now, here's my conversation with Kaifu Li.
01:26.120 --> 01:29.440
I immigrated from Russia to US when I was 13.
01:29.440 --> 01:32.480
You immigrated to US at about the same age.
01:32.480 --> 01:35.920
The Russian people, the American people, the Chinese people,
01:35.920 --> 01:39.440
each have a certain soul, a spirit,
01:39.440 --> 01:42.080
that permeates throughout the generations.
01:42.080 --> 01:45.120
So maybe it's a little bit of a poetic question,
01:45.120 --> 01:49.240
but could you describe your sense of what
01:49.240 --> 01:52.080
defines the Chinese soul?
01:52.080 --> 01:56.160
I think the Chinese soul of people today, right,
01:56.160 --> 02:02.000
we're talking about people who have had centuries of burden
02:02.000 --> 02:05.240
because of the poverty that the country has gone through
02:05.240 --> 02:10.560
and suddenly shined with hope of prosperity
02:10.560 --> 02:13.440
in the past 40 years as China opened up
02:13.440 --> 02:16.440
and embraced market economy.
02:16.440 --> 02:20.200
And undoubtedly, there are two sets of pressures
02:20.200 --> 02:24.160
on the people, that of the tradition,
02:24.160 --> 02:28.040
that of facing difficult situations,
02:28.040 --> 02:31.160
and that of hope of wanting to be the first
02:31.160 --> 02:33.840
to become successful and wealthy,
02:33.840 --> 02:38.360
so that it's a very strong hunger and strong desire
02:38.360 --> 02:41.160
and strong work ethic that drives China forward.
02:41.160 --> 02:43.960
And is there roots to not just this generation,
02:43.960 --> 02:47.880
but before, that's deeper than just
02:47.880 --> 02:50.080
the new economic developments?
02:50.080 --> 02:52.520
Is there something that's unique to China
02:52.520 --> 02:54.960
that you could speak to that's in the people?
02:54.960 --> 02:56.000
Yeah.
02:56.000 --> 03:00.280
Well, the Chinese tradition is about excellence,
03:00.280 --> 03:02.680
dedication, and results.
03:02.680 --> 03:07.240
And the Chinese exams and study subjects in schools
03:07.240 --> 03:11.080
have traditionally started from memorizing 10,000 characters,
03:11.080 --> 03:13.600
not an easy task to start with.
03:13.600 --> 03:17.640
And further by memorizing historic philosophers,
03:17.640 --> 03:19.000
literature, poetry.
03:19.000 --> 03:22.480
So it really is probably the strongest road
03:22.480 --> 03:26.920
learning mechanism created to make sure people had good memory
03:26.920 --> 03:30.080
and remembered things extremely well.
03:30.080 --> 03:33.720
That, I think, at the same time suppresses
03:33.720 --> 03:37.360
the breakthrough innovation.
03:37.360 --> 03:42.520
And also enhances the speed execution get results.
03:42.520 --> 03:47.400
And that, I think, characterizes the historic basis of China.
03:47.400 --> 03:49.160
That's interesting, because there's echoes of that
03:49.160 --> 03:52.080
in Russian education as well as rote memorization.
03:52.080 --> 03:53.800
So you memorize a lot of poetry.
03:53.800 --> 03:59.240
I mean, there's just an emphasis on perfection in all forms
03:59.240 --> 04:02.240
that's not conducive to perhaps what you're speaking to,
04:02.240 --> 04:03.640
which is creativity.
04:03.640 --> 04:05.640
But you think that kind of education
04:05.640 --> 04:09.040
holds back the innovative spirit that you
04:09.040 --> 04:10.960
might see in the United States?
04:10.960 --> 04:14.840
Well, it holds back the breakthrough innovative spirit
04:14.840 --> 04:16.480
that we see in the United States.
04:16.480 --> 04:21.880
But it does not hold back the valuable execution oriented,
04:21.880 --> 04:26.320
result oriented value creating engines, which we see China
04:26.320 --> 04:27.960
being very successful.
04:27.960 --> 04:32.320
So is there a difference between a Chinese AI engineer
04:32.320 --> 04:35.600
today and an American AI engineer perhaps rooted
04:35.600 --> 04:38.320
in the culture that we just talked about or the education
04:38.320 --> 04:41.160
or the very soul of the people or no?
04:41.160 --> 04:43.720
And what would your advice be to each
04:43.720 --> 04:45.520
if there's a difference?
04:45.520 --> 04:47.120
Well, there's a lot that's similar,
04:47.120 --> 04:51.240
because AI is about mastering sciences,
04:51.240 --> 04:54.880
about using known technologies and trying new things.
04:54.880 --> 04:59.760
But it's also about picking from many parts of possible networks
04:59.760 --> 05:02.920
to use and different types of parameters to tune.
05:02.920 --> 05:05.280
And that part is somewhat rote.
05:05.280 --> 05:09.040
And it is also, as anyone who's built AI products,
05:09.040 --> 05:12.680
can tell you a lot about cleansing the data.
05:12.680 --> 05:15.200
Because AI runs better with more data.
05:15.200 --> 05:20.160
And data is generally unstructured, errorful,
05:20.160 --> 05:22.360
and unclean.
05:22.360 --> 05:26.280
And the effort to clean the data is immense.
05:26.280 --> 05:32.280
So I think the better part of the American AI engineering
05:32.280 --> 05:36.840
process is to try new things, to do things people haven't done
05:36.840 --> 05:41.840
before, and to use technology to solve most, if not all,
05:41.840 --> 05:43.480
problems.
05:43.480 --> 05:47.160
So to make the algorithm work despite not so great data,
05:47.160 --> 05:50.680
find error tolerant ways to deal with the data.
05:50.680 --> 05:55.960
The Chinese way would be to basically enumerate,
05:55.960 --> 05:58.560
to the fullest extent, all the possible ways
05:58.560 --> 06:01.000
by a lot of machines, try lots of different ways
06:01.000 --> 06:05.320
to get it to work, and spend a lot of resources and money
06:05.320 --> 06:07.720
and time cleaning up data.
06:07.720 --> 06:11.880
That means the AI engineer may be writing data cleansing
06:11.880 --> 06:15.600
algorithms, working with thousands of people
06:15.600 --> 06:19.160
who label or correct or do things with the data.
06:19.160 --> 06:21.920
That is the incredible hard work that
06:21.920 --> 06:24.040
might lead to better results.
06:24.040 --> 06:28.240
So the Chinese engineer would rely on and ask for more and more
06:28.240 --> 06:31.120
data and find ways to cleanse them and make them work
06:31.120 --> 06:34.200
in the system, and probably less time thinking
06:34.200 --> 06:39.320
about new algorithms that can overcome data or other issues.
06:39.320 --> 06:40.560
So where's your intuition?
06:40.560 --> 06:43.160
What do you think the biggest impact the next 10 years
06:43.160 --> 06:43.920
lies?
06:43.920 --> 06:47.120
Is it in some breakthrough algorithms?
06:47.120 --> 06:53.920
Or is it in just this at scale rigor, a rigorous approach
06:53.920 --> 06:57.120
to data, cleaning data, organizing data
06:57.120 --> 06:58.440
onto the same algorithms?
06:58.440 --> 07:02.600
What do you think the big impact in the applied world is?
07:02.600 --> 07:04.560
Well, if you're really in the company
07:04.560 --> 07:08.400
and you have to deliver results, using known techniques
07:08.400 --> 07:12.240
and enhancing data seems like the more expedient approach
07:12.240 --> 07:15.640
that's very low risk and likely to generate
07:15.640 --> 07:17.200
better and better results.
07:17.200 --> 07:20.520
And that's why the Chinese approach has done quite well.
07:20.520 --> 07:24.240
Now, there are a lot of more challenging startups
07:24.240 --> 07:28.440
and problems, such as autonomous vehicles,
07:28.440 --> 07:32.560
medical diagnosis, that existing algorithms probably
07:32.560 --> 07:34.240
won't solve.
07:34.240 --> 07:38.680
And that would put the Chinese approach more challenged
07:38.680 --> 07:43.720
and give them more breakthrough innovation approach, more
07:43.720 --> 07:45.440
of an edge on those kinds of problems.
07:45.440 --> 07:47.040
So let me talk to that a little more.
07:47.040 --> 07:50.960
So my intuition, personally, is that data
07:50.960 --> 07:53.680
can take us extremely far.
07:53.680 --> 07:56.480
So you brought up autonomous vehicles and medical diagnosis.
07:56.480 --> 08:00.080
So your intuition is that huge amounts of data
08:00.080 --> 08:04.000
might not be able to completely help us solve that problem.
08:04.000 --> 08:04.600
Right.
08:04.600 --> 08:08.080
So breaking that down further, autonomous vehicle,
08:08.080 --> 08:10.080
I think huge amounts of data probably
08:10.080 --> 08:13.360
will solve trucks driving on highways, which
08:13.360 --> 08:15.640
will deliver significant value.
08:15.640 --> 08:19.320
And China will probably lead in that.
08:19.320 --> 08:24.880
And full L5 autonomous is likely to require new technologies
08:24.880 --> 08:26.320
we don't yet know.
08:26.320 --> 08:30.320
And that might require academia and great industrial research,
08:30.320 --> 08:32.480
both innovating and working together.
08:32.480 --> 08:35.360
And in that case, US has an advantage.
08:35.360 --> 08:37.040
So the interesting question there is,
08:37.040 --> 08:39.280
I don't know if you're familiar on the autonomous vehicle
08:39.280 --> 08:43.480
space and the developments with Tesla and Elon Musk,
08:43.480 --> 08:49.400
where they are, in fact, a full steam ahead
08:49.400 --> 08:53.480
into this mysterious, complex world of full autonomy, L5,
08:53.480 --> 08:55.080
L4, L5.
08:55.080 --> 08:58.800
And they're trying to solve that purely with data.
08:58.800 --> 09:00.800
So the same kind of thing that you're saying
09:00.800 --> 09:03.200
is just for highway, which is what a lot of people
09:03.200 --> 09:07.200
share your intuition, they're trying to solve with data.
09:07.200 --> 09:09.320
It's just to linger on that moment further.
09:09.320 --> 09:13.600
Do you think possible for them to achieve success
09:13.600 --> 09:17.040
with simply just a huge amount of this training
09:17.040 --> 09:20.440
on edge cases, on difficult cases in urban environments,
09:20.440 --> 09:22.840
not just highway and so on?
09:22.840 --> 09:24.480
I think they'll be very hard.
09:24.480 --> 09:27.680
One could characterize Tesla's approach as kind
09:27.680 --> 09:31.600
of a Chinese strength approach, gather all the data you can,
09:31.600 --> 09:34.000
and hope that will overcome the problems.
09:34.000 --> 09:38.480
But in autonomous driving, clearly a lot of the decisions
09:38.480 --> 09:41.480
aren't merely solved by aggregating data
09:41.480 --> 09:43.520
and having feedback loop.
09:43.520 --> 09:48.040
There are things that are more akin to human thinking.
09:48.040 --> 09:51.680
And how would those be integrated and built?
09:51.680 --> 09:54.000
There has not yet been a lot of success
09:54.000 --> 09:57.200
integrating human intelligence or, you know,
09:57.200 --> 09:58.800
colored expert systems, if you will,
09:58.800 --> 10:02.960
even though that's a taboo word with the machine learning.
10:02.960 --> 10:05.600
And the integration of the two types of thinking
10:05.600 --> 10:07.840
hasn't yet been demonstrated.
10:07.840 --> 10:09.600
And the question is, how much can you
10:09.600 --> 10:12.440
push a purely machine learning approach?
10:12.440 --> 10:15.480
And of course, Tesla also has an additional constraint
10:15.480 --> 10:18.520
that they don't have all the sensors.
10:18.520 --> 10:21.120
I know that they think it's foolish to use LIDARS,
10:21.120 --> 10:25.920
but that's clearly a one less, very valuable and reliable
10:25.920 --> 10:29.200
source of input that they're foregoing, which
10:29.200 --> 10:32.440
may also have consequences.
10:32.440 --> 10:33.840
I think the advantage, of course,
10:33.840 --> 10:37.040
is capturing data that no one has ever seen before.
10:37.040 --> 10:41.040
And in some cases, such as computer vision and speech
10:41.040 --> 10:44.800
recognition, I have seen Chinese companies accumulate data
10:44.800 --> 10:47.320
that's not seen anywhere in the Western world,
10:47.320 --> 10:50.200
and they have delivered superior results.
10:50.200 --> 10:53.720
But then speech recognition and object recognition
10:53.720 --> 10:57.080
are relatively suitable problems for deep learning
10:57.080 --> 11:02.440
and don't have the potentially need for the human intelligence
11:02.440 --> 11:04.440
analytical planning elements.
11:04.440 --> 11:06.400
And the same on the speech recognition side,
11:06.400 --> 11:09.440
your intuition that speech recognition and the machine
11:09.440 --> 11:11.440
learning approaches to speech recognition
11:11.440 --> 11:14.600
won't take us to a conversational system that
11:14.600 --> 11:19.160
can pass the Turing test, which is maybe akin to what
11:19.160 --> 11:20.040
driving is.
11:20.040 --> 11:25.120
So it needs to have something more than just simply simple
11:25.120 --> 11:27.480
language understanding, simple language generation.
11:27.480 --> 11:32.000
Roughly right, I would say that based on purely machine
11:32.000 --> 11:35.160
learning approaches, it's hard to imagine.
11:35.160 --> 11:40.520
It could lead to a full conversational experience
11:40.520 --> 11:44.600
across arbitrary domains, which is akin to L5.
11:44.600 --> 11:46.920
I'm a little hesitant to use the word Turing test,
11:46.920 --> 11:50.280
because the original definition was probably too easy.
11:50.280 --> 11:52.320
We probably do that.
11:52.320 --> 11:55.280
The spirit of the Turing test is what I was referring to.
11:55.280 --> 11:56.520
Of course.
11:56.520 --> 11:59.400
So you've had major leadership research positions
11:59.400 --> 12:01.640
at Apple, Microsoft, Google.
12:01.640 --> 12:06.320
So continuing on the discussion of America, Russia, Chinese soul
12:06.320 --> 12:10.520
and culture and so on, what is the culture of Silicon
12:10.520 --> 12:16.400
Valley in contrast to China and maybe US broadly?
12:16.400 --> 12:19.920
And what is the unique culture of each of these three
12:19.920 --> 12:22.040
major companies, in your view?
12:22.040 --> 12:25.120
I think in aggregate, Silicon Valley companies,
12:25.120 --> 12:27.200
we could probably include Microsoft in that,
12:27.200 --> 12:29.120
even though they're not in the Valley,
12:29.120 --> 12:33.960
is really dream big and have visionary goals
12:33.960 --> 12:37.920
and believe that technology will conquer all
12:37.920 --> 12:42.240
and also the self confidence and the self entitlement
12:42.240 --> 12:45.440
that whatever they produce, the whole world should use
12:45.440 --> 12:47.240
and must use.
12:47.240 --> 12:54.080
And those are historically important, I think.
12:54.080 --> 12:59.120
Steve Jobs's famous quote that he doesn't do focus groups.
12:59.120 --> 13:02.360
He looks in the mirror and asks the person in the mirror,
13:02.360 --> 13:03.520
what do you want?
13:03.520 --> 13:07.000
And that really is an inspirational comment
13:07.000 --> 13:10.480
that says the great company shouldn't just ask users
13:10.480 --> 13:13.240
what they want, but develop something
13:13.240 --> 13:16.200
that users will know they want when they see it,
13:16.200 --> 13:18.960
but they could never come up with themselves.
13:18.960 --> 13:23.880
I think that is probably the most exhilarating description
13:23.880 --> 13:26.560
of what the essence of Silicon Valley is,
13:26.560 --> 13:31.840
that this brilliant idea could cause you to build something
13:31.840 --> 13:35.520
that couldn't come out of the focus groups or A.B. tests.
13:35.520 --> 13:38.040
And iPhone would be an example of that.
13:38.040 --> 13:40.560
No one in the age of BlackBerry would write down
13:40.560 --> 13:43.720
they want an iPhone or multi touch, a browser,
13:43.720 --> 13:44.800
might be another example.
13:44.800 --> 13:47.520
No one would say they want that in the days of FTP,
13:47.520 --> 13:49.440
but once they see it, they want it.
13:49.440 --> 13:55.680
So I think that is what Silicon Valley is best at.
13:55.680 --> 13:58.920
But it also came with a lot of success.
13:58.920 --> 14:01.960
These products became global platforms,
14:01.960 --> 14:05.080
and there were basically no competitors anywhere.
14:05.080 --> 14:08.400
And that has also led to a belief
14:08.400 --> 14:13.240
that these are the only things that one should do,
14:13.240 --> 14:17.960
that companies should not tread on other companies territory,
14:17.960 --> 14:24.040
so that a Groupon and a Yelp and an OpenTable
14:24.040 --> 14:26.240
and the Grubhub would each feel,
14:26.240 --> 14:28.520
okay, I'm not going to do the other companies business
14:28.520 --> 14:33.280
because that would not be the pride of innovating
14:33.280 --> 14:36.920
what each of these four companies have innovated.
14:36.920 --> 14:42.720
But I think the Chinese approach is do whatever it takes to win.
14:42.720 --> 14:45.000
And it's a winner take all market.
14:45.000 --> 14:47.200
And in fact, in the internet space,
14:47.200 --> 14:50.840
the market leader will get predominantly all the value
14:50.840 --> 14:53.320
extracted out of the system.
14:53.320 --> 14:59.600
And the system isn't just defined as one narrow category,
14:59.600 --> 15:01.360
but gets broader and broader.
15:01.360 --> 15:07.960
So it's amazing ambition for success and domination
15:07.960 --> 15:11.760
of increasingly larger product categories
15:11.760 --> 15:15.080
leading to clear market winner status
15:15.080 --> 15:19.120
and the opportunity to extract tremendous value.
15:19.120 --> 15:25.840
And that develops a practical, result oriented,
15:25.840 --> 15:31.520
ultra ambitious winner take all gladiatorial mentality.
15:31.520 --> 15:37.400
And if what it takes is to build what the competitors built,
15:37.400 --> 15:41.920
essentially a copycat, that can be done without infringing laws.
15:41.920 --> 15:46.280
If what it takes is to satisfy a foreign country's need
15:46.280 --> 15:48.480
by forking the code base and building something
15:48.480 --> 15:51.440
that looks really ugly and different, they'll do it.
15:51.440 --> 15:56.280
So it's contrasted very sharply with the Silicon Valley approach.
15:56.280 --> 16:00.080
And I think the flexibility and the speed and execution
16:00.080 --> 16:01.960
has helped the Chinese approach.
16:01.960 --> 16:05.040
And I think the Silicon Valley approach
16:05.040 --> 16:10.280
is potentially challenged if every Chinese entrepreneur is
16:10.280 --> 16:13.200
learning from the whole world, US and China,
16:13.200 --> 16:16.280
and the American entrepreneurs only look internally
16:16.280 --> 16:19.600
and write off China as a copycat.
16:19.600 --> 16:22.880
And the second part of your question about the three
16:22.880 --> 16:23.520
companies.
16:23.520 --> 16:26.000
The unique elements of the three companies, perhaps.
16:26.000 --> 16:26.840
Yeah.
16:26.840 --> 16:33.080
I think Apple represents, while the user, please the user,
16:33.080 --> 16:38.520
and the essence of design and brand,
16:38.520 --> 16:44.080
and it's the one company and perhaps the only tech company
16:44.080 --> 16:49.920
that draws people with a strong, serious desire
16:49.920 --> 16:53.560
for the product and the willingness to pay a premium
16:53.560 --> 16:57.160
because of the halo effect of the brand, which
16:57.160 --> 17:00.960
came from the attention to detail and great respect
17:00.960 --> 17:03.360
for user needs.
17:03.360 --> 17:09.200
Microsoft represents a platform approach
17:09.200 --> 17:14.280
that builds giant products that become very strong modes
17:14.280 --> 17:17.680
that others can't do because it's
17:17.680 --> 17:21.480
well architected at the bottom level
17:21.480 --> 17:26.640
and the work is efficiently delegated to individuals
17:26.640 --> 17:30.360
and then the whole product is built
17:30.360 --> 17:33.560
by adding small parts that sum together.
17:33.560 --> 17:37.760
So it's probably the most effective high tech assembly
17:37.760 --> 17:40.480
line that builds a very difficult product
17:40.480 --> 17:44.800
that the whole process of doing that
17:44.800 --> 17:50.800
is kind of a differentiation and something competitors
17:50.800 --> 17:52.480
can't easily repeat.
17:52.480 --> 17:54.800
Are there elements of the Chinese approach
17:54.800 --> 17:59.280
in the way Microsoft went about assembling those little pieces
17:59.280 --> 18:03.920
and essentially dominating the market for a long time?
18:03.920 --> 18:05.640
Or do you see those as distinct?
18:05.640 --> 18:08.240
I think there are elements that are the same.
18:08.240 --> 18:10.440
I think the three American companies
18:10.440 --> 18:13.880
that had or have Chinese characteristics,
18:13.880 --> 18:16.080
and obviously as well as American characteristics,
18:16.080 --> 18:20.400
are Microsoft, Facebook, and Amazon.
18:20.400 --> 18:21.720
Yes, that's right, Amazon.
18:21.720 --> 18:25.560
Because these are companies that will tenaciously
18:25.560 --> 18:31.320
go after adjacent markets, build up strong product offering,
18:31.320 --> 18:38.200
and find ways to extract greater value from a sphere that's
18:38.200 --> 18:39.960
ever increasing.
18:39.960 --> 18:43.520
And they understand the value of the platforms.
18:43.520 --> 18:45.600
So that's the similarity.
18:45.600 --> 18:53.760
And then with Google, I think it's a genuinely value oriented
18:53.760 --> 18:56.960
company that does have a heart and soul
18:56.960 --> 18:59.760
and that wants to do great things for the world
18:59.760 --> 19:06.040
by connecting information and that has also
19:06.040 --> 19:13.280
very strong technology genes and wants to use technology
19:13.280 --> 19:19.080
and has found out of the box ways to use technology
19:19.080 --> 19:23.680
to deliver incredible value to the end user.
19:23.680 --> 19:25.240
We can look at Google, for example.
19:25.240 --> 19:28.040
You mentioned heart and soul.
19:28.040 --> 19:31.840
There seems to be an element where Google
19:31.840 --> 19:34.840
is after making the world better.
19:34.840 --> 19:36.520
There's a more positive view.
19:36.520 --> 19:38.960
I mean, they used to have the slogan, don't be evil.
19:38.960 --> 19:43.120
And Facebook a little bit more has a negative tend to it,
19:43.120 --> 19:46.000
at least in the perception of privacy and so on.
19:46.000 --> 19:51.280
Do you have a sense of how these different companies can
19:51.280 --> 19:53.400
achieve, because you've talked about how much
19:53.400 --> 19:55.600
we can make the world better in all these kinds of ways
19:55.600 --> 19:59.360
with AI, what is it about a company that can make,
19:59.360 --> 20:03.200
give it a heart and soul, gain the trust of the public,
20:03.200 --> 20:08.000
and just actually just not be evil and do good for the world?
20:08.000 --> 20:09.000
It's really hard.
20:09.000 --> 20:13.120
And I think Google has struggled with that.
20:13.120 --> 20:15.160
First, they don't do evil.
20:15.160 --> 20:18.880
Mantra is very dangerous, because every employee's
20:18.880 --> 20:20.800
definition of evil is different.
20:20.800 --> 20:23.800
And that has led to some difficult employee situations
20:23.800 --> 20:25.240
for them.
20:25.240 --> 20:29.520
So I don't necessarily think that's a good value statement.
20:29.520 --> 20:31.840
But just watching the kinds of things
20:31.840 --> 20:36.440
Google or its parent company Alphabet does in new areas
20:36.440 --> 20:40.440
like health care, like eradicating mosquitoes,
20:40.440 --> 20:42.360
things that are really not in the business
20:42.360 --> 20:45.040
of a internet tech company, I think
20:45.040 --> 20:47.200
that shows that there is a heart and soul
20:47.200 --> 20:53.920
and desire to do good and willingness to put in the resources
20:53.920 --> 20:58.280
to do something when they see it's good, they will pursue it.
20:58.280 --> 21:00.640
That doesn't necessarily mean it has
21:00.640 --> 21:02.520
all the trust of the users.
21:02.520 --> 21:06.400
I realize while most people would view Facebook
21:06.400 --> 21:09.760
as the primary target of their recent unhappiness
21:09.760 --> 21:12.720
about Silicon Valley companies, many would put Google
21:12.720 --> 21:14.080
in that category.
21:14.080 --> 21:16.800
And some have named Google's business practices
21:16.800 --> 21:19.840
as predatory also.
21:19.840 --> 21:24.240
So it's kind of difficult to have the two parts of a body.
21:24.240 --> 21:28.080
The brain wants to do what it's supposed to do for a shareholder,
21:28.080 --> 21:29.280
maximize profit.
21:29.280 --> 21:30.880
And then the heart and soul wants
21:30.880 --> 21:36.120
to do good things that may run against what the brain wants to do.
21:36.120 --> 21:40.320
So in this complex balancing that these companies have to do,
21:40.320 --> 21:44.520
you've mentioned that you're concerned about a future where
21:44.520 --> 21:47.360
too few companies like Google, Facebook, Amazon
21:47.360 --> 21:51.560
are controlling our data or are controlling too much
21:51.560 --> 21:53.360
of our digital lives.
21:53.360 --> 21:55.400
Can you elaborate on this concern?
21:55.400 --> 21:58.640
Perhaps do you have a better way forward?
21:58.640 --> 22:05.000
I think I'm hardly the most vocal complainer of this.
22:05.000 --> 22:07.280
There are a lot louder complainers out there.
22:07.280 --> 22:11.840
I do observe that having a lot of data
22:11.840 --> 22:16.120
does perpetuate their strength and limits
22:16.120 --> 22:19.400
competition in many spaces.
22:19.400 --> 22:24.200
But I also believe AI is much broader than the internet space.
22:24.200 --> 22:26.280
So the entrepreneurial opportunities
22:26.280 --> 22:30.480
still exists in using AI to empower
22:30.480 --> 22:34.160
financial, retail, manufacturing, education,
22:34.160 --> 22:35.480
applications.
22:35.480 --> 22:39.800
So I don't think it's quite a case of full monopolistic dominance
22:39.800 --> 22:43.960
that totally stifles innovation.
22:43.960 --> 22:46.400
But I do believe in their areas of strength
22:46.400 --> 22:49.760
it's hard to dislodge them.
22:49.760 --> 22:53.280
I don't know if I have a good solution.
22:53.280 --> 22:57.160
Probably the best solution is let the entrepreneurial VC
22:57.160 --> 23:00.840
ecosystem work well and find all the places that
23:00.840 --> 23:04.200
can create the next Google, the next Facebook.
23:04.200 --> 23:08.560
So there will always be increasing number of challengers.
23:08.560 --> 23:11.360
In some sense, that has happened a little bit.
23:11.360 --> 23:15.760
You see Uber, Airbnb having emerged despite the strength
23:15.760 --> 23:19.040
of the big three.
23:19.040 --> 23:22.400
And I think China as an environment
23:22.400 --> 23:25.280
may be more interesting for the emergence.
23:25.280 --> 23:28.920
Because if you look at companies between, let's say,
23:28.920 --> 23:36.320
$50 to $300 billion, China has emerged more of such companies
23:36.320 --> 23:39.880
than the US in the last three to four years.
23:39.880 --> 23:42.120
Because of the larger marketplace,
23:42.120 --> 23:47.000
because of the more fearless nature of the entrepreneurs.
23:47.000 --> 23:50.840
And the Chinese giants are just as powerful as American ones.
23:50.840 --> 23:52.920
Tencent Alibaba are very strong.
23:52.920 --> 23:57.040
But Bytes Dance has emerged worth $75 billion.
23:57.040 --> 24:00.120
And financial, while it's Alibaba affiliated,
24:00.120 --> 24:03.920
it's nevertheless independent and worth $150 billion.
24:03.920 --> 24:08.280
And so I do think if we start to extend
24:08.280 --> 24:12.640
to traditional businesses, we will see very valuable companies.
24:12.640 --> 24:18.120
So it's probably not the case that in five or 10 years,
24:18.120 --> 24:20.920
we'll still see the whole world with these five companies
24:20.920 --> 24:22.680
having such dominance.
24:22.680 --> 24:26.040
So you've mentioned a couple of times
24:26.040 --> 24:27.840
this fascinating world of entrepreneurship
24:27.840 --> 24:31.080
in China of the fearless nature of the entrepreneurs.
24:31.080 --> 24:32.640
So can you maybe talk a little bit
24:32.640 --> 24:35.520
about what it takes to be an entrepreneur in China?
24:35.520 --> 24:38.240
What are the strategies that are undertaken?
24:38.240 --> 24:41.120
What are the ways that you success?
24:41.120 --> 24:43.960
What is the dynamic of VCF funding,
24:43.960 --> 24:46.480
of the way the government helps companies, and so on?
24:46.480 --> 24:49.520
What are the interesting aspects here that are distinct from,
24:49.520 --> 24:52.880
that are different from the Silicon Valley world
24:52.880 --> 24:55.240
of entrepreneurship?
24:55.240 --> 24:58.080
Well, many of the listeners probably
24:58.080 --> 25:03.000
still would brand Chinese entrepreneur as copycats.
25:03.000 --> 25:06.120
And no doubt, 10 years ago, that would not
25:06.120 --> 25:09.080
be an inaccurate description.
25:09.080 --> 25:12.320
Back 10 years ago, an entrepreneur probably
25:12.320 --> 25:14.840
could not get funding if he or she could not
25:14.840 --> 25:20.400
describe what product he or she is copying from the US.
25:20.400 --> 25:23.520
The first question is, who has proven this business model,
25:23.520 --> 25:27.200
which is a nice way of asking, who are you copying?
25:27.200 --> 25:29.520
And that reason is understandable,
25:29.520 --> 25:34.840
because China had a much lower internet penetration
25:34.840 --> 25:40.920
and didn't have enough indigenous experience
25:40.920 --> 25:43.200
to build innovative products.
25:43.200 --> 25:47.600
And secondly, internet was emerging.
25:47.600 --> 25:49.800
Link startup was the way to do things,
25:49.800 --> 25:52.920
building a first minimally viable product,
25:52.920 --> 25:55.320
and then expanding was the right way to go.
25:55.320 --> 25:59.480
And the American successes have given a shortcut
25:59.480 --> 26:02.840
that if you build your minimally viable product based
26:02.840 --> 26:05.040
on an American product, it's guaranteed
26:05.040 --> 26:06.720
to be a decent starting point.
26:06.720 --> 26:08.400
Then you tweak it afterwards.
26:08.400 --> 26:11.720
So as long as there are no IP infringement, which,
26:11.720 --> 26:15.080
as far as I know, there hasn't been in the mobile and AI
26:15.080 --> 26:19.360
spaces, that's a much better shortcut.
26:19.360 --> 26:23.720
And I think Silicon Valley would view that as still not
26:23.720 --> 26:29.200
very honorable, because that's not your own idea to start with.
26:29.200 --> 26:32.600
But you can't really, at the same time,
26:32.600 --> 26:35.160
believe every idea must be your own
26:35.160 --> 26:38.120
and believe in the link startup methodology,
26:38.120 --> 26:41.880
because link startup is intended to try many, many things
26:41.880 --> 26:44.240
and then converge when that works.
26:44.240 --> 26:46.720
And it's meant to be iterated and changed.
26:46.720 --> 26:51.240
So finding a decent starting point without legal violations,
26:51.240 --> 26:55.520
there should be nothing morally dishonorable about that.
26:55.520 --> 26:57.080
So just a quick pause on that.
26:57.080 --> 27:01.920
It's fascinating that that's why is that not honorable, right?
27:01.920 --> 27:04.680
It's exactly as you formulated.
27:04.680 --> 27:08.040
It seems like a perfect start for business
27:08.040 --> 27:12.440
is to take a look at Amazon and say, OK,
27:12.440 --> 27:14.560
we'll do exactly what Amazon is doing.
27:14.560 --> 27:16.800
Let's start there in this particular market.
27:16.800 --> 27:20.520
And then let's out innovate them from that starting point.
27:20.520 --> 27:22.200
Yes. Come up with new ways.
27:22.200 --> 27:26.520
I mean, is it wrong to be, except the word copycat just
27:26.520 --> 27:28.800
sounds bad, but is it wrong to be a copycat?
27:28.800 --> 27:31.640
It just seems like a smart strategy.
27:31.640 --> 27:35.800
But yes, doesn't have a heroic nature to it
27:35.800 --> 27:42.280
that Steve Jobs, Elon Musk, sort of in something completely
27:42.280 --> 27:43.880
coming up with something completely new.
27:43.880 --> 27:45.480
Yeah, I like the way you describe it.
27:45.480 --> 27:50.440
It's a nonheroic, acceptable way to start the company.
27:50.440 --> 27:52.840
And maybe more expedient.
27:52.840 --> 27:58.920
So that's, I think, a baggage for Silicon Valley,
27:58.920 --> 28:01.320
that if it doesn't let go, then it
28:01.320 --> 28:05.160
may limit the ultimate ceiling of the company.
28:05.160 --> 28:07.200
Take Snapchat as an example.
28:07.200 --> 28:09.840
I think Evan's brilliant.
28:09.840 --> 28:11.480
He built a great product.
28:11.480 --> 28:14.160
But he's very proud that he wants
28:14.160 --> 28:16.800
to build his own features, not copy others.
28:16.800 --> 28:21.000
While Facebook was more willing to copy his features,
28:21.000 --> 28:23.440
and you see what happens in the competition.
28:23.440 --> 28:27.440
So I think putting that handcuff on the company
28:27.440 --> 28:31.560
would limit its ability to reach the maximum potential.
28:31.560 --> 28:33.800
So back to the Chinese environment,
28:33.800 --> 28:38.400
copying was merely a way to learn from the American masters.
28:38.400 --> 28:43.480
Just like if we learned to play piano or painting,
28:43.480 --> 28:44.560
you start by copying.
28:44.560 --> 28:46.160
You don't start by innovating when
28:46.160 --> 28:48.200
you don't have the basic skill sets.
28:48.200 --> 28:51.040
So very amazingly, the Chinese entrepreneurs
28:51.040 --> 28:56.160
about six years ago started to branch off
28:56.160 --> 28:59.520
with these lean startups built on American ideas
28:59.520 --> 29:02.280
to build better products than American products.
29:02.280 --> 29:04.960
But they did start from the American idea.
29:04.960 --> 29:08.600
And today, WeChat is better than WhatsApp.
29:08.600 --> 29:10.520
Weibo is better than Twitter.
29:10.520 --> 29:12.920
Zihu is better than Quora and so on.
29:12.920 --> 29:17.000
So that, I think, is Chinese entrepreneurs
29:17.000 --> 29:18.480
going to step two.
29:18.480 --> 29:21.760
And then step three is once these entrepreneurs have
29:21.760 --> 29:23.720
done one or two of these companies,
29:23.720 --> 29:27.400
they now look at the Chinese market and the opportunities
29:27.400 --> 29:30.600
and come up with ideas that didn't exist elsewhere.
29:30.600 --> 29:36.320
So products like and financial under which includes Alipay,
29:36.320 --> 29:42.080
which is mobile payments, and also the financial products
29:42.080 --> 29:48.560
for loans built on that, and also in education, VIP kid,
29:48.560 --> 29:54.880
and in social video, social network, TikTok,
29:54.880 --> 29:58.640
and in social eCommerce, Pinduoduo,
29:58.640 --> 30:01.720
and then in ride sharing, Mobike.
30:01.720 --> 30:05.640
These are all Chinese innovative products
30:05.640 --> 30:08.720
that now are being copied elsewhere.
30:08.720 --> 30:13.040
So an additional interesting observation
30:13.040 --> 30:16.000
is some of these products are built on unique Chinese
30:16.000 --> 30:19.360
demographics, which may not work in the US,
30:19.360 --> 30:23.160
but may work very well in Southeast Asia, Africa,
30:23.160 --> 30:27.840
and other developing worlds that are a few years behind China.
30:27.840 --> 30:31.040
And a few of these products maybe are universal
30:31.040 --> 30:33.760
and are getting traction even in the United States,
30:33.760 --> 30:35.360
such as TikTok.
30:35.360 --> 30:42.080
So this whole ecosystem is supported by VCs
30:42.080 --> 30:44.920
as a virtuous cycle, because a large market
30:44.920 --> 30:49.400
with innovative entrepreneurs will draw a lot of money
30:49.400 --> 30:51.560
and then invest in these companies.
30:51.560 --> 30:54.480
As the market gets larger and larger,
30:54.480 --> 30:58.400
China market is easily three, four times larger than the US.
30:58.400 --> 31:01.120
They will create greater value and greater returns
31:01.120 --> 31:05.400
for the VCs, thereby raising even more money.
31:05.400 --> 31:10.000
So at Sinovation Ventures, our first fund was $15 million.
31:10.000 --> 31:12.040
Our last fund was $500 million.
31:12.040 --> 31:16.520
So it reflects the valuation of the companies
31:16.520 --> 31:19.840
and our us going multi stage and things like that.
31:19.840 --> 31:23.840
It also has government support, but not
31:23.840 --> 31:26.080
in the way most Americans would think of it.
31:26.080 --> 31:29.520
The government actually leaves the entrepreneurial space
31:29.520 --> 31:33.200
as a private enterprise, so the self regulating.
31:33.200 --> 31:36.200
And the government would build infrastructures
31:36.200 --> 31:39.320
that would around it to make it work better.
31:39.320 --> 31:41.960
For example, the mass entrepreneur mass innovation
31:41.960 --> 31:44.880
plan builds 8,000 incubators.
31:44.880 --> 31:48.360
So the pipeline is very strong to the VCs
31:48.360 --> 31:49.680
for autonomous vehicles.
31:49.680 --> 31:53.280
The Chinese government is building smart highways
31:53.280 --> 31:56.680
with sensors, smart cities that separate pedestrians
31:56.680 --> 32:01.560
from cars that may allow initially an inferior autonomous
32:01.560 --> 32:05.760
vehicle company to launch a car without increasing,
32:05.760 --> 32:11.520
with lower casualty, because the roads or the city is smart.
32:11.520 --> 32:13.800
And the Chinese government at local levels
32:13.800 --> 32:17.360
would have these guiding funds acting as LPs,
32:17.360 --> 32:19.400
passive LPs to funds.
32:19.400 --> 32:23.240
And when the fund makes money, part of the money made
32:23.240 --> 32:27.280
is given back to the GPs and potentially other LPs
32:27.280 --> 32:31.960
to increase everybody's return at the expense
32:31.960 --> 32:33.680
of the government's return.
32:33.680 --> 32:36.360
So that's an interesting incentive
32:36.360 --> 32:41.640
that entrusts the task of choosing entrepreneurs to VCs
32:41.640 --> 32:43.800
who are better at it than the government
32:43.800 --> 32:46.680
by letting some of the profits move that way.
32:46.680 --> 32:48.720
So this is really fascinating, right?
32:48.720 --> 32:51.800
So I look at the Russian government as a case study
32:51.800 --> 32:54.480
where, let me put it this way, there
32:54.480 --> 32:58.520
is no such government driven, large scale
32:58.520 --> 33:00.840
support of entrepreneurship.
33:00.840 --> 33:04.000
And probably the same is true in the United States.
33:04.000 --> 33:07.640
But the entrepreneurs themselves kind of find a way.
33:07.640 --> 33:11.680
So maybe in a form of advice or explanation,
33:11.680 --> 33:15.560
how did the Chinese government arrive to be this way,
33:15.560 --> 33:17.680
so supportive on entrepreneurship,
33:17.680 --> 33:21.520
to be in this particular way so forward thinking
33:21.520 --> 33:23.120
at such a large scale?
33:23.120 --> 33:28.280
And also perhaps, how can we copy it in other countries?
33:28.280 --> 33:29.800
How can we encourage other governments,
33:29.800 --> 33:31.600
like even the United States government,
33:31.600 --> 33:33.760
to support infrastructure for autonomous vehicles
33:33.760 --> 33:36.040
in that same kind of way, perhaps?
33:36.040 --> 33:36.680
Yes.
33:36.680 --> 33:44.440
So these techniques are the result of several key things,
33:44.440 --> 33:46.480
some of which may be learnable, some of which
33:46.480 --> 33:48.440
may be very hard.
33:48.440 --> 33:51.080
One is just trial and error and watching
33:51.080 --> 33:52.960
what everyone else is doing.
33:52.960 --> 33:54.960
I think it's important to be humble and not
33:54.960 --> 33:56.920
feel like you know all the answers.
33:56.920 --> 33:59.480
The guiding funds idea came from Singapore,
33:59.480 --> 34:01.440
which came from Israel.
34:01.440 --> 34:06.080
And China made a few tweaks and turned it into a,
34:06.080 --> 34:09.600
because the Chinese cities and government officials kind
34:09.600 --> 34:11.320
of compete with each other.
34:11.320 --> 34:14.640
Because they all want to make their city more successful,
34:14.640 --> 34:20.280
so they can get the next level in their political career.
34:20.280 --> 34:22.320
And it's somewhat competitive.
34:22.320 --> 34:25.200
So the central government made it a bit of a competition.
34:25.200 --> 34:26.840
Everybody has a budget.
34:26.840 --> 34:29.840
They can put it on AI, or they can put it on bio,
34:29.840 --> 34:32.200
or they can put it on energy.
34:32.200 --> 34:35.040
And then whoever gets the results, the city shines,
34:35.040 --> 34:38.000
the people are better off, the mayor gets a promotion.
34:38.000 --> 34:41.680
So the tools is kind of almost like an entrepreneurial
34:41.680 --> 34:44.840
environment for local governments
34:44.840 --> 34:47.480
to see who can do a better job.
34:47.480 --> 34:52.440
And also, many of them tried different experiments.
34:52.440 --> 34:58.440
Some have given award to very smart researchers,
34:58.440 --> 35:00.840
just give them money and hope they'll start a company.
35:00.840 --> 35:05.840
Some have given money to academic research labs,
35:05.840 --> 35:08.440
maybe government research labs, to see
35:08.440 --> 35:11.920
if they can spin off some companies from the science
35:11.920 --> 35:14.040
lab or something like that.
35:14.040 --> 35:17.080
Some have tried to recruit overseas Chinese
35:17.080 --> 35:18.960
to come back and start companies.
35:18.960 --> 35:20.960
And they've had mixed results.
35:20.960 --> 35:23.400
The one that worked the best was the guiding funds.
35:23.400 --> 35:25.840
So it's almost like a lean startup idea
35:25.840 --> 35:29.160
where people try different things in what works, sticks,
35:29.160 --> 35:30.600
and everybody copies.
35:30.600 --> 35:32.880
So now every city has a guiding fund.
35:32.880 --> 35:35.680
So that's how that came about.
35:35.680 --> 35:40.400
The autonomous vehicle and the massive spending
35:40.400 --> 35:46.080
in highways and smart cities, that's a Chinese way.
35:46.080 --> 35:49.480
It's about building infrastructure to facilitate.
35:49.480 --> 35:52.840
It's a clear division of the government's responsibility
35:52.840 --> 35:55.400
from the market.
35:55.400 --> 36:00.560
The market should do everything in a private freeway.
36:00.560 --> 36:02.920
But there are things the market can't afford to do,
36:02.920 --> 36:04.520
like infrastructure.
36:04.520 --> 36:08.000
So the government always appropriates
36:08.000 --> 36:12.000
large amounts of money for infrastructure building.
36:12.000 --> 36:16.880
This happens with not only autonomous vehicle and AI,
36:16.880 --> 36:20.840
but happened with the 3G and 4G.
36:20.840 --> 36:25.320
You'll find that the Chinese wireless reception
36:25.320 --> 36:28.760
is better than the US, because massive spending that
36:28.760 --> 36:30.720
tries to cover the whole country.
36:30.720 --> 36:34.360
Whereas in the US, it may be a little spotty.
36:34.360 --> 36:36.160
It's a government driven, because I think
36:36.160 --> 36:44.120
they view the coverage of cell access and 3G, 4G access
36:44.120 --> 36:47.080
to be a governmental infrastructure spending,
36:47.080 --> 36:49.880
as opposed to capitalistic.
36:49.880 --> 36:52.160
So of course, the state or enterprise
36:52.160 --> 36:55.000
is also publicly traded, but they also
36:55.000 --> 36:57.720
carry a government responsibility
36:57.720 --> 37:00.240
to deliver infrastructure to all.
37:00.240 --> 37:01.880
So it's a different way of thinking
37:01.880 --> 37:05.400
that may be very hard to inject into Western countries
37:05.400 --> 37:09.280
to say starting tomorrow, bandwidth infrastructure
37:09.280 --> 37:13.840
and highways are going to be governmental spending
37:13.840 --> 37:16.240
with some characteristics.
37:16.240 --> 37:18.240
What's your sense, and sorry to interrupt,
37:18.240 --> 37:21.680
but because it's such a fascinating point,
37:21.680 --> 37:25.600
do you think on the autonomous vehicle space
37:25.600 --> 37:30.120
it's possible to solve the problem of full autonomy
37:30.120 --> 37:34.040
without significant investment in infrastructure?
37:34.040 --> 37:36.400
Well, that's really hard to speculate.
37:36.400 --> 37:38.960
I think it's not a yes, no question,
37:38.960 --> 37:41.920
but how long does it take question?
37:41.920 --> 37:45.120
15 years, 30 years, 45 years.
37:45.120 --> 37:48.960
Clearly with infrastructure augmentation,
37:48.960 --> 37:52.320
where there's road, the city, or whole city planning,
37:52.320 --> 37:56.440
building a new city, I'm sure that will accelerate
37:56.440 --> 37:59.040
the day of the L5.
37:59.040 --> 38:01.520
I'm not knowledgeable enough, and it's
38:01.520 --> 38:03.920
hard to predict even when we're knowledgeable,
38:03.920 --> 38:07.120
because a lot of it is speculative.
38:07.120 --> 38:09.800
But in the US, I don't think people
38:09.800 --> 38:13.240
would consider building a new city the size of Chicago
38:13.240 --> 38:15.920
to make it the AI slash autonomous city.
38:15.920 --> 38:18.840
There are smaller ones being built, I'm aware of that.
38:18.840 --> 38:21.280
But is infrastructure spend really
38:21.280 --> 38:23.720
impossible for US or Western countries?
38:23.720 --> 38:25.680
I don't think so.
38:25.680 --> 38:28.920
The US highway system was built.
38:28.920 --> 38:31.960
Was that during President Eisenhower or Kennedy?
38:31.960 --> 38:33.160
Eisenhower, yeah.
38:33.160 --> 38:38.960
So maybe historians can study how the President Eisenhower
38:38.960 --> 38:42.960
get the resources to build this massive infrastructure that
38:42.960 --> 38:47.560
surely gave US a tremendous amount of prosperity
38:47.560 --> 38:50.800
over the next decade, if not century.
38:50.800 --> 38:53.240
If I may comment on that, then, it
38:53.240 --> 38:54.880
takes us to artificial intelligence
38:54.880 --> 38:58.080
a little bit, because in order to build infrastructure,
38:58.080 --> 39:00.520
it creates a lot of jobs.
39:00.520 --> 39:02.840
So I'll be actually interested if you
39:02.840 --> 39:06.120
would say that you're talking in your book about all kinds
39:06.120 --> 39:08.960
of jobs that could and could not be automated.
39:08.960 --> 39:12.000
I wonder if building infrastructure
39:12.000 --> 39:15.720
is one of the jobs that would not be easily automated,
39:15.720 --> 39:18.160
something you can think about, because I think you've mentioned
39:18.160 --> 39:21.160
somewhere in a talk, or that there
39:21.160 --> 39:24.280
might be, as jobs are being automated,
39:24.280 --> 39:28.160
a role for government to create jobs that can't be automated.
39:28.160 --> 39:31.040
Yes, I think that's a possibility.
39:31.040 --> 39:34.280
Back in the last financial crisis,
39:34.280 --> 39:40.320
China put a lot of money to basically give this economy
39:40.320 --> 39:45.520
a boost, and a lot of it went into infrastructure building.
39:45.520 --> 39:49.920
And I think that's a legitimate way, at the government level,
39:49.920 --> 39:55.680
to deal with the employment issues as well as build out
39:55.680 --> 39:58.960
the infrastructure, as long as the infrastructures are truly
39:58.960 --> 40:03.160
needed, and as long as there is an employment problem, which
40:03.160 --> 40:04.960
we don't know.
40:04.960 --> 40:07.920
So maybe taking a little step back,
40:07.920 --> 40:12.840
if you've been a leader and a researcher in AI
40:12.840 --> 40:16.200
for several decades, at least 30 years,
40:16.200 --> 40:21.040
so how has AI changed in the West and the East
40:21.040 --> 40:23.120
as you've observed, as you've been deep in it
40:23.120 --> 40:25.120
over the past 30 years?
40:25.120 --> 40:28.520
Well, AI began as the pursuit of understanding
40:28.520 --> 40:34.160
human intelligence, and the term itself represents that.
40:34.160 --> 40:37.680
But it kind of drifted into the one subarea that
40:37.680 --> 40:40.880
worked extremely well, which is machine intelligence.
40:40.880 --> 40:45.080
And that's actually more using pattern recognition techniques
40:45.080 --> 40:51.280
to basically do incredibly well on a limited domain,
40:51.280 --> 40:54.840
large amount of data, but relatively simple kinds
40:54.840 --> 40:58.720
of planning, tasks, and not very creative.
40:58.720 --> 41:02.480
So we didn't end up building human intelligence.
41:02.480 --> 41:04.760
We built a different machine that
41:04.760 --> 41:08.040
was a lot better than us, some problems,
41:08.040 --> 41:11.840
but nowhere close to us on other problems.
41:11.840 --> 41:14.200
So today, I think a lot of people still
41:14.200 --> 41:18.080
misunderstand when we say artificial intelligence
41:18.080 --> 41:20.720
and what various products can do.
41:20.720 --> 41:24.160
People still think it's about replicating human intelligence.
41:24.160 --> 41:26.160
But the products out there really
41:26.160 --> 41:31.680
are closer to having invented the internet or the spreadsheet
41:31.680 --> 41:35.360
or the database and getting broader adoption.
41:35.360 --> 41:38.400
And speaking further to the fears, near term fears
41:38.400 --> 41:41.240
that people have about AI, so you're commenting
41:41.240 --> 41:45.680
on the general intelligence that people
41:45.680 --> 41:48.040
in the popular culture from sci fi movies
41:48.040 --> 41:50.920
have a sense about AI, but there's practical fears
41:50.920 --> 41:54.800
about AI, the kind of narrow AI that you're talking about
41:54.800 --> 41:57.280
of automating particular kinds of jobs,
41:57.280 --> 41:59.400
and you talk about them in the book.
41:59.400 --> 42:01.520
So what are the kinds of jobs in your view
42:01.520 --> 42:04.840
that you see in the next five, 10 years beginning
42:04.840 --> 42:09.240
to be automated by AI systems algorithms?
42:09.240 --> 42:13.000
Yes, this is also maybe a little bit counterintuitive
42:13.000 --> 42:15.440
because it's the routine jobs that
42:15.440 --> 42:18.360
will be displaced the soonest.
42:18.360 --> 42:23.120
And they may not be displaced entirely, maybe 50%, 80%
42:23.120 --> 42:26.320
of a job, but when the workload drops by that much,
42:26.320 --> 42:28.760
employment will come down.
42:28.760 --> 42:31.520
And also another part of misunderstanding
42:31.520 --> 42:35.720
is most people think of AI replacing routine jobs,
42:35.720 --> 42:38.760
then they think of the assembly line, the workers.
42:38.760 --> 42:40.960
Well, that will have some effects,
42:40.960 --> 42:44.600
but it's actually the routine white collar workers that's
42:44.600 --> 42:49.280
easiest to replace because to replace a white collar worker,
42:49.280 --> 42:50.720
you just need software.
42:50.720 --> 42:53.120
To replace a blue collar worker,
42:53.120 --> 42:57.200
you need robotics, mechanical excellence,
42:57.200 --> 43:01.880
and the ability to deal with dexterity,
43:01.880 --> 43:05.640
and maybe even unknown environments, very, very difficult.
43:05.640 --> 43:11.200
So if we were to categorize the most dangerous white collar
43:11.200 --> 43:15.600
jobs, they would be things like back office,
43:15.600 --> 43:20.800
people who copy and paste and deal with simple computer
43:20.800 --> 43:25.560
programs and data, and maybe paper and OCR,
43:25.560 --> 43:29.000
and they don't make strategic decisions,
43:29.000 --> 43:32.040
they basically facilitate the process.
43:32.040 --> 43:34.680
These software and paper systems don't work,
43:34.680 --> 43:40.520
so you have people dealing with new employee orientation,
43:40.520 --> 43:45.400
searching for past lawsuits and financial documents,
43:45.400 --> 43:49.800
and doing reference check, so basic searching and management
43:49.800 --> 43:52.800
of data that's the most in danger of being lost.
43:52.800 --> 43:56.440
In addition to the white collar repetitive work,
43:56.440 --> 43:59.360
a lot of simple interaction work can also
43:59.360 --> 44:02.840
be taken care of, such as tele sales, telemarketing,
44:02.840 --> 44:07.280
customer service, as well as many physical jobs
44:07.280 --> 44:09.880
that are in the same location and don't
44:09.880 --> 44:12.240
require a high degree of dexterity,
44:12.240 --> 44:17.840
so fruit picking, dishwashing, assembly line, inspection,
44:17.840 --> 44:20.360
our jobs in that category.
44:20.360 --> 44:25.440
So altogether, back office is a big part,
44:25.440 --> 44:29.840
and the other, the blue collar may be smaller initially,
44:29.840 --> 44:32.560
but over time, AI will get better.
44:32.560 --> 44:36.880
And when we start to get to over the next 15, 20 years,
44:36.880 --> 44:39.120
the ability to actually have the dexterity
44:39.120 --> 44:42.600
of doing assembly line, that's a huge chunk of jobs.
44:42.600 --> 44:44.760
And when autonomous vehicles start
44:44.760 --> 44:47.400
to work initially starting with truck drivers,
44:47.400 --> 44:49.640
but eventually to all drivers, that's
44:49.640 --> 44:52.040
another huge group of workers.
44:52.040 --> 44:55.560
So I see modest numbers in the next five years,
44:55.560 --> 44:58.080
but increasing rapidly after that.
44:58.080 --> 45:01.240
On the worry of the jobs that are in danger
45:01.240 --> 45:04.320
and the gradual loss of jobs, I'm not
45:04.320 --> 45:06.680
sure if you're familiar with Andrew Yang.
45:06.680 --> 45:07.800
Yes, I am.
45:07.800 --> 45:10.560
So there's a candidate for president of the United States
45:10.560 --> 45:14.960
whose platform, Andrew Yang, is based around, in part,
45:14.960 --> 45:17.680
around job loss due to automation,
45:17.680 --> 45:21.120
and also, in addition, the need, perhaps,
45:21.120 --> 45:26.120
of universal basic income to support jobs that are folks who
45:26.120 --> 45:28.560
lose their job due to automation and so on,
45:28.560 --> 45:31.960
and in general, support people under complex,
45:31.960 --> 45:34.320
unstable job market.
45:34.320 --> 45:36.720
So what are your thoughts about his concerns,
45:36.720 --> 45:40.000
him as a candidate, his ideas in general?
45:40.000 --> 45:44.600
I think his thinking is generally in the right direction,
45:44.600 --> 45:48.440
but his approach as a presidential candidate
45:48.440 --> 45:52.240
may be a little bit ahead at the time.
45:52.240 --> 45:56.080
I think the displacements will happen,
45:56.080 --> 45:58.280
but will they happen soon enough for people
45:58.280 --> 46:00.480
to agree to vote for him?
46:00.480 --> 46:03.760
The unemployment numbers are not very high yet.
46:03.760 --> 46:07.600
And I think he and I have the same challenge.
46:07.600 --> 46:11.520
If I want to theoretically convince people this is an issue
46:11.520 --> 46:13.880
and he wants to become the president,
46:13.880 --> 46:17.760
people have to see how can this be the case when
46:17.760 --> 46:19.680
unemployment numbers are low.
46:19.680 --> 46:21.360
So that is the challenge.
46:21.360 --> 46:27.360
And I think I do agree with him on the displacement issue,
46:27.360 --> 46:32.280
on universal basic income, at a very vanilla level.
46:32.280 --> 46:36.800
I don't agree with it because I think the main issue
46:36.800 --> 46:38.320
is retraining.
46:38.320 --> 46:43.200
So people need to be incented not by just giving a monthly
46:43.200 --> 46:47.160
$2,000 check or $1,000 check and do whatever they want
46:47.160 --> 46:50.920
because they don't have the know how
46:50.920 --> 46:56.840
to know what to retrain to go into what type of a job
46:56.840 --> 46:58.640
and guidance is needed.
46:58.640 --> 47:01.720
And retraining is needed because historically
47:01.720 --> 47:05.080
in technology revolutions, when routine jobs were displaced,
47:05.080 --> 47:06.920
new routine jobs came up.
47:06.920 --> 47:09.400
So there was always room for that.
47:09.400 --> 47:12.640
But with AI and automation, the whole point
47:12.640 --> 47:15.320
is replacing all routine jobs eventually.
47:15.320 --> 47:17.840
So there will be fewer and fewer routine jobs.
47:17.840 --> 47:22.640
And AI will create jobs, but it won't create routine jobs
47:22.640 --> 47:24.840
because if it creates routine jobs,
47:24.840 --> 47:26.880
why wouldn't AI just do it?
47:26.880 --> 47:30.360
So therefore, the people who are losing the jobs
47:30.360 --> 47:32.280
are losing routine jobs.
47:32.280 --> 47:35.720
The jobs that are becoming available are nonroutine jobs.
47:35.720 --> 47:39.320
So the social stipend needs to be put in place
47:39.320 --> 47:42.040
is for the routine workers who lost their jobs
47:42.040 --> 47:46.120
to be retrained maybe in six months, maybe in three years.
47:46.120 --> 47:48.560
Takes a while to retrain on the nonroutine job
47:48.560 --> 47:51.360
and then take on a job that will last
47:51.360 --> 47:53.400
for that person's lifetime.
47:53.400 --> 47:56.160
Now, having said that, if you look deeply
47:56.160 --> 47:58.240
into Andrew's document, he does cater for that.
47:58.240 --> 48:03.240
So I'm not disagreeing with what he's trying to do.
48:03.280 --> 48:06.360
But for simplification, sometimes he just says UBI,
48:06.360 --> 48:08.760
but simple UBI wouldn't work.
48:08.760 --> 48:10.600
And I think you've mentioned elsewhere
48:10.600 --> 48:15.600
that the goal isn't necessarily to give people enough money
48:15.760 --> 48:19.120
to survive or live or even to prosper.
48:19.120 --> 48:22.800
The point is to give them a job that gives them meaning.
48:22.800 --> 48:25.600
That meaning is extremely important.
48:25.600 --> 48:28.600
That our employment, at least in the United States
48:28.600 --> 48:31.200
and perhaps it cares across the world,
48:31.200 --> 48:34.600
provides something that's, forgive me for saying,
48:34.600 --> 48:36.960
greater than money, it provides meaning.
48:38.400 --> 48:43.400
So now what kind of jobs do you think can't be automated?
48:44.840 --> 48:46.600
You talk a little bit about creativity
48:46.600 --> 48:48.200
and compassion in your book.
48:48.200 --> 48:50.720
What aspects do you think it's difficult
48:50.720 --> 48:52.320
to automate for an AI system?
48:52.320 --> 48:57.320
Because an AI system is currently merely optimizing.
48:57.360 --> 49:00.120
It's not able to reason, plan,
49:00.120 --> 49:02.920
or think creatively or strategically.
49:02.920 --> 49:05.320
It's not able to deal with complex problems.
49:05.320 --> 49:09.520
It can't come up with a new problem and solve it.
49:09.520 --> 49:12.320
A human needs to find the problem
49:12.320 --> 49:15.520
and pose it as an optimization problem,
49:15.520 --> 49:17.520
then have the AI work at it.
49:17.520 --> 49:21.320
So an AI would have a very hard time
49:21.320 --> 49:23.320
discovering a new drug
49:23.320 --> 49:26.320
or discovering a new style of painting
49:27.320 --> 49:30.320
or dealing with complex tasks
49:30.320 --> 49:32.320
such as managing a company
49:32.320 --> 49:35.320
that isn't just about optimizing the bottom line,
49:35.320 --> 49:39.320
but also about employee satisfaction, corporate brand,
49:39.320 --> 49:40.320
and many, many other things.
49:40.320 --> 49:44.320
So that is one category of things.
49:44.320 --> 49:48.320
And because these things are challenging, creative, complex,
49:48.320 --> 49:52.320
doing them creates a higher degree of satisfaction
49:52.320 --> 49:55.320
and therefore appealing to our desire for working,
49:55.320 --> 49:57.320
which isn't just to make the money,
49:57.320 --> 49:58.320
make the ends meet,
49:58.320 --> 50:00.320
but also that we've accomplished something
50:00.320 --> 50:03.320
that others maybe can't do or can't do as well.
50:04.320 --> 50:07.320
Another type of job that is much numerous
50:07.320 --> 50:09.320
would be compassionate jobs,
50:09.320 --> 50:14.320
jobs that require compassion, empathy, human touch, human trust.
50:14.320 --> 50:18.320
AI can't do that because AI is cold, calculating,
50:18.320 --> 50:22.320
and even if it can fake that to some extent,
50:22.320 --> 50:26.320
it will make errors and that will make it look very silly.
50:26.320 --> 50:29.320
And also, I think even if AI did okay,
50:29.320 --> 50:33.320
people would want to interact with another person,
50:33.320 --> 50:38.320
whether it's for some kind of a service or a teacher or a doctor
50:38.320 --> 50:41.320
or a concierge or a masseuse or bartender.
50:41.320 --> 50:46.320
There are so many jobs where people just don't want to interact
50:46.320 --> 50:49.320
with a cold robot or software.
50:50.320 --> 50:53.320
I've had an entrepreneur who built an elderly care robot
50:53.320 --> 50:58.320
and they found that the elderly really only use it for customer service.
50:58.320 --> 51:00.320
But not to service the product,
51:00.320 --> 51:05.320
but they click on customer service and the video of a person comes up
51:05.320 --> 51:07.320
and then the person says,
51:07.320 --> 51:11.320
how come my daughter didn't call me? Let me show you a picture of her grandkids.
51:11.320 --> 51:15.320
So people earn for that, people people interaction.
51:15.320 --> 51:19.320
So even if robots improved, people just don't want it.
51:19.320 --> 51:21.320
And those jobs are going to be increasing
51:21.320 --> 51:24.320
because AI will create a lot of value,
51:24.320 --> 51:29.320
$16 trillion to the world in next 11 years according to PWC
51:29.320 --> 51:34.320
and that will give people money to enjoy services,
51:34.320 --> 51:39.320
whether it's eating a gourmet meal or tourism and traveling
51:39.320 --> 51:41.320
or having concierge services.
51:41.320 --> 51:44.320
The services revolving around, you know,
51:44.320 --> 51:47.320
every dollar of that $16 trillion will be tremendous.
51:47.320 --> 51:52.320
It will create more opportunities to service the people who did well
51:52.320 --> 51:55.320
through AI with things.
51:55.320 --> 52:01.320
But even at the same time, the entire society is very much short
52:01.320 --> 52:05.320
in need of many service oriented, compassionate oriented jobs.
52:05.320 --> 52:10.320
The best example is probably in healthcare services.
52:10.320 --> 52:15.320
There's going to be 2 million new jobs, not counting replacement,
52:15.320 --> 52:20.320
just brand new incremental jobs in the next six years in healthcare services.
52:20.320 --> 52:24.320
That includes nurses orderly in the hospital,
52:24.320 --> 52:29.320
elderly care and also at home care.
52:29.320 --> 52:31.320
It's particularly lacking.
52:31.320 --> 52:34.320
And those jobs are not likely to be filled.
52:34.320 --> 52:36.320
So there's likely to be a shortage.
52:36.320 --> 52:41.320
And the reason they're not filled is simply because they don't pay very well
52:41.320 --> 52:47.320
and that the social status of these jobs are not very good.
52:47.320 --> 52:52.320
So they pay about half as much as a heavy equipment operator,
52:52.320 --> 52:55.320
which will be replaced a lot sooner.
52:55.320 --> 52:59.320
And they pay probably comparably to someone on the assembly line.
52:59.320 --> 53:03.320
And so if we're ignoring all the other issues
53:03.320 --> 53:07.320
and just think about satisfaction from one's job,
53:07.320 --> 53:11.320
someone repetitively doing the same manual action at an assembly line,
53:11.320 --> 53:14.320
that can't create a lot of job satisfaction.
53:14.320 --> 53:17.320
But someone taking care of a sick person
53:17.320 --> 53:21.320
and getting a hug and thank you from that person and the family,
53:21.320 --> 53:24.320
I think is quite satisfying.
53:24.320 --> 53:28.320
So if only we could fix the pay for service jobs,
53:28.320 --> 53:33.320
there are plenty of jobs that require some training or a lot of training
53:33.320 --> 53:36.320
for the people coming off the routine jobs to take.
53:36.320 --> 53:43.320
We can easily imagine someone who was maybe a cashier at the grocery store,
53:43.320 --> 53:49.320
at stores become automated, learns to become a nurse or at home care.
53:49.320 --> 53:54.320
Also, I do want to point out the blue collar jobs are going to stay around a bit longer,
53:54.320 --> 53:57.320
some of them quite a bit longer.
53:57.320 --> 54:01.320
AI cannot be told, go clean an arbitrary home.
54:01.320 --> 54:03.320
That's incredibly hard.
54:03.320 --> 54:07.320
Arguably is an L5 level of difficulty.
54:07.320 --> 54:09.320
And then AI cannot be a good plumber,
54:09.320 --> 54:12.320
because plumber is almost like a mini detective
54:12.320 --> 54:15.320
that has to figure out where the leak came from.
54:15.320 --> 54:22.320
So yet AI probably can be an assembly line and auto mechanic and so on.
54:22.320 --> 54:26.320
So one has to study which blue collar jobs are going away
54:26.320 --> 54:30.320
and facilitate retraining for the people to go into the ones that won't go away
54:30.320 --> 54:32.320
or maybe even will increase.
54:32.320 --> 54:39.320
I mean, it is fascinating that it's easier to build a world champion chess player
54:39.320 --> 54:41.320
than it is to build a mediocre plumber.
54:41.320 --> 54:43.320
Yes, very true.
54:43.320 --> 54:47.320
And to AI, and that goes counterintuitive to a lot of people's understanding
54:47.320 --> 54:49.320
of what artificial intelligence is.
54:49.320 --> 54:53.320
So it sounds, I mean, you're painting a pretty optimistic picture
54:53.320 --> 54:56.320
about retraining, about the number of jobs
54:56.320 --> 55:01.320
and actually the meaningful nature of those jobs once we automate repetitive tasks.
55:01.320 --> 55:07.320
So overall, are you optimistic about the future
55:07.320 --> 55:11.320
where much of the repetitive tasks are automated,
55:11.320 --> 55:15.320
that there is a lot of room for humans, for the compassionate,
55:15.320 --> 55:19.320
for the creative input that only humans can provide?
55:19.320 --> 55:23.320
I am optimistic if we start to take action.
55:23.320 --> 55:27.320
If we have no action in the next five years,
55:27.320 --> 55:33.320
I think it's going to be hard to deal with the devastating losses that will emerge.
55:33.320 --> 55:39.320
So if we start thinking about retraining, maybe with the low hanging fruits,
55:39.320 --> 55:45.320
explaining to vocational schools why they should train more plumbers than auto mechanics,
55:45.320 --> 55:53.320
maybe starting with some government subsidy for corporations to have more training positions.
55:53.320 --> 55:57.320
We start to explain to people why retraining is important.
55:57.320 --> 56:00.320
We start to think about what the future of education,
56:00.320 --> 56:04.320
how that needs to be tweaked for the era of AI.
56:04.320 --> 56:06.320
If we start to make incremental progress,
56:06.320 --> 56:09.320
and the greater number of people understand,
56:09.320 --> 56:12.320
then there's no reason to think we can't deal with this,
56:12.320 --> 56:16.320
because this technological revolution is arguably similar to
56:16.320 --> 56:20.320
what electricity, industrial revolutions, and internet brought about.
56:20.320 --> 56:24.320
Do you think there's a role for policy, for governments to step in
56:24.320 --> 56:27.320
to help with policy to create a better world?
56:27.320 --> 56:32.320
Absolutely, and the governments don't have to believe
56:32.320 --> 56:39.320
that unemployment will go up, and they don't have to believe automation will be this fast to do something.
56:39.320 --> 56:42.320
Revamping vocational school would be one example.
56:42.320 --> 56:47.320
Another is if there's a big gap in healthcare service employment,
56:47.320 --> 56:54.320
and we know that a country's population is growing older and more longevity living older,
56:54.320 --> 56:59.320
because people over 80 require five times as much care as those under 80,
56:59.320 --> 57:04.320
then it is a good time to incent training programs for elderly care,
57:04.320 --> 57:07.320
to find ways to improve the pay.
57:07.320 --> 57:13.320
Maybe one way would be to offer as part of Medicare or the equivalent program
57:13.320 --> 57:18.320
for people over 80 to be entitled to a few hours of elderly care at home,
57:18.320 --> 57:21.320
and then that might be reimbursable,
57:21.320 --> 57:28.320
and that will stimulate the service industry around the policy.
57:28.320 --> 57:32.320
Do you have concerns about large entities,
57:32.320 --> 57:38.320
whether it's governments or companies, controlling the future of AI development in general?
57:38.320 --> 57:40.320
So we talked about companies.
57:40.320 --> 57:48.320
Do you have a better sense that governments can better represent the interest of the people
57:48.320 --> 57:54.320
than companies, or do you believe companies are better at representing the interest of the people?
57:54.320 --> 57:56.320
Or is there no easy answer?
57:56.320 --> 57:59.320
I don't think there's an easy answer because it's a double edged sword.
57:59.320 --> 58:06.320
The companies and governments can provide better services with more access to data and more access to AI,
58:06.320 --> 58:13.320
but that also leads to greater power, which can lead to uncontrollable problems,
58:13.320 --> 58:17.320
whether it's monopoly or corruption in the government.
58:17.320 --> 58:24.320
So I think one has to be careful to look at how much data that companies and governments have,
58:24.320 --> 58:29.320
and some kind of checks and balances would be helpful.
58:29.320 --> 58:33.320
So again, I come from Russia.
58:33.320 --> 58:36.320
There's something called the Cold War.
58:36.320 --> 58:40.320
So let me ask a difficult question here, looking at conflict.
58:40.320 --> 58:45.320
Steven Pinker wrote a great book that conflict all over the world is decreasing in general.
58:45.320 --> 58:51.320
But do you have a sense that having written the book AI Superpowers,
58:51.320 --> 58:57.320
do you see a major international conflict potentially arising between major nations,
58:57.320 --> 59:02.320
whatever they are, whether it's Russia, China, European nations, United States,
59:02.320 --> 59:09.320
or others in the next 10, 20, 50 years around AI, around the digital space, cyber space?
59:09.320 --> 59:12.320
Do you worry about that?
59:12.320 --> 59:19.320
Is that something we need to think about and try to alleviate or prevent?
59:19.320 --> 59:22.320
I believe in greater engagement.
59:22.320 --> 59:33.320
A lot of the worries about more powerful AI are based on an arms race metaphor.
59:33.320 --> 59:41.320
And when you extrapolate into military kinds of scenarios,
59:41.320 --> 59:48.320
AI can automate autonomous weapons that needs to be controlled somehow.
59:48.320 --> 59:57.320
And autonomous decision making can lead to not enough time to fix international crises.
59:57.320 --> 1:00:02.320
So I actually believe a Cold War mentality would be very dangerous
1:00:02.320 --> 1:00:07.320
because should two countries rely on AI to make certain decisions
1:00:07.320 --> 1:00:11.320
and they don't even talk to each other, they do their own scenario planning,
1:00:11.320 --> 1:00:14.320
then something could easily go wrong.
1:00:14.320 --> 1:00:24.320
I think engagement, interaction, some protocols to avoid inadvertent disasters is actually needed.
1:00:24.320 --> 1:00:28.320
So it's natural for each country to want to be the best,
1:00:28.320 --> 1:00:34.320
whether it's in nuclear technologies or AI or bio.
1:00:34.320 --> 1:00:40.320
But I think it's important to realize if each country has a black box AI
1:00:40.320 --> 1:00:48.320
and don't talk to each other, that probably presents greater challenges to humanity
1:00:48.320 --> 1:00:50.320
than if they interacted.
1:00:50.320 --> 1:00:56.320
I think there can still be competition, but with some degree of protocol for interaction.
1:00:56.320 --> 1:01:01.320
Just like when there was a nuclear competition,
1:01:01.320 --> 1:01:07.320
there were some protocol for deterrence among US, Russia, and China.
1:01:07.320 --> 1:01:10.320
And I think that engagement is needed.
1:01:10.320 --> 1:01:15.320
So of course, we're still far from AI presenting that kind of danger.
1:01:15.320 --> 1:01:22.320
But what I worry the most about is the level of engagement seems to be coming down.
1:01:22.320 --> 1:01:25.320
The level of distrust seems to be going up,
1:01:25.320 --> 1:01:32.320
especially from the US towards other large countries such as China and Russia.
1:01:32.320 --> 1:01:34.320
Is there a way to make that better?
1:01:34.320 --> 1:01:40.320
So that's beautifully put, level of engagement and even just basic trust and communication
1:01:40.320 --> 1:01:52.320
as opposed to making artificial enemies out of particular countries.
1:01:52.320 --> 1:02:01.320
Do you have a sense how we can make it better, actionable items that as a society we can take on?
1:02:01.320 --> 1:02:10.320
I'm not an expert at geopolitics, but I would say that we look pretty foolish as humankind
1:02:10.320 --> 1:02:19.320
when we are faced with the opportunity to create $16 trillion for humanity.
1:02:19.320 --> 1:02:29.320
And yet we're not solving fundamental problems with parts of the world still in poverty.
1:02:29.320 --> 1:02:34.320
And for the first time, we have the resources to overcome poverty and hunger.
1:02:34.320 --> 1:02:38.320
We're not using it on that, but we're fueling competition among superpowers.
1:02:38.320 --> 1:02:41.320
And that's a very unfortunate thing.
1:02:41.320 --> 1:02:54.320
If we become utopian for a moment, imagine a benevolent world government that has this $16 trillion
1:02:54.320 --> 1:03:02.320
and maybe some AI to figure out how to use it to deal with diseases and problems and hate and things like that.
1:03:02.320 --> 1:03:04.320
World would be a lot better off.
1:03:04.320 --> 1:03:07.320
So what is wrong with the current world?
1:03:07.320 --> 1:03:13.320
I think the people with more skill than I should think about this.
1:03:13.320 --> 1:03:19.320
And then the geopolitics issue with superpower competition is one side of the issue.
1:03:19.320 --> 1:03:29.320
There's another side which I worry maybe even more, which is as the $16 trillion all gets made by U.S. and China
1:03:29.320 --> 1:03:34.320
and a few of the other developed countries, the poorer country will get nothing
1:03:34.320 --> 1:03:42.320
because they don't have technology and the wealth disparity and inequality will increase.
1:03:42.320 --> 1:03:50.320
So a poorer country with a large population will not only benefit from the AI boom or other technology booms
1:03:50.320 --> 1:03:57.320
but they will have their workers who previously had hoped they could do the China model and do outsource manufacturing
1:03:57.320 --> 1:04:02.320
or the India model so they could do the outsource process or call center
1:04:02.320 --> 1:04:05.320
while all those jobs are going to be gone in 10 or 15 years.
1:04:05.320 --> 1:04:14.320
So the individual citizen may be a net liability, I mean financially speaking, to a poorer country
1:04:14.320 --> 1:04:19.320
and not an asset to claw itself out of poverty.
1:04:19.320 --> 1:04:29.320
So in that kind of situation, these large countries with not much tech are going to be facing a downward spiral
1:04:29.320 --> 1:04:37.320
and it's unclear what could be done and then when we look back and say there's $16 trillion being created
1:04:37.320 --> 1:04:43.320
and it's all being kept by U.S. China and other developed countries, it just doesn't feel right.
1:04:43.320 --> 1:04:50.320
So I hope people who know about geopolitics can find solutions that's beyond my expertise.
1:04:50.320 --> 1:04:54.320
So different countries that we've talked about have different value systems.
1:04:54.320 --> 1:05:02.320
If you look at the United States to an almost extreme degree, there is an absolute desire for freedom of speech.
1:05:02.320 --> 1:05:14.320
If you look at a country where I was raised, that desire just amongst the people is not as elevated as it is to basically fundamental level
1:05:14.320 --> 1:05:17.320
to the essence of what it means to be America, right?
1:05:17.320 --> 1:05:20.320
And the same is true with China, there's different value systems.
1:05:20.320 --> 1:05:30.320
There is some censorship of internet content that China and Russia and many other countries undertake.
1:05:30.320 --> 1:05:40.320
Do you see that having effects on innovation, other aspects of some of the tech stuff, AI development we talked about
1:05:40.320 --> 1:05:52.320
and maybe from another angle, do you see that changing in different ways over the next 10 years, 20 years, 50 years as China continues to grow
1:05:52.320 --> 1:05:55.320
as it does now in its tech innovation?
1:05:55.320 --> 1:06:08.320
There's a common belief that full freedom of speech and expression is correlated with creativity, which is correlated with entrepreneurial success.
1:06:08.320 --> 1:06:15.320
I think empirically we have seen that is not true and China has been successful.
1:06:15.320 --> 1:06:25.320
That's not to say the fundamental values are not right or not the best, but it's just that perfect correlation isn't there.
1:06:25.320 --> 1:06:36.320
It's hard to read the tea leaves on opening up or not in any country and I've not been very good at that in my past predictions.
1:06:36.320 --> 1:06:46.320
But I do believe every country shares some fundamental value, a lot of fundamental values for the long term.
1:06:46.320 --> 1:07:02.320
So, you know, China is drafting its privacy policy for individual citizens and they don't look that different from the American or European ones.
1:07:02.320 --> 1:07:13.320
So, people do want to protect their privacy and have the opportunity to express and I think the fundamental values are there.
1:07:13.320 --> 1:07:21.320
The question is in the execution and timing, how soon or when will that start to open up?
1:07:21.320 --> 1:07:31.320
So, as long as each government knows, ultimately people want that kind of protection, there should be a plan to move towards that.
1:07:31.320 --> 1:07:35.320
As to when or how, again, I'm not an expert.
1:07:35.320 --> 1:07:38.320
On the point of privacy to me, it's really interesting.
1:07:38.320 --> 1:07:44.320
So, AI needs data to create a personalized awesome experience.
1:07:44.320 --> 1:07:47.320
I'm just speaking generally in terms of products.
1:07:47.320 --> 1:07:53.320
And then we have currently, depending on the age and depending on the demographics of who we're talking about,
1:07:53.320 --> 1:07:58.320
some people are more or less concerned about the amount of data they hand over.
1:07:58.320 --> 1:08:03.320
So, in your view, how do we get this balance right?
1:08:03.320 --> 1:08:09.320
That we provide an amazing experience to people that use products.
1:08:09.320 --> 1:08:15.320
You look at Facebook, you know, the more Facebook knows about you, yes, it's scary to say.
1:08:15.320 --> 1:08:20.320
The better it can probably, a better experience it can probably create.
1:08:20.320 --> 1:08:24.320
So, in your view, how do we get that balance right?
1:08:24.320 --> 1:08:38.320
Yes, I think a lot of people have a misunderstanding that it's okay and possible to just rip all the data out from a provider and give it back to you.
1:08:38.320 --> 1:08:43.320
So, you can deny them access to further data and still enjoy the services we have.
1:08:43.320 --> 1:08:48.320
If we take back all the data, all the services will give us nonsense.
1:08:48.320 --> 1:08:57.320
We'll no longer be able to use products that function well in terms of, you know, right ranking, right products, right user experience.
1:08:57.320 --> 1:09:04.320
So, yet I do understand we don't want to permit misuse of the data.
1:09:04.320 --> 1:09:16.320
From legal policy standpoint, I think there can be severe punishment for those who have egregious misuse of the data.
1:09:16.320 --> 1:09:19.320
That's, I think, a good first step.
1:09:19.320 --> 1:09:27.320
Actually, China on this aspect has very strong laws about people who sell or give data to other companies.
1:09:27.320 --> 1:09:40.320
And that over the past few years, since that law came into effect, pretty much eradicated the illegal distribution sharing of data.
1:09:40.320 --> 1:09:52.320
Additionally, I think giving, I think technology is often a very good way to solve technology misuse.
1:09:52.320 --> 1:09:58.320
So, can we come up with new technologies that will let us have our cake and eat it too?
1:09:58.320 --> 1:10:07.320
People are looking into homomorphic encryption, which is letting you keep the data, have it encrypted and train encrypted data.
1:10:07.320 --> 1:10:13.320
Of course, we haven't solved that one yet, but that kind of direction may be worth pursuing.
1:10:13.320 --> 1:10:22.320
Also federated learning, which would allow one hospital to train on its hospitals patient data fully because they have a license for that.
1:10:22.320 --> 1:10:28.320
And then hospitals would then share their models, not data, but models to create a supra AI.
1:10:28.320 --> 1:10:30.320
And that also maybe has some promise.
1:10:30.320 --> 1:10:39.320
So I would want to encourage us to be open minded and think of this as not just the policy binary yes no,
1:10:39.320 --> 1:10:48.320
but letting the technologists try to find solutions to let us have our cake and eat it too, or have most of our cake and eat most of it too.
1:10:48.320 --> 1:10:55.320
Finally, I think giving each end user a choice is important and having transparency is important.
1:10:55.320 --> 1:11:04.320
Also, I think that's universal, but the choice you give to the user should not be at a granular level that the user cannot understand.
1:11:04.320 --> 1:11:12.320
GDPR today causes all these pop ups of yes, no, will you give this site this right to use this part of your data?
1:11:12.320 --> 1:11:20.320
I don't think any user understands what they're saying yes or no to, and I suspect most are just saying yes because they don't understand it.
1:11:20.320 --> 1:11:30.320
So while GDPR in its current implementation has lived up to its promise of transparency and user choice,
1:11:30.320 --> 1:11:39.320
it implemented it in such a way that really didn't deliver the spirit of GDPR.
1:11:39.320 --> 1:11:41.320
It fit the letter, but not the spirit.
1:11:41.320 --> 1:11:50.320
So again, I think we need to think about is there a way to fit the spirit of GDPR by using some kind of technology?
1:11:50.320 --> 1:11:52.320
Can we have a slider?
1:11:52.320 --> 1:12:01.320
That's an AI trying to figure out how much you want to slide between perfect protection security of your personal data
1:12:01.320 --> 1:12:07.320
versus high degree of convenience with some risks of not having full privacy.
1:12:07.320 --> 1:12:11.320
Each user should have some preference and that gives you the user choice,
1:12:11.320 --> 1:12:18.320
but maybe we should turn the problem on its head and ask can there be an AI algorithm that can customize this
1:12:18.320 --> 1:12:24.320
because we can understand the slider, but we sure cannot understand every pop up question.
1:12:24.320 --> 1:12:30.320
And I think getting that right requires getting the balance between what we talked about earlier,
1:12:30.320 --> 1:12:36.320
which is heart and soul versus profit driven decisions and strategy.
1:12:36.320 --> 1:12:45.320
I think from my perspective, the best way to make a lot of money in the long term is to keep your heart and soul intact.
1:12:45.320 --> 1:12:53.320
I think getting that slider right in the short term may feel like you'll be sacrificing profit,
1:12:53.320 --> 1:12:59.320
but in the long term, you'll be getting user trust and providing a great experience.
1:12:59.320 --> 1:13:01.320
Do you share that kind of view in general?
1:13:01.320 --> 1:13:11.320
Yes, absolutely. I sure would hope there is a way we can do long term projects that really do the right thing.
1:13:11.320 --> 1:13:16.320
I think a lot of people who embrace GDPR, their hearts in the right place.
1:13:16.320 --> 1:13:20.320
I think they just need to figure out how to build a solution.
1:13:20.320 --> 1:13:24.320
I've heard utopians talk about solutions that get me excited,
1:13:24.320 --> 1:13:29.320
but not sure how in the current funding environment they can get started, right?
1:13:29.320 --> 1:13:37.320
People talk about, imagine this crowdsourced data collection that we all trust,
1:13:37.320 --> 1:13:45.320
and then we have these agents that we ask them to ask the trusted agent.
1:13:45.320 --> 1:13:48.320
That agent only, that platform.
1:13:48.320 --> 1:14:02.320
A trusted joint platform that we all believe is trustworthy that can give us all the close loop personal suggestions
1:14:02.320 --> 1:14:07.320
by the new social network, new search engine, new ecommerce engine
1:14:07.320 --> 1:14:12.320
that has access to even more of our data, but not directly but indirectly.
1:14:12.320 --> 1:14:18.320
I think that general concept of licensing to some trusted engine
1:14:18.320 --> 1:14:22.320
and finding a way to trust that engine seems like a great idea,
1:14:22.320 --> 1:14:27.320
but if you think how long it's going to take to implement and tweak and develop it right,
1:14:27.320 --> 1:14:31.320
as well as to collect all the trust and the data from the people,
1:14:31.320 --> 1:14:34.320
it's beyond the current cycle of venture capital.
1:14:34.320 --> 1:14:37.320
How do you do that is a big question.
1:14:37.320 --> 1:14:44.320
You've recently had a fight with cancer, stage 4 lymphoma,
1:14:44.320 --> 1:14:54.320
and in a sort of deep personal level, what did it feel like in the darker moments to face your own mortality?
1:14:54.320 --> 1:14:57.320
Well, I've been the workaholic my whole life,
1:14:57.320 --> 1:15:04.320
and I've basically worked 9.96, 9am to 9pm, 6 days a week, roughly.
1:15:04.320 --> 1:15:10.320
And I didn't really pay a lot of attention to my family, friends, and people who loved me,
1:15:10.320 --> 1:15:14.320
and my life revolved around optimizing for work.
1:15:14.320 --> 1:15:25.320
While my work was not routine, my optimization really made my life basically a very mechanical process.
1:15:25.320 --> 1:15:36.320
But I got a lot of highs out of it because of accomplishments that I thought were really important and dear and the highest priority to me.
1:15:36.320 --> 1:15:41.320
But when I faced mortality and the possible death in matter of months,
1:15:41.320 --> 1:15:45.320
I suddenly realized that this really meant nothing to me,
1:15:45.320 --> 1:15:48.320
that I didn't feel like working for another minute,
1:15:48.320 --> 1:15:54.320
that if I had 6 months left in my life, I would spend it all with my loved ones.
1:15:54.320 --> 1:16:02.320
And thanking them, giving them love back, and apologizing to them that I lived my life the wrong way.
1:16:02.320 --> 1:16:11.320
So that moment of reckoning caused me to really rethink that why we exist in this world
1:16:11.320 --> 1:16:22.320
is something that we might be too much shaped by the society to think that success and accomplishments is why we live.
1:16:22.320 --> 1:16:29.320
And while that can get you periodic successes and satisfaction,
1:16:29.320 --> 1:16:35.320
it's really in them facing death, you see what's truly important to you.
1:16:35.320 --> 1:16:41.320
So as a result of going through the challenges with cancer,
1:16:41.320 --> 1:16:45.320
I've resolved to live a more balanced lifestyle.
1:16:45.320 --> 1:16:48.320
I'm now in remission, knock on wood,
1:16:48.320 --> 1:16:52.320
and I'm spending more time with my family.
1:16:52.320 --> 1:16:54.320
My wife travels with me.
1:16:54.320 --> 1:16:57.320
When my kids need me, I spend more time with them.
1:16:57.320 --> 1:17:02.320
And before, I used to prioritize everything around work.
1:17:02.320 --> 1:17:05.320
When I had a little bit of time, I would dole it out to my family.
1:17:05.320 --> 1:17:09.320
Now, when my family needs something, really needs something,
1:17:09.320 --> 1:17:12.320
I drop everything at work and go to them.
1:17:12.320 --> 1:17:15.320
And then in the time remaining, I allocate to work.
1:17:15.320 --> 1:17:18.320
But one's family is very understanding.
1:17:18.320 --> 1:17:22.320
It's not like they will take 50 hours a week from me.
1:17:22.320 --> 1:17:26.320
So I'm actually able to still work pretty hard,
1:17:26.320 --> 1:17:28.320
maybe 10 hours less per week.
1:17:28.320 --> 1:17:35.320
So I realize the most important thing in my life is really love and the people I love.
1:17:35.320 --> 1:17:38.320
And I give that the highest priority.
1:17:38.320 --> 1:17:40.320
It isn't the only thing I do.
1:17:40.320 --> 1:17:45.320
But when that is needed, I put that at the top priority.
1:17:45.320 --> 1:17:49.320
And I feel much better and I feel much more balanced.
1:17:49.320 --> 1:17:56.320
And I think this also gives a hint as to a life of routine work,
1:17:56.320 --> 1:17:58.320
a life of pursuit of numbers.
1:17:58.320 --> 1:18:03.320
While my job was not routine, it wasn't pursuit of numbers.
1:18:03.320 --> 1:18:05.320
Pursuit of, can I make more money?
1:18:05.320 --> 1:18:07.320
Can I fund more great companies?
1:18:07.320 --> 1:18:09.320
Can I raise more money?
1:18:09.320 --> 1:18:13.320
Can I make sure our VC is ranked higher and higher every year?
1:18:13.320 --> 1:18:20.320
This competitive nature of driving for bigger numbers and better numbers
1:18:20.320 --> 1:18:27.320
became an endless pursuit of that's mechanical.
1:18:27.320 --> 1:18:31.320
And bigger numbers really didn't make me happier.
1:18:31.320 --> 1:18:36.320
And faced with death, I realized bigger numbers really meant nothing.
1:18:36.320 --> 1:18:42.320
And what was important is that people who have given their heart and their love to me
1:18:42.320 --> 1:18:45.320
deserve for me to do the same.
1:18:45.320 --> 1:18:52.320
So there's deep profound truth in that, that everyone should hear and internalize.
1:18:52.320 --> 1:18:56.320
And that's really powerful for you to say that.
1:18:56.320 --> 1:19:02.320
I have to ask sort of a difficult question here.
1:19:02.320 --> 1:19:07.320
So I've competed in sports my whole life, looking historically.
1:19:07.320 --> 1:19:14.320
I'd like to challenge some aspect of that a little bit on the point of hard work.
1:19:14.320 --> 1:19:20.320
That it feels that there are certain aspects that is the greatest,
1:19:20.320 --> 1:19:26.320
the most beautiful aspects of human nature, is the ability to become obsessed,
1:19:26.320 --> 1:19:33.320
of becoming extremely passionate to the point where, yes, flaws are revealed
1:19:33.320 --> 1:19:36.320
and just giving yourself fully to a task.
1:19:36.320 --> 1:19:41.320
That is, in another sense, you mentioned love being important,
1:19:41.320 --> 1:19:47.320
but in another sense, this kind of obsession, this pure exhibition of passion and hard work
1:19:47.320 --> 1:19:50.320
is truly what it means to be human.
1:19:50.320 --> 1:19:53.320
What lessons should we take that's deeper?
1:19:53.320 --> 1:19:55.320
Because you've accomplished incredible things.
1:19:55.320 --> 1:19:57.320
Like chasing numbers.
1:19:57.320 --> 1:20:01.320
But really, there's some incredible work there.
1:20:01.320 --> 1:20:07.320
So how do you think about that when you look back in your 20s, your 30s?
1:20:07.320 --> 1:20:10.320
What would you do differently?
1:20:10.320 --> 1:20:16.320
Would you really take back some of the incredible hard work?
1:20:16.320 --> 1:20:17.320
I would.
1:20:17.320 --> 1:20:20.320
But it's in percentages, right?
1:20:20.320 --> 1:20:22.320
We're both now computer scientists.
1:20:22.320 --> 1:20:27.320
So I think when one balances one's life, when one is younger,
1:20:27.320 --> 1:20:33.320
you might give a smaller percentage to family, but you would still give them high priority.
1:20:33.320 --> 1:20:38.320
And when you get older, you would give a larger percentage to them and still the high priority.
1:20:38.320 --> 1:20:43.320
And when you're near retirement, you give most of it to them and the highest priority.
1:20:43.320 --> 1:20:50.320
So I think the key point is not that we would work 20 hours less for the whole life
1:20:50.320 --> 1:20:56.320
and just spend it aimlessly with the family, but that when the family has a need,
1:20:56.320 --> 1:21:02.320
when your wife is having a baby, when your daughter has a birthday,
1:21:02.320 --> 1:21:07.320
or when they're depressed, or when they're celebrating something,
1:21:07.320 --> 1:21:11.320
or when they have a get together, or when we have family time,
1:21:11.320 --> 1:21:18.320
that is important for us to put down our phone and PC and be 100% with them.
1:21:18.320 --> 1:21:26.320
And that priority on the things that really matter isn't going to be so taxing
1:21:26.320 --> 1:21:32.320
that it would eliminate or even dramatically reduce our accomplishments.
1:21:32.320 --> 1:21:36.320
It might have some impact, but it might also have other impact
1:21:36.320 --> 1:21:39.320
because if you have a happier family, maybe you fight less.
1:21:39.320 --> 1:21:45.320
If you fight less, you don't spend time taking care of all the aftermath of a fight.
1:21:45.320 --> 1:21:46.320
That's right.
1:21:46.320 --> 1:21:48.320
And I'm sure that it would take more time.
1:21:48.320 --> 1:21:53.320
And if it did, I'd be willing to take that reduction.
1:21:53.320 --> 1:21:56.320
And it's not a dramatic number, but it's a number
1:21:56.320 --> 1:22:00.320
that I think would give me a greater degree of happiness
1:22:00.320 --> 1:22:03.320
and knowing that I've done the right thing
1:22:03.320 --> 1:22:10.320
and still have plenty of hours to get the success that I want to get.
1:22:10.320 --> 1:22:14.320
So given the many successful companies that you've launched
1:22:14.320 --> 1:22:17.320
and much success throughout your career,
1:22:17.320 --> 1:22:25.320
what advice would you give to young people today looking,
1:22:25.320 --> 1:22:28.320
or it doesn't have to be young, but people today looking to launch
1:22:28.320 --> 1:22:35.320
and to create the next $1 billion tech startup, or even AI based startup?
1:22:35.320 --> 1:22:42.320
I would suggest that people understand technology waves move quickly.
1:22:42.320 --> 1:22:45.320
What worked two years ago may not work today.
1:22:45.320 --> 1:22:49.320
And that is very much a case in point for AI.
1:22:49.320 --> 1:22:53.320
I think two years ago, or maybe three years ago,
1:22:53.320 --> 1:22:57.320
you certainly could say I have a couple of super smart PhDs
1:22:57.320 --> 1:22:59.320
and we're not sure what we're going to do,
1:22:59.320 --> 1:23:04.320
but here's how we're going to start and get funding for a very high valuation.
1:23:04.320 --> 1:23:11.320
Those days are over because AI is going from rocket science towards mainstream.
1:23:11.320 --> 1:23:14.320
Not yet commodity, but more mainstream.
1:23:14.320 --> 1:23:20.320
So first, the creation of any company to eventual capitalist
1:23:20.320 --> 1:23:25.320
has to be creation of business value and monetary value.
1:23:25.320 --> 1:23:29.320
And when you have a very scarce commodity,
1:23:29.320 --> 1:23:34.320
VCs may be willing to accept greater uncertainty.
1:23:34.320 --> 1:23:40.320
But now the number of people who have the equivalent of PhD three years ago
1:23:40.320 --> 1:23:43.320
because that can be learned more quickly.
1:23:43.320 --> 1:23:45.320
Platforms are emerging.
1:23:45.320 --> 1:23:51.320
The cost to become an AI engineer is much lower and there are many more AI engineers.
1:23:51.320 --> 1:23:53.320
So the market is different.
1:23:53.320 --> 1:23:57.320
So I would suggest someone who wants to build an AI company
1:23:57.320 --> 1:24:01.320
be thinking about the normal business questions.
1:24:01.320 --> 1:24:05.320
What customer cases are you trying to address?
1:24:05.320 --> 1:24:08.320
What kind of pain are you trying to address?
1:24:08.320 --> 1:24:10.320
How does that translate to value?
1:24:10.320 --> 1:24:16.320
How will you extract value and get paid through what channel?
1:24:16.320 --> 1:24:19.320
And how much business value will get created?
1:24:19.320 --> 1:24:26.320
That today needs to be thought about much earlier up front than it did three years ago.
1:24:26.320 --> 1:24:30.320
The scarcity question of AI talent has changed.
1:24:30.320 --> 1:24:32.320
The number of AI talent has changed.
1:24:32.320 --> 1:24:41.320
So now you need not just AI but also understanding of business customer and the marketplace.
1:24:41.320 --> 1:24:49.320
So I also think you should have a more reasonable evaluation expectation
1:24:49.320 --> 1:24:51.320
and growth expectation.
1:24:51.320 --> 1:24:53.320
There's going to be more competition.
1:24:53.320 --> 1:25:00.320
But the good news though is that AI technologies are now more available in open source.
1:25:00.320 --> 1:25:06.320
TensorFlow, PyTorch and such tools are much easier to use.
1:25:06.320 --> 1:25:13.320
So you should be able to experiment and get results iteratively faster than before.
1:25:13.320 --> 1:25:18.320
So take more of a business mindset to this.
1:25:18.320 --> 1:25:25.320
Think less of this as a laboratory taken into a company because we've gone beyond that stage.
1:25:25.320 --> 1:25:31.320
The only exception is if you truly have a breakthrough in some technology that really no one has,
1:25:31.320 --> 1:25:34.320
then the old way still works.
1:25:34.320 --> 1:25:36.320
But I think that's harder and harder now.
1:25:36.320 --> 1:25:44.320
So I know you believe as many do that we're far from creating an artificial general intelligence system.
1:25:44.320 --> 1:25:54.320
But say once we do and you get to ask her one question, what would that question be?
1:25:54.320 --> 1:26:00.320
What is it that differentiates you and me?
1:26:00.320 --> 1:26:05.320
Beautifully put, Kaifu, thank you so much for your time today.
1:26:05.320 --> 1:26:26.320
Thank you.