Datasets:
File size: 81,618 Bytes
a3be5d0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 1826 1827 1828 1829 1830 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850 1851 1852 1853 1854 1855 1856 1857 1858 1859 1860 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 1876 1877 1878 1879 1880 1881 1882 1883 1884 1885 1886 1887 1888 1889 1890 1891 1892 1893 1894 1895 1896 1897 1898 1899 1900 1901 1902 1903 1904 1905 1906 1907 1908 1909 1910 1911 1912 1913 1914 1915 1916 1917 1918 1919 1920 1921 1922 1923 1924 1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935 1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 1951 1952 1953 1954 1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 2026 2027 2028 2029 2030 2031 2032 2033 2034 2035 2036 2037 2038 2039 2040 2041 2042 2043 2044 2045 2046 2047 2048 2049 2050 2051 2052 2053 2054 2055 2056 2057 2058 2059 2060 2061 2062 2063 2064 2065 2066 2067 2068 2069 2070 2071 2072 2073 2074 2075 2076 2077 2078 2079 2080 2081 2082 2083 2084 2085 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 2101 2102 2103 2104 2105 2106 2107 2108 2109 2110 2111 2112 2113 2114 2115 2116 2117 2118 2119 2120 2121 2122 2123 2124 2125 2126 2127 2128 2129 2130 2131 2132 2133 2134 2135 2136 2137 2138 2139 2140 2141 2142 2143 2144 2145 2146 2147 2148 2149 2150 2151 2152 2153 2154 2155 2156 2157 2158 2159 2160 2161 2162 2163 2164 2165 2166 2167 2168 2169 2170 2171 2172 2173 2174 2175 2176 2177 2178 2179 2180 2181 2182 2183 2184 2185 2186 2187 2188 2189 2190 2191 2192 2193 2194 2195 2196 2197 2198 2199 2200 2201 2202 2203 2204 2205 2206 2207 2208 2209 2210 2211 2212 2213 2214 2215 2216 2217 2218 2219 2220 2221 2222 2223 2224 2225 2226 2227 2228 2229 2230 2231 2232 2233 2234 2235 2236 2237 2238 2239 2240 2241 2242 2243 2244 2245 2246 2247 2248 2249 2250 2251 2252 2253 2254 2255 2256 2257 2258 2259 2260 2261 2262 2263 2264 2265 2266 2267 2268 2269 2270 2271 2272 2273 2274 2275 2276 2277 2278 2279 2280 2281 2282 2283 2284 2285 2286 2287 2288 2289 2290 2291 2292 2293 2294 2295 2296 2297 2298 2299 2300 2301 2302 2303 2304 2305 2306 2307 2308 2309 2310 2311 2312 2313 2314 2315 2316 2317 2318 2319 2320 2321 2322 2323 2324 2325 2326 2327 2328 2329 2330 2331 2332 2333 2334 2335 2336 2337 2338 2339 2340 2341 2342 2343 2344 2345 2346 2347 2348 2349 2350 2351 2352 2353 2354 2355 2356 2357 2358 2359 2360 2361 2362 2363 2364 2365 2366 2367 2368 2369 2370 2371 2372 2373 2374 2375 2376 2377 2378 2379 2380 2381 2382 2383 2384 2385 2386 2387 2388 2389 2390 2391 2392 2393 2394 2395 2396 2397 2398 2399 2400 2401 2402 2403 2404 2405 2406 2407 2408 2409 2410 2411 2412 2413 2414 2415 2416 2417 2418 2419 2420 2421 2422 2423 2424 2425 2426 2427 2428 2429 2430 2431 2432 2433 2434 2435 2436 2437 2438 2439 2440 2441 2442 2443 2444 2445 2446 2447 2448 2449 2450 2451 2452 2453 2454 2455 2456 2457 2458 2459 2460 2461 2462 2463 2464 2465 2466 2467 2468 2469 2470 2471 2472 2473 2474 2475 2476 2477 2478 2479 2480 2481 2482 2483 2484 2485 2486 2487 2488 2489 2490 2491 2492 2493 2494 2495 2496 2497 2498 2499 2500 2501 2502 2503 2504 2505 2506 2507 2508 2509 2510 2511 2512 2513 2514 2515 2516 2517 2518 2519 2520 2521 2522 2523 2524 2525 2526 2527 2528 2529 2530 2531 2532 2533 2534 2535 2536 2537 2538 2539 2540 2541 2542 2543 2544 2545 2546 2547 2548 2549 2550 2551 2552 2553 2554 2555 2556 2557 2558 2559 2560 2561 2562 2563 2564 2565 2566 2567 2568 2569 2570 2571 2572 2573 2574 2575 2576 2577 2578 2579 2580 2581 2582 2583 2584 2585 2586 2587 2588 2589 2590 2591 2592 2593 2594 2595 2596 2597 2598 2599 2600 2601 2602 2603 2604 2605 2606 2607 2608 2609 2610 2611 2612 2613 2614 2615 2616 2617 2618 2619 2620 2621 2622 2623 2624 2625 2626 2627 2628 2629 2630 2631 2632 2633 2634 2635 2636 2637 2638 2639 2640 2641 2642 2643 2644 2645 2646 2647 2648 2649 2650 2651 2652 2653 2654 2655 2656 2657 2658 2659 2660 2661 2662 2663 2664 2665 2666 2667 2668 2669 2670 2671 2672 2673 2674 2675 2676 2677 2678 2679 2680 2681 2682 2683 2684 2685 2686 2687 2688 2689 2690 2691 2692 2693 2694 2695 2696 2697 2698 2699 2700 2701 2702 2703 2704 2705 2706 2707 2708 2709 2710 2711 2712 2713 2714 2715 2716 2717 2718 2719 2720 2721 2722 2723 2724 2725 2726 2727 2728 2729 2730 2731 2732 2733 2734 2735 2736 2737 2738 2739 2740 2741 2742 2743 2744 2745 2746 2747 2748 2749 2750 2751 2752 2753 2754 2755 2756 2757 2758 2759 2760 2761 2762 2763 2764 2765 2766 2767 2768 2769 2770 2771 2772 2773 2774 2775 2776 2777 2778 2779 2780 2781 2782 2783 2784 2785 2786 2787 2788 2789 2790 2791 2792 2793 2794 2795 2796 2797 2798 2799 2800 2801 2802 2803 2804 2805 2806 2807 2808 2809 2810 2811 2812 2813 2814 2815 2816 2817 2818 2819 2820 2821 2822 2823 2824 2825 2826 2827 2828 2829 2830 2831 2832 2833 2834 2835 2836 2837 2838 2839 2840 2841 2842 2843 2844 2845 2846 2847 2848 2849 2850 2851 2852 2853 2854 2855 2856 2857 2858 2859 2860 2861 2862 2863 2864 2865 2866 2867 2868 2869 2870 2871 2872 2873 2874 2875 2876 2877 2878 2879 2880 2881 2882 2883 2884 2885 2886 2887 2888 2889 2890 2891 2892 2893 2894 2895 2896 2897 2898 2899 2900 2901 2902 2903 2904 2905 2906 2907 2908 2909 2910 2911 2912 2913 2914 2915 2916 2917 2918 2919 2920 2921 2922 2923 2924 2925 2926 2927 2928 2929 2930 2931 2932 2933 2934 2935 2936 2937 2938 2939 2940 2941 2942 2943 2944 2945 2946 2947 2948 2949 2950 2951 2952 2953 2954 2955 2956 2957 2958 2959 2960 2961 2962 2963 2964 2965 2966 2967 2968 2969 2970 2971 2972 2973 2974 2975 2976 2977 2978 2979 2980 2981 2982 2983 2984 2985 2986 2987 2988 2989 2990 2991 2992 2993 2994 2995 2996 2997 2998 2999 3000 3001 3002 3003 3004 3005 3006 3007 3008 3009 3010 3011 3012 3013 3014 3015 3016 3017 3018 3019 3020 3021 3022 3023 3024 3025 3026 3027 3028 3029 3030 3031 3032 3033 3034 3035 3036 3037 3038 3039 3040 3041 3042 3043 3044 3045 3046 3047 3048 3049 3050 3051 3052 3053 3054 3055 3056 3057 3058 3059 3060 3061 3062 3063 3064 3065 3066 3067 3068 3069 3070 3071 3072 3073 3074 3075 3076 3077 3078 3079 3080 3081 3082 3083 3084 3085 3086 3087 3088 3089 3090 3091 3092 3093 3094 3095 3096 3097 3098 3099 3100 3101 3102 3103 3104 3105 3106 3107 3108 3109 3110 3111 3112 3113 3114 3115 3116 3117 3118 3119 3120 3121 3122 3123 3124 3125 3126 3127 3128 3129 3130 3131 3132 3133 3134 3135 3136 3137 3138 3139 3140 3141 3142 3143 3144 3145 3146 3147 3148 3149 3150 3151 3152 3153 3154 3155 3156 3157 3158 3159 3160 3161 3162 3163 3164 3165 3166 3167 3168 3169 3170 3171 3172 3173 3174 3175 3176 3177 3178 3179 3180 3181 3182 3183 3184 3185 3186 3187 3188 3189 3190 3191 3192 3193 3194 3195 3196 3197 3198 3199 3200 3201 3202 3203 3204 3205 3206 3207 3208 3209 3210 3211 3212 3213 3214 3215 3216 3217 3218 3219 3220 3221 3222 3223 3224 3225 3226 3227 3228 3229 3230 3231 3232 3233 3234 3235 3236 3237 3238 3239 3240 3241 3242 3243 3244 3245 3246 3247 3248 3249 3250 3251 3252 3253 3254 3255 3256 3257 3258 3259 3260 3261 3262 3263 3264 3265 3266 3267 3268 3269 3270 3271 3272 3273 3274 3275 3276 3277 3278 3279 3280 3281 3282 3283 3284 3285 3286 3287 3288 3289 3290 3291 3292 3293 3294 3295 3296 3297 3298 3299 3300 3301 3302 3303 3304 3305 3306 3307 3308 3309 3310 3311 3312 3313 3314 3315 3316 3317 3318 3319 3320 3321 3322 3323 3324 3325 3326 3327 3328 3329 3330 3331 3332 3333 3334 3335 3336 3337 3338 3339 3340 3341 3342 3343 3344 3345 3346 3347 3348 3349 3350 3351 3352 3353 3354 3355 3356 3357 3358 3359 3360 3361 3362 3363 3364 3365 3366 3367 3368 3369 3370 3371 3372 3373 3374 3375 3376 3377 3378 3379 3380 3381 3382 3383 3384 3385 3386 3387 3388 3389 3390 3391 3392 3393 3394 3395 3396 3397 3398 3399 3400 3401 3402 3403 3404 3405 3406 3407 3408 3409 3410 3411 3412 3413 3414 3415 3416 3417 3418 3419 3420 3421 3422 3423 3424 3425 3426 3427 3428 3429 3430 3431 3432 3433 3434 3435 3436 3437 3438 3439 3440 3441 3442 3443 3444 3445 3446 3447 3448 3449 3450 3451 3452 3453 3454 3455 3456 3457 3458 3459 3460 3461 3462 3463 3464 3465 3466 3467 3468 3469 3470 3471 3472 3473 3474 3475 3476 3477 3478 3479 3480 3481 3482 3483 3484 3485 3486 3487 3488 3489 3490 3491 3492 3493 3494 3495 3496 3497 3498 3499 3500 3501 3502 3503 3504 3505 3506 3507 3508 3509 3510 3511 3512 3513 3514 3515 3516 3517 3518 3519 3520 3521 3522 3523 3524 3525 3526 3527 3528 3529 3530 3531 3532 3533 3534 3535 3536 3537 3538 3539 3540 3541 3542 3543 3544 3545 3546 3547 3548 3549 3550 3551 3552 3553 3554 3555 3556 3557 3558 3559 3560 3561 3562 3563 3564 3565 3566 3567 3568 3569 3570 3571 3572 3573 3574 3575 3576 3577 3578 3579 3580 3581 3582 3583 3584 3585 |
WEBVTT
00:00.000 --> 00:03.440
The following is a conversation with Kevin Scott,
00:03.440 --> 00:06.080
the CTO of Microsoft.
00:06.080 --> 00:08.560
Before that, he was the senior vice president
00:08.560 --> 00:11.080
of engineering and operations at LinkedIn,
00:11.080 --> 00:13.520
and before that, he oversaw mobile ads
00:13.520 --> 00:14.960
engineering at Google.
00:15.960 --> 00:19.000
He also has a podcast called Behind the Tech
00:19.000 --> 00:21.880
with Kevin Scott, which I'm a fan of.
00:21.880 --> 00:24.280
This was a fun and wide ranging conversation
00:24.280 --> 00:26.680
that covered many aspects of computing.
00:26.680 --> 00:28.840
It happened over a month ago,
00:28.840 --> 00:31.000
before the announcement of Microsoft's investment
00:31.000 --> 00:34.440
OpenAI that a few people have asked me about.
00:34.440 --> 00:38.120
I'm sure there'll be one or two people in the future
00:38.120 --> 00:41.400
that'll talk with me about the impact of that investment.
00:42.280 --> 00:45.440
This is the Artificial Intelligence podcast.
00:45.440 --> 00:47.680
If you enjoy it, subscribe on YouTube,
00:47.680 --> 00:49.440
give it five stars on iTunes,
00:49.440 --> 00:50.960
support it on a Patreon,
00:50.960 --> 00:53.000
or simply connect with me on Twitter,
00:53.000 --> 00:57.680
at Lex Freedman, spelled FRIDMAM.
00:57.680 --> 00:59.240
And I'd like to give a special thank you
00:59.240 --> 01:01.960
to Tom and Elanti Bighausen
01:01.960 --> 01:04.600
for their support of the podcast on Patreon.
01:04.600 --> 01:06.080
Thanks Tom and Elanti.
01:06.080 --> 01:08.400
Hope I didn't mess up your last name too bad.
01:08.400 --> 01:10.520
Your support means a lot,
01:10.520 --> 01:13.480
and inspires me to keep this series going.
01:13.480 --> 01:18.160
And now, here's my conversation with Kevin Scott.
01:18.160 --> 01:20.760
You've described yourself as a kid in a candy store
01:20.760 --> 01:23.000
at Microsoft because of all the interesting projects
01:23.000 --> 01:24.200
that are going on.
01:24.200 --> 01:28.000
Can you try to do the impossible task
01:28.000 --> 01:31.760
and give a brief whirlwind view
01:31.760 --> 01:34.520
of all the spaces that Microsoft is working in?
01:35.520 --> 01:37.440
Both research and product.
01:37.440 --> 01:42.440
If you include research, it becomes even more difficult.
01:46.480 --> 01:48.880
So, I think broadly speaking,
01:48.880 --> 01:53.720
Microsoft's product portfolio includes everything
01:53.720 --> 01:56.920
from big cloud business,
01:56.920 --> 01:59.360
like a big set of SaaS services.
01:59.360 --> 02:01.720
We have sort of the original,
02:01.720 --> 02:05.560
or like some of what are among the original
02:05.560 --> 02:09.640
productivity software products that everybody uses.
02:09.640 --> 02:11.200
We have an operating system business.
02:11.200 --> 02:13.560
We have a hardware business
02:13.560 --> 02:17.240
where we make everything from computer mice
02:17.240 --> 02:20.760
and headphones to high end,
02:20.760 --> 02:23.520
high end personal computers and laptops.
02:23.520 --> 02:27.680
We have a fairly broad ranging research group
02:27.680 --> 02:29.680
where we have people doing everything
02:29.680 --> 02:31.880
from economics research.
02:31.880 --> 02:35.920
So, there's this really smart young economist,
02:35.920 --> 02:39.760
Glenn Weil, who like my group works with a lot,
02:39.760 --> 02:42.880
who's doing this research on these things
02:42.880 --> 02:45.120
called radical markets.
02:45.120 --> 02:48.120
Like he's written an entire technical book
02:48.120 --> 02:51.120
about this whole notion of radical markets.
02:51.120 --> 02:53.520
So, like the research group sort of spans from that
02:53.520 --> 02:56.840
to human computer interaction, to artificial intelligence.
02:56.840 --> 03:01.040
And we have GitHub, we have LinkedIn.
03:01.040 --> 03:05.800
We have a search advertising and news business
03:05.800 --> 03:07.360
and like probably a bunch of stuff
03:07.360 --> 03:11.240
that I'm embarrassingly not recounting in this list.
03:11.240 --> 03:12.920
On gaming to Xbox and so on, right?
03:12.920 --> 03:14.120
Yeah, gaming for sure.
03:14.120 --> 03:17.320
Like I was having a super fun conversation
03:17.320 --> 03:19.520
this morning with Phil Spencer.
03:19.520 --> 03:21.280
So, when I was in college,
03:21.280 --> 03:25.560
there was this game that Lucas Arts made
03:25.560 --> 03:27.600
called Day of the Tentacle,
03:27.600 --> 03:30.160
that my friends and I played forever.
03:30.160 --> 03:33.920
And like we're doing some interesting collaboration now
03:33.920 --> 03:37.920
with the folks who made Day of the Tentacle.
03:37.920 --> 03:40.840
And I was like completely nerding out with Tim Schaeffer,
03:40.840 --> 03:43.880
like the guy who wrote Day of the Tentacle this morning,
03:43.880 --> 03:45.840
just a complete fanboy,
03:45.840 --> 03:49.880
which you know, sort of it like happens a lot.
03:49.880 --> 03:53.320
Like, you know, Microsoft has been doing so much stuff
03:53.320 --> 03:56.000
at such breadth for such a long period of time
03:56.000 --> 03:59.680
that, you know, like being CTO,
03:59.680 --> 04:02.200
like most of the time my job is very, very serious
04:02.200 --> 04:05.640
and sometimes that like I get caught up
04:05.640 --> 04:09.200
in like how amazing it is
04:09.200 --> 04:11.520
to be able to have the conversations
04:11.520 --> 04:14.640
that I have with the people I get to have them with.
04:14.640 --> 04:17.040
You had to reach back into the sentimental
04:17.040 --> 04:21.640
and what's the radical markets and the economics?
04:21.640 --> 04:24.760
So the idea with radical markets is like,
04:24.760 --> 04:29.760
can you come up with new market based mechanisms to,
04:32.320 --> 04:33.800
you know, I think we have this,
04:33.800 --> 04:35.240
we're having this debate right now,
04:35.240 --> 04:40.040
like does capitalism work, like free markets work?
04:40.040 --> 04:43.000
Can the incentive structures
04:43.000 --> 04:46.360
that are built into these systems produce outcomes
04:46.360 --> 04:51.360
that are creating sort of equitably distributed benefits
04:51.560 --> 04:53.520
for every member of society?
04:55.400 --> 04:58.720
You know, and I think it's a reasonable set of questions
04:58.720 --> 04:59.560
to be asking.
04:59.560 --> 05:02.160
And so what Glenn, and so like, you know,
05:02.160 --> 05:04.400
one mode of thought there, like if you have doubts
05:04.400 --> 05:06.720
that the markets are actually working,
05:06.720 --> 05:08.560
you can sort of like tip towards like,
05:08.560 --> 05:10.800
okay, let's become more socialist
05:10.800 --> 05:14.240
and like have central planning and governments
05:14.240 --> 05:15.800
or some other central organization
05:15.800 --> 05:18.280
is like making a bunch of decisions
05:18.280 --> 05:22.040
about how sort of work gets done
05:22.040 --> 05:25.400
and like where the investments
05:25.400 --> 05:28.880
and where the outputs of those investments get distributed.
05:28.880 --> 05:32.160
Glenn's notion is like lean more
05:32.160 --> 05:35.800
into like the market based mechanism.
05:35.800 --> 05:37.920
So like for instance,
05:37.920 --> 05:39.600
this is one of the more radical ideas,
05:39.600 --> 05:44.600
like suppose that you had a radical pricing mechanism
05:45.160 --> 05:47.120
for assets like real estate
05:47.120 --> 05:52.120
where you could be bid out of your position
05:53.600 --> 05:58.600
in your home, you know, for instance.
05:58.720 --> 06:01.120
So like if somebody came along and said,
06:01.120 --> 06:04.400
you know, like I can find higher economic utility
06:04.400 --> 06:05.760
for this piece of real estate
06:05.760 --> 06:08.720
that you're running your business in,
06:08.720 --> 06:13.040
like then like you either have to, you know,
06:13.040 --> 06:16.440
sort of bid to sort of stay
06:16.440 --> 06:19.960
or like the thing that's got the higher economic utility,
06:19.960 --> 06:21.440
you know, sort of takes over the asset
06:21.440 --> 06:23.720
and which would make it very difficult
06:23.720 --> 06:27.600
to have the same sort of rent seeking behaviors
06:27.600 --> 06:29.000
that you've got right now
06:29.000 --> 06:34.000
because like if you did speculative bidding,
06:34.000 --> 06:39.000
like you would very quickly like lose a whole lot of money.
06:40.440 --> 06:43.520
And so like the prices of the assets would be sort of
06:43.520 --> 06:47.600
like very closely indexed to like the value
06:47.600 --> 06:49.720
that they can produce.
06:49.720 --> 06:52.680
And like because like you'd have this sort of real time
06:52.680 --> 06:55.320
mechanism that would force you to sort of mark the value
06:55.320 --> 06:56.800
of the asset to the market,
06:56.800 --> 06:58.560
then it could be taxed appropriately.
06:58.560 --> 07:00.400
Like you couldn't sort of sit on this thing and say,
07:00.400 --> 07:03.040
oh, like this house is only worth 10,000 bucks
07:03.040 --> 07:06.600
when like everything around it is worth 10 million.
07:06.600 --> 07:07.440
That's really interesting.
07:07.440 --> 07:08.720
So it's an incentive structure
07:08.720 --> 07:13.200
that where the prices match the value much better.
07:13.200 --> 07:14.040
Yeah.
07:14.040 --> 07:16.320
And Glenn does a much, much better job than I do
07:16.320 --> 07:18.920
at selling and I probably picked the world's worst example,
07:18.920 --> 07:20.360
you know, and, and, and, but like,
07:20.360 --> 07:24.520
and it's intentionally provocative, you know,
07:24.520 --> 07:26.480
so like this whole notion, like I, you know,
07:26.480 --> 07:28.920
like I'm not sure whether I like this notion
07:28.920 --> 07:31.120
that like we can have a set of market mechanisms
07:31.120 --> 07:35.360
where I could get bid out of, out of my property, you know,
07:35.360 --> 07:37.680
but, but, you know, like if you're thinking about something
07:37.680 --> 07:42.480
like Elizabeth Warren's wealth tax, for instance,
07:42.480 --> 07:45.600
like you would have, I mean, it'd be really interesting
07:45.600 --> 07:50.080
in like how you would actually set the price on the assets.
07:50.080 --> 07:52.040
And like you might have to have a mechanism like that
07:52.040 --> 07:54.160
if you put a tax like that in place.
07:54.160 --> 07:56.440
It's really interesting that that kind of research,
07:56.440 --> 07:59.800
at least tangentially touching Microsoft research.
07:59.800 --> 08:00.640
Yeah.
08:00.640 --> 08:02.560
So if you're really thinking broadly,
08:02.560 --> 08:07.560
maybe you can speak to this connects to AI.
08:08.400 --> 08:10.680
So we have a candidate, Andrew Yang,
08:10.680 --> 08:13.480
who kind of talks about artificial intelligence
08:13.480 --> 08:16.640
and the concern that people have about, you know,
08:16.640 --> 08:19.000
automations impact on society.
08:19.000 --> 08:22.680
And arguably Microsoft is at the cutting edge
08:22.680 --> 08:25.040
of innovation in all these kinds of ways.
08:25.040 --> 08:27.080
And so it's pushing AI forward.
08:27.080 --> 08:30.040
How do you think about combining all our conversations
08:30.040 --> 08:32.840
together here with radical markets and socialism
08:32.840 --> 08:37.520
and innovation in AI that Microsoft is doing?
08:37.520 --> 08:42.520
And then Andrew Yang's worry that that will,
08:43.520 --> 08:46.840
that will result in job loss for the lower and so on.
08:46.840 --> 08:47.680
How do you think about that?
08:47.680 --> 08:51.160
I think it's sort of one of the most important questions
08:51.160 --> 08:55.320
in technology, like maybe even in society right now
08:55.320 --> 09:00.320
about how is AI going to develop over the course
09:00.720 --> 09:02.000
of the next several decades
09:02.000 --> 09:03.600
and like what's it gonna be used for
09:03.600 --> 09:06.560
and like what benefits will it produce
09:06.560 --> 09:08.520
and what negative impacts will it produce
09:08.520 --> 09:13.520
and you know, who gets to steer this whole thing?
09:13.720 --> 09:16.320
You know, I'll say at the highest level,
09:17.240 --> 09:22.240
one of the real joys of getting to do what I do at Microsoft
09:22.240 --> 09:27.240
is Microsoft has this heritage as a platform company.
09:27.560 --> 09:31.040
And so, you know, like Bill has this thing
09:31.040 --> 09:32.880
that he said a bunch of years ago
09:32.880 --> 09:36.440
where the measure of a successful platform
09:36.440 --> 09:39.800
is that it produces far more economic value
09:39.800 --> 09:41.840
for the people who build on top of the platform
09:41.840 --> 09:46.840
than is created for the platform owner or builder.
09:47.320 --> 09:50.920
And I think we have to think about AI that way.
09:50.920 --> 09:55.920
Like it has to be a platform that other people can use
09:56.280 --> 10:01.280
to build businesses, to fulfill their creative objectives,
10:01.280 --> 10:04.640
to be entrepreneurs, to solve problems that they have
10:04.640 --> 10:07.680
in their work and in their lives.
10:07.680 --> 10:11.960
It can't be a thing where there are a handful of companies
10:11.960 --> 10:16.440
sitting in a very small handful of cities geographically
10:16.440 --> 10:19.120
who are making all the decisions
10:19.120 --> 10:24.120
about what goes into the AI and like,
10:24.240 --> 10:26.920
and then on top of like all this infrastructure,
10:26.920 --> 10:31.000
then build all of the commercially valuable uses for it.
10:31.000 --> 10:34.400
So like, I think like that's bad from a, you know,
10:34.400 --> 10:36.520
sort of, you know, economics
10:36.520 --> 10:39.720
and sort of equitable distribution of value perspective,
10:39.720 --> 10:42.080
like, you know, sort of back to this whole notion of,
10:42.080 --> 10:44.560
you know, like, do the markets work?
10:44.560 --> 10:47.600
But I think it's also bad from an innovation perspective
10:47.600 --> 10:51.360
because like I have infinite amounts of faith
10:51.360 --> 10:53.880
in human beings that if you, you know,
10:53.880 --> 10:58.280
give folks powerful tools, they will go do interesting things.
10:58.280 --> 11:02.320
And it's more than just a few tens of thousands of people
11:02.320 --> 11:03.360
with the interesting tools,
11:03.360 --> 11:05.400
it should be millions of people with the tools.
11:05.400 --> 11:07.200
So it's sort of like, you know,
11:07.200 --> 11:10.200
you think about the steam engine
11:10.200 --> 11:13.800
and the late 18th century, like it was, you know,
11:13.800 --> 11:16.800
maybe the first large scale substitute for human labor
11:16.800 --> 11:19.120
that we've built like a machine.
11:19.120 --> 11:21.680
And, you know, in the beginning,
11:21.680 --> 11:23.520
when these things are getting deployed,
11:23.520 --> 11:28.320
the folks who got most of the value from the steam engines
11:28.320 --> 11:30.160
were the folks who had capital
11:30.160 --> 11:31.600
so they could afford to build them.
11:31.600 --> 11:34.720
And like they built factories around them in businesses
11:34.720 --> 11:38.680
and the experts who knew how to build and maintain them.
11:38.680 --> 11:42.880
But access to that technology democratized over time.
11:42.880 --> 11:47.040
Like now like an engine is not a,
11:47.040 --> 11:48.800
it's not like a differentiated thing.
11:48.800 --> 11:50.280
Like there isn't one engine company
11:50.280 --> 11:51.560
that builds all the engines
11:51.560 --> 11:53.120
and all of the things that use engines
11:53.120 --> 11:54.240
are made by this company.
11:54.240 --> 11:57.440
And like they get all the economics from all of that.
11:57.440 --> 11:59.320
Like, no, like fully demarcated.
11:59.320 --> 12:00.600
Like they're probably, you know,
12:00.600 --> 12:02.360
we're sitting here in this room
12:02.360 --> 12:03.680
and like even though they don't,
12:03.680 --> 12:05.280
they're probably things, you know,
12:05.280 --> 12:09.120
like the MIMS gyroscope that are in both of our,
12:09.120 --> 12:11.480
like there's like little engines, you know,
12:11.480 --> 12:14.520
sort of everywhere, they're just a component
12:14.520 --> 12:16.240
in how we build the modern world.
12:16.240 --> 12:17.680
Like AI needs to get there.
12:17.680 --> 12:20.200
Yeah, so that's a really powerful way to think.
12:20.200 --> 12:25.120
If we think of AI as a platform versus a tool
12:25.120 --> 12:27.600
that Microsoft owns as a platform
12:27.600 --> 12:30.120
that enables creation on top of it,
12:30.120 --> 12:31.520
that's the way to democratize it.
12:31.520 --> 12:34.200
That's really interesting actually.
12:34.200 --> 12:36.040
And Microsoft throughout its history
12:36.040 --> 12:38.240
has been positioned well to do that.
12:38.240 --> 12:41.640
And the, you know, the tieback to this radical markets thing,
12:41.640 --> 12:46.640
like the, so my team has been working with Glenn
12:47.800 --> 12:51.120
on this and Jaren Lanier actually.
12:51.120 --> 12:56.120
So Jaren is the like the sort of father of virtual reality.
12:56.440 --> 12:59.480
Like he's one of the most interesting human beings
12:59.480 --> 13:01.760
on the planet, like a sweet, sweet guy.
13:02.840 --> 13:07.120
And so Jaren and Glenn and folks in my team
13:07.120 --> 13:10.360
have been working on this notion of data as labor
13:10.360 --> 13:13.160
or like they call it data dignity as well.
13:13.160 --> 13:16.880
And so the idea is that if you, you know,
13:16.880 --> 13:18.600
again, going back to this, you know,
13:18.600 --> 13:20.800
sort of industrial analogy,
13:20.800 --> 13:23.560
if you think about data as the raw material
13:23.560 --> 13:27.640
that is consumed by the machine of AI
13:27.640 --> 13:30.560
in order to do useful things,
13:30.560 --> 13:34.400
then like we're not doing a really great job right now
13:34.400 --> 13:37.760
in having transparent marketplaces for valuing
13:37.760 --> 13:39.800
those data contributions.
13:39.800 --> 13:42.680
So like, and we all make them like explicitly,
13:42.680 --> 13:43.600
like you go to LinkedIn,
13:43.600 --> 13:46.160
you sort of set up your profile on LinkedIn,
13:46.160 --> 13:47.800
like that's an explicit contribution.
13:47.800 --> 13:49.480
Like, you know exactly the information
13:49.480 --> 13:50.720
that you're putting into the system.
13:50.720 --> 13:53.000
And like you put it there because you have
13:53.000 --> 13:55.520
some nominal notion of like what value
13:55.520 --> 13:56.640
you're going to get in return,
13:56.640 --> 13:57.720
but it's like only nominal.
13:57.720 --> 13:59.680
Like you don't know exactly what value
13:59.680 --> 14:02.040
you're getting in return, like services free, you know,
14:02.040 --> 14:04.600
like it's low amount of like perceived.
14:04.600 --> 14:06.680
And then you've got all this indirect contribution
14:06.680 --> 14:08.960
that you're making just by virtue of interacting
14:08.960 --> 14:13.160
with all of the technology that's in your daily life.
14:13.160 --> 14:16.120
And so like what Glenn and Jaren
14:16.120 --> 14:19.440
and this data dignity team are trying to do is like,
14:19.440 --> 14:22.240
can we figure out a set of mechanisms
14:22.240 --> 14:26.000
that let us value those data contributions
14:26.000 --> 14:28.200
so that you could create an economy
14:28.200 --> 14:31.480
and like a set of controls and incentives
14:31.480 --> 14:36.480
that would allow people to like maybe even in the limit
14:36.840 --> 14:38.880
like earn part of their living
14:38.880 --> 14:41.000
through the data that they're creating.
14:41.000 --> 14:42.680
And like you can sort of see it in explicit ways.
14:42.680 --> 14:46.000
There are these companies like Scale AI
14:46.000 --> 14:49.960
and like they're a whole bunch of them in China right now
14:49.960 --> 14:52.400
that are basically data labeling companies.
14:52.400 --> 14:54.560
So like you're doing supervised machine learning,
14:54.560 --> 14:57.400
you need lots and lots of label training data.
14:58.600 --> 15:01.440
And like those people are getting like who work
15:01.440 --> 15:03.600
for those companies are getting compensated
15:03.600 --> 15:06.360
for their data contributions into the system.
15:06.360 --> 15:07.720
And so...
15:07.720 --> 15:10.280
That's easier to put a number on their contribution
15:10.280 --> 15:11.960
because they're explicitly labeling data.
15:11.960 --> 15:12.800
Correct.
15:12.800 --> 15:14.360
But you're saying that we're all contributing data
15:14.360 --> 15:15.720
in different kinds of ways.
15:15.720 --> 15:19.640
And it's fascinating to start to explicitly try
15:19.640 --> 15:20.880
to put a number on it.
15:20.880 --> 15:22.600
Do you think that's possible?
15:22.600 --> 15:23.640
I don't know, it's hard.
15:23.640 --> 15:25.480
It really is.
15:25.480 --> 15:30.480
Because, you know, we don't have as much transparency
15:30.480 --> 15:35.480
as I think we need in like how the data is getting used.
15:37.240 --> 15:38.720
And it's, you know, super complicated.
15:38.720 --> 15:41.000
Like, you know, we, you know,
15:41.000 --> 15:42.880
I think as technologists sort of appreciate
15:42.880 --> 15:44.160
like some of the subtlety there.
15:44.160 --> 15:47.880
It's like, you know, the data, the data gets created
15:47.880 --> 15:51.400
and then it gets, you know, it's not valuable.
15:51.400 --> 15:56.000
Like the data exhaust that you give off
15:56.000 --> 15:58.480
or the, you know, the explicit data
15:58.480 --> 16:03.240
that I am putting into the system isn't valuable.
16:03.240 --> 16:05.160
It's super valuable atomically.
16:05.160 --> 16:08.360
Like it's only valuable when you sort of aggregate it together
16:08.360 --> 16:10.440
into, you know, sort of large numbers.
16:10.440 --> 16:11.960
It's true even for these like folks
16:11.960 --> 16:14.880
who are getting compensated for like labeling things.
16:14.880 --> 16:16.480
Like for supervised machine learning now,
16:16.480 --> 16:20.080
like you need lots of labels to train, you know,
16:20.080 --> 16:22.080
a model that performs well.
16:22.080 --> 16:24.440
And so, you know, I think that's one of the challenges.
16:24.440 --> 16:26.120
It's like, how do you, you know,
16:26.120 --> 16:28.000
how do you sort of figure out like
16:28.000 --> 16:31.480
because this data is getting combined in so many ways,
16:31.480 --> 16:33.880
like through these combinations,
16:33.880 --> 16:35.880
like how the value is flowing.
16:35.880 --> 16:38.520
Yeah, that's, that's fascinating.
16:38.520 --> 16:39.360
Yeah.
16:39.360 --> 16:41.880
And it's fascinating that you're thinking about this.
16:41.880 --> 16:44.160
And I wasn't even going into this competition
16:44.160 --> 16:48.200
expecting the breadth of research really
16:48.200 --> 16:50.600
that Microsoft broadly is thinking about.
16:50.600 --> 16:52.360
You are thinking about in Microsoft.
16:52.360 --> 16:57.360
So if we go back to 89 when Microsoft released Office
16:57.360 --> 17:00.920
or 1990 when they released Windows 3.0,
17:00.920 --> 17:04.960
how's the, in your view,
17:04.960 --> 17:07.280
I know you weren't there the entire, you know,
17:07.280 --> 17:09.760
through its history, but how has the company changed
17:09.760 --> 17:12.840
in the 30 years since as you look at it now?
17:12.840 --> 17:17.080
The good thing is it's started off as a platform company.
17:17.080 --> 17:19.960
Like it's still a platform company,
17:19.960 --> 17:22.640
like the parts of the business that are like thriving
17:22.640 --> 17:26.560
and most successful or those that are building platforms,
17:26.560 --> 17:29.000
like the mission of the company now is,
17:29.000 --> 17:30.120
the mission's changed.
17:30.120 --> 17:32.480
It's like changing a very interesting way.
17:32.480 --> 17:36.280
So, you know, back in 89.90,
17:36.280 --> 17:39.040
like they were still on the original mission,
17:39.040 --> 17:43.840
which was like put a PC on every desk and in every home.
17:43.840 --> 17:47.480
Like, and it was basically about democratizing access
17:47.480 --> 17:50.000
to this new personal computing technology,
17:50.000 --> 17:52.680
which when Bill started the company,
17:52.680 --> 17:57.680
integrated circuit microprocessors were a brand new thing
17:57.680 --> 18:00.120
and like people were building, you know,
18:00.120 --> 18:03.840
homebrew computers, you know, from kits,
18:03.840 --> 18:07.520
like the way people build ham radios right now.
18:08.520 --> 18:10.680
And I think this is sort of the interesting thing
18:10.680 --> 18:12.840
for folks who build platforms in general.
18:12.840 --> 18:16.840
Bill saw the opportunity there
18:16.840 --> 18:18.720
and what personal computers could do.
18:18.720 --> 18:20.440
And it was like, it was sort of a reach.
18:20.440 --> 18:21.680
Like you just sort of imagined
18:21.680 --> 18:23.880
like where things were, you know,
18:23.880 --> 18:24.880
when they started the company
18:24.880 --> 18:26.120
versus where things are now.
18:26.120 --> 18:29.400
Like in success, when you democratize a platform,
18:29.400 --> 18:31.000
it just sort of vanishes into the platform.
18:31.000 --> 18:32.480
You don't pay attention to it anymore.
18:32.480 --> 18:35.600
Like operating systems aren't a thing anymore.
18:35.600 --> 18:38.040
Like they're super important, like completely critical.
18:38.040 --> 18:41.760
And like, you know, when you see one, you know, fail,
18:41.760 --> 18:43.520
like you just, you sort of understand,
18:43.520 --> 18:45.320
but like, you know, it's not a thing where you're,
18:45.320 --> 18:47.920
you're not like waiting for, you know,
18:47.920 --> 18:50.480
the next operating system thing
18:50.480 --> 18:52.960
in the same way that you were in 1995, right?
18:52.960 --> 18:54.280
Like in 1995, like, you know,
18:54.280 --> 18:56.000
we had Rolling Stones on the stage
18:56.000 --> 18:57.600
with the Windows 95 roll out.
18:57.600 --> 18:59.320
Like it was like the biggest thing in the world.
18:59.320 --> 19:01.080
Everybody would like lined up for it
19:01.080 --> 19:03.400
the way that people used to line up for iPhone.
19:03.400 --> 19:05.120
But like, you know, eventually,
19:05.120 --> 19:07.160
and like this isn't necessarily a bad thing.
19:07.160 --> 19:09.000
Like it just sort of, you know,
19:09.000 --> 19:12.880
the success is that it's sort of, it becomes ubiquitous.
19:12.880 --> 19:14.800
It's like everywhere and like human beings
19:14.800 --> 19:16.640
when their technology becomes ubiquitous,
19:16.640 --> 19:18.240
they just sort of start taking it for granted.
19:18.240 --> 19:23.240
So the mission now that Satya rearticulated
19:23.640 --> 19:25.280
five plus years ago now
19:25.280 --> 19:27.360
when he took over as CEO of the company,
19:29.320 --> 19:33.480
our mission is to empower every individual
19:33.480 --> 19:37.760
and every organization in the world to be more successful.
19:39.200 --> 19:43.160
And so, you know, again, like that's a platform mission.
19:43.160 --> 19:46.320
And like the way that we do it now is different.
19:46.320 --> 19:48.680
It's like we have a hyperscale cloud
19:48.680 --> 19:51.680
that people are building their applications on top of.
19:51.680 --> 19:53.680
Like we have a bunch of AI infrastructure
19:53.680 --> 19:56.280
that people are building their AI applications on top of.
19:56.280 --> 20:01.280
We have, you know, we have a productivity suite of software
20:02.280 --> 20:05.800
like Microsoft Dynamics, which, you know,
20:05.800 --> 20:07.440
some people might not think is the sexiest thing
20:07.440 --> 20:10.040
in the world, but it's like helping people figure out
20:10.040 --> 20:12.720
how to automate all of their business processes
20:12.720 --> 20:16.800
and workflows and to, you know, like help those businesses
20:16.800 --> 20:19.120
using it to like grow and be more successful.
20:19.120 --> 20:24.120
So it's a much broader vision in a way now
20:24.240 --> 20:25.480
than it was back then.
20:25.480 --> 20:27.400
Like it was sort of very particular thing.
20:27.400 --> 20:29.280
And like now, like we live in this world
20:29.280 --> 20:31.320
where technology is so powerful
20:31.320 --> 20:36.320
and it's like such a basic fact of life
20:36.320 --> 20:39.760
that it, you know, that it both exists
20:39.760 --> 20:42.760
and is going to get better and better over time
20:42.760 --> 20:46.000
or at least more and more powerful over time.
20:46.000 --> 20:48.200
So like, you know, what you have to do as a platform player
20:48.200 --> 20:49.920
is just much bigger.
20:49.920 --> 20:50.760
Right.
20:50.760 --> 20:52.600
There's so many directions in which you can transform.
20:52.600 --> 20:55.160
You didn't mention mixed reality too.
20:55.160 --> 20:59.200
You know, that's probably early days
20:59.200 --> 21:00.680
or depends how you think of it.
21:00.680 --> 21:02.240
But if we think in a scale of centuries,
21:02.240 --> 21:04.120
it's the early days of mixed reality.
21:04.120 --> 21:04.960
Oh, for sure.
21:04.960 --> 21:08.280
And so yeah, with how it lands,
21:08.280 --> 21:10.600
the Microsoft is doing some really interesting work there.
21:10.600 --> 21:13.560
Do you touch that part of the effort?
21:13.560 --> 21:14.840
What's the thinking?
21:14.840 --> 21:17.640
Do you think of mixed reality as a platform too?
21:17.640 --> 21:18.480
Oh, sure.
21:18.480 --> 21:21.320
When we look at what the platforms of the future could be.
21:21.320 --> 21:23.880
So like fairly obvious that like AI is one,
21:23.880 --> 21:26.600
like you don't have to, I mean, like that's,
21:26.600 --> 21:29.160
you know, you sort of say it to like someone
21:29.160 --> 21:31.920
and you know, like they get it.
21:31.920 --> 21:36.280
But like we also think of the like mixed reality
21:36.280 --> 21:39.560
and quantum is like these two interesting,
21:39.560 --> 21:40.920
you know, potentially.
21:40.920 --> 21:41.800
Quantum computing.
21:41.800 --> 21:42.640
Yeah.
21:42.640 --> 21:44.520
Okay, so let's get crazy then.
21:44.520 --> 21:48.920
So you're talking about some futuristic things here.
21:48.920 --> 21:50.920
Well, the mixed reality Microsoft is really,
21:50.920 --> 21:52.600
it's not even futuristic, it's here.
21:52.600 --> 21:53.440
It is.
21:53.440 --> 21:54.280
Incredible stuff.
21:54.280 --> 21:56.680
And look, and it's having an impact right now.
21:56.680 --> 21:58.720
Like one of the more interesting things
21:58.720 --> 22:01.280
that's happened with mixed reality over the past
22:01.280 --> 22:04.120
couple of years that I didn't clearly see
22:04.120 --> 22:08.400
is that it's become the computing device
22:08.400 --> 22:13.160
for folks who, for doing their work
22:13.160 --> 22:16.040
who haven't used any computing device at all
22:16.040 --> 22:16.960
to do their work before.
22:16.960 --> 22:19.800
So technicians and service folks
22:19.800 --> 22:24.200
and people who are doing like machine maintenance
22:24.200 --> 22:25.280
on factory floors.
22:25.280 --> 22:28.760
So like they, you know, because they're mobile
22:28.760 --> 22:30.280
and like they're out in the world
22:30.280 --> 22:32.320
and they're working with their hands
22:32.320 --> 22:34.080
and, you know, sort of servicing these
22:34.080 --> 22:36.520
like very complicated things.
22:36.520 --> 22:39.440
They're, they don't use their mobile phone
22:39.440 --> 22:41.440
and like they don't carry a laptop with them.
22:41.440 --> 22:43.480
And, you know, they're not tethered to a desk.
22:43.480 --> 22:46.920
And so mixed reality, like where it's getting
22:46.920 --> 22:48.840
traction right now, where HoloLens is selling
22:48.840 --> 22:53.840
a lot of units is for these sorts of applications
22:53.880 --> 22:55.440
for these workers and it's become like,
22:55.440 --> 22:58.040
I mean, like the people love it.
22:58.040 --> 23:00.600
They're like, oh my God, like this is like,
23:00.600 --> 23:02.840
for them like the same sort of productivity boosts
23:02.840 --> 23:05.520
that, you know, like an office worker had
23:05.520 --> 23:08.200
when they got their first personal computer.
23:08.200 --> 23:09.800
Yeah, but you did mention,
23:09.800 --> 23:13.400
it's certainly obvious AI as a platform,
23:13.400 --> 23:15.560
but can we dig into it a little bit?
23:15.560 --> 23:18.320
How does AI begin to infuse some of the products
23:18.320 --> 23:19.480
in Microsoft?
23:19.480 --> 23:24.480
So currently providing training of, for example,
23:25.040 --> 23:26.760
neural networks in the cloud
23:26.760 --> 23:30.960
or providing pre trained models
23:30.960 --> 23:35.360
or just even providing computing resources
23:35.360 --> 23:37.520
and whatever different inference
23:37.520 --> 23:39.320
that you want to do using neural networks.
23:39.320 --> 23:40.160
Yep.
23:40.160 --> 23:43.560
Well, how do you think of AI infusing the,
23:43.560 --> 23:45.880
as a platform that Microsoft can provide?
23:45.880 --> 23:48.320
Yeah, I mean, I think it's, it's super interesting.
23:48.320 --> 23:49.560
It's like everywhere.
23:49.560 --> 23:54.560
And like we run these, we run these review meetings now
23:54.560 --> 23:59.560
where it's me and Satya and like members of Satya's
24:01.480 --> 24:04.600
leadership team and like a cross functional group
24:04.600 --> 24:06.200
of folks across the entire company
24:06.200 --> 24:11.200
who are working on like either AI infrastructure
24:11.840 --> 24:15.520
or like have some substantial part of their,
24:16.480 --> 24:21.480
of their product work using AI in some significant way.
24:21.480 --> 24:23.440
Now, the important thing to understand is like,
24:23.440 --> 24:27.040
when you think about like how the AI is going to manifest
24:27.040 --> 24:29.600
in like an experience for something
24:29.600 --> 24:30.760
that's going to make it better,
24:30.760 --> 24:35.760
like I think you don't want the AI in this
24:35.760 --> 24:37.760
to be the first order thing.
24:37.760 --> 24:40.600
It's like whatever the product is and like the thing
24:40.600 --> 24:42.440
that is trying to help you do,
24:42.440 --> 24:44.560
like the AI just sort of makes it better.
24:44.560 --> 24:46.840
And you know, this is a gross exaggeration,
24:46.840 --> 24:50.680
but like I, yeah, people get super excited about it.
24:50.680 --> 24:53.280
They're super excited about like where the AI is showing up
24:53.280 --> 24:55.440
in products and I'm like, do you get that excited
24:55.440 --> 24:59.880
about like where you're using a hash table like in your code?
24:59.880 --> 25:03.200
Like it's just another, it's a very interesting
25:03.200 --> 25:05.800
programming tool, but it's sort of like it's an engineering
25:05.800 --> 25:09.560
tool and so like it shows up everywhere.
25:09.560 --> 25:12.920
So like we've got dozens and dozens of features now
25:12.920 --> 25:17.400
in office that are powered by like fairly sophisticated
25:17.400 --> 25:22.200
machine learning, our search engine wouldn't work at all
25:22.200 --> 25:24.840
if you took the machine learning out of it.
25:24.840 --> 25:28.560
The like increasingly, you know,
25:28.560 --> 25:33.560
things like content moderation on our Xbox and xCloud
25:34.800 --> 25:35.960
platform.
25:37.000 --> 25:39.160
When you mean moderation to me, like the recommender
25:39.160 --> 25:41.760
is like showing what you want to look at next.
25:41.760 --> 25:44.000
No, no, no, it's like anti bullying stuff.
25:44.000 --> 25:47.040
So the usual social network stuff that you have to deal with.
25:47.040 --> 25:47.880
Yeah, correct.
25:47.880 --> 25:50.080
But it's like really it's targeted,
25:50.080 --> 25:52.280
it's targeted towards a gaming audience.
25:52.280 --> 25:55.320
So it's like a very particular type of thing where,
25:55.320 --> 25:59.480
you know, the the line between playful banter
25:59.480 --> 26:02.280
and like legitimate bullying is like a subtle one.
26:02.280 --> 26:06.080
And like you have to, it's sort of tough.
26:06.080 --> 26:09.080
Like I have, I love to, if we could dig into it
26:09.080 --> 26:11.720
because you're also, you led the engineering efforts
26:11.720 --> 26:14.920
of LinkedIn and if we look at,
26:14.920 --> 26:17.640
if we look at LinkedIn as a social network
26:17.640 --> 26:21.760
and if we look at the Xbox gaming as the social components,
26:21.760 --> 26:24.840
the very different kinds of, I imagine communication
26:24.840 --> 26:26.880
going on on the two platforms, right?
26:26.880 --> 26:29.520
And the line in terms of bullying and so on
26:29.520 --> 26:31.480
is different on the two platforms.
26:31.480 --> 26:33.480
So how do you, I mean,
26:33.480 --> 26:36.240
such a fascinating philosophical discussion
26:36.240 --> 26:37.240
of where that line is.
26:37.240 --> 26:39.840
I don't think anyone knows the right answer.
26:39.840 --> 26:42.040
Twitter folks are under fire now,
26:42.040 --> 26:45.120
Jack at Twitter for trying to find that line.
26:45.120 --> 26:46.920
Nobody knows what that line is,
26:46.920 --> 26:51.720
but how do you try to find the line for,
26:52.480 --> 26:57.480
you know, trying to prevent abusive behavior
26:58.040 --> 27:00.200
and at the same time let people be playful
27:00.200 --> 27:02.880
and joke around and that kind of thing.
27:02.880 --> 27:04.640
I think in a certain way, like, you know,
27:04.640 --> 27:09.640
if you have what I would call vertical social networks,
27:09.640 --> 27:12.200
it gets to be a little bit easier.
27:12.200 --> 27:14.440
So like if you have a clear notion
27:14.440 --> 27:17.960
of like what your social network should be used for
27:17.960 --> 27:22.280
or like what you are designing a community around,
27:22.280 --> 27:25.800
then you don't have as many dimensions
27:25.800 --> 27:28.960
to your sort of content safety problem
27:28.960 --> 27:33.720
as, you know, as you do in a general purpose platform.
27:33.720 --> 27:37.520
I mean, so like on LinkedIn,
27:37.520 --> 27:39.920
like the whole social network is about
27:39.920 --> 27:41.560
connecting people with opportunity,
27:41.560 --> 27:43.160
whether it's helping them find a job
27:43.160 --> 27:46.280
or to, you know, sort of find mentors
27:46.280 --> 27:49.320
or to, you know, sort of help them
27:49.320 --> 27:52.120
like find their next sales lead
27:52.120 --> 27:56.160
or to just sort of allow them to broadcast
27:56.160 --> 27:59.440
their, you know, sort of professional identity
27:59.440 --> 28:04.440
to their network of peers and collaborators
28:04.440 --> 28:05.880
and, you know, sort of professional community.
28:05.880 --> 28:07.400
Like that is, I mean, like in some ways,
28:07.400 --> 28:08.960
like that's very, very broad,
28:08.960 --> 28:12.480
but in other ways, it's sort of, you know, it's narrow.
28:12.480 --> 28:17.480
And so like you can build AIs like machine learning systems
28:18.360 --> 28:23.360
that are, you know, capable with those boundaries
28:23.360 --> 28:26.200
of making better automated decisions about like,
28:26.200 --> 28:28.240
what is, you know, sort of inappropriate
28:28.240 --> 28:30.440
and offensive comment or dangerous comment
28:30.440 --> 28:31.920
or illegal content.
28:31.920 --> 28:34.800
When you have some constraints,
28:34.800 --> 28:37.400
you know, same thing with, you know,
28:37.400 --> 28:40.880
same thing with like the gaming social network.
28:40.880 --> 28:42.680
So for instance, like it's about playing games,
28:42.680 --> 28:44.880
about having fun and like the thing
28:44.880 --> 28:47.240
that you don't want to have happen on the platform.
28:47.240 --> 28:49.160
It's why bullying is such an important thing.
28:49.160 --> 28:50.600
Like bullying is not fun.
28:50.600 --> 28:53.400
So you want to do everything in your power
28:53.400 --> 28:56.240
to encourage that not to happen.
28:56.240 --> 29:00.320
And yeah, but I think that's a really important thing
29:00.320 --> 29:03.920
but I think it's sort of a tough problem in general.
29:03.920 --> 29:05.280
It's one where I think, you know,
29:05.280 --> 29:07.120
eventually we're gonna have to have
29:09.120 --> 29:13.800
some sort of clarification from our policy makers
29:13.800 --> 29:17.400
about what it is that we should be doing,
29:17.400 --> 29:20.880
like where the lines are, because it's tough.
29:20.880 --> 29:23.760
Like you don't, like in democracy, right?
29:23.760 --> 29:26.680
Like you don't want, you want some sort
29:26.680 --> 29:28.880
of democratic involvement.
29:28.880 --> 29:30.440
Like people should have a say
29:30.440 --> 29:34.680
in like where the lines are drawn.
29:34.680 --> 29:36.920
Like you don't want a bunch of people
29:36.920 --> 29:39.480
making like unilateral decisions.
29:39.480 --> 29:43.120
And like we are in a state right now
29:43.120 --> 29:44.760
for some of these platforms where you actually
29:44.760 --> 29:46.280
do have to make unilateral decisions
29:46.280 --> 29:48.640
where the policy making isn't gonna happen fast enough
29:48.640 --> 29:52.520
in order to like prevent very bad things from happening.
29:52.520 --> 29:55.200
But like we need the policy making side of that
29:55.200 --> 29:58.480
to catch up I think as quickly as possible
29:58.480 --> 30:00.680
because you want that whole process
30:00.680 --> 30:02.000
to be a democratic thing,
30:02.000 --> 30:05.760
not a, you know, not some sort of weird thing
30:05.760 --> 30:08.040
where you've got a non representative group
30:08.040 --> 30:10.440
of people making decisions that have, you know,
30:10.440 --> 30:12.520
like national and global impact.
30:12.520 --> 30:14.720
And it's fascinating because the digital space
30:14.720 --> 30:17.520
is different than the physical space
30:17.520 --> 30:19.800
in which nations and governments were established.
30:19.800 --> 30:23.960
And so what policy looks like globally,
30:23.960 --> 30:25.760
what bullying looks like globally,
30:25.760 --> 30:28.360
what healthy communication looks like globally
30:28.360 --> 30:31.920
is an open question and we're all figuring it out together.
30:31.920 --> 30:32.760
Which is fascinating.
30:32.760 --> 30:37.160
Yeah, I mean with, you know, sort of fake news for instance
30:37.160 --> 30:42.160
and deep fakes and fake news generated by humans.
30:42.320 --> 30:44.600
Yeah, so we can talk about deep fakes.
30:44.600 --> 30:46.120
Like I think that is another like, you know,
30:46.120 --> 30:48.280
sort of very interesting level of complexity.
30:48.280 --> 30:51.480
But like if you think about just the written word, right?
30:51.480 --> 30:54.400
Like we have, you know, we invented Papyrus
30:54.400 --> 30:56.760
what 3000 years ago where we, you know,
30:56.760 --> 31:01.160
you could sort of put word on paper.
31:01.160 --> 31:06.160
And then 500 years ago, like we get the printing press
31:07.240 --> 31:11.480
like where the word gets a little bit more ubiquitous.
31:11.480 --> 31:14.600
And then like you really, really didn't get ubiquitous
31:14.600 --> 31:18.400
printed word until the end of the 19th century
31:18.400 --> 31:20.720
when the offset press was invented.
31:20.720 --> 31:22.360
And then, you know, just sort of explodes
31:22.360 --> 31:25.360
and like, you know, the cross product of that
31:25.360 --> 31:28.960
and the industrial revolutions need
31:28.960 --> 31:32.880
for educated citizens resulted in like
31:32.880 --> 31:34.720
this rapid expansion of literacy
31:34.720 --> 31:36.000
and the rapid expansion of the word.
31:36.000 --> 31:39.680
But like we had 3000 years up to that point
31:39.680 --> 31:44.040
to figure out like how to, you know, like what's,
31:44.040 --> 31:46.880
what's journalism, what's editorial integrity?
31:46.880 --> 31:50.120
Like what's, you know, what's scientific peer review?
31:50.120 --> 31:52.840
And so like you built all of this mechanism
31:52.840 --> 31:57.080
to like try to filter through all of the noise
31:57.080 --> 32:00.600
that the technology made possible to like, you know,
32:00.600 --> 32:04.000
sort of getting to something that society could cope with.
32:04.000 --> 32:06.600
And like, if you think about just the piece,
32:06.600 --> 32:09.800
the PC didn't exist 50 years ago.
32:09.800 --> 32:11.800
And so in like this span of, you know,
32:11.800 --> 32:16.160
like half a century, like we've gone from no digital,
32:16.160 --> 32:18.320
you know, no ubiquitous digital technology
32:18.320 --> 32:21.080
to like having a device that sits in your pocket
32:21.080 --> 32:23.760
where you can sort of say whatever is on your mind
32:23.760 --> 32:26.800
to like what would Mary have
32:26.800 --> 32:31.800
and Mary Meeker just released her new like slide deck last week.
32:32.440 --> 32:37.360
You know, we've got 50% penetration of the internet
32:37.360 --> 32:38.520
to the global population.
32:38.520 --> 32:40.280
Like there are like three and a half billion people
32:40.280 --> 32:41.720
who are connected now.
32:41.720 --> 32:43.720
So it's like, it's crazy, crazy.
32:43.720 --> 32:45.000
They're like inconceivable,
32:45.000 --> 32:46.480
like how fast all of this happened.
32:46.480 --> 32:48.720
So, you know, it's not surprising
32:48.720 --> 32:51.000
that we haven't figured out what to do yet,
32:51.000 --> 32:55.640
but like we gotta really like lean into this set of problems
32:55.640 --> 33:00.200
because like we basically have three millennia worth of work
33:00.200 --> 33:02.520
to do about how to deal with all of this
33:02.520 --> 33:05.800
and like probably what amounts to the next decade
33:05.800 --> 33:07.040
worth of time.
33:07.040 --> 33:09.960
So since we're on the topic of tough, you know,
33:09.960 --> 33:11.600
tough challenging problems,
33:11.600 --> 33:15.200
let's look at more on the tooling side in AI
33:15.200 --> 33:18.440
that Microsoft is looking at as face recognition software.
33:18.440 --> 33:21.840
So there's a lot of powerful positive use cases
33:21.840 --> 33:24.240
for face recognition, but there's some negative ones
33:24.240 --> 33:27.200
and we're seeing those in different governments
33:27.200 --> 33:28.160
in the world.
33:28.160 --> 33:30.240
So how do you, how does Microsoft think
33:30.240 --> 33:33.880
about the use of face recognition software
33:33.880 --> 33:38.880
as a platform in governments and companies?
33:39.400 --> 33:42.280
Yeah, how do we strike an ethical balance here?
33:42.280 --> 33:47.280
Yeah, I think we've articulated a clear point of view.
33:47.280 --> 33:51.840
So Brad Smith wrote a blog post last fall,
33:51.840 --> 33:54.120
I believe that sort of like outline,
33:54.120 --> 33:57.000
like very specifically what, you know,
33:57.000 --> 33:59.280
what our point of view is there.
33:59.280 --> 34:02.240
And, you know, I think we believe that there are certain uses
34:02.240 --> 34:04.680
to which face recognition should not be put
34:04.680 --> 34:09.160
and we believe again that there's a need for regulation there.
34:09.160 --> 34:12.440
Like the government should like really come in and say
34:12.440 --> 34:15.720
that, you know, this is where the lines are.
34:15.720 --> 34:18.600
And like we very much wanted to like figuring out
34:18.600 --> 34:20.680
where the lines are should be a democratic process.
34:20.680 --> 34:23.240
But in the short term, like we've drawn some lines
34:23.240 --> 34:26.640
where, you know, we push back against uses
34:26.640 --> 34:29.440
of face recognition technology.
34:29.440 --> 34:32.480
You know, like this city of San Francisco, for instance,
34:32.480 --> 34:36.480
I think has completely outlawed any government agency
34:36.480 --> 34:39.560
from using face recognition tech.
34:39.560 --> 34:44.560
And like that may prove to be a little bit overly broad.
34:44.560 --> 34:48.840
But for like certain law enforcement things,
34:48.840 --> 34:53.840
like you really, I would personally rather be overly
34:54.040 --> 34:57.400
sort of cautious in terms of restricting use of it
34:57.400 --> 34:58.920
until like we have, you know,
34:58.920 --> 35:02.160
sort of defined a reasonable, you know,
35:02.160 --> 35:04.880
democratically determined regulatory framework
35:04.880 --> 35:08.840
for like where we could and should use it.
35:08.840 --> 35:10.880
And, you know, the other thing there is
35:11.960 --> 35:14.000
like we've got a bunch of research that we're doing
35:14.000 --> 35:18.400
and a bunch of progress that we've made on bias there.
35:18.400 --> 35:20.880
And like there are all sorts of like weird biases
35:20.880 --> 35:23.640
that these models can have like all the way
35:23.640 --> 35:26.920
from like the most noteworthy one where, you know,
35:26.920 --> 35:31.680
you may have underrepresented minorities
35:31.680 --> 35:34.680
who are like underrepresented in the training data.
35:34.680 --> 35:39.240
And then you start learning like strange things.
35:39.240 --> 35:42.160
But like they're even, you know, other weird things
35:42.160 --> 35:46.480
like we've, I think we've seen in the public research
35:46.480 --> 35:49.520
like models can learn strange things
35:49.520 --> 35:54.520
like all doctors or men for instance.
35:54.520 --> 35:59.520
Yeah, I mean, and so like it really is a thing where
36:00.760 --> 36:03.600
it's very important for everybody
36:03.600 --> 36:08.440
who is working on these things before they push publish,
36:08.440 --> 36:12.800
they launch the experiment, they, you know, push the code
36:12.800 --> 36:17.120
to, you know, online or they even publish the paper
36:17.120 --> 36:20.040
that they are at least starting to think
36:20.040 --> 36:25.040
about what some of the potential negative consequences
36:25.040 --> 36:25.880
are some of this stuff.
36:25.880 --> 36:29.040
I mean, this is where, you know, like the deep fake stuff
36:29.040 --> 36:32.360
I find very worrisome just because
36:32.360 --> 36:37.360
they're going to be some very good beneficial uses
36:39.800 --> 36:44.800
of like GAN generated imagery.
36:46.080 --> 36:48.440
And like, and funny enough, like one of the places
36:48.440 --> 36:52.920
where it's actually useful is we're using the technology
36:52.920 --> 36:57.920
right now to generate synthetic, synthetic visual data
36:58.640 --> 37:01.160
for training some of the face recognition models
37:01.160 --> 37:03.440
to get rid of the bias.
37:03.440 --> 37:05.800
So like that's one like super good use of the tech,
37:05.800 --> 37:09.640
but like, you know, it's getting good enough now
37:09.640 --> 37:12.320
where, you know, it's going to sort of challenge
37:12.320 --> 37:15.400
a normal human beings ability to like now you're just sort
37:15.400 --> 37:19.320
of say like it's very expensive for someone
37:19.320 --> 37:23.280
to fabricate a photorealistic fake video.
37:24.200 --> 37:26.920
And like GANs are going to make it fantastically cheap
37:26.920 --> 37:30.440
to fabricate a photorealistic fake video.
37:30.440 --> 37:33.920
And so like what you assume you can sort of trust
37:33.920 --> 37:38.400
is true versus like be skeptical about is about to change.
37:38.400 --> 37:40.560
And like we're not ready for it, I don't think.
37:40.560 --> 37:42.000
The nature of truth, right?
37:42.000 --> 37:46.360
That's, it's also exciting because I think both you
37:46.360 --> 37:49.600
and I probably would agree that the way to solve,
37:49.600 --> 37:52.080
to take on that challenge is with technology.
37:52.080 --> 37:52.920
Yeah. Right.
37:52.920 --> 37:56.800
There's probably going to be ideas of ways to verify
37:56.800 --> 38:00.800
which kind of video is legitimate, which kind is not.
38:00.800 --> 38:03.880
So to me, that's an exciting possibility.
38:03.880 --> 38:07.160
Most likely for just the comedic genius
38:07.160 --> 38:10.960
that the internet usually creates with these kinds of videos.
38:10.960 --> 38:13.960
And hopefully will not result in any serious harm.
38:13.960 --> 38:17.680
Yeah. And it could be, you know, like I think
38:17.680 --> 38:22.680
we will have technology to that may be able to detect
38:23.040 --> 38:24.440
whether or not something's fake or real.
38:24.440 --> 38:29.440
Although the fakes are pretty convincing
38:30.160 --> 38:34.360
even like when you subject them to machine scrutiny.
38:34.360 --> 38:37.800
But, you know, we also have these increasingly
38:37.800 --> 38:40.520
interesting social networks, you know,
38:40.520 --> 38:45.520
that are under fire right now for some of the bad things
38:45.800 --> 38:46.640
that they do.
38:46.640 --> 38:47.720
Like one of the things you could choose to do
38:47.720 --> 38:51.760
with a social network is like you could,
38:51.760 --> 38:55.560
you could use crypto and the networks
38:55.560 --> 38:59.960
to like have content signed where you could have a like
38:59.960 --> 39:02.160
full chain of custody that accompanied
39:02.160 --> 39:03.920
every piece of content.
39:03.920 --> 39:06.800
So like when you're viewing something
39:06.800 --> 39:09.640
and like you want to ask yourself like how, you know,
39:09.640 --> 39:11.040
how much can I trust this?
39:11.040 --> 39:12.400
Like you can click something
39:12.400 --> 39:15.640
and like have a verified chain of custody that shows like,
39:15.640 --> 39:19.040
oh, this is coming from, you know, from this source.
39:19.040 --> 39:24.040
And it's like signed by like someone whose identity I trust.
39:24.080 --> 39:25.400
Yeah, I think having that, you know,
39:25.400 --> 39:28.040
having that chain of custody like being able to like say,
39:28.040 --> 39:31.200
oh, here's this video, like it may or may not
39:31.200 --> 39:33.760
been produced using some of this deep fake technology.
39:33.760 --> 39:35.640
But if you've got a verified chain of custody
39:35.640 --> 39:37.800
where you can sort of trace it all the way back
39:37.800 --> 39:39.960
to an identity and you can decide whether or not
39:39.960 --> 39:41.520
like I trust this identity.
39:41.520 --> 39:43.360
Like, oh no, this is really from the White House
39:43.360 --> 39:45.480
or like this is really from the, you know,
39:45.480 --> 39:48.840
the office of this particular presidential candidate
39:48.840 --> 39:50.960
or it's really from, you know,
39:50.960 --> 39:55.520
Jeff Wiener CEO of LinkedIn or Satya Nadella CEO of Microsoft.
39:55.520 --> 39:58.400
Like that might be like one way
39:58.400 --> 39:59.960
that you can solve some of the problems.
39:59.960 --> 40:01.800
So like that's not the super high tech.
40:01.800 --> 40:04.480
Like we've had all of this technology forever.
40:04.480 --> 40:06.720
And but I think you're right.
40:06.720 --> 40:11.120
Like it has to be some sort of technological thing
40:11.120 --> 40:15.840
because the underlying tech that is used to create this
40:15.840 --> 40:18.800
is not going to do anything but get better over time
40:18.800 --> 40:21.160
and the genie is sort of out of the bottle.
40:21.160 --> 40:22.800
There's no stuffing it back in.
40:22.800 --> 40:24.520
And there's a social component
40:24.520 --> 40:26.600
which I think is really healthy for democracy
40:26.600 --> 40:30.200
where people will be skeptical about the thing they watch.
40:30.200 --> 40:31.040
Yeah.
40:31.040 --> 40:34.160
In general, so, you know, which is good.
40:34.160 --> 40:37.280
Skepticism in general is good for your personal content.
40:37.280 --> 40:40.400
So deep fakes in that sense are creating
40:40.400 --> 40:44.800
global skepticism about can they trust what they read?
40:44.800 --> 40:46.880
It encourages further research.
40:46.880 --> 40:48.840
I come from the Soviet Union
40:49.800 --> 40:53.320
where basically nobody trusted the media
40:53.320 --> 40:55.120
because you knew it was propaganda.
40:55.120 --> 40:59.160
And that kind of skepticism encouraged further research
40:59.160 --> 41:02.360
about ideas supposed to just trusting anyone's source.
41:02.360 --> 41:05.440
Well, like I think it's one of the reasons why the,
41:05.440 --> 41:09.440
you know, the scientific method and our apparatus
41:09.440 --> 41:11.480
of modern science is so good.
41:11.480 --> 41:15.360
Like because you don't have to trust anything.
41:15.360 --> 41:18.520
Like you, like the whole notion of, you know,
41:18.520 --> 41:21.320
like modern science beyond the fact that, you know,
41:21.320 --> 41:23.440
this is a hypothesis and this is an experiment
41:23.440 --> 41:24.840
to test the hypothesis.
41:24.840 --> 41:27.360
And, you know, like this is a peer review process
41:27.360 --> 41:30.080
for scrutinizing published results.
41:30.080 --> 41:33.280
But like stuff's also supposed to be reproducible.
41:33.280 --> 41:35.240
So like, you know, it's been vetted by this process,
41:35.240 --> 41:38.000
but like you also are expected to publish enough detail
41:38.000 --> 41:41.480
where, you know, if you are sufficiently skeptical
41:41.480 --> 41:44.720
of the thing, you can go try to like reproduce it yourself.
41:44.720 --> 41:47.560
And like, I don't know what it is.
41:47.560 --> 41:49.920
Like, I think a lot of engineers are like this
41:49.920 --> 41:52.600
where like, you know, sort of this, like your brain
41:52.600 --> 41:55.520
is sort of wired for skepticism.
41:55.520 --> 41:58.000
Like you don't just first order trust everything
41:58.000 --> 42:00.040
that you see and encounter.
42:00.040 --> 42:02.560
And like you're sort of curious to understand,
42:02.560 --> 42:04.480
you know, the next thing.
42:04.480 --> 42:09.080
But like, I think it's an entirely healthy thing.
42:09.080 --> 42:12.280
And like we need a little bit more of that right now.
42:12.280 --> 42:16.200
So I'm not a large business owner.
42:16.200 --> 42:23.200
So I'm just, I'm just a huge fan of many of Microsoft products.
42:23.200 --> 42:25.360
I mean, I still, actually in terms of,
42:25.360 --> 42:27.000
I generate a lot of graphics and images
42:27.000 --> 42:28.640
and I still use PowerPoint to do that.
42:28.640 --> 42:30.440
It beats Illustrator for me.
42:30.440 --> 42:34.480
Even professional sort of, it's fascinating.
42:34.480 --> 42:39.560
So I wonder what is the future of, let's say,
42:39.560 --> 42:41.920
windows and office look like?
42:41.920 --> 42:43.840
Is do you see it?
42:43.840 --> 42:45.880
I mean, I remember looking forward to XP.
42:45.880 --> 42:48.200
Was it exciting when XP was released?
42:48.200 --> 42:51.080
Just like you said, I don't remember when 95 was released.
42:51.080 --> 42:53.800
But XP for me was a big celebration.
42:53.800 --> 42:56.000
And when 10 came out, I was like,
42:56.000 --> 42:58.040
okay, well, it's nice, it's a nice improvement.
42:58.040 --> 43:02.600
But so what do you see the future of these products?
43:02.600 --> 43:04.640
You know, I think there's a bunch of excitement.
43:04.640 --> 43:07.160
I mean, on the office front,
43:07.160 --> 43:13.440
there's going to be this like increasing productivity
43:13.440 --> 43:17.080
wins that are coming out of some of these AI powered features
43:17.080 --> 43:19.000
that are coming, like the products will sort of get
43:19.000 --> 43:21.120
smarter and smarter in like a very subtle way.
43:21.120 --> 43:24.120
Like there's not going to be this big bang moment
43:24.120 --> 43:27.080
where, you know, like Clippy is going to reemerge
43:27.080 --> 43:27.960
and it's going to be...
43:27.960 --> 43:28.680
Wait a minute.
43:28.680 --> 43:30.520
Okay, well, I have to wait, wait, wait.
43:30.520 --> 43:31.960
It's Clippy coming back.
43:31.960 --> 43:34.560
Well, quite seriously.
43:34.560 --> 43:37.920
So injection of AI, there's not much,
43:37.920 --> 43:39.040
or at least I'm not familiar,
43:39.040 --> 43:41.200
sort of assistive type of stuff going on
43:41.200 --> 43:43.600
inside the office products,
43:43.600 --> 43:47.600
like a Clippy style assistant, personal assistant.
43:47.600 --> 43:50.560
Do you think that there's a possibility
43:50.560 --> 43:52.000
of that in the future?
43:52.000 --> 43:54.680
So I think there are a bunch of like very small ways
43:54.680 --> 43:57.320
in which like machine learning power
43:57.320 --> 44:00.080
and assistive things are in the product right now.
44:00.080 --> 44:04.800
So there are a bunch of interesting things,
44:04.800 --> 44:09.280
like the auto response stuff's getting better and better
44:09.280 --> 44:12.160
and it's like getting to the point where, you know,
44:12.160 --> 44:14.960
it can auto respond with like, okay,
44:14.960 --> 44:19.080
let this person is clearly trying to schedule a meeting
44:19.080 --> 44:21.520
so it looks at your calendar and it automatically
44:21.520 --> 44:24.080
like tries to find like a time and a space
44:24.080 --> 44:26.240
that's mutually interesting.
44:26.240 --> 44:31.240
Like we have this notion of Microsoft search
44:33.520 --> 44:34.960
where it's like not just web search,
44:34.960 --> 44:38.200
but it's like search across like all of your information
44:38.200 --> 44:43.200
that's sitting inside of like your Office 365 tenant
44:43.320 --> 44:46.880
and like, you know, potentially in other products.
44:46.880 --> 44:49.680
And like we have this thing called the Microsoft Graph
44:49.680 --> 44:53.400
that is basically a API federator that, you know,
44:53.400 --> 44:57.960
sort of like gets you hooked up across the entire breadth
44:57.960 --> 44:59.760
of like all of the, you know,
44:59.760 --> 45:01.640
like what were information silos
45:01.640 --> 45:04.720
before they got woven together with the graph.
45:05.680 --> 45:07.880
Like that is like getting increasing
45:07.880 --> 45:09.160
with increasing effectiveness,
45:09.160 --> 45:11.280
sort of plumbed into the,
45:11.280 --> 45:13.120
into some of these auto response things
45:13.120 --> 45:15.840
where you're going to be able to see the system
45:15.840 --> 45:18.200
like automatically retrieve information for you.
45:18.200 --> 45:21.160
Like if, you know, like I frequently send out,
45:21.160 --> 45:24.080
you know, emails to folks where like I can't find a paper
45:24.080 --> 45:25.400
or a document or whatnot.
45:25.400 --> 45:26.840
There's no reason why the system won't be able
45:26.840 --> 45:27.680
to do that for you.
45:27.680 --> 45:29.560
And like, I think the,
45:29.560 --> 45:33.640
it's building towards like having things that look more
45:33.640 --> 45:37.880
like like a fully integrated, you know, assistant,
45:37.880 --> 45:40.720
but like you'll have a bunch of steps
45:40.720 --> 45:42.800
that you will see before you,
45:42.800 --> 45:45.120
like it will not be this like big bang thing
45:45.120 --> 45:47.400
where like Clippy comes back and you've got this like,
45:47.400 --> 45:49.360
you know, manifestation of, you know,
45:49.360 --> 45:52.000
like a fully, fully powered assistant.
45:53.320 --> 45:56.920
So I think that's, that's definitely coming out.
45:56.920 --> 45:58.680
Like all of the, you know, collaboration,
45:58.680 --> 46:00.720
co authoring stuff's getting better.
46:00.720 --> 46:02.200
You know, it's like really interesting.
46:02.200 --> 46:07.200
Like if you look at how we use the office product portfolio
46:08.320 --> 46:10.840
at Microsoft, like more and more of it is happening
46:10.840 --> 46:14.480
inside of like teams as a canvas.
46:14.480 --> 46:17.160
And like it's this thing where, you know,
46:17.160 --> 46:19.840
that you've got collaboration is like
46:19.840 --> 46:21.560
at the center of the product.
46:21.560 --> 46:26.560
And like we, we, we built some like really cool stuff
46:26.720 --> 46:29.440
that's some of, which is about to be open source
46:29.440 --> 46:33.120
that are sort of framework level things for doing,
46:33.120 --> 46:35.600
for doing co authoring.
46:35.600 --> 46:36.440
That's awesome.
46:36.440 --> 46:38.920
So in, is there a cloud component to that?
46:38.920 --> 46:41.880
So on the web or is it,
46:41.880 --> 46:43.640
forgive me if I don't already know this,
46:43.640 --> 46:45.600
but with office 365,
46:45.600 --> 46:48.480
we still, the collaboration we do, if we're doing Word,
46:48.480 --> 46:50.640
we're still sending the file around.
46:50.640 --> 46:51.480
No, no, no, no.
46:51.480 --> 46:53.400
So this is,
46:53.400 --> 46:55.240
we're already a little bit better than that.
46:55.240 --> 46:57.360
And like, you know, so like the fact that you're unaware
46:57.360 --> 46:59.120
of it means we've got a better job to do,
46:59.120 --> 47:01.960
like helping you discover, discover this stuff.
47:02.880 --> 47:06.360
But yeah, I mean, it's already like got a huge,
47:06.360 --> 47:07.200
huge cloud component.
47:07.200 --> 47:09.680
And like part of, you know, part of this framework stuff,
47:09.680 --> 47:12.640
I think we're calling it, like I,
47:12.640 --> 47:14.520
like we've been working on it for a couple of years.
47:14.520 --> 47:17.200
So like, I know the, the internal OLA code name for it,
47:17.200 --> 47:18.640
but I think when we launched it to build,
47:18.640 --> 47:20.720
it's called the fluid framework.
47:21.920 --> 47:25.080
And, but like what fluid lets you do is like,
47:25.080 --> 47:27.920
you can go into a conversation that you're having in teams
47:27.920 --> 47:30.280
and like reference, like part of a spreadsheet
47:30.280 --> 47:32.600
that you're working on,
47:32.600 --> 47:35.600
where somebody's like sitting in the Excel canvas,
47:35.600 --> 47:37.760
like working on the spreadsheet with a, you know,
47:37.760 --> 47:39.120
charter whatnot.
47:39.120 --> 47:42.000
And like, you can sort of embed like part of the spreadsheet
47:42.000 --> 47:43.240
in the team's conversation,
47:43.240 --> 47:46.520
where like you can dynamically update in like all
47:46.520 --> 47:49.400
of the changes that you're making to the,
47:49.400 --> 47:51.280
to this object or like, you know,
47:51.280 --> 47:54.680
coordinate and everything is sort of updating in real time.
47:54.680 --> 47:58.000
So like you can be in whatever canvas is most convenient
47:58.000 --> 48:00.400
for you to get your work done.
48:00.400 --> 48:03.400
So out of my own sort of curiosity as an engineer,
48:03.400 --> 48:06.280
I know what it's like to sort of lead a team
48:06.280 --> 48:08.280
of 10, 15 engineers.
48:08.280 --> 48:11.680
Microsoft has, I don't know what the numbers are,
48:11.680 --> 48:14.920
maybe 15, maybe 60,000 engineers, maybe 40.
48:14.920 --> 48:16.160
I don't know exactly what the number is.
48:16.160 --> 48:17.000
It's a lot.
48:17.000 --> 48:18.520
It's tens of thousands.
48:18.520 --> 48:20.640
Right. This is more than 10 or 15.
48:23.640 --> 48:28.640
I mean, you've led different sizes,
48:28.720 --> 48:30.560
mostly large sizes of engineers.
48:30.560 --> 48:33.840
What does it take to lead such a large group
48:33.840 --> 48:37.480
into a continue innovation,
48:37.480 --> 48:40.240
continue being highly productive
48:40.240 --> 48:43.200
and yet develop all kinds of new ideas
48:43.200 --> 48:45.120
and yet maintain like, what does it take
48:45.120 --> 48:49.000
to lead such a large group of brilliant people?
48:49.000 --> 48:52.080
I think the thing that you learn
48:52.080 --> 48:55.120
as you manage larger and larger scale
48:55.120 --> 48:57.920
is that there are three things
48:57.920 --> 49:00.480
that are like very, very important
49:00.480 --> 49:02.360
for big engineering teams.
49:02.360 --> 49:06.320
Like one is like having some sort of forethought
49:06.320 --> 49:09.840
about what it is that you're going to be building
49:09.840 --> 49:11.040
over large periods of time.
49:11.040 --> 49:11.880
Like not exactly.
49:11.880 --> 49:13.760
Like you don't need to know that like,
49:13.760 --> 49:16.440
I'm putting all my chips on this one product
49:16.440 --> 49:17.760
and like this is going to be the thing.
49:17.760 --> 49:21.440
But it's useful to know what sort of capabilities
49:21.440 --> 49:23.080
you think you're going to need to have
49:23.080 --> 49:24.720
to build the products of the future
49:24.720 --> 49:28.000
and then like invest in that infrastructure.
49:28.000 --> 49:31.520
Like whether, and I'm not just talking about storage systems
49:31.520 --> 49:33.480
or cloud APIs, it's also like,
49:33.480 --> 49:35.360
what is your development process look like?
49:35.360 --> 49:36.720
What tools do you want?
49:36.720 --> 49:39.560
Like what culture do you want to build
49:39.560 --> 49:42.760
around like how you're sort of collaborating together
49:42.760 --> 49:45.720
to like make complicated technical things?
49:45.720 --> 49:48.080
And so like having an opinion and investing in that
49:48.080 --> 49:50.480
is like, it just gets more and more important.
49:50.480 --> 49:54.520
And like the sooner you can get a concrete set of opinions,
49:54.520 --> 49:57.680
like the better you're going to be.
49:57.680 --> 50:01.600
Like you can wing it for a while at small scales.
50:01.600 --> 50:03.160
Like, you know, when you start a company,
50:03.160 --> 50:06.320
like you don't have to be like super specific about it.
50:06.320 --> 50:10.000
But like the biggest miseries that I've ever seen
50:10.000 --> 50:12.640
as an engineering leader are in places
50:12.640 --> 50:14.440
where you didn't have a clear enough opinion
50:14.440 --> 50:16.800
about those things soon enough.
50:16.800 --> 50:20.240
And then you just sort of go create a bunch of technical debt
50:20.240 --> 50:24.000
and like culture debt that is excruciatingly painful
50:24.000 --> 50:25.760
to clean up.
50:25.760 --> 50:28.640
So like that's one bundle of things.
50:28.640 --> 50:33.640
Like the other, you know, another bundle of things is
50:33.640 --> 50:37.440
like it's just really, really important to
50:38.960 --> 50:43.960
like have a clear mission that's not just some cute crap
50:45.520 --> 50:48.880
you say because like you think you should have a mission,
50:48.880 --> 50:52.880
but like something that clarifies for people
50:52.880 --> 50:55.680
like where it is that you're headed together.
50:57.160 --> 50:58.520
Like I know it's like probably
50:58.520 --> 51:00.320
like a little bit too popular right now,
51:00.320 --> 51:05.320
but Yval Harari's book, Sapiens,
51:07.240 --> 51:12.240
one of the central ideas in his book is that
51:12.440 --> 51:16.840
like storytelling is like the quintessential thing
51:16.840 --> 51:20.480
for coordinating the activities of large groups of people.
51:20.480 --> 51:22.320
Like once you get past Dunbar's number
51:23.360 --> 51:25.800
and like I've really, really seen that
51:25.800 --> 51:27.320
just managing engineering teams.
51:27.320 --> 51:32.080
Like you can just brute force things
51:32.080 --> 51:35.160
when you're less than 120, 150 folks
51:35.160 --> 51:37.520
where you can sort of know and trust
51:37.520 --> 51:40.920
and understand what the dynamics are between all the people.
51:40.920 --> 51:41.840
But like past that,
51:41.840 --> 51:45.440
like things just sort of start to catastrophically fail
51:45.440 --> 51:48.760
if you don't have some sort of set of shared goals
51:48.760 --> 51:50.480
that you're marching towards.
51:50.480 --> 51:52.960
And so like even though it sounds touchy feely
51:52.960 --> 51:55.640
and you know, like a bunch of technical people
51:55.640 --> 51:58.200
will sort of balk at the idea that like you need
51:58.200 --> 52:01.680
to like have a clear, like the missions
52:01.680 --> 52:03.560
like very, very, very important.
52:03.560 --> 52:04.640
Yval's right, right?
52:04.640 --> 52:07.520
Stories, that's how our society,
52:07.520 --> 52:09.360
that's the fabric that connects us all of us
52:09.360 --> 52:11.120
is these powerful stories.
52:11.120 --> 52:13.440
And that works for companies too, right?
52:13.440 --> 52:14.520
It works for everything.
52:14.520 --> 52:16.520
Like I mean, even down to like, you know,
52:16.520 --> 52:18.280
you sort of really think about like our currency
52:18.280 --> 52:19.960
for instance is a story.
52:19.960 --> 52:23.360
Our constitution is a story, our laws are story.
52:23.360 --> 52:27.840
I mean, like we believe very, very, very strongly in them
52:27.840 --> 52:29.960
and thank God we do.
52:29.960 --> 52:33.040
But like they are, they're just abstract things.
52:33.040 --> 52:34.000
Like they're just words.
52:34.000 --> 52:36.520
Like if we don't believe in them, they're nothing.
52:36.520 --> 52:39.440
And in some sense, those stories are platforms
52:39.440 --> 52:43.040
and the kinds some of which Microsoft is creating, right?
52:43.040 --> 52:46.360
Yeah, platforms in which we define the future.
52:46.360 --> 52:48.600
So last question, what do you,
52:48.600 --> 52:50.080
let's get philosophical maybe,
52:50.080 --> 52:51.480
bigger than even Microsoft.
52:51.480 --> 52:56.280
What do you think the next 2030 plus years
52:56.280 --> 53:00.120
looks like for computing, for technology, for devices?
53:00.120 --> 53:03.760
Do you have crazy ideas about the future of the world?
53:04.600 --> 53:06.400
Yeah, look, I think we, you know,
53:06.400 --> 53:09.480
we're entering this time where we've got,
53:10.640 --> 53:13.360
we have technology that is progressing
53:13.360 --> 53:15.800
at the fastest rate that it ever has.
53:15.800 --> 53:20.800
And you've got, you get some really big social problems
53:20.800 --> 53:25.800
like society scale problems that we have to tackle.
53:26.320 --> 53:28.720
And so, you know, I think we're gonna rise to the challenge
53:28.720 --> 53:30.560
and like figure out how to intersect
53:30.560 --> 53:32.400
like all of the power of this technology
53:32.400 --> 53:35.320
with all of the big challenges that are facing us,
53:35.320 --> 53:37.840
whether it's, you know, global warming,
53:37.840 --> 53:41.000
whether it's like the biggest remainder of the population
53:41.000 --> 53:46.000
boom is in Africa for the next 50 years or so.
53:46.800 --> 53:49.360
And like global warming is gonna make it increasingly
53:49.360 --> 53:52.600
difficult to feed the global population in particular,
53:52.600 --> 53:54.200
like in this place where you're gonna have
53:54.200 --> 53:56.600
like the biggest population boom.
53:57.720 --> 54:01.520
I think we, you know, like AI is gonna,
54:01.520 --> 54:03.560
like if we push it in the right direction,
54:03.560 --> 54:05.680
like it can do like incredible things
54:05.680 --> 54:10.160
to empower all of us to achieve our full potential
54:10.160 --> 54:15.160
and to, you know, like live better lives.
54:15.160 --> 54:20.160
But like that also means focus on like
54:20.520 --> 54:22.040
some super important things,
54:22.040 --> 54:23.960
like how can you apply it to healthcare
54:23.960 --> 54:28.960
to make sure that, you know, like our quality and cost of,
54:29.640 --> 54:32.080
and sort of ubiquity of health coverage
54:32.080 --> 54:35.080
is better and better over time.
54:35.080 --> 54:37.960
Like that's more and more important every day
54:37.960 --> 54:40.880
is like in the United States
54:40.880 --> 54:43.280
and like the rest of the industrialized world.
54:43.280 --> 54:45.720
So Western Europe, China, Japan, Korea,
54:45.720 --> 54:48.880
like you've got this population bubble
54:48.880 --> 54:52.880
of like aging working, you know, working age folks
54:52.880 --> 54:56.200
who are, you know, at some point over the next 20, 30 years
54:56.200 --> 54:58.000
they're gonna be largely retired
54:58.000 --> 55:00.160
and like you're gonna have more retired people
55:00.160 --> 55:01.200
than working age people.
55:01.200 --> 55:02.520
And then like you've got, you know,
55:02.520 --> 55:04.800
sort of natural questions about who's gonna take care
55:04.800 --> 55:07.120
of all the old folks and who's gonna do all the work.
55:07.120 --> 55:11.040
And the answers to like all of these sorts of questions
55:11.040 --> 55:13.200
like where you're sort of running into, you know,
55:13.200 --> 55:16.080
like constraints of the, you know,
55:16.080 --> 55:20.080
the world and of society has always been like
55:20.080 --> 55:23.000
what tech is gonna like help us get around this.
55:23.000 --> 55:26.360
You know, like when I was a kid in the 70s and 80s,
55:26.360 --> 55:29.800
like we talked all the time about like population boom,
55:29.800 --> 55:32.200
population boom, like we're gonna,
55:32.200 --> 55:34.360
like we're not gonna be able to like feed the planet.
55:34.360 --> 55:36.800
And like we were like right in the middle
55:36.800 --> 55:38.200
of the green revolution
55:38.200 --> 55:43.200
where like this massive technology driven increase
55:44.560 --> 55:47.520
and crop productivity like worldwide.
55:47.520 --> 55:49.320
And like some of that was like taking some of the things
55:49.320 --> 55:52.560
that we knew in the West and like getting them distributed
55:52.560 --> 55:55.760
to the, you know, to the developing world.
55:55.760 --> 55:59.360
And like part of it were things like, you know,
55:59.360 --> 56:03.280
just smarter biology like helping us increase.
56:03.280 --> 56:06.760
And like we don't talk about like, yeah,
56:06.760 --> 56:10.320
overpopulation anymore because like we can more or less,
56:10.320 --> 56:12.000
we sort of figured out how to feed the world.
56:12.000 --> 56:14.760
Like that's a technology story.
56:14.760 --> 56:19.480
And so like I'm super, super hopeful about the future
56:19.480 --> 56:24.080
and in the ways where we will be able to apply technology
56:24.080 --> 56:28.040
to solve some of these super challenging problems.
56:28.040 --> 56:31.360
Like I've, like one of the things
56:31.360 --> 56:34.680
that I'm trying to spend my time doing right now
56:34.680 --> 56:36.600
is trying to get everybody else to be hopeful
56:36.600 --> 56:38.720
as well because, you know, back to Harari,
56:38.720 --> 56:41.160
like we are the stories that we tell.
56:41.160 --> 56:44.320
Like if we, you know, if we get overly pessimistic right now
56:44.320 --> 56:49.320
about like the potential future of technology, like we,
56:49.320 --> 56:53.680
you know, like we may fail to fail to get all the things
56:53.680 --> 56:56.880
in place that we need to like have our best possible future.
56:56.880 --> 56:59.440
And that kind of hopeful optimism.
56:59.440 --> 57:03.160
I'm glad that you have it because you're leading large groups
57:03.160 --> 57:05.600
of engineers that are actually defining
57:05.600 --> 57:06.720
that are writing that story,
57:06.720 --> 57:08.320
that are helping build that future,
57:08.320 --> 57:10.000
which is super exciting.
57:10.000 --> 57:12.320
And I agree with everything you said,
57:12.320 --> 57:14.840
except I do hope Clippy comes back.
57:16.400 --> 57:17.760
We miss him.
57:17.760 --> 57:19.360
I speak for the people.
57:19.360 --> 57:21.800
So, Kellen, thank you so much for talking to me.
57:21.800 --> 57:22.640
Thank you so much for having me.
57:22.640 --> 57:43.640
It was a pleasure.
|