Datasets:
File size: 150,079 Bytes
a3be5d0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 1826 1827 1828 1829 1830 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850 1851 1852 1853 1854 1855 1856 1857 1858 1859 1860 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 1876 1877 1878 1879 1880 1881 1882 1883 1884 1885 1886 1887 1888 1889 1890 1891 1892 1893 1894 1895 1896 1897 1898 1899 1900 1901 1902 1903 1904 1905 1906 1907 1908 1909 1910 1911 1912 1913 1914 1915 1916 1917 1918 1919 1920 1921 1922 1923 1924 1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935 1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 1951 1952 1953 1954 1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 2026 2027 2028 2029 2030 2031 2032 2033 2034 2035 2036 2037 2038 2039 2040 2041 2042 2043 2044 2045 2046 2047 2048 2049 2050 2051 2052 2053 2054 2055 2056 2057 2058 2059 2060 2061 2062 2063 2064 2065 2066 2067 2068 2069 2070 2071 2072 2073 2074 2075 2076 2077 2078 2079 2080 2081 2082 2083 2084 2085 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 2101 2102 2103 2104 2105 2106 2107 2108 2109 2110 2111 2112 2113 2114 2115 2116 2117 2118 2119 2120 2121 2122 2123 2124 2125 2126 2127 2128 2129 2130 2131 2132 2133 2134 2135 2136 2137 2138 2139 2140 2141 2142 2143 2144 2145 2146 2147 2148 2149 2150 2151 2152 2153 2154 2155 2156 2157 2158 2159 2160 2161 2162 2163 2164 2165 2166 2167 2168 2169 2170 2171 2172 2173 2174 2175 2176 2177 2178 2179 2180 2181 2182 2183 2184 2185 2186 2187 2188 2189 2190 2191 2192 2193 2194 2195 2196 2197 2198 2199 2200 2201 2202 2203 2204 2205 2206 2207 2208 2209 2210 2211 2212 2213 2214 2215 2216 2217 2218 2219 2220 2221 2222 2223 2224 2225 2226 2227 2228 2229 2230 2231 2232 2233 2234 2235 2236 2237 2238 2239 2240 2241 2242 2243 2244 2245 2246 2247 2248 2249 2250 2251 2252 2253 2254 2255 2256 2257 2258 2259 2260 2261 2262 2263 2264 2265 2266 2267 2268 2269 2270 2271 2272 2273 2274 2275 2276 2277 2278 2279 2280 2281 2282 2283 2284 2285 2286 2287 2288 2289 2290 2291 2292 2293 2294 2295 2296 2297 2298 2299 2300 2301 2302 2303 2304 2305 2306 2307 2308 2309 2310 2311 2312 2313 2314 2315 2316 2317 2318 2319 2320 2321 2322 2323 2324 2325 2326 2327 2328 2329 2330 2331 2332 2333 2334 2335 2336 2337 2338 2339 2340 2341 2342 2343 2344 2345 2346 2347 2348 2349 2350 2351 2352 2353 2354 2355 2356 2357 2358 2359 2360 2361 2362 2363 2364 2365 2366 2367 2368 2369 2370 2371 2372 2373 2374 2375 2376 2377 2378 2379 2380 2381 2382 2383 2384 2385 2386 2387 2388 2389 2390 2391 2392 2393 2394 2395 2396 2397 2398 2399 2400 2401 2402 2403 2404 2405 2406 2407 2408 2409 2410 2411 2412 2413 2414 2415 2416 2417 2418 2419 2420 2421 2422 2423 2424 2425 2426 2427 2428 2429 2430 2431 2432 2433 2434 2435 2436 2437 2438 2439 2440 2441 2442 2443 2444 2445 2446 2447 2448 2449 2450 2451 2452 2453 2454 2455 2456 2457 2458 2459 2460 2461 2462 2463 2464 2465 2466 2467 2468 2469 2470 2471 2472 2473 2474 2475 2476 2477 2478 2479 2480 2481 2482 2483 2484 2485 2486 2487 2488 2489 2490 2491 2492 2493 2494 2495 2496 2497 2498 2499 2500 2501 2502 2503 2504 2505 2506 2507 2508 2509 2510 2511 2512 2513 2514 2515 2516 2517 2518 2519 2520 2521 2522 2523 2524 2525 2526 2527 2528 2529 2530 2531 2532 2533 2534 2535 2536 2537 2538 2539 2540 2541 2542 2543 2544 2545 2546 2547 2548 2549 2550 2551 2552 2553 2554 2555 2556 2557 2558 2559 2560 2561 2562 2563 2564 2565 2566 2567 2568 2569 2570 2571 2572 2573 2574 2575 2576 2577 2578 2579 2580 2581 2582 2583 2584 2585 2586 2587 2588 2589 2590 2591 2592 2593 2594 2595 2596 2597 2598 2599 2600 2601 2602 2603 2604 2605 2606 2607 2608 2609 2610 2611 2612 2613 2614 2615 2616 2617 2618 2619 2620 2621 2622 2623 2624 2625 2626 2627 2628 2629 2630 2631 2632 2633 2634 2635 2636 2637 2638 2639 2640 2641 2642 2643 2644 2645 2646 2647 2648 2649 2650 2651 2652 2653 2654 2655 2656 2657 2658 2659 2660 2661 2662 2663 2664 2665 2666 2667 2668 2669 2670 2671 2672 2673 2674 2675 2676 2677 2678 2679 2680 2681 2682 2683 2684 2685 2686 2687 2688 2689 2690 2691 2692 2693 2694 2695 2696 2697 2698 2699 2700 2701 2702 2703 2704 2705 2706 2707 2708 2709 2710 2711 2712 2713 2714 2715 2716 2717 2718 2719 2720 2721 2722 2723 2724 2725 2726 2727 2728 2729 2730 2731 2732 2733 2734 2735 2736 2737 2738 2739 2740 2741 2742 2743 2744 2745 2746 2747 2748 2749 2750 2751 2752 2753 2754 2755 2756 2757 2758 2759 2760 2761 2762 2763 2764 2765 2766 2767 2768 2769 2770 2771 2772 2773 2774 2775 2776 2777 2778 2779 2780 2781 2782 2783 2784 2785 2786 2787 2788 2789 2790 2791 2792 2793 2794 2795 2796 2797 2798 2799 2800 2801 2802 2803 2804 2805 2806 2807 2808 2809 2810 2811 2812 2813 2814 2815 2816 2817 2818 2819 2820 2821 2822 2823 2824 2825 2826 2827 2828 2829 2830 2831 2832 2833 2834 2835 2836 2837 2838 2839 2840 2841 2842 2843 2844 2845 2846 2847 2848 2849 2850 2851 2852 2853 2854 2855 2856 2857 2858 2859 2860 2861 2862 2863 2864 2865 2866 2867 2868 2869 2870 2871 2872 2873 2874 2875 2876 2877 2878 2879 2880 2881 2882 2883 2884 2885 2886 2887 2888 2889 2890 2891 2892 2893 2894 2895 2896 2897 2898 2899 2900 2901 2902 2903 2904 2905 2906 2907 2908 2909 2910 2911 2912 2913 2914 2915 2916 2917 2918 2919 2920 2921 2922 2923 2924 2925 2926 2927 2928 2929 2930 2931 2932 2933 2934 2935 2936 2937 2938 2939 2940 2941 2942 2943 2944 2945 2946 2947 2948 2949 2950 2951 2952 2953 2954 2955 2956 2957 2958 2959 2960 2961 2962 2963 2964 2965 2966 2967 2968 2969 2970 2971 2972 2973 2974 2975 2976 2977 2978 2979 2980 2981 2982 2983 2984 2985 2986 2987 2988 2989 2990 2991 2992 2993 2994 2995 2996 2997 2998 2999 3000 3001 3002 3003 3004 3005 3006 3007 3008 3009 3010 3011 3012 3013 3014 3015 3016 3017 3018 3019 3020 3021 3022 3023 3024 3025 3026 3027 3028 3029 3030 3031 3032 3033 3034 3035 3036 3037 3038 3039 3040 3041 3042 3043 3044 3045 3046 3047 3048 3049 3050 3051 3052 3053 3054 3055 3056 3057 3058 3059 3060 3061 3062 3063 3064 3065 3066 3067 3068 3069 3070 3071 3072 3073 3074 3075 3076 3077 3078 3079 3080 3081 3082 3083 3084 3085 3086 3087 3088 3089 3090 3091 3092 3093 3094 3095 3096 3097 3098 3099 3100 3101 3102 3103 3104 3105 3106 3107 3108 3109 3110 3111 3112 3113 3114 3115 3116 3117 3118 3119 3120 3121 3122 3123 3124 3125 3126 3127 3128 3129 3130 3131 3132 3133 3134 3135 3136 3137 3138 3139 3140 3141 3142 3143 3144 3145 3146 3147 3148 3149 3150 3151 3152 3153 3154 3155 3156 3157 3158 3159 3160 3161 3162 3163 3164 3165 3166 3167 3168 3169 3170 3171 3172 3173 3174 3175 3176 3177 3178 3179 3180 3181 3182 3183 3184 3185 3186 3187 3188 3189 3190 3191 3192 3193 3194 3195 3196 3197 3198 3199 3200 3201 3202 3203 3204 3205 3206 3207 3208 3209 3210 3211 3212 3213 3214 3215 3216 3217 3218 3219 3220 3221 3222 3223 3224 3225 3226 3227 3228 3229 3230 3231 3232 3233 3234 3235 3236 3237 3238 3239 3240 3241 3242 3243 3244 3245 3246 3247 3248 3249 3250 3251 3252 3253 3254 3255 3256 3257 3258 3259 3260 3261 3262 3263 3264 3265 3266 3267 3268 3269 3270 3271 3272 3273 3274 3275 3276 3277 3278 3279 3280 3281 3282 3283 3284 3285 3286 3287 3288 3289 3290 3291 3292 3293 3294 3295 3296 3297 3298 3299 3300 3301 3302 3303 3304 3305 3306 3307 3308 3309 3310 3311 3312 3313 3314 3315 3316 3317 3318 3319 3320 3321 3322 3323 3324 3325 3326 3327 3328 3329 3330 3331 3332 3333 3334 3335 3336 3337 3338 3339 3340 3341 3342 3343 3344 3345 3346 3347 3348 3349 3350 3351 3352 3353 3354 3355 3356 3357 3358 3359 3360 3361 3362 3363 3364 3365 3366 3367 3368 3369 3370 3371 3372 3373 3374 3375 3376 3377 3378 3379 3380 3381 3382 3383 3384 3385 3386 3387 3388 3389 3390 3391 3392 3393 3394 3395 3396 3397 3398 3399 3400 3401 3402 3403 3404 3405 3406 3407 3408 3409 3410 3411 3412 3413 3414 3415 3416 3417 3418 3419 3420 3421 3422 3423 3424 3425 3426 3427 3428 3429 3430 3431 3432 3433 3434 3435 3436 3437 3438 3439 3440 3441 3442 3443 3444 3445 3446 3447 3448 3449 3450 3451 3452 3453 3454 3455 3456 3457 3458 3459 3460 3461 3462 3463 3464 3465 3466 3467 3468 3469 3470 3471 3472 3473 3474 3475 3476 3477 3478 3479 3480 3481 3482 3483 3484 3485 3486 3487 3488 3489 3490 3491 3492 3493 3494 3495 3496 3497 3498 3499 3500 3501 3502 3503 3504 3505 3506 3507 3508 3509 3510 3511 3512 3513 3514 3515 3516 3517 3518 3519 3520 3521 3522 3523 3524 3525 3526 3527 3528 3529 3530 3531 3532 3533 3534 3535 3536 3537 3538 3539 3540 3541 3542 3543 3544 3545 3546 3547 3548 3549 3550 3551 3552 3553 3554 3555 3556 3557 3558 3559 3560 3561 3562 3563 3564 3565 3566 3567 3568 3569 3570 3571 3572 3573 3574 3575 3576 3577 3578 3579 3580 3581 3582 3583 3584 3585 3586 3587 3588 3589 3590 3591 3592 3593 3594 3595 3596 3597 3598 3599 3600 3601 3602 3603 3604 3605 3606 3607 3608 3609 3610 3611 3612 3613 3614 3615 3616 3617 3618 3619 3620 3621 3622 3623 3624 3625 3626 3627 3628 3629 3630 3631 3632 3633 3634 3635 3636 3637 3638 3639 3640 3641 3642 3643 3644 3645 3646 3647 3648 3649 3650 3651 3652 3653 3654 3655 3656 3657 3658 3659 3660 3661 3662 3663 3664 3665 3666 3667 3668 3669 3670 3671 3672 3673 3674 3675 3676 3677 3678 3679 3680 3681 3682 3683 3684 3685 3686 3687 3688 3689 3690 3691 3692 3693 3694 3695 3696 3697 3698 3699 3700 3701 3702 3703 3704 3705 3706 3707 3708 3709 3710 3711 3712 3713 3714 3715 3716 3717 3718 3719 3720 3721 3722 3723 3724 3725 3726 3727 3728 3729 3730 3731 3732 3733 3734 3735 3736 3737 3738 3739 3740 3741 3742 3743 3744 3745 3746 3747 3748 3749 3750 3751 3752 3753 3754 3755 3756 3757 3758 3759 3760 3761 3762 3763 3764 3765 3766 3767 3768 3769 3770 3771 3772 3773 3774 3775 3776 3777 3778 3779 3780 3781 3782 3783 3784 3785 3786 3787 3788 3789 3790 3791 3792 3793 3794 3795 3796 3797 3798 3799 3800 3801 3802 3803 3804 3805 3806 3807 3808 3809 3810 3811 3812 3813 3814 3815 3816 3817 3818 3819 3820 3821 3822 3823 3824 3825 3826 3827 3828 3829 3830 3831 3832 3833 3834 3835 3836 3837 3838 3839 3840 3841 3842 3843 3844 3845 3846 3847 3848 3849 3850 3851 3852 3853 3854 3855 3856 3857 3858 3859 3860 3861 3862 3863 3864 3865 3866 3867 3868 3869 3870 3871 3872 3873 3874 3875 3876 3877 3878 3879 3880 3881 3882 3883 3884 3885 3886 3887 3888 3889 3890 3891 3892 3893 3894 3895 3896 3897 3898 3899 3900 3901 3902 3903 3904 3905 3906 3907 3908 3909 3910 3911 3912 3913 3914 3915 3916 3917 3918 3919 3920 3921 3922 3923 3924 3925 3926 3927 3928 3929 3930 3931 3932 3933 3934 3935 3936 3937 3938 3939 3940 3941 3942 3943 3944 3945 3946 3947 3948 3949 3950 3951 3952 3953 3954 3955 3956 3957 3958 3959 3960 3961 3962 3963 3964 3965 3966 3967 3968 3969 3970 3971 3972 3973 3974 3975 3976 3977 3978 3979 3980 3981 3982 3983 3984 3985 3986 3987 3988 3989 3990 3991 3992 3993 3994 3995 3996 3997 3998 3999 4000 4001 4002 4003 4004 4005 4006 4007 4008 4009 4010 4011 4012 4013 4014 4015 4016 4017 4018 4019 4020 4021 4022 4023 4024 4025 4026 4027 4028 4029 4030 4031 4032 4033 4034 4035 4036 4037 4038 4039 4040 4041 4042 4043 4044 4045 4046 4047 4048 4049 4050 4051 4052 4053 4054 4055 4056 4057 4058 4059 4060 4061 4062 4063 4064 4065 4066 4067 4068 4069 4070 4071 4072 4073 4074 4075 4076 4077 4078 4079 4080 4081 4082 4083 4084 4085 4086 4087 4088 4089 4090 4091 4092 4093 4094 4095 4096 4097 4098 4099 4100 4101 4102 4103 4104 4105 4106 4107 4108 4109 4110 4111 4112 4113 4114 4115 4116 4117 4118 4119 4120 4121 4122 4123 4124 4125 4126 4127 4128 4129 4130 4131 4132 4133 4134 4135 4136 4137 4138 4139 4140 4141 4142 4143 4144 4145 4146 4147 4148 4149 4150 4151 4152 4153 4154 4155 4156 4157 4158 4159 4160 4161 4162 4163 4164 4165 4166 4167 4168 4169 4170 4171 4172 4173 4174 4175 4176 4177 4178 4179 4180 4181 4182 4183 4184 4185 4186 4187 4188 4189 4190 4191 4192 4193 4194 4195 4196 4197 4198 4199 4200 4201 4202 4203 4204 4205 4206 4207 4208 4209 4210 4211 4212 4213 4214 4215 4216 4217 4218 4219 4220 4221 4222 4223 4224 4225 4226 4227 4228 4229 4230 4231 4232 4233 4234 4235 4236 4237 4238 4239 4240 4241 4242 4243 4244 4245 4246 4247 4248 4249 4250 4251 4252 4253 4254 4255 4256 4257 4258 4259 4260 4261 4262 4263 4264 4265 4266 4267 4268 4269 4270 4271 4272 4273 4274 4275 4276 4277 4278 4279 4280 4281 4282 4283 4284 4285 4286 4287 4288 4289 4290 4291 4292 4293 4294 4295 4296 4297 4298 4299 4300 4301 4302 4303 4304 4305 4306 4307 4308 4309 4310 4311 4312 4313 4314 4315 4316 4317 4318 4319 4320 4321 4322 4323 4324 4325 4326 4327 4328 4329 4330 4331 4332 4333 4334 4335 4336 4337 4338 4339 4340 4341 4342 4343 4344 4345 4346 4347 4348 4349 4350 4351 4352 4353 4354 4355 4356 4357 4358 4359 4360 4361 4362 4363 4364 4365 4366 4367 4368 4369 4370 4371 4372 4373 4374 4375 4376 4377 4378 4379 4380 4381 4382 4383 4384 4385 4386 4387 4388 4389 4390 4391 4392 4393 4394 4395 4396 4397 4398 4399 4400 4401 4402 4403 4404 4405 4406 4407 4408 4409 4410 4411 4412 4413 4414 4415 4416 4417 4418 4419 4420 4421 4422 4423 4424 4425 4426 4427 4428 4429 4430 4431 4432 4433 4434 4435 4436 4437 4438 4439 4440 4441 4442 4443 4444 4445 4446 4447 4448 4449 4450 4451 4452 4453 4454 4455 4456 4457 4458 4459 4460 4461 4462 4463 4464 4465 4466 4467 4468 4469 4470 4471 4472 4473 4474 4475 4476 4477 4478 4479 4480 4481 4482 4483 4484 4485 4486 4487 4488 4489 4490 4491 4492 4493 4494 4495 4496 4497 4498 4499 4500 4501 4502 4503 4504 4505 4506 4507 4508 4509 4510 4511 4512 4513 4514 4515 4516 4517 4518 4519 4520 4521 4522 4523 4524 4525 4526 4527 4528 4529 4530 4531 4532 4533 4534 4535 4536 4537 4538 4539 4540 4541 4542 4543 4544 4545 4546 4547 4548 4549 4550 4551 4552 4553 4554 4555 4556 4557 4558 4559 4560 4561 4562 4563 4564 4565 4566 4567 4568 4569 4570 4571 4572 4573 4574 4575 4576 4577 4578 4579 4580 4581 4582 4583 4584 4585 4586 4587 4588 4589 4590 4591 4592 4593 4594 4595 4596 4597 4598 4599 4600 4601 4602 4603 4604 4605 4606 4607 4608 4609 4610 4611 4612 4613 4614 4615 4616 4617 4618 4619 4620 4621 4622 4623 4624 4625 4626 4627 4628 4629 4630 4631 4632 4633 4634 4635 4636 4637 4638 4639 4640 4641 4642 4643 4644 4645 4646 4647 4648 4649 4650 4651 4652 4653 4654 4655 4656 4657 4658 4659 4660 4661 4662 4663 4664 4665 4666 4667 4668 4669 4670 4671 4672 4673 4674 4675 4676 4677 4678 4679 4680 4681 4682 4683 4684 4685 4686 4687 4688 4689 4690 4691 4692 4693 4694 4695 4696 4697 4698 4699 4700 4701 4702 4703 4704 4705 4706 4707 4708 4709 4710 4711 4712 4713 4714 4715 4716 4717 4718 4719 4720 4721 4722 4723 4724 4725 4726 4727 4728 4729 4730 4731 4732 4733 4734 4735 4736 4737 4738 4739 4740 4741 4742 4743 4744 4745 4746 4747 4748 4749 4750 4751 4752 4753 4754 4755 4756 4757 4758 4759 4760 4761 4762 4763 4764 4765 4766 4767 4768 4769 4770 4771 4772 4773 4774 4775 4776 4777 4778 4779 4780 4781 4782 4783 4784 4785 4786 4787 4788 4789 4790 4791 4792 4793 4794 4795 4796 4797 4798 4799 4800 4801 4802 4803 4804 4805 4806 4807 4808 4809 4810 4811 4812 4813 4814 4815 4816 4817 4818 4819 4820 4821 4822 4823 4824 4825 4826 4827 4828 4829 4830 4831 4832 4833 4834 4835 4836 4837 4838 4839 4840 4841 4842 4843 4844 4845 4846 4847 4848 4849 4850 4851 4852 4853 4854 4855 4856 4857 4858 4859 4860 4861 4862 4863 4864 4865 4866 4867 4868 4869 4870 4871 4872 4873 4874 4875 4876 4877 4878 4879 4880 4881 4882 4883 4884 4885 4886 4887 4888 4889 4890 4891 4892 4893 4894 4895 4896 4897 4898 4899 4900 4901 4902 4903 4904 4905 4906 4907 4908 4909 4910 4911 4912 4913 4914 4915 4916 4917 4918 4919 4920 4921 4922 4923 4924 4925 4926 4927 4928 4929 4930 4931 4932 4933 4934 4935 4936 4937 4938 4939 4940 4941 4942 4943 4944 4945 4946 4947 4948 4949 4950 4951 4952 4953 4954 4955 4956 4957 4958 4959 4960 4961 4962 4963 4964 4965 4966 4967 4968 4969 4970 4971 4972 4973 4974 4975 4976 4977 4978 4979 4980 4981 4982 4983 4984 4985 4986 4987 4988 4989 4990 4991 4992 4993 4994 4995 4996 4997 4998 4999 5000 5001 5002 5003 5004 5005 5006 5007 5008 5009 5010 5011 5012 5013 5014 5015 5016 5017 5018 5019 5020 5021 5022 5023 5024 5025 5026 5027 5028 5029 5030 5031 5032 5033 5034 5035 5036 5037 5038 5039 5040 5041 5042 5043 5044 5045 5046 5047 5048 5049 5050 5051 5052 5053 5054 5055 5056 5057 5058 5059 5060 5061 5062 5063 5064 5065 5066 5067 5068 5069 5070 5071 5072 5073 5074 5075 5076 5077 5078 5079 5080 5081 5082 5083 5084 5085 5086 5087 5088 5089 5090 5091 5092 5093 5094 5095 5096 5097 5098 5099 5100 5101 5102 5103 5104 5105 5106 5107 5108 5109 5110 5111 5112 5113 5114 5115 5116 5117 5118 5119 5120 5121 5122 5123 5124 5125 5126 5127 5128 5129 5130 5131 5132 5133 5134 5135 5136 5137 5138 5139 5140 5141 5142 5143 5144 5145 5146 5147 5148 5149 5150 5151 5152 5153 5154 5155 5156 5157 5158 5159 5160 5161 5162 5163 5164 5165 5166 5167 5168 5169 5170 5171 5172 5173 5174 5175 5176 5177 5178 5179 5180 5181 5182 5183 5184 5185 5186 5187 5188 5189 5190 5191 5192 5193 5194 5195 5196 5197 5198 5199 5200 5201 5202 5203 5204 5205 5206 5207 5208 5209 5210 5211 5212 5213 5214 5215 5216 5217 5218 5219 5220 5221 5222 5223 5224 5225 5226 5227 5228 5229 5230 5231 5232 5233 5234 5235 5236 5237 5238 5239 5240 5241 5242 5243 5244 5245 5246 5247 5248 5249 5250 5251 5252 5253 5254 5255 5256 5257 5258 5259 5260 5261 5262 5263 5264 5265 5266 5267 5268 5269 5270 5271 5272 5273 5274 5275 5276 5277 5278 5279 5280 5281 5282 5283 5284 5285 5286 5287 5288 5289 5290 5291 5292 5293 5294 5295 5296 5297 5298 5299 5300 5301 5302 5303 5304 5305 5306 5307 5308 5309 5310 5311 5312 5313 5314 5315 5316 5317 5318 5319 5320 5321 5322 5323 5324 5325 5326 5327 5328 5329 5330 5331 5332 5333 5334 5335 5336 5337 5338 5339 5340 5341 5342 5343 5344 5345 5346 5347 5348 5349 5350 5351 5352 5353 5354 5355 5356 5357 5358 5359 5360 5361 5362 5363 5364 5365 5366 5367 5368 5369 5370 5371 5372 5373 5374 5375 5376 5377 5378 5379 5380 5381 5382 5383 5384 5385 5386 5387 5388 5389 5390 5391 5392 5393 5394 5395 5396 5397 5398 5399 5400 5401 5402 5403 5404 5405 5406 5407 5408 5409 5410 5411 5412 5413 5414 5415 5416 5417 5418 5419 5420 5421 5422 5423 5424 5425 5426 5427 5428 5429 5430 5431 5432 5433 5434 5435 5436 5437 5438 5439 5440 5441 5442 5443 5444 5445 5446 5447 5448 5449 5450 5451 5452 5453 5454 5455 5456 5457 5458 5459 5460 5461 5462 5463 5464 5465 5466 5467 5468 5469 5470 5471 5472 5473 5474 5475 5476 5477 5478 5479 5480 5481 5482 5483 5484 5485 5486 5487 5488 5489 5490 5491 5492 5493 5494 5495 5496 5497 5498 5499 5500 5501 5502 5503 5504 5505 5506 5507 5508 5509 5510 5511 5512 5513 5514 5515 5516 5517 5518 5519 5520 5521 5522 5523 5524 5525 5526 5527 5528 5529 5530 5531 5532 5533 5534 5535 5536 5537 5538 5539 5540 5541 5542 5543 5544 5545 5546 5547 5548 5549 5550 5551 5552 5553 5554 5555 5556 5557 5558 5559 5560 5561 5562 5563 5564 5565 5566 5567 5568 5569 5570 5571 5572 5573 5574 5575 5576 5577 5578 5579 5580 5581 5582 5583 5584 5585 5586 5587 5588 5589 5590 5591 5592 5593 5594 5595 5596 5597 5598 5599 5600 5601 5602 5603 5604 5605 5606 5607 5608 5609 5610 5611 5612 5613 5614 5615 5616 5617 5618 5619 5620 5621 5622 5623 5624 5625 5626 5627 5628 5629 5630 5631 5632 5633 5634 5635 5636 5637 5638 5639 5640 5641 5642 5643 5644 5645 5646 5647 5648 5649 5650 5651 5652 5653 5654 5655 5656 5657 5658 5659 5660 5661 5662 5663 5664 5665 5666 5667 5668 5669 5670 5671 5672 5673 5674 5675 5676 5677 5678 5679 5680 5681 5682 5683 5684 5685 5686 5687 5688 5689 5690 5691 5692 5693 5694 5695 5696 5697 5698 5699 5700 5701 5702 5703 5704 5705 5706 5707 5708 5709 5710 5711 5712 5713 5714 5715 5716 5717 5718 5719 5720 5721 5722 5723 5724 5725 5726 5727 5728 5729 5730 5731 5732 5733 5734 5735 5736 5737 5738 5739 5740 5741 5742 5743 5744 5745 5746 5747 5748 5749 5750 5751 5752 5753 5754 5755 5756 5757 5758 5759 5760 5761 5762 5763 5764 5765 5766 5767 5768 5769 5770 5771 5772 5773 5774 5775 5776 5777 5778 5779 5780 5781 5782 5783 5784 5785 5786 5787 5788 5789 5790 5791 5792 5793 5794 5795 5796 5797 5798 5799 5800 5801 5802 5803 5804 5805 5806 5807 5808 5809 5810 5811 5812 5813 5814 5815 5816 5817 5818 5819 5820 5821 5822 5823 5824 5825 5826 5827 5828 5829 5830 5831 5832 5833 5834 5835 5836 5837 5838 5839 5840 5841 5842 5843 5844 5845 5846 5847 5848 5849 5850 5851 5852 5853 5854 5855 5856 5857 5858 5859 5860 5861 5862 5863 5864 5865 5866 5867 5868 5869 5870 5871 5872 5873 5874 5875 5876 5877 5878 5879 5880 5881 5882 5883 5884 5885 5886 5887 5888 5889 5890 5891 5892 5893 5894 5895 5896 5897 5898 5899 5900 5901 5902 5903 5904 5905 5906 5907 5908 5909 5910 5911 5912 5913 5914 5915 5916 5917 5918 5919 5920 5921 5922 5923 5924 5925 5926 5927 5928 5929 5930 5931 5932 5933 5934 5935 5936 5937 5938 5939 5940 5941 5942 5943 5944 5945 5946 5947 5948 5949 5950 5951 5952 5953 5954 5955 5956 5957 5958 5959 5960 5961 5962 5963 5964 5965 5966 5967 5968 5969 5970 5971 5972 5973 5974 5975 5976 5977 5978 5979 5980 5981 5982 5983 5984 5985 5986 5987 5988 5989 5990 5991 5992 5993 5994 5995 5996 5997 5998 5999 6000 6001 6002 6003 6004 6005 6006 6007 6008 6009 6010 6011 6012 6013 6014 6015 6016 6017 6018 6019 6020 6021 6022 6023 6024 6025 6026 6027 6028 6029 6030 6031 6032 6033 6034 6035 6036 6037 6038 6039 6040 6041 6042 6043 6044 6045 6046 6047 6048 6049 6050 6051 6052 6053 6054 6055 6056 6057 6058 6059 6060 6061 6062 6063 6064 6065 6066 6067 6068 6069 6070 6071 6072 6073 6074 6075 6076 6077 6078 6079 6080 6081 6082 6083 6084 6085 6086 6087 6088 6089 6090 6091 6092 6093 6094 6095 6096 6097 6098 6099 6100 6101 6102 6103 6104 6105 6106 6107 6108 6109 6110 6111 6112 6113 6114 6115 6116 6117 6118 6119 6120 6121 6122 6123 6124 6125 6126 6127 6128 6129 6130 6131 6132 6133 6134 6135 6136 6137 6138 6139 6140 6141 6142 6143 6144 6145 6146 6147 6148 6149 6150 6151 6152 6153 6154 6155 6156 6157 6158 6159 6160 6161 6162 6163 6164 6165 6166 6167 6168 6169 6170 6171 6172 6173 6174 6175 6176 6177 6178 6179 6180 6181 6182 6183 6184 6185 6186 6187 6188 6189 6190 6191 6192 6193 6194 6195 6196 6197 6198 6199 6200 6201 6202 6203 6204 6205 6206 6207 6208 6209 6210 6211 6212 6213 6214 6215 6216 6217 6218 6219 6220 6221 6222 6223 6224 6225 6226 6227 6228 6229 6230 6231 6232 6233 6234 6235 6236 6237 6238 6239 6240 6241 6242 6243 6244 6245 6246 6247 6248 6249 6250 6251 6252 6253 6254 6255 6256 6257 6258 6259 6260 6261 6262 6263 6264 6265 6266 6267 6268 6269 6270 6271 6272 6273 6274 6275 6276 6277 6278 6279 6280 6281 6282 6283 6284 6285 6286 6287 6288 6289 6290 6291 6292 6293 6294 6295 6296 6297 6298 6299 6300 6301 6302 6303 6304 6305 6306 6307 6308 6309 6310 6311 6312 6313 6314 6315 6316 6317 6318 6319 6320 6321 6322 6323 6324 6325 6326 6327 6328 6329 6330 6331 6332 6333 6334 6335 6336 6337 6338 6339 6340 6341 6342 6343 6344 6345 6346 6347 6348 6349 6350 6351 6352 6353 6354 6355 6356 6357 6358 6359 6360 6361 6362 6363 6364 6365 6366 6367 6368 6369 6370 6371 6372 6373 6374 6375 6376 6377 6378 6379 6380 6381 6382 6383 6384 6385 6386 6387 6388 6389 6390 6391 6392 6393 6394 6395 6396 6397 6398 6399 6400 6401 6402 6403 6404 6405 6406 6407 6408 6409 6410 6411 6412 6413 6414 6415 6416 6417 6418 6419 6420 6421 6422 6423 6424 6425 6426 6427 6428 6429 6430 6431 6432 6433 6434 6435 6436 6437 6438 6439 6440 6441 6442 6443 6444 |
WEBVTT
00:00.000 --> 00:03.120
The following is a conversation with Jeremy Howard.
00:03.120 --> 00:07.080
He's the founder of Fast AI, a research institute dedicated
00:07.080 --> 00:09.760
to making deep learning more accessible.
00:09.760 --> 00:12.560
He's also a distinguished research scientist
00:12.560 --> 00:14.600
at the University of San Francisco,
00:14.600 --> 00:17.600
a former president of Kegel, as well as a top breaking
00:17.600 --> 00:18.800
competitor there.
00:18.800 --> 00:21.680
And in general, he's a successful entrepreneur,
00:21.680 --> 00:25.240
educator, researcher, and an inspiring personality
00:25.240 --> 00:27.000
in the AI community.
00:27.000 --> 00:28.680
When someone asked me, how do I get
00:28.680 --> 00:30.240
started with deep learning?
00:30.240 --> 00:33.360
Fast AI is one of the top places I point them to.
00:33.360 --> 00:34.120
It's free.
00:34.120 --> 00:35.520
It's easy to get started.
00:35.520 --> 00:37.600
It's insightful and accessible.
00:37.600 --> 00:40.960
And if I may say so, it has very little BS.
00:40.960 --> 00:44.160
It can sometimes dilute the value of educational content
00:44.160 --> 00:46.720
on popular topics like deep learning.
00:46.720 --> 00:49.440
Fast AI has a focus on practical application
00:49.440 --> 00:51.600
of deep learning and hands on exploration
00:51.600 --> 00:53.880
of the cutting edge that is incredibly
00:53.880 --> 00:57.960
both accessible to beginners and useful to experts.
00:57.960 --> 01:01.360
This is the Artificial Intelligence Podcast.
01:01.360 --> 01:03.760
If you enjoy it, subscribe on YouTube,
01:03.760 --> 01:06.920
give it five stars on iTunes, support it on Patreon,
01:06.920 --> 01:09.040
or simply connect with me on Twitter.
01:09.040 --> 01:13.280
Alex Friedman, spelled F R I D M A N.
01:13.280 --> 01:18.560
And now, here's my conversation with Jeremy Howard.
01:18.560 --> 01:21.680
What's the first program you ever written?
01:21.680 --> 01:24.800
First program I wrote that I remember
01:24.800 --> 01:29.200
would be at high school.
01:29.200 --> 01:31.240
I did an assignment where I decided
01:31.240 --> 01:36.240
to try to find out if there were some better musical scales
01:36.240 --> 01:40.640
than the normal 12 tone, 12 interval scale.
01:40.640 --> 01:43.680
So I wrote a program on my Commodore 64 in BASIC
01:43.680 --> 01:46.080
that searched through other scale sizes
01:46.080 --> 01:48.440
to see if it could find one where there
01:48.440 --> 01:51.880
were more accurate harmonies.
01:51.880 --> 01:53.040
Like mid tone?
01:53.040 --> 01:56.520
Like you want an actual exactly 3 to 2 ratio,
01:56.520 --> 01:59.400
where else with a 12 interval scale,
01:59.400 --> 02:01.480
it's not exactly 3 to 2, for example.
02:01.480 --> 02:05.080
So that's well tempered, as they say.
02:05.080 --> 02:07.680
And BASIC on a Commodore 64.
02:07.680 --> 02:09.440
Where was the interest in music from?
02:09.440 --> 02:10.480
Or is it just technical?
02:10.480 --> 02:14.640
I did music all my life, so I played saxophone and clarinet
02:14.640 --> 02:18.120
and piano and guitar and drums and whatever.
02:18.120 --> 02:22.200
How does that thread go through your life?
02:22.200 --> 02:24.160
Where's music today?
02:24.160 --> 02:28.320
It's not where I wish it was.
02:28.320 --> 02:30.200
For various reasons, couldn't really keep it going,
02:30.200 --> 02:32.560
particularly because I had a lot of problems with RSI,
02:32.560 --> 02:33.480
with my fingers.
02:33.480 --> 02:37.360
And so I had to cut back anything that used hands
02:37.360 --> 02:39.360
and fingers.
02:39.360 --> 02:43.920
I hope one day I'll be able to get back to it health wise.
02:43.920 --> 02:46.240
So there's a love for music underlying it all.
02:46.240 --> 02:47.840
Sure, yeah.
02:47.840 --> 02:49.480
What's your favorite instrument?
02:49.480 --> 02:50.360
Saxophone.
02:50.360 --> 02:51.000
Sax.
02:51.000 --> 02:52.840
Baritone saxophone.
02:52.840 --> 02:57.440
Well, probably bass saxophone, but they're awkward.
02:57.440 --> 03:00.120
Well, I always love it when music is
03:00.120 --> 03:01.760
coupled with programming.
03:01.760 --> 03:03.800
There's something about a brain that
03:03.800 --> 03:07.520
utilizes those that emerges with creative ideas.
03:07.520 --> 03:11.200
So you've used and studied quite a few programming languages.
03:11.200 --> 03:15.120
Can you give an overview of what you've used?
03:15.120 --> 03:17.920
What are the pros and cons of each?
03:17.920 --> 03:21.960
Well, my favorite programming environment almost certainly
03:21.960 --> 03:26.520
was Microsoft Access back in the earliest days.
03:26.520 --> 03:29.080
So that was a special basic for applications, which
03:29.080 --> 03:30.720
is not a good programming language,
03:30.720 --> 03:33.080
but the programming environment is fantastic.
03:33.080 --> 03:40.120
It's like the ability to create user interfaces and tied data
03:40.120 --> 03:43.720
and actions to them and create reports and all that.
03:43.720 --> 03:46.800
As I've never seen anything as good.
03:46.800 --> 03:48.920
So things nowadays like Airtable, which
03:48.920 --> 03:56.200
are like small subsets of that, which people love for good reason.
03:56.200 --> 04:01.160
But unfortunately, nobody's ever achieved anything like that.
04:01.160 --> 04:03.320
What is that, if you could pause on that for a second?
04:03.320 --> 04:03.840
Oh, Access.
04:03.840 --> 04:04.340
Access.
04:04.340 --> 04:06.320
Is it a fundamental database?
04:06.320 --> 04:09.600
It was a database program that Microsoft produced,
04:09.600 --> 04:13.440
part of Office, and it kind of withered.
04:13.440 --> 04:16.320
But basically, it lets you in a totally graphical way
04:16.320 --> 04:18.480
create tables and relationships and queries
04:18.480 --> 04:24.720
and tie them to forms and set up event handlers and calculations.
04:24.720 --> 04:28.680
And it was a very complete, powerful system designed
04:28.680 --> 04:35.000
for not massive scalable things, but for useful little applications
04:35.000 --> 04:36.400
that I loved.
04:36.400 --> 04:40.240
So what's the connection between Excel and Access?
04:40.240 --> 04:42.160
So very close.
04:42.160 --> 04:47.680
So Access was the relational database equivalent,
04:47.680 --> 04:48.360
if you like.
04:48.360 --> 04:51.080
So people still do a lot of that stuff
04:51.080 --> 04:54.120
that should be in Access in Excel because they know it.
04:54.120 --> 04:56.680
Excel's great as well.
04:56.680 --> 05:01.760
But it's just not as rich a programming model as VBA
05:01.760 --> 05:04.680
combined with a relational database.
05:04.680 --> 05:07.320
And so I've always loved relational databases.
05:07.320 --> 05:11.080
But today, programming on top of relational databases
05:11.080 --> 05:13.840
is just a lot more of a headache.
05:13.840 --> 05:16.680
You generally either need to kind of,
05:16.680 --> 05:19.040
you need something that connects, that runs some kind
05:19.040 --> 05:21.560
of database server, unless you use SQLite, which
05:21.560 --> 05:25.000
has its own issues.
05:25.000 --> 05:26.320
Then you kind of often, if you want
05:26.320 --> 05:27.760
to get a nice programming model, you
05:27.760 --> 05:30.440
need to create an ORM on top.
05:30.440 --> 05:34.360
And then, I don't know, there's all these pieces tied together.
05:34.360 --> 05:37.000
And it's just a lot more awkward than it should be.
05:37.000 --> 05:39.200
There are people that are trying to make it easier,
05:39.200 --> 05:44.480
so in particular, I think of Fsharp, Don Syme, who him
05:44.480 --> 05:49.320
and his team have done a great job of making something
05:49.320 --> 05:51.640
like a database appear in the type system,
05:51.640 --> 05:54.960
so you actually get tab completion for fields and tables
05:54.960 --> 05:57.840
and stuff like that.
05:57.840 --> 05:59.280
Anyway, so that was kind of, anyway,
05:59.280 --> 06:01.880
so that whole VBA Office thing, I guess,
06:01.880 --> 06:04.560
was a starting point, which is your miss.
06:04.560 --> 06:07.800
And I got into Standard Visual Basic, which
06:07.800 --> 06:09.840
that's interesting, just to pause on that for a second.
06:09.840 --> 06:12.600
And it's interesting that you're connecting programming
06:12.600 --> 06:18.200
languages to the ease of management of data.
06:18.200 --> 06:20.600
So in your use of programming languages,
06:20.600 --> 06:24.880
you always had a love and a connection with data.
06:24.880 --> 06:28.640
I've always been interested in doing useful things for myself
06:28.640 --> 06:31.880
and for others, which generally means getting some data
06:31.880 --> 06:34.600
and doing something with it and putting it out there again.
06:34.600 --> 06:38.400
So that's been my interest throughout.
06:38.400 --> 06:41.560
So I also did a lot of stuff with Apple script
06:41.560 --> 06:43.880
back in the early days.
06:43.880 --> 06:47.960
So it's kind of nice being able to get the computer
06:47.960 --> 06:52.960
and computers to talk to each other and to do things for you.
06:52.960 --> 06:56.600
And then I think that one night, the programming language
06:56.600 --> 06:59.960
I most loved then would have been Delphi, which
06:59.960 --> 07:05.960
was Object Pascal created by Anders Halsberg, who previously
07:05.960 --> 07:08.840
did Turbo Pascal and then went on to create.net
07:08.840 --> 07:11.080
and then went on to create TypeScript.
07:11.080 --> 07:16.720
Delphi was amazing because it was like a compiled, fast language
07:16.720 --> 07:20.200
that was as easy to use as Visual Basic.
07:20.200 --> 07:27.480
Delphi, what is it similar to in more modern languages?
07:27.480 --> 07:28.840
Visual Basic.
07:28.840 --> 07:29.680
Visual Basic.
07:29.680 --> 07:32.320
Yeah, that a compiled, fast version.
07:32.320 --> 07:37.080
So I'm not sure there's anything quite like it anymore.
07:37.080 --> 07:42.520
If you took C Sharp or Java and got rid of the virtual machine
07:42.520 --> 07:45.040
and replaced it with something, you could compile a small type
07:45.040 --> 07:46.520
binary.
07:46.520 --> 07:51.680
I feel like it's where Swift could get to with the new Swift
07:51.680 --> 07:56.640
UI and the cross platform development going on.
07:56.640 --> 08:01.600
That's one of my dreams is that we'll hopefully get back
08:01.600 --> 08:02.840
to where Delphi was.
08:02.840 --> 08:08.520
There is actually a free Pascal project nowadays
08:08.520 --> 08:10.320
called Lazarus, which is also attempting
08:10.320 --> 08:13.960
to recreate Delphi.
08:13.960 --> 08:16.080
They're making good progress.
08:16.080 --> 08:21.000
So OK, Delphi, that's one of your favorite programming languages?
08:21.000 --> 08:22.360
Well, it's programming environments.
08:22.360 --> 08:26.280
Again, say Pascal's not a nice language.
08:26.280 --> 08:27.880
If you wanted to know specifically
08:27.880 --> 08:30.360
about what languages I like, I would definitely
08:30.360 --> 08:35.480
pick Jay as being an amazingly wonderful language.
08:35.480 --> 08:37.000
What's Jay?
08:37.000 --> 08:39.600
Jay, are you aware of APL?
08:39.600 --> 08:43.520
I am not, except from doing a little research on the work
08:43.520 --> 08:44.080
you've done.
08:44.080 --> 08:47.280
OK, so not at all surprising you're not
08:47.280 --> 08:49.040
familiar with it because it's not well known,
08:49.040 --> 08:55.480
but it's actually one of the main families of programming
08:55.480 --> 08:57.920
languages going back to the late 50s, early 60s.
08:57.920 --> 09:01.720
So there was a couple of major directions.
09:01.720 --> 09:04.440
One was the kind of lambda, calculus,
09:04.440 --> 09:08.640
Alonzo church direction, which I guess kind of Lisbon scheme
09:08.640 --> 09:12.040
and whatever, which has a history going back
09:12.040 --> 09:13.440
to the early days of computing.
09:13.440 --> 09:17.360
The second was the kind of imperative slash
09:17.360 --> 09:23.240
OO, algo, similar going on to C, C++, so forth.
09:23.240 --> 09:26.960
There was a third, which are called array oriented languages,
09:26.960 --> 09:31.720
which started with a paper by a guy called Ken Iverson, which
09:31.720 --> 09:37.480
was actually a math theory paper, not a programming paper.
09:37.480 --> 09:41.520
It was called Notation as a Tool for Thought.
09:41.520 --> 09:45.320
And it was the development of a new type of math notation.
09:45.320 --> 09:48.560
And the idea is that this math notation was much more
09:48.560 --> 09:54.480
flexible, expressive, and also well defined than traditional
09:54.480 --> 09:56.440
math notation, which is none of those things.
09:56.440 --> 09:59.160
Math notation is awful.
09:59.160 --> 10:02.840
And so he actually turned that into a programming language.
10:02.840 --> 10:06.720
Because this was the late 50s, all the names were available.
10:06.720 --> 10:10.520
So he called his programming language, or APL.
10:10.520 --> 10:11.160
APL, what?
10:11.160 --> 10:15.360
So APL is a implementation of notation
10:15.360 --> 10:18.280
as a tool for thought, by which he means math notation.
10:18.280 --> 10:22.880
And Ken and his son went on to do many things,
10:22.880 --> 10:26.720
but eventually they actually produced a new language that
10:26.720 --> 10:28.440
was built on top of all the learnings of APL.
10:28.440 --> 10:32.800
And that was called J. And J is the most
10:32.800 --> 10:41.040
expressive, composable, beautifully designed language
10:41.040 --> 10:42.400
I've ever seen.
10:42.400 --> 10:44.520
Does it have object oriented components?
10:44.520 --> 10:45.520
Does it have that kind of thing?
10:45.520 --> 10:46.240
Not really.
10:46.240 --> 10:47.720
It's an array oriented language.
10:47.720 --> 10:51.400
It's the third path.
10:51.400 --> 10:52.760
Are you saying array?
10:52.760 --> 10:53.720
Array oriented.
10:53.720 --> 10:54.200
Yeah.
10:54.200 --> 10:55.480
It needs to be array oriented.
10:55.480 --> 10:57.480
So array oriented means that you generally
10:57.480 --> 10:59.520
don't use any loops.
10:59.520 --> 11:02.240
But the whole thing is done with kind
11:02.240 --> 11:06.360
of an extreme version of broadcasting,
11:06.360 --> 11:09.880
if you're familiar with that NumPy slash Python concept.
11:09.880 --> 11:14.240
So you do a lot with one line of code.
11:14.240 --> 11:17.520
It looks a lot like math.
11:17.520 --> 11:20.280
Notation is basically highly compact.
11:20.280 --> 11:22.800
And the idea is that you can kind of,
11:22.800 --> 11:24.760
because you can do so much with one line of code,
11:24.760 --> 11:27.720
a single screen of code is very unlikely to,
11:27.720 --> 11:31.080
you very rarely need more than that to express your program.
11:31.080 --> 11:33.240
And so you can kind of keep it all in your head.
11:33.240 --> 11:36.000
And you can kind of clearly communicate it.
11:36.000 --> 11:41.560
It's interesting that APL created two main branches, K and J.
11:41.560 --> 11:47.920
J is this kind of like open source niche community of crazy
11:47.920 --> 11:49.360
enthusiasts like me.
11:49.360 --> 11:52.120
And then the other path, K, was fascinating.
11:52.120 --> 11:56.600
It's an astonishingly expensive programming language,
11:56.600 --> 12:01.920
which many of the world's most ludicrously rich hedge funds
12:01.920 --> 12:02.840
use.
12:02.840 --> 12:06.640
So the entire K machine is so small,
12:06.640 --> 12:09.320
it sits inside level three cache on your CPU.
12:09.320 --> 12:14.040
And it easily wins every benchmark I've ever seen
12:14.040 --> 12:16.440
in terms of data processing speed.
12:16.440 --> 12:17.840
But you don't come across it very much,
12:17.840 --> 12:22.640
because it's like $100,000 per CPU to run it.
12:22.640 --> 12:26.240
But it's like this path of programming languages
12:26.240 --> 12:29.760
is just so much, I don't know, so much more powerful
12:29.760 --> 12:33.840
in every way than the ones that almost anybody uses every day.
12:33.840 --> 12:37.400
So it's all about computation.
12:37.400 --> 12:38.360
It's really focusing on it.
12:38.360 --> 12:40.640
Pretty heavily focused on computation.
12:40.640 --> 12:44.320
I mean, so much of programming is data processing
12:44.320 --> 12:45.640
by definition.
12:45.640 --> 12:49.000
And so there's a lot of things you can do with it.
12:49.000 --> 12:51.320
But yeah, there's not much work being
12:51.320 --> 12:57.080
done on making user interface toolkills or whatever.
12:57.080 --> 12:59.400
I mean, there's some, but they're not great.
12:59.400 --> 13:03.160
At the same time, you've done a lot of stuff with Perl and Python.
13:03.160 --> 13:08.320
So what does that fit into the picture of J and K and APL
13:08.320 --> 13:08.880
and Python?
13:08.880 --> 13:12.400
Well, it's just much more pragmatic.
13:12.400 --> 13:13.960
In the end, you kind of have to end up
13:13.960 --> 13:17.960
where the libraries are.
13:17.960 --> 13:21.320
Because to me, my focus is on productivity.
13:21.320 --> 13:23.800
I just want to get stuff done and solve problems.
13:23.800 --> 13:27.360
So Perl was great.
13:27.360 --> 13:29.760
I created an email company called Fastmail.
13:29.760 --> 13:35.200
And Perl was great, because back in the late 90s, early 2000s,
13:35.200 --> 13:38.160
it just had a lot of stuff it could do.
13:38.160 --> 13:41.840
I still had to write my own monitoring system
13:41.840 --> 13:43.840
and my own web framework and my own whatever,
13:43.840 --> 13:45.760
because none of that stuff existed.
13:45.760 --> 13:50.280
But it was a super flexible language to do that in.
13:50.280 --> 13:52.720
And you used Perl for Fastmail.
13:52.720 --> 13:54.520
You used it as a back end.
13:54.520 --> 13:55.800
So everything was written in Perl?
13:55.800 --> 13:56.520
Yeah.
13:56.520 --> 13:58.720
Yeah, everything was Perl.
13:58.720 --> 14:04.480
Why do you think Perl hasn't succeeded or hasn't dominated
14:04.480 --> 14:07.120
the market where Python really takes over a lot of the
14:07.120 --> 14:08.200
tests?
14:08.200 --> 14:09.640
Well, I mean, Perl did dominate.
14:09.640 --> 14:13.080
It was everything, everywhere.
14:13.080 --> 14:19.920
But then the guy that ran Perl, Larry Wall,
14:19.920 --> 14:22.280
just didn't put the time in anymore.
14:22.280 --> 14:29.680
And no project can be successful if there isn't.
14:29.680 --> 14:32.640
Particularly one that started with a strong leader that
14:32.640 --> 14:35.040
loses that strong leadership.
14:35.040 --> 14:38.040
So then Python has kind of replaced it.
14:38.040 --> 14:45.040
Python is a lot less elegant language in nearly every way.
14:45.040 --> 14:48.880
But it has the data science libraries.
14:48.880 --> 14:51.240
And a lot of them are pretty great.
14:51.240 --> 14:58.280
So I kind of use it because it's the best we have.
14:58.280 --> 15:01.800
But it's definitely not good enough.
15:01.800 --> 15:04.040
What do you think the future of programming looks like?
15:04.040 --> 15:06.880
What do you hope the future of programming looks like if we
15:06.880 --> 15:10.200
zoom in on the computational fields on data science
15:10.200 --> 15:11.800
and machine learning?
15:11.800 --> 15:19.440
I hope Swift is successful because the goal of Swift,
15:19.440 --> 15:21.000
the way Chris Latna describes it,
15:21.000 --> 15:22.640
is to be infinitely hackable.
15:22.640 --> 15:23.480
And that's what I want.
15:23.480 --> 15:26.920
I want something where me and the people I do research with
15:26.920 --> 15:30.360
and my students can look at and change everything
15:30.360 --> 15:32.000
from top to bottom.
15:32.000 --> 15:36.240
There's nothing mysterious and magical and inaccessible.
15:36.240 --> 15:38.600
Unfortunately, with Python, it's the opposite of that
15:38.600 --> 15:42.640
because Python is so slow, it's extremely unhackable.
15:42.640 --> 15:44.840
You get to a point where it's like, OK, from here on down
15:44.840 --> 15:47.320
at C. So your debugger doesn't work in the same way.
15:47.320 --> 15:48.920
Your profiler doesn't work in the same way.
15:48.920 --> 15:50.880
Your build system doesn't work in the same way.
15:50.880 --> 15:53.760
It's really not very hackable at all.
15:53.760 --> 15:55.600
What's the part you like to be hackable?
15:55.600 --> 16:00.120
Is it for the objective of optimizing training
16:00.120 --> 16:02.600
of neural networks, inference of neural networks?
16:02.600 --> 16:04.360
Is it performance of the system?
16:04.360 --> 16:08.440
Or is there some nonperformance related, just creative idea?
16:08.440 --> 16:09.080
It's everything.
16:09.080 --> 16:15.480
I mean, in the end, I want to be productive as a practitioner.
16:15.480 --> 16:18.440
So at the moment, our understanding of deep learning
16:18.440 --> 16:20.080
is incredibly primitive.
16:20.080 --> 16:21.520
There's very little we understand.
16:21.520 --> 16:24.200
Most things don't work very well, even though it works better
16:24.200 --> 16:26.200
than anything else out there.
16:26.200 --> 16:28.760
There's so many opportunities to make it better.
16:28.760 --> 16:34.360
So you look at any domain area like speech recognition
16:34.360 --> 16:37.720
with deep learning or natural language processing
16:37.720 --> 16:39.440
classification with deep learning or whatever.
16:39.440 --> 16:41.960
Every time I look at an area with deep learning,
16:41.960 --> 16:44.480
I always see like, oh, it's terrible.
16:44.480 --> 16:47.560
There's lots and lots of obviously stupid ways
16:47.560 --> 16:50.000
to do things that need to be fixed.
16:50.000 --> 16:53.320
So then I want to be able to jump in there and quickly
16:53.320 --> 16:54.880
experiment and make them better.
16:54.880 --> 16:59.320
Do you think the programming language has a role in that?
16:59.320 --> 17:00.280
Huge role, yeah.
17:00.280 --> 17:07.080
So currently, Python has a big gap in terms of our ability
17:07.080 --> 17:11.880
to innovate particularly around recurrent neural networks
17:11.880 --> 17:16.840
and natural language processing because it's so slow.
17:16.840 --> 17:20.200
The actual loop where we actually loop through words,
17:20.200 --> 17:23.760
we have to do that whole thing in CUDA C.
17:23.760 --> 17:27.600
So we actually can't innovate with the kernel, the heart,
17:27.600 --> 17:31.560
of that most important algorithm.
17:31.560 --> 17:33.680
And it's just a huge problem.
17:33.680 --> 17:36.600
And this happens all over the place.
17:36.600 --> 17:40.080
So we hit research limitations.
17:40.080 --> 17:42.840
Another example, convolutional neural networks, which
17:42.840 --> 17:46.800
are actually the most popular architecture for lots of things,
17:46.800 --> 17:48.920
maybe most things in deep learning.
17:48.920 --> 17:50.360
We almost certainly should be using
17:50.360 --> 17:54.600
sparse convolutional neural networks, but only like two
17:54.600 --> 17:56.800
people are because to do it, you have
17:56.800 --> 17:59.920
to rewrite all of that CUDA C level stuff.
17:59.920 --> 18:04.520
And yeah, just research, just in practitioners, don't.
18:04.520 --> 18:09.240
So there's just big gaps in what people actually research on,
18:09.240 --> 18:11.640
what people actually implement because of the programming
18:11.640 --> 18:13.240
language problem.
18:13.240 --> 18:17.560
So you think it's just too difficult
18:17.560 --> 18:23.480
to write in CUDA C that a higher level programming language
18:23.480 --> 18:30.520
like Swift should enable the easier,
18:30.520 --> 18:33.160
fooling around, create stuff with RNNs,
18:33.160 --> 18:34.920
or sparse convolutional neural networks?
18:34.920 --> 18:35.920
Kind of.
18:35.920 --> 18:38.520
Who is at fault?
18:38.520 --> 18:42.320
Who is at charge of making it easy for a researcher to play around?
18:42.320 --> 18:43.520
I mean, no one's at fault.
18:43.520 --> 18:45.120
Just nobody's got a round to it yet.
18:45.120 --> 18:47.080
Or it's just it's hard.
18:47.080 --> 18:51.800
And I mean, part of the fault is that we ignored that whole APL
18:51.800 --> 18:55.640
kind of direction, or nearly everybody did for 60 years,
18:55.640 --> 18:57.720
50 years.
18:57.720 --> 18:59.920
But recently, people have been starting
18:59.920 --> 19:04.840
to reinvent pieces of that and kind of create some interesting
19:04.840 --> 19:07.400
new directions in the compiler technology.
19:07.400 --> 19:11.760
So the place where that's particularly happening right now
19:11.760 --> 19:14.920
is something called MLIR, which is something that, again,
19:14.920 --> 19:18.000
Chris Lattener, the Swift guy, is leading.
19:18.000 --> 19:20.080
And because it's actually not going
19:20.080 --> 19:22.160
to be Swift on its own that solves this problem.
19:22.160 --> 19:24.880
Because the problem is that currently writing
19:24.880 --> 19:32.360
a acceptably fast GPU program is too complicated,
19:32.360 --> 19:33.680
regardless of what language you use.
19:36.480 --> 19:38.680
And that's just because if you have to deal with the fact
19:38.680 --> 19:43.160
that I've got 10,000 threads and I have to synchronize between them
19:43.160 --> 19:45.360
all, and I have to put my thing into grid blocks
19:45.360 --> 19:47.040
and think about warps and all this stuff,
19:47.040 --> 19:50.720
it's just so much boilerplate that to do that well,
19:50.720 --> 19:52.240
you have to be a specialist at that.
19:52.240 --> 19:58.200
And it's going to be a year's work to optimize that algorithm
19:58.200 --> 19:59.720
in that way.
19:59.720 --> 20:04.640
But with things like TensorFlow Comprehensions, and Tile,
20:04.640 --> 20:08.880
and MLIR, and TVM, there's all these various projects which
20:08.880 --> 20:11.840
are all about saying, let's let people
20:11.840 --> 20:16.080
create domain specific languages for tensor
20:16.080 --> 20:16.880
computations.
20:16.880 --> 20:19.120
These are the kinds of things we do generally
20:19.120 --> 20:21.640
on the GPU for deep learning, and then
20:21.640 --> 20:28.280
have a compiler which can optimize that tensor computation.
20:28.280 --> 20:31.440
A lot of this work is actually sitting on top of a project
20:31.440 --> 20:36.040
called Halide, which is a mind blowing project
20:36.040 --> 20:38.880
where they came up with such a domain specific language.
20:38.880 --> 20:41.240
In fact, two, one domain specific language for expressing,
20:41.240 --> 20:43.840
this is what my tensor computation is.
20:43.840 --> 20:46.320
And another domain specific language for expressing,
20:46.320 --> 20:50.320
this is the way I want you to structure
20:50.320 --> 20:53.040
the compilation of that, and do it block by block
20:53.040 --> 20:54.960
and do these bits in parallel.
20:54.960 --> 20:57.760
And they were able to show how you can compress
20:57.760 --> 21:02.880
the amount of code by 10x compared to optimized GPU
21:02.880 --> 21:05.600
code and get the same performance.
21:05.600 --> 21:08.480
So these are the things that are sitting on top
21:08.480 --> 21:12.240
of that kind of research, and MLIR
21:12.240 --> 21:15.160
is pulling a lot of those best practices together.
21:15.160 --> 21:17.160
And now we're starting to see work done
21:17.160 --> 21:21.400
on making all of that directly accessible through Swift
21:21.400 --> 21:25.040
so that I could use Swift to write those domain specific
21:25.040 --> 21:25.880
languages.
21:25.880 --> 21:29.520
And hopefully we'll get then Swift CUDA kernels
21:29.520 --> 21:31.720
written in a very expressive and concise way that
21:31.720 --> 21:36.280
looks a bit like J in APL, and then Swift layers on top
21:36.280 --> 21:38.360
of that, and then a Swift UI on top of that,
21:38.360 --> 21:42.600
and it'll be so nice if we can get to that point.
21:42.600 --> 21:48.560
Now does it all eventually boil down to CUDA and NVIDIA GPUs?
21:48.560 --> 21:50.120
Unfortunately at the moment it does,
21:50.120 --> 21:52.600
but one of the nice things about MLIR,
21:52.600 --> 21:56.120
if AMD ever gets their act together, which they probably
21:56.120 --> 21:59.040
want, is that they or others could
21:59.040 --> 22:05.000
write MLIR backends for other GPUs
22:05.000 --> 22:10.320
or rather tensor computation devices, of which today
22:10.320 --> 22:15.520
there are increasing number like Graphcore or Vertex AI
22:15.520 --> 22:18.840
or whatever.
22:18.840 --> 22:22.600
So yeah, being able to target lots of backends
22:22.600 --> 22:23.960
would be another benefit of this,
22:23.960 --> 22:26.680
and the market really needs competition,
22:26.680 --> 22:28.680
because at the moment NVIDIA is massively
22:28.680 --> 22:33.640
overcharging for their kind of enterprise class cards,
22:33.640 --> 22:36.720
because there is no serious competition,
22:36.720 --> 22:39.280
because nobody else is doing the software properly.
22:39.280 --> 22:41.400
In the cloud there is some competition, right?
22:41.400 --> 22:45.080
But not really, other than TPUs perhaps,
22:45.080 --> 22:49.040
but TPUs are almost unprogrammable at the moment.
22:49.040 --> 22:51.080
TPUs have the same problem that you can't.
22:51.080 --> 22:51.760
It's even worse.
22:51.760 --> 22:54.800
So TPUs, Google actually made an explicit decision
22:54.800 --> 22:57.200
to make them almost entirely unprogrammable,
22:57.200 --> 22:59.960
because they felt that there was too much IP in there,
22:59.960 --> 23:02.640
and if they gave people direct access to program them,
23:02.640 --> 23:04.360
people would learn their secrets.
23:04.360 --> 23:09.720
So you can't actually directly program
23:09.720 --> 23:12.120
the memory in a TPU.
23:12.120 --> 23:16.360
You can't even directly create code that runs on
23:16.360 --> 23:19.080
and that you look at on the machine that has the TPU.
23:19.080 --> 23:20.920
It all goes through a virtual machine.
23:20.920 --> 23:23.680
So all you can really do is this kind of cookie cutter
23:23.680 --> 23:27.760
thing of like plug in high level stuff together,
23:27.760 --> 23:31.440
which is just super tedious and annoying
23:31.440 --> 23:33.920
and totally unnecessary.
23:33.920 --> 23:40.960
So tell me if you could, the origin story of fast AI.
23:40.960 --> 23:45.760
What is the motivation, its mission, its dream?
23:45.760 --> 23:50.040
So I guess the founding story is heavily
23:50.040 --> 23:51.840
tied to my previous startup, which
23:51.840 --> 23:53.960
is a company called Inletic, which
23:53.960 --> 23:58.280
was the first company to focus on deep learning for medicine.
23:58.280 --> 24:03.240
And I created that because I saw there was a huge opportunity
24:03.240 --> 24:07.960
to, there's about a 10x shortage of the number of doctors
24:07.960 --> 24:12.120
in the world and the developing world that we need.
24:12.120 --> 24:13.840
I expected it would take about 300 years
24:13.840 --> 24:16.120
to train enough doctors to meet that gap.
24:16.120 --> 24:20.760
But I guessed that maybe if we used
24:20.760 --> 24:23.760
deep learning for some of the analytics,
24:23.760 --> 24:25.760
we could maybe make it so you don't need
24:25.760 --> 24:27.320
as highly trained doctors.
24:27.320 --> 24:28.320
For diagnosis?
24:28.320 --> 24:29.840
For diagnosis and treatment planning.
24:29.840 --> 24:33.440
Where's the biggest benefit just before get the fast AI?
24:33.440 --> 24:37.280
Where's the biggest benefit of AI and medicine that you see
24:37.280 --> 24:39.440
today and in the future?
24:39.440 --> 24:41.960
Not much happening today in terms of stuff that's actually
24:41.960 --> 24:42.440
out there.
24:42.440 --> 24:43.160
It's very early.
24:43.160 --> 24:45.320
But in terms of the opportunity, it's
24:45.320 --> 24:51.080
to take markets like India and China and Indonesia, which
24:51.080 --> 24:58.120
have big populations, Africa, small numbers of doctors,
24:58.120 --> 25:02.440
and provide diagnostic, particularly treatment
25:02.440 --> 25:05.160
planning and triage kind of on device
25:05.160 --> 25:10.360
so that if you do a test for malaria or tuberculosis
25:10.360 --> 25:12.800
or whatever, you immediately get something
25:12.800 --> 25:14.840
that even a health care worker that's
25:14.840 --> 25:20.360
had a month of training can get a very high quality
25:20.360 --> 25:23.480
assessment of whether the patient might be at risk
25:23.480 --> 25:27.480
until OK, we'll send them off to a hospital.
25:27.480 --> 25:31.720
So for example, in Africa, outside of South Africa,
25:31.720 --> 25:34.080
there's only five pediatric radiologists
25:34.080 --> 25:35.320
for the entire continent.
25:35.320 --> 25:37.200
So most countries don't have any.
25:37.200 --> 25:39.240
So if your kid is sick and they need something
25:39.240 --> 25:41.200
diagnosed through medical imaging,
25:41.200 --> 25:44.040
the person, even if you're able to get medical imaging done,
25:44.040 --> 25:48.920
the person that looks at it will be a nurse at best.
25:48.920 --> 25:52.480
But actually, in India, for example, and China,
25:52.480 --> 25:54.760
almost no x rays are read by anybody,
25:54.760 --> 25:59.400
by any trained professional, because they don't have enough.
25:59.400 --> 26:02.880
So if instead we had an algorithm that
26:02.880 --> 26:10.080
could take the most likely high risk 5% and say triage,
26:10.080 --> 26:13.280
basically say, OK, somebody needs to look at this,
26:13.280 --> 26:16.240
it would massively change the kind of way
26:16.240 --> 26:20.640
that what's possible with medicine in the developing world.
26:20.640 --> 26:23.680
And remember, increasingly, they have money.
26:23.680 --> 26:24.800
They're the developing world.
26:24.800 --> 26:26.160
They're not the poor world, the developing world.
26:26.160 --> 26:26.920
So they have the money.
26:26.920 --> 26:28.480
So they're building the hospitals.
26:28.480 --> 26:31.960
They're getting the diagnostic equipment.
26:31.960 --> 26:34.880
But there's no way for a very long time
26:34.880 --> 26:38.480
will they be able to have the expertise.
26:38.480 --> 26:39.760
Shortage of expertise.
26:39.760 --> 26:42.720
OK, and that's where the deep learning systems
26:42.720 --> 26:46.040
can step in and magnify the expertise they do have.
26:46.040 --> 26:47.840
Exactly.
26:47.840 --> 26:54.160
So you do see, just to linger a little bit longer,
26:54.160 --> 26:58.520
the interaction, do you still see the human experts still
26:58.520 --> 26:59.840
at the core of the system?
26:59.840 --> 27:00.480
Yeah, absolutely.
27:00.480 --> 27:01.720
Is there something in medicine that
27:01.720 --> 27:03.760
could be automated almost completely?
27:03.760 --> 27:06.360
I don't see the point of even thinking about that,
27:06.360 --> 27:08.480
because we have such a shortage of people.
27:08.480 --> 27:12.160
Why would we want to find a way not to use them?
27:12.160 --> 27:13.840
Like, we have people.
27:13.840 --> 27:17.200
So the idea of, even from an economic point of view,
27:17.200 --> 27:19.800
if you can make them 10x more productive,
27:19.800 --> 27:21.600
getting rid of the person doesn't
27:21.600 --> 27:23.880
impact your unit economics at all.
27:23.880 --> 27:26.680
And it totally involves the fact that there are things
27:26.680 --> 27:28.760
people do better than machines.
27:28.760 --> 27:33.120
So it's just, to me, that's not a useful way
27:33.120 --> 27:34.120
of framing the problem.
27:34.120 --> 27:36.440
I guess, just to clarify, I guess I
27:36.440 --> 27:40.560
meant there may be some problems where you can avoid even
27:40.560 --> 27:42.160
going to the expert ever.
27:42.160 --> 27:46.160
Sort of maybe preventative care or some basic stuff,
27:46.160 --> 27:47.800
the low hanging fruit, allowing the expert
27:47.800 --> 27:51.320
to focus on the things that are really that.
27:51.320 --> 27:52.960
Well, that's what the triage would do, right?
27:52.960 --> 28:00.760
So the triage would say, OK, 99% sure there's nothing here.
28:00.760 --> 28:04.040
So that can be done on device.
28:04.040 --> 28:05.920
And they can just say, OK, go home.
28:05.920 --> 28:10.520
So the experts are being used to look at the stuff which
28:10.520 --> 28:12.240
has some chance it's worth looking at,
28:12.240 --> 28:15.720
which most things is not.
28:15.720 --> 28:16.280
It's fine.
28:16.280 --> 28:19.840
Why do you think we haven't quite made progress on that yet
28:19.840 --> 28:27.480
in terms of the scale of how much AI is applied in the method?
28:27.480 --> 28:28.400
There's a lot of reasons.
28:28.400 --> 28:29.640
I mean, one is it's pretty new.
28:29.640 --> 28:32.040
I only started in late 2014.
28:32.040 --> 28:35.920
And before that, it's hard to express
28:35.920 --> 28:37.760
to what degree the medical world was not
28:37.760 --> 28:40.720
aware of the opportunities here.
28:40.720 --> 28:45.520
So I went to RSNA, which is the world's largest radiology
28:45.520 --> 28:46.240
conference.
28:46.240 --> 28:50.040
And I told everybody I could, like,
28:50.040 --> 28:51.800
I'm doing this thing with deep learning.
28:51.800 --> 28:53.320
Please come and check it out.
28:53.320 --> 28:56.880
And no one had any idea what I was talking about.
28:56.880 --> 28:59.640
No one had any interest in it.
28:59.640 --> 29:05.040
So we've come from absolute zero, which is hard.
29:05.040 --> 29:09.920
And then the whole regulatory framework, education system,
29:09.920 --> 29:13.400
everything is just set up to think of doctoring
29:13.400 --> 29:14.920
in a very different way.
29:14.920 --> 29:16.400
So today, there is a small number
29:16.400 --> 29:22.040
of people who are deep learning practitioners and doctors
29:22.040 --> 29:22.960
at the same time.
29:22.960 --> 29:25.040
And we're starting to see the first ones come out
29:25.040 --> 29:26.520
of their PhD programs.
29:26.520 --> 29:33.960
So Zach Cahane over in Boston, Cambridge
29:33.960 --> 29:41.040
has a number of students now who are data science experts,
29:41.040 --> 29:46.400
deep learning experts, and actual medical doctors.
29:46.400 --> 29:49.480
Quite a few doctors have completed our fast AI course
29:49.480 --> 29:54.920
now and are publishing papers and creating journal reading
29:54.920 --> 29:58.040
groups in the American Council of Radiology.
29:58.040 --> 30:00.280
And it's just starting to happen.
30:00.280 --> 30:02.840
But it's going to be a long process.
30:02.840 --> 30:04.920
The regulators have to learn how to regulate this.
30:04.920 --> 30:08.720
They have to build guidelines.
30:08.720 --> 30:12.120
And then the lawyers at hospitals
30:12.120 --> 30:15.080
have to develop a new way of understanding
30:15.080 --> 30:18.680
that sometimes it makes sense for data
30:18.680 --> 30:24.880
to be looked at in raw form in large quantities
30:24.880 --> 30:27.000
in order to create world changing results.
30:27.000 --> 30:30.080
Yeah, there's a regulation around data, all that.
30:30.080 --> 30:33.840
It sounds probably the hardest problem,
30:33.840 --> 30:36.760
but it sounds reminiscent of autonomous vehicles as well.
30:36.760 --> 30:38.760
Many of the same regulatory challenges,
30:38.760 --> 30:40.560
many of the same data challenges.
30:40.560 --> 30:42.160
Yeah, I mean, funnily enough, the problem
30:42.160 --> 30:44.880
is less the regulation and more the interpretation
30:44.880 --> 30:48.200
of that regulation by lawyers in hospitals.
30:48.200 --> 30:52.560
So HIPAA was actually designed.
30:52.560 --> 30:56.400
The P in HIPAA does not stand for privacy.
30:56.400 --> 30:57.640
It stands for portability.
30:57.640 --> 31:01.200
It's actually meant to be a way that data can be used.
31:01.200 --> 31:04.400
And it was created with lots of gray areas
31:04.400 --> 31:06.560
because the idea is that would be more practical
31:06.560 --> 31:10.480
and it would help people to use this legislation
31:10.480 --> 31:13.680
to actually share data in a more thoughtful way.
31:13.680 --> 31:15.320
Unfortunately, it's done the opposite
31:15.320 --> 31:18.880
because when a lawyer sees a gray area, they see, oh,
31:18.880 --> 31:22.440
if we don't know we won't get sued, then we can't do it.
31:22.440 --> 31:26.360
So HIPAA is not exactly the problem.
31:26.360 --> 31:30.080
The problem is more that hospital lawyers
31:30.080 --> 31:34.720
are not incented to make bold decisions
31:34.720 --> 31:36.520
about data portability.
31:36.520 --> 31:40.480
Or even to embrace technology that saves lives.
31:40.480 --> 31:42.440
They more want to not get in trouble
31:42.440 --> 31:44.280
for embracing that technology.
31:44.280 --> 31:47.840
Also, it is also saves lives in a very abstract way,
31:47.840 --> 31:49.840
which is like, oh, we've been able to release
31:49.840 --> 31:52.360
these 100,000 anonymous records.
31:52.360 --> 31:55.360
I can't point at the specific person whose life that's saved.
31:55.360 --> 31:57.760
I can say like, oh, we've ended up with this paper
31:57.760 --> 32:02.200
which found this result, which diagnosed 1,000 more people
32:02.200 --> 32:04.200
than we would have otherwise, but it's like,
32:04.200 --> 32:07.360
which ones were helped, it's very abstract.
32:07.360 --> 32:09.400
Yeah, and on the counter side of that,
32:09.400 --> 32:13.080
you may be able to point to a life that was taken
32:13.080 --> 32:14.360
because of something that was...
32:14.360 --> 32:18.240
Yeah, or a person whose privacy was violated.
32:18.240 --> 32:20.360
It's like, oh, this specific person,
32:20.360 --> 32:25.480
you know, there was deidentified.
32:25.480 --> 32:27.360
Just a fascinating topic.
32:27.360 --> 32:28.360
We're jumping around.
32:28.360 --> 32:32.880
We'll get back to fast AI, but on the question of privacy,
32:32.880 --> 32:38.160
data is the fuel for so much innovation in deep learning.
32:38.160 --> 32:39.840
What's your sense on privacy,
32:39.840 --> 32:44.080
whether we're talking about Twitter, Facebook, YouTube,
32:44.080 --> 32:48.720
just the technologies like in the medical field
32:48.720 --> 32:53.440
that rely on people's data in order to create impact?
32:53.440 --> 32:58.840
How do we get that right, respecting people's privacy
32:58.840 --> 33:03.360
and yet creating technology that is learned from data?
33:03.360 --> 33:11.480
One of my areas of focus is on doing more with less data,
33:11.480 --> 33:15.000
which so most vendors, unfortunately, are strongly
33:15.000 --> 33:20.000
centred to find ways to require more data and more computation.
33:20.000 --> 33:24.000
So Google and IBM being the most obvious...
33:24.000 --> 33:26.000
IBM.
33:26.000 --> 33:30.600
Yeah, so Watson, you know, so Google and IBM both strongly push
33:30.600 --> 33:35.400
the idea that they have more data and more computation
33:35.400 --> 33:37.800
and more intelligent people than anybody else,
33:37.800 --> 33:39.840
and so you have to trust them to do things
33:39.840 --> 33:42.600
because nobody else can do it.
33:42.600 --> 33:45.360
And Google's very upfront about this,
33:45.360 --> 33:48.680
like Jeff Dain has gone out there and given talks and said,
33:48.680 --> 33:52.840
our goal is to require 1,000 times more computation,
33:52.840 --> 33:55.120
but less people.
33:55.120 --> 34:00.600
Our goal is to use the people that you have better
34:00.600 --> 34:02.960
and the data you have better and the computation you have better.
34:02.960 --> 34:06.000
So one of the things that we've discovered is,
34:06.000 --> 34:11.080
or at least highlighted, is that you very, very, very often
34:11.080 --> 34:13.360
don't need much data at all.
34:13.360 --> 34:16.160
And so the data you already have in your organization
34:16.160 --> 34:19.240
will be enough to get state of the art results.
34:19.240 --> 34:22.600
So like my starting point would be to kind of say around privacy
34:22.600 --> 34:25.760
is a lot of people are looking for ways
34:25.760 --> 34:28.120
to share data and aggregate data,
34:28.120 --> 34:29.920
but I think often that's unnecessary.
34:29.920 --> 34:32.160
They assume that they need more data than they do
34:32.160 --> 34:35.240
because they're not familiar with the basics of transfer
34:35.240 --> 34:38.440
learning, which is this critical technique
34:38.440 --> 34:42.000
for needing orders of magnitude less data.
34:42.000 --> 34:44.680
Is your sense, one reason you might want to collect data
34:44.680 --> 34:50.440
from everyone is like in the recommender system context,
34:50.440 --> 34:54.520
where your individual, Jeremy Howard's individual data
34:54.520 --> 34:58.600
is the most useful for providing a product that's
34:58.600 --> 34:59.880
impactful for you.
34:59.880 --> 35:02.240
So for giving you advertisements,
35:02.240 --> 35:07.640
for recommending to you movies, for doing medical diagnosis.
35:07.640 --> 35:11.720
Is your sense we can build with a small amount of data,
35:11.720 --> 35:16.040
general models that will have a huge impact for most people,
35:16.040 --> 35:19.120
that we don't need to have data from each individual?
35:19.120 --> 35:20.560
On the whole, I'd say yes.
35:20.560 --> 35:26.400
I mean, there are things like, recommender systems
35:26.400 --> 35:30.960
have this cold start problem, where Jeremy is a new customer.
35:30.960 --> 35:33.280
We haven't seen him before, so we can't recommend him things
35:33.280 --> 35:36.520
based on what else he's bought and liked with us.
35:36.520 --> 35:39.440
And there's various workarounds to that.
35:39.440 --> 35:41.160
A lot of music programs will start out
35:41.160 --> 35:44.920
by saying, which of these artists do you like?
35:44.920 --> 35:46.800
Which of these albums do you like?
35:46.800 --> 35:49.800
Which of these songs do you like?
35:49.800 --> 35:51.040
Netflix used to do that.
35:51.040 --> 35:55.320
Nowadays, people don't like that because they think, oh,
35:55.320 --> 35:57.400
we don't want to bother the user.
35:57.400 --> 36:00.560
So you could work around that by having some kind of data
36:00.560 --> 36:04.240
sharing where you get my marketing record from Axiom
36:04.240 --> 36:06.360
or whatever and try to question that.
36:06.360 --> 36:12.360
To me, the benefit to me and to society
36:12.360 --> 36:16.520
of saving me five minutes on answering some questions
36:16.520 --> 36:23.520
versus the negative externalities of the privacy issue
36:23.520 --> 36:24.800
doesn't add up.
36:24.800 --> 36:26.600
So I think a lot of the time, the places
36:26.600 --> 36:30.520
where people are invading our privacy in order
36:30.520 --> 36:35.360
to provide convenience is really about just trying
36:35.360 --> 36:36.880
to make them more money.
36:36.880 --> 36:40.760
And they move these negative externalities
36:40.760 --> 36:44.360
into places that they don't have to pay for them.
36:44.360 --> 36:48.120
So when you actually see regulations
36:48.120 --> 36:50.560
appear that actually cause the companies that
36:50.560 --> 36:52.360
create these negative externalities to have
36:52.360 --> 36:54.320
to pay for it themselves, they say, well,
36:54.320 --> 36:56.160
we can't do it anymore.
36:56.160 --> 36:58.240
So the cost is actually too high.
36:58.240 --> 37:02.280
But for something like medicine, the hospital
37:02.280 --> 37:06.440
has my medical imaging, my pathology studies,
37:06.440 --> 37:08.920
my medical records.
37:08.920 --> 37:11.920
And also, I own my medical data.
37:11.920 --> 37:16.960
So I help a startup called DocAI.
37:16.960 --> 37:19.760
One of the things DocAI does is that it has an app.
37:19.760 --> 37:26.120
You can connect to Sutter Health and Labcore and Walgreens
37:26.120 --> 37:29.840
and download your medical data to your phone
37:29.840 --> 37:33.560
and then upload it, again, at your discretion
37:33.560 --> 37:36.040
to share it as you wish.
37:36.040 --> 37:38.440
So with that kind of approach, we
37:38.440 --> 37:41.160
can share our medical information
37:41.160 --> 37:44.840
with the people we want to.
37:44.840 --> 37:45.720
Yeah, so control.
37:45.720 --> 37:48.240
I mean, really being able to control who you share it with
37:48.240 --> 37:49.760
and so on.
37:49.760 --> 37:53.080
So that has a beautiful, interesting tangent
37:53.080 --> 37:59.360
to return back to the origin story of FastAI.
37:59.360 --> 38:02.520
Right, so before I started FastAI,
38:02.520 --> 38:07.160
I spent a year researching where are the biggest
38:07.160 --> 38:10.400
opportunities for deep learning.
38:10.400 --> 38:14.080
Because I knew from my time at Kaggle in particular
38:14.080 --> 38:17.960
that deep learning had hit this threshold point where it was
38:17.960 --> 38:20.520
rapidly becoming the state of the art approach in every area
38:20.520 --> 38:21.600
that looked at it.
38:21.600 --> 38:25.400
And I'd been working with neural nets for over 20 years.
38:25.400 --> 38:27.440
I knew that from a theoretical point of view,
38:27.440 --> 38:30.760
once it hit that point, it would do that in just about every
38:30.760 --> 38:31.600
domain.
38:31.600 --> 38:34.480
And so I spent a year researching
38:34.480 --> 38:37.120
what are the domains it's going to have the biggest low hanging
38:37.120 --> 38:39.400
fruit in the shortest time period.
38:39.400 --> 38:43.880
I picked medicine, but there were so many I could have picked.
38:43.880 --> 38:47.640
And so there was a level of frustration for me of like, OK,
38:47.640 --> 38:50.840
I'm really glad we've opened up the medical deep learning
38:50.840 --> 38:53.880
world and today it's huge, as you know.
38:53.880 --> 38:58.280
But we can't do, you know, I can't do everything.
38:58.280 --> 39:00.400
I don't even know like, like in medicine,
39:00.400 --> 39:02.760
it took me a really long time to even get a sense of like,
39:02.760 --> 39:05.080
what kind of problems do medical practitioners solve?
39:05.080 --> 39:06.400
What kind of data do they have?
39:06.400 --> 39:08.520
Who has that data?
39:08.520 --> 39:12.480
So I kind of felt like I need to approach this differently
39:12.480 --> 39:16.200
if I want to maximize the positive impact of deep learning.
39:16.200 --> 39:19.480
Rather than me picking an area and trying
39:19.480 --> 39:21.720
to become good at it and building something,
39:21.720 --> 39:24.480
I should let people who are already domain experts
39:24.480 --> 39:29.240
in those areas and who already have the data do it themselves.
39:29.240 --> 39:35.520
So that was the reason for vast AI is to basically try
39:35.520 --> 39:38.840
and figure out how to get deep learning
39:38.840 --> 39:41.800
into the hands of people who could benefit from it
39:41.800 --> 39:45.400
and help them to do so in as quick and easy and effective
39:45.400 --> 39:47.080
a way as possible.
39:47.080 --> 39:47.560
Got it.
39:47.560 --> 39:50.240
So sort of empower the domain experts.
39:50.240 --> 39:51.320
Yeah.
39:51.320 --> 39:54.200
And like partly it's because like,
39:54.200 --> 39:56.280
unlike most people in this field,
39:56.280 --> 39:59.960
my background is very applied and industrial.
39:59.960 --> 40:02.480
Like my first job was at McKinsey & Company.
40:02.480 --> 40:04.640
I spent 10 years of management consulting.
40:04.640 --> 40:10.240
I spend a lot of time with domain experts.
40:10.240 --> 40:12.800
You know, so I kind of respect them and appreciate them.
40:12.800 --> 40:16.440
And I know that's where the value generation in society is.
40:16.440 --> 40:21.560
And so I also know how most of them can't code.
40:21.560 --> 40:26.320
And most of them don't have the time to invest, you know,
40:26.320 --> 40:29.320
three years in a graduate degree or whatever.
40:29.320 --> 40:33.520
So it's like, how do I upskill those domain experts?
40:33.520 --> 40:36.080
I think that would be a super powerful thing,
40:36.080 --> 40:40.200
you know, the biggest societal impact I could have.
40:40.200 --> 40:41.680
So yeah, that was the thinking.
40:41.680 --> 40:45.680
So so much of fast AI students and researchers
40:45.680 --> 40:50.120
and the things you teach are programmatically minded,
40:50.120 --> 40:51.520
practically minded,
40:51.520 --> 40:55.840
figuring out ways how to solve real problems and fast.
40:55.840 --> 40:57.480
So from your experience,
40:57.480 --> 41:02.040
what's the difference between theory and practice of deep learning?
41:02.040 --> 41:03.680
Hmm.
41:03.680 --> 41:07.520
Well, most of the research in the deep mining world
41:07.520 --> 41:09.840
is a total waste of time.
41:09.840 --> 41:11.040
Right. That's what I was getting at.
41:11.040 --> 41:12.200
Yeah.
41:12.200 --> 41:16.240
It's it's a problem in science in general.
41:16.240 --> 41:19.600
Scientists need to be published,
41:19.600 --> 41:21.480
which means they need to work on things
41:21.480 --> 41:24.040
that their peers are extremely familiar with
41:24.040 --> 41:26.200
and can recognize in advance in that area.
41:26.200 --> 41:30.040
So that means that they all need to work on the same thing.
41:30.040 --> 41:33.040
And so it really ink and the thing they work on
41:33.040 --> 41:35.640
is nothing to encourage them to work on things
41:35.640 --> 41:38.840
that are practically useful.
41:38.840 --> 41:41.120
So you get just a whole lot of research,
41:41.120 --> 41:43.200
which is minor advances in stuff
41:43.200 --> 41:44.600
that's been very highly studied
41:44.600 --> 41:49.280
and has no significant practical impact.
41:49.280 --> 41:50.840
Whereas the things that really make a difference
41:50.840 --> 41:52.760
like I mentioned transfer learning,
41:52.760 --> 41:55.560
like if we can do better at transfer learning,
41:55.560 --> 41:58.160
then it's this like world changing thing
41:58.160 --> 42:02.880
where suddenly like lots more people can do world class work
42:02.880 --> 42:06.760
with less resources and less data and.
42:06.760 --> 42:08.480
But almost nobody works on that.
42:08.480 --> 42:10.760
Or another example, active learning,
42:10.760 --> 42:11.880
which is the study of like,
42:11.880 --> 42:15.880
how do we get more out of the human beings in the loop?
42:15.880 --> 42:17.120
That's my favorite topic.
42:17.120 --> 42:18.520
Yeah. So active learning is great,
42:18.520 --> 42:21.160
but it's almost nobody working on it
42:21.160 --> 42:23.800
because it's just not a trendy thing right now.
42:23.800 --> 42:27.040
You know what somebody started to interrupt?
42:27.040 --> 42:29.720
He was saying that nobody is publishing
42:29.720 --> 42:31.520
on active learning, right?
42:31.520 --> 42:33.440
But there's people inside companies,
42:33.440 --> 42:36.800
anybody who actually has to solve a problem,
42:36.800 --> 42:39.600
they're going to innovate on active learning.
42:39.600 --> 42:42.080
Yeah. Everybody kind of reinvents active learning
42:42.080 --> 42:43.760
when they actually have to work in practice
42:43.760 --> 42:46.360
because they start labeling things and they think,
42:46.360 --> 42:49.280
gosh, this is taking a long time and it's very expensive.
42:49.280 --> 42:51.200
And then they start thinking,
42:51.200 --> 42:52.640
well, why am I labeling everything?
42:52.640 --> 42:54.840
I'm only, the machine's only making mistakes
42:54.840 --> 42:56.040
on those two classes.
42:56.040 --> 42:56.880
They're the hard ones.
42:56.880 --> 42:58.840
Maybe I'll just start labeling those two classes
42:58.840 --> 43:00.360
and then you start thinking,
43:00.360 --> 43:01.560
well, why did I do that manually?
43:01.560 --> 43:03.000
Why can't I just get the system to tell me
43:03.000 --> 43:04.760
which things are going to be harder steps?
43:04.760 --> 43:06.200
It's an obvious thing to do.
43:06.200 --> 43:11.400
But yeah, it's just like transfer learning.
43:11.400 --> 43:14.120
It's understudied and the academic world
43:14.120 --> 43:17.440
just has no reason to care about practical results.
43:17.440 --> 43:18.360
The funny thing is, like,
43:18.360 --> 43:19.920
I've only really ever written one paper.
43:19.920 --> 43:21.520
I hate writing papers.
43:21.520 --> 43:22.760
And I didn't even write it.
43:22.760 --> 43:25.480
It was my colleague, Sebastian Ruder, who actually wrote it.
43:25.480 --> 43:28.040
I just did the research for it.
43:28.040 --> 43:31.640
But it was basically introducing successful transfer learning
43:31.640 --> 43:34.200
to NLP for the first time.
43:34.200 --> 43:37.000
And the algorithm is called ULMfit.
43:37.000 --> 43:42.320
And I actually wrote it for the course,
43:42.320 --> 43:43.720
for the first day of course.
43:43.720 --> 43:45.360
I wanted to teach people NLP.
43:45.360 --> 43:47.520
And I thought I only want to teach people practical stuff.
43:47.520 --> 43:50.560
And I think the only practical stuff is transfer learning.
43:50.560 --> 43:53.360
And I couldn't find any examples of transfer learning in NLP.
43:53.360 --> 43:54.560
So I just did it.
43:54.560 --> 43:57.320
And I was shocked to find that as soon as I did it,
43:57.320 --> 44:01.080
which, you know, the basic prototype took a couple of days,
44:01.080 --> 44:02.520
smashed the state of the art
44:02.520 --> 44:04.760
on one of the most important data sets in a field
44:04.760 --> 44:06.720
that I knew nothing about.
44:06.720 --> 44:10.400
And I just thought, well, this is ridiculous.
44:10.400 --> 44:13.800
And so I spoke to Sebastian about it.
44:13.800 --> 44:17.680
And he kindly offered to write it up the results.
44:17.680 --> 44:21.360
And so it ended up being published in ACL,
44:21.360 --> 44:25.560
which is the top computational linguistics conference.
44:25.560 --> 44:28.880
So like, people do actually care once you do it.
44:28.880 --> 44:34.160
But I guess it's difficult for maybe junior researchers.
44:34.160 --> 44:37.720
I don't care whether I get citations or papers or whatever.
44:37.720 --> 44:39.640
There's nothing in my life that makes that important,
44:39.640 --> 44:41.240
which is why I've never actually
44:41.240 --> 44:43.040
bothered to write a paper myself.
44:43.040 --> 44:44.400
But for people who do, I guess they
44:44.400 --> 44:50.960
have to pick the kind of safe option, which is like,
44:50.960 --> 44:52.720
yeah, make a slight improvement on something
44:52.720 --> 44:55.160
that everybody's already working on.
44:55.160 --> 44:59.040
Yeah, nobody does anything interesting or succeeds
44:59.040 --> 45:01.240
in life with the safe option.
45:01.240 --> 45:02.960
Well, I mean, the nice thing is nowadays,
45:02.960 --> 45:05.320
everybody is now working on NLP transfer learning.
45:05.320 --> 45:12.240
Because since that time, we've had GPT and GPT2 and BERT.
45:12.240 --> 45:15.400
So yeah, once you show that something's possible,
45:15.400 --> 45:17.680
everybody jumps in, I guess.
45:17.680 --> 45:19.320
I hope to be a part of it.
45:19.320 --> 45:21.600
I hope to see more innovation and active learning
45:21.600 --> 45:22.160
in the same way.
45:22.160 --> 45:24.560
I think transfer learning and active learning
45:24.560 --> 45:27.360
are a fascinating public open work.
45:27.360 --> 45:30.160
I actually helped start a startup called Platform AI, which
45:30.160 --> 45:31.760
is really all about active learning.
45:31.760 --> 45:34.200
And yeah, it's been interesting trying
45:34.200 --> 45:36.920
to kind of see what research is out there
45:36.920 --> 45:37.800
and make the most of it.
45:37.800 --> 45:39.200
And there's basically none.
45:39.200 --> 45:41.040
So we've had to do all our own research.
45:41.040 --> 45:44.240
Once again, and just as you described,
45:44.240 --> 45:47.640
can you tell the story of the Stanford competition,
45:47.640 --> 45:51.520
Dawn Bench, and fast AI's achievement on it?
45:51.520 --> 45:51.960
Sure.
45:51.960 --> 45:55.560
So something which I really enjoy is that I basically
45:55.560 --> 45:59.000
teach two courses a year, the practical deep learning
45:59.000 --> 46:02.120
for coders, which is kind of the introductory course,
46:02.120 --> 46:04.280
and then cutting edge deep learning for coders, which
46:04.280 --> 46:08.080
is the kind of research level course.
46:08.080 --> 46:14.320
And while I teach those courses, I basically
46:14.320 --> 46:18.440
have a big office at the University of San Francisco.
46:18.440 --> 46:19.800
It'd be enough for like 30 people.
46:19.800 --> 46:22.960
And I invite any student who wants to come and hang out
46:22.960 --> 46:25.320
with me while I build the course.
46:25.320 --> 46:26.640
And so generally, it's full.
46:26.640 --> 46:30.880
And so we have 20 or 30 people in a big office
46:30.880 --> 46:33.880
with nothing to do but study deep learning.
46:33.880 --> 46:35.880
So it was during one of these times
46:35.880 --> 46:38.640
that somebody in the group said, oh, there's
46:38.640 --> 46:41.480
a thing called Dawn Bench that looks interesting.
46:41.480 --> 46:42.800
And I say, what the hell is that?
46:42.800 --> 46:44.120
I'm going to set out some competition
46:44.120 --> 46:46.440
to see how quickly you can train a model.
46:46.440 --> 46:50.080
It seems kind of not exactly relevant to what we're doing,
46:50.080 --> 46:51.440
but it sounds like the kind of thing
46:51.440 --> 46:52.480
which you might be interested in.
46:52.480 --> 46:53.960
And I checked it out and I was like, oh, crap.
46:53.960 --> 46:55.840
There's only 10 days till it's over.
46:55.840 --> 46:58.120
It's pretty much too late.
46:58.120 --> 47:01.000
And we're kind of busy trying to teach this course.
47:01.000 --> 47:05.640
But we're like, oh, it would make an interesting case study
47:05.640 --> 47:08.200
for the course like it's all the stuff we're already doing.
47:08.200 --> 47:11.120
Why don't we just put together our current best practices
47:11.120 --> 47:12.480
and ideas.
47:12.480 --> 47:16.880
So me and I guess about four students just decided
47:16.880 --> 47:17.560
to give it a go.
47:17.560 --> 47:19.880
And we focused on this small one called
47:19.880 --> 47:24.640
SciFar 10, which is little 32 by 32 pixel images.
47:24.640 --> 47:26.160
Can you say what Dawn Bench is?
47:26.160 --> 47:29.560
Yeah, so it's a competition to train a model as fast as possible.
47:29.560 --> 47:31.000
It was run by Stanford.
47:31.000 --> 47:32.480
And as cheap as possible, too.
47:32.480 --> 47:34.320
That's also another one for as cheap as possible.
47:34.320 --> 47:38.160
And there's a couple of categories, ImageNet and SciFar 10.
47:38.160 --> 47:42.080
So ImageNet's this big 1.3 million image thing
47:42.080 --> 47:45.400
that took a couple of days to train.
47:45.400 --> 47:51.240
I remember a friend of mine, Pete Warden, who's now at Google.
47:51.240 --> 47:53.760
I remember he told me how he trained ImageNet a few years
47:53.760 --> 47:59.440
ago when he basically had this little granny flat out
47:59.440 --> 48:01.920
the back that he turned into was ImageNet training center.
48:01.920 --> 48:04.240
And after a year of work, he figured out
48:04.240 --> 48:07.040
how to train it in 10 days or something.
48:07.040 --> 48:08.480
It's like that was a big job.
48:08.480 --> 48:10.640
Whereas SciFar 10, at that time, you
48:10.640 --> 48:13.040
could train in a few hours.
48:13.040 --> 48:14.520
It's much smaller and easier.
48:14.520 --> 48:18.120
So we thought we'd try SciFar 10.
48:18.120 --> 48:23.800
And yeah, I've really never done that before.
48:23.800 --> 48:27.920
Like, things like using more than one GPU at a time
48:27.920 --> 48:29.800
was something I tried to avoid.
48:29.800 --> 48:32.160
Because to me, it's very against the whole idea
48:32.160 --> 48:35.080
of accessibility, is she better do things with one GPU?
48:35.080 --> 48:36.480
I mean, have you asked in the past
48:36.480 --> 48:39.680
before, after having accomplished something,
48:39.680 --> 48:42.520
how do I do this faster, much faster?
48:42.520 --> 48:43.240
Oh, always.
48:43.240 --> 48:44.680
But it's always, for me, it's always,
48:44.680 --> 48:47.640
how do I make it much faster on a single GPU
48:47.640 --> 48:50.400
that a normal person could afford in their day to day life?
48:50.400 --> 48:54.760
It's not, how could I do it faster by having a huge data
48:54.760 --> 48:55.280
center?
48:55.280 --> 48:57.240
Because to me, it's all about, like,
48:57.240 --> 48:59.560
as many people should be able to use something as possible
48:59.560 --> 49:04.160
without fussing around with infrastructure.
49:04.160 --> 49:06.080
So anyway, so in this case, it's like, well,
49:06.080 --> 49:10.240
we can use 8GPUs just by renting a AWS machine.
49:10.240 --> 49:11.920
So we thought we'd try that.
49:11.920 --> 49:16.560
And yeah, basically, using the stuff we were already doing,
49:16.560 --> 49:20.360
we were able to get the speed.
49:20.360 --> 49:25.360
Within a few days, we had the speed down to a very small
49:25.360 --> 49:26.040
number of minutes.
49:26.040 --> 49:28.800
I can't remember exactly how many minutes it was,
49:28.800 --> 49:31.440
but it might have been like 10 minutes or something.
49:31.440 --> 49:34.200
And so yeah, we found ourselves at the top of the leaderboard
49:34.200 --> 49:38.720
easily for both time and money, which really shocked me.
49:38.720 --> 49:40.160
Because the other people competing in this
49:40.160 --> 49:41.880
were like Google and Intel and stuff,
49:41.880 --> 49:45.360
where I know a lot more about this stuff than I think we do.
49:45.360 --> 49:46.800
So then we emboldened.
49:46.800 --> 49:50.640
We thought, let's try the ImageNet one too.
49:50.640 --> 49:53.280
I mean, it seemed way out of our league.
49:53.280 --> 49:57.120
But our goal was to get under 12 hours.
49:57.120 --> 49:59.280
And we did, which was really exciting.
49:59.280 --> 50:01.440
And we didn't put anything up on the leaderboard,
50:01.440 --> 50:03.080
but we were down to like 10 hours.
50:03.080 --> 50:10.000
But then Google put in like five hours or something,
50:10.000 --> 50:13.360
and we're just like, oh, we're so screwed.
50:13.360 --> 50:16.880
But we kind of thought, well, keep trying.
50:16.880 --> 50:17.880
If Google can do it in five hours.
50:17.880 --> 50:20.760
I mean, Google did it on five hours on like a TPU pod
50:20.760 --> 50:24.280
or something, like a lot of hardware.
50:24.280 --> 50:26.360
But we kind of like had a bunch of ideas to try.
50:26.360 --> 50:28.920
Like a really simple thing was, why
50:28.920 --> 50:30.480
are we using these big images?
50:30.480 --> 50:36.280
They're like 224, 256 by 256 pixels.
50:36.280 --> 50:37.640
Why don't we try smaller ones?
50:37.640 --> 50:41.360
And just to elaborate, there's a constraint on the accuracy
50:41.360 --> 50:43.080
that your train model is supposed to achieve.
50:43.080 --> 50:45.760
Yeah, you've got to achieve 93%.
50:45.760 --> 50:47.640
I think it was for ImageNet.
50:47.640 --> 50:49.160
Exactly.
50:49.160 --> 50:50.240
Which is very tough.
50:50.240 --> 50:51.240
So you have to repeat that.
50:51.240 --> 50:52.120
Yeah, 93%.
50:52.120 --> 50:54.680
Like they picked a good threshold.
50:54.680 --> 50:58.920
It was a little bit higher than what the most commonly used
50:58.920 --> 51:03.320
ResNet 50 model could achieve at that time.
51:03.320 --> 51:08.080
So yeah, so it's quite a difficult problem to solve.
51:08.080 --> 51:09.920
But yeah, we realized if we actually just
51:09.920 --> 51:16.200
use 64 by 64 images, it trained a pretty good model.
51:16.200 --> 51:17.960
And then we could take that same model
51:17.960 --> 51:19.560
and just give it a couple of epochs
51:19.560 --> 51:21.880
to learn 224 by 224 images.
51:21.880 --> 51:24.440
And it was basically already trained.
51:24.440 --> 51:25.480
It makes a lot of sense.
51:25.480 --> 51:27.200
Like if you teach somebody, like here's
51:27.200 --> 51:30.240
what a dog looks like, and you show them low res versions,
51:30.240 --> 51:33.640
and then you say, here's a really clear picture of a dog.
51:33.640 --> 51:36.000
They already know what a dog looks like.
51:36.000 --> 51:39.920
So that, like, just we jumped to the front,
51:39.920 --> 51:46.400
and we ended up winning parts of that competition.
51:46.400 --> 51:49.680
We actually ended up doing a distributed version
51:49.680 --> 51:51.960
over multiple machines a couple of months later
51:51.960 --> 51:53.560
and ended up at the top of the leaderboard.
51:53.560 --> 51:55.440
We had 18 minutes.
51:55.440 --> 51:56.280
ImageNet.
51:56.280 --> 52:00.560
Yeah, and people have just kept on blasting through again
52:00.560 --> 52:02.320
and again since then.
52:02.320 --> 52:06.760
So what's your view on multi GPU or multiple machine
52:06.760 --> 52:11.960
training in general as a way to speed code up?
52:11.960 --> 52:13.680
I think it's largely a waste of time.
52:13.680 --> 52:15.880
Both multi GPU on a single machine and?
52:15.880 --> 52:17.640
Yeah, particularly multi machines,
52:17.640 --> 52:18.880
because it's just clunky.
52:21.840 --> 52:25.320
Multi GPUs is less clunky than it used to be.
52:25.320 --> 52:28.520
But to me, anything that slows down your iteration speed
52:28.520 --> 52:31.800
is a waste of time.
52:31.800 --> 52:36.960
So you could maybe do your very last perfecting of the model
52:36.960 --> 52:38.960
on multi GPUs if you need to.
52:38.960 --> 52:44.560
But so for example, I think doing stuff on ImageNet
52:44.560 --> 52:46.000
is generally a waste of time.
52:46.000 --> 52:48.240
Why test things on 1.3 million images?
52:48.240 --> 52:51.040
Most of us don't use 1.3 million images.
52:51.040 --> 52:54.360
And we've also done research that shows that doing things
52:54.360 --> 52:56.840
on a smaller subset of images gives you
52:56.840 --> 52:59.280
the same relative answers anyway.
52:59.280 --> 53:02.120
So from a research point of view, why waste that time?
53:02.120 --> 53:06.200
So actually, I released a couple of new data sets recently.
53:06.200 --> 53:08.880
One is called ImageNet.
53:08.880 --> 53:12.920
The French ImageNet, which is a small subset of ImageNet,
53:12.920 --> 53:15.200
which is designed to be easy to classify.
53:15.200 --> 53:17.320
What's how do you spell ImageNet?
53:17.320 --> 53:19.200
It's got an extra T and E at the end,
53:19.200 --> 53:20.520
because it's very French.
53:20.520 --> 53:21.640
Image, OK.
53:21.640 --> 53:24.720
And then another one called ImageWolf,
53:24.720 --> 53:29.840
which is a subset of ImageNet that only contains dog breeds.
53:29.840 --> 53:31.120
But that's a hard one, right?
53:31.120 --> 53:32.000
That's a hard one.
53:32.000 --> 53:34.360
And I've discovered that if you just look at these two
53:34.360 --> 53:39.120
subsets, you can train things on a single GPU in 10 minutes.
53:39.120 --> 53:42.040
And the results you get are directly transferrable
53:42.040 --> 53:44.320
to ImageNet nearly all the time.
53:44.320 --> 53:46.600
And so now I'm starting to see some researchers start
53:46.600 --> 53:48.960
to use these smaller data sets.
53:48.960 --> 53:51.120
I so deeply love the way you think,
53:51.120 --> 53:57.000
because I think you might have written a blog post saying
53:57.000 --> 54:00.200
that going with these big data sets
54:00.200 --> 54:03.920
is encouraging people to not think creatively.
54:03.920 --> 54:04.560
Absolutely.
54:04.560 --> 54:08.320
So year two, it sort of constrains you
54:08.320 --> 54:09.840
to train on large resources.
54:09.840 --> 54:11.280
And because you have these resources,
54:11.280 --> 54:14.040
you think more research will be better.
54:14.040 --> 54:17.760
And then you start to like somehow you kill the creativity.
54:17.760 --> 54:18.000
Yeah.
54:18.000 --> 54:20.760
And even worse than that, Lex, I keep hearing from people
54:20.760 --> 54:23.480
who say, I decided not to get into deep learning
54:23.480 --> 54:26.080
because I don't believe it's accessible to people
54:26.080 --> 54:28.560
outside of Google to do useful work.
54:28.560 --> 54:31.640
So like I see a lot of people make an explicit decision
54:31.640 --> 54:36.000
to not learn this incredibly valuable tool
54:36.000 --> 54:39.840
because they've drunk the Google Kool Aid, which is that only
54:39.840 --> 54:42.440
Google's big enough and smart enough to do it.
54:42.440 --> 54:45.400
And I just find that so disappointing and it's so wrong.
54:45.400 --> 54:49.200
And I think all of the major breakthroughs in AI
54:49.200 --> 54:53.280
in the next 20 years will be doable on a single GPU.
54:53.280 --> 54:57.120
Like I would say, my sense is all the big sort of.
54:57.120 --> 54:58.200
Well, let's put it this way.
54:58.200 --> 55:00.200
None of the big breakthroughs of the last 20 years
55:00.200 --> 55:01.720
have required multiple GPUs.
55:01.720 --> 55:05.920
So like batch norm, value, dropout,
55:05.920 --> 55:08.080
to demonstrate that there's something to them.
55:08.080 --> 55:11.840
Every one of them, none of them has required multiple GPUs.
55:11.840 --> 55:15.800
GANs, the original GANs, didn't require multiple GPUs.
55:15.800 --> 55:18.040
Well, and we've actually recently shown
55:18.040 --> 55:19.680
that you don't even need GANs.
55:19.680 --> 55:23.360
So we've developed GAN level outcomes
55:23.360 --> 55:24.720
without needing GANs.
55:24.720 --> 55:26.880
And we can now do it with, again,
55:26.880 --> 55:29.680
by using transfer learning, we can do it in a couple of hours
55:29.680 --> 55:30.520
on a single GPU.
55:30.520 --> 55:31.600
So you're using a generator model
55:31.600 --> 55:32.960
without the adversarial part?
55:32.960 --> 55:33.440
Yeah.
55:33.440 --> 55:35.880
So we've found loss functions that
55:35.880 --> 55:38.680
work super well without the adversarial part.
55:38.680 --> 55:41.840
And then one of our students, a guy called Jason Antich,
55:41.840 --> 55:44.640
has created a system called Dealtify,
55:44.640 --> 55:47.280
which uses this technique to colorize
55:47.280 --> 55:48.840
old black and white movies.
55:48.840 --> 55:51.480
You can do it on a single GPU, colorize a whole movie
55:51.480 --> 55:52.920
in a couple of hours.
55:52.920 --> 55:56.080
And one of the things that Jason and I did together
55:56.080 --> 56:00.480
was we figured out how to add a little bit of GAN
56:00.480 --> 56:03.000
at the very end, which it turns out for colorization,
56:03.000 --> 56:06.000
makes it just a bit brighter and nicer.
56:06.000 --> 56:07.920
And then Jason did masses of experiments
56:07.920 --> 56:10.000
to figure out exactly how much to do.
56:10.000 --> 56:12.840
But it's still all done on his home machine,
56:12.840 --> 56:15.400
on a single GPU in his lounge room.
56:15.400 --> 56:19.200
And if you think about colorizing Hollywood movies,
56:19.200 --> 56:21.720
that sounds like something a huge studio would have to do.
56:21.720 --> 56:25.280
But he has the world's best results on this.
56:25.280 --> 56:27.040
There's this problem of microphones.
56:27.040 --> 56:28.640
We're just talking to microphones now.
56:28.640 --> 56:29.140
Yeah.
56:29.140 --> 56:32.520
It's such a pain in the ass to have these microphones
56:32.520 --> 56:34.440
to get good quality audio.
56:34.440 --> 56:36.720
And I tried to see if it's possible to plop down
56:36.720 --> 56:39.960
a bunch of cheap sensors and reconstruct higher quality
56:39.960 --> 56:41.840
audio from multiple sources.
56:41.840 --> 56:45.440
Because right now, I haven't seen work from, OK,
56:45.440 --> 56:48.760
we can save inexpensive mics, automatically combining
56:48.760 --> 56:52.280
audio from multiple sources to improve the combined audio.
56:52.280 --> 56:53.200
People haven't done that.
56:53.200 --> 56:55.080
And that feels like a learning problem.
56:55.080 --> 56:56.800
So hopefully somebody can.
56:56.800 --> 56:58.760
Well, I mean, it's evidently doable.
56:58.760 --> 57:01.000
And it should have been done by now.
57:01.000 --> 57:03.640
I felt the same way about computational photography
57:03.640 --> 57:04.480
four years ago.
57:04.480 --> 57:05.240
That's right.
57:05.240 --> 57:08.240
Why are we investing in big lenses when
57:08.240 --> 57:13.160
three cheap lenses plus actually a little bit of intentional
57:13.160 --> 57:16.640
movement, so like take a few frames,
57:16.640 --> 57:19.840
gives you enough information to get excellent subpixel
57:19.840 --> 57:22.440
resolution, which particularly with deep learning,
57:22.440 --> 57:25.840
you would know exactly what you meant to be looking at.
57:25.840 --> 57:28.200
We can totally do the same thing with audio.
57:28.200 --> 57:30.720
I think the madness that it hasn't been done yet.
57:30.720 --> 57:33.320
Has there been progress on photography companies?
57:33.320 --> 57:33.820
Yeah.
57:33.820 --> 57:36.720
Photography is basically a standard now.
57:36.720 --> 57:41.120
So the Google Pixel Nightlight, I
57:41.120 --> 57:43.240
don't know if you've ever tried it, but it's astonishing.
57:43.240 --> 57:45.440
You take a picture and almost pitch black
57:45.440 --> 57:49.120
and you get back a very high quality image.
57:49.120 --> 57:51.440
And it's not because of the lens.
57:51.440 --> 57:55.280
Same stuff with like adding the bokeh to the background
57:55.280 --> 57:55.800
blurring.
57:55.800 --> 57:57.200
It's done computationally.
57:57.200 --> 57:58.520
Just the pics over here.
57:58.520 --> 57:59.020
Yeah.
57:59.020 --> 58:05.000
Basically, everybody now is doing most of the fanciest stuff
58:05.000 --> 58:07.120
on their phones with computational photography
58:07.120 --> 58:10.640
and also increasingly, people are putting more than one lens
58:10.640 --> 58:11.840
on the back of the camera.
58:11.840 --> 58:14.360
So the same will happen for audio, for sure.
58:14.360 --> 58:16.520
And there's applications in the audio side.
58:16.520 --> 58:19.360
If you look at an Alexa type device,
58:19.360 --> 58:21.840
most people I've seen, especially I worked at Google
58:21.840 --> 58:26.000
before, when you look at noise background removal,
58:26.000 --> 58:29.480
you don't think of multiple sources of audio.
58:29.480 --> 58:31.920
You don't play with that as much as I would hope people would.
58:31.920 --> 58:33.640
But I mean, you can still do it even with one.
58:33.640 --> 58:36.120
Like, again, it's not much work's been done in this area.
58:36.120 --> 58:38.440
So we're actually going to be releasing an audio library
58:38.440 --> 58:41.040
soon, which hopefully will encourage development of this
58:41.040 --> 58:43.200
because it's so underused.
58:43.200 --> 58:46.480
The basic approach we used for our super resolution,
58:46.480 --> 58:49.960
in which Jason uses for dealdify of generating
58:49.960 --> 58:51.920
high quality images, the exact same approach
58:51.920 --> 58:53.480
would work for audio.
58:53.480 --> 58:57.160
No one's done it yet, but it would be a couple of months work.
58:57.160 --> 59:01.600
OK, also learning rate in terms of dawn bench.
59:01.600 --> 59:04.280
There's some magic on learning rate that you played around
59:04.280 --> 59:04.760
with.
59:04.760 --> 59:05.800
It's kind of interesting.
59:05.800 --> 59:08.120
Yeah, so this is all work that came from a guy called Leslie
59:08.120 --> 59:09.360
Smith.
59:09.360 --> 59:12.760
Leslie's a researcher who, like us,
59:12.760 --> 59:17.720
cares a lot about just the practicalities of training
59:17.720 --> 59:20.000
neural networks quickly and accurately,
59:20.000 --> 59:22.120
which you would think is what everybody should care about,
59:22.120 --> 59:25.000
but almost nobody does.
59:25.000 --> 59:28.120
And he discovered something very interesting,
59:28.120 --> 59:30.000
which he calls super convergence, which
59:30.000 --> 59:32.360
is there are certain networks that with certain settings
59:32.360 --> 59:34.320
of high parameters could suddenly
59:34.320 --> 59:37.440
be trained 10 times faster by using
59:37.440 --> 59:39.480
a 10 times higher learning rate.
59:39.480 --> 59:44.680
Now, no one published that paper
59:44.680 --> 59:49.520
because it's not an area of active research
59:49.520 --> 59:50.440
in the academic world.
59:50.440 --> 59:52.840
No academics recognize this is important.
59:52.840 --> 59:56.080
And also, deep learning in academia
59:56.080 --> 1:00:00.040
is not considered a experimental science.
1:00:00.040 --> 1:00:02.440
So unlike in physics, where you could say,
1:00:02.440 --> 1:00:05.360
I just saw a subatomic particle do something
1:00:05.360 --> 1:00:07.240
which the theory doesn't explain,
1:00:07.240 --> 1:00:10.440
you could publish that without an explanation.
1:00:10.440 --> 1:00:12.120
And then in the next 60 years, people
1:00:12.120 --> 1:00:14.120
can try to work out how to explain it.
1:00:14.120 --> 1:00:16.200
We don't allow this in the deep learning world.
1:00:16.200 --> 1:00:20.720
So it's literally impossible for Leslie to publish a paper that
1:00:20.720 --> 1:00:23.560
says, I've just seen something amazing happen.
1:00:23.560 --> 1:00:25.680
This thing trained 10 times faster than it should have.
1:00:25.680 --> 1:00:27.080
I don't know why.
1:00:27.080 --> 1:00:28.600
And so the reviewers were like, well,
1:00:28.600 --> 1:00:30.280
you can't publish that because you don't know why.
1:00:30.280 --> 1:00:31.000
So anyway.
1:00:31.000 --> 1:00:32.680
That's important to pause on because there's
1:00:32.680 --> 1:00:36.160
so many discoveries that would need to start like that.
1:00:36.160 --> 1:00:39.280
Every other scientific field I know of works of that way.
1:00:39.280 --> 1:00:42.520
I don't know why ours is uniquely
1:00:42.520 --> 1:00:46.480
disinterested in publishing unexplained
1:00:46.480 --> 1:00:47.680
experimental results.
1:00:47.680 --> 1:00:48.680
But there it is.
1:00:48.680 --> 1:00:51.200
So it wasn't published.
1:00:51.200 --> 1:00:55.080
Having said that, I read a lot more
1:00:55.080 --> 1:00:56.840
unpublished papers and published papers
1:00:56.840 --> 1:01:00.080
because that's where you find the interesting insights.
1:01:00.080 --> 1:01:02.680
So I absolutely read this paper.
1:01:02.680 --> 1:01:08.120
And I was just like, this is astonishingly mind blowing
1:01:08.120 --> 1:01:09.760
and weird and awesome.
1:01:09.760 --> 1:01:12.400
And why isn't everybody only talking about this?
1:01:12.400 --> 1:01:15.520
Because if you can train these things 10 times faster,
1:01:15.520 --> 1:01:18.480
they also generalize better because you're doing less epochs,
1:01:18.480 --> 1:01:20.080
which means you look at the data less,
1:01:20.080 --> 1:01:22.400
you get better accuracy.
1:01:22.400 --> 1:01:24.640
So I've been kind of studying that ever since.
1:01:24.640 --> 1:01:28.520
And eventually Leslie kind of figured out
1:01:28.520 --> 1:01:30.160
a lot of how to get this done.
1:01:30.160 --> 1:01:32.280
And we added minor tweaks.
1:01:32.280 --> 1:01:34.840
And a big part of the trick is starting
1:01:34.840 --> 1:01:37.920
at a very low learning rate, very gradually increasing it.
1:01:37.920 --> 1:01:39.800
So as you're training your model,
1:01:39.800 --> 1:01:42.120
you take very small steps at the start.
1:01:42.120 --> 1:01:44.080
And you gradually make them bigger and bigger
1:01:44.080 --> 1:01:46.440
until eventually you're taking much bigger steps
1:01:46.440 --> 1:01:49.400
than anybody thought was possible.
1:01:49.400 --> 1:01:52.280
There's a few other little tricks to make it work.
1:01:52.280 --> 1:01:55.240
Basically, we can reliably get super convergence.
1:01:55.240 --> 1:01:56.640
And so for the dorm bench thing,
1:01:56.640 --> 1:01:59.320
we were using just much higher learning rates
1:01:59.320 --> 1:02:02.200
than people expected to work.
1:02:02.200 --> 1:02:03.880
What do you think the future of,
1:02:03.880 --> 1:02:05.200
I mean, it makes so much sense for that
1:02:05.200 --> 1:02:08.640
to be a critical hyperparameter learning rate that you vary.
1:02:08.640 --> 1:02:13.480
What do you think the future of learning rate magic looks like?
1:02:13.480 --> 1:02:14.960
Well, there's been a lot of great work
1:02:14.960 --> 1:02:17.400
in the last 12 months in this area.
1:02:17.400 --> 1:02:20.800
And people are increasingly realizing that we just
1:02:20.800 --> 1:02:23.120
have no idea really how optimizers work.
1:02:23.120 --> 1:02:25.840
And the combination of weight decay,
1:02:25.840 --> 1:02:27.480
which is how we regularize optimizers,
1:02:27.480 --> 1:02:30.120
and the learning rate, and then other things
1:02:30.120 --> 1:02:32.760
like the epsilon we use in the atom optimizer,
1:02:32.760 --> 1:02:36.560
they all work together in weird ways.
1:02:36.560 --> 1:02:38.560
And different parts of the model,
1:02:38.560 --> 1:02:40.480
this is another thing we've done a lot of work on,
1:02:40.480 --> 1:02:43.480
is research into how different parts of the model
1:02:43.480 --> 1:02:46.600
should be trained at different rates in different ways.
1:02:46.600 --> 1:02:49.040
So we do something we call discriminative learning rates,
1:02:49.040 --> 1:02:51.040
which is really important, particularly for transfer
1:02:51.040 --> 1:02:53.200
learning.
1:02:53.200 --> 1:02:54.880
So really, I think in the last 12 months,
1:02:54.880 --> 1:02:57.360
a lot of people have realized that all this stuff is important.
1:02:57.360 --> 1:03:00.000
There's been a lot of great work coming out.
1:03:00.000 --> 1:03:02.880
And we're starting to see algorithms
1:03:02.880 --> 1:03:06.880
appear which have very, very few dials, if any,
1:03:06.880 --> 1:03:07.920
that you have to touch.
1:03:07.920 --> 1:03:09.240
So I think what's going to happen
1:03:09.240 --> 1:03:10.840
is the idea of a learning rate, well,
1:03:10.840 --> 1:03:14.360
it almost already has disappeared in the latest research.
1:03:14.360 --> 1:03:18.240
And instead, it's just like, we know enough
1:03:18.240 --> 1:03:22.440
about how to interpret the gradients
1:03:22.440 --> 1:03:23.840
and the change of gradients we see
1:03:23.840 --> 1:03:25.440
to know how to set every parameter of our way.
1:03:25.440 --> 1:03:26.440
There you can automate it.
1:03:26.440 --> 1:03:31.720
So you see the future of deep learning, where really,
1:03:31.720 --> 1:03:34.600
where is the input of a human expert needed?
1:03:34.600 --> 1:03:36.520
Well, hopefully, the input of a human expert
1:03:36.520 --> 1:03:39.680
will be almost entirely unneeded from the deep learning
1:03:39.680 --> 1:03:40.560
point of view.
1:03:40.560 --> 1:03:43.480
So again, Google's approach to this
1:03:43.480 --> 1:03:46.000
is to try and use thousands of times more compute
1:03:46.000 --> 1:03:49.400
to run lots and lots of models at the same time
1:03:49.400 --> 1:03:51.040
and hope that one of them is good.
1:03:51.040 --> 1:03:51.960
A lot of malkana stuff.
1:03:51.960 --> 1:03:56.800
Yeah, a lot of malkana stuff, which I think is insane.
1:03:56.800 --> 1:04:01.720
When you better understand the mechanics of how models learn,
1:04:01.720 --> 1:04:03.800
you don't have to try 1,000 different models
1:04:03.800 --> 1:04:05.680
to find which one happens to work the best.
1:04:05.680 --> 1:04:08.240
You can just jump straight to the best one, which
1:04:08.240 --> 1:04:12.720
means that it's more accessible in terms of compute, cheaper,
1:04:12.720 --> 1:04:14.920
and also with less hyperparameters to set.
1:04:14.920 --> 1:04:16.800
That means you don't need deep learning experts
1:04:16.800 --> 1:04:19.360
to train your deep learning model for you,
1:04:19.360 --> 1:04:22.480
which means that domain experts can do more of the work, which
1:04:22.480 --> 1:04:24.960
means that now you can focus the human time
1:04:24.960 --> 1:04:28.320
on the kind of interpretation, the data gathering,
1:04:28.320 --> 1:04:31.360
identifying model errors, and stuff like that.
1:04:31.360 --> 1:04:32.840
Yeah, the data side.
1:04:32.840 --> 1:04:34.720
How often do you work with data these days
1:04:34.720 --> 1:04:38.680
in terms of the cleaning, Darwin looked
1:04:38.680 --> 1:04:43.120
at different species while traveling about,
1:04:43.120 --> 1:04:45.040
do you look at data?
1:04:45.040 --> 1:04:49.400
Have you, in your roots in Kaggle, just look at data?
1:04:49.400 --> 1:04:51.320
Yeah, I mean, it's a key part of our course.
1:04:51.320 --> 1:04:53.480
It's like before we train a model in the course,
1:04:53.480 --> 1:04:55.160
we see how to look at the data.
1:04:55.160 --> 1:04:57.920
And then the first thing we do after we train our first model,
1:04:57.920 --> 1:05:00.520
which we fine tune an ImageNet model for five minutes.
1:05:00.520 --> 1:05:02.240
And then the thing we immediately do after that
1:05:02.240 --> 1:05:05.760
is we learn how to analyze the results of the model
1:05:05.760 --> 1:05:08.920
by looking at examples of misclassified images,
1:05:08.920 --> 1:05:10.880
and looking at a classification matrix,
1:05:10.880 --> 1:05:15.080
and then doing research on Google
1:05:15.080 --> 1:05:18.000
to learn about the kinds of things that it's misclassifying.
1:05:18.000 --> 1:05:19.520
So to me, one of the really cool things
1:05:19.520 --> 1:05:21.840
about machine learning models in general
1:05:21.840 --> 1:05:24.480
is that when you interpret them, they
1:05:24.480 --> 1:05:27.360
tell you about things like what are the most important features,
1:05:27.360 --> 1:05:29.400
which groups you're misclassifying,
1:05:29.400 --> 1:05:32.440
and they help you become a domain expert more quickly,
1:05:32.440 --> 1:05:34.880
because you can focus your time on the bits
1:05:34.880 --> 1:05:38.680
that the model is telling you is important.
1:05:38.680 --> 1:05:40.760
So it lets you deal with things like data leakage,
1:05:40.760 --> 1:05:43.080
for example, if it says, oh, the main feature I'm looking at
1:05:43.080 --> 1:05:45.240
is customer ID.
1:05:45.240 --> 1:05:47.640
And you're like, oh, customer ID should be predictive.
1:05:47.640 --> 1:05:52.280
And then you can talk to the people that manage customer IDs,
1:05:52.280 --> 1:05:56.840
and they'll tell you, oh, yes, as soon as a customer's application
1:05:56.840 --> 1:05:59.480
is accepted, we add a one on the end of their customer ID
1:05:59.480 --> 1:06:01.200
or something.
1:06:01.200 --> 1:06:04.360
So yeah, looking at data, particularly
1:06:04.360 --> 1:06:06.600
from the lens of which parts of the data the model says
1:06:06.600 --> 1:06:09.400
is important, is super important.
1:06:09.400 --> 1:06:11.480
Yeah, and using kind of using the model
1:06:11.480 --> 1:06:14.240
to almost debug the data to learn more about the data.
1:06:14.240 --> 1:06:16.800
Exactly.
1:06:16.800 --> 1:06:18.600
What are the different cloud options
1:06:18.600 --> 1:06:20.160
for training your networks?
1:06:20.160 --> 1:06:22.000
Last question related to Don Bench.
1:06:22.000 --> 1:06:24.240
Well, it's part of a lot of the work you do,
1:06:24.240 --> 1:06:27.280
but from a perspective of performance,
1:06:27.280 --> 1:06:29.480
I think you've written this in a blog post.
1:06:29.480 --> 1:06:32.720
There's AWS, there's a TPU from Google.
1:06:32.720 --> 1:06:33.440
What's your sense?
1:06:33.440 --> 1:06:34.520
What the future holds?
1:06:34.520 --> 1:06:37.360
What would you recommend now in terms of training in the cloud?
1:06:37.360 --> 1:06:40.520
So from a hardware point of view,
1:06:40.520 --> 1:06:45.520
Google's TPUs and the best Nvidia GPUs are similar.
1:06:45.520 --> 1:06:47.880
And maybe the TPUs are like 30% faster,
1:06:47.880 --> 1:06:51.160
but they're also much harder to program.
1:06:51.160 --> 1:06:54.720
There isn't a clear leader in terms of hardware right now,
1:06:54.720 --> 1:06:57.840
although much more importantly, the Nvidia's GPUs
1:06:57.840 --> 1:06:59.560
are much more programmable.
1:06:59.560 --> 1:07:01.280
They've got much more written problems.
1:07:01.280 --> 1:07:03.480
That's the clear leader for me and where
1:07:03.480 --> 1:07:08.640
I would spend my time as a researcher and practitioner.
1:07:08.640 --> 1:07:12.280
But then in terms of the platform,
1:07:12.280 --> 1:07:15.680
I mean, we're super lucky now with stuff like Google,
1:07:15.680 --> 1:07:21.520
GCP, Google Cloud, and AWS that you can access a GPU
1:07:21.520 --> 1:07:25.440
pretty quickly and easily.
1:07:25.440 --> 1:07:28.280
But I mean, for AWS, it's still too hard.
1:07:28.280 --> 1:07:33.760
You have to find an AMI and get the instance running
1:07:33.760 --> 1:07:37.080
and then install the software you want and blah, blah, blah.
1:07:37.080 --> 1:07:40.400
GCP is currently the best way to get
1:07:40.400 --> 1:07:42.320
started on a full server environment
1:07:42.320 --> 1:07:46.120
because they have a fantastic fast AI in PyTorch,
1:07:46.120 --> 1:07:51.120
ready to go instance, which has all the courses preinstalled.
1:07:51.120 --> 1:07:53.040
It has Jupyter Notebook prerunning.
1:07:53.040 --> 1:07:57.080
Jupyter Notebook is this wonderful interactive computing
1:07:57.080 --> 1:07:59.440
system, which everybody basically
1:07:59.440 --> 1:08:02.920
should be using for any kind of data driven research.
1:08:02.920 --> 1:08:05.880
But then even better than that, there
1:08:05.880 --> 1:08:09.560
are platforms like Salamander, which we own,
1:08:09.560 --> 1:08:13.600
and Paperspace, where literally you click a single button
1:08:13.600 --> 1:08:17.240
and it pops up and you put a notebook straight away
1:08:17.240 --> 1:08:22.240
without any kind of installation or anything.
1:08:22.240 --> 1:08:25.800
And all the course notebooks are all preinstalled.
1:08:25.800 --> 1:08:28.560
So for me, this is one of the things
1:08:28.560 --> 1:08:34.160
we spent a lot of time curating and working on.
1:08:34.160 --> 1:08:35.960
Because when we first started our courses,
1:08:35.960 --> 1:08:39.560
the biggest problem was people dropped out of lesson one
1:08:39.560 --> 1:08:42.680
because they couldn't get an AWS instance running.
1:08:42.680 --> 1:08:44.880
So things are so much better now.
1:08:44.880 --> 1:08:47.760
And we actually have, if you go to course.fast.ai,
1:08:47.760 --> 1:08:49.040
the first thing it says is, here's
1:08:49.040 --> 1:08:50.480
how to get started with your GPU.
1:08:50.480 --> 1:08:52.120
And it's like, you just click on the link
1:08:52.120 --> 1:08:55.120
and you click start and you're going.
1:08:55.120 --> 1:08:56.240
You will go GCP.
1:08:56.240 --> 1:08:58.760
I have to confess, I've never used the Google GCP.
1:08:58.760 --> 1:09:01.600
Yeah, GCP gives you $300 of compute for free,
1:09:01.600 --> 1:09:04.920
which is really nice.
1:09:04.920 --> 1:09:10.960
But as I say, Salamander and Paperspace are even easier still.
1:09:10.960 --> 1:09:15.120
So from the perspective of deep learning frameworks,
1:09:15.120 --> 1:09:18.400
you work with Fast.ai, if you think of it as framework,
1:09:18.400 --> 1:09:22.960
and PyTorch and TensorFlow, what are the strengths
1:09:22.960 --> 1:09:25.840
of each platform in your perspective?
1:09:25.840 --> 1:09:29.240
So in terms of what we've done our research on and taught
1:09:29.240 --> 1:09:34.400
in our course, we started with Theano and Keras.
1:09:34.400 --> 1:09:38.120
And then we switched to TensorFlow and Keras.
1:09:38.120 --> 1:09:40.400
And then we switched to PyTorch.
1:09:40.400 --> 1:09:43.360
And then we switched to PyTorch and Fast.ai.
1:09:43.360 --> 1:09:47.560
And that kind of reflects a growth and development
1:09:47.560 --> 1:09:52.560
of the ecosystem of deep learning libraries.
1:09:52.560 --> 1:09:57.040
Theano and TensorFlow were great,
1:09:57.040 --> 1:10:01.360
but were much harder to teach and to do research and development
1:10:01.360 --> 1:10:04.560
on because they define what's called a computational graph
1:10:04.560 --> 1:10:06.680
up front, a static graph, where you basically
1:10:06.680 --> 1:10:08.360
have to say, here are all the things
1:10:08.360 --> 1:10:12.040
that I'm going to eventually do in my model.
1:10:12.040 --> 1:10:15.080
And then later on, you say, OK, do those things with this data.
1:10:15.080 --> 1:10:17.160
And you can't debug them.
1:10:17.160 --> 1:10:18.560
You can't do them step by step.
1:10:18.560 --> 1:10:20.160
You can't program them interactively
1:10:20.160 --> 1:10:22.280
in a Jupyter notebook and so forth.
1:10:22.280 --> 1:10:24.320
PyTorch was not the first, but PyTorch
1:10:24.320 --> 1:10:27.400
was certainly the strongest entrant to come along
1:10:27.400 --> 1:10:28.720
and say, let's not do it that way.
1:10:28.720 --> 1:10:31.320
Let's just use normal Python.
1:10:31.320 --> 1:10:32.880
And everything you know about in Python
1:10:32.880 --> 1:10:34.000
is just going to work.
1:10:34.000 --> 1:10:37.880
And we'll figure out how to make that run on the GPU
1:10:37.880 --> 1:10:40.800
as and when necessary.
1:10:40.800 --> 1:10:45.120
That turned out to be a huge leap in terms
1:10:45.120 --> 1:10:46.800
of what we could do with our research
1:10:46.800 --> 1:10:49.720
and what we could do with our teaching.
1:10:49.720 --> 1:10:51.160
Because it wasn't limiting.
1:10:51.160 --> 1:10:52.760
Yeah, I mean, it was critical for us
1:10:52.760 --> 1:10:55.960
for something like Dawnbench to be able to rapidly try things.
1:10:55.960 --> 1:10:58.560
It's just so much harder to be a researcher and practitioner
1:10:58.560 --> 1:11:00.520
when you have to do everything upfront
1:11:00.520 --> 1:11:03.400
and you can't inspect it.
1:11:03.400 --> 1:11:07.360
Problem with PyTorch is it's not at all
1:11:07.360 --> 1:11:09.360
accessible to newcomers because you
1:11:09.360 --> 1:11:11.600
have to write your own training loop
1:11:11.600 --> 1:11:15.680
and manage the gradients and all this stuff.
1:11:15.680 --> 1:11:17.920
And it's also not great for researchers
1:11:17.920 --> 1:11:20.680
because you're spending your time dealing with all this boiler
1:11:20.680 --> 1:11:23.920
plate and overhead rather than thinking about your algorithm.
1:11:23.920 --> 1:11:27.760
So we ended up writing this very multi layered API
1:11:27.760 --> 1:11:31.040
that at the top level, you can train a state of the art neural
1:11:31.040 --> 1:11:33.640
network in three lines of code.
1:11:33.640 --> 1:11:35.920
And which talks to an API, which talks to an API,
1:11:35.920 --> 1:11:38.880
which talks to an API, which you can dive into at any level
1:11:38.880 --> 1:11:45.400
and get progressively closer to the machine levels of control.
1:11:45.400 --> 1:11:47.480
And this is the fast AI library.
1:11:47.480 --> 1:11:51.920
That's been critical for us and for our students
1:11:51.920 --> 1:11:54.200
and for lots of people that have won big learning
1:11:54.200 --> 1:11:58.560
competitions with it and written academic papers with it.
1:11:58.560 --> 1:12:00.680
It's made a big difference.
1:12:00.680 --> 1:12:03.960
We're still limited though by Python.
1:12:03.960 --> 1:12:05.920
And particularly this problem with things
1:12:05.920 --> 1:12:10.640
like our current neural nets say where you just can't change
1:12:10.640 --> 1:12:13.320
things unless you accept it going so slowly
1:12:13.320 --> 1:12:15.680
that it's impractical.
1:12:15.680 --> 1:12:18.320
So in the latest incarnation of the course
1:12:18.320 --> 1:12:20.880
and with some of the research we're now starting to do,
1:12:20.880 --> 1:12:24.480
we're starting to do some stuff in Swift.
1:12:24.480 --> 1:12:28.920
I think we're three years away from that being
1:12:28.920 --> 1:12:31.080
super practical, but I'm in no hurry.
1:12:31.080 --> 1:12:35.480
I'm very happy to invest the time to get there.
1:12:35.480 --> 1:12:38.000
But with that, we actually already
1:12:38.000 --> 1:12:41.840
have a nascent version of the fast AI library for vision
1:12:41.840 --> 1:12:44.720
running on Swift and TensorFlow.
1:12:44.720 --> 1:12:48.040
Because Python for TensorFlow is not going to cut it.
1:12:48.040 --> 1:12:49.920
It's just a disaster.
1:12:49.920 --> 1:12:54.440
What they did was they tried to replicate the bits
1:12:54.440 --> 1:12:56.640
that people were saying they like about PyTorch,
1:12:56.640 --> 1:12:59.160
this kind of interactive computation.
1:12:59.160 --> 1:13:02.760
But they didn't actually change their foundational runtime
1:13:02.760 --> 1:13:03.920
components.
1:13:03.920 --> 1:13:06.640
So they kind of added this like syntax, sugar,
1:13:06.640 --> 1:13:08.560
they call TF Eager, TensorFlow Eager, which
1:13:08.560 --> 1:13:10.880
makes it look a lot like PyTorch.
1:13:10.880 --> 1:13:16.400
But it's 10 times slower than PyTorch to actually do a step.
1:13:16.400 --> 1:13:19.080
So because they didn't invest the time
1:13:19.080 --> 1:13:22.080
in retooling the foundations because their code base
1:13:22.080 --> 1:13:23.520
is so horribly complex.
1:13:23.520 --> 1:13:25.280
Yeah, I think it's probably very difficult
1:13:25.280 --> 1:13:26.440
to do that kind of rejoining.
1:13:26.440 --> 1:13:28.680
Yeah, well, particularly the way TensorFlow was written,
1:13:28.680 --> 1:13:31.480
it was written by a lot of people very quickly
1:13:31.480 --> 1:13:33.320
in a very disorganized way.
1:13:33.320 --> 1:13:36.000
So when you actually look in the code, as I do often,
1:13:36.000 --> 1:13:38.840
I'm always just like, oh, god, what were they thinking?
1:13:38.840 --> 1:13:41.480
It's just, it's pretty awful.
1:13:41.480 --> 1:13:47.080
So I'm really extremely negative about the potential future
1:13:47.080 --> 1:13:52.120
for Python TensorFlow that Swift for TensorFlow
1:13:52.120 --> 1:13:53.760
can be a different beast altogether.
1:13:53.760 --> 1:13:57.560
It can be like, it can basically be a layer on top of MLIR
1:13:57.560 --> 1:14:02.640
that takes advantage of all the great compiler stuff
1:14:02.640 --> 1:14:04.760
that Swift builds on with LLVM.
1:14:04.760 --> 1:14:07.040
And yeah, it could be absolutely.
1:14:07.040 --> 1:14:10.320
I think it will be absolutely fantastic.
1:14:10.320 --> 1:14:11.920
Well, you're inspiring me to try.
1:14:11.920 --> 1:14:17.640
Evan truly felt the pain of TensorFlow 2.0 Python.
1:14:17.640 --> 1:14:19.040
It's fine by me.
1:14:19.040 --> 1:14:21.080
But of course.
1:14:21.080 --> 1:14:23.240
Yeah, I mean, it does the job if you're using
1:14:23.240 --> 1:14:27.720
predefined things that somebody's already written.
1:14:27.720 --> 1:14:29.920
But if you actually compare, like I've
1:14:29.920 --> 1:14:33.680
had to do a lot of stuff with TensorFlow recently,
1:14:33.680 --> 1:14:35.480
you actually compare like, I want
1:14:35.480 --> 1:14:37.360
to write something from scratch.
1:14:37.360 --> 1:14:39.040
And you're like, I just keep finding it's like, oh,
1:14:39.040 --> 1:14:41.560
it's running 10 times slower than PyTorch.
1:14:41.560 --> 1:14:43.800
So is the biggest cost.
1:14:43.800 --> 1:14:47.360
Let's throw running time out the window.
1:14:47.360 --> 1:14:49.640
How long it takes you to program?
1:14:49.640 --> 1:14:51.000
That's not too different now.
1:14:51.000 --> 1:14:54.080
Thanks to TensorFlow Eager, that's not too different.
1:14:54.080 --> 1:14:58.640
But because so many things take so long to run,
1:14:58.640 --> 1:15:00.320
you wouldn't run it at 10 times slower.
1:15:00.320 --> 1:15:03.000
Like, you just go like, oh, this is taking too long.
1:15:03.000 --> 1:15:04.240
And also, there's a lot of things
1:15:04.240 --> 1:15:05.840
which are just less programmable,
1:15:05.840 --> 1:15:09.000
like tf.data, which is the way data processing works
1:15:09.000 --> 1:15:11.400
in TensorFlow, is just this big mess.
1:15:11.400 --> 1:15:13.160
It's incredibly inefficient.
1:15:13.160 --> 1:15:14.800
And they kind of had to write it that way
1:15:14.800 --> 1:15:19.160
because of the TPU problems I described earlier.
1:15:19.160 --> 1:15:24.680
So I just feel like they've got this huge technical debt,
1:15:24.680 --> 1:15:27.960
which they're not going to solve without starting from scratch.
1:15:27.960 --> 1:15:29.440
So here's an interesting question then.
1:15:29.440 --> 1:15:34.720
If there's a new student starting today,
1:15:34.720 --> 1:15:37.480
what would you recommend they use?
1:15:37.480 --> 1:15:39.160
Well, I mean, we obviously recommend
1:15:39.160 --> 1:15:42.760
FastAI and PyTorch because we teach new students.
1:15:42.760 --> 1:15:43.960
And that's what we teach with.
1:15:43.960 --> 1:15:46.080
So we would very strongly recommend that
1:15:46.080 --> 1:15:50.280
because it will let you get on top of the concepts much
1:15:50.280 --> 1:15:51.960
more quickly.
1:15:51.960 --> 1:15:53.160
So then you'll become an action.
1:15:53.160 --> 1:15:56.400
And you'll also learn the actual state of the art techniques.
1:15:56.400 --> 1:15:59.240
So you actually get world class results.
1:15:59.240 --> 1:16:03.000
Honestly, it doesn't much matter what library
1:16:03.000 --> 1:16:09.240
you learn because switching from Shaina to MXNet to TensorFlow
1:16:09.240 --> 1:16:12.000
to PyTorch is going to be a couple of days work
1:16:12.000 --> 1:16:15.280
if you long as you understand the foundation as well.
1:16:15.280 --> 1:16:21.600
But you think we'll Swift creep in there as a thing
1:16:21.600 --> 1:16:22.960
that people start using?
1:16:22.960 --> 1:16:26.400
Not for a few years, particularly because Swift
1:16:26.400 --> 1:16:33.440
has no data science community, libraries, schooling.
1:16:33.440 --> 1:16:39.080
And the Swift community has a total lack of appreciation
1:16:39.080 --> 1:16:41.040
and understanding of numeric computing.
1:16:41.040 --> 1:16:43.640
So they keep on making stupid decisions.
1:16:43.640 --> 1:16:47.480
For years, they've just done dumb things around performance
1:16:47.480 --> 1:16:50.280
and prioritization.
1:16:50.280 --> 1:16:56.360
That's clearly changing now because the developer of Chris
1:16:56.360 --> 1:16:59.960
Lattner is working at Google on Swift for TensorFlow.
1:16:59.960 --> 1:17:03.200
So that's a priority.
1:17:03.200 --> 1:17:05.000
It'll be interesting to see what happens with Apple
1:17:05.000 --> 1:17:10.000
because Apple hasn't shown any sign of caring
1:17:10.000 --> 1:17:12.960
about numeric programming in Swift.
1:17:12.960 --> 1:17:16.600
So hopefully they'll get off their arse
1:17:16.600 --> 1:17:18.840
and start appreciating this because currently all
1:17:18.840 --> 1:17:24.240
of their low level libraries are not written in Swift.
1:17:24.240 --> 1:17:27.640
They're not particularly Swifty at all, stuff like Core ML.
1:17:27.640 --> 1:17:30.840
They're really pretty rubbish.
1:17:30.840 --> 1:17:32.760
So yeah, so there's a long way to go.
1:17:32.760 --> 1:17:35.360
But at least one nice thing is that Swift for TensorFlow
1:17:35.360 --> 1:17:40.000
can actually directly use Python code and Python libraries.
1:17:40.000 --> 1:17:44.240
Literally, the entire lesson one notebook of fast AI
1:17:44.240 --> 1:17:47.800
runs in Swift right now in Python mode.
1:17:47.800 --> 1:17:50.800
So that's a nice intermediate thing.
1:17:50.800 --> 1:17:56.800
How long does it take if you look at the two fast AI courses,
1:17:56.800 --> 1:18:00.360
how long does it take to get from 0.0 to completing
1:18:00.360 --> 1:18:02.360
both courses?
1:18:02.360 --> 1:18:04.800
It varies a lot.
1:18:04.800 --> 1:18:12.360
Somewhere between two months and two years, generally.
1:18:12.360 --> 1:18:15.360
So for two months, how many hours a day on average?
1:18:15.360 --> 1:18:20.360
So like somebody who is a very competent coder
1:18:20.360 --> 1:18:27.360
can can do 70 hours per course and pick up.
1:18:27.360 --> 1:18:28.360
70, 70.
1:18:28.360 --> 1:18:29.360
That's it?
1:18:29.360 --> 1:18:30.360
OK.
1:18:30.360 --> 1:18:36.360
But a lot of people I know take a year off to study fast AI
1:18:36.360 --> 1:18:39.360
full time and say at the end of the year,
1:18:39.360 --> 1:18:42.360
they feel pretty competent.
1:18:42.360 --> 1:18:45.360
Because generally, there's a lot of other things you do.
1:18:45.360 --> 1:18:48.360
Generally, they'll be entering Kaggle competitions.
1:18:48.360 --> 1:18:51.360
They might be reading Ian Goodfellow's book.
1:18:51.360 --> 1:18:54.360
They might be doing a bunch of stuff.
1:18:54.360 --> 1:18:57.360
And often, particularly if they are a domain expert,
1:18:57.360 --> 1:19:01.360
their coding skills might be a little on the pedestrian side.
1:19:01.360 --> 1:19:04.360
So part of it's just like doing a lot more writing.
1:19:04.360 --> 1:19:07.360
What do you find is the bottleneck for people usually,
1:19:07.360 --> 1:19:11.360
except getting started and setting stuff up?
1:19:11.360 --> 1:19:13.360
I would say coding.
1:19:13.360 --> 1:19:17.360
The people who are strong coders pick it up the best.
1:19:17.360 --> 1:19:21.360
Although another bottleneck is people who have a lot of
1:19:21.360 --> 1:19:27.360
experience of classic statistics can really struggle
1:19:27.360 --> 1:19:30.360
because the intuition is so the opposite of what they're used to.
1:19:30.360 --> 1:19:33.360
They're very used to trying to reduce the number of parameters
1:19:33.360 --> 1:19:38.360
in their model and looking at individual coefficients
1:19:38.360 --> 1:19:39.360
and stuff like that.
1:19:39.360 --> 1:19:42.360
So I find people who have a lot of coding background
1:19:42.360 --> 1:19:45.360
and know nothing about statistics are generally
1:19:45.360 --> 1:19:48.360
going to be the best stuff.
1:19:48.360 --> 1:19:51.360
So you taught several courses on deep learning
1:19:51.360 --> 1:19:54.360
and as Feynman says, the best way to understand something
1:19:54.360 --> 1:19:55.360
is to teach it.
1:19:55.360 --> 1:19:58.360
What have you learned about deep learning from teaching it?
1:19:58.360 --> 1:20:00.360
A lot.
1:20:00.360 --> 1:20:03.360
It's a key reason for me to teach the courses.
1:20:03.360 --> 1:20:06.360
Obviously, it's going to be necessary to achieve our goal
1:20:06.360 --> 1:20:09.360
of getting domain experts to be familiar with deep learning,
1:20:09.360 --> 1:20:12.360
but it was also necessary for me to achieve my goal
1:20:12.360 --> 1:20:16.360
of being really familiar with deep learning.
1:20:16.360 --> 1:20:24.360
I mean, to see so many domain experts from so many different
1:20:24.360 --> 1:20:28.360
backgrounds, it's definitely, I wouldn't say taught me,
1:20:28.360 --> 1:20:31.360
but convinced me something that I liked to believe was true,
1:20:31.360 --> 1:20:34.360
which was anyone can do it.
1:20:34.360 --> 1:20:37.360
So there's a lot of kind of snobbishness out there about
1:20:37.360 --> 1:20:39.360
only certain people can learn to code,
1:20:39.360 --> 1:20:42.360
only certain people are going to be smart enough to do AI.
1:20:42.360 --> 1:20:44.360
That's definitely bullshit.
1:20:44.360 --> 1:20:48.360
I've seen so many people from so many different backgrounds
1:20:48.360 --> 1:20:52.360
get state of the art results in their domain areas now.
1:20:52.360 --> 1:20:56.360
It's definitely taught me that the key differentiator
1:20:56.360 --> 1:21:00.360
between people that succeed and people that fail is tenacity.
1:21:00.360 --> 1:21:03.360
That seems to be basically the only thing that matters.
1:21:03.360 --> 1:21:07.360
A lot of people give up.
1:21:07.360 --> 1:21:13.360
But if the ones who don't give up pretty much everybody succeeds,
1:21:13.360 --> 1:21:17.360
even if at first I'm just kind of thinking,
1:21:17.360 --> 1:21:20.360
wow, they really aren't quite getting it yet, are they?
1:21:20.360 --> 1:21:24.360
But eventually people get it and they succeed.
1:21:24.360 --> 1:21:27.360
So I think that's been, I think they're both things I liked
1:21:27.360 --> 1:21:29.360
to believe was true, but I don't feel like I really had
1:21:29.360 --> 1:21:31.360
strong evidence for them to be true,
1:21:31.360 --> 1:21:34.360
but now I can see I've seen it again and again.
1:21:34.360 --> 1:21:39.360
So what advice do you have for someone
1:21:39.360 --> 1:21:42.360
who wants to get started in deep learning?
1:21:42.360 --> 1:21:44.360
Train lots of models.
1:21:44.360 --> 1:21:47.360
That's how you learn it.
1:21:47.360 --> 1:21:51.360
So I think, it's not just me.
1:21:51.360 --> 1:21:53.360
I think our course is very good,
1:21:53.360 --> 1:21:55.360
but also lots of people independently have said it's very good.
1:21:55.360 --> 1:21:58.360
It recently won the CogEx Award for AI courses,
1:21:58.360 --> 1:22:00.360
it's being the best in the world.
1:22:00.360 --> 1:22:02.360
I'd say come to our course, course.fast.ai.
1:22:02.360 --> 1:22:05.360
And the thing I keep on harping on in my lessons is
1:22:05.360 --> 1:22:08.360
train models, print out the inputs to the models,
1:22:08.360 --> 1:22:10.360
print out to the outputs to the models,
1:22:10.360 --> 1:22:14.360
like study, you know, change the inputs a bit,
1:22:14.360 --> 1:22:16.360
look at how the outputs vary,
1:22:16.360 --> 1:22:22.360
just run lots of experiments to get an intuitive understanding
1:22:22.360 --> 1:22:24.360
of what's going on.
1:22:24.360 --> 1:22:28.360
To get hooked, do you think, you mentioned training,
1:22:28.360 --> 1:22:32.360
do you think just running the models inference?
1:22:32.360 --> 1:22:35.360
If we talk about getting started.
1:22:35.360 --> 1:22:37.360
No, you've got to fine tune the models.
1:22:37.360 --> 1:22:39.360
So that's the critical thing,
1:22:39.360 --> 1:22:43.360
because at that point, you now have a model that's in your domain area.
1:22:43.360 --> 1:22:46.360
So there's no point running somebody else's model,
1:22:46.360 --> 1:22:48.360
because it's not your model.
1:22:48.360 --> 1:22:50.360
So it only takes five minutes to fine tune a model
1:22:50.360 --> 1:22:52.360
for the data you care about.
1:22:52.360 --> 1:22:54.360
And in lesson two of the course,
1:22:54.360 --> 1:22:56.360
we teach you how to create your own dataset from scratch
1:22:56.360 --> 1:22:58.360
by scripting Google image search.
1:22:58.360 --> 1:23:02.360
And we show you how to actually create a web application running online.
1:23:02.360 --> 1:23:05.360
So I create one in the course that differentiates
1:23:05.360 --> 1:23:08.360
between a teddy bear, a grizzly bear, and a brown bear.
1:23:08.360 --> 1:23:10.360
And it does it with basically 100% accuracy.
1:23:10.360 --> 1:23:13.360
It took me about four minutes to scrape the images
1:23:13.360 --> 1:23:15.360
from Google search in the script.
1:23:15.360 --> 1:23:18.360
There's a little graphical widgets we have in the notebook
1:23:18.360 --> 1:23:21.360
that help you clean up the dataset.
1:23:21.360 --> 1:23:24.360
There's other widgets that help you study the results
1:23:24.360 --> 1:23:26.360
and see where the errors are happening.
1:23:26.360 --> 1:23:29.360
And so now we've got over a thousand replies
1:23:29.360 --> 1:23:32.360
in our Share Your Work Here thread of students saying,
1:23:32.360 --> 1:23:34.360
here's the thing I built.
1:23:34.360 --> 1:23:36.360
And so there's people who, like,
1:23:36.360 --> 1:23:38.360
and a lot of them are state of the art.
1:23:38.360 --> 1:23:40.360
Like somebody said, oh, I tried looking at Dev and Gary characters
1:23:40.360 --> 1:23:42.360
and I couldn't believe it.
1:23:42.360 --> 1:23:44.360
The thing that came out was more accurate
1:23:44.360 --> 1:23:46.360
than the best academic paper after lesson one.
1:23:46.360 --> 1:23:48.360
And then there's others which are just more kind of fun,
1:23:48.360 --> 1:23:53.360
like somebody who's doing Trinidad and Tobago hummingbirds.
1:23:53.360 --> 1:23:55.360
So that's kind of their national bird.
1:23:55.360 --> 1:23:57.360
And Susie's got something that can now classify Trinidad
1:23:57.360 --> 1:23:59.360
and Tobago hummingbirds.
1:23:59.360 --> 1:24:02.360
So yeah, train models, fine tune models with your dataset
1:24:02.360 --> 1:24:05.360
and then study their inputs and outputs.
1:24:05.360 --> 1:24:07.360
How much is Fast AI courses?
1:24:07.360 --> 1:24:09.360
Free.
1:24:09.360 --> 1:24:11.360
Everything we do is free.
1:24:11.360 --> 1:24:13.360
We have no revenue sources of any kind.
1:24:13.360 --> 1:24:15.360
It's just a service to the community.
1:24:15.360 --> 1:24:17.360
You're a saint.
1:24:17.360 --> 1:24:20.360
Okay, once a person understands the basics,
1:24:20.360 --> 1:24:22.360
trains a bunch of models,
1:24:22.360 --> 1:24:25.360
if we look at the scale of years,
1:24:25.360 --> 1:24:27.360
what advice do you have for someone wanting
1:24:27.360 --> 1:24:30.360
to eventually become an expert?
1:24:30.360 --> 1:24:32.360
Train lots of models.
1:24:32.360 --> 1:24:35.360
Specifically, train lots of models in your domain area.
1:24:35.360 --> 1:24:37.360
So an expert, what, right?
1:24:37.360 --> 1:24:40.360
We don't need more expert, like,
1:24:40.360 --> 1:24:45.360
create slightly evolutionary research in areas
1:24:45.360 --> 1:24:47.360
that everybody's studying.
1:24:47.360 --> 1:24:50.360
We need experts at using deep learning
1:24:50.360 --> 1:24:52.360
to diagnose malaria.
1:24:52.360 --> 1:24:55.360
Well, we need experts at using deep learning
1:24:55.360 --> 1:25:00.360
to analyze language to study media bias.
1:25:00.360 --> 1:25:08.360
So we need experts in analyzing fisheries
1:25:08.360 --> 1:25:11.360
to identify problem areas and the ocean.
1:25:11.360 --> 1:25:13.360
That's what we need.
1:25:13.360 --> 1:25:17.360
So become the expert in your passion area.
1:25:17.360 --> 1:25:21.360
And this is a tool which you can use for just about anything,
1:25:21.360 --> 1:25:24.360
and you'll be able to do that thing better than other people,
1:25:24.360 --> 1:25:26.360
particularly by combining it with your passion
1:25:26.360 --> 1:25:27.360
and domain expertise.
1:25:27.360 --> 1:25:28.360
So that's really interesting.
1:25:28.360 --> 1:25:30.360
Even if you do want to innovate on transfer learning
1:25:30.360 --> 1:25:32.360
or active learning,
1:25:32.360 --> 1:25:34.360
your thought is, I mean,
1:25:34.360 --> 1:25:38.360
what I certainly share is you also need to find
1:25:38.360 --> 1:25:41.360
a domain or data set that you actually really care for.
1:25:41.360 --> 1:25:42.360
Right.
1:25:42.360 --> 1:25:45.360
If you're not working on a real problem that you understand,
1:25:45.360 --> 1:25:47.360
how do you know if you're doing it any good?
1:25:47.360 --> 1:25:49.360
How do you know if your results are good?
1:25:49.360 --> 1:25:51.360
How do you know if you're getting bad results?
1:25:51.360 --> 1:25:52.360
Why are you getting bad results?
1:25:52.360 --> 1:25:54.360
Is it a problem with the data?
1:25:54.360 --> 1:25:57.360
How do you know you're doing anything useful?
1:25:57.360 --> 1:26:00.360
Yeah, to me, the only really interesting research is,
1:26:00.360 --> 1:26:03.360
not the only, but the vast majority of interesting research
1:26:03.360 --> 1:26:06.360
is try and solve an actual problem and solve it really well.
1:26:06.360 --> 1:26:10.360
So both understanding sufficient tools on the deep learning side
1:26:10.360 --> 1:26:14.360
and becoming a domain expert in a particular domain
1:26:14.360 --> 1:26:18.360
are really things within reach for anybody.
1:26:18.360 --> 1:26:19.360
Yeah.
1:26:19.360 --> 1:26:23.360
To me, I would compare it to studying self driving cars,
1:26:23.360 --> 1:26:26.360
having never looked at a car or been in a car
1:26:26.360 --> 1:26:29.360
or turned a car on, which is like the way it is
1:26:29.360 --> 1:26:30.360
for a lot of people.
1:26:30.360 --> 1:26:33.360
They'll study some academic data set
1:26:33.360 --> 1:26:36.360
where they literally have no idea about that.
1:26:36.360 --> 1:26:37.360
By the way, I'm not sure how familiar
1:26:37.360 --> 1:26:39.360
you are with autonomous vehicles,
1:26:39.360 --> 1:26:42.360
but that is literally, you describe a large percentage
1:26:42.360 --> 1:26:45.360
of robotics folks working in self driving cars,
1:26:45.360 --> 1:26:48.360
as they actually haven't considered driving.
1:26:48.360 --> 1:26:50.360
They haven't actually looked at what driving looks like.
1:26:50.360 --> 1:26:51.360
They haven't driven.
1:26:51.360 --> 1:26:52.360
And it applies.
1:26:52.360 --> 1:26:54.360
It's a problem because you know when you've actually driven,
1:26:54.360 --> 1:26:57.360
these are the things that happened to me when I was driving.
1:26:57.360 --> 1:26:59.360
There's nothing that beats the real world examples
1:26:59.360 --> 1:27:02.360
or just experiencing them.
1:27:02.360 --> 1:27:04.360
You've created many successful startups.
1:27:04.360 --> 1:27:08.360
What does it take to create a successful startup?
1:27:08.360 --> 1:27:12.360
Same thing as becoming successful deep learning practitioner,
1:27:12.360 --> 1:27:14.360
which is not giving up.
1:27:14.360 --> 1:27:22.360
So you can run out of money or run out of time
1:27:22.360 --> 1:27:24.360
or run out of something, you know,
1:27:24.360 --> 1:27:27.360
but if you keep costs super low
1:27:27.360 --> 1:27:29.360
and try and save up some money beforehand
1:27:29.360 --> 1:27:34.360
so you can afford to have some time,
1:27:34.360 --> 1:27:37.360
then just sticking with it is one important thing.
1:27:37.360 --> 1:27:42.360
Doing something you understand and care about is important.
1:27:42.360 --> 1:27:44.360
By something, I don't mean...
1:27:44.360 --> 1:27:46.360
The biggest problem I see with deep learning people
1:27:46.360 --> 1:27:49.360
is they do a PhD in deep learning
1:27:49.360 --> 1:27:52.360
and then they try and commercialize their PhD.
1:27:52.360 --> 1:27:53.360
It does a waste of time
1:27:53.360 --> 1:27:55.360
because that doesn't solve an actual problem.
1:27:55.360 --> 1:27:57.360
You picked your PhD topic
1:27:57.360 --> 1:28:00.360
because it was an interesting kind of engineering
1:28:00.360 --> 1:28:02.360
or math or research exercise.
1:28:02.360 --> 1:28:06.360
But yeah, if you've actually spent time as a recruiter
1:28:06.360 --> 1:28:10.360
and you know that most of your time was spent sifting through resumes
1:28:10.360 --> 1:28:12.360
and you know that most of the time
1:28:12.360 --> 1:28:14.360
you're just looking for certain kinds of things
1:28:14.360 --> 1:28:19.360
and you can try doing that with a model for a few minutes
1:28:19.360 --> 1:28:21.360
and see whether that's something which a model
1:28:21.360 --> 1:28:23.360
seems to be able to do as well as you could,
1:28:23.360 --> 1:28:27.360
then you're on the right track to creating a startup.
1:28:27.360 --> 1:28:30.360
And then I think just being...
1:28:30.360 --> 1:28:34.360
Just be pragmatic and...
1:28:34.360 --> 1:28:36.360
try and stay away from venture capital money
1:28:36.360 --> 1:28:38.360
as long as possible, preferably forever.
1:28:38.360 --> 1:28:42.360
So yeah, on that point, do you...
1:28:42.360 --> 1:28:43.360
venture capital...
1:28:43.360 --> 1:28:46.360
So were you able to successfully run startups
1:28:46.360 --> 1:28:48.360
with self funded for quite a while?
1:28:48.360 --> 1:28:50.360
Yeah, so my first two were self funded
1:28:50.360 --> 1:28:52.360
and that was the right way to do it.
1:28:52.360 --> 1:28:53.360
Is that scary?
1:28:53.360 --> 1:28:55.360
No.
1:28:55.360 --> 1:28:57.360
VCs startups are much more scary
1:28:57.360 --> 1:29:00.360
because you have these people on your back
1:29:00.360 --> 1:29:01.360
who do this all the time
1:29:01.360 --> 1:29:03.360
and who have done it for years
1:29:03.360 --> 1:29:05.360
telling you grow, grow, grow, grow.
1:29:05.360 --> 1:29:07.360
And they don't care if you fail.
1:29:07.360 --> 1:29:09.360
They only care if you don't grow fast enough.
1:29:09.360 --> 1:29:10.360
So that's scary.
1:29:10.360 --> 1:29:13.360
We're else doing the ones myself
1:29:13.360 --> 1:29:17.360
with partners who were friends.
1:29:17.360 --> 1:29:20.360
It's nice because we just went along
1:29:20.360 --> 1:29:22.360
at a pace that made sense
1:29:22.360 --> 1:29:24.360
and we were able to build it to something
1:29:24.360 --> 1:29:27.360
which was big enough that we never had to work again
1:29:27.360 --> 1:29:29.360
but was not big enough that any VC
1:29:29.360 --> 1:29:31.360
would think it was impressive
1:29:31.360 --> 1:29:35.360
and that was enough for us to be excited.
1:29:35.360 --> 1:29:38.360
So I thought that's a much better way
1:29:38.360 --> 1:29:40.360
to do things for most people.
1:29:40.360 --> 1:29:42.360
And generally speaking now for yourself
1:29:42.360 --> 1:29:44.360
but how do you make money during that process?
1:29:44.360 --> 1:29:47.360
Do you cut into savings?
1:29:47.360 --> 1:29:49.360
So yeah, so I started Fast Mail
1:29:49.360 --> 1:29:51.360
and Optimal Decisions at the same time
1:29:51.360 --> 1:29:54.360
in 1999 with two different friends.
1:29:54.360 --> 1:29:59.360
And for Fast Mail,
1:29:59.360 --> 1:30:03.360
I guess I spent $70 a month on the server.
1:30:03.360 --> 1:30:06.360
And when the server ran out of space
1:30:06.360 --> 1:30:09.360
I put a payments button on the front page
1:30:09.360 --> 1:30:11.360
and said if you want more than 10 meg of space
1:30:11.360 --> 1:30:15.360
you have to pay $10 a year.
1:30:15.360 --> 1:30:18.360
So run low like I keep your cost down.
1:30:18.360 --> 1:30:19.360
Yeah, so I kept my cost down
1:30:19.360 --> 1:30:22.360
and once I needed to spend more money
1:30:22.360 --> 1:30:25.360
I asked people to spend the money for me
1:30:25.360 --> 1:30:29.360
and that was that basically from then on.
1:30:29.360 --> 1:30:34.360
We were making money and I was profitable from then.
1:30:34.360 --> 1:30:37.360
For Optimal Decisions it was a bit harder
1:30:37.360 --> 1:30:40.360
because we were trying to sell something
1:30:40.360 --> 1:30:42.360
that was more like a $1 million sale
1:30:42.360 --> 1:30:46.360
but what we did was we would sell scoping projects
1:30:46.360 --> 1:30:50.360
so kind of like prototypy projects
1:30:50.360 --> 1:30:51.360
but rather than doing it for free
1:30:51.360 --> 1:30:54.360
we would sell them $50,000 to $100,000.
1:30:54.360 --> 1:30:57.360
So again we were covering our costs
1:30:57.360 --> 1:30:58.360
and also making the client feel like
1:30:58.360 --> 1:31:00.360
we were doing something valuable.
1:31:00.360 --> 1:31:06.360
So in both cases we were profitable from six months in.
1:31:06.360 --> 1:31:08.360
Nevertheless it's scary.
1:31:08.360 --> 1:31:10.360
I mean, yeah, sure.
1:31:10.360 --> 1:31:13.360
I mean it's scary before you jump in
1:31:13.360 --> 1:31:18.360
and I guess I was comparing it to the scaredyness of VC.
1:31:18.360 --> 1:31:20.360
I felt like with VC stuff it was more scary.
1:31:20.360 --> 1:31:24.360
Much more in somebody else's hands.
1:31:24.360 --> 1:31:26.360
Will they fund you or not?
1:31:26.360 --> 1:31:28.360
What do they think of what you're doing?
1:31:28.360 --> 1:31:30.360
I also found it very difficult with VC's back startups
1:31:30.360 --> 1:31:33.360
to actually do the thing which I thought was important
1:31:33.360 --> 1:31:35.360
for the company rather than doing the thing
1:31:35.360 --> 1:31:38.360
which I thought would make the VC happy.
1:31:38.360 --> 1:31:40.360
Now, VC's always tell you not to do the thing
1:31:40.360 --> 1:31:41.360
that makes them happy
1:31:41.360 --> 1:31:43.360
but then if you don't do the thing that makes them happy
1:31:43.360 --> 1:31:45.360
they get sad.
1:31:45.360 --> 1:31:48.360
And do you think optimizing for the whatever they call it
1:31:48.360 --> 1:31:52.360
the exit is a good thing to optimize for?
1:31:52.360 --> 1:31:54.360
I mean it can be but not at the VC level
1:31:54.360 --> 1:31:59.360
because the VC exit needs to be, you know, a thousand X.
1:31:59.360 --> 1:32:02.360
So where else the lifestyle exit
1:32:02.360 --> 1:32:04.360
if you can sell something for $10 million
1:32:04.360 --> 1:32:06.360
then you've made it, right?
1:32:06.360 --> 1:32:08.360
So it depends.
1:32:08.360 --> 1:32:10.360
If you want to build something that's going to,
1:32:10.360 --> 1:32:13.360
you're kind of happy to do forever then fine.
1:32:13.360 --> 1:32:16.360
If you want to build something you want to sell
1:32:16.360 --> 1:32:18.360
then three years time that's fine too.
1:32:18.360 --> 1:32:21.360
I mean they're both perfectly good outcomes.
1:32:21.360 --> 1:32:24.360
So you're learning Swift now?
1:32:24.360 --> 1:32:26.360
In a way, I mean you already.
1:32:26.360 --> 1:32:31.360
And I read that you use at least in some cases
1:32:31.360 --> 1:32:34.360
space repetition as a mechanism for learning new things.
1:32:34.360 --> 1:32:38.360
I use Anki quite a lot myself.
1:32:38.360 --> 1:32:41.360
I actually don't never talk to anybody about it.
1:32:41.360 --> 1:32:44.360
Don't know how many people do it
1:32:44.360 --> 1:32:46.360
and it works incredibly well for me.
1:32:46.360 --> 1:32:48.360
Can you talk to your experience?
1:32:48.360 --> 1:32:52.360
Like how did you, what do you, first of all, okay,
1:32:52.360 --> 1:32:53.360
let's back it up.
1:32:53.360 --> 1:32:55.360
What is space repetition?
1:32:55.360 --> 1:33:00.360
So space repetition is an idea created
1:33:00.360 --> 1:33:03.360
by a psychologist named Ebbinghaus,
1:33:03.360 --> 1:33:06.360
I don't know, must be a couple hundred years ago
1:33:06.360 --> 1:33:08.360
or something 150 years ago.
1:33:08.360 --> 1:33:11.360
He did something which sounds pretty damn tedious.
1:33:11.360 --> 1:33:16.360
He found random sequences of letters on cards
1:33:16.360 --> 1:33:21.360
and tested how well he would remember those random sequences
1:33:21.360 --> 1:33:23.360
a day later, a week later, whatever.
1:33:23.360 --> 1:33:26.360
He discovered that there was this kind of a curve
1:33:26.360 --> 1:33:29.360
where his probability of remembering one of them
1:33:29.360 --> 1:33:31.360
would be dramatically smaller the next day
1:33:31.360 --> 1:33:32.360
and then a little bit smaller the next day
1:33:32.360 --> 1:33:34.360
and a little bit smaller the next day.
1:33:34.360 --> 1:33:37.360
What he discovered is that if he revised those cards
1:33:37.360 --> 1:33:42.360
a day, the probabilities would decrease at a smaller rate
1:33:42.360 --> 1:33:44.360
and then if he revised them again a week later,
1:33:44.360 --> 1:33:46.360
they would decrease at a smaller rate again.
1:33:46.360 --> 1:33:51.360
And so he basically figured out a roughly optimal equation
1:33:51.360 --> 1:33:56.360
for when you should revise something you want to remember.
1:33:56.360 --> 1:34:00.360
So space repetition learning is using this simple algorithm,
1:34:00.360 --> 1:34:03.360
just something like revise something after a day
1:34:03.360 --> 1:34:06.360
and then three days and then a week and then three weeks
1:34:06.360 --> 1:34:07.360
and so forth.
1:34:07.360 --> 1:34:10.360
And so if you use a program like Anki, as you know,
1:34:10.360 --> 1:34:12.360
it will just do that for you.
1:34:12.360 --> 1:34:14.360
And it will say, did you remember this?
1:34:14.360 --> 1:34:18.360
And if you say no, it will reschedule it back to be
1:34:18.360 --> 1:34:22.360
appear again like 10 times faster than it otherwise would have.
1:34:22.360 --> 1:34:27.360
It's a kind of a way of being guaranteed to learn something
1:34:27.360 --> 1:34:30.360
because by definition, if you're not learning it,
1:34:30.360 --> 1:34:33.360
it will be rescheduled to be revised more quickly.
1:34:33.360 --> 1:34:37.360
Unfortunately though, it doesn't let you fool yourself.
1:34:37.360 --> 1:34:42.360
If you're not learning something, you know your revisions
1:34:42.360 --> 1:34:44.360
will just get more and more.
1:34:44.360 --> 1:34:48.360
So you have to find ways to learn things productively
1:34:48.360 --> 1:34:50.360
and effectively treat your brain well.
1:34:50.360 --> 1:34:57.360
So using mnemonics and stories and context and stuff like that.
1:34:57.360 --> 1:34:59.360
So yeah, it's a super great technique.
1:34:59.360 --> 1:35:01.360
It's like learning how to learn is something
1:35:01.360 --> 1:35:05.360
which everybody should learn before they actually learn anything.
1:35:05.360 --> 1:35:07.360
But almost nobody does.
1:35:07.360 --> 1:35:10.360
Yes, so what have you, so it certainly works well
1:35:10.360 --> 1:35:14.360
for learning new languages, for, I mean, for learning,
1:35:14.360 --> 1:35:16.360
like small projects almost.
1:35:16.360 --> 1:35:19.360
But do you, you know, I started using it for,
1:35:19.360 --> 1:35:22.360
I forget who wrote a blog post about this inspired me.
1:35:22.360 --> 1:35:25.360
It might have been you, I'm not sure.
1:35:25.360 --> 1:35:28.360
I started when I read papers.
1:35:28.360 --> 1:35:31.360
I'll, concepts and ideas, I'll put them.
1:35:31.360 --> 1:35:32.360
Was it Michael Nielsen?
1:35:32.360 --> 1:35:33.360
It was Michael Nielsen.
1:35:33.360 --> 1:35:34.360
Yeah, it was Michael Nielsen.
1:35:34.360 --> 1:35:36.360
Michael started doing this recently
1:35:36.360 --> 1:35:39.360
and has been writing about it.
1:35:39.360 --> 1:35:44.360
I, so the kind of today's ebbing house is a guy called Peter Wozniak
1:35:44.360 --> 1:35:47.360
who developed a system called Super Memo.
1:35:47.360 --> 1:35:51.360
And he's been basically trying to become like
1:35:51.360 --> 1:35:55.360
the world's greatest renaissance man over the last few decades.
1:35:55.360 --> 1:36:00.360
He's basically lived his life with space repetition learning
1:36:00.360 --> 1:36:03.360
for everything.
1:36:03.360 --> 1:36:07.360
I, and sort of like Michael's only very recently got into this,
1:36:07.360 --> 1:36:09.360
but he started really getting excited about doing it
1:36:09.360 --> 1:36:10.360
for a lot of different things.
1:36:10.360 --> 1:36:14.360
For me personally, I actually don't use it
1:36:14.360 --> 1:36:16.360
for anything except Chinese.
1:36:16.360 --> 1:36:21.360
And the reason for that is that Chinese is specifically a thing.
1:36:21.360 --> 1:36:26.360
I made a conscious decision that I want to continue to remember
1:36:26.360 --> 1:36:29.360
even if I don't get much of a chance to exercise it
1:36:29.360 --> 1:36:33.360
because like I'm not often in China, so I don't.
1:36:33.360 --> 1:36:37.360
Or else something like programming languages or papers,
1:36:37.360 --> 1:36:39.360
they have a very different approach,
1:36:39.360 --> 1:36:42.360
which is I try not to learn anything from them,
1:36:42.360 --> 1:36:46.360
but instead I try to identify the important concepts
1:36:46.360 --> 1:36:48.360
and like actually ingest them.
1:36:48.360 --> 1:36:53.360
So like really understand that concept deeply
1:36:53.360 --> 1:36:54.360
and study it carefully.
1:36:54.360 --> 1:36:56.360
Well, decide if it really is important.
1:36:56.360 --> 1:37:00.360
If it is like incorporate it into our library,
1:37:00.360 --> 1:37:03.360
you know, incorporate it into how I do things
1:37:03.360 --> 1:37:06.360
or decide it's not worth it.
1:37:06.360 --> 1:37:12.360
So I find I then remember the things that I care about
1:37:12.360 --> 1:37:15.360
because I'm using it all the time.
1:37:15.360 --> 1:37:19.360
So for the last 25 years,
1:37:19.360 --> 1:37:23.360
I've committed to spending at least half of every day
1:37:23.360 --> 1:37:25.360
learning or practicing something new,
1:37:25.360 --> 1:37:28.360
which is all my colleagues have always hated
1:37:28.360 --> 1:37:30.360
because it always looks like I'm not working on
1:37:30.360 --> 1:37:31.360
what I'm meant to be working on,
1:37:31.360 --> 1:37:34.360
but that always means I do everything faster
1:37:34.360 --> 1:37:36.360
because I've been practicing a lot of stuff.
1:37:36.360 --> 1:37:39.360
So I kind of give myself a lot of opportunity
1:37:39.360 --> 1:37:41.360
to practice new things.
1:37:41.360 --> 1:37:47.360
And so I find now I don't often kind of find myself
1:37:47.360 --> 1:37:50.360
wishing I could remember something
1:37:50.360 --> 1:37:51.360
because if it's something that's useful,
1:37:51.360 --> 1:37:53.360
then I've been using it a lot.
1:37:53.360 --> 1:37:55.360
It's easy enough to look it up on Google.
1:37:55.360 --> 1:37:59.360
But speaking Chinese, you can't look it up on Google.
1:37:59.360 --> 1:38:01.360
Do you have advice for people learning new things?
1:38:01.360 --> 1:38:04.360
What have you learned as a process?
1:38:04.360 --> 1:38:07.360
I mean, it all starts just making the hours
1:38:07.360 --> 1:38:08.360
and the day available.
1:38:08.360 --> 1:38:10.360
Yeah, you've got to stick with it,
1:38:10.360 --> 1:38:12.360
which is, again, the number one thing
1:38:12.360 --> 1:38:14.360
that 99% of people don't do.
1:38:14.360 --> 1:38:16.360
So the people I started learning Chinese with,
1:38:16.360 --> 1:38:18.360
none of them were still doing it 12 months later.
1:38:18.360 --> 1:38:20.360
I'm still doing it 10 years later.
1:38:20.360 --> 1:38:22.360
I tried to stay in touch with them,
1:38:22.360 --> 1:38:24.360
but they just, no one did it.
1:38:24.360 --> 1:38:26.360
For something like Chinese,
1:38:26.360 --> 1:38:28.360
like study how human learning works.
1:38:28.360 --> 1:38:31.360
So every one of my Chinese flashcards
1:38:31.360 --> 1:38:33.360
is associated with a story,
1:38:33.360 --> 1:38:36.360
and that story is specifically designed to be memorable.
1:38:36.360 --> 1:38:38.360
And we find things memorable,
1:38:38.360 --> 1:38:41.360
funny or disgusting or sexy
1:38:41.360 --> 1:38:44.360
or related to people that we know or care about.
1:38:44.360 --> 1:38:47.360
So I try to make sure all the stories that are in my head
1:38:47.360 --> 1:38:50.360
have those characteristics.
1:38:50.360 --> 1:38:52.360
Yeah, so you have to, you know,
1:38:52.360 --> 1:38:55.360
you won't remember things well if they don't have some context.
1:38:55.360 --> 1:38:57.360
And yeah, you won't remember them well
1:38:57.360 --> 1:39:00.360
if you don't regularly practice them,
1:39:00.360 --> 1:39:02.360
whether it be just part of your day to day life
1:39:02.360 --> 1:39:05.360
for the Chinese and me flashcards.
1:39:05.360 --> 1:39:09.360
I mean, the other thing is, let yourself fail sometimes.
1:39:09.360 --> 1:39:11.360
So like, I've had various medical problems
1:39:11.360 --> 1:39:13.360
over the last few years,
1:39:13.360 --> 1:39:16.360
and basically my flashcards just stopped
1:39:16.360 --> 1:39:18.360
for about three years.
1:39:18.360 --> 1:39:21.360
And then there've been other times I've stopped
1:39:21.360 --> 1:39:24.360
for a few months, and it's so hard because you get back to it,
1:39:24.360 --> 1:39:27.360
and it's like, you have 18,000 cards due.
1:39:27.360 --> 1:39:30.360
It's like, and so you just have to go,
1:39:30.360 --> 1:39:33.360
all right, well, I can either stop and give up everything
1:39:33.360 --> 1:39:36.360
or just decide to do this every day for the next two years
1:39:36.360 --> 1:39:38.360
until I get back to it.
1:39:38.360 --> 1:39:41.360
The amazing thing has been that even after three years,
1:39:41.360 --> 1:39:45.360
I, you know, the Chinese were still in there.
1:39:45.360 --> 1:39:47.360
Like, it was so much faster to relearn
1:39:47.360 --> 1:39:49.360
than it was to mine the first time.
1:39:49.360 --> 1:39:51.360
Yeah, absolutely.
1:39:51.360 --> 1:39:52.360
It's in there.
1:39:52.360 --> 1:39:55.360
I have the same with guitar, with music and so on.
1:39:55.360 --> 1:39:58.360
It's sad because work sometimes takes away
1:39:58.360 --> 1:40:00.360
and then you won't play for a year.
1:40:00.360 --> 1:40:03.360
But really, if you then just get back to it every day,
1:40:03.360 --> 1:40:05.360
you're right there again.
1:40:05.360 --> 1:40:08.360
What do you think is the next big breakthrough
1:40:08.360 --> 1:40:09.360
in artificial intelligence?
1:40:09.360 --> 1:40:12.360
What are your hopes in deep learning or beyond
1:40:12.360 --> 1:40:14.360
that people should be working on,
1:40:14.360 --> 1:40:16.360
or you hope there'll be breakthroughs?
1:40:16.360 --> 1:40:18.360
I don't think it's possible to predict.
1:40:18.360 --> 1:40:20.360
I think what we already have
1:40:20.360 --> 1:40:23.360
is an incredibly powerful platform
1:40:23.360 --> 1:40:26.360
to solve lots of societally important problems
1:40:26.360 --> 1:40:28.360
that are currently unsolved.
1:40:28.360 --> 1:40:30.360
I just hope that people will, lots of people
1:40:30.360 --> 1:40:33.360
will learn this toolkit and try to use it.
1:40:33.360 --> 1:40:36.360
I don't think we need a lot of new technological breakthroughs
1:40:36.360 --> 1:40:39.360
to do a lot of great work right now.
1:40:39.360 --> 1:40:42.360
And when do you think we're going to create
1:40:42.360 --> 1:40:44.360
a human level intelligence system?
1:40:44.360 --> 1:40:45.360
Do you think?
1:40:45.360 --> 1:40:46.360
I don't know.
1:40:46.360 --> 1:40:47.360
How hard is it?
1:40:47.360 --> 1:40:48.360
How far away are we?
1:40:48.360 --> 1:40:49.360
I don't know.
1:40:49.360 --> 1:40:50.360
I have no way to know.
1:40:50.360 --> 1:40:51.360
I don't know.
1:40:51.360 --> 1:40:53.360
Like, I don't know why people make predictions about this
1:40:53.360 --> 1:40:57.360
because there's no data and nothing to go on.
1:40:57.360 --> 1:40:59.360
And it's just like,
1:40:59.360 --> 1:41:03.360
there's so many societally important problems
1:41:03.360 --> 1:41:04.360
to solve right now,
1:41:04.360 --> 1:41:08.360
I just don't find it a really interesting question
1:41:08.360 --> 1:41:09.360
to even answer.
1:41:09.360 --> 1:41:12.360
So in terms of societally important problems,
1:41:12.360 --> 1:41:15.360
what's the problem that is within reach?
1:41:15.360 --> 1:41:17.360
Well, I mean, for example,
1:41:17.360 --> 1:41:19.360
there are problems that AI creates, right?
1:41:19.360 --> 1:41:21.360
So more specifically,
1:41:22.360 --> 1:41:26.360
labor force displacement is going to be huge
1:41:26.360 --> 1:41:28.360
and people keep making this
1:41:28.360 --> 1:41:31.360
frivolous econometric argument of being like,
1:41:31.360 --> 1:41:33.360
oh, there's been other things that aren't AI
1:41:33.360 --> 1:41:34.360
that have come along before
1:41:34.360 --> 1:41:37.360
and haven't created massive labor force displacement.
1:41:37.360 --> 1:41:39.360
Therefore, AI won't.
1:41:39.360 --> 1:41:41.360
So that's a serious concern for you?
1:41:41.360 --> 1:41:42.360
Oh, yeah.
1:41:42.360 --> 1:41:43.360
Andrew Yang is running on it.
1:41:43.360 --> 1:41:44.360
Yeah.
1:41:44.360 --> 1:41:46.360
It's desperately concerned.
1:41:46.360 --> 1:41:52.360
And you see already that the changing workplace
1:41:52.360 --> 1:41:55.360
has lived to a hollowing out of the middle class.
1:41:55.360 --> 1:41:58.360
You're seeing that students coming out of school today
1:41:58.360 --> 1:42:03.360
have a less rosy financial future ahead of them
1:42:03.360 --> 1:42:04.360
than the parents did,
1:42:04.360 --> 1:42:06.360
which has never happened in recent,
1:42:06.360 --> 1:42:08.360
in the last 300 years.
1:42:08.360 --> 1:42:11.360
We've always had progress before.
1:42:11.360 --> 1:42:16.360
And you see this turning into anxiety and despair
1:42:16.360 --> 1:42:19.360
and even violence.
1:42:19.360 --> 1:42:21.360
So I very much worry about that.
1:42:21.360 --> 1:42:24.360
You've written quite a bit about ethics, too.
1:42:24.360 --> 1:42:27.360
I do think that every data scientist
1:42:27.360 --> 1:42:32.360
working with deep learning needs to recognize
1:42:32.360 --> 1:42:34.360
they have an incredibly high leverage tool
1:42:34.360 --> 1:42:36.360
that they're using that can influence society
1:42:36.360 --> 1:42:37.360
in lots of ways.
1:42:37.360 --> 1:42:38.360
And if they're doing research,
1:42:38.360 --> 1:42:41.360
that research is going to be used by people
1:42:41.360 --> 1:42:42.360
doing this kind of work
1:42:42.360 --> 1:42:44.360
and they have a responsibility
1:42:44.360 --> 1:42:46.360
to consider the consequences
1:42:46.360 --> 1:42:49.360
and to think about things like
1:42:49.360 --> 1:42:53.360
how will humans be in the loop here?
1:42:53.360 --> 1:42:55.360
How do we avoid runaway feedback loops?
1:42:55.360 --> 1:42:58.360
How do we ensure an appeals process for humans
1:42:58.360 --> 1:43:00.360
that are impacted by my algorithm?
1:43:00.360 --> 1:43:04.360
How do I ensure that the constraints of my algorithm
1:43:04.360 --> 1:43:08.360
are adequately explained to the people that end up using them?
1:43:08.360 --> 1:43:11.360
There's all kinds of human issues,
1:43:11.360 --> 1:43:13.360
which only data scientists
1:43:13.360 --> 1:43:17.360
are actually in the right place to educate people about,
1:43:17.360 --> 1:43:21.360
but data scientists tend to think of themselves as
1:43:21.360 --> 1:43:22.360
just engineers
1:43:22.360 --> 1:43:24.360
and that they don't need to be part of that process,
1:43:24.360 --> 1:43:26.360
which is wrong.
1:43:26.360 --> 1:43:29.360
Well, you're in the perfect position to educate them better,
1:43:29.360 --> 1:43:32.360
to read literature, to read history,
1:43:32.360 --> 1:43:35.360
to learn from history.
1:43:35.360 --> 1:43:38.360
Well, Jeremy, thank you so much for everything you do
1:43:38.360 --> 1:43:40.360
for inspiring a huge amount of people,
1:43:40.360 --> 1:43:42.360
getting them into deep learning
1:43:42.360 --> 1:43:44.360
and having the ripple effects,
1:43:44.360 --> 1:43:48.360
the flap of a butterfly's wings that will probably change the world.
1:43:48.360 --> 1:44:17.360
So thank you very much.
|