Datasets:
File size: 60,926 Bytes
a3be5d0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 1826 1827 1828 1829 1830 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850 1851 1852 1853 1854 1855 1856 1857 1858 1859 1860 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 1876 1877 1878 1879 1880 1881 1882 1883 1884 1885 1886 1887 1888 1889 1890 1891 1892 1893 1894 1895 1896 1897 1898 1899 1900 1901 1902 1903 1904 1905 1906 1907 1908 1909 1910 1911 1912 1913 1914 1915 1916 1917 1918 1919 1920 1921 1922 1923 1924 1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935 1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 1951 1952 1953 1954 1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 2026 2027 2028 2029 2030 2031 2032 2033 2034 2035 2036 2037 2038 2039 2040 2041 2042 2043 2044 2045 2046 2047 2048 2049 2050 2051 2052 2053 2054 2055 2056 2057 2058 2059 2060 2061 2062 2063 2064 2065 2066 2067 2068 2069 2070 2071 2072 2073 2074 2075 2076 2077 2078 2079 2080 2081 2082 2083 2084 2085 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 2101 2102 2103 2104 2105 2106 2107 2108 2109 2110 2111 2112 2113 2114 2115 2116 2117 2118 2119 2120 2121 2122 2123 2124 2125 2126 2127 2128 2129 2130 2131 2132 2133 2134 2135 2136 2137 2138 2139 2140 2141 2142 2143 2144 2145 2146 2147 2148 2149 2150 2151 2152 2153 2154 2155 2156 2157 2158 2159 2160 2161 2162 2163 2164 2165 2166 2167 2168 2169 2170 2171 2172 2173 2174 2175 2176 2177 2178 2179 2180 2181 2182 2183 2184 2185 2186 2187 2188 2189 2190 2191 2192 2193 2194 2195 2196 2197 2198 2199 2200 2201 2202 2203 2204 2205 2206 2207 2208 2209 2210 2211 2212 2213 2214 2215 2216 2217 2218 2219 2220 2221 2222 2223 2224 2225 2226 2227 2228 2229 2230 2231 2232 2233 2234 2235 2236 2237 2238 2239 2240 2241 2242 2243 2244 2245 2246 2247 2248 2249 2250 2251 2252 2253 2254 2255 2256 2257 2258 2259 2260 2261 2262 2263 2264 2265 2266 2267 2268 2269 2270 2271 2272 2273 2274 2275 2276 2277 2278 2279 2280 2281 2282 2283 2284 2285 2286 2287 2288 2289 2290 2291 2292 2293 2294 2295 2296 2297 2298 2299 2300 2301 2302 2303 2304 2305 2306 2307 2308 2309 2310 2311 2312 2313 2314 2315 2316 2317 2318 2319 2320 2321 2322 2323 2324 2325 2326 2327 2328 2329 2330 2331 2332 2333 2334 2335 2336 2337 2338 2339 2340 2341 2342 2343 2344 2345 2346 2347 2348 2349 2350 2351 2352 2353 2354 2355 2356 2357 2358 2359 2360 2361 2362 2363 2364 2365 2366 2367 2368 2369 2370 2371 2372 2373 2374 2375 2376 2377 2378 2379 2380 2381 2382 2383 2384 2385 2386 2387 2388 2389 2390 2391 2392 2393 2394 2395 2396 2397 2398 2399 2400 2401 2402 2403 2404 2405 2406 2407 2408 2409 2410 2411 2412 2413 2414 2415 2416 2417 2418 2419 2420 2421 2422 2423 2424 2425 2426 2427 2428 2429 2430 2431 2432 2433 2434 2435 2436 2437 2438 2439 2440 2441 2442 2443 2444 2445 2446 2447 2448 2449 2450 2451 2452 2453 2454 2455 2456 2457 2458 2459 2460 2461 2462 2463 2464 2465 2466 2467 2468 2469 2470 2471 2472 2473 2474 2475 2476 2477 2478 2479 2480 2481 2482 2483 2484 2485 2486 2487 2488 2489 2490 2491 2492 2493 2494 2495 2496 2497 2498 2499 2500 2501 2502 2503 2504 2505 2506 2507 2508 2509 2510 2511 2512 2513 2514 2515 2516 2517 2518 2519 2520 2521 2522 2523 2524 2525 2526 2527 2528 2529 2530 2531 2532 2533 2534 2535 2536 2537 2538 2539 2540 2541 2542 2543 2544 2545 2546 2547 2548 2549 2550 2551 2552 2553 2554 2555 2556 2557 2558 2559 2560 2561 2562 2563 2564 2565 2566 2567 2568 2569 2570 2571 2572 2573 2574 2575 2576 2577 2578 2579 2580 2581 2582 2583 2584 2585 2586 2587 2588 2589 2590 2591 2592 2593 2594 2595 2596 2597 2598 2599 2600 2601 2602 2603 2604 2605 2606 2607 2608 2609 2610 2611 2612 2613 2614 2615 2616 2617 2618 2619 2620 2621 2622 2623 2624 2625 2626 2627 2628 2629 2630 2631 2632 2633 2634 2635 2636 2637 2638 2639 2640 2641 2642 2643 2644 2645 2646 2647 2648 2649 2650 2651 2652 2653 2654 2655 2656 2657 2658 2659 2660 2661 2662 2663 2664 2665 2666 2667 2668 2669 2670 2671 2672 2673 2674 2675 2676 2677 2678 2679 2680 2681 2682 2683 2684 2685 2686 2687 2688 2689 2690 2691 2692 2693 2694 2695 2696 2697 2698 2699 2700 2701 2702 2703 2704 2705 2706 |
WEBVTT
00:00.000 --> 00:03.120
The following is a conversation with Chris Sampson.
00:03.120 --> 00:06.000
He was a CTO of the Google self driving car team,
00:06.000 --> 00:08.880
a key engineer and leader behind the Carnegie Mellon
00:08.880 --> 00:12.000
University autonomous vehicle entries in the DARPA Grand
00:12.000 --> 00:16.160
Challenges and the winner of the DARPA Urban Challenge.
00:16.160 --> 00:20.100
Today, he's the CEO of Aurora Innovation, an autonomous
00:20.100 --> 00:21.360
vehicle software company.
00:21.360 --> 00:23.600
He started with Sterling Anderson,
00:23.600 --> 00:25.960
who was the former director of Tesla Autopilot,
00:25.960 --> 00:30.120
and drew back now, Uber's former autonomy and perception lead.
00:30.120 --> 00:32.880
Chris is one of the top roboticists and autonomous
00:32.880 --> 00:36.320
vehicle experts in the world, and a longtime voice
00:36.320 --> 00:38.840
of reason in a space that is shrouded
00:38.840 --> 00:41.320
in both mystery and hype.
00:41.320 --> 00:43.600
He both acknowledges the incredible challenges
00:43.600 --> 00:46.480
involved in solving the problem of autonomous driving
00:46.480 --> 00:49.760
and is working hard to solve it.
00:49.760 --> 00:52.400
This is the Artificial Intelligence podcast.
00:52.400 --> 00:54.720
If you enjoy it, subscribe on YouTube,
00:54.720 --> 00:57.920
give it five stars on iTunes, support it on Patreon,
00:57.920 --> 00:59.720
or simply connect with me on Twitter
00:59.720 --> 01:03.240
at Lex Friedman, spelled F R I D M A N.
01:03.240 --> 01:09.120
And now, here's my conversation with Chris Sampson.
01:09.120 --> 01:11.960
You were part of both the DARPA Grand Challenge
01:11.960 --> 01:13.880
and the DARPA Urban Challenge teams
01:13.880 --> 01:17.040
at CMU with Red Whitaker.
01:17.040 --> 01:19.720
What technical or philosophical things
01:19.720 --> 01:22.240
have you learned from these races?
01:22.240 --> 01:26.600
I think the high order bit was that it could be done.
01:26.600 --> 01:30.200
I think that was the thing that was
01:30.200 --> 01:34.880
incredible about the first of the Grand Challenges,
01:34.880 --> 01:38.160
that I remember I was a grad student at Carnegie Mellon,
01:38.160 --> 01:45.360
and there was kind of this dichotomy of it
01:45.360 --> 01:46.720
seemed really hard, so that would
01:46.720 --> 01:48.800
be cool and interesting.
01:48.800 --> 01:52.800
But at the time, we were the only robotics institute around,
01:52.800 --> 01:55.560
and so if we went into it and fell on our faces,
01:55.560 --> 01:58.360
that would be embarrassing.
01:58.360 --> 02:01.120
So I think just having the will to go do it,
02:01.120 --> 02:02.880
to try to do this thing that at the time
02:02.880 --> 02:05.000
was marked as darn near impossible,
02:05.000 --> 02:06.960
and then after a couple of tries,
02:06.960 --> 02:08.420
be able to actually make it happen,
02:08.420 --> 02:12.320
I think that was really exciting.
02:12.320 --> 02:15.040
But at which point did you believe it was possible?
02:15.040 --> 02:16.960
Did you from the very beginning?
02:16.960 --> 02:18.000
Did you personally?
02:18.000 --> 02:19.800
Because you're one of the lead engineers.
02:19.800 --> 02:21.800
You actually had to do a lot of the work.
02:21.800 --> 02:23.880
Yeah, I was the technical director there,
02:23.880 --> 02:26.120
and did a lot of the work, along with a bunch
02:26.120 --> 02:28.420
of other really good people.
02:28.420 --> 02:29.760
Did I believe it could be done?
02:29.760 --> 02:31.080
Yeah, of course.
02:31.080 --> 02:32.760
Why would you go do something you thought
02:32.760 --> 02:34.800
was completely impossible?
02:34.800 --> 02:36.260
We thought it was going to be hard.
02:36.260 --> 02:37.800
We didn't know how we were going to be able to do it.
02:37.800 --> 02:42.880
We didn't know if we'd be able to do it the first time.
02:42.880 --> 02:45.960
Turns out we couldn't.
02:45.960 --> 02:48.400
That, yeah, I guess you have to.
02:48.400 --> 02:52.960
I think there's a certain benefit to naivete, right?
02:52.960 --> 02:55.440
That if you don't know how hard something really is,
02:55.440 --> 02:59.600
you try different things, and it gives you an opportunity
02:59.600 --> 03:04.120
that others who are wiser maybe don't have.
03:04.120 --> 03:05.720
What were the biggest pain points?
03:05.720 --> 03:08.880
Mechanical, sensors, hardware, software,
03:08.880 --> 03:11.800
algorithms for mapping, localization,
03:11.800 --> 03:13.680
just general perception, control?
03:13.680 --> 03:15.320
Like hardware, software, first of all?
03:15.320 --> 03:20.120
I think that's the joy of this field, is that it's all hard
03:20.120 --> 03:25.360
and that you have to be good at each part of it.
03:25.360 --> 03:32.360
So for the urban challenges, if I look back at it from today,
03:32.360 --> 03:38.960
it should be easy today, that it was a static world.
03:38.960 --> 03:40.800
There weren't other actors moving through it,
03:40.800 --> 03:42.480
is what that means.
03:42.480 --> 03:47.080
It was out in the desert, so you get really good GPS.
03:47.080 --> 03:51.400
So that went, and we could map it roughly.
03:51.400 --> 03:55.160
And so in retrospect now, it's within the realm of things
03:55.160 --> 03:57.840
we could do back then.
03:57.840 --> 03:59.720
Just actually getting the vehicle and the,
03:59.720 --> 04:00.680
there's a bunch of engineering work
04:00.680 --> 04:04.760
to get the vehicle so that we could control it and drive it.
04:04.760 --> 04:09.600
That's still a pain today, but it was even more so back then.
04:09.600 --> 04:14.280
And then the uncertainty of exactly what they wanted us to do
04:14.280 --> 04:17.040
was part of the challenge as well.
04:17.040 --> 04:19.440
Right, you didn't actually know the track heading in here.
04:19.440 --> 04:21.480
You knew approximately, but you didn't actually
04:21.480 --> 04:23.520
know the route that was going to be taken.
04:23.520 --> 04:24.920
That's right, we didn't know the route.
04:24.920 --> 04:28.600
We didn't even really, the way the rules had been described,
04:28.600 --> 04:29.800
you had to kind of guess.
04:29.800 --> 04:33.360
So if you think back to that challenge,
04:33.360 --> 04:36.960
the idea was that the government would give us,
04:36.960 --> 04:40.320
the DARPA would give us a set of waypoints
04:40.320 --> 04:43.520
and kind of the width that you had to stay within
04:43.520 --> 04:46.800
between the line that went between each of those waypoints.
04:46.800 --> 04:49.280
And so the most devious thing they could have done
04:49.280 --> 04:53.280
is set a kilometer wide corridor across a field
04:53.280 --> 04:58.280
of scrub brush and rocks and said, go figure it out.
04:58.520 --> 05:01.920
Fortunately, it really, it turned into basically driving
05:01.920 --> 05:05.000
along a set of trails, which is much more relevant
05:05.000 --> 05:07.920
to the application they were looking for.
05:08.760 --> 05:12.080
But no, it was a hell of a thing back in the day.
05:12.080 --> 05:16.640
So the legend, Red, was kind of leading that effort
05:16.640 --> 05:19.120
in terms of just broadly speaking.
05:19.120 --> 05:22.040
So you're a leader now.
05:22.040 --> 05:25.000
What have you learned from Red about leadership?
05:25.000 --> 05:26.200
I think there's a couple things.
05:26.200 --> 05:31.080
One is go and try those really hard things.
05:31.080 --> 05:34.480
That's where there is an incredible opportunity.
05:34.480 --> 05:36.560
I think the other big one, though,
05:36.560 --> 05:40.680
is to see people for who they can be, not who they are.
05:41.720 --> 05:43.720
It's one of the things that I actually,
05:43.720 --> 05:46.080
one of the deepest lessons I learned from Red
05:46.080 --> 05:50.200
was that he would look at undergraduates
05:50.200 --> 05:55.200
or graduate students and empower them to be leaders,
05:56.120 --> 06:00.320
to have responsibility, to do great things
06:00.320 --> 06:04.480
that I think another person might look at them
06:04.480 --> 06:06.600
and think, oh, well, that's just an undergraduate student.
06:06.600 --> 06:07.720
What could they know?
06:08.680 --> 06:12.720
And so I think that kind of trust but verify,
06:12.720 --> 06:14.480
have confidence in what people can become,
06:14.480 --> 06:16.680
I think is a really powerful thing.
06:16.680 --> 06:20.440
So through that, let's just fast forward through the history.
06:20.440 --> 06:24.160
Can you maybe talk through the technical evolution
06:24.160 --> 06:26.200
of autonomous vehicle systems
06:26.200 --> 06:29.960
from the first two Grand Challenges to the Urban Challenge
06:29.960 --> 06:33.560
to today, are there major shifts in your mind
06:33.560 --> 06:37.240
or is it the same kind of technology just made more robust?
06:37.240 --> 06:39.840
I think there's been some big, big steps.
06:40.880 --> 06:43.720
So for the Grand Challenge,
06:43.720 --> 06:48.720
the real technology that unlocked that was HD mapping.
06:51.400 --> 06:54.200
Prior to that, a lot of the off road robotics work
06:55.160 --> 06:58.480
had been done without any real prior model
06:58.480 --> 07:01.400
of what the vehicle was going to encounter.
07:01.400 --> 07:04.880
And so that innovation that the fact that we could get
07:05.960 --> 07:10.960
decimeter resolution models was really a big deal.
07:13.440 --> 07:18.200
And that allowed us to kind of bound the complexity
07:18.200 --> 07:19.680
of the driving problem the vehicle had
07:19.680 --> 07:21.040
and allowed it to operate at speed
07:21.040 --> 07:23.800
because we could assume things about the environment
07:23.800 --> 07:25.360
that it was going to encounter.
07:25.360 --> 07:29.720
So that was the big step there.
07:31.280 --> 07:35.280
For the Urban Challenge,
07:37.240 --> 07:39.280
one of the big technological innovations there
07:39.280 --> 07:41.040
was the multi beam LIDAR
07:41.960 --> 07:45.760
and being able to generate high resolution,
07:45.760 --> 07:48.680
mid to long range 3D models of the world
07:48.680 --> 07:53.680
and use that for understanding the world around the vehicle.
07:53.680 --> 07:56.600
And that was really kind of a game changing technology.
07:58.600 --> 08:00.000
In parallel with that,
08:00.000 --> 08:04.360
we saw a bunch of other technologies
08:04.360 --> 08:06.120
that had been kind of converging
08:06.120 --> 08:08.440
half their day in the sun.
08:08.440 --> 08:12.560
So Bayesian estimation had been,
08:12.560 --> 08:17.560
SLAM had been a big field in robotics.
08:17.840 --> 08:20.760
You would go to a conference a couple of years before that
08:20.760 --> 08:24.880
and every paper would effectively have SLAM somewhere in it.
08:24.880 --> 08:29.320
And so seeing that the Bayesian estimation techniques
08:30.720 --> 08:33.400
play out on a very visible stage,
08:33.400 --> 08:36.520
I thought that was pretty exciting to see.
08:38.080 --> 08:41.560
And mostly SLAM was done based on LIDAR at that time.
08:41.560 --> 08:44.560
Yeah, and in fact, we weren't really doing SLAM per se
08:45.600 --> 08:47.480
in real time because we had a model ahead of time,
08:47.480 --> 08:51.040
we had a roadmap, but we were doing localization.
08:51.040 --> 08:53.560
And we were using the LIDAR or the cameras
08:53.560 --> 08:55.400
depending on who exactly was doing it
08:55.400 --> 08:57.560
to localize to a model of the world.
08:57.560 --> 09:00.160
And I thought that was a big step
09:00.160 --> 09:05.160
from kind of naively trusting GPS, INS before that.
09:06.640 --> 09:09.840
And again, lots of work had been going on in this field.
09:09.840 --> 09:13.040
Certainly this was not doing anything
09:13.040 --> 09:16.840
particularly innovative in SLAM or in localization,
09:16.840 --> 09:20.200
but it was seeing that technology necessary
09:20.200 --> 09:21.800
in a real application on a big stage,
09:21.800 --> 09:23.080
I thought was very cool.
09:23.080 --> 09:24.000
So for the urban challenge,
09:24.000 --> 09:28.600
those are already maps constructed offline in general.
09:28.600 --> 09:30.920
And did people do that individually,
09:30.920 --> 09:33.600
did individual teams do it individually
09:33.600 --> 09:36.440
so they had their own different approaches there
09:36.440 --> 09:41.440
or did everybody kind of share that information
09:41.720 --> 09:42.880
at least intuitively?
09:42.880 --> 09:47.880
So DARPA gave all the teams a model of the world, a map.
09:49.640 --> 09:53.240
And then one of the things that we had to figure out
09:53.240 --> 09:56.080
back then was, and it's still one of these things
09:56.080 --> 09:57.280
that trips people up today
09:57.280 --> 10:00.280
is actually the coordinate system.
10:00.280 --> 10:03.080
So you get a latitude longitude
10:03.080 --> 10:05.040
and to so many decimal places,
10:05.040 --> 10:07.360
you don't really care about kind of the ellipsoid
10:07.360 --> 10:09.560
of the earth that's being used.
10:09.560 --> 10:12.240
But when you want to get to 10 centimeter
10:12.240 --> 10:14.400
or centimeter resolution,
10:14.400 --> 10:18.520
you care whether the coordinate system is NADS 83
10:18.520 --> 10:23.520
or WGS 84 or these are different ways to describe
10:24.200 --> 10:26.760
both the kind of non sphericalness of the earth,
10:26.760 --> 10:31.080
but also kind of the, I think,
10:31.080 --> 10:32.080
I can't remember which one,
10:32.080 --> 10:33.600
the tectonic shifts that are happening
10:33.600 --> 10:37.000
and how to transform the global datum as a function of that.
10:37.000 --> 10:41.020
So getting a map and then actually matching it to reality
10:41.020 --> 10:42.880
to centimeter resolution, that was kind of interesting
10:42.880 --> 10:44.040
and fun back then.
10:44.040 --> 10:46.760
So how much work was the perception doing there?
10:46.760 --> 10:51.760
So how much were you relying on localization based on maps
10:52.480 --> 10:55.760
without using perception to register to the maps?
10:55.760 --> 10:58.000
And I guess the question is how advanced
10:58.000 --> 10:59.800
was perception at that point?
10:59.800 --> 11:01.960
It's certainly behind where we are today, right?
11:01.960 --> 11:05.840
We're more than a decade since the urban challenge.
11:05.840 --> 11:08.640
But the core of it was there.
11:08.640 --> 11:13.120
That we were tracking vehicles.
11:13.120 --> 11:15.640
We had to do that at 100 plus meter range
11:15.640 --> 11:18.320
because we had to merge with other traffic.
11:18.320 --> 11:21.240
We were using, again, Bayesian estimates
11:21.240 --> 11:23.860
for state of these vehicles.
11:23.860 --> 11:25.580
We had to deal with a bunch of the problems
11:25.580 --> 11:26.920
that you think of today,
11:26.920 --> 11:29.820
of predicting where that vehicle's going to be
11:29.820 --> 11:31.060
a few seconds into the future.
11:31.060 --> 11:32.380
We had to deal with the fact
11:32.380 --> 11:35.320
that there were multiple hypotheses for that
11:35.320 --> 11:37.660
because a vehicle at an intersection might be going right
11:37.660 --> 11:38.780
or it might be going straight
11:38.780 --> 11:40.620
or it might be making a left turn.
11:41.500 --> 11:44.120
And we had to deal with the challenge of the fact
11:44.120 --> 11:47.600
that our behavior was going to impact the behavior
11:47.600 --> 11:48.960
of that other operator.
11:48.960 --> 11:53.480
And we did a lot of that in relatively naive ways,
11:53.480 --> 11:54.820
but it kind of worked.
11:54.820 --> 11:57.080
Still had to have some kind of solution.
11:57.080 --> 11:59.960
And so where does that, 10 years later,
11:59.960 --> 12:01.520
where does that take us today
12:01.520 --> 12:04.260
from that artificial city construction
12:04.260 --> 12:07.000
to real cities to the urban environment?
12:07.000 --> 12:09.160
Yeah, I think the biggest thing
12:09.160 --> 12:14.160
is that the actors are truly unpredictable.
12:15.720 --> 12:18.800
That most of the time, the drivers on the road,
12:18.800 --> 12:23.800
the other road users are out there behaving well,
12:24.080 --> 12:25.880
but every once in a while they're not.
12:27.080 --> 12:32.080
The variety of other vehicles is, you have all of them.
12:32.080 --> 12:35.840
In terms of behavior, in terms of perception, or both?
12:35.840 --> 12:36.680
Both.
12:38.740 --> 12:40.520
Back then we didn't have to deal with cyclists,
12:40.520 --> 12:42.800
we didn't have to deal with pedestrians,
12:42.800 --> 12:44.800
didn't have to deal with traffic lights.
12:46.260 --> 12:49.400
The scale over which that you have to operate is now
12:49.400 --> 12:51.120
is much larger than the air base
12:51.120 --> 12:52.720
that we were thinking about back then.
12:52.720 --> 12:55.420
So what, easy question,
12:56.280 --> 12:59.720
what do you think is the hardest part about driving?
12:59.720 --> 13:00.560
Easy question.
13:00.560 --> 13:02.560
Yeah, no, I'm joking.
13:02.560 --> 13:07.440
I'm sure nothing really jumps out at you as one thing,
13:07.440 --> 13:12.440
but in the jump from the urban challenge to the real world,
13:12.920 --> 13:15.320
is there something that's a particular,
13:15.320 --> 13:18.480
you foresee as very serious, difficult challenge?
13:18.480 --> 13:21.080
I think the most fundamental difference
13:21.080 --> 13:25.340
is that we're doing it for real.
13:26.760 --> 13:28.960
That in that environment,
13:28.960 --> 13:31.880
it was both a limited complexity environment
13:31.880 --> 13:33.240
because certain actors weren't there,
13:33.240 --> 13:35.380
because the roads were maintained,
13:35.380 --> 13:37.360
there were barriers keeping people separate
13:37.360 --> 13:39.400
from robots at the time,
13:40.840 --> 13:43.300
and it only had to work for 60 miles.
13:43.300 --> 13:46.160
Which, looking at it from 2006,
13:46.160 --> 13:48.960
it had to work for 60 miles, right?
13:48.960 --> 13:50.940
Looking at it from now,
13:51.880 --> 13:53.720
we want things that will go and drive
13:53.720 --> 13:57.160
for half a million miles,
13:57.160 --> 14:00.020
and it's just a different game.
14:00.940 --> 14:03.480
So how important,
14:03.480 --> 14:06.080
you said LiDAR came into the game early on,
14:06.080 --> 14:07.880
and it's really the primary driver
14:07.880 --> 14:10.240
of autonomous vehicles today as a sensor.
14:10.240 --> 14:11.920
So how important is the role of LiDAR
14:11.920 --> 14:14.800
in the sensor suite in the near term?
14:14.800 --> 14:16.740
So I think it's essential.
14:17.920 --> 14:20.480
I believe, but I also believe that cameras are essential,
14:20.480 --> 14:22.120
and I believe the radar is essential.
14:22.120 --> 14:26.280
I think that you really need to use
14:26.280 --> 14:28.720
the composition of data from these different sensors
14:28.720 --> 14:32.640
if you want the thing to really be robust.
14:32.640 --> 14:34.360
The question I wanna ask,
14:34.360 --> 14:35.600
let's see if we can untangle it,
14:35.600 --> 14:39.320
is what are your thoughts on the Elon Musk
14:39.320 --> 14:42.340
provocative statement that LiDAR is a crutch,
14:42.340 --> 14:47.340
that it's a kind of, I guess, growing pains,
14:47.760 --> 14:49.920
and that much of the perception task
14:49.920 --> 14:52.120
can be done with cameras?
14:52.120 --> 14:55.440
So I think it is undeniable
14:55.440 --> 14:59.360
that people walk around without lasers in their foreheads,
14:59.360 --> 15:01.880
and they can get into vehicles and drive them,
15:01.880 --> 15:05.600
and so there's an existence proof
15:05.600 --> 15:09.600
that you can drive using passive vision.
15:10.880 --> 15:12.720
No doubt, can't argue with that.
15:12.720 --> 15:14.680
In terms of sensors, yeah, so there's proof.
15:14.680 --> 15:16.000
Yeah, in terms of sensors, right?
15:16.000 --> 15:20.200
So there's an example that we all go do it,
15:20.200 --> 15:21.380
many of us every day.
15:21.380 --> 15:26.380
In terms of LiDAR being a crutch, sure.
15:28.180 --> 15:33.100
But in the same way that the combustion engine
15:33.100 --> 15:35.260
was a crutch on the path to an electric vehicle,
15:35.260 --> 15:39.300
in the same way that any technology ultimately gets
15:40.840 --> 15:44.380
replaced by some superior technology in the future,
15:44.380 --> 15:47.740
and really the way that I look at this
15:47.740 --> 15:51.460
is that the way we get around on the ground,
15:51.460 --> 15:53.920
the way that we use transportation is broken,
15:55.280 --> 15:59.740
and that we have this, I think the number I saw this morning,
15:59.740 --> 16:04.060
37,000 Americans killed last year on our roads,
16:04.060 --> 16:05.380
and that's just not acceptable.
16:05.380 --> 16:09.460
And so any technology that we can bring to bear
16:09.460 --> 16:12.860
that accelerates this self driving technology
16:12.860 --> 16:14.640
coming to market and saving lives
16:14.640 --> 16:17.320
is technology we should be using.
16:18.280 --> 16:20.840
And it feels just arbitrary to say,
16:20.840 --> 16:25.840
well, I'm not okay with using lasers
16:26.240 --> 16:27.820
because that's whatever,
16:27.820 --> 16:30.720
but I am okay with using an eight megapixel camera
16:30.720 --> 16:32.880
or a 16 megapixel camera.
16:32.880 --> 16:34.640
These are just bits of technology,
16:34.640 --> 16:36.360
and we should be taking the best technology
16:36.360 --> 16:41.360
from the tool bin that allows us to go and solve a problem.
16:41.360 --> 16:45.160
The question I often talk to, well, obviously you do as well,
16:45.160 --> 16:48.280
to sort of automotive companies,
16:48.280 --> 16:51.360
and if there's one word that comes up more often
16:51.360 --> 16:55.280
than anything, it's cost, and trying to drive costs down.
16:55.280 --> 17:00.280
So while it's true that it's a tragic number, the 37,000,
17:01.400 --> 17:04.880
the question is, and I'm not the one asking this question
17:04.880 --> 17:05.820
because I hate this question,
17:05.820 --> 17:09.960
but we want to find the cheapest sensor suite
17:09.960 --> 17:13.280
that creates a safe vehicle.
17:13.280 --> 17:18.220
So in that uncomfortable trade off,
17:18.220 --> 17:23.220
do you foresee LiDAR coming down in cost in the future,
17:23.680 --> 17:26.680
or do you see a day where level four autonomy
17:26.680 --> 17:29.880
is possible without LiDAR?
17:29.880 --> 17:32.880
I see both of those, but it's really a matter of time.
17:32.880 --> 17:36.040
And I think really, maybe I would talk to the question
17:36.040 --> 17:37.840
you asked about the cheapest sensor.
17:37.840 --> 17:40.360
I don't think that's actually what you want.
17:40.360 --> 17:45.360
What you want is a sensor suite that is economically viable.
17:45.680 --> 17:49.440
And then after that, everything is about margin
17:49.440 --> 17:52.120
and driving costs out of the system.
17:52.120 --> 17:55.360
What you also want is a sensor suite that works.
17:55.360 --> 17:58.200
And so it's great to tell a story about
17:59.600 --> 18:03.260
how it would be better to have a self driving system
18:03.260 --> 18:08.040
with a $50 sensor instead of a $500 sensor.
18:08.040 --> 18:10.520
But if the $500 sensor makes it work
18:10.520 --> 18:14.760
and the $50 sensor doesn't work, who cares?
18:15.680 --> 18:20.020
So long as you can actually have an economic opportunity,
18:20.020 --> 18:21.520
there's an economic opportunity there.
18:21.520 --> 18:23.760
And the economic opportunity is important
18:23.760 --> 18:27.760
because that's how you actually have a sustainable business
18:27.760 --> 18:31.120
and that's how you can actually see this come to scale
18:31.120 --> 18:32.400
and be out in the world.
18:32.400 --> 18:34.780
And so when I look at LiDAR,
18:35.960 --> 18:38.880
I see a technology that has no underlying
18:38.880 --> 18:42.420
fundamentally expense to it, fundamental expense to it.
18:42.420 --> 18:46.080
It's going to be more expensive than an imager
18:46.080 --> 18:50.360
because CMOS processes or FAP processes
18:51.360 --> 18:55.080
are dramatically more scalable than mechanical processes.
18:56.200 --> 18:58.320
But we still should be able to drive costs down
18:58.320 --> 19:00.120
substantially on that side.
19:00.120 --> 19:04.840
And then I also do think that with the right business model
19:05.880 --> 19:07.560
you can absorb more,
19:07.560 --> 19:09.480
certainly more cost on the bill of materials.
19:09.480 --> 19:12.600
Yeah, if the sensor suite works, extra value is provided,
19:12.600 --> 19:15.480
thereby you don't need to drive costs down to zero.
19:15.480 --> 19:17.100
It's the basic economics.
19:17.100 --> 19:18.820
You've talked about your intuition
19:18.820 --> 19:22.200
that level two autonomy is problematic
19:22.200 --> 19:25.920
because of the human factor of vigilance,
19:25.920 --> 19:28.040
decrement, complacency, over trust and so on,
19:28.040 --> 19:29.600
just us being human.
19:29.600 --> 19:31.120
We over trust the system,
19:31.120 --> 19:34.240
we start doing even more so partaking
19:34.240 --> 19:37.180
in the secondary activities like smartphones and so on.
19:38.680 --> 19:43.000
Have your views evolved on this point in either direction?
19:43.000 --> 19:44.800
Can you speak to it?
19:44.800 --> 19:47.480
So, and I want to be really careful
19:47.480 --> 19:50.380
because sometimes this gets twisted in a way
19:50.380 --> 19:53.040
that I certainly didn't intend.
19:53.040 --> 19:58.040
So active safety systems are a really important technology
19:58.040 --> 20:00.680
that we should be pursuing and integrating into vehicles.
20:02.080 --> 20:04.280
And there's an opportunity in the near term
20:04.280 --> 20:06.520
to reduce accidents, reduce fatalities,
20:06.520 --> 20:10.320
and we should be pushing on that.
20:11.960 --> 20:14.680
Level two systems are systems
20:14.680 --> 20:18.080
where the vehicle is controlling two axes.
20:18.080 --> 20:21.720
So braking and throttle slash steering.
20:23.480 --> 20:25.680
And I think there are variants of level two systems
20:25.680 --> 20:27.280
that are supporting the driver.
20:27.280 --> 20:31.080
That absolutely we should encourage to be out there.
20:31.080 --> 20:32.880
Where I think there's a real challenge
20:32.880 --> 20:37.640
is in the human factors part around this
20:37.640 --> 20:41.240
and the misconception from the public
20:41.240 --> 20:43.600
around the capability set that that enables
20:43.600 --> 20:45.640
and the trust that they should have in it.
20:46.640 --> 20:50.000
And that is where I kind of,
20:50.000 --> 20:52.920
I'm actually incrementally more concerned
20:52.920 --> 20:54.440
around level three systems
20:54.440 --> 20:58.440
and how exactly a level two system is marketed and delivered
20:58.440 --> 21:01.840
and how much effort people have put into those human factors.
21:01.840 --> 21:05.640
So I still believe several things around this.
21:05.640 --> 21:09.440
One is people will overtrust the technology.
21:09.440 --> 21:11.440
We've seen over the last few weeks
21:11.440 --> 21:14.040
a spate of people sleeping in their Tesla.
21:14.920 --> 21:19.920
I watched an episode last night of Trevor Noah
21:19.920 --> 21:23.920
talking about this and him,
21:23.920 --> 21:26.720
this is a smart guy who has a lot of resources
21:26.720 --> 21:30.720
at his disposal describing a Tesla as a self driving car
21:30.720 --> 21:33.480
and that why shouldn't people be sleeping in their Tesla?
21:33.480 --> 21:36.560
And it's like, well, because it's not a self driving car
21:36.560 --> 21:38.840
and it is not intended to be
21:38.840 --> 21:43.840
and these people will almost certainly die at some point
21:46.400 --> 21:48.040
or hurt other people.
21:48.040 --> 21:50.080
And so we need to really be thoughtful
21:50.080 --> 21:51.840
about how that technology is described
21:51.840 --> 21:53.280
and brought to market.
21:54.240 --> 21:59.240
I also think that because of the economic challenges
21:59.240 --> 22:01.240
we were just talking about,
22:01.240 --> 22:05.160
that these level two driver assistance systems,
22:05.160 --> 22:07.280
that technology path will diverge
22:07.280 --> 22:10.200
from the technology path that we need to be on
22:10.200 --> 22:14.080
to actually deliver truly self driving vehicles,
22:14.080 --> 22:16.920
ones where you can get in it and drive it.
22:16.920 --> 22:20.800
Can get in it and sleep and have the equivalent
22:20.800 --> 22:24.680
or better safety than a human driver behind the wheel.
22:24.680 --> 22:27.520
Because again, the economics are very different
22:28.480 --> 22:30.880
in those two worlds and so that leads
22:30.880 --> 22:32.800
to divergent technology.
22:32.800 --> 22:34.680
So you just don't see the economics
22:34.680 --> 22:38.560
of gradually increasing from level two
22:38.560 --> 22:41.600
and doing so quickly enough
22:41.600 --> 22:44.480
to where it doesn't cause safety, critical safety concerns.
22:44.480 --> 22:47.680
You believe that it needs to diverge at this point
22:48.680 --> 22:50.800
into basically different routes.
22:50.800 --> 22:55.560
And really that comes back to what are those L2
22:55.560 --> 22:57.080
and L1 systems doing?
22:57.080 --> 22:59.840
And they are driver assistance functions
22:59.840 --> 23:04.400
where the people that are marketing that responsibly
23:04.400 --> 23:08.000
are being very clear and putting human factors in place
23:08.000 --> 23:12.440
such that the driver is actually responsible for the vehicle
23:12.440 --> 23:15.160
and that the technology is there to support the driver.
23:15.160 --> 23:19.880
And the safety cases that are built around those
23:19.880 --> 23:24.040
are dependent on that driver attention and attentiveness.
23:24.040 --> 23:28.000
And at that point, you can kind of give up
23:29.160 --> 23:31.240
to some degree for economic reasons,
23:31.240 --> 23:33.480
you can give up on say false negatives.
23:34.800 --> 23:36.200
And the way to think about this
23:36.200 --> 23:39.320
is for a four collision mitigation braking system,
23:39.320 --> 23:43.960
if it half the times the driver missed a vehicle
23:43.960 --> 23:46.080
in front of it, it hit the brakes
23:46.080 --> 23:47.680
and brought the vehicle to a stop,
23:47.680 --> 23:51.640
that would be an incredible, incredible advance
23:51.640 --> 23:53.040
in safety on our roads, right?
23:53.040 --> 23:55.000
That would be equivalent to seat belts.
23:55.000 --> 23:56.600
But it would mean that if that vehicle
23:56.600 --> 23:59.440
wasn't being monitored, it would hit one out of two cars.
24:00.600 --> 24:05.120
And so economically, that's a perfectly good solution
24:05.120 --> 24:06.280
for a driver assistance system.
24:06.280 --> 24:07.240
What you should do at that point,
24:07.240 --> 24:09.240
if you can get it to work 50% of the time,
24:09.240 --> 24:10.520
is drive the cost out of that
24:10.520 --> 24:13.320
so you can get it on as many vehicles as possible.
24:13.320 --> 24:14.760
But driving the cost out of it
24:14.760 --> 24:18.800
doesn't drive up performance on the false negative case.
24:18.800 --> 24:21.440
And so you'll continue to not have a technology
24:21.440 --> 24:25.680
that could really be available for a self driven vehicle.
24:25.680 --> 24:28.440
So clearly the communication,
24:28.440 --> 24:31.600
and this probably applies to all four vehicles as well,
24:31.600 --> 24:34.440
the marketing and communication
24:34.440 --> 24:37.040
of what the technology is actually capable of,
24:37.040 --> 24:38.400
how hard it is, how easy it is,
24:38.400 --> 24:41.000
all that kind of stuff is highly problematic.
24:41.000 --> 24:45.640
So say everybody in the world was perfectly communicated
24:45.640 --> 24:48.400
and were made to be completely aware
24:48.400 --> 24:50.000
of every single technology out there,
24:50.000 --> 24:52.840
what it's able to do.
24:52.840 --> 24:54.120
What's your intuition?
24:54.120 --> 24:56.880
And now we're maybe getting into philosophical ground.
24:56.880 --> 25:00.000
Is it possible to have a level two vehicle
25:00.000 --> 25:03.280
where we don't over trust it?
25:04.680 --> 25:05.800
I don't think so.
25:05.800 --> 25:10.800
If people truly understood the risks and internalized it,
25:11.160 --> 25:14.320
then sure, you could do that safely.
25:14.320 --> 25:16.160
But that's a world that doesn't exist.
25:16.160 --> 25:17.520
The people are going to,
25:18.720 --> 25:20.760
if the facts are put in front of them,
25:20.760 --> 25:24.440
they're gonna then combine that with their experience.
25:24.440 --> 25:28.360
And let's say they're using an L2 system
25:28.360 --> 25:30.800
and they go up and down the 101 every day
25:30.800 --> 25:32.720
and they do that for a month.
25:32.720 --> 25:36.200
And it just worked every day for a month.
25:36.200 --> 25:39.000
Like that's pretty compelling at that point,
25:39.000 --> 25:41.800
just even if you know the statistics,
25:41.800 --> 25:43.400
you're like, well, I don't know,
25:43.400 --> 25:44.760
maybe there's something funny about those.
25:44.760 --> 25:46.920
Maybe they're driving in difficult places.
25:46.920 --> 25:49.840
Like I've seen it with my own eyes, it works.
25:49.840 --> 25:52.400
And the problem is that that sample size that they have,
25:52.400 --> 25:53.880
so it's 30 miles up and down,
25:53.880 --> 25:56.360
so 60 miles times 30 days,
25:56.360 --> 25:58.720
so 60, 180, 1,800 miles.
25:58.720 --> 26:03.280
Like that's a drop in the bucket
26:03.280 --> 26:07.640
compared to the, what, 85 million miles between fatalities.
26:07.640 --> 26:11.400
And so they don't really have a true estimate
26:11.400 --> 26:14.440
based on their personal experience of the real risks,
26:14.440 --> 26:15.640
but they're gonna trust it anyway,
26:15.640 --> 26:16.480
because it's hard not to.
26:16.480 --> 26:18.640
It worked for a month, what's gonna change?
26:18.640 --> 26:21.640
So even if you start a perfect understanding of the system,
26:21.640 --> 26:24.160
your own experience will make it drift.
26:24.160 --> 26:25.920
I mean, that's a big concern.
26:25.920 --> 26:28.160
Over a year, over two years even,
26:28.160 --> 26:29.440
it doesn't have to be months.
26:29.440 --> 26:32.920
And I think that as this technology moves
26:32.920 --> 26:37.760
from what I would say is kind of the more technology savvy
26:37.760 --> 26:40.880
ownership group to the mass market,
26:42.640 --> 26:44.600
you may be able to have some of those folks
26:44.600 --> 26:46.280
who are really familiar with technology,
26:46.280 --> 26:48.840
they may be able to internalize it better.
26:48.840 --> 26:50.800
And your kind of immunization
26:50.800 --> 26:53.360
against this kind of false risk assessment
26:53.360 --> 26:54.280
might last longer,
26:54.280 --> 26:58.680
but as folks who aren't as savvy about that
26:58.680 --> 27:00.880
read the material and they compare that
27:00.880 --> 27:02.160
to their personal experience,
27:02.160 --> 27:07.160
I think there it's going to move more quickly.
27:08.160 --> 27:11.280
So your work, the program that you've created at Google
27:11.280 --> 27:16.280
and now at Aurora is focused more on the second path
27:16.600 --> 27:18.480
of creating full autonomy.
27:18.480 --> 27:20.880
So it's such a fascinating,
27:20.880 --> 27:24.560
I think it's one of the most interesting AI problems
27:24.560 --> 27:25.600
of the century, right?
27:25.600 --> 27:28.280
It's, I just talked to a lot of people,
27:28.280 --> 27:29.440
just regular people, I don't know,
27:29.440 --> 27:31.720
my mom, about autonomous vehicles,
27:31.720 --> 27:34.520
and you begin to grapple with ideas
27:34.520 --> 27:38.080
of giving your life control over to a machine.
27:38.080 --> 27:40.040
It's philosophically interesting,
27:40.040 --> 27:41.760
it's practically interesting.
27:41.760 --> 27:43.720
So let's talk about safety.
27:43.720 --> 27:46.240
How do you think we demonstrate,
27:46.240 --> 27:47.880
you've spoken about metrics in the past,
27:47.880 --> 27:51.880
how do you think we demonstrate to the world
27:51.880 --> 27:56.160
that an autonomous vehicle, an Aurora system is safe?
27:56.160 --> 27:57.320
This is one where it's difficult
27:57.320 --> 27:59.280
because there isn't a soundbite answer.
27:59.280 --> 28:04.280
That we have to show a combination of work
28:05.960 --> 28:08.360
that was done diligently and thoughtfully,
28:08.360 --> 28:10.840
and this is where something like a functional safety process
28:10.840 --> 28:11.680
is part of that.
28:11.680 --> 28:14.360
It's like here's the way we did the work,
28:15.280 --> 28:17.160
that means that we were very thorough.
28:17.160 --> 28:20.040
So if you believe that what we said
28:20.040 --> 28:21.440
about this is the way we did it,
28:21.440 --> 28:22.720
then you can have some confidence
28:22.720 --> 28:25.200
that we were thorough in the engineering work
28:25.200 --> 28:26.920
we put into the system.
28:26.920 --> 28:28.920
And then on top of that,
28:28.920 --> 28:32.000
to kind of demonstrate that we weren't just thorough,
28:32.000 --> 28:33.800
we were actually good at what we did,
28:35.280 --> 28:38.200
there'll be a kind of a collection of evidence
28:38.200 --> 28:40.440
in terms of demonstrating that the capabilities
28:40.440 --> 28:42.920
worked the way we thought they did,
28:42.920 --> 28:45.320
statistically and to whatever degree
28:45.320 --> 28:47.280
we can demonstrate that,
28:48.160 --> 28:50.320
both in some combination of simulations,
28:50.320 --> 28:53.080
some combination of unit testing
28:53.080 --> 28:54.640
and decomposition testing,
28:54.640 --> 28:57.000
and then some part of it will be on road data.
28:58.160 --> 29:02.680
And I think the way we'll ultimately
29:02.680 --> 29:04.000
convey this to the public
29:04.000 --> 29:06.760
is there'll be clearly some conversation
29:06.760 --> 29:08.200
with the public about it,
29:08.200 --> 29:12.040
but we'll kind of invoke the kind of the trusted nodes
29:12.040 --> 29:13.880
and that we'll spend more time
29:13.880 --> 29:17.280
being able to go into more depth with folks like NHTSA
29:17.280 --> 29:19.720
and other federal and state regulatory bodies
29:19.720 --> 29:22.080
and kind of given that they are
29:22.080 --> 29:25.200
operating in the public interest and they're trusted,
29:26.240 --> 29:28.640
that if we can show enough work to them
29:28.640 --> 29:30.000
that they're convinced,
29:30.000 --> 29:33.800
then I think we're in a pretty good place.
29:33.800 --> 29:35.000
That means you work with people
29:35.000 --> 29:36.920
that are essentially experts at safety
29:36.920 --> 29:39.000
to try to discuss and show.
29:39.000 --> 29:41.720
Do you think, the answer's probably no,
29:41.720 --> 29:42.920
but just in case,
29:42.920 --> 29:44.360
do you think there exists a metric?
29:44.360 --> 29:46.320
So currently people have been using
29:46.320 --> 29:48.200
number of disengagements.
29:48.200 --> 29:50.120
And it quickly turns into a marketing scheme
29:50.120 --> 29:54.280
to sort of you alter the experiments you run to adjust.
29:54.280 --> 29:56.280
I think you've spoken that you don't like.
29:56.280 --> 29:57.120
Don't love it.
29:57.120 --> 29:59.680
No, in fact, I was on the record telling DMV
29:59.680 --> 30:01.960
that I thought this was not a great metric.
30:01.960 --> 30:05.280
Do you think it's possible to create a metric,
30:05.280 --> 30:09.440
a number that could demonstrate safety
30:09.440 --> 30:12.320
outside of fatalities?
30:12.320 --> 30:13.440
So I do.
30:13.440 --> 30:16.560
And I think that it won't be just one number.
30:17.600 --> 30:21.280
So as we are internally grappling with this,
30:21.280 --> 30:23.560
and at some point we'll be able to talk
30:23.560 --> 30:25.040
more publicly about it,
30:25.040 --> 30:28.520
is how do we think about human performance
30:28.520 --> 30:29.840
in different tasks,
30:29.840 --> 30:32.160
say detecting traffic lights
30:32.160 --> 30:36.200
or safely making a left turn across traffic?
30:37.680 --> 30:40.080
And what do we think the failure rates are
30:40.080 --> 30:42.520
for those different capabilities for people?
30:42.520 --> 30:44.760
And then demonstrating to ourselves
30:44.760 --> 30:48.480
and then ultimately folks in the regulatory role
30:48.480 --> 30:50.760
and then ultimately the public
30:50.760 --> 30:52.400
that we have confidence that our system
30:52.400 --> 30:54.760
will work better than that.
30:54.760 --> 30:57.040
And so these individual metrics
30:57.040 --> 31:00.720
will kind of tell a compelling story ultimately.
31:01.760 --> 31:03.920
I do think at the end of the day
31:03.920 --> 31:06.640
what we care about in terms of safety
31:06.640 --> 31:11.320
is life saved and injuries reduced.
31:12.160 --> 31:15.280
And then ultimately kind of casualty dollars
31:16.440 --> 31:19.360
that people aren't having to pay to get their car fixed.
31:19.360 --> 31:22.680
And I do think that in aviation
31:22.680 --> 31:25.880
they look at a kind of an event pyramid
31:25.880 --> 31:28.600
where a crash is at the top of that
31:28.600 --> 31:30.440
and that's the worst event obviously
31:30.440 --> 31:34.240
and then there's injuries and near miss events and whatnot
31:34.240 --> 31:37.320
and violation of operating procedures
31:37.320 --> 31:40.160
and you kind of build a statistical model
31:40.160 --> 31:44.440
of the relevance of the low severity things
31:44.440 --> 31:45.280
or the high severity things.
31:45.280 --> 31:46.120
And I think that's something
31:46.120 --> 31:48.200
where we'll be able to look at as well
31:48.200 --> 31:51.840
because an event per 85 million miles
31:51.840 --> 31:54.440
is statistically a difficult thing
31:54.440 --> 31:56.800
even at the scale of the U.S.
31:56.800 --> 31:59.360
to kind of compare directly.
31:59.360 --> 32:02.240
And that event fatality that's connected
32:02.240 --> 32:07.240
to an autonomous vehicle is significantly
32:07.440 --> 32:09.160
at least currently magnified
32:09.160 --> 32:12.320
in the amount of attention it gets.
32:12.320 --> 32:15.080
So that speaks to public perception.
32:15.080 --> 32:16.720
I think the most popular topic
32:16.720 --> 32:19.480
about autonomous vehicles in the public
32:19.480 --> 32:23.080
is the trolley problem formulation, right?
32:23.080 --> 32:27.000
Which has, let's not get into that too much
32:27.000 --> 32:29.600
but is misguided in many ways.
32:29.600 --> 32:32.320
But it speaks to the fact that people are grappling
32:32.320 --> 32:36.160
with this idea of giving control over to a machine.
32:36.160 --> 32:41.160
So how do you win the hearts and minds of the people
32:41.560 --> 32:44.600
that autonomy is something that could be a part
32:44.600 --> 32:45.520
of their lives?
32:45.520 --> 32:47.640
I think you let them experience it, right?
32:47.640 --> 32:50.440
I think it's right.
32:50.440 --> 32:52.800
I think people should be skeptical.
32:52.800 --> 32:55.680
I think people should ask questions.
32:55.680 --> 32:57.000
I think they should doubt
32:57.000 --> 33:00.120
because this is something new and different.
33:00.120 --> 33:01.880
They haven't touched it yet.
33:01.880 --> 33:03.640
And I think that's perfectly reasonable.
33:03.640 --> 33:07.320
And, but at the same time,
33:07.320 --> 33:09.320
it's clear there's an opportunity to make the road safer.
33:09.320 --> 33:12.440
It's clear that we can improve access to mobility.
33:12.440 --> 33:14.960
It's clear that we can reduce the cost of mobility.
33:16.640 --> 33:19.480
And that once people try that
33:19.480 --> 33:22.720
and understand that it's safe
33:22.720 --> 33:24.440
and are able to use in their daily lives,
33:24.440 --> 33:25.280
I think it's one of these things
33:25.280 --> 33:28.040
that will just be obvious.
33:28.040 --> 33:32.240
And I've seen this practically in demonstrations
33:32.240 --> 33:35.560
that I've given where I've had people come in
33:35.560 --> 33:38.840
and they're very skeptical.
33:38.840 --> 33:40.440
Again, in a vehicle, my favorite one
33:40.440 --> 33:42.560
is taking somebody out on the freeway
33:42.560 --> 33:46.000
and we're on the 101 driving at 65 miles an hour.
33:46.000 --> 33:48.400
And after 10 minutes, they kind of turn and ask,
33:48.400 --> 33:49.480
is that all it does?
33:49.480 --> 33:52.080
And you're like, it's a self driving car.
33:52.080 --> 33:54.840
I'm not sure exactly what you thought it would do, right?
33:54.840 --> 33:57.920
But it becomes mundane,
33:58.840 --> 34:01.480
which is exactly what you want a technology
34:01.480 --> 34:02.720
like this to be, right?
34:02.720 --> 34:07.280
We don't really, when I turn the light switch on in here,
34:07.280 --> 34:12.000
I don't think about the complexity of those electrons
34:12.000 --> 34:14.200
being pushed down a wire from wherever it was
34:14.200 --> 34:15.240
and being generated.
34:15.240 --> 34:19.080
It's like, I just get annoyed if it doesn't work, right?
34:19.080 --> 34:21.400
And what I value is the fact
34:21.400 --> 34:23.080
that I can do other things in this space.
34:23.080 --> 34:24.560
I can see my colleagues.
34:24.560 --> 34:26.160
I can read stuff on a paper.
34:26.160 --> 34:29.200
I can not be afraid of the dark.
34:30.360 --> 34:33.320
And I think that's what we want this technology to be like
34:33.320 --> 34:34.640
is it's in the background
34:34.640 --> 34:37.120
and people get to have those life experiences
34:37.120 --> 34:38.440
and do so safely.
34:38.440 --> 34:42.160
So putting this technology in the hands of people
34:42.160 --> 34:46.320
speaks to scale of deployment, right?
34:46.320 --> 34:50.880
So what do you think the dreaded question about the future
34:50.880 --> 34:53.560
because nobody can predict the future,
34:53.560 --> 34:57.240
but just maybe speak poetically
34:57.240 --> 35:00.880
about when do you think we'll see a large scale deployment
35:00.880 --> 35:05.880
of autonomous vehicles, 10,000, those kinds of numbers?
35:06.680 --> 35:08.240
We'll see that within 10 years.
35:09.240 --> 35:10.240
I'm pretty confident.
35:14.040 --> 35:16.040
What's an impressive scale?
35:16.040 --> 35:19.200
What moment, so you've done the DARPA challenge
35:19.200 --> 35:20.440
where there's one vehicle.
35:20.440 --> 35:23.960
At which moment does it become, wow, this is serious scale?
35:23.960 --> 35:26.520
So I think the moment it gets serious
35:26.520 --> 35:31.520
is when we really do have a driverless vehicle
35:32.240 --> 35:34.120
operating on public roads
35:35.000 --> 35:37.960
and that we can do that kind of continuously.
35:37.960 --> 35:38.880
Without a safety driver.
35:38.880 --> 35:40.440
Without a safety driver in the vehicle.
35:40.440 --> 35:41.560
I think at that moment,
35:41.560 --> 35:44.400
we've kind of crossed the zero to one threshold.
35:45.920 --> 35:50.200
And then it is about how do we continue to scale that?
35:50.200 --> 35:53.960
How do we build the right business models?
35:53.960 --> 35:56.320
How do we build the right customer experience around it
35:56.320 --> 35:59.960
so that it is actually a useful product out in the world?
36:00.960 --> 36:03.600
And I think that is really,
36:03.600 --> 36:05.920
at that point it moves from
36:05.920 --> 36:09.200
what is this kind of mixed science engineering project
36:09.200 --> 36:12.360
into engineering and commercialization
36:12.360 --> 36:15.840
and really starting to deliver on the value
36:15.840 --> 36:20.680
that we all see here and actually making that real in the world.
36:20.680 --> 36:22.240
What do you think that deployment looks like?
36:22.240 --> 36:26.440
Where do we first see the inkling of no safety driver,
36:26.440 --> 36:28.600
one or two cars here and there?
36:28.600 --> 36:29.800
Is it on the highway?
36:29.800 --> 36:33.160
Is it in specific routes in the urban environment?
36:33.160 --> 36:36.920
I think it's gonna be urban, suburban type environments.
36:37.880 --> 36:41.560
Yeah, with Aurora, when we thought about how to tackle this,
36:41.560 --> 36:45.040
it was kind of in vogue to think about trucking
36:46.040 --> 36:47.800
as opposed to urban driving.
36:47.800 --> 36:51.280
And again, the human intuition around this
36:51.280 --> 36:55.400
is that freeways are easier to drive on
36:57.080 --> 36:59.280
because everybody's kind of going in the same direction
36:59.280 --> 37:01.560
and lanes are a little wider, et cetera.
37:01.560 --> 37:03.320
And I think that that intuition is pretty good,
37:03.320 --> 37:06.040
except we don't really care about most of the time.
37:06.040 --> 37:08.400
We care about all of the time.
37:08.400 --> 37:10.880
And when you're driving on a freeway with a truck,
37:10.880 --> 37:13.440
say 70 miles an hour,
37:14.600 --> 37:16.240
and you've got 70,000 pound load with you,
37:16.240 --> 37:18.880
that's just an incredible amount of kinetic energy.
37:18.880 --> 37:21.440
And so when that goes wrong, it goes really wrong.
37:22.640 --> 37:27.640
And those challenges that you see occur more rarely,
37:27.800 --> 37:31.120
so you don't get to learn as quickly.
37:31.120 --> 37:34.720
And they're incrementally more difficult than urban driving,
37:34.720 --> 37:37.440
but they're not easier than urban driving.
37:37.440 --> 37:41.640
And so I think this happens in moderate speed
37:41.640 --> 37:45.280
urban environments because if two vehicles crash
37:45.280 --> 37:48.120
at 25 miles per hour, it's not good,
37:48.120 --> 37:50.120
but probably everybody walks away.
37:51.080 --> 37:53.720
And those events where there's the possibility
37:53.720 --> 37:55.800
for that occurring happen frequently.
37:55.800 --> 37:58.000
So we get to learn more rapidly.
37:58.000 --> 38:01.360
We get to do that with lower risk for everyone.
38:02.520 --> 38:04.360
And then we can deliver value to people
38:04.360 --> 38:05.880
that need to get from one place to another.
38:05.880 --> 38:08.160
And once we've got that solved,
38:08.160 --> 38:11.320
then the freeway driving part of this just falls out.
38:11.320 --> 38:13.080
But we're able to learn more safely,
38:13.080 --> 38:15.200
more quickly in the urban environment.
38:15.200 --> 38:18.760
So 10 years and then scale 20, 30 year,
38:18.760 --> 38:22.040
who knows if a sufficiently compelling experience
38:22.040 --> 38:24.400
is created, it could be faster and slower.
38:24.400 --> 38:27.160
Do you think there could be breakthroughs
38:27.160 --> 38:29.920
and what kind of breakthroughs might there be
38:29.920 --> 38:32.400
that completely change that timeline?
38:32.400 --> 38:35.360
Again, not only am I asking you to predict the future,
38:35.360 --> 38:37.360
I'm asking you to predict breakthroughs
38:37.360 --> 38:38.360
that haven't happened yet.
38:38.360 --> 38:41.440
So what's the, I think another way to ask that
38:41.440 --> 38:44.320
would be if I could wave a magic wand,
38:44.320 --> 38:46.720
what part of the system would I make work today
38:46.720 --> 38:49.480
to accelerate it as quickly as possible?
38:52.120 --> 38:54.200
Don't say infrastructure, please don't say infrastructure.
38:54.200 --> 38:56.320
No, it's definitely not infrastructure.
38:56.320 --> 39:00.600
It's really that perception forecasting capability.
39:00.600 --> 39:04.840
So if tomorrow you could give me a perfect model
39:04.840 --> 39:06.960
of what's happened, what is happening
39:06.960 --> 39:09.200
and what will happen for the next five seconds
39:10.360 --> 39:13.040
around a vehicle on the roadway,
39:13.040 --> 39:15.360
that would accelerate things pretty dramatically.
39:15.360 --> 39:17.600
Are you, in terms of staying up at night,
39:17.600 --> 39:21.760
are you mostly bothered by cars, pedestrians or cyclists?
39:21.760 --> 39:25.960
So I worry most about the vulnerable road users
39:25.960 --> 39:28.480
about the combination of cyclists and cars, right?
39:28.480 --> 39:31.960
Or cyclists and pedestrians because they're not in armor.
39:31.960 --> 39:36.480
The cars, they're bigger, they've got protection
39:36.480 --> 39:39.440
for the people and so the ultimate risk is lower there.
39:41.080 --> 39:43.240
Whereas a pedestrian or a cyclist,
39:43.240 --> 39:46.480
they're out on the road and they don't have any protection
39:46.480 --> 39:49.720
and so we need to pay extra attention to that.
39:49.720 --> 39:54.120
Do you think about a very difficult technical challenge
39:55.720 --> 39:58.520
of the fact that pedestrians,
39:58.520 --> 40:00.240
if you try to protect pedestrians
40:00.240 --> 40:04.560
by being careful and slow, they'll take advantage of that.
40:04.560 --> 40:09.040
So the game theoretic dance, does that worry you
40:09.040 --> 40:12.480
of how, from a technical perspective, how we solve that?
40:12.480 --> 40:14.560
Because as humans, the way we solve that
40:14.560 --> 40:17.240
is kind of nudge our way through the pedestrians
40:17.240 --> 40:20.000
which doesn't feel, from a technical perspective,
40:20.000 --> 40:22.300
as a appropriate algorithm.
40:23.200 --> 40:25.920
But do you think about how we solve that problem?
40:25.920 --> 40:30.920
Yeah, I think there's two different concepts there.
40:31.360 --> 40:35.820
So one is, am I worried that because these vehicles
40:35.820 --> 40:37.600
are self driving, people will kind of step in the road
40:37.600 --> 40:38.640
and take advantage of them?
40:38.640 --> 40:43.640
And I've heard this and I don't really believe it
40:43.760 --> 40:45.960
because if I'm driving down the road
40:45.960 --> 40:48.400
and somebody steps in front of me, I'm going to stop.
40:50.600 --> 40:53.660
Even if I'm annoyed, I'm not gonna just drive
40:53.660 --> 40:56.400
through a person stood in the road.
40:56.400 --> 41:00.400
And so I think today people can take advantage of this
41:00.400 --> 41:02.560
and you do see some people do it.
41:02.560 --> 41:04.180
I guess there's an incremental risk
41:04.180 --> 41:05.880
because maybe they have lower confidence
41:05.880 --> 41:07.720
that I'm gonna see them than they might have
41:07.720 --> 41:10.400
for an automated vehicle and so maybe that shifts
41:10.400 --> 41:12.040
it a little bit.
41:12.040 --> 41:14.360
But I think people don't wanna get hit by cars.
41:14.360 --> 41:17.080
And so I think that I'm not that worried
41:17.080 --> 41:18.760
about people walking out of the 101
41:18.760 --> 41:23.760
and creating chaos more than they would today.
41:24.400 --> 41:27.040
Regarding kind of the nudging through a big stream
41:27.040 --> 41:30.040
of pedestrians leaving a concert or something,
41:30.040 --> 41:33.520
I think that is further down the technology pipeline.
41:33.520 --> 41:36.960
I think that you're right, that's tricky.
41:36.960 --> 41:38.620
I don't think it's necessarily,
41:40.360 --> 41:43.600
I think the algorithm people use for this is pretty simple.
41:43.600 --> 41:44.800
It's kind of just move forward slowly
41:44.800 --> 41:46.800
and if somebody's really close then stop.
41:46.800 --> 41:50.880
And I think that that probably can be replicated
41:50.880 --> 41:54.040
pretty easily and particularly given that
41:54.040 --> 41:55.720
you don't do this at 30 miles an hour,
41:55.720 --> 41:59.080
you do it at one, that even in those situations
41:59.080 --> 42:01.200
the risk is relatively minimal.
42:01.200 --> 42:03.640
But it's not something we're thinking about
42:03.640 --> 42:04.560
in any serious way.
42:04.560 --> 42:07.920
And probably that's less an algorithm problem
42:07.920 --> 42:10.160
and more creating a human experience.
42:10.160 --> 42:14.300
So the HCI people that create a visual display
42:14.300 --> 42:16.260
that you're pleasantly as a pedestrian
42:16.260 --> 42:20.760
nudged out of the way, that's an experience problem,
42:20.760 --> 42:22.000
not an algorithm problem.
42:22.880 --> 42:25.480
Who's the main competitor to Aurora today?
42:25.480 --> 42:28.640
And how do you outcompete them in the long run?
42:28.640 --> 42:31.200
So we really focus a lot on what we're doing here.
42:31.200 --> 42:34.480
I think that, I've said this a few times,
42:34.480 --> 42:37.960
that this is a huge difficult problem
42:37.960 --> 42:40.320
and it's great that a bunch of companies are tackling it
42:40.320 --> 42:42.320
because I think it's so important for society
42:42.320 --> 42:43.800
that somebody gets there.
42:43.800 --> 42:48.800
So we don't spend a whole lot of time
42:49.120 --> 42:51.600
thinking tactically about who's out there
42:51.600 --> 42:55.240
and how do we beat that person individually.
42:55.240 --> 42:58.720
What are we trying to do to go faster ultimately?
42:59.760 --> 43:02.640
Well part of it is the leadership team we have
43:02.640 --> 43:04.200
has got pretty tremendous experience.
43:04.200 --> 43:06.440
And so we kind of understand the landscape
43:06.440 --> 43:09.160
and understand where the cul de sacs are to some degree
43:09.160 --> 43:10.980
and we try and avoid those.
43:10.980 --> 43:14.260
I think there's a part of it,
43:14.260 --> 43:16.260
just this great team we've built.
43:16.260 --> 43:19.080
People, this is a technology and a company
43:19.080 --> 43:22.320
that people believe in the mission of
43:22.320 --> 43:23.740
and so it allows us to attract
43:23.740 --> 43:25.740
just awesome people to go work.
43:26.800 --> 43:29.320
We've got a culture I think that people appreciate
43:29.320 --> 43:30.460
that allows them to focus,
43:30.460 --> 43:33.120
allows them to really spend time solving problems.
43:33.120 --> 43:35.900
And I think that keeps them energized.
43:35.900 --> 43:38.940
And then we've invested hard,
43:38.940 --> 43:43.500
invested heavily in the infrastructure
43:43.500 --> 43:46.540
and architectures that we think will ultimately accelerate us.
43:46.540 --> 43:50.660
So because of the folks we're able to bring in early on,
43:50.660 --> 43:53.540
because of the great investors we have,
43:53.540 --> 43:56.780
we don't spend all of our time doing demos
43:56.780 --> 43:58.660
and kind of leaping from one demo to the next.
43:58.660 --> 44:02.820
We've been given the freedom to invest in
44:03.940 --> 44:05.500
infrastructure to do machine learning,
44:05.500 --> 44:08.600
infrastructure to pull data from our on road testing,
44:08.600 --> 44:11.500
infrastructure to use that to accelerate engineering.
44:11.500 --> 44:14.480
And I think that early investment
44:14.480 --> 44:17.340
and continuing investment in those kind of tools
44:17.340 --> 44:19.780
will ultimately allow us to accelerate
44:19.780 --> 44:21.940
and do something pretty incredible.
44:21.940 --> 44:23.420
Chris, beautifully put.
44:23.420 --> 44:24.660
It's a good place to end.
44:24.660 --> 44:26.500
Thank you so much for talking today.
44:26.500 --> 44:47.940
Thank you very much. Really enjoyed it.
|