Datasets:
File size: 60,280 Bytes
a3be5d0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 1826 1827 1828 1829 1830 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850 1851 1852 1853 1854 1855 1856 1857 1858 1859 1860 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 1876 1877 1878 1879 1880 1881 1882 1883 1884 1885 1886 1887 1888 1889 1890 1891 1892 1893 1894 1895 1896 1897 1898 1899 1900 1901 1902 1903 1904 1905 1906 1907 1908 1909 1910 1911 1912 1913 1914 1915 1916 1917 1918 1919 1920 1921 1922 1923 1924 1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935 1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 1951 1952 1953 1954 1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 2026 2027 2028 2029 2030 2031 2032 2033 2034 2035 2036 2037 2038 2039 2040 2041 2042 2043 2044 2045 2046 2047 2048 2049 2050 2051 2052 2053 2054 2055 2056 2057 2058 2059 2060 2061 2062 2063 2064 2065 2066 2067 2068 2069 2070 2071 2072 2073 2074 2075 2076 2077 2078 2079 2080 2081 2082 2083 2084 2085 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 2101 2102 2103 2104 2105 2106 2107 2108 2109 2110 2111 2112 2113 2114 2115 2116 2117 2118 2119 2120 2121 2122 2123 2124 2125 2126 2127 2128 2129 2130 2131 2132 2133 2134 2135 2136 2137 2138 2139 2140 2141 2142 2143 2144 2145 2146 2147 2148 2149 2150 2151 2152 2153 2154 2155 2156 2157 2158 2159 2160 2161 2162 2163 2164 2165 2166 2167 2168 2169 2170 2171 2172 2173 2174 2175 2176 2177 2178 2179 2180 2181 2182 2183 2184 2185 2186 2187 2188 2189 2190 2191 2192 2193 2194 2195 2196 2197 2198 2199 2200 2201 2202 2203 2204 2205 2206 2207 2208 2209 2210 2211 2212 2213 2214 2215 2216 2217 2218 2219 2220 2221 2222 2223 2224 2225 2226 2227 2228 2229 2230 2231 2232 2233 2234 2235 2236 2237 2238 2239 2240 2241 2242 2243 2244 2245 2246 2247 2248 2249 2250 2251 2252 2253 2254 2255 2256 2257 2258 2259 2260 2261 2262 2263 2264 2265 2266 2267 2268 2269 2270 2271 2272 2273 2274 2275 2276 2277 2278 2279 2280 2281 2282 2283 2284 2285 2286 2287 2288 2289 2290 2291 2292 2293 2294 2295 2296 2297 2298 2299 2300 2301 2302 2303 2304 2305 2306 2307 2308 2309 2310 2311 2312 2313 2314 2315 2316 2317 2318 2319 2320 2321 2322 2323 2324 2325 2326 2327 2328 2329 2330 2331 2332 2333 2334 2335 2336 2337 2338 2339 2340 2341 2342 2343 2344 2345 2346 2347 2348 2349 2350 2351 2352 2353 2354 2355 2356 2357 2358 2359 2360 2361 2362 2363 2364 2365 2366 2367 2368 2369 2370 2371 2372 2373 2374 2375 2376 2377 2378 2379 2380 2381 2382 2383 2384 2385 2386 2387 2388 2389 2390 2391 2392 2393 2394 2395 2396 2397 2398 2399 2400 2401 2402 2403 2404 2405 2406 2407 2408 2409 2410 2411 2412 2413 2414 2415 2416 2417 2418 2419 2420 2421 2422 2423 2424 2425 2426 2427 2428 2429 2430 2431 2432 2433 2434 2435 2436 2437 2438 2439 2440 2441 2442 2443 2444 2445 2446 2447 2448 2449 2450 2451 2452 2453 2454 2455 2456 2457 2458 2459 2460 2461 2462 2463 2464 2465 2466 2467 2468 2469 2470 2471 2472 2473 2474 2475 2476 2477 2478 2479 2480 2481 2482 2483 2484 2485 2486 2487 2488 2489 2490 2491 2492 2493 2494 2495 2496 2497 2498 2499 2500 2501 2502 2503 2504 2505 2506 2507 2508 2509 2510 2511 2512 2513 2514 2515 2516 2517 2518 2519 2520 2521 2522 2523 2524 2525 2526 2527 2528 2529 2530 2531 2532 2533 2534 2535 2536 2537 2538 2539 2540 2541 2542 2543 2544 2545 2546 2547 2548 2549 2550 2551 2552 2553 2554 2555 2556 2557 2558 2559 2560 2561 2562 2563 2564 2565 2566 2567 2568 2569 2570 2571 2572 2573 2574 2575 2576 2577 2578 2579 2580 2581 2582 2583 2584 2585 2586 2587 2588 2589 2590 2591 2592 2593 2594 2595 2596 2597 2598 2599 2600 2601 2602 2603 2604 2605 2606 2607 2608 2609 2610 2611 2612 2613 2614 2615 2616 2617 2618 2619 2620 2621 2622 2623 2624 2625 2626 2627 2628 2629 2630 2631 |
WEBVTT
00:00.000 --> 00:03.120
The following is a conversation with Chris Ermsen.
00:03.120 --> 00:06.040
He was the CTO of the Google self driving car team,
00:06.040 --> 00:08.880
a key engineer and leader behind the Carnegie Mellon
00:08.880 --> 00:11.240
University, autonomous vehicle entries
00:11.240 --> 00:14.120
in the DARPA Grand Challenges and the winner
00:14.120 --> 00:16.160
of the DARPA Urban Challenge.
00:16.160 --> 00:19.480
Today, he's the CEO of Aurora Innovation,
00:19.480 --> 00:21.360
an autonomous vehicle software company.
00:21.360 --> 00:23.600
He started with Sterling Anderson,
00:23.600 --> 00:26.000
who was the former director of Tesla Autopilot
00:26.000 --> 00:30.160
and drew back now Uber's former autonomy and perception lead.
00:30.160 --> 00:33.160
Chris is one of the top roboticist and autonomous vehicle
00:33.160 --> 00:37.440
experts in the world and a long time voice of reason
00:37.440 --> 00:41.320
in a space that is shrouded in both mystery and hype.
00:41.320 --> 00:43.600
He both acknowledges the incredible challenges
00:43.600 --> 00:46.560
involved in solving the problem of autonomous driving
00:46.560 --> 00:49.680
and is working hard to solve it.
00:49.680 --> 00:52.440
This is the Artificial Intelligence Podcast.
00:52.440 --> 00:54.720
If you enjoy it, subscribe on YouTube,
00:54.720 --> 00:57.920
give it five stars on iTunes, support it on Patreon,
00:57.920 --> 00:59.760
or simply connect with me on Twitter
00:59.760 --> 01:03.280
at Lex Freedman spelled FRID MAN.
01:03.280 --> 01:09.160
And now, here's my conversation with Chris Ermsen.
01:09.160 --> 01:11.960
You were part of both the DARPA Grand Challenge
01:11.960 --> 01:17.040
and the DARPA Urban Challenge teams at CMU with Red Whitaker.
01:17.040 --> 01:19.720
What technical or philosophical things
01:19.720 --> 01:22.280
have you learned from these races?
01:22.280 --> 01:26.640
I think the high order bit was that it could be done.
01:26.640 --> 01:32.880
I think that was the thing that was incredible about the first
01:32.880 --> 01:36.440
of the Grand Challenges, that I remember I was a grad
01:36.440 --> 01:41.440
student at Carnegie Mellon, and there we
01:41.440 --> 01:46.320
was kind of this dichotomy of it seemed really hard,
01:46.320 --> 01:48.800
so that would be cool and interesting.
01:48.800 --> 01:51.720
But at the time, we were the only robotics
01:51.720 --> 01:54.960
institute around, and so if we went into it and fell
01:54.960 --> 01:58.320
in our faces, that would be embarrassing.
01:58.320 --> 02:01.160
So I think just having the will to go do it,
02:01.160 --> 02:03.360
to try to do this thing that at the time was marked
02:03.360 --> 02:07.120
as darn near impossible, and then after a couple of tries,
02:07.120 --> 02:11.360
be able to actually make it happen, I think that was really
02:11.360 --> 02:12.360
exciting.
02:12.360 --> 02:15.120
But at which point did you believe it was possible?
02:15.120 --> 02:17.000
Did you, from the very beginning,
02:17.000 --> 02:18.360
did you personally, because you're
02:18.360 --> 02:20.320
one of the lead engineers, you actually
02:20.320 --> 02:21.800
had to do a lot of the work?
02:21.800 --> 02:23.840
Yeah, I was the technical director there,
02:23.840 --> 02:26.120
and did a lot of the work, along with a bunch
02:26.120 --> 02:28.440
of other really good people.
02:28.440 --> 02:29.760
Did I believe it could be done?
02:29.760 --> 02:31.120
Yeah, of course.
02:31.120 --> 02:33.400
Why would you go do something you thought was impossible,
02:33.400 --> 02:34.880
completely impossible?
02:34.880 --> 02:36.280
We thought it was going to be hard.
02:36.280 --> 02:38.080
We didn't know how we're going to be able to do it.
02:38.080 --> 02:42.880
We didn't know if we'd be able to do it the first time.
02:42.880 --> 02:46.000
Turns out we couldn't.
02:46.000 --> 02:48.400
That, yeah, I guess you have to.
02:48.400 --> 02:52.920
I think there's a certain benefit to naivete,
02:52.920 --> 02:55.400
that if you don't know how hard something really is,
02:55.400 --> 02:59.560
you try different things, and it gives you an opportunity
02:59.560 --> 03:04.080
that others who are wiser maybe don't have.
03:04.080 --> 03:05.680
What were the biggest pain points?
03:05.680 --> 03:09.360
Mechanical, sensors, hardware, software, algorithms
03:09.360 --> 03:12.760
for mapping, localization, just general perception,
03:12.760 --> 03:15.440
control, like hardware, software, first of all.
03:15.440 --> 03:17.840
I think that's the joy of this field,
03:17.840 --> 03:20.040
is that it's all hard.
03:20.040 --> 03:25.160
And that you have to be good at each part of it.
03:25.160 --> 03:32.280
So for the urban challenges, if I look back at it from today,
03:32.280 --> 03:36.200
it should be easy today.
03:36.200 --> 03:38.880
That it was a static world.
03:38.880 --> 03:40.720
There weren't other actors moving through it.
03:40.720 --> 03:42.440
That is what that means.
03:42.440 --> 03:47.080
It was out in the desert, so you get really good GPS.
03:47.080 --> 03:51.320
So that went, and we could map it roughly.
03:51.320 --> 03:55.160
And so in retrospect now, it's within the realm of things
03:55.160 --> 03:57.800
we could do back then.
03:57.800 --> 03:59.200
Just actually getting the vehicle,
03:59.200 --> 04:00.680
and there's a bunch of engineering work
04:00.680 --> 04:04.200
to get the vehicle so that we could control and drive it.
04:04.200 --> 04:09.520
That's still a pain today, but it was even more so back then.
04:09.520 --> 04:12.920
And then the uncertainty of exactly what they wanted us
04:12.920 --> 04:17.080
to do was part of the challenge as well.
04:17.080 --> 04:19.360
Right, you didn't actually know the track hiding it.
04:19.360 --> 04:21.520
You knew approximately, but you didn't actually
04:21.520 --> 04:23.560
know the route that's going to be taken.
04:23.560 --> 04:26.560
That's right, we didn't even really,
04:26.560 --> 04:28.640
the way the rules had been described,
04:28.640 --> 04:29.840
you had to kind of guess.
04:29.840 --> 04:33.440
So if you think back to that challenge,
04:33.440 --> 04:37.000
the idea was that the government would give us,
04:37.000 --> 04:40.360
the DARPA would give us a set of waypoints
04:40.360 --> 04:44.240
and kind of the width that you had to stay within between the line
04:44.240 --> 04:46.840
that went between each of those waypoints.
04:46.840 --> 04:49.280
And so the most devious thing they could have done
04:49.280 --> 04:53.720
is set a kilometer wide corridor across a field of scrub
04:53.720 --> 04:58.520
brush and rocks and said, go figure it out.
04:58.520 --> 05:02.200
Fortunately, it turned into basically driving along
05:02.200 --> 05:06.800
a set of trails, which is much more relevant to the application
05:06.800 --> 05:08.760
they were looking for.
05:08.760 --> 05:12.080
But no, it was a hell of a thing back in the day.
05:12.080 --> 05:16.640
So the legend, Red, was kind of leading that effort
05:16.640 --> 05:19.120
in terms of just broadly speaking.
05:19.120 --> 05:22.040
So you're a leader now.
05:22.040 --> 05:25.000
What have you learned from Red about leadership?
05:25.000 --> 05:26.360
I think there's a couple of things.
05:26.360 --> 05:30.880
One is go and try those really hard things.
05:30.880 --> 05:34.760
That's where there is an incredible opportunity.
05:34.760 --> 05:36.560
I think the other big one, though,
05:36.560 --> 05:41.720
is to see people for who they can be, not who they are.
05:41.720 --> 05:46.080
It's one of the deepest lessons I learned from Red,
05:46.080 --> 05:51.000
was that he would look at undergraduates or graduate
05:51.000 --> 05:56.120
students and empower them to be leaders,
05:56.120 --> 06:01.400
to have responsibility, to do great things,
06:01.400 --> 06:04.760
that I think another person might look at them and think,
06:04.760 --> 06:06.600
oh, well, that's just an undergraduate student.
06:06.600 --> 06:08.720
What could they know?
06:08.720 --> 06:13.520
And so I think that trust, but verify, have confidence
06:13.520 --> 06:14.880
in what people can become, I think,
06:14.880 --> 06:16.680
is a really powerful thing.
06:16.680 --> 06:20.480
So through that, let's just fast forward through the history.
06:20.480 --> 06:24.200
Can you maybe talk through the technical evolution
06:24.200 --> 06:27.480
of autonomous vehicle systems from the first two
06:27.480 --> 06:30.920
Grand Challenges to the Urban Challenge to today?
06:30.920 --> 06:33.600
Are there major shifts in your mind,
06:33.600 --> 06:37.240
or is it the same kind of technology just made more robust?
06:37.240 --> 06:40.880
I think there's been some big, big steps.
06:40.880 --> 06:46.600
So for the Grand Challenge, the real technology
06:46.600 --> 06:51.400
that unlocked that was HD mapping.
06:51.400 --> 06:55.200
Prior to that, a lot of the off road robotics work
06:55.200 --> 06:58.920
had been done without any real prior model of what
06:58.920 --> 07:01.400
the vehicle was going to encounter.
07:01.400 --> 07:03.960
And so that innovation, that the fact
07:03.960 --> 07:11.320
that we could get decimeter resolution models,
07:11.320 --> 07:13.560
was really a big deal.
07:13.560 --> 07:17.480
And that allowed us to kind of bound
07:17.480 --> 07:19.680
the complexity of the driving problem the vehicle had
07:19.680 --> 07:21.040
and allowed it to operate at speed,
07:21.040 --> 07:23.800
because we could assume things about the environment
07:23.800 --> 07:26.400
that it was going to encounter.
07:26.400 --> 07:31.320
So that was one of the big step there.
07:31.320 --> 07:38.520
For the Urban Challenge, one of the big technological
07:38.520 --> 07:41.960
innovations there was the multi beam LiDAR.
07:41.960 --> 07:45.720
And be able to generate high resolution,
07:45.720 --> 07:48.680
mid to long range 3D models the world,
07:48.680 --> 07:54.120
and use that for understanding the world around the vehicle.
07:54.120 --> 07:59.120
And that was really kind of a game changing technology.
07:59.120 --> 08:02.880
And parallel with that, we saw a bunch
08:02.880 --> 08:06.640
of other technologies that had been kind of converging
08:06.640 --> 08:08.960
half their day in the sun.
08:08.960 --> 08:16.800
So Bayesian estimation had been, SLAM had been a big field
08:16.800 --> 08:18.600
in robotics.
08:18.600 --> 08:20.800
You would go to a conference a couple of years
08:20.800 --> 08:23.800
before that, and every paper would effectively
08:23.800 --> 08:25.640
have SLAM somewhere in it.
08:25.640 --> 08:31.560
And so seeing that Bayesian estimation techniques
08:31.560 --> 08:34.040
play out on a very visible stage,
08:34.040 --> 08:38.680
I thought that was pretty exciting to see.
08:38.680 --> 08:41.760
And mostly SLAM was done based on LiDAR at that time?
08:41.760 --> 08:42.400
Well, yeah.
08:42.400 --> 08:46.720
And in fact, we weren't really doing SLAM per se in real time,
08:46.720 --> 08:48.120
because we had a model ahead of time.
08:48.120 --> 08:51.560
We had a roadmap, but we were doing localization.
08:51.560 --> 08:54.080
And we were using the LiDAR or the cameras,
08:54.080 --> 08:55.920
depending on who exactly was doing it,
08:55.920 --> 08:58.080
to localize to a model of the world.
08:58.080 --> 09:00.720
And I thought that was a big step
09:00.720 --> 09:07.160
from kind of naively trusting GPS INS before that.
09:07.160 --> 09:10.400
And again, lots of work had been going on in this field.
09:10.400 --> 09:14.080
Certainly, this was not doing anything particularly
09:14.080 --> 09:17.400
innovative in SLAM or in localization,
09:17.400 --> 09:20.160
but it was seeing that technology necessary
09:20.160 --> 09:21.800
in a real application on a big stage.
09:21.800 --> 09:23.080
I thought it was very cool.
09:23.080 --> 09:25.600
So for the Urban Challenge, those already maps
09:25.600 --> 09:28.120
constructed offline in general?
09:28.120 --> 09:28.600
OK.
09:28.600 --> 09:30.920
And did people do that individually?
09:30.920 --> 09:33.600
Did individual teams do it individually?
09:33.600 --> 09:36.440
So they had their own different approaches there?
09:36.440 --> 09:41.720
Or did everybody kind of share that information,
09:41.720 --> 09:42.880
at least intuitively?
09:42.880 --> 09:49.560
So DARPA gave all the teams a model of the world, a map.
09:49.560 --> 09:53.720
And then one of the things that we had to figure out back then
09:53.720 --> 09:56.720
was, and it's still one of these things that trips people up
09:56.720 --> 10:00.240
today, is actually the coordinate system.
10:00.240 --> 10:03.000
So you get a latitude, longitude.
10:03.000 --> 10:05.120
And to so many decimal places, you
10:05.120 --> 10:07.800
don't really care about kind of the ellipsoid of the Earth
10:07.800 --> 10:09.520
that's being used.
10:09.520 --> 10:12.720
But when you want to get to 10 centimeter or centimeter
10:12.720 --> 10:18.480
resolution, you care whether the coordinate system is NADS 83
10:18.480 --> 10:22.720
or WGS 84, or these are different ways
10:22.720 --> 10:26.720
to describe both the kind of nonsphericalness of the Earth,
10:26.720 --> 10:31.560
but also kind of the actually, and I think when I can't remember
10:31.560 --> 10:33.560
which one, the tectonic shifts that are happening
10:33.560 --> 10:36.920
and how to transform the global datum as a function of that.
10:36.920 --> 10:40.400
So getting a map and then actually matching it
10:40.400 --> 10:41.880
to reality to centimeter resolution,
10:41.880 --> 10:44.000
that was kind of interesting and fun back then.
10:44.000 --> 10:46.800
So how much work was the perception doing there?
10:46.800 --> 10:52.440
So how much were you relying on localization based on maps
10:52.440 --> 10:55.720
without using perception to register to the maps?
10:55.720 --> 10:57.960
And I guess the question is how advanced
10:57.960 --> 10:59.720
was perception at that point?
10:59.720 --> 11:01.920
It's certainly behind where we are today.
11:01.920 --> 11:05.800
We're more than a decade since the urban challenge.
11:05.800 --> 11:13.080
But the core of it was there, that we were tracking vehicles.
11:13.080 --> 11:15.600
We had to do that at 100 plus meter range
11:15.600 --> 11:18.280
because we had to merge with other traffic.
11:18.280 --> 11:21.200
We were using, again, Bayesian estimates
11:21.200 --> 11:23.800
for state of these vehicles.
11:23.800 --> 11:25.560
We had to deal with a bunch of the problems
11:25.560 --> 11:28.240
that you think of today of predicting
11:28.240 --> 11:31.040
where that vehicle is going to be a few seconds into the future.
11:31.040 --> 11:33.680
We had to deal with the fact that there
11:33.680 --> 11:36.000
were multiple hypotheses for that because a vehicle
11:36.000 --> 11:37.640
at an intersection might be going right
11:37.640 --> 11:41.440
or it might be going straight or it might be making a left turn.
11:41.440 --> 11:44.080
And we had to deal with the challenge of the fact
11:44.080 --> 11:47.520
that our behavior was going to impact the behavior
11:47.520 --> 11:48.880
of that other operator.
11:48.880 --> 11:53.400
And we did a lot of that in relatively naive ways.
11:53.400 --> 11:54.720
But it kind of worked.
11:54.720 --> 11:57.000
Still had to have some kind of assumption.
11:57.000 --> 12:00.640
And so where does that 10 years later, where does that take us
12:00.640 --> 12:04.200
today from that artificial city construction
12:04.200 --> 12:06.920
to real cities to the urban environment?
12:06.920 --> 12:13.600
Yeah, I think the biggest thing is that the actors are truly
12:13.600 --> 12:18.680
unpredictable, that most of the time, the drivers on the road,
12:18.680 --> 12:24.000
the other road users are out there behaving well.
12:24.000 --> 12:27.040
But every once in a while, they're not.
12:27.040 --> 12:33.320
The variety of other vehicles is, you have all of them.
12:33.320 --> 12:35.760
In terms of behavior, or terms of perception, or both?
12:35.760 --> 12:38.320
Both.
12:38.320 --> 12:40.480
Back then, we didn't have to deal with cyclists.
12:40.480 --> 12:42.800
We didn't have to deal with pedestrians.
12:42.800 --> 12:46.240
Didn't have to deal with traffic lights.
12:46.240 --> 12:49.360
The scale over which that you have to operate is now
12:49.360 --> 12:52.240
as much larger than the airbase that we were thinking about back
12:52.240 --> 12:52.720
then.
12:52.720 --> 12:56.280
So what easy question?
12:56.280 --> 12:59.720
What do you think is the hardest part about driving?
12:59.720 --> 13:00.480
Easy question.
13:00.480 --> 13:01.320
Yeah.
13:01.320 --> 13:02.600
No, I'm joking.
13:02.600 --> 13:07.440
I'm sure nothing really jumps out at you as one thing.
13:07.440 --> 13:12.920
But in the jump from the urban challenge to the real world,
13:12.920 --> 13:16.200
is there something that's a particular euphorcy
13:16.200 --> 13:18.480
as a very serious, difficult challenge?
13:18.480 --> 13:21.120
I think the most fundamental difference
13:21.120 --> 13:28.960
is that we're doing it for real, that in that environment,
13:28.960 --> 13:31.840
it was both a limited complexity environment,
13:31.840 --> 13:33.240
because certain actors weren't there,
13:33.240 --> 13:35.360
because the roads were maintained.
13:35.360 --> 13:38.720
There were barriers keeping people separate from robots
13:38.720 --> 13:40.880
at the time.
13:40.880 --> 13:44.480
And it only had to work for 60 miles, which looking at it
13:44.480 --> 13:48.960
from 2006, it had to work for 60 miles.
13:48.960 --> 13:52.720
Looking at it from now, we want things
13:52.720 --> 13:57.200
that will go and drive for half a million miles.
13:57.200 --> 14:00.960
And it's just a different game.
14:00.960 --> 14:06.080
So how important, you said Lyder came into the game early on,
14:06.080 --> 14:08.880
and it's really the primary driver of autonomous vehicles
14:08.880 --> 14:10.240
today as a sensor.
14:10.240 --> 14:12.880
So how important is the role of Lyder in the sensor suite
14:12.880 --> 14:14.760
in the near term?
14:14.760 --> 14:18.680
So I think it's essential.
14:18.680 --> 14:20.520
But I also believe that cameras are essential,
14:20.520 --> 14:22.160
and I believe the radar is essential.
14:22.160 --> 14:27.400
I think that you really need to use the composition of data
14:27.400 --> 14:28.920
from these different sensors if you
14:28.920 --> 14:32.600
want the thing to really be robust.
14:32.600 --> 14:35.440
The question I want to ask, let's see if we can untangle it,
14:35.440 --> 14:40.240
is what are your thoughts on the Elon Musk provocative statement
14:40.240 --> 14:45.840
that Lyder is a crutch, that is a kind of, I guess,
14:45.840 --> 14:49.600
growing pains, and that much of the perception
14:49.600 --> 14:52.160
task can be done with cameras?
14:52.160 --> 14:56.920
So I think it is undeniable that people walk around
14:56.920 --> 14:59.680
without lasers in their foreheads,
14:59.680 --> 15:01.840
and they can get into vehicles and drive them.
15:01.840 --> 15:05.560
And so there's an existence proof
15:05.560 --> 15:10.840
that you can drive using passive vision.
15:10.840 --> 15:12.680
No doubt, can't argue with that.
15:12.680 --> 15:14.320
In terms of sensors, yeah.
15:14.320 --> 15:14.800
So there's proof.
15:14.800 --> 15:15.960
Yes, in terms of sensors, right?
15:15.960 --> 15:18.720
So there's an example that we all
15:18.720 --> 15:23.280
go do it at many of us every day.
15:23.280 --> 15:28.200
In terms of Lyder being a crutch, sure.
15:28.200 --> 15:33.080
But in the same way that the combustion engine
15:33.080 --> 15:35.240
was a crutch on the path to an electric vehicle,
15:35.240 --> 15:40.840
in the same way that any technology ultimately gets
15:40.840 --> 15:44.640
replaced by some superior technology in the future.
15:44.640 --> 15:47.720
And really, the way that I look at this
15:47.720 --> 15:51.720
is that the way we get around on the ground, the way
15:51.720 --> 15:55.280
that we use transportation is broken.
15:55.280 --> 15:59.720
And that we have this, I think the number I saw this morning,
15:59.720 --> 16:04.040
37,000 Americans killed last year on our roads.
16:04.040 --> 16:05.360
And that's just not acceptable.
16:05.360 --> 16:09.440
And so any technology that we can bring to bear
16:09.440 --> 16:12.840
that accelerates this technology, self driving technology,
16:12.840 --> 16:15.720
coming to market and saving lives,
16:15.720 --> 16:18.280
is technology we should be using.
16:18.280 --> 16:24.040
And it feels just arbitrary to say, well, I'm not
16:24.040 --> 16:27.800
OK with using lasers, because that's whatever.
16:27.800 --> 16:30.760
But I am OK with using an 8 megapixel camera
16:30.760 --> 16:32.880
or a 16 megapixel camera.
16:32.880 --> 16:34.640
These are just bits of technology,
16:34.640 --> 16:36.880
and we should be taking the best technology from the tool
16:36.880 --> 16:41.600
bin that allows us to go and solve a problem.
16:41.600 --> 16:45.160
The question I often talk to, well, obviously you do as well,
16:45.160 --> 16:48.320
to automotive companies.
16:48.320 --> 16:51.880
And if there's one word that comes up more often than anything,
16:51.880 --> 16:55.320
it's cost and drive costs down.
16:55.320 --> 17:01.440
So while it's true that it's a tragic number, the 37,000,
17:01.440 --> 17:04.880
the question is, and I'm not the one asking this question,
17:04.880 --> 17:07.160
because I hate this question, but we
17:07.160 --> 17:11.680
want to find the cheapest sensor suite that
17:11.680 --> 17:13.400
creates a safe vehicle.
17:13.400 --> 17:18.240
So in that uncomfortable trade off,
17:18.240 --> 17:23.680
do you foresee lidar coming down in cost in the future?
17:23.680 --> 17:28.000
Or do you see a day where level 4 autonomy is possible
17:28.000 --> 17:29.880
without lidar?
17:29.880 --> 17:32.880
I see both of those, but it's really a matter of time.
17:32.880 --> 17:35.080
And I think, really, maybe I would
17:35.080 --> 17:38.760
talk to the question you asked about the cheapest sensor.
17:38.760 --> 17:40.440
I don't think that's actually what you want.
17:40.440 --> 17:45.720
What you want is a sensor suite that is economically viable.
17:45.720 --> 17:49.480
And then after that, everything is about margin
17:49.480 --> 17:52.320
and driving cost out of the system.
17:52.320 --> 17:55.400
What you also want is a sensor suite that works.
17:55.400 --> 18:01.280
And so it's great to tell a story about how it would be better
18:01.280 --> 18:04.560
to have a self driving system with a $50 sensor instead
18:04.560 --> 18:08.720
of a $500 sensor.
18:08.720 --> 18:11.560
But if the $500 sensor makes it work and the $50 sensor
18:11.560 --> 18:15.680
doesn't work, who cares?
18:15.680 --> 18:21.680
So long as you can actually have an economic opportunity there.
18:21.680 --> 18:23.760
And the economic opportunity is important,
18:23.760 --> 18:27.800
because that's how you actually have a sustainable business.
18:27.800 --> 18:30.440
And that's how you can actually see this come to scale
18:30.440 --> 18:32.520
and be out in the world.
18:32.520 --> 18:36.400
And so when I look at lidar, I see
18:36.400 --> 18:41.200
a technology that has no underlying fundamentally expense
18:41.200 --> 18:43.240
to it, fundamental expense to it.
18:43.240 --> 18:46.120
It's going to be more expensive than an imager,
18:46.120 --> 18:51.400
because CMOS processes or FAP processes
18:51.400 --> 18:56.200
are dramatically more scalable than mechanical processes.
18:56.200 --> 18:58.160
But we still should be able to drive cost
18:58.160 --> 19:00.440
out substantially on that side.
19:00.440 --> 19:05.880
And then I also do think that with the right business model,
19:05.880 --> 19:08.440
you can absorb more, certainly more cost
19:08.440 --> 19:09.480
on the below materials.
19:09.480 --> 19:12.600
Yeah, if the sensor suite works, extra value is provided.
19:12.600 --> 19:15.480
Thereby, you don't need to drive cost down to zero.
19:15.480 --> 19:17.120
It's a basic economics.
19:17.120 --> 19:18.840
You've talked about your intuition
19:18.840 --> 19:22.720
at level two autonomy is problematic because
19:22.720 --> 19:27.280
of the human factor of vigilance, decrement, complacency,
19:27.280 --> 19:29.600
overtrust, and so on, just us being human.
19:29.600 --> 19:33.000
With the overtrust system, we start doing even more
19:33.000 --> 19:36.480
so partaking in the secondary activities like smartphone
19:36.480 --> 19:38.720
and so on.
19:38.720 --> 19:42.960
Have your views evolved on this point in either direction?
19:42.960 --> 19:44.760
Can you speak to it?
19:44.760 --> 19:48.240
So I want to be really careful, because sometimes this
19:48.240 --> 19:53.000
gets twisted in a way that I certainly didn't intend.
19:53.000 --> 19:59.360
So active safety systems are a really important technology
19:59.360 --> 20:03.400
that we should be pursuing and integrating into vehicles.
20:03.400 --> 20:05.680
And there's an opportunity in the near term
20:05.680 --> 20:09.400
to reduce accidents, reduce fatalities, and that's
20:09.400 --> 20:13.400
and we should be pushing on that.
20:13.400 --> 20:17.280
Level two systems are systems where
20:17.280 --> 20:19.480
the vehicle is controlling two axes,
20:19.480 --> 20:24.800
so breaking and thrall slash steering.
20:24.800 --> 20:27.200
And I think there are variants of level two systems that
20:27.200 --> 20:30.200
are supporting the driver that absolutely we
20:30.200 --> 20:32.560
should encourage to be out there.
20:32.560 --> 20:37.920
Where I think there's a real challenge is in the human factors
20:37.920 --> 20:40.800
part around this and the misconception
20:40.800 --> 20:44.920
from the public around the capability set that that enables
20:44.920 --> 20:48.000
and the trust that they should have in it.
20:48.000 --> 20:53.880
And that is where I'm actually incrementally more
20:53.880 --> 20:55.800
concerned around level three systems
20:55.800 --> 20:59.960
and how exactly a level two system is marketed and delivered
20:59.960 --> 21:03.240
and how much effort people have put into those human factors.
21:03.240 --> 21:07.000
So I still believe several things around this.
21:07.000 --> 21:10.760
One is people will over trust the technology.
21:10.760 --> 21:12.720
We've seen over the last few weeks
21:12.720 --> 21:16.280
a spate of people sleeping in their Tesla.
21:16.280 --> 21:23.240
I watched an episode last night of Trevor Noah talking
21:23.240 --> 21:27.160
about this, and this is a smart guy
21:27.160 --> 21:31.040
who has a lot of resources at his disposal describing
21:31.040 --> 21:32.880
a Tesla as a self driving car.
21:32.880 --> 21:35.640
And that why shouldn't people be sleeping in their Tesla?
21:35.640 --> 21:38.800
It's like, well, because it's not a self driving car
21:38.800 --> 21:41.120
and it is not intended to be.
21:41.120 --> 21:48.400
And these people will almost certainly die at some point
21:48.400 --> 21:50.400
or hurt other people.
21:50.400 --> 21:52.640
And so we need to really be thoughtful about how
21:52.640 --> 21:56.280
that technology is described and brought to market.
21:56.280 --> 22:00.760
I also think that because of the economic issue,
22:00.760 --> 22:03.320
economic challenges we were just talking about,
22:03.320 --> 22:06.960
that technology path will, these level two driver system
22:06.960 --> 22:08.400
systems, that technology path will
22:08.400 --> 22:11.560
diverge from the technology path that we
22:11.560 --> 22:15.800
need to be on to actually deliver truly self driving
22:15.800 --> 22:19.120
vehicles, ones where you can get in it and sleep
22:19.120 --> 22:21.480
and have the equivalent or better safety
22:21.480 --> 22:24.600
than a human driver behind the wheel.
22:24.600 --> 22:28.440
Because, again, the economics are very different
22:28.440 --> 22:29.800
in those two worlds.
22:29.800 --> 22:32.720
And so that leads to divergent technology.
22:32.720 --> 22:36.920
So you just don't see the economics of gradually
22:36.920 --> 22:41.520
increasing from level two and doing so quickly enough
22:41.520 --> 22:44.400
to where it doesn't cost safety, critical safety concerns.
22:44.400 --> 22:48.600
You believe that it needs to diverge at this point
22:48.600 --> 22:50.600
into different, basically different routes.
22:50.600 --> 22:53.760
And really that comes back to what
22:53.760 --> 22:56.840
are those L2 and L1 systems doing?
22:56.840 --> 22:59.800
And they are driver assistance functions
22:59.800 --> 23:04.360
where the people that are marketing that responsibly
23:04.360 --> 23:07.960
are being very clear and putting human factors in place
23:07.960 --> 23:12.400
such that the driver is actually responsible for the vehicle
23:12.400 --> 23:15.200
and that the technology is there to support the driver.
23:15.200 --> 23:19.880
And the safety cases that are built around those
23:19.880 --> 23:24.320
are dependent on that driver attention and attentiveness.
23:24.320 --> 23:30.360
And at that point, you can kind of give up, to some degree,
23:30.360 --> 23:34.280
for economic reasons, you can give up on, say, false negatives.
23:34.280 --> 23:36.200
And so the way to think about this
23:36.200 --> 23:40.760
is for a four collision mitigation braking system,
23:40.760 --> 23:45.080
if half the times the driver missed a vehicle in front of it,
23:45.080 --> 23:47.640
it hit the brakes and brought the vehicle to a stop,
23:47.640 --> 23:51.200
that would be an incredible, incredible advance
23:51.200 --> 23:52.960
in safety on our roads, right?
23:52.960 --> 23:55.080
That would be equivalent to seatbelts.
23:55.080 --> 23:57.560
But it would mean that if that vehicle wasn't being monitored,
23:57.560 --> 24:00.560
it would hit one out of two cars.
24:00.560 --> 24:05.080
And so economically, that's a perfectly good solution
24:05.080 --> 24:06.200
for a driver assistance system.
24:06.200 --> 24:07.360
What you should do at that point,
24:07.360 --> 24:09.200
if you can get it to work 50% of the time,
24:09.200 --> 24:11.040
is drive the cost out of that so you can get it
24:11.040 --> 24:13.320
on as many vehicles as possible.
24:13.320 --> 24:16.920
But driving the cost out of it doesn't drive up performance
24:16.920 --> 24:18.840
on the false negative case.
24:18.840 --> 24:21.480
And so you'll continue to not have a technology
24:21.480 --> 24:25.720
that could really be available for a self driven vehicle.
24:25.720 --> 24:28.480
So clearly the communication,
24:28.480 --> 24:31.640
and this probably applies to all four vehicles as well,
24:31.640 --> 24:34.440
the marketing and the communication
24:34.440 --> 24:37.080
of what the technology is actually capable of,
24:37.080 --> 24:38.440
how hard it is, how easy it is,
24:38.440 --> 24:41.040
all that kind of stuff is highly problematic.
24:41.040 --> 24:45.680
So say everybody in the world was perfectly communicated
24:45.680 --> 24:48.400
and were made to be completely aware
24:48.400 --> 24:50.040
of every single technology out there,
24:50.040 --> 24:52.880
what it's able to do.
24:52.880 --> 24:54.160
What's your intuition?
24:54.160 --> 24:56.920
And now we're maybe getting into philosophical ground.
24:56.920 --> 25:00.040
Is it possible to have a level two vehicle
25:00.040 --> 25:03.280
where we don't overtrust it?
25:04.720 --> 25:05.840
I don't think so.
25:05.840 --> 25:10.840
If people truly understood the risks and internalized it,
25:11.200 --> 25:14.320
then sure you could do that safely,
25:14.320 --> 25:16.200
but that's a world that doesn't exist.
25:16.200 --> 25:17.560
The people are going to,
25:19.440 --> 25:20.800
if the facts are put in front of them,
25:20.800 --> 25:24.480
they're gonna then combine that with their experience.
25:24.480 --> 25:28.400
And let's say they're using an L2 system
25:28.400 --> 25:31.040
and they go up and down the one on one every day
25:31.040 --> 25:32.800
and they do that for a month
25:32.800 --> 25:35.200
and it just worked every day for a month.
25:36.320 --> 25:37.400
Like that's pretty compelling.
25:37.400 --> 25:41.880
At that point, just even if you know the statistics,
25:41.880 --> 25:43.520
you're like, well, I don't know,
25:43.520 --> 25:44.840
maybe there's something a little funny about those.
25:44.840 --> 25:47.000
Maybe they're driving in difficult places.
25:47.000 --> 25:49.960
Like I've seen it with my own eyes, it works.
25:49.960 --> 25:52.480
And the problem is that that sample size that they have,
25:52.480 --> 25:54.000
so it's 30 miles up and down,
25:54.000 --> 25:58.800
so 60 miles times 30 days, so 60, 180, 1,800 miles.
26:01.720 --> 26:05.240
That's a drop in the bucket compared to the one,
26:05.240 --> 26:07.640
what 85 million miles between fatalities.
26:07.640 --> 26:11.400
And so they don't really have a true estimate
26:11.400 --> 26:14.440
based on their personal experience of the real risks,
26:14.440 --> 26:15.640
but they're gonna trust it anyway,
26:15.640 --> 26:17.720
because it's hard not to, it worked for a month.
26:17.720 --> 26:18.640
What's gonna change?
26:18.640 --> 26:21.600
So even if you start a perfect understanding of the system,
26:21.600 --> 26:24.160
your own experience will make it drift.
26:24.160 --> 26:25.920
I mean, that's a big concern.
26:25.920 --> 26:29.480
Over a year, over two years even, it doesn't have to be months.
26:29.480 --> 26:33.720
And I think that as this technology moves from,
26:35.440 --> 26:37.800
what I would say is kind of the more technology savvy
26:37.800 --> 26:41.480
ownership group to the mass market,
26:41.480 --> 26:44.640
you may be able to have some of those folks
26:44.640 --> 26:46.320
who are really familiar with technology,
26:46.320 --> 26:48.880
they may be able to internalize it better.
26:48.880 --> 26:50.840
And you're kind of immunization
26:50.840 --> 26:53.400
against this kind of false risk assessment
26:53.400 --> 26:56.960
might last longer, but as folks who aren't as savvy
26:56.960 --> 27:00.200
about that read the material
27:00.200 --> 27:02.200
and they compare that to their personal experience,
27:02.200 --> 27:08.200
I think there that it's gonna move more quickly.
27:08.200 --> 27:11.320
So your work, the program that you've created at Google
27:11.320 --> 27:16.320
and now at Aurora is focused more on the second path
27:16.640 --> 27:18.520
of creating full autonomy.
27:18.520 --> 27:20.920
So it's such a fascinating,
27:21.800 --> 27:24.600
I think it's one of the most interesting AI problems
27:24.600 --> 27:25.640
of the century, right?
27:25.640 --> 27:28.320
It's a, I just talked to a lot of people,
27:28.320 --> 27:30.400
just regular people, I don't know, my mom
27:30.400 --> 27:33.840
about autonomous vehicles and you begin to grapple
27:33.840 --> 27:38.080
with ideas of giving your life control over to a machine.
27:38.080 --> 27:40.040
It's philosophically interesting,
27:40.040 --> 27:41.760
it's practically interesting.
27:41.760 --> 27:43.720
So let's talk about safety.
27:43.720 --> 27:46.240
How do you think, we demonstrate,
27:46.240 --> 27:47.880
you've spoken about metrics in the past,
27:47.880 --> 27:51.880
how do you think we demonstrate to the world
27:51.880 --> 27:56.160
that an autonomous vehicle, an Aurora system is safe?
27:56.160 --> 27:57.320
This is one where it's difficult
27:57.320 --> 27:59.280
because there isn't a sound bite answer.
27:59.280 --> 28:04.280
That we have to show a combination of work
28:05.960 --> 28:08.360
that was done diligently and thoughtfully.
28:08.360 --> 28:10.840
And this is where something like a functional safety process
28:10.840 --> 28:14.360
as part of that is like, here's the way we did the work.
28:15.320 --> 28:17.200
That means that we were very thorough.
28:17.200 --> 28:20.560
So, if you believe that we, what we said about,
28:20.560 --> 28:21.480
this is the way we did it,
28:21.480 --> 28:23.440
then you can have some confidence that we were thorough
28:23.440 --> 28:27.000
in the engineering work we put into the system.
28:27.000 --> 28:30.160
And then on top of that, to kind of demonstrate
28:30.160 --> 28:32.000
that we weren't just thorough,
28:32.000 --> 28:34.000
we were actually good at what we did.
28:35.320 --> 28:38.240
There'll be a kind of a collection of evidence
28:38.240 --> 28:40.480
in terms of demonstrating that the capabilities
28:40.480 --> 28:43.960
work the way we thought they did, statistically
28:43.960 --> 28:47.200
and to whatever degree we can demonstrate that
28:48.200 --> 28:50.320
both in some combination of simulation,
28:50.320 --> 28:54.720
some combination of unit testing and decomposition testing,
28:54.720 --> 28:57.000
and then some part of it will be on road data.
28:58.200 --> 29:03.200
And I think the way we'll ultimately convey this
29:03.320 --> 29:06.800
to the public is there'll be clearly some conversation
29:06.800 --> 29:08.240
with the public about it,
29:08.240 --> 29:12.080
but we'll kind of invoke the kind of the trusted nodes
29:12.080 --> 29:14.360
and that we'll spend more time being able to go
29:14.360 --> 29:17.280
into more depth with folks like NHTSA
29:17.280 --> 29:19.760
and other federal and state regulatory bodies
29:19.760 --> 29:22.600
and kind of given that they are operating
29:22.600 --> 29:25.120
in the public interest and they're trusted
29:26.240 --> 29:28.680
that if we can show enough work to them
29:28.680 --> 29:30.040
that they're convinced,
29:30.040 --> 29:33.840
then I think we're in a pretty good place.
29:33.840 --> 29:35.040
That means that you work with people
29:35.040 --> 29:36.960
that are essentially experts at safety
29:36.960 --> 29:39.040
to try to discuss and show,
29:39.040 --> 29:41.800
do you think the answer is probably no,
29:41.800 --> 29:44.360
but just in case, do you think there exists a metric?
29:44.360 --> 29:46.360
So currently people have been using
29:46.360 --> 29:48.200
a number of disengagement.
29:48.200 --> 29:50.160
And it quickly turns into a marketing scheme
29:50.160 --> 29:54.320
to sort of you alter the experiments you run to.
29:54.320 --> 29:56.320
I think you've spoken that you don't like.
29:56.320 --> 29:57.160
Don't love it.
29:57.160 --> 29:59.720
No, in fact, I was on the record telling DMV
29:59.720 --> 30:02.000
that I thought this was not a great metric.
30:02.000 --> 30:05.360
Do you think it's possible to create a metric,
30:05.360 --> 30:09.480
a number that could demonstrate safety
30:09.480 --> 30:12.400
outside of fatalities?
30:12.400 --> 30:16.640
So I do and I think that it won't be just one number.
30:16.640 --> 30:21.320
So as we are internally grappling with this
30:21.320 --> 30:23.600
and at some point we'll be able to talk
30:23.600 --> 30:25.080
more publicly about it,
30:25.080 --> 30:28.560
is how do we think about human performance
30:28.560 --> 30:32.200
in different tasks, say detecting traffic lights
30:32.200 --> 30:36.240
or safely making a left turn across traffic?
30:37.720 --> 30:40.040
And what do we think the failure rates
30:40.040 --> 30:42.520
are for those different capabilities for people?
30:42.520 --> 30:44.760
And then demonstrating to ourselves
30:44.760 --> 30:48.480
and then ultimately folks in regulatory role
30:48.480 --> 30:50.760
and then ultimately the public,
30:50.760 --> 30:52.400
that we have confidence that our system
30:52.400 --> 30:54.800
will work better than that.
30:54.800 --> 30:57.040
And so these individual metrics
30:57.040 --> 31:00.720
will kind of tell a compelling story ultimately.
31:01.760 --> 31:03.920
I do think at the end of the day,
31:03.920 --> 31:06.640
what we care about in terms of safety
31:06.640 --> 31:11.640
is life saved and injuries reduced.
31:11.640 --> 31:15.320
And then ultimately kind of casualty dollars
31:16.440 --> 31:19.360
that people aren't having to pay to get their car fixed.
31:19.360 --> 31:22.680
And I do think that in aviation,
31:22.680 --> 31:25.880
they look at a kind of an event pyramid
31:25.880 --> 31:28.600
where a crash is at the top of that
31:28.600 --> 31:30.440
and that's the worst event obviously.
31:30.440 --> 31:34.240
And then there's injuries and near miss events and whatnot
31:34.240 --> 31:37.320
and violation of operating procedures.
31:37.320 --> 31:40.160
And you kind of build a statistical model
31:40.160 --> 31:44.440
of the relevance of the low severity things
31:44.440 --> 31:45.280
and the high severity things.
31:45.280 --> 31:46.120
And I think that's something
31:46.120 --> 31:48.240
where we'll be able to look at as well
31:48.240 --> 31:51.920
because an event per 85 million miles
31:51.920 --> 31:54.480
is statistically a difficult thing
31:54.480 --> 31:59.440
even at the scale of the US to kind of compare directly.
31:59.440 --> 32:02.280
And that event, the fatality that's connected
32:02.280 --> 32:07.280
to an autonomous vehicle is significantly,
32:07.480 --> 32:09.160
at least currently magnified
32:09.160 --> 32:12.320
in the amount of attention you get.
32:12.320 --> 32:15.080
So that speaks to public perception.
32:15.080 --> 32:16.720
I think the most popular topic
32:16.720 --> 32:19.520
about autonomous vehicles in the public
32:19.520 --> 32:23.080
is the trolley problem formulation, right?
32:23.080 --> 32:27.040
Which has, let's not get into that too much
32:27.040 --> 32:29.600
but is misguided in many ways.
32:29.600 --> 32:32.320
But it speaks to the fact that people are grappling
32:32.320 --> 32:36.160
with this idea of giving control over to a machine.
32:36.160 --> 32:41.160
So how do you win the hearts and minds of the people
32:41.560 --> 32:43.600
that autonomy is something
32:43.600 --> 32:45.480
that could be a part of their lives?
32:45.480 --> 32:47.640
I think you let them experience it, right?
32:47.640 --> 32:50.440
I think it's right.
32:50.440 --> 32:52.720
I think people should be skeptical.
32:52.720 --> 32:55.680
I think people should ask questions.
32:55.680 --> 32:57.000
I think they should doubt
32:58.040 --> 33:00.960
because this is something new and different.
33:00.960 --> 33:01.960
They haven't touched it yet.
33:01.960 --> 33:03.680
And I think it's perfectly reasonable.
33:03.680 --> 33:07.360
And but at the same time,
33:07.360 --> 33:09.360
it's clear there's an opportunity to make the road safer.
33:09.360 --> 33:12.480
It's clear that we can improve access to mobility.
33:12.480 --> 33:15.160
It's clear that we can reduce the cost of mobility.
33:16.680 --> 33:19.520
And that once people try that
33:19.520 --> 33:22.800
and understand that it's safe
33:22.800 --> 33:24.480
and are able to use in their daily lives,
33:24.480 --> 33:28.080
I think it's one of these things that will just be obvious.
33:28.080 --> 33:32.280
And I've seen this practically in demonstrations
33:32.280 --> 33:35.640
that I've given where I've had people come in
33:35.640 --> 33:38.600
and they're very skeptical.
33:38.600 --> 33:39.960
And they get in the vehicle.
33:39.960 --> 33:42.640
My favorite one is taking somebody out on the freeway
33:42.640 --> 33:46.080
and we're on the one on one driving at 65 miles an hour.
33:46.080 --> 33:48.560
And after 10 minutes, they kind of turn and ask,
33:48.560 --> 33:49.560
is that all it does?
33:49.560 --> 33:52.160
And you're like, it's self driving car.
33:52.160 --> 33:54.920
I'm not sure exactly what you thought it would do, right?
33:54.920 --> 33:57.960
But it becomes mundane,
33:58.920 --> 34:01.560
which is exactly what you want to technology
34:01.560 --> 34:02.760
like this to be, right?
34:02.760 --> 34:04.680
We don't really...
34:04.680 --> 34:07.320
When I turn the light switch on in here,
34:07.320 --> 34:12.040
I don't think about the complexity of those electrons
34:12.040 --> 34:14.240
being pushed down a wire from wherever it was
34:14.240 --> 34:15.880
and being generated.
34:15.880 --> 34:19.120
It's like, I just get annoyed if it doesn't work, right?
34:19.120 --> 34:21.440
And what I value is the fact
34:21.440 --> 34:23.120
that I can do other things in this space.
34:23.120 --> 34:24.600
I can see my colleagues.
34:24.600 --> 34:26.200
I can read stuff on a paper.
34:26.200 --> 34:29.240
I can not be afraid of the dark.
34:29.240 --> 34:32.840
And I think that's what we want this technology to be like
34:32.840 --> 34:34.160
is it's in the background
34:34.160 --> 34:36.520
and people get to have those life experiences
34:36.520 --> 34:37.880
and do so safely.
34:37.880 --> 34:41.600
So putting this technology in the hands of people
34:41.600 --> 34:45.800
speaks to scale of deployment, right?
34:45.800 --> 34:50.360
So what do you think the dreaded question about the future
34:50.360 --> 34:52.840
because nobody can predict the future?
34:52.840 --> 34:57.080
But just maybe speak poetically about
34:57.080 --> 35:00.600
when do you think we'll see a large scale deployment
35:00.600 --> 35:05.600
of autonomous vehicles, 10,000, those kinds of numbers.
35:06.360 --> 35:08.280
We'll see that within 10 years.
35:09.280 --> 35:10.600
I'm pretty confident.
35:10.600 --> 35:11.920
We...
35:13.920 --> 35:15.920
What's an impressive scale?
35:15.920 --> 35:19.000
What moment, so you've done the DARPA Challenge
35:19.000 --> 35:20.240
where there's one vehicle,
35:20.240 --> 35:22.000
at which moment does it become,
35:22.000 --> 35:23.720
wow, this is serious scale?
35:23.720 --> 35:27.960
So I think the moment it gets serious is when
35:27.960 --> 35:32.040
we really do have a driverless vehicle
35:32.040 --> 35:33.880
operating on public roads
35:34.760 --> 35:37.760
and that we can do that kind of continuously.
35:37.760 --> 35:38.640
Without a safety driver?
35:38.640 --> 35:40.240
Without a safety driver in the vehicle.
35:40.240 --> 35:41.320
I think at that moment,
35:41.320 --> 35:44.160
we've kind of crossed the zero to one threshold.
35:45.720 --> 35:50.000
And then it is about how do we continue to scale that?
35:50.000 --> 35:53.720
How do we build the right business models?
35:53.720 --> 35:56.040
How do we build the right customer experience around it
35:56.040 --> 35:59.680
so that it is actually a useful product out in the world?
36:00.720 --> 36:03.360
And I think that is really,
36:03.360 --> 36:05.720
at that point, it moves from a,
36:05.720 --> 36:08.960
what is this kind of mixed science engineering project
36:08.960 --> 36:12.120
into engineering and commercialization
36:12.120 --> 36:15.600
and really starting to deliver on the value
36:15.600 --> 36:18.000
that we all see here.
36:18.000 --> 36:20.680
And actually making that real in the world.
36:20.680 --> 36:22.240
What do you think that deployment looks like?
36:22.240 --> 36:24.920
Where do we first see the inkling of
36:24.920 --> 36:28.600
no safety driver, one or two cars here and there?
36:28.600 --> 36:29.760
Is it on the highway?
36:29.760 --> 36:33.200
Is it in specific routes in the urban environment?
36:33.200 --> 36:36.960
I think it's gonna be urban, suburban type environments.
36:37.920 --> 36:38.920
You know, with Aurora,
36:38.920 --> 36:41.560
when we thought about how to tackle this,
36:42.400 --> 36:45.040
it was kind of invoke to think about trucking
36:46.000 --> 36:47.760
as opposed to urban driving.
36:47.760 --> 36:51.240
And again, the human intuition around this
36:51.240 --> 36:55.360
is that freeways are easier to drive on
36:57.040 --> 36:59.240
because everybody's kind of going in the same direction
36:59.240 --> 37:01.560
and lanes are a little wider, et cetera.
37:01.560 --> 37:03.280
And I think that that intuition is pretty good,
37:03.280 --> 37:06.000
except we don't really care about most of the time.
37:06.000 --> 37:08.360
We care about all of the time.
37:08.360 --> 37:10.840
And when you're driving on a freeway with a truck,
37:10.840 --> 37:15.840
say 70 miles an hour and you've got 70,000 pound load
37:15.840 --> 37:17.840
to do with you, that's just an incredible amount
37:17.840 --> 37:18.840
of kinetic energy.
37:18.840 --> 37:21.440
And so when that goes wrong, it goes really wrong.
37:22.600 --> 37:27.600
And that those challenges that you see occur more rarely
37:27.760 --> 37:31.040
so you don't get to learn as quickly.
37:31.040 --> 37:33.640
And they're incrementally more difficult
37:33.640 --> 37:35.920
than urban driving, but they're not easier
37:35.920 --> 37:37.400
than urban driving.
37:37.400 --> 37:41.600
And so I think this happens in moderate speed,
37:41.600 --> 37:43.840
urban environments, because there,
37:43.840 --> 37:46.560
if two vehicles crash at 25 miles per hour,
37:46.560 --> 37:50.040
it's not good, but probably everybody walks away.
37:51.000 --> 37:53.680
And those events where there's the possibility
37:53.680 --> 37:55.720
for that occurring happen frequently.
37:55.720 --> 37:57.920
So we get to learn more rapidly.
37:57.920 --> 38:01.320
We get to do that with lower risk for everyone.
38:02.440 --> 38:04.280
And then we can deliver value to people
38:04.280 --> 38:05.800
that need to get from one place to another.
38:05.800 --> 38:08.160
And then once we've got that solved,
38:08.160 --> 38:10.000
then the kind of the freeway driving part of this
38:10.000 --> 38:12.440
just falls out, but we're able to learn
38:12.440 --> 38:15.160
more safely, more quickly in the urban environment.
38:15.160 --> 38:18.480
So 10 years and then scale 20, 30 years.
38:18.480 --> 38:21.440
I mean, who knows if it's sufficiently compelling
38:21.440 --> 38:24.320
experience is created, it can be faster and slower.
38:24.320 --> 38:27.120
Do you think there could be breakthroughs
38:27.120 --> 38:29.880
and what kind of breakthroughs might there be
38:29.880 --> 38:32.360
that completely change that timeline?
38:32.360 --> 38:35.320
Again, not only am I asking to predict the future,
38:35.320 --> 38:37.280
I'm asking you to predict breakthroughs
38:37.280 --> 38:38.280
that haven't happened yet.
38:38.280 --> 38:41.800
So what's the, I think another way to ask that would be
38:41.800 --> 38:44.240
if I could wave a magic wand,
38:44.240 --> 38:46.640
what part of the system would I make work today
38:46.640 --> 38:48.600
to accelerate it as quickly as possible?
38:48.600 --> 38:49.440
Right.
38:52.080 --> 38:54.080
Don't say infrastructure, please don't say infrastructure.
38:54.080 --> 38:56.280
No, it's definitely not infrastructure.
38:56.280 --> 39:00.520
It's really that perception forecasting capability.
39:00.520 --> 39:04.760
So if tomorrow you could give me a perfect model
39:04.760 --> 39:07.520
of what's happening and what will happen
39:07.520 --> 39:11.400
for the next five seconds around a vehicle
39:11.400 --> 39:14.480
on the roadway, that would accelerate things
39:14.480 --> 39:15.320
pretty dramatically.
39:15.320 --> 39:17.560
Are you interested in staying up at night?
39:17.560 --> 39:21.680
Are you mostly bothered by cars, pedestrians, or cyclists?
39:21.680 --> 39:25.920
So I worry most about the vulnerable road users
39:25.920 --> 39:28.000
about the combination of cyclists and cars, right?
39:28.000 --> 39:29.480
Just cyclists and pedestrians
39:29.480 --> 39:31.880
because they're not in armor.
39:33.240 --> 39:36.480
The cars, they're bigger, they've got protection
39:36.480 --> 39:39.440
for the people and so the ultimate risk is lower there.
39:39.440 --> 39:44.080
Whereas a pedestrian or cyclist, they're out on the road
39:44.080 --> 39:46.520
and they don't have any protection.
39:46.520 --> 39:49.760
And so we need to pay extra attention to that.
39:49.760 --> 39:54.120
Do you think about a very difficult technical challenge
39:55.760 --> 39:58.560
of the fact that pedestrians,
39:58.560 --> 40:01.400
if you try to protect pedestrians by being careful
40:01.400 --> 40:04.600
and slow, they'll take advantage of that.
40:04.600 --> 40:07.560
So the game theoretic dance.
40:07.560 --> 40:10.880
Does that worry you from a technical perspective
40:10.880 --> 40:12.520
how we solve that?
40:12.520 --> 40:14.600
Because as humans, the way we solve that
40:14.600 --> 40:17.280
is kind of nudge our way through the pedestrians,
40:17.280 --> 40:20.040
which doesn't feel from a technical perspective
40:20.040 --> 40:22.320
as a appropriate algorithm.
40:23.240 --> 40:25.960
But do you think about how we solve that problem?
40:25.960 --> 40:30.960
Yeah, I think there's two different concepts there.
40:31.400 --> 40:35.880
So one is, am I worried that because these vehicles
40:35.880 --> 40:37.640
are self driving, people will kind of step on the road
40:37.640 --> 40:38.680
and take advantage of them.
40:38.680 --> 40:43.680
And I've heard this and I don't really believe it
40:43.800 --> 40:46.000
because if I'm driving down the road
40:46.000 --> 40:48.920
and somebody steps in front of me, I'm going to stop.
40:48.920 --> 40:49.760
Right?
40:49.760 --> 40:53.720
Like even if I'm annoyed, I'm not gonna just drive
40:53.720 --> 40:55.200
through a person stood on the road.
40:55.200 --> 40:56.440
Right.
40:56.440 --> 41:00.440
And so I think today people can take advantage of this
41:00.440 --> 41:02.600
and you do see some people do it.
41:02.600 --> 41:04.200
I guess there's an incremental risk
41:04.200 --> 41:05.920
because maybe they have lower confidence
41:05.920 --> 41:06.760
that I'm going to see them
41:06.760 --> 41:09.360
than they might have for an automated vehicle.
41:09.360 --> 41:12.080
And so maybe that shifts it a little bit.
41:12.080 --> 41:14.400
But I think people don't want to get hit by cars.
41:14.400 --> 41:17.120
And so I think that I'm not that worried
41:17.120 --> 41:18.800
about people walking out of the one on one
41:18.800 --> 41:21.840
and creating chaos more than they would today.
41:24.400 --> 41:27.040
Regarding kind of the nudging through a big stream
41:27.040 --> 41:30.040
of pedestrians leaving a concert or something.
41:30.040 --> 41:33.480
I think that is further down the technology pipeline.
41:33.480 --> 41:36.920
I think that you're right, that's tricky.
41:36.920 --> 41:38.560
I don't think it's necessarily,
41:40.320 --> 41:43.360
I think the algorithm people use for this is pretty simple.
41:43.360 --> 41:44.200
Right?
41:44.200 --> 41:45.040
It's kind of just move forward slowly
41:45.040 --> 41:47.600
and if somebody's really close and stop.
41:47.600 --> 41:50.840
And I think that that probably can be replicated
41:50.840 --> 41:54.040
pretty easily and particularly given that it's,
41:54.040 --> 41:57.240
you don't do this at 30 miles an hour, you do it at one,
41:57.240 --> 41:59.080
that even in those situations,
41:59.080 --> 42:01.200
the risk is relatively minimal.
42:01.200 --> 42:03.440
But it's not something we're thinking
42:03.440 --> 42:04.560
about in any serious way.
42:04.560 --> 42:08.000
And probably that's less an algorithm problem
42:08.000 --> 42:10.160
more creating a human experience.
42:10.160 --> 42:14.320
So the HCI people that create a visual display
42:14.320 --> 42:16.280
that you're pleasantly as a pedestrian,
42:16.280 --> 42:17.680
nudged out of the way.
42:17.680 --> 42:21.960
That's an experience problem, not an algorithm problem.
42:22.880 --> 42:25.480
Who's the main competitor to Aurora today?
42:25.480 --> 42:28.600
And how do you out compete them in the long run?
42:28.600 --> 42:31.200
So we really focus a lot on what we're doing here.
42:31.200 --> 42:34.440
I think that, I've said this a few times
42:34.440 --> 42:37.960
that this is a huge difficult problem
42:37.960 --> 42:40.280
and it's great that a bunch of companies are tackling it
42:40.280 --> 42:42.320
because I think it's so important for society
42:42.320 --> 42:43.760
that somebody gets there.
42:45.200 --> 42:49.040
So we don't spend a whole lot of time
42:49.040 --> 42:51.560
like thinking tactically about who's out there
42:51.560 --> 42:55.480
and how do we beat that person individually?
42:55.480 --> 42:58.680
What are we trying to do to go faster ultimately?
42:58.680 --> 43:02.600
Well, part of it is the leisure team we have
43:02.600 --> 43:04.160
has got pretty tremendous experience.
43:04.160 --> 43:06.400
And so we kind of understand the landscape
43:06.400 --> 43:09.120
and understand where the cul de sacs are to some degree.
43:09.120 --> 43:10.920
And we try and avoid those.
43:12.600 --> 43:14.240
I think there's a part of it
43:14.240 --> 43:16.240
just this great team we've built.
43:16.240 --> 43:19.040
People, this is a technology and a company
43:19.040 --> 43:22.280
that people believe in the mission of.
43:22.280 --> 43:24.760
And so it allows us to attract just awesome people
43:24.760 --> 43:25.680
to go work.
43:26.760 --> 43:28.000
We've got a culture, I think,
43:28.000 --> 43:30.440
that people appreciate, that allows them to focus,
43:30.440 --> 43:33.080
allows them to really spend time solving problems.
43:33.080 --> 43:35.880
And I think that keeps them energized.
43:35.880 --> 43:40.880
And then we've invested heavily in the infrastructure
43:43.520 --> 43:46.520
and architectures that we think will ultimately accelerate us.
43:46.520 --> 43:50.640
So because of the folks we're able to bring in early on,
43:50.640 --> 43:53.520
because of the great investors we have,
43:53.520 --> 43:56.760
we don't spend all of our time doing demos
43:56.760 --> 43:58.680
and kind of leaping from one demo to the next.
43:58.680 --> 44:02.800
We've been given the freedom to invest in
44:03.960 --> 44:05.480
infrastructure to do machine learning,
44:05.480 --> 44:08.600
infrastructure to pull data from our on road testing,
44:08.600 --> 44:11.480
infrastructure to use that to accelerate engineering.
44:11.480 --> 44:14.480
And I think that early investment
44:14.480 --> 44:17.320
and continuing investment in those kind of tools
44:17.320 --> 44:19.400
will ultimately allow us to accelerate
44:19.400 --> 44:21.920
and do something pretty incredible.
44:21.920 --> 44:23.400
Chris, beautifully put.
44:23.400 --> 44:24.640
It's a good place to end.
44:24.640 --> 44:26.520
Thank you so much for talking today.
44:26.520 --> 44:27.360
Thank you very much.
44:27.360 --> 44:57.200
I hope you enjoyed it.
|