For over 5 years, San Francisco’s hilly streets were being mapped – by funky looking cars adorned with spinning sensors – into a high-precision virtual world. Since then, the streets have been hosting everyday drivers like you and me alongside car-like robots with no one in the front seat. Residents of the City, and the Bay Area at large, no longer even show surprise when they turn their heads to find an empty driver’s seat in the adjacent car at a traffic light.
In a contest to name something both futuristic and of-the-moment, “Autonomous Driving” and “Robo-Taxi” could easily win the prize. Tech giants and major car brands around the world have been sending their autonomous driving vehicles out on the road for years now to test their readiness for the new reality. But after years of sci-fi wishful thinking, exactly what convergence now makes this moment possible?
To align on understanding and language, SAE International released Levels of Driving Automation standard J3016i, defining Level 1 to Level 5 Autonomous Driving. In California, the Autonomous Vehicle branch of the DMV has also established regulations governing testing and deployment of these self-driving vehicles.
From Level 0 to Level 2, a human is expected to be driving the car with constant supervision to maintain safety, although the driver’s feet may be off the pedals and the driver may not be steering. The features under these levels are categorized as Driver Support Features.
Level 3, a stopgap level before the most advanced autonomous driving technology, is when the safety features have the ability to indicate when a human should take over driving.
From Level 4 to Level 5, a human is not driving, even if in the driver’s seat. The features under these levels are categorized as Automated Driving Features. The granular differences among these levels are presented and carefully listed out in the government regulations.
To offer some examples:
- Automatic emergency braking, blind spot warning, and lane departure warning are standard features in many affordable car models; they are categorized under Level 0.
- Lane centering and adaptive cruise control in the same lane fall under Level 2.
- At Level 3, the vehicle would be equipped with traffic jam chauffeur (automated lane-keeping system); Mercedes-Benz was the first to get U.S. approval for its Level 3 automated driving systemii.
- The “Autonomous Driving” cars mentioned above on San Francisco streets belong either to Level 4 or Level 5. Waymo, the self-driving technology company established under Alphabet, is certified under Level 4 by SAEiii.
In short, Autonomous Driving can be thought of as a convergence of technology and standards to meet human expectations about how a driverless vehicle should perform.
It’s a weekday afternoon right as rush hour is about to hit. Starting from the upper east side of San Francisco, we are heading towards busy Montgomery Street and the Financial District. Earlier that day, a decision had been made that trying out a Robo-Taxi experience in San Francisco would, without a doubt, be worth the hour-long trip up to the city from San Jose.
Hailing and check-in
The Waymo One intuitive user interface app is carefully designed and works just like any other ride-sharing services we had experienced. Around 10 minutes after making a request, a white Jaguar arrived. Similar to how a traditional ride-sharing driver would check the pickup by confirming my name, Waymo asked for confirmation through its app. After pressing “unlock” on my phone, the car handle popped out for me to enter.
Climbing into the back seat, a soft voice welcomed us and reminded us to fasten our seatbelts. A screen for the infotainment system provided options to play music and view routes, as well as display objects detected around the vehicle; of course, up in front was an empty driver’s seat.
The surprisingly stylish white Jaguar is clearly well designed, and loaded with both cutting-edge software and hardware technologies. Cameras along with LiDAR (light detection and ranging) and other sensors are visible on the front, side, back, and top of the vehicle. As a first-time rider, actually seeing those sensors did help put me at ease.
En route
With a group of 6 people, we hailed two Waymo rides. My ride was so smooth that the apprehension I felt just after hopping in quickly wore off. The system did everything a perfect driver is expected to do. Traffic got busy and the narrow streets of San Francisco became congested quickly. My attention was mainly on the steering wheel, and enjoying how smooth the maneuvers were while turning, lane changing, and yielding to pedestrians.
However, the story from the other set of riders is much more fun to hear about. “The car pulled over on the side of the street for no reason and we were concerned and even panicked a little” said Jim Elliot, EVP Samsung Memory Sales. Having the car stop without any reason, and without a driver to ask why, leaves riders inside the car truly clueless. Traffic builds up during these hours, and so does the stress. “And… a fire truck came from the back in the distance. Then it all made sense, Waymo detected the situation before every human rider in the car did.”
In just 30 mins, both of the rides arrived and dropped us off safe and sound. After we stepped out, both cars quietly drove away for their next customers.
Hardware
LiDAR, cameras, and radar are commonly available equipment; many conventional car models driving on the road nowadays are already equipped with them. What is different for autonomous driving vehicles is the quantity and quality, as redundancy is demanded for mission-critical tasks. According to S&P Mobility research, SAE level 4 passenger vehicles are equipped with, on average, a total of 25 sensors, 8 radar, 12 cameras, 4 LiDARiv.
Real-time computing, training, inferencing
While Autonomous Driving vehicles are transporting riders on the road, their computing system – hidden in the trunk – is making real-time decisions just as we human drivers are. “For instance, when Waymo started testing the Jaguar I-Pace in late 2019, the crossover SUV came with more powerful sensors that generated a bigger stream of information—to the point that full logs for an hour’s driving equated to more than 1,100 gigabytes, enough to fill 240 DVDs.”v Given how large an amount of data will be generated on each ride, these vehicles must now be considered a data center on wheels.
Data is both processed on the car during each riding session and also uploaded to the data center for further inference and training. The industry predicts Level 4+ Autonomous Driving vehicles would require over 1000 TeraOPS for processing. In recent years, advanced AI models also enabled more complex training up in the cloud and within data centers. The demand increase in data centers is proven in recent earnings releases for semiconductor corporationsvi, many reporting in the range of 80% year-over-year growth in terms of revenue. Server OEMs such as Dell Technologies also reported nearly 40% year-over-year order growthvii with an exponential demand surge for training and inference usage with AI.
Backbone for all that power – Memory
“By 2030, we expect DRAM demand from gen AI applications to be five to 13 million wafers… translating to four to 12 dedicated fabs… [and] the total NAND demand to be two to eight million wafers, corresponding to one to five fabs,” according to the most recent article published by McKinsey & Companyviii on how the semiconductor industry will have to dedicate resources to the applications of generative AI.
Vishal Devadiya, Sr. Manager of Automotive BE at Samsung Semiconductor, shared his thoughts on this topic: “Automotive HW systems for the vehicles manufactured today are equipped with one of the following storage options, depending on the ADAS level support: PCIe Gen3 SSD with theoretical maximum bandwidth of 4GB/s per x4 lane configuration, UFS2.x with theoretical maximum bandwidth of 11.6 Gbps per x2 lane configuration, or UFS3.1 with theoretical maximum bandwidth of 23.2 Gbps per x2 lane configuration speed. However, the future generation of autonomous vehicles with ADAS level 3 and beyond will require higher performance storage, such as PCIe Gen4 SSD with theoretical maximum bandwidth of 8GB/s per x4 lane configuration or UFS4.1 with theoretical maximum bandwidth of 46.4 Gbps speed per device.”
Samsung Semiconductor, as the top provider for both DRAM and NAND memory products, is supporting, enabling, and leading the technology breakthroughs behind the Autonomous Driving transformation and AI era. The complete lineup of our automotive memory products empowers traditional OEMs, Tier 1 trailblazers, and new automotive disruptor brands with high-performance, low-power, automotive-grade solutions.
i sae-j3016-visual-chart_5.3.21.pdf
iiMercedes-Benz First To Get U.S. Approval For Level 3 Automated Driving (forbes.com)
iiiTesla Full Self-Driving cars and Waymo taxis: Two autonomous vehicle strategies - Vox
ivSource: S&P Global Mobility, AutoTechInsight, Autonomy Forecast Q3 2023
vSelf-Driving Cars Are Being Put on a Data Diet | WIRED
viHuge data center growth pushes AMD revenues (msn.com)
viiPowerPoint Presentation (delltechnologies.com)
viiiAI in semiconductor manufacturing: The next S curve? | McKinsey