Today’s Advanced Driver Assistance Systems (ADAS) are gradually evolving into full autonomous vehicle systems.
By Lou Frenzel, Contributing Editor
The hype and overblown expectations continue. And the auto companies along with the many OEM and technology suppliers are collectively spending billions of dollars to create the first practical self-driving (SD) cars. Good progress is being made. Analyzing this effort to create the first automated vehicles (AVs), it is clear what must happen. The challenges are mostly technical, but there are major legal/social/human issues to resolve.
ADAS as the Prelude to AV
Most of the essential technologies for self-drivers are already at the core of the advanced driver assistance systems (ADAS) now widely available in most vehicles. Typical features include backup cameras, lane departure warning, blind spot detection, adaptive cruise control, automatic braking and a few others. Some ADAS are optional now but the trend is to make it a standard in all vehicles. The ADAS will eventually morph into a real AV. Table 1 shows the different levels of automated vehicles as defined by the Society of Automotive Engineers.
|Level||Amount of Automation|
|Level 0||No automation driver performs all functions|
|Level 1||Driver assistance. Driver performs all functions but ADAS systems provide alerts and partial control or braking, steering and throttle.|
|Level 2||Partial automation. Driver must still monitor actions but automated systems control braking, steering and throttle|
|Level 3||Conditional assistance. Automated driving systems perform all driving activities but driver must still be available to take control in special circumstances.|
|Level 4||High automation. Automated driving systems perform all driving activities. Driver may still control the vehicle if needed.|
|Level 5||Full automation. No driver needed but a driver may intervene if necessary.|
This evolution from ADAS to AVs will occur as these seven critical issues are adequately addressed.
A self-driving car is a robot that must have the equivalent sensing ability of a human, or close to it, to function safely. Most of that is “seeing.” The developers are using multiple sensors to create an equivalent composite view suitable for driving. These include:
- Video cameras. Highly developed, small, cheap and they see color. Multiple RGB cameras are used for different functions. The big problem is that they do not see well at night and weather like fog, snow and hard rain can interfere with their view. Short range is another limitation. Distances to objects are difficult to determine. Furthermore, the complex pictures they capture require lots of memory and special software like machine vision and artificial intelligence to interpret what they see. New vision systems using two cameras in a stereo configuration overcome some of these obstacles.
- IR cameras. Infrared sensors have been available for eons. They are widely used in weapons and satellites but not in cars. They “see” heat signatures, giving the cameras ever-important night vision. They bring a different and complementary view that adds to the quality of coverage. Standard RGB cameras are also being combined with IR cameras in a collaborative way to make object detection easier and more accurate.
- Radar. Single-chip radars are now available that give an alternate view of the driving surroundings. Using reflected radio waves they detect objects at a significant distance. The new 77 GHz CMOS devices are cheaper and consume less power. Their field of view and beamwidth can be set from a few degrees up to about 75 deg. by antenna design and selection. And these radars can see out to distances from a few feet up to 300 meters. Multiple devices can be used to fill in the gaps that may remain in the 360-deg. view goal.
- Ultrasonic. These short-range sensors give an entirely different view. They are typically built into side mirrors for close object detection. Parking assist is another use. Operating at 58 kHz, these sensors are inexpensive and easily incorporated into the vehicle.
- Lidar. Light detection and ranging sensors are laser-based and radar-like. They paint the surrounding area with narrow laser beams and detect the reflections to generate a picture of the environment. Their advantage is the ability to create a precision 360-deg. 3D image that is superior for object detection. They are not used in current ADAS systems because of their very high cost and there is some doubt about their use in the forthcoming self-drivers. Recent developments show considerable progress.
Get on the fast-track to automotive system innovation with Texas Instruments
None of these sensors are good enough alone to safely guide the vehicle independently. But collectively these sensors do provide the 360-deg. view around the vehicle (Fig. 1) to detect nearby objects, identify them, avoid them, and provide that information to the steering, throttling, and braking controls. These sensors collectively create a massive problem. They generate a huge amount of data in parallel that must be stored, transported on several appropriate buses to processors that will use the data for driving instructions. The real processing is in the software.
It begins with an operating system such as that from Blackberry, Microsoft, QNX, or some Linux-offshoot. Add to that a collection of programs that take the sensor data and interpret it. The software must meld the various sensor inputs into a composite view of the area around the vehicle. And don’t forget, the view is changing constantly so the software must be able to respond quickly to the changes. Real-time processing with minimal latency is essential. The software is a mix of standard algorithms plus DSP. Artificial intelligence (AI) software using neural networks and deep machine learning do the sensor fusion computations.
With the massive amount of data to crunch, many special processors are needed. This calls for heterogeneous processing, a group of diverse processors including general-purpose CPUs, DSPs, and graphics processing units (GPUs) from vendors like AMD and NVIDIA. Multiple vendors are getting some of the processor business including Intel, NXP, Qualcomm, and Texas Instruments. FPGAs are also being applied to get the speed need for some processing operations.
After analyzing the inputs, the processors will make the decisions about what to do. They will warn the driver (if any) with visible or audible alerts. Or they will assist the driver in the driving effort. But in a self-driver the decision will typically translate directly into control actions for braking, steering, or accelerating/decelerating. Figure 2 shows a generalized picture of the complete system.
A missing part of the AV puzzle is the proposed and generally accepted technology known as vehicle-to-vehicle (V2V) or vehicle-to-everything (V2X) communications. An early accepted standard is the dedicated short range communications (DSRC) system based upon the IEEE 802.11p standard. It lets vehicles talk to one another and exchange location, speed, direction, turn intents, and other relevant factors that will help vehicles avoid collision. Links to nearby road side units (RSUs) will also eventually provide weather, road, traffic, and construction work data to further fill the needs for self-driving.
A competing technology uses cellular technology called C-V2X. It would deploy 3GPP cellular standards like LTE radios initially and eventually evolve to the 5G NR standards. No decision has been made on which to use. With so much data to transmit quickly and the vital need for less latency, the C-V2X choice may be the better of the choices. It also offers cellular network connectivity. This communications technology will greatly enhance the current ADAS and will form a vital part of the self-driving system. Communications inputs will be merged with all the sensor inputs to create a composite picture that the driving system will respond to.
- The Infotainment System
The infotainment system is a critical part of most newer vehicles and it will play an increasing role in AVs. With no formal driving responsibility the former driver-now-passenger will want something to do. Staring at a smartphone is the most likely outcome, but others will want music, video, or other entertainment. OLED TV and a killer audio system may be the prime feature.
Let’s face it, laws will be enacted to regulate self-drivers. There will be some federal rules but much of the guidelines will come from the states. That is only beginning so we can look forward to this important issue to be resolved.
Then there is the insurance problem. Whose fault is it when two level 5 AVs crash into one another? Or if an AV kills a pedestrian? And will the resulting monthly insurance payment be more than the car payment? Those and other legal questions are seeking answers.
- The Market
The auto manufacturers have done considerable marketing research work to determine the interest in self-driving cars. There is an interest, but surveys also indicate that not everyone is enamored with the concept. A recent AAA survey reveals that about 75% of those polled would be uncomfortable with a self-driver. Secretary of the DOT Elaine Chao recently cited another survey that indicated that 78% of those polled would be afraid to ride in a driverless car. But that is not all. The market is complex as the entire automotive market is highly segmented. Which market segments like AVs?
- High-end buyers want comfort. Luxury buyers want to be seen as rich. Image is important.
- Those who love to drive and buy performance and sports cars. Real drivers want to be seen as sporty and skilled drivers.
- Green buyers want less pollution and economy. Green drivers want you to see that they are trying to save the world. Their interest may be more in electric vehicles (EVs) than self-drivers.
- Just transportation. Those who just want a cheap way to get from point A to point B. These are frequently used car buyers. No self-drivers.
- These buyers prefer SUVs and minivans. Over 50% of all vehicles sold in the U.S. are now SUVs. Some but not total interest in self-drivers.
- Pickup trucks. The highest-volume selling vehicle in the U.S. is the Ford F150 pickup. Competitive pickups form Chevy, Dodge, and others also sell well. Pickup drivers want to be seen as macho. These are not likely self-driving enthusiasts.
To most these categories, image is a major issue. It has been said that the vehicle you drive defines who you are. The type of vehicle is a reflection of your personality, life style, and outlook. What does a self-driving car say about a person?
The market is a mixed bag. It is anyone’s guess at this time how well autonomous vehicles will sell. No doubt they will initially be expensive thereby generally limiting sales. Many potential buyers will take a wait and see attitude. For these reasons, AVs will likely be a niche market like hybrid electric vehicles.
Driving automotive innovation with TI
The Case for the Electric Vehicle
A current parallel development by most auto makers is the electric vehicle (EV). While ADAS is evolving to the AV, the internal combustion engine (ICE) is evolving to an EV. We already have hybrid electrics like the Prius and some full electrics like the Tesla S and 3, the Chevy Bolt and Volt, and others from BMW and VW. Many more are on the way. Manufacturers are basically working to overcome the multiple limitations inhibiting sales. The battery is one of them. Everyone is seeking the holy grail of EVs, a smaller, lighter, battery with more energy capacity. Lithium ions have proven successful and are the solution for now. Something better is needed.
Another limitation is the short range of most EVs. The typical maximum range for today’s EVs is in the 100 to 200 mile range. Compare that to the 300-400 mile range of the average internal combustion engine vehicle. Most consumers demand the greater range they are used to. Another limitation is the lack of charging stations. There are roughly 18,000 public charging stations and about 45,000 charging points at companies, parking lots and other locations. Their number is increasing but it will be a while before the quantity of charging stations is on par with the 120,000 or so gasoline stations in the U.S. now available 24/7. And even when the number of EV charging stations increase to a reasonable level, it still takes a minimum of a half hour or more for even a partial recharge. Given our being used to a five- to 10-minute fill up at any gas station, a half hour or more recharge is not acceptable to most.
EVs are enviable. When their limitations are overcome, they will dominate perhaps not for popularity but perhaps because of legislation. Already China and some European countries have mandated the phase out of internal combustion engines in favor of EVs in the coming decades. If price and the range/charging issues are addressed, EVs could be the more popular choice. But the ultimate electronic vehicle, of course, is a self-driving EV.
The industry is well under way to achieving a practical self-driving car. All the technology needed is here now, but legal, social and market issues for now inhibit the forward motion. Predictions for when the first sales will occur run from a few years to a decade or longer. The goal will ultimately be reached. In the meantime, celebrate the ADAS and its real benefits to safety and convenience. They work and they are affordable and easy to adapt to. No doubt they will ultimately be mandated. Then the AV will emerge. EVs are on the horizon so should the current priority be EVs instead of AVs?