A self-driving car, also known as an autonomous car, driver-less car, or robotic car (robo-car), is a car incorporating vehicular automation, that is, a ground vehicle that is capable of sensing its environment and moving safely with little or no human input. Self-driving cars combine a variety of sensors to perceive their surroundings, such as thermographic cameras, radar, lidar, sonar, GPS, odometry and inertial measurement units. Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage. Control methods based on artificial intelligence can then be used to learn all the gathered sensory information in order to control the vehicle and support various autonomous-driving tasks.
As a future technology, they are predicted to have a comprehensive impact on automobile industry, health, welfare, urban planning, traffic, insurance, labor market and other fields. Together with other emerging automotive technologies such as vehicle electrification, connected vehicles and shared mobility, self-driving cars form a future mobility vision called Connected, Autonomous, Shared and Electric (CASE) Mobility. Autonomy in vehicles is often categorized in six levels, according to a system developed by SAE International (SAE J3016, revised periodically). The SAE levels can be roughly understood as Level 0 – no automation; Level 1 – hands on/shared control; Level 2 – hands off; Level 3 – eyes off; Level 4 – mind off, and Level 5 – steering wheel optional.
As of March 2022, vehicles operating at Level 3 and above remain a marginal portion of the market. In December 2020, Waymo became the first service provider to offer driver-less taxi rides to the general public, in a part of Phoenix, Arizona. In March 2021, Honda became the first manufacturer to provide a legally approved Level 3 car, and Toyota operated a potentially Level 4 service around the Tokyo 2020 Olympic Village. Nuro has been allowed to start autonomous commercial delivery operations in California in 2021. In December 2021, Mercedes-Benz became the second manufacturer to receive legal approval for a Level 3 car complying with legal requirements. In February 2022, Cruise became the second service provider to offer driver-less taxi rides to the general public, in San Francisco in the United States.
In China, two publicly accessible trials of robotaxis have been launched, in 2020 in Shenzhen's Pingshan District by Chinese firm AutoX and in 2021 at Shougang Park in Beijing by Baidu, a venue for the 2022 Winter Olympics.
History
Experiments have been conducted on automated driving systems (ADS) since at least the 1920s; trials began in the 1950s. The first semi-automated car was developed in 1977, by Japan's Tsukuba Mechanical Engineering Laboratory, which required specially marked streets that were interpreted by two cameras on the vehicle and an analog computer. The vehicle reached speeds up to 30 kilometres per hour (19 mph) with the support of an elevated rail.
A landmark autonomous car appeared in the 1980s, with Carnegie Mellon University's Navlab and ALV projects funded by the United States' Defense Advanced Research Projects Agency (DARPA) starting in 1984 and Mercedes-Benz and Bundeswehr University Munich's EUREKA Prometheus Project in 1987. By 1985, the ALV had demonstrated self-driving speeds on two-lane roads of 31 kilometres per hour (19 mph), with obstacle avoidance added in 1986, and off-road driving in day and night time conditions by 1987. A major milestone was achieved in 1995, with Carnegie Mellon University's Navlab 5 completing the first autonomous coast-to-coast drive of the United States. Of the 2,849 mi (4,585 km) between Pittsburgh, Pennsylvania and San Diego, California, 2,797 mi (4,501 km) were autonomous (98.2%), completed with an average speed of 63.8 mph (102.7 km/h). From the 1960s through the second DARPA Grand Challenge in 2005, automated vehicle research in the United States was primarily funded by DARPA, the US Army, and the US Navy, yielding incremental advances in speeds, driving competence in more complex conditions, controls, and sensor systems. Companies and research organizations have developed prototypes.
The US allocated US$650 million in 1991 for research on the National Automated Highway System, which demonstrated automated driving through a combination of automation embedded in the highway with automated technology in vehicles, and cooperative networking between the vehicles and with the highway infrastructure. The programme concluded with a successful demonstration in 1997 but without clear direction or funding to implement the system on a larger scale. Partly funded by the National Automated Highway System and DARPA, the Carnegie Mellon University Navlab drove 4,584 kilometres (2,848 mi) across America in 1995, 4,501 kilometres (2,797 mi) or 98% of it autonomously. Navlab's record achievement stood unmatched for two decades until 2015, when Delphi improved it by piloting an Audi, augmented with Delphi technology, over 5,472 kilometres (3,400 mi) through 15 states while remaining in self-driving mode 99% of the time. In 2015, the US states of Nevada, Florida, California, Virginia, and Michigan, together with Washington, DC, allowed the testing of automated cars on public roads.
From 2016 to 2018, the European Commission funded an innovation strategy development for connected and automated driving through the Coordination Actions CARTRE and SCOUT. Moreover, the Strategic Transport Research and Innovation Agenda (STRIA) Roadmap for Connected and Automated Transport was published in 2019.
In November 2017, Waymo announced that it had begun testing driver-less cars without a safety driver in the driver position; however, there was still an employee in the car. An October 2017 report by the Brookings Institution found that the $80 billion had been reported as invested in all facets of self driving technology up to that point, but that it was "reasonable to presume that total global investment in autonomous vehicle technology is significantly more than this."
In October 2018, Waymo announced that its test vehicles had traveled in automated mode for over 10,000,000 miles (16,000,000 km), increasing by about 1,000,000 miles (1,600,000 kilometres) per month. In December 2018, Waymo was the first to commercialize a fully autonomous taxi service in the US, in Phoenix, Arizona. In October 2020, Waymo launched a geo-fenced driver-less ride hailing service in Phoenix. The cars are being monitored in real-time by a team of remote engineers, and there are cases where the remote engineers need to intervene.
In March 2019, ahead of the autonomous racing series Roborace, Robocar set the Guinness World Record for being the fastest autonomous car in the world. In pushing the limits of self-driving vehicles, Robocar reached 282.42 km/h (175.49 mph) – an average confirmed by the UK Timing Association at Elvington in Yorkshire, UK.
In 2020, a National Transportation Safety Board chairman stated that no self-driving cars (SAE level 3+) were available for consumers to purchase in the US in 2020:
There is not a vehicle currently available to US consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated. If you are selling a car with an advanced driver assistance system, you’re not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don't own a self-driving car.
On 5 March 2021, Honda began leasing in Japan a limited edition of 100 Legend Hybrid EX sedans equipped with the newly approved Level 3 automated driving equipment which had been granted the safety certification by Japanese government to their autonomous "Traffic Jam Pilot" driving technology, and legally allow drivers to take their eyes off the road.
Definitions
There is some inconsistency in the terminology used in the self-driving car industry. Various organizations have proposed to define an accurate and consistent vocabulary.
In 2014, such confusion has been documented in SAE J3016 which states that "some vernacular usages associate autonomous specifically with full driving automation (Level 5), while other usages apply it to all levels of driving automation, and some state legislation has defined it to correspond approximately to any ADS [automated driving system] at or above Level 3 (or to any vehicle equipped with such an ADS)."
Terminology and safety considerations
Modern vehicles provide features such as keeping the car within its lane, speed controls, or emergency braking. Those features alone are just considered as driver assistance technologies because they still require a human driver control while fully automated vehicles drive themselves without human driver input.
According to Fortune, some newer vehicles' technology names—such as AutonoDrive, PilotAssist, Full-Self Driving or DrivePilot—might confuse the driver, who may believe no driver input is expected when in fact the driver needs to remain involved in the driving task. According to the BBC, confusion between those concepts leads to deaths.
For this reason, some organizations such as the AAA try to provide standardized naming conventions for features such as ALKS which aim to have capacity to manage the driving task, but which are not yet approved to be an automated vehicles in any countries. The Association of British Insurers considers the usage of the word autonomous in marketing for modern cars to be dangerous because car ads make motorists think 'autonomous' and 'autopilot' mean a vehicle can drive itself when they still rely on the driver to ensure safety. Technology able to drive a car is still in its beta stage.
Some car makers suggest or claim vehicles are self-driving when they are not able to manage some driving situations. Despite being called Full Self-Driving, Tesla stated that its offering should not be considered as a fully autonomous driving system. This makes drivers risk becoming excessively confident, taking distracted driving behavior, leading to crashes. While in Great-Britain, a fully self-driving car is only a car registered in a specific list. There have also been proposals to adopt the aviation automation safety knowledge into the discussions of safe implementation of autonomous vehicles, due to the experience that has been gained over the decades by the aviation sector on safety topics.
According to the SMMT, "There are two clear states – a vehicle is either assisted with a driver being supported by technology or automated where the technology is effectively and safely replacing the driver."
Autonomous vs. automated
Autonomous means self-governing. Many historical projects related to vehicle automation have been automated (made automatic) subject to a heavy reliance on artificial aids in their environment, such as magnetic strips. Autonomous control implies satisfactory performance under significant uncertainties in the environment, and the ability to compensate for system failures without external intervention.
One approach is to implement communication networks both in the immediate vicinity (for collision avoidance) and farther away (for congestion management). Such outside influences in the decision process reduce an individual vehicle's autonomy, while still not requiring human intervention.
As of 2017, most commercial projects focused on automated vehicles that did not communicate with other vehicles or with an enveloping management regime. Euro NCAP defines autonomous in "Autonomous Emergency Braking" as: "the system acts independently of the driver to avoid or mitigate the accident", which implies the autonomous system is not the driver.
In Europe, the words automated and autonomous might be used together. For instance, Regulation (EU) 2019/2144 of the European Parliament and of the Council of 27 November 2019 on type-approval requirements for motor vehicles (...) defines "automated vehicle" and "fully automated vehicle" based on their autonomous capacity:
- "automated vehicle" means a motor vehicle designed and constructed to move autonomously for certain periods of time without continuous driver supervision but in respect of which driver intervention is still expected or required;
- "fully automated vehicle" means a motor vehicle that has been designed and constructed to move autonomously without any driver supervision;
In British English, the word automated alone might have several meaning, such in the sentence: "Thatcham also found that the automated lane keeping systems could only meet two out of the twelve principles required to guarantee safety, going on to say they cannot, therefore, be classed as ‘automated driving’, instead it claims the tech should be classed as ‘assisted driving’." The first occurrence of the "automated" word refers to an Unece automated system, while the second occurrence refers to the British legal definition of an automated vehicle. The British law interprets the meaning of "automated vehicle" based on the interpretation section related to a vehicle "driving itself" and an insured vehicle.
Autonomous versus cooperative
To enable a car to travel without any driver embedded within the vehicle, some companies use a remote driver.
According to SAE J3016,
Some driving automation systems may indeed be autonomous if they perform all of their functions independently and self-sufficiently, but if they depend on communication and/or cooperation with outside entities, they should be considered cooperative rather than autonomous.
Classifications
Self-driving car
PC Magazine defines a self-driving car as "a computer-controlled car that drives itself." The Union of Concerned Scientists states that self-driving cars are "cars or trucks in which human drivers are never required to take control to safely operate the vehicle. Also known as autonomous or 'driver-less' cars, they combine sensors and software to control, navigate, and drive the vehicle."
The British Automated and Electric Vehicles Act 2018 law defines a vehicle as "driving itself" if the vehicle "is operating in a mode in which it is not being controlled, and does not need to be monitored, by an individual".
Another British definition assumes: "Self-driving vehicles are vehicles that can safely and lawfully drive themselves."
SAE classification
A classification system with six levels – ranging from fully manual to fully automated systems – was published in 2014 by standardization body SAE International as J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems; the details are revised periodically. This classification is based on the amount of driver intervention and attentiveness required, rather than the vehicle's capabilities, although these are loosely related. In the United States in 2013, the National Highway Traffic Safety Administration (NHTSA) had released its original formal classification system. After SAE updated its classification in 2016, called J3016_201609, NHTSA adopted the SAE standard, and SAE classification became widely accepted.
Levels of driving automation
In SAE's automation level definitions, "driving mode" means "a type of driving scenario with characteristic dynamic driving task requirements (e.g., expressway merging, high speed cruising, low speed traffic jam, closed-campus operations, etc.)"
- Level 0: The automated system issues warnings and may momentarily intervene but has no sustained vehicle control.
- Level 1 ("hands on"): The driver and the automated system share control of the vehicle. Examples are systems where the driver controls steering and the automated system controls engine power to maintain a set speed (Cruise control) or engine and brake power to maintain and vary speed (Adaptive cruise control or ACC); and Parking Assistance, where steering is automated while speed is under manual control. The driver must be ready to retake full control at any time. Lane Keeping Assistance (LKA) Type II is a further example of Level 1 self-driving. Automatic emergency braking which alerts the driver to a crash and permits full braking capacity is also a Level 1 feature, according to Autopilot Review magazine.
- Level 2 ("hands off"): The automated system takes full control of the vehicle: accelerating, braking, and steering. The driver must monitor the driving and be prepared to intervene immediately at any time if the automated system fails to respond properly. The shorthand "hands off" is not meant to be taken literally – contact between hand and wheel is often mandatory during SAE 2 driving, to confirm that the driver is ready to intervene. The eyes of the driver might be monitored by cameras to confirm that the driver is keeping their attention to traffic. Literal hands off driving is considered level 2.5, although there are no half levels officially. A common example is adaptive cruise control which also utilizes lane keeping assist technology so that the driver simply monitors the vehicle, such as "Super-Cruise" in the Cadillac CT6 by General Motors or Ford's F-150 BlueCruise.
- Level 3 ("eyes off"): The driver can safely turn their attention away from the driving tasks, e.g. the driver can text or watch a film. The vehicle will handle situations that call for an immediate response, like emergency braking. The driver must still be prepared to intervene within some limited time, specified by the manufacturer, when called upon by the vehicle to do so. This level of automation can be thought of as a co-driver or co-pilot that's ready to alert the driver in an orderly fashion when swapping their turn to drive. An example would be a Traffic Jam Chauffeur, another example would be a car satisfying the international Automated Lane Keeping Systems (ALKS) regulations.
- Level 4 ("mind off"): As level 3, but no driver attention is ever required for safety, e.g. the driver may safely go to sleep or leave the driver's seat. However, self-driving is supported only in limited spatial areas (geofenced) or under special circumstances. Outside of these areas or circumstances, the vehicle must be able to safely abort the trip, e.g. slow down and park the car, if the driver does not retake control. An example would be a robotic taxi or a robotic delivery service that covers selected locations in an area, at a specific time and quantities. Automated valet parking is another example.
- Level 5 ("steering wheel optional"): No human intervention is required at all. An example would be a robotic vehicle that works on all kinds of surfaces, all over the world, all year around, in all weather conditions.
In the formal SAE definition below, an important transition is from SAE Level 2 to SAE Level 3 in which the human driver is no longer expected to monitor the environment continuously. At SAE 3, the human driver still has responsibility to intervene when asked to do so by the automated system. At SAE 4 the human driver is always relieved of that responsibility and at SAE 5 the automated system will never need to ask for an intervention.
SAE Level | Name | Narrative definition | Execution of steering and acceleration/ deceleration | Monitoring of driving environment | Fallback performance of dynamic driving task | System capability (driving modes) | |
---|---|---|---|---|---|---|---|
Human driver monitors the driving environment | |||||||
0 | No Automation | The full-time performance by the human driver of all aspects of the dynamic driving task, even when "enhanced by warning or intervention systems" | Human driver | Human driver | Human driver | N/a | |
1 | Driver Assistance | The driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration | Using information about the driving environment and with the expectation that the human driver performs all remaining aspects of the dynamic driving task | Human driver and system | Some driving modes | ||
2 | Partial Automation | The driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/deceleration | System | ||||
Automated driving system monitors the driving environment | |||||||
3 | Conditional Automation | The driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task | With the expectation that the human driver will respond appropriately to a request to intervene | System | System | Human driver | Some driving modes |
4 | High Automation | Even if a human driver does not respond appropriately to a request to intervene the car can pull over safely by guiding system | System | Many driving modes | |||
5 | Full Automation | Under all roadway and environmental conditions that can be managed by a human driver | All driving modes |
Criticism of SAE
The SAE Automation Levels have been criticized for their technological focus. It has been argued that the structure of the levels suggests that automation increases linearly and that more automation is better, which may not always be the case. The SAE Levels also do not account for changes that may be required to infrastructure and road user behavior.
Technology
General perspectives
To deal with board range of technology discussions regarding to self-driving car, there are few proposals for its classification. Among them, there is a proposal to have classification to have the following categories; car navigation, path planning, environment perception and car control. In 2020s, these technologies became recognized that they are far more complicated and involved than we thought it would be. Even video games have been used as a platform to test autonomous vehicles.
Hybrid navigation is the simultaneous use of more than one navigation system for location data determination, needed for navigation.
Sensing
To reliably and safely operate an autonomous vehicle, usually a mixture of sensors is utilized.
Typical sensors include lidar (Light Detection and Ranging), stereo vision, GPS and IMU. Modern self-driving cars generally use Bayesian simultaneous localization and mapping (SLAM) algorithms,
which fuse data from multiple sensors and an off-line map into current location estimates and map updates.
Waymo has developed a variant of SLAM with detection and tracking of
other moving objects (DATMO), which also handles obstacles such as cars
and pedestrians. Simpler systems may use roadside real-time locating system (RTLS) technologies to aid localization.
Maps
Self-driving cars require a new class of high-definition maps (HD maps) that represent the world at up to two orders of magnitude more detail. In May 2018, researchers from the Massachusetts Institute of Technology (MIT) announced that they had built an automated car that can navigate unmapped roads. Researchers at their Computer Science and Artificial Intelligence Laboratory
(CSAIL) have developed a new system, called MapLite, which allows
self-driving cars to drive on roads that they have never been on before,
without using 3D maps. The system combines the GPS position of the
vehicle, a "sparse topological map" such as OpenStreetMap (i.e. having 2D features of the roads only), and a series of sensors that observe the road conditions.
Sensor fusion
Control systems on automated cars may use sensor fusion,
which is an approach that integrates information from a variety of
sensors on the car to produce a more consistent, accurate, and useful
view of the environment.
Self-driving cars tend to use a combination of cameras, LiDAR sensors,
and radar sensors in order to enhance performance and ensure the safety
of the passenger and other drivers on the road. An increased consistency
in self-driving performance prevents accidents that may occur because
of one faulty sensor.
Path planning
Path planning is a computational problem
to find a sequence of valid configurations that moves the object from
the source to destination. Self-driving cars rely on path planning
technology in order to follow the rules of traffic and prevent accidents
from occurring. The large scale path of the vehicle can be determined
by using a voronoi diagram, an occupancy grid mapping, or with a driving corridors algorithm.
A driving corridors algorithm allows the vehicle to locate and drive
within open free space that is bounded by lanes or barriers. While these
algorithms work in a simple situation, path planning has not been
proven to be effective in a complex scenario. Two techniques used for
path planning are graph-based search and variational-based optimization
techniques. Graph-based techniques can make harder decisions such as how
to pass another vehicle/obstacle. Variational-based optimization
techniques require a higher level of planning in setting restrictions on
the vehicle's driving corridor to prevent collisions.
Drive by wire
Drive by wire technology in the automotive industry is the use of electrical or electro-mechanical systems for performing vehicle functions traditionally achieved by mechanical linkages.
Driver monitoring system
Driver monitoring system is a vehicle safety system to assess the driver's alertness and warn the driver if needed. It is recognized in developer side that the role of the systems will increase as SAE Level 2 systems become more common-place, and becomes more challenging at Level 3 and above to predict the driver's readiness for handover.
Vehicular communication
Vehicular communications is a growing area of communications between vehicles and including roadside communication infrastructure. Vehicular communication systems use vehicles and roadside units as the communicating nodes in a peer-to-peer network, providing each other with information. This connectivity enables autonomous vehicles to interact with non-autonomous traffic and pedestrians to increase safety. And autonomous vehicles will need to connect to the cloud to update their software and maps, and feedback information to improve the used maps and software of their manufacturer.
Re-programmable
Autonomous vehicles have software systems that drive the vehicle, meaning that updates through reprogramming or editing the software can enhance the benefits of the owner (e.g. update in better distinguishing blind person vs. non-blind person so that the vehicle will take extra caution when approaching a blind person). A characteristic of this re-programmable part of autonomous vehicles is that the updates need not only to come from the supplier, because through machine learning, smart autonomous vehicles can generate certain updates and install them accordingly (e.g. new navigation maps or new intersection computer systems). These reprogrammable characteristics of the digital technology and the possibility of smart machine learning give manufacturers of autonomous vehicles the opportunity to differentiate themselves on software.
In March 2021, UNECE regulation on software update and software update management system was published.
Modularity
Autonomous vehicles are more modular since they are made up out of several modules which will be explained hereafter through a Layered Modular Architecture. The Layered Modular Architecture extends the architecture of purely physical vehicles by incorporating four loosely coupled layers of devices, networks, services and contents into Autonomous Vehicles. These loosely coupled layers can interact through certain standardized interfaces.
- The first layer of this architecture consists of the device layer. This layer consists of the following two parts: logical capability and physical machinery. The physical machinery refers to the actual vehicle itself (e.g. chassis and carrosserie). When it comes to digital technologies, the physical machinery is accompanied by a logical capability layer in the form of operating systems that helps to guide the vehicles itself and make it autonomous. The logical capability provides control over the vehicle and connects it with the other layers;
- On top of the device layer comes the network layer. This layer also consists of two different parts: physical transport and logical transmission. The physical transport layer refers to the radars, sensors and cables of the autonomous vehicles which enable the transmission of digital information. Next to that, the network layer of autonomous vehicles also has a logical transmission which contains communication protocols and network standard to communicate the digital information with other networks and platforms or between layers. This increases the accessibility of the autonomous vehicles and enables the computational power of a network or platform;
- The service layer contains the applications and their functionalities that serves the autonomous vehicle (and its owners) as they extract, create, store and consume content with regards to their own driving history, traffic congestion, roads or parking abilities for example.;
- The final layer of the model is the contents layer. This layer contains the sounds, images and videos. The autonomous vehicles store, extract and use to act upon and improve their driving and understanding of the environment. The contents layer also provides metadata and directory information about the content's origin, ownership, copyright, encoding methods, content tags, Geo-time stamps, and so on (Yoo et al., 2010).
Homogenization
In order for autonomous vehicles to perceive their surroundings, they have to use different techniques each with their own accompanying digital information (e.g. radar, GPS, motion sensors and computer vision). Homogenization requires that the digital information from these different sources is transmitted and stored in the same form. This means their differences are decoupled, and digital information can be transmitted, stored, and computed in a way that the vehicles and their operating system can better understand and act upon it.
In international standardization field, ISO/TC 22 is in charge of in-vehicle transport information and control systems, and ISO/TC 204 is in charge of information, communication and control systems in the field of urban and rural surface transportation. International standards have been actively developed in the domains of AD/ADAS functions, connectivity, human interaction, in-vehicle systems, management/engineering, dynamic map and positioning, privacy and security.
Mathematical safety model
In 2017, Mobileye published a mathematical model for automated vehicle safety which is called "Responsibility-Sensitive Safety (RSS)". It is under standardization at IEEE Standards Association as "IEEE P2846: A Formal Model for Safety Considerations in Automated Vehicle Decision Making".
In 2022, a research group of National Institute of Informatics (NII, Japan) expanded RSS and developed "Goal-Aware RSS" to make RSS rules possible to deal with complex scenarios via program logic.
Challenges
Obstacles
The potential benefits from increased vehicle automation described may be limited by foreseeable challenges such as disputes over liability, the time needed to turn over the existing stock of vehicles from non-automated to automated, and thus a long period of humans and autonomous vehicles sharing the roads, resistance by individuals to forfeiting control of their cars, concerns about safety, and the implementation of a legal framework and consistent global government regulations for self-driving cars. In addition, cyberattacks could be a potential threat to autonomous driving in the future.
Other obstacles could include de-skilling and lower levels of driver experience for dealing with potentially dangerous situations and anomalies, ethical problems where an automated vehicle's software is forced during an unavoidable crash to choose between multiple harmful courses of action ('the trolley problem'), concerns about making large numbers of people currently employed as drivers unemployed, the potential for more intrusive mass surveillance of location, association and travel as a result of police and intelligence agency access to large data sets generated by sensors and pattern-recognition AI, and possibly insufficient understanding of verbal sounds, gestures and non-verbal cues by police, other drivers or pedestrians.
Possible technological obstacles for automated cars are:
- Artificial intelligence is still not able to function properly in chaotic inner-city environments.
- A car's computer could potentially be compromised, as could a communication system between cars.
- Susceptibility of the car's sensing and navigation systems to different types of weather (such as snow) or deliberate interference, including jamming and spoofing.
- Avoidance of large animals requires recognition and tracking, and Volvo found that software suited to caribou, deer, and elk was ineffective with kangaroos.
- Autonomous cars may require high-definition maps to operate properly. Where these maps may be out of date, they would need to be able to fall back to reasonable behaviors.
- Competition for the radio spectrum desired for the car's communication.
- Field programmability for the systems will require careful evaluation of product development and the component supply chain.
- Current road infrastructure may need changes for automated cars to function optimally.
- Validation challenge of Automated Driving and need for novel simulation-based approaches comprising digital twins and agent-based traffic simulation.
Concerns
Regulation
In 2010s, researchers openly worried about the potential of future regulation to delay deployment of automated cars on the road. However, as written in UNECE WP.29 GRVA, international regulation for Level 3 was smoothly established in 2020, and the uncertainty was resolved. As of 2022,
in practice, it is actually very difficult to be approved as Level 3,
with the Mercedes-Benz Drive Pilot being one of the few commercially
available options to receive such approval.
Deceptive marketing
As Tesla's "Full Self-Driving (FSD)" actually corresponds to Level 2,
senators called for investigation to the Federal Trade Commission (FTC) about their marketing claims in August 2021.
And in December 2021 in Japan, Mercedes-Benz Japan Co., Ltd. was punished by the Consumer Affairs Agency for the descriptions in their handouts that are different from the fact.
In July 2016, following a fatal crash by a Tesla car operating in "Autopilot" mode, Mercedes-Benz was also slammed for a misleading commercial advertising E-Class models which had been available with "Drive Pilot". At that time, Mercedes-Benz rejected the claims and stopped its "self-driving car" ad campaign which had been running in the United States.
Employment
Companies working on the technology have an increasing recruitment
problem in that the available talent pool has not grown with demand.
As such, education and training by third-party organizations such as
providers of online courses and self-taught community-driven projects
such as DIY Robocars
and Formula Pi have quickly grown in popularity, while university level
extra-curricular programmed such as Formula Student Driver-less have bolstered graduate experience. Industry is steadily increasing freely available information sources, such as code, datasets and glossaries to widen the recruitment pool.
National security
In 2020s, from the importance of the automotive sector to the nation,
self-driving car has become a topic of national security. The concerns
regarding cybersecurity and data protection are not only important for
user protection, but also in the context of national security. The trove
of data collected by self-driving cars, paired with cybersecurity
vulnerabilities, creates an appealing target for intelligence
collection. Self-driving cars are required to be considered in a new way
when it comes to espionage risk.
It was in July 2018 that a former Apple engineer was arrested by Federal Bureau of Investigation (FBI) at San Jose International Airport (SJC) while preparing to board a flight to China and charged with stealing proprietary information related to Apple's self-driving car project. And in January 2019, another Apple employee was charged with stealing self-driving car project secrets. In July 2021, United States Department of Justice (DOJ) accused Chinese security officials of a hacking attack seeking data on of coordinating a vast hacking campaign to steal sensitive and secret information from government entities including research related to autonomous vehicles. On the China side, they have already prepared "the Provisions on Management of Automotive Data Security (Trial)".
It is concerned that leapfrogging ability can be applied to autonomous car technology. Also, emerging Cellular V2X (Cellular Vehicle-to-Everything) technologies are based on 5G wireless networks.
Human factors
Moving obstacles
Self-driving cars are already exploring the difficulties of determining
the intentions of pedestrians, bicyclists, and animals, and models of
behavior must be programmed into driving algorithms.
Human road users also have the challenge of determining the intentions
of autonomous vehicles, where there is no driver with which to make eye
contact or exchange hand signals. Drive.ai
is testing a solution to this problem that involves LED signs mounted
on the outside of the vehicle, announcing status such as "going now,
don't cross" vs. "waiting for you to cross".
Handover and risk compensation
Two human-factor challenges are important for safety. One is the
handover from automated driving to manual driving.
Human factors research on automated systems has shown that people are
slow to detect a problem with automation and slow to understand the
problem after it is detected. When automation failures occur, unexpected
transitions that require a driver to take over will occur suddenly and
the driver may not be ready to take over.
The second challenge is known as risk compensation: as a system is perceived to be safer, instead of benefiting entirely from all of the increased safety, people engage in riskier behavior and enjoy other benefits. Semi-automated cars have been shown to suffer from this problem, for example with users of Tesla Autopilot ignoring the road and using electronic devices or other activities against the advice of the company that the car is not capable of being completely autonomous. In the near future, pedestrians and bicyclists may travel in the street in a riskier fashion if they believe self-driving cars are capable of avoiding them.
Trust
In order for people to buy self-driving cars and vote for the government
to allow them on roads, the technology must be trusted as safe.
Self-driving elevators were invented in 1900, but the high number of
people refusing to use them slowed adoption for several decades until
operator strikes increased demand and trust was built with advertising
and features like the emergency stop button. There are three types of trust between human and automation. There is dispositional trust, the trust between the driver and the company's product; there is situational trust, or the trust from different scenarios; and there is learned trust where the trust is built between similar events.
Moral issues
Rationale for liability
There are different opinions on who should be held liable in case of a crash, especially with people being hurt.
One study suggests requesting the owners of self-driving cars to sign
end-user license agreements (EULAs), assigning to them accountability
for any accidents.
Other studies suggest introducing a tax or insurances that would
protect owners and users of automated vehicles of claims made by victims
of an accident.
Other possible parties that can be held responsible in case of a
technical failure include software engineers that programmed the code
for the automated operation of the vehicles, and suppliers of components
of the AV.
Implications from the Trolley Problem
A moral dilemma
that a software engineer or car manufacturer might face in programming
the operating software of a self-driving vehicle is captured in a
variation of the traditional ethical thought experiment, the trolley problem:
An AV is driving with passengers when suddenly a person appears in its
way and the car has to commit between one of two options, either to run
the person over or to avoid hitting the person by swerving into a wall,
killing the passengers.
Researchers have suggested, in particular, two ethical theories to be
applicable to the behavior of automated vehicles in cases of emergency:
deontology and utilitarianism.
Deontological theory suggests that an automated car needs to follow
strict written-out rules that it needs to follow in any situation.
Utilitarianism, on the other hand, promotes maximizing the number of
people surviving in a crash. Critics suggest that automated vehicles
should adapt a mix of multiple theories to be able to respond morally
right in the instance of a crash. Recently, some specific ethical frameworks i.e., utilitarianism,
deontology, relativism, absolutism (monism), and pluralism, are
investigated empirically with respect to the acceptance of self-driving
cars in unavoidable accidents.
According to research, people overwhelmingly express a preference for autonomous vehicles to be programmed with utilitarian ideas, that is, in a manner that generates the least harm and minimizes driving casualties. While people want others to purchase utilitarian promoting vehicles, they themselves prefer to ride in vehicles that prioritize the lives of people inside the vehicle at all costs. This presents a paradox in which people prefer that others drive utilitarian vehicles designed to maximize the lives preserved in a fatal situation but want to ride in cars that prioritize the safety of passengers at all costs. People disapprove of regulations that promote utilitarian views and would be less willing to purchase a self-driving car that may opt to promote the greatest good at the expense of its passengers.
Bonnefon et al. concluded that the regulation of autonomous vehicle ethical prescriptions may be counterproductive to societal safety. This is because, if the government mandates utilitarian ethics and people prefer to ride in self-protective cars, it could prevent the large scale implementation of self-driving cars. Delaying the adoption of autonomous cars vitiates the safety of society as a whole because this technology is projected to save so many lives.
Privacy
Privacy-related issues arise mainly from the
interconnectivity of automated cars, making it just another mobile
device that can gather any information about an individual (see data mining).
This information gathering ranges from tracking of the routes taken,
voice recording, video recording, preferences in media that is consumed
in the car, behavioral patterns, to many more streams of information.
The data and communications infrastructure needed to support these
vehicles may also be capable of surveillance, especially if coupled to
other data sets and advanced analytics.
Testing
Approaches
The testing of vehicles with varying degrees of automation can be carried out either physically, in a closed environment or, where permitted, on public roads (typically requiring a license or permit, or adhering to a specific set of operating principles), or in a virtual environment, i.e. using computer simulations. When driven on public roads, automated vehicles require a person to monitor their proper operation and "take over" when needed. For example, New York has strict requirements for the test driver, such that the vehicle can be corrected at all times by a licensed operator; highlighted by Cardian Cube Company's application and discussions with New York State officials and the NYS DMV.
Disengagements in the 2010s
In California, self-driving car manufacturers are required to submit annual reports to share how often their vehicles disengaged from autonomous mode during tests. It has been believed that we would learn how reliable the vehicles are becoming based on how often they needed "disengagements".
In 2017, Waymo reported 63 disengagements over 352,545 mi (567,366 km) of testing, an average distance of 5,596 mi (9,006 km) between disengagements, the highest among companies reporting such figures. Waymo also traveled a greater total distance than any of the other companies. Their 2017 rate of 0.18 disengagements per 1,000 mi (1,600 km) was an improvement over the 0.2 disengagements per 1,000 mi (1,600 km) in 2016, and 0.8 in 2015. In March 2017, Uber reported an average of just 0.67 mi (1.08 km) per disengagement. In the final three months of 2017, Cruise (now owned by GM) averaged 5,224 mi (8,407 km) per disengagement over a total distance of 62,689 mi (100,888 km). In July 2018, the first electric driver-less racing car, "Robocar", completed a 1.8-kilometer track, using its navigation system and artificial intelligence.
Car maker | California, 2016 | California, 2018 | California, 2019 | |||
---|---|---|---|---|---|---|
Distance between disengagements |
Total distance traveled | Distance between disengagements |
Total distance traveled | Distance between disengagements |
Total distance traveled | |
Waymo | 5,128 mi (8,253 km) | 635,868 mi (1,023,330 km) | 11,154 mi (17,951 km) | 1,271,587 mi (2,046,421 km) | 11,017 mi (17,730 km) | 1,450,000 mi (2,330,000 km) |
BMW | 638 mi (1,027 km) | 638 mi (1,027 km) |
|
|
|
|
Nissan | 263 mi (423 km) | 6,056 mi (9,746 km) | 210 mi (340 km) | 5,473 mi (8,808 km) |
|
|
Ford | 197 mi (317 km) | 590 mi (950 km) |
|
|
|
|
General Motors | 55 mi (89 km) | 8,156 mi (13,126 km) | 5,205 mi (8,377 km) | 447,621 mi (720,376 km) | 12,221 mi (19,668 km) | 831,040 mi (1,337,430 km) |
Aptiv | 15 mi (24 km) | 2,658 mi (4,278 km) |
|
|
|
|
Tesla | 3 mi (4.8 km) | 550 mi (890 km) |
|
|
|
|
Mercedes-Benz | 2 mi (3.2 km) | 673 mi (1,083 km) | 1.5 mi (2.4 km) | 1,749 mi (2,815 km) |
|
|
Bosch | 7 mi (11 km) | 983 mi (1,582 km) |
|
|
|
|
Zoox |
|
|
1,923 mi (3,095 km) | 30,764 mi (49,510 km) | 1,595 mi (2,567 km) | 67,015 mi (107,850 km) |
Nuro |
|
|
1,028 mi (1,654 km) | 24,680 mi (39,720 km) | 2,022 mi (3,254 km) | 68,762 mi (110,662 km) |
Pony.ai |
|
|
1,022 mi (1,645 km) | 16,356 mi (26,322 km) | 6,476 mi (10,422 km) | 174,845 mi (281,386 km) |
Baidu (Apolong) |
|
|
206 mi (332 km) | 18,093 mi (29,118 km) | 18,050 mi (29,050 km) | 108,300 mi (174,300 km) |
Aurora |
|
|
100 mi (160 km) | 32,858 mi (52,880 km) | 280 mi (450 km) | 39,729 mi (63,938 km) |
Apple |
|
|
1.1 mi (1.8 km) | 79,745 mi (128,337 km) | 118 mi (190 km) | 7,544 mi (12,141 km) |
Uber |
|
|
0.4 mi (0.64 km) | 26,899 mi (43,290 km) |
|
0 mi (0 km) |
In the 2020s
Compliance
In April 2021, WP.29 GRVA issued the master document on "Test Method for Automated Driving (NATM)".
In October 2021, the Europe's comprehensive pilot test of automated driving on public roads, L3Pilot, demonstrated automated systems for cars in Hamburg, Germany, in conjunction with ITS World Congress 2021. SAE Level 3 and 4 functions were tested on ordinary roads. At the end of February 2022, the final results of the L3Pilot project were published.
Simulation and validation
In September 2022, Biprogy released a software system of "Driving
Intelligence Validation Platform (DIVP)" as the achievement of Japanese
national project "SIP-adus" led by Cabinet Office with the same name of its subproject which is interoperable with Open Simulation Interface (OSI) of ASAM.
Topics
In November 2021, the California Department of Motor Vehicles (DMV) notified Pony.ai
that it was suspending its driverless testing permit following a
reported collision in Fremont on 28 October. This incident stands out
because the vehicle was in autonomous mode and didn't involve any other
vehicle.
In May 2022, DMV revoked Pony.ai's permit for failing to monitor the
driving records of the safety drivers on its testing permit.
As of 2022, "disengagements" are at the center of the controversy. The problem is that reporting companies have varying definitions of what qualifies as a disengagement, and that definition can change over time.
In April 2022, it is reported that Cruise's testing vehicle blocked fire engine on emergency call, and sparked questions about an autonomous vehicle's ability to handle unexpected roadway issues.
Applications
Autonomous trucks and vans
Companies such as Otto and Starsky Robotics have focused on autonomous trucks. Automation of trucks is important, not only due to the improved safety aspects of these very heavy vehicles, but also due to the ability of fuel savings through platooning. Autonomous vans are being developed for use by online grocers such as Ocado.
Research has also indicated that goods distribution on the macro (urban distribution) and micro level (last mile delivery) could be made more efficient with the use of autonomous vehicles thanks to the possibility of smaller vehicle sizes.
Transport systems
China trailed the first automated public bus in Henan province in 2015, on a highway linking Zhengzhou and Kaifeng. Baidu and King Long produce automated minibus, a vehicle with 14 seats, but without driving seat. With 100 vehicles produced, 2018 will be the first year with commercial automated service in China.
In Europe, cities in Belgium, France, Italy and the UK are planning to operate transport systems for automated cars, and Germany, the Netherlands, and Spain have allowed public testing in traffic. In 2015, the UK launched public trials of the LUTZ Pathfinder automated pod in Milton Keynes. Beginning in summer 2015, the French government allowed PSA Peugeot-Citroen to make trials in real conditions in the Paris area. The experiments were planned to be extended to other cities such as Bordeaux and Strasbourg by 2016. The alliance between French companies THALES and Valeo (provider of the first self-parking car system that equips Audi and Mercedes premi) is testing its own system. New Zealand is planning to use automated vehicles for public transport in Tauranga and Christchurch.
Tesla Autopilot
As of November 2021, Tesla's advanced driver-assistance system (ADAS) Autopilot is classified as a Level 2.
On 20 January 2016, the first of five known fatal crashes of a Tesla with Autopilot occurred in China's Hubei province. According to China's 163.com news channel, this marked "China's first accidental death due to Tesla's automatic driving (system)". Initially, Tesla pointed out that the vehicle was so badly damaged from the impact that their recorder was not able to conclusively prove that the car had been on autopilot at the time; however, 163.com pointed out that other factors, such as the car's absolute failure to take any evasive actions prior to the high speed crash, and the driver's otherwise good driving record, seemed to indicate a strong likelihood that the car was on autopilot at the time. A similar fatal crash occurred four months later in Florida. In 2018, in a subsequent civil suit between the father of the driver killed and Tesla, Tesla did not deny that the car had been on autopilot at the time of the accident, and sent evidence to the victim's father documenting that fact.
The second known fatal accident involving a vehicle being driven by itself took place in Williston, Florida on 7 May 2016 while a Tesla Model S electric car was engaged in Autopilot mode. The occupant was killed in a crash with an 18-wheel tractor-trailer. On 28 June 2016 the US National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident working with the Florida Highway Patrol. According to NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck's trailer. NHTSA's preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involved a population of an estimated 25,000 Model S cars. On 8 July 2016, NHTSA requested Tesla Motors provide the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla's planned updates schedule for the next four months.
According to Tesla, "neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." The car attempted to drive full speed under the trailer, "with the bottom of the trailer impacting the windshield of the Model S". Tesla also claimed that this was Tesla's first known autopilot death in over 130 million miles (210 million kilometers) driven by its customers with Autopilot engaged, however by this statement, Tesla was apparently refusing to acknowledge claims that the January 2016 fatality in Hubei China had also been the result of an autopilot system error. According to Tesla there is a fatality every 94 million miles (151 million kilometers) among all type of vehicles in the US. However, this number also includes fatalities of the crashes, for instance, of motorcycle drivers with pedestrians.
In July 2016, the US National Transportation Safety Board (NTSB) opened a formal investigation into the fatal accident while the Autopilot was engaged. The NTSB is an investigative body that has the power to make only policy recommendations. An agency spokesman said "It's worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible." In January 2017, the NTSB released the report that concluded Tesla was not at fault; the investigation revealed that for Tesla cars, the crash rate dropped by 40 percent after Autopilot was installed.
In 2021, NTSB Chair called on Tesla to change the design of its Autopilot to ensure it cannot be misused by drivers, according to a letter sent to the company's CEO.
Waymo
Waymo originated as a self-driving car project within Google. In August 2012, Google announced that their vehicles had completed over 300,000 automated-driving miles (500,000 km) accident-free, typically involving about a dozen cars on the road at any given time, and that they were starting to test with single drivers instead of in pairs. In late-May 2014, Google revealed a new prototype that had no steering wheel, gas pedal, or brake pedal, and was fully automated. As of March 2016, Google had test-driven their fleet in automated mode a total of 1,500,000 mi (2,400,000 km). In December 2016, Google Corporation announced that its technology would be spun off to a new company called Waymo, with both Google and Waymo becoming subsidiaries of a new parent company called Alphabet.
According to Google's accident reports as of early 2016, their test cars had been involved in 14 collisions, of which other drivers were at fault 13 times, although in 2016 the car's software caused a crash.
In June 2015, Brin confirmed that 12 vehicles had suffered collisions as of that date. Eight involved rear-end collisions at a stop sign or traffic light, two in which the vehicle was side-swiped by another driver, one in which another driver rolled through a stop sign, and one where a Google employee was controlling the car manually. In July 2015, three Google employees suffered minor injuries when their vehicle was rear-ended by a car whose driver failed to brake at a traffic light. This was the first time that a collision resulted in injuries. On 14 February 2016 a Google vehicle attempted to avoid sandbags blocking its path. During the maneuver it struck a bus. Google stated, "In this case, we clearly bear some responsibility, because if our car hadn't moved, there wouldn't have been a collision." Google characterized the crash as a misunderstanding and a learning experience. No injuries were reported in the crash.
Uber's Advanced Technologies Group (ATG)
In March 2018, Elaine Herzberg died after being hit by a self-driving car being tested by Uber's Advanced Technologies Group in the US state of Arizona. There was a safety driver in the car. Herzberg was crossing the road about 400 feet from an intersection. This marks the first time an individual is known to have been killed by an autonomous vehicle, and the incident raised questions about regulation of the self-driving car industry. Some experts said a human driver could have avoided the fatal crash. Arizona governor Doug Ducey suspended the company's ability to test and operate its automated cars on public roadways citing an "unquestionable failure" of the expectation that Uber make public safety its top priority. Uber then stopped self-driving tests in California until it was issued a new permit in 2020.
In May 2018, the US National Transportation Safety Board issued a preliminary report. The final report 18 months later determined that the immediate cause of the accident was the safety driver’s failure to monitor the road because she was distracted by her phone. However, Uber ATG’s "inadequate safety culture" contributed to the crash. The report noted from the post-mortem that the victim had "a very high level" of methamphetamine in her body. The board also called on federal regulators to carry out a review before allowing automated test vehicles to operate on public roads.
In September 2020, the backup driver, Rafael Vasquez, was charged with negligent homicide, because she did not look at the road for several seconds while her phone was streaming The Voice broadcast by Hulu. She pleaded not guilty and was released to await trial. Uber does not face any criminal charge because in the USA there is no basis for criminal liability for the corporation. The safety driver is assumed to be responsible of the accident, because she was in the driving seat in a capacity to avoid an accident (like in a Level 3). The trial was planned for February 2021.
On 9 November 2017, a Navya Arma automated self-driving bus with passengers was involved in a crash with a truck. The truck was found to be at fault of the crash, reversing into the stationary automated bus. The automated bus did not take evasive actions or apply defensive driving techniques such as flashing its headlights, or sounding the horn. As one passenger commented, "The shuttle didn't have the ability to move back. The shuttle just stayed still."
On 12 August 2021, a 31-year-old Chinese man was killed after his NIO ES8 collided with a construction vehicle. NIO's self-driving feature is still in beta and cannot yet deal with static obstacles. Though the vehicle's manual clearly states that the driver must take over when nearing construction sites, the issue is whether the feature was improperly marketed and unsafe. Lawyers of the deceased's family have also called into question NIO's private access to the vehicle, which they argue may lead to the data ending up forged.
Toyota e-Palette operation
On 26 August 2021, a Toyota e-Palette, a mobility vehicle used to support mobility within the Athletes' Village at the Olympic and Paralympic Games Tokyo 2020, collided with a visually impaired pedestrian about to cross a pedestrian crossing. The suspension was made after the accident, and restarted on 31 with improved safety measures.
Public opinion surveys
In a 2011 online survey of 2,006 US and UK consumers by Accenture, 49% said they would be comfortable using a "driverless car".
A 2012 survey of 17,400 vehicle owners by J.D. Power and Associates found 37% initially said they would be interested in purchasing a "fully autonomous car". However, that figure dropped to 20% if told the technology would cost US$3,000 more.
In a 2012 survey of about 1,000 German drivers by automotive researcher Puls, 22% of the respondents had a positive attitude towards these cars, 10% were undecided, 44% were skeptical and 24% were hostile.
A 2013 survey of 1,500 consumers across 10 countries by Cisco Systems found 57% "stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver", with Brazil, India and China the most willing to trust automated technology.
In a 2014 US telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an automated car was available instead.
In a February 2015 survey of top auto journalists, 46% predicted that either Tesla or Daimler would be the first to the market with a fully autonomous vehicle, while (at 38%) Daimler was predicted to be the most functional, safe, and in-demand autonomous vehicle.
In 2015 a questionnaire survey by Delft University of Technology explored the opinion of 5,000 people from 109 countries on automated driving. Results showed that respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the respondents did not want to spend any money for a fully automated driving system. Respondents were found to be most concerned about software hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries (in terms of lower accident statistics, higher education, and higher income) were less comfortable with their vehicle transmitting data. The survey also gave results on potential consumer opinion on interest of purchasing an automated car, stating that 37% of surveyed current owners were either "definitely" or "probably" interested in purchasing an automated car.
In 2016, a survey in Germany examined the opinion of 1,603 people, who were representative in terms of age, gender, and education for the German population, towards partially, highly, and fully automated cars. Results showed that men and women differ in their willingness to use them. Men felt less anxiety and more joy towards automated cars, whereas women showed the exact opposite. The gender difference towards anxiety was especially pronounced between young men and women but decreased with participants' age.
In 2016, a PwC survey, in the United States, showing the opinion of 1,584 people, highlights that "66 percent of respondents said they think autonomous cars are probably smarter than the average human driver". People are still worried about safety and mostly the fact of having the car hacked. Nevertheless, only 13% of the interviewees see no advantages in this new kind of cars.
In 2017, Pew Research Center surveyed 4,135 US adults from 1–15 May and found that many Americans anticipate significant impacts from various automation technologies in the course of their lifetimes—from the widespread adoption of automated vehicles to the replacement of entire job categories with robot workers.
In 2019, results from two opinion surveys of 54 and 187 US adults respectively were published. A new standardized questionnaire, the autonomous vehicle acceptance model (AVAM) was developed, including additional description to help respondents better understand the implications of different automation levels. Results showed that users were less accepting of high autonomy levels and displayed significantly lower intention to use highly autonomous vehicles. Additionally, partial autonomy (regardless of level) was perceived as requiring uniformly higher driver engagement (usage of hands, feet and eyes) than full autonomy.
Regulation
Regulation of self-driving cars is an increasingly important issue which includes multiple subtopics. Among them are self-driving car liability, regulations regarding approval and international conventions.
Anticipated launch
Between manually driven vehicles (SAE Level 0) and fully autonomous vehicles (SAE Level 5), there are a variety of vehicle types that can be described to have some degree of automation. These are collectively known as semi-automated vehicles. As it could be a while before the technology and infrastructure are developed for full automation, it is likely that vehicles will have increasing levels of automation. These semi-automated vehicles could potentially harness many of the advantages of fully automated vehicles, while still keeping the driver in charge of the vehicle.
Anticipated Level 2
Tesla vehicles are equipped with hardware that Tesla claims will allow full self driving in the future. In October 2020 Tesla released a "beta" version of its "Full Self-Driving" software to a small group of testers in the United States; however, this "Full Self-Driving" corresponds to level 2 autonomy.
Anticipated Level 3
In 2017, BMW had been trying to make 7 Series as an automated car in public urban motorways of the United States, Germany and Israel before commercializing them in 2021. Although it was not realized, BMW is still preparing 7 Series to become the next manufacturer to reach Level 3 in the second half of 2022.
In September 2021, Stellantis has presented its findings from a pilot programme testing Level 3 autonomous vehicles on public Italian highways. Stellantis's Highway Chauffeur claims Level 3 capabilities, which was tested on the Maserati Ghibli and Fiat 500X prototypes. Stellantis is going to roll out Level 3 capability within its cars in 2024.
In January 2022, Polestar, a Volvo Cars' brand, indicated its plan to offer Level 3 autonomous driving system in the Polestar 3 SUV, Volvo XC90 successor, with technologies from Luminar Technologies, Nvidia, and Zenseact.
As of February 2022, Hyundai Motor Company is in the stage of enhancing cybersecurity of connected cars to put Level 3 self-driving Genesis G90 on Korean roads.
In early 2023, Mercedes-Benz plans to submit an application for its Level 3 Drive Pilot in California and Nevada for approval by mid-2023.
Anticipated Level 4
In July 2020, Toyota started testing with public demonstration rides on Lexus LS (fifth generation) based TRI-P4 with Level 4 capability. In August 2021, Toyota operated potentially Level 4 service using e-Palette around the Tokyo 2020 Olympic Village.
In September 2020, Mercedes-Benz introduced world's first commercial Level 4 Automated Valet Parking (AVP) system named Intelligent Park Pilot for its new S-Class. The system can be pre-installed but is conditional on future national legal approval.
In September 2021, Honda started testing programme toward launch of Level 4 mobility service business in Japan under collaboration with Cruise and General Motors, using Cruise AV. In October 2021 at World Congress on Intelligent Transport Systems, Honda presented that they are already testing Level 4 technology on modified Legend Hybrid EX. At the end of the month, Honda explained that they are conducting verification project on Level 4 technology on a test course in Tochigi prefecture. Honda plans to test on public roads in early 2022.
In February 2022, General Motors and Cruise have petitioned NHTSA for permission to build and deploy a self-driving vehicle, the Cruise Origin, which is without human controls like steering wheels or brake pedals. The car was developed with GM and Cruise investor Honda, and its production is expected to begin in late 2022 in Detroit at GM's Factory Zero. As of April 2022, the petition is pending.
In April 2022, Honda unveiled its Level 4 mobility service partners to roll out in central Tokyo in the mid-2020s using the Cruse Origin.
Also in April 2022, Volkswagen started testing of its autonomous ID. Buzz AD prototype with Argo AI on public roads. And in May 2022, Argo AI started testing on public roads in Austin and Miami using modified Ford Escape Hybrid.