Musk pushes the boundaries in Tesla autonomous campaign
Tesla Chief Executive Elon Musk often touts the arrival of completely autonomous vehicles as imminent, but exactly how close that future is for the electric automaker remains murky.
Meanwhile, the company is launching new features in a US regulatory environment that has often taken a laissez-faire approach to emerging technologies, while using terms like Full Self Driving (FSD) that critics view as misleading.
Videos posted online by Tesla owners show an erratic performance in "FSD Beta," the latest update on Tesla's driver-assistance system.
Cars can be seen turning awkwardly, knocking down safety cones and lurching unexpectedly.
Earlier this month, Tesla initiated a recall of some 54,000 vehicles equipped with FSD Beta to disable a feature that had allowed the cars to go through a stop sign without fully halting in certain situations.
The episode highlights a downside to Musk's envelope-pushing approach, which has also been credited with making electric vehicles a mainstream option in the United States and other markets.
"The rolling stop recall was not because of an honest mistake made in engineering, but rather a decision Tesla says was intentional to break traffic laws," said Phil Koopman, a professor at Carnegie Mellon University and a specialist on autonomous vehicles.
The National Highway Traffic Safety Administration (NHTSA) launched a probe last year following a series of collisions with first-response vehicles involving Teslas equipped with its "Autopilot" driver-assistance system.
"Tesla's doing a lot of things that tiptoe around violations of the Safety Act and a lot of marketing that inflates the consumer perspective of what their vehicles are capable," said Michael Brooks, acting executive director of the Center for Auto Safety.
- Shift under Biden -
Under US regulations, new vehicles are not systematically certified by safety officials before they hit the market. Rather, automakers must simply certify that the products comply with the rules.
The NHTSA only steps in if there is a problem with a vehicle that raises questions about its compliance, or if it is thought to be unsafe.
In some cases, regulators may not have any rules governing systems like adaptive cruise control, said Bryant Walker Smith, a specialist on law and mobility affiliated with Stanford Law School.
During Donald Trump's presidency, NHTSA avoided actions that slowed the development of driverless technology.
But after President Joe Biden took office, the NHTSA began to look more closely at the safety questions connected to driver-assistance programs.
In June 2021, the agency required Tesla and other auto manufacturers that make cars with driver assistance systems or automated driving to report crashes.
It has also made repeated requests for information from Tesla and other automakers during the investigation into the accidents with emergency vehicles.
"We continue to research new technologies, including the driver support features, and monitor their real-world performances," said a NHTSA spokesperson.
- 'Dangerous and irresponsible' -
Tesla now installs on all new vehicles Autopilot, a system that can match a vehicle's speed to that of surrounding traffic and assist with steering.
The company also offers features such as auto lane change and parking assistance in packages called "Enhanced Autopilot" or "Full Self-Driving Capability", depending on the countries.
Tesla describes as "upcoming" the feature "auto-steering on city streets".
However, the company has already started testing this function on about 60,000 vehicles that are authorized to download FSD Beta.
"While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car," Tesla says on its website.
Tesla has told California officials that its current systems are at "level 2" on the Society of Automotive Engineers scale of autonomy and therefore do not need to comply with rules for autonomous driving.
But Musk said his ultimate aim is a vehicle that can operate without a driver, a function that critics say has already been confounded by Tesla's use of terms like "Autopilot" and "Full Self Driving."
"What it calls 'full self driving' literally needs a human driver," Smith said. "Tesla is really trying to have it both ways, in a way that is disingenuous and irresponsible."
Smith contrasted Tesla's approach with other companies such as Waymo, which have developed technologies that are further along on the autonomy scale with less fanfare.
He called on Tesla to employ technologies that ensure drivers are attentive, avoid misleading consumers and to "act like a trustworthy company."
J.M.Gillet--JdB