
“Eyeless driving” is coming, and we are not ready for it
By Andrew J. Hawkins | Published: 2025-11-03 21:27:00 | Source: The Verge
Last month, General Motors added its name to the growing list of automakers pursuing a new type of partially automated technology called “driving without looking.” But what these companies did not do was provide a comprehensive description of how they take responsibility when things go wrong.
It should not be confused with the type of distracted driving that many drivers seem to practice Practice these daysGM’s system would be a step toward the automaker’s ultimate goal of selling privately owned, fully autonomous vehicles. Some GM vehicles already include the company’s Super Cruise system, which allows drivers to take their hands off the wheel but uses eye-tracking technology to make sure they keep their eyes on the road. The new system in Level 3 of the six-level autonomy scaleIt will allow drivers to take their hands off the steering wheel and Their eyes are off the road on some US highways.
GM says it aims to bring its Level 3 system to market by 2028, starting with the Cadillac Escalade IQ. From there, the technology will likely spread to the automaker’s other brands, such as Chevy, Buick and GMC. Soon, drivers will be able to look at their phones without shame or the risk of committing traffic violations. In some cases, drivers are encouraged to play video games or watch YouTube while their car is driving.
But only sometimes. Importantly, in a Level 3 system, drivers still need to be prepared to take over control of the vehicle if asked to do so. If they fail to do so quickly, they may be held liable when something goes wrong. And when it comes to leadership in today’s world, something is always wrong.
“With conditional automation, Level 3 automation, things get a lot messier,” said Dr. Alexandra Mueller, a senior research scientist at the Insurance Institute for Highway Safety. “And that’s where I think a lot of the concerns come from, because there’s a lot that we simply don’t know when it comes to Level 3 driving automation.”
The uncertainty becomes even more troubling when you go down the list of automakers actively pursuing this technology. In addition to GM, Ford, Stellantis, Jeep’s parent company, and Honda, have rolled out their chips at Level 3. Mercedes-Benz already has a Level 3 system, which it calls Drive Pilot — but its use is only legal on select highways in California and Nevada.
This is the catch. The industry is actively planning to release new technology that is still widely banned in most places. Germany and Japan both have some temporary allowances for BMW and Honda respectively. But so far, the third tier is considered too restrictive and will likely remain so until lawmakers figure it out.
It is a very difficult problem for many organizers. How do you assign responsibility in a system that can bounce back and forth between an automated driving system and a human driver? In the case of Drive Pilot, Mercedes says it will accept liability for accidents caused by its technology when the system is active. But this is conditional in nature, and the driver is still liable if they fail to take control of the vehicle when instructed to do so or if they misuse the system.
Tesla is already using this ambiguity to its advantage with its Level 2 systems, Autopilot and full self-driving. An investigation into dozens of Tesla-related crashes found that Autopilot would disengage “less than one second” before impact. Investigators found no evidence Which suggests Tesla was trying to shirk its responsibility – but it certainly looks bad for the company.
Companies can also use sensors that guide these systems, such as cameras, infrared trackers and torque sensors, to provide evidence, in the event of an accident, about who was in control and when. At the event announcing the new “ignore” system, GM CEO Mary Barra pointed to the growing number of sensors as potentially vindicating the company in these cases. “We will have so much more sensing that we will pretty much know exactly what happened,” she said when asked about liability concerns related to Level 3 automation. “And I think you’ve seen GM, you know, always take responsibility for what we need.”
The definition of Level 3 itself is a contradiction: drivers are told they can disengage, but must also remain available to quickly re-engage. When transitions are planned, such as when a driver is entering or leaving a specific area, the handover process should be smooth. But unexpected events, such as sudden changes in weather or road, may make these systems unreliable. Research has shown that humans generally struggle with this type of recovery from “out of the box” tasks.
Research has shown that humans in general struggle with this type of “off the beaten track” task recovery.
When people are not driving for a longer period of time, they may overreact when they suddenly have to take control in an emergency. They may over-correct the steering, hit the brakes too hard, or be unable to respond properly because they haven’t been paying attention. These actions can create a domino effect that can be dangerous or even fatal.
“The mixed fleet scenario, which will likely continue long after our lifetimes, provides a largely uncontrolled environment in which many highly automated systems and even partially and conditionally automated systems will struggle,” Mueller said. “And they will struggle with it indefinitely because frankly we live in a very chaotic and dynamic environment where things are constantly changing.”
We are already beginning to see case law emerging that places the onus on the human driver over the automated system.
In Arizona, a safety driver for an Uber robotaxi pleaded guilty in response to a charge of negligent homicide in a 2017 fatal crash that occurred while the autonomous system was operating. Before that, a Tesla driver pleaded no contest to negligent homicide over two deaths resulting from crashes while the company’s Autopilot system was in use. In both cases, prosecutors brought criminal charges against the human behind the wheel, assuming that despite an automated system, the driver is ultimately responsible for the vehicle.
Automakers will likely be happy with the results of these cases. But there have been other cases that have found that a car company can share responsibility when something bad happens. Take, for example, a recent jury ruling in Florida, where Tesla was held partially liable for an accident that killed two people. In this case, the owner of the Model S who was using Autopilot was also convicted, but it was Elon Musk’s company that was ordered to pay $243 million to the victims’ families.
Mike Nelson, a mobility lawyer, points out that the legal precedent for automation-related accidents is still at an early stage. Cases relating to Tier 2 systems will be used to inform judgments relating to Tier 3 and beyond. But judges, lawyers, and jurors tend to lack technical expertise, so the future will be fundamentally dictated by unpredictability.
As we move into this chaotic middle period, when human drivers find themselves sharing the road with more and more robots, automakers would do well to be as transparent as possible, Nelson said. the reason? Juries tend to like it when companies don’t try to cover up their misdeeds.
“I’m not happy about this mess, but it’s not unexpected,” Nelson said. This has happened every time we’ve had an industrial revolution
(Tags for translation) Self-driving cars
ــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــ





