Free Consultation | Call 24/7

Legal Questions Remain as Waymo Begins Testing Self-Driving Vehicles in NYC

Last Updated: September 18, 2025

Image of Christopher DiBella

Reviewed By: Christopher DiBella

woman sitting in the drivers seat of a car, reading, while the automated car drives itself

MANHATTAN, NEW YORK (September 14, 2025) – Waymo, formerly known as the Google Self-Driving Car Project, is set to begin testing self-driving vehicles in parts of Manhattan and Brooklyn. This follows the approval from Mayor Eric Adams and the New York City Department of Transportation (DOT), who granted Waymo a waiver to test a limited number of vehicles. After this pilot testing period, Waymo can apply for additional testing permissions.

Key Conditions of the Pilot Test:

  • No Public Access: Members of the public will not be allowed to use the cars for transportation during the testing phase.
  • Driver Presence: A driver will be required to sit in the front seat at all times.
  • Collaboration with DOT: Waymo must coordinate closely with the DOT throughout the testing process.

Other U.S. cities like Phoenix, San Francisco, Los Angeles, and Boston have already begun testing or operating self-driving vehicles. Companies like Tesla have released vehicles with varying levels of self-driving capabilities across all 50 states, and they plan to offer fully autonomous rideshare services in the future.

While the technology holds great promise, self-driving vehicles have already been involved in numerous crashes, raising concerns about liability and accountability. As self-driving technology advances, the legal framework remains inconsistent, with varying state and local laws. This patchwork of regulations poses challenges for those involved in accidents with these vehicles. However, tech companies continue to put profits over consumer safety and push to offer self-driving vehicles as a mainstream commercial service.

Who Can Be Held Responsible in Autonomous Vehicle Crashes?

As self-driving vehicles become more mainstream, there are real questions about how different lawsuits will move forward. Some attorneys see promise in pursuing negligence, failure to warn, and product liability claims. One systems engineer has argued that lawyers should pursue gross negligence and fraud claims. There is also the issue of what parties may bear legal responsibility for a crash.

  • Other Drivers: Many crashes involving self-driving cars are caused by the negligence of other drivers, potentially leading to personal injury claims.
  • Software Engineers: If faulty software leads to an accident, the company behind the software could be held liable.
  • Vehicle Manufacturers: A crash due to a malfunctioning sensor or other vehicle components could result in a product liability claim.

However, explaining algorithm malfunctions or engineering defects in a clear manner for a jury will likely present challenges for trial lawyers, given the complexity of self-driving technology.

A History of Crashes Involving Self-Driving Vehicles

According to Consumer Affairs, “Between July 2021 and December 15, 2023, there were 508 reported autonomous vehicle crashes in the U.S. Between July 2021 and December 15, 2023, data shows 1,239 crashes involving a Level 2 automation system, such as Tesla’s Autopilot, in the U.S.”

Despite their promise, self-driving vehicles have been involved in serious accidents:

  • Fatal Incidents: Elaine Herzberg tragically died when an Uber self-driving vehicle struck her while she was crossing the street with a bicycle.
  • Tesla Incidents: In addition to many motor vehicle accidents reported across the U.S. of Teslas driving into buildings or onto pedestrian sidewalks, Tesla settled a lawsuit after an engineer died in a crash involving one of their vehicles.

While these cars generally perform well in ideal conditions, they struggle in complex or unexpected driving situations. One humorous incident involved Waymo cars repeatedly honking at each other, causing a traffic jam in a parking lot.

Shortcomings of Self-Driving Vehicles

Self-driving vehicles rely on a suite of sensors and algorithms to make decisions. The sensors and AI systems used to interpret data can misrepresent data. This can lead to delayed or incorrect decisions behind the wheel. 

For example, a few shortcomings of self-driving technology that have been widely reported on are;

  • Sensor Issues: Sensors can be impaired by weather conditions like rain, snow, or fog.
  • Unable to Adapt to Unique Factors: Self-driving systems often struggle with unusual situations, such as a floating bag in the wind or avoiding sudden debris in it’s path.
  • System Malfunctions: Like any technology, self-driving systems can experience technical malfunctions.
  • Predicting Human Behavior: Self-driving cars often struggle to predict how humans will behave on the road.

Legal Help for Victims of Self-Driving Car Accidents

DiBella Law is dedicated to helping individuals who have been injured in self-driving vehicle accidents across Massachusetts. As self-driving cars become more common, we are ready to do whatever it takes to hold the responsible companies accountable for their negligence.

If you or someone you know has been involved in a self-driving car accident, we’re here to help. Our Boston personal injury attorneys can guide you through the legal process. Contact us anytime for a free legal consultation.

Source: Reuters – Waymo Gets Permit to Test in NYC