The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search instagram avvo phone envelope checkmark mail-reply spinner error close The Legal Examiner The Legal Examiner The Legal Examiner
Skip to main content

Experts in the auto industry predict that by the year 2020, there will be 10 million self-driving cars on the road. Not only have automakers such as BMW, Mercedes, and Tesla already introduced or are currently developing vehicles that have self-driving features, even tech companies – such as Google – have announced that they have self-driving cars in development.

However, auto safety advocates are calling for federal regulators to put the brakes on self-driving cars – at least temporarily – after two horrific car accidents involving Tesla models, in which both the drivers had the car’s autopilot system in place.

In the first crash, a 40-year-old Ohio man was driving a 2015 Tesla Model S on US 27A in Williston, FL. The victim had the vehicle in autopilot mode. An 18-wheel tractor trailer was crossing the road, however, the car’s sensor system failed to recognize the white tractor trailer against the bright sky and steered the car directly under the truck, driving at full speed. The force of the impact tore the top of the Tesla off completely. The car did not stop until after it plowed through two fences and slammed into a utility pole. The driver was killed instantly.

The second self-driving car crash occurred a few days later on the Pennsylvania Turnpike. The driver, who was operating a Tesla Model X, told police the car was in the autopilot mode when it suddenly hit a guardrail on the right side of the highway. The car then crossed over all the eastbound lanes before it hit a concrete median, causing the car to flip over and finally land on its roof. Police did not release details on whether or not the driver or his passenger were injured in the crash.

The National Highway Traffic Safety Administration (NHTSA) is investigating both car crashes.

As of today, there are no federal regulations in place governing autonomous vehicles. The NHTSA is planning on releasing safety guidelines that automakers should follow regarding the manufacturing of these vehicles. However, the law does not currently require manufacturers to follow any guidelines the agency may come up with.

This means that the automaker – not the government – makes the decision on whether or not their car is ready to drive off the dealership lot and onto the road. The only thing that could potentially stop a car maker from putting a self-driving model on the market which is not ready would be concern over liability in case of an accident or sudden death. Given the record number of recalls which have occurred over the past several years regarding issues like defective ignition switches and defective airbags, as well as the revelations that automakers covered up known dangerous defects, can we really trust them to regulate themselves?

No matter what the product that is being manufactured, companies owe it to consumers to only put the safest products on the market. Our Virginia defective products lawyers have successfully represented many clients who have suffered tragic injuries – and even death – because a product manufacturer chose to put profits over safety.

The problem was self-driving cars is simply that the status of computers and the finest technology will never adapt to all of the various dangerous things that can occur in a second, while you are driving a car, particularly at interstate speeds of 65 to 75 mph. If we adopt safety regulations that don’t permit any speed over 25 mph in a self-driving car, and that applies to all other cars as well, with appropriate airbag safety you might survive any type of crash at that speed. But putting these cars on interstate highways where the driver is scrolling through their email ignoring the operation of the car? We very well may get to a computerization level that this can occur but it’s going to be a long way in the future

One Comment

  1. Gravatar for James
    James

    Autonomous Vehicles open up a new side of liability including negligent and who to blame.

    Don't you find it interesting that there has been zero effort on the government part to implement policy/regulation on autonomous vehicles to prevent them from becoming a weapon of destruction (besides the cyber security issue)? Right now the automotive and technology companies are on a fast track to get autonomous vehicles on the streets and the first consumer interaction will be the autonomous taxicab. Several federal government agencies have already stated that it is a problem and NATO has even mentioned that ISIS is working on their own autonomous vehicle as a bomb delivery vehicle (drone on wheels).

    Just how easy would it be for a lone bad actor or terrorist organization to call up an autonomous taxi, load it with explosives, punch in the coordinates, shut the door, and send the autonomous vehicle to a destination. It is a low cost approach for bad actors to cause major harm to society. Now take that same scenario and look at a dedicated attack like something in Paris or Brussels. What impact to a major city (New York or Washington DC) if 7-10 autonomous vehicles (taxicab) were loaded up with explosives or other hazardous materials and spread out across the city to major infrastructures. Seems logical that the government should require that all autonomous vehicles have sensors to sniff/detect for hazardous/wmd (explosives) materials and once detected to disable certain autonomous features.

    However, there are zero policies/regulations regarding this and for the most part the government has been sound asleep on this issue. Granted, cyber security, vehicle safety, and data privacy are important and should be addressed, leaving the possibility that someone could use an autonomous vehicle as a weapon of destruction should be a priority for everyone. So are the automotive/technology companies to be held as gross negligent and liable as they are fully aware a safety issue exist and they did nothing to mitigate the problem or is the government responsible?

Comments for this article are closed.