Cars and Drivers

What Matters About New Self-Driving Car Guidelines

Thinkstock

If there is one new technology that promises to affect virtually every American, it is autonomous, or self-driving, vehicles. Everything from highway big rigs to taxis are under development by car and truck makers, and tests have begun on a variety of vehicles with autonomous technology.

The U.S. Department of Transportation (DOT) on Tuesday released its “Federal Automated Vehicles Policy” designed to ensure that the development of self-driving (autonomous) vehicles proceeds with safety as the primary concern. The department’s guidelines diverge from past practice of letting carmakers build their vehicles to meet federal safety rules. Regulators then enforced the standards once the cars were sold to consumers.

This policy outlines a new process wherein automakers will need to provide documentation and information of 15 specific topics for their autonomous vehicles. According to Transportation Secretary Anthony Foxx, the policy guiding the development of autonomous vehicles “envisions greater transparency as DOT works with manufacturers to ensure that safety is appropriately addressed on the front end of development.”

As proposed, vehicle makers will complete a safety assessment outlining the procedures the maker used to meet the guidelines at the time the vehicles are ready for testing on public roads or sale to the public. Here are the DOT’s guidelines for the specified 15 areas, along with a brief description of each. The full document is available at the DOT website.

Data Recording and Sharing

There should be a documented process for testing, validation and collection of event, incident and crash data, for the purposes of recording the occurrence of malfunctions, degradations or failures in a way that can be used to establish the cause of any such issues. Data should be collected for both testing and operational (including for event reconstruction) purposes.

Privacy

These policies should ensure transparency; choice; respect for context; minimization; de-identification and retention; data security; integrity and access; and accountability.

System Safety

A robust design and validation process should be followed, one based on a systems-engineering approach with the goal of designing autonomous systems free of unreasonable safety risks.

Vehicle Cybersecurity

A robust product development process should be based on a systems-engineering approach to minimize risks to safety, including those due to cybersecurity threats and vulnerabilities.

Human Machine Interface

Because the vehicle must be capable of accurately conveying information to the human driver regarding intentions and vehicle performance, as well as conveying information to other vehicles and pedestrians, automakers should have a documented process for assessment, testing and validation of the human-machine interface.

Crashworthiness

Autonomous vehicles are “expected to meet” crashworthiness standards, regardless of their claimed crash avoidance capabilities. Fully autonomous vehicles (with no driver at all) are held to similar standards.

Consumer Education and Training

They should develop, document and maintain employee, dealer, distributor and consumer education and training programs to address the anticipated differences in the use and operation of autonomous vehicles from those of the conventional vehicles that the public owns and operates today.

Registration and Certification

As autonomous vehicles add capabilities by means of new software and even some older vehicles are kitted out with autonomous functionality, it may be necessary to update a particular vehicle’s level of automation.

Post-Crash Behavior

This calls for a documented process for the assessment, testing and validation of how an autonomous vehicle is reinstated into service after being involved in a crash.

Federal, State and Local Laws

Documented plans should detail how they intend to comply with all applicable federal, state and local laws.

Ethical Considerations

Manufacturers and other entities, working cooperatively with regulators and other stakeholders (e.g., drivers, passengers and vulnerable road users), should address these considerations to ensure that ethical judgments and decisions are made consciously and intentionally.

Operational Design Domain (ODD)

The ODD should describe the specific operating domain(s) in which the autonomous vehicle is designed to properly operate. The list should include roadway types, geographic areas, speed ranges, environmental conditions and other constraints.

Object and Event Detection and Response

This guideline refers to the detection by the driver or autonomous system of any circumstance that is relevant to the immediate driving task, as well as the implementation of the appropriate driver or autonomous system response to such circumstance.

Fall Back (Minimal Risk Condition)

There should be a documented process for transitioning to a minimal risk condition when a problem is encountered.

Validation Methods

Manufacturers and other entities should develop tests and validation methods to ensure a high level of safety in the operation of their autonomous vehicles.

The devil of course is in the details, and the DOT expects those to develop as carmakers and the agency monitor and analyze data as the vehicles become more common on U.S. roads and highways.

Sponsored: Attention Savvy Investors: Speak to 3 Financial Experts – FREE

Ever wanted an extra set of eyes on an investment you’re considering? Now you can speak with up to 3 financial experts in your area for FREE. By simply
clicking here
you can begin to match with financial professionals who can help guide you through the financial decisions you’re making. And the best part? The first conversation with them is free.


Click here
to match with up to 3 financial pros who would be excited to help you make financial decisions.

Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.