Self-driving car accidents are causing a crisis of confidence

Human trust is a very delicate thing. Although it is innate, it needs to be gradually accumulated over time. Once trust is lost, it will take a long time to regain it. This sense of trust is also especially important for technologies such as safety in self-driving cars. Many times, once a self-driving car has an accident, everyone knows it, even if the cars have been tested for safety many times.

Human trust is a very delicate thing. Although it is innate, it needs to be gradually accumulated over time. Once trust is lost, it will take a long time to regain it. This sense of trust is also especially important for technologies such as safety in self-driving cars. Many times, once a self-driving car has an accident, everyone knows it, even if the cars have been tested for safety many times.

Last year’s Uber and recent Tesla crashes were far from just a corporate issue, but an industry-wide crisis of trust.

Self-driving car accidents are causing a crisis of confidence

New technologies bring new challenges to functional safety and cybersecurity

An investigation into a crash found the Tesla lost control on a particular section of the highway and ended up hitting a concrete barrier. The first few times, the driver was able to take control of the vehicle manually, but the last time the driver was distracted, causing the crash. Even more worrying, the car’s speed jumped from 62 mph to 71 mph just before hitting the barrier.

Another accident has been investigated by the National Transportation Safety Board (NTSB). The accident, which involved a fatal collision between a car with self-driving capabilities and a slow-moving truck, was also caused in part by the driver’s failure to regain control of the vehicle in time. In addition, the NTSB noted that in a previous incident, “the automatic emergency braking (AEB) system was only able to recognize the rear of other vehicles, in part because radar-based systems had difficulty distinguishing between objects on the road and those on the side of the road. “

This is the challenge that current self-driving car technology is facing. Instead of focusing on driving, drivers place too much responsibility on low-level autonomous functions, allowing the technology to work beyond its capabilities, ultimately causing irreparable damage. In fact, regardless of whether it is an autonomous vehicle or a conventional vehicle, inattentive driving can be life-threatening.

There are two reasons for this misconception about autonomous driving. On the one hand, some companies put too much faith in the self-driving capabilities of their cars, putting consumers at risk, a misguided and dangerous guide. On the other hand, some companies do not have a clear understanding of autonomous driving capabilities. Low-level autonomous driving is meant to augment, not replace, human drivers, and it can only help drivers do things humans are not good at, such as focusing for long periods of time or checking all blind spots.

However, there are some things humans are better at than cars, such as situational understanding and object recognition. In any case, people are more inclined to blame technology than humans themselves, and this has led to a lack of trust in technology, which has affected the entire automotive industry.

Self-driving car accidents are causing a crisis of confidence

Distrust or lack of trust

Another reason for consumers to lose trust is the frequency of cybersecurity incidents. If a self-driving function malfunctions because of a cyberattack, the overwhelming publicity could bring self-driving technology to a complete standstill. Even when the driver is in manual control of the vehicle, these highly connected vehicles can become infected with malware when they connect to mobile devices, download traffic reports, or make updates for potential maintenance issues, with catastrophic consequences.

There are many scenarios that can arise from a cyberattack in people’s minds: hackers manipulating cars to collide with each other, or manipulating cars on the highway to block a traffic thoroughfare. But in reality, attackers are more likely to use malware to steal payment credentials stored in the car’s system and make automatic payments in business areas where drivers can shop without getting out of the car, such as gas stations, drive-thru restaurants, car washes, etc. .

There’s also an almost unavoidable situation where marketing data collection companies may monitor vehicle owners’ communications to learn when, where, how long, what they’re listening to or watching, and more.

Self-driving car accidents are causing a crisis of confidence

Adapting current security solutions to new technologies

Therefore, connected and autonomous vehicles require the same security technologies as firewalls, endpoint detection and response (EDR), distributed ledger technology (DLT) and other security technologies. Today’s large number of mobile terminals bring more safety hazards to modern cars, but the question cannot be avoided: Can the current safety solutions be well applied to self-driving cars?

While a self-driving car weighs more than a ton and can travel more than 100 mph, the vehicle should be a large network for safety systems. For practical reasons, the nature of the self-driving systems that make up a vehicle varies widely, and automakers must work closely with car companies to develop advanced safety systems specifically for self-driving cars.

While ensuring the functional safety and cybersecurity of self-driving cars is a huge challenge, it is reassuring that many leading technologists have already put some effort into addressing these issues.

BlackBerry recently released The Road to Mobility: A 2020 Guide to Trends and Technologies for Smart Cities and Transportation, which explores some of the key questions we should consider as we enter the world of autonomous vehicles, including:

· “Barriers and Solutions to Vehicle Electrification,” by Austin Brown, Executive Director, Institute for Energy, Economic, and Environmental Policy, University of California, Davis.

・ “The Challenges of Smart Mobility and Smart Cities,” by Roger Lanctot, Associate Director, Global Automotive Business Unit, Strategy Analytics.

・ “Regulatory Policy, Functional Safety and Cybersecurity for Connected and Autonomous Vehicles” by Parham Eftekhari, Executive Director of the Institute of Critical Infrastructure Technology (ICIT), a leading US cybersecurity think tank.

The Guide also includes articles written by experts and scholars from Auto-ISAC, ITSA, Carnegie Mellon, Cyber ​​Future Foundation and more.

Self-driving car accidents are causing a crisis of confidence

Functional Safety, Cybersecurity and Trust

The mobility of the future is one of the most exciting technological fields, and it is still growing rapidly and making new progress.

If the industry as a whole is to develop and maintain the sense of trust required to adopt these technologies, functional safety and cybersecurity concerns need to be at the forefront from the start and throughout development and production. Functional safety, cybersecurity and trust are fundamental to the entire work and cannot be separated in importance.

Self-driving car accidents are causing a crisis of confidence

Author: Jeff Davis

Jeff joined BlackBerry in February 2019 and is primarily responsible for strategic innovation and development in the intelligent transportation market. Jeff has over 15 years of technology and project development experience in the defense and transportation sectors. He has developed projects focused on cybersecurity, mobility and connectivity, with a particular focus on human interaction with advanced technologies and new concepts.

The Links:   LM190E08-TLK1 LQ9P16E