• Skip to primary navigation
  • Skip to content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Skip to footer

DuBose Law Firm, PLLC

Main navigation

  • Our Attorneys
    • Ben K. DuBose
    • Greg W. Lisemby
    • Brett M. Powers
  • What We Do
    • Mesothelioma
    • Serious Personal Injury
    • Employment / Labor Law
  • Blog
  • Contact Us
You are here: Home / Archives for Self-driving car

Self-driving car

The Life or Death Decisions of Autonomous Vehicles

May 20, 2020 By Ben DuBose

Though not seen often – yet – autonomous vehicles are on the streets bringing the advantages of artificial intelligence (AI). The goal is to make driving decisions without the human tendencies toward distraction or impaired driving. These autonomous vehicles also must make life or death decisions when an unexpected event occurs. On what basis are these critical decisions made? Can this be programmed into a software driven car or truck?

Autonomous Vehicles Making Life or Death Decisions

Can moral decisions be programmed? If a self-driven car judges a potentially fatal accident is imminent, does it choose to sacrifice passengers or pedestrians? Does the number in each group matter; two passengers or five pedestrians? What if the passengers include a child and the pedestrians comprise a group of elderly citizens? Should it endanger passengers to avoid hitting an animal? These and millions of other difficult scenarios have been discussed for years. There is even a website platform, the Moral Machine, created by MIT Media Labs where anyone is invited to make judgements in these situations. So far, people in over 200 countries have contributed.

What does data show?

Interesting data has been uncovered from the Moral Machine regarding the decisions people would make. In general, there is a consensus to save children over adults. Yet, in Far Eastern countries, the elderly would be saved first. So, even areas of the world come to different decisions – Western, Eastern, Southern – especially in complex situations.

Nicholas Evans, philosophy professor at the University of Massachusetts, writes, “You could program a car to minimize the number of deaths or life-years lost in any situation, but then something counter-intuitive happens. When there’s a choice between a two-person car and you alone in your self-driving car, the result would be to run you off the road. People are much less likely to buy self-driving vehicles if they think theirs might kill them on purpose and be programmed to do so.” What people say in surveys versus what they would want to happen if they, or loved ones, are involved varies greatly.

Though extremely difficult to program to fit so many scenarios, there must be moral programming for the AI to make the autonomous vehicles accepted. This is especially true since there is not a global consensus for the morality of any given situation. In addition to life or death decisions, the vehicle is computing routes, traffic, obstacles, speed, condition of vehicle and countless other parameters.

What is next?

The ultimate goal is reducing accidents exponentially. Until almost all vehicles are controlled by AI and can interact, that will not happen. Even then, everyone who has anything electronic knows that there can be bugs and glitches, service going down, hackers, and the unknown.

Once there is an accident, there will need to be a decision on liability. These computer-driven vehicles will be equipped with essentially “black boxes” that record the previous 30 seconds or so of data. This information will make it easier to reconstruct what occurred, but who is to blame? What stage of the vehicle’s development created this accident: software developer, vehicle manufacturer, communication provider, or one of the other multiple vendors supplying parts?

Much is still unknown as this is a huge change in transportation globally. As more and more of these vehicles are introduced to our streets and highways, data collected will lead to more answers and probably more questions as well.  

Filed Under: Dallas personal injury lawyer, Distracted Driving, Personal Injury, Personal injury law, self-driving car Tagged With: autonomous vehicles, dallas personal injury lawyer, Louisiana personal injury lawyer, Self-driving car, self-driving car accidents, software-driven vehicles, texas personal injury lawyer

Uber Self-Driving Car Kills Pedestrian

April 5, 2018 By Ben DuBose

The first case of a pedestrian killed by a self-driving car occurred in Tempe, Arizona. This technology is still in the experimental phase, however Uber self-driving cars are on the roads in Pittsburgh, San Francisco, and Toronto – as well as Tempe.

Are self-driving cars safer?

Increased safety was the initial premise for the existence of unmanned self-driving cars. One of the primary causes of automobile accidents is distracted driving. In a robotic car, that should not be a problem, but obviously there are issues to be addressed. Building systems to automatically react appropriately in unexpected situations is a difficult task.

What happened in Tempe?

In this case, the car did have a human in the car as a backup. It appears the car, a Volvo XC90, was going about 40 MPH on dry streets with clear weather at 10 p.m. on a Sunday. An investigation is ongoing since the safety driver did not appear impaired. The victim, a 49 year-old woman, was walking her bicycle in the street. Though this was the first accident involving a pedestrian, one of these autonomous vehicles collided with another car in March of 2017.

Safety measures in place

Because states are eager to encourage testing, there are few regulations in place – not only with Uber, but Waymo, Lyft and Cruise, owned by General Motors. Doug Ducey, governor of Arizona, said in 2017, “We needed our message to Uber, Lyft and other entrepreneurs in Silicon Valley to be that Arizona was open to new ideas.” Originally Arizona mandated a backup driver in the car, but recently that was changed to allow testing of unmanned self-driving cars. This leniency is to boost the economy by allowing a low regulatory environment.

California is on the cusp of allowing unmanned vehicles, but is still investigating the proposition. In its testing of manned self-driving cars, Waymo has produced statistics required by California. When a human has to take control, it’s called a disengagement. In just over a year, Waymo’s cars drove 350,000 miles with 63 disengagements. This averages approximately 5,600 miles between disengagement events, clearly demonstrating the need for more testing before widespread use of unmanned vehicles.

Michael Bennett, an associate research professor at Arizona State University, has been studying the reaction of the public to driverless cars and their artificial intelligence. His comment on the aftermath of the Tempe pedestrian incident reveal his conclusion, “We’ve imagined an event like this as a huge inflection point for the technology and the companies advocating for it. They’re going to have to do a lot to prove that the technology is safe.”

Who’s Fault is Injury from Self-Driving Car?

“Look for insurance companies to lobby for profound changes in state insurance requirements and state laws”, says Dallas lawyer, Ben DuBose. “However, in the Arizona Uber incident, there was a back-up driver in the car. So, there’s still a question of why the back-up driver didn’t disengage and potential liability for her. People will still need car auto insurance and will still have potential liability.”

At the same time, as we head into the self-driving car era, DuBose says to look for additional new legal theories to address the autonomous car features. Automobile manufacturers and software designers will face new product liability claims. As even Volvo Car Corp. Vice President, Anders Karrberg recently testified before Congress “[c]armakers should take liability for any system in the car. So we have declared that if there is a malfunction to the [autonomous driving] system when operating autonomously, we would take the product liability.”

Filed Under: Personal Injury Tagged With: dallas personal injury lawyer, Dallas serious personal injury attorney, Louisiana personal injury lawyer, New Mexico personal injury lawyer, Oklahoma personal injury lawyer, Personal injury attorney, Personal Injury Dallas, Self-driving car, serious personal injury, texas personal injury lawyer

Are Self-Driving Cars Accidents Waiting to Happen?

July 26, 2016 By Ben DuBose

Most people already know that there was a fatality in May, 2016, with a Tesla Model S car driving under a left-turning tractor trailer. There are indications that there may be driver distraction involved, but that is still unknown. This was the first fatality of any Level 2 self-driving car in autonomous mode out of 130 million miles driven in autonomous mode, but not the first accident. Self-driving cars are rated from 0 (no automatic driving features) to 4 (totally self-driving with no driver, or steering wheel, required). The Level 2 car involved in the fatality did require a driver to agree before activation of Autopilot that the feature “is an assist feature that requires you to keep your hands on the steering wheel at all times . . . to maintain control and responsibility for your vehicle while in use . . . and to be prepared to take over at any time.” This is not the self-driving, no driver needed, vehicle of the future.

Why Develop Self-Driving Cars?

Here are a few of the numerous reasons given for developing smart vehicles:

  • Since 81% of car accidents are due to driver error, it is possible there would be fewer accidents with a computer in control.
  • Insurance costs might decrease since there is less chance for human error.
  • Drivers could better use commuting time reading, working, talking on the phone, or even sleeping.
  • Congestion, “rubber necking,” and other traffic problem caused by humans would be minimized using the smart communication between all smart cars on the road, along with smart traffic signals.
  • Cars would probably be fueled by electricity, minimizing gasoline use.
  • No longer would disabled individuals, or elderly drivers, have to depend on public transportation or other assistance.

So What is the Downside?

At least for the near future, there are a number of issues:

  • Costs could be significant and out of reach for most people.
  • A mix of standard and smart cars on the road decreases the benefits of smart communication. There will be a number of early adopters, but most people will still want the control of their vehicle and will not trust control to the car.
  • Hackers could infiltrate the communication between smart cars as a whole system, or individually.
  • Heavy rain or snow would be a problem for the laser sensor on the roof, leading to questions of what a driver could do in the event of a technology breakdown.
  • If a smart traffic light broke down and there was a police officer or other individual directing traffic, what would be the response of a self-driving vehicle?
  • These vehicles rely on GPS mapping, at least for now. GPS is not always accurate and could direct a car against traffic on a one-way street or other dangerous maneuvers.
  • What about accidents? They will happen, as no technology is foolproof. When there is an accident, who is the responsible party? There are several potential culprits – drivers, software developers, or manufacturers.

These fully self-driven, smart vehicles will surely be on the road one day, but that day may not be in the foreseeable future for mass adoption. There are still many situations to resolve, not only in the technology and cars themselves, but with roadways, traffic signalling, a new set of rules for the road, and – most of all – the acceptance and knowledge of drivers themselves.

Filed Under: Personal Injury, Personal Injury, Safety Tagged With: Autopilot on car, dallas personal injury lawyer, Self-driving car

Primary Sidebar

Mesothelioma, Lung Cancer & Serious Personal Injury Attorneys of DuBose Law Firm has decades of experience fighting for mesothelioma & personal injury victims.

Call 877-857-2914 today for free case evaluation.

Recent Posts

  • New Turn for Asbestos Reporting Under TSCA
  • World Cancer Day for Awareness, Education, and Action
  • What Can You Do to Reduce Your Risk of Cancer?
  • Final Rule: Independent Contractor Status under the Fair Labor Standards Act
  • Martin Luther King, Jr. – A Voice of Wisdom

Archives

Blog Categories

  • Asbestos
    • Abatement
    • Articles
    • asbestos in talc
    • Cancer
    • Conference
    • Legal News
    • News
    • On the Job Exposure
    • Power plants
    • US Congress
  • Asbestos legal issues
  • Asbestos safety regulations
  • DuBose Law Firm News
    • Dallas employment lawyer
    • Dallas mesothelioma lawyer
    • Dallas personal injury lawyer
    • Louisiana asbestos attorney
    • Personal Injury
      • Dallas electric scooters
      • electric scooters
      • self-driving car
    • Press Releases
  • International asbestos developments
    • Earth Day environment
  • Laws
    • Employment Law
    • FLSA
  • Louisiana attorney
  • Lung cancer medical treatment/research
    • COVID-19
  • Medicare and Medicaid
  • Mesothelioma medical treatment/research
    • Mesothelioma
    • Mesothelioma treatment
  • mesothelioma research
    • nanotechnology
  • Miscellaneous
    • Congressional bills
    • COVID-19
      • Health
      • Pandemic
    • Holidays
      • Cinco de Mayo
      • Flag Day
      • July 4th
      • Labor Day
      • Martin Luther King
      • MLK Day
      • National Cancer Prevention Month
      • Thanksgiving
      • Veterans Day
      • World Cancer Day
    • Oil & Fracturing
    • oilfield injury
    • Veterans
  • Overtime Pay
    • FLSA wage laws
  • Personal Injury
    • Cosmetics
    • Distracted Driving
    • e-cigarettes
    • Elder abuse
    • Hand Sanitizers
    • Insurance
    • Personal injury law
    • Popcorn Lung
    • Safety
  • U.S. Navy exposure
  • Uncategorized

Secondary Sidebar

Mesothelioma and Lung Cancer

  • Mesothelioma
  • Mesothelioma Frequently Asked Questions
  • Mesothelioma Related Blog Posts
  • How to Pick an Asbestos Lawyer
  • Asbestos Information
  • Asbestos Exposure U.S. Navy List of Ships
  • Lung Cancer Claims
  • Lung Cancer is Not Just a Smoking Disease

Serious Personal Injury

  • How to Pick a Serious Personal Injury Attorney
  • Medical Litigation
  • Motor Vehicle Accidents
  • Oil Field & Gas Field Injuries
  • Personal Injury Frequently Asked Questions
  • Product Liability
  • Workplace Injuries

Employment and Labor Law Attorneys

  • Employment and Labor Law
  • Medical Leave and FMLA
  • Discrimination
  • Harassment
  • Wrongful Termination
  • Overtime Pay – Fair Labor Standards Act (FLSA)
  • Are you a Healthcare Worker not being paid overtime wages?
  • Worker Adjustment and Retraining Notification Act – WARN Act
  • Business Interruption Claims During COVID19 Pandemic
  • Unpaid Overtime for Dispatchers
  • Arbitration Clauses, How they impact your life
  • Asbestos Exposure on September 11, 2001

Footer

Dallas, Texas – Main Office

DuBose Law Firm, PLLC
The Adelfa B. Callejo Building
4310 N. Central Expressway
Dallas, Texas 75206
Office 214.389.8199 • Fax
214.389.8399
877-857-2914

New Orleans, LA Office

DuBose Law Firm, PLLC
829 Baronne Street
New Orleans, Louisiana 70113
Office 504.581.9322 • Fax
504.324.0155

HELPFUL FREQUENTLY USED PAGES

  • Dallas Mesothelioma Lawyer
  • New Orleans Mesothelioma Lawyer
  • Mesothelioma
  • Asbestos Information
  • How to Pick an Asbestos Lawyer
  • Mesothelioma Frequently Asked Questions
  • Serious Personal Injury
  • Personal Injury Frequently Asked Questions

Copyright DuBose Law Firm © 2021 · ; Log in