The Ethics of Self-Driving Tech

The thread from @Digitaldavos95 about adoption of electric trucks raised a question. Which is as follows…
As self-driving technology continues to improve, how will an acceptable accident (and fatality) rate be determined?

The logical answer is that if fully autonomous self-driving technology results in lower fatality rate, then of course it is better, it preserves human life, and should be adopted right away.

For example, in 2022 42,795 people (may God rest their souls) lost their lives in vehicle accidents. Wouldn’t it be better to release fully autonomous self-driving tech if that figure could be significantly lowered to 35,000? Seems like a no brainer.

But it raises an ethical question… How can technology be released to the public, when it is for sure going to make tens of thousands of fatal mistakes?

There are also ethical decisions that the cars themselves will have to make in milliseconds. When an accident is unavoidable, the car might have to decide whom to hit. It is a dark topic, sorry.

Will AI consider the entire history of human ethical debate in a nanosecond? Then make a judgement call? Possibly

Interesting @KLook
Screen Shot 2023-09-13 at 2.27.52 PM

@MFrank the data indicates that even at its current level of development, autonomous systems are much safer than human drivers. The table above indicates that 99% of collisions involving autonomous vehicles are in fact caused by human error, the driver of the other car.

An ethicist might say that eliminating human drivers as fast as possible is the right thing to do.

Two things that I wonder about in regard to ethics and automated vehicles: abdication of responsibility, and commercialization.

Adbication of responsibility: I remember one study showing increased likelihood of crashes with increasing automation and “easy to use” features in cars – point being that the more humans subconsciously feel like the driving experience has been “handled” for them (navigation system, anti-lock brakes, power steering, etc.), the more that we subconsciouly are prone to paying less attention. As the argument goes, the decrease in attention and the increase in “safety” features are hand in hand.

Commercialization: As one comedian quipped, is the difference between the basic package and the premium package for a self-driving car software going to be that the premium version prioritizes your safety over that of others on the road? Will there be subtle ways in which more money means more safety, regardless of whether more safety actually NEEDS to cost more money?

@David-CKC-Fund At first I laughed, but it actually already happens. Certain braking and traction systems for extra safety are not always standard. You have to pay more if you want extra safe brakes.