self driving cars on the roadself driving cars on the road

News

Comment: Are self-driving cars one day going to have to decide who to kill?

During a recent running monologue the topic of self driving cars having to one day make very difficult decisions was high up the agenda

Time 6:46 am, March 5, 2022

I run every weekend with a good friend of mine. It’s not really to keep fit, more McDonald’s offsetting as I’m really rather partial to a chicken nugget.

I reason with myself that a 10km run makes 20 nuggets, a triple cheeseburger, chicken sandwich meal and a McFlurry essentially calorie neutral.

While we attempt to improve our dad bods, conversation topics are varied – nothing is off limits. During one rather long plod, we ended up talking about crypto currencies and then goaded each other into wasting money on some Bitretheum. 


Charting the ups and downs of our £200 ‘investment’ gave us much to talk about – mostly about how stupid we were buying it in the first place.

Another of our favourite topics is cars.

Here, musings can range from why people want to buy Vauxhall Zafiras for 60 per cent more than they were worth a year ago (answer: They’re lunatics) to an on-going missing person-like hunt for a reasonably priced 4×4 for my running companion.


It was some time last year on a particularly soggy jog we chatted about the Mercedes Marco Polo my friend had bought after lockdown and that it was likely to then be worth more than he’d paid for it.

A few hours later, he’d decided to sell it, a couple of days after that he’d banked a £5k profit. That was one of the run’s more successful topics.

Marco Polo Mercedes

One of our run’s more successful topics – a £5k profit on a Marco Polo

Anyway, this weekend’s chat was one that got my head pounding – and this time, it wasn’t anything to do with not drinking enough water. 

Essentially we decided that sooner or later cars are going to have to make a decision whether to kill you or not. Bear with me…

I read a fascinating interview with a leading expert in artificial intelligence in The Times recently (which started our train of thought).

Mo Gawdat was a seriously bright brain who worked for Google X, the search giant’s weird blue-sky thinking lab that dabbles in future tech.

I’m going to truncate the fantastic interview: After working on AI-powered robot arms, Mo realised that one day robots could try and kill us. (Presumably, with strong handshakes, but that’s not the point).

Ok, I’ve simplified that a bit, but he thinks that in the not-too-distant future artificial intelligence will eclipse humans as the smartest conscious on the planet. And when that happens, if we haven’t programmed them correctly at the beginning, they could very well try to kill us. 

Yes, I know this sounds like the plot of Terminator, and believe me Mo is well aware of that, but the points he makes are scary and real. He’s so worried, he’s written a book about it.


He’s not the only one who’s concerned.

Other great AI musing minds of the planet who work closely with the tech are worried too, including fellow author Stuart Russell. He’s convinced AI is something we need to worry about and that those who are developing it are ‘in denial’.

I’ve read plenty of AI experts’ opinions that unless AI is properly programmed from the start it could make decisions that would ultimately lead to our demise. 

For example, said one, ask AI to find a cure for cancer and it may decide to get a larger sample size, and thus decide it needs to give everyone a tumour. Extreme, yes, but so too was the Terminator.

Anyway, back to the run and cars deciding to kill us. My running partner has worked in robotic boats so actually knows the AI subject pretty well. I had a Tamagotchi, so don’t. 

The AI that runs self driving cars of the future will have to make decisions, sometimes at lightning speed, faster than a human could ever compute them.

Hypothetically, you’re travelling in a fully autonomous self-driving car at a speed too fast to stop and approach two stationary vehicles you will crash into.

On one side of the road is a school bus with 10 children in, the other a coach load of grannies on a day trip. How does the car decide which to crash into? Or does it decide to crash you into a wall instead because there’s only one of you and 20 of them?

These decisions will have to be made by the car – the driver will have no input – they’ll be no different to a passenger in a plane in free fall.

Somewhere and some point along the line that car’s ‘brain’ will have to have been programmed with decision making criteria so it knows how to deal with the sort of things we never want to think about.

Including whether to run down a coach load of grannies or protect the driver above all costs.

Fortunately, the people building this tech are far smarter than two middle-aged dad joggers and have probably thought about all this already.

But what if they haven’t? Maybe we should stop thinking about future technology and go back to the hunt for a sensibly priced 4×4. The latter appears to be just as challenging at the moment.

This column first appeared in the latest issue of Car Dealer Magazine which you can read in full below. Just click on the magazine cover image.

James Baggott's avatar

James is the founder and editor-in-chief of Car Dealer Magazine, and CEO of parent company Baize Group. James has been a motoring journalist for more than 20 years writing about cars and the car industry.



More stories...

Advert
Server 108