Only way I will accept it is if the board members are all held to the same degree of responsibility as a driver on the road. Your AI car ran someone over? There goes the company drivers license for a few years… Good luck keeping your AI car company afloat in that time.
That would collapse the industry overnight. Who in their right mind would take on that risk?!
I want self-driving vehicles someday. They are incapable of road rage, getting drunk, etc. There’s going to be risk no matter what, but once this tech is truly mature, that risk is going to be far below what we see from humans. I foresee a time when you have to pay extra insurance to self drive, and it may even eventually be banned outside of closed tracks.
You won’t accept an AI car killing someone, but you’re OK with losing ~41,000 people a year to human drivers? If the former upsets you, the later should enrage you. (American numbers, probably better in places like Europe. Fewer drivers, shorter distances.)
Only way I will accept it is if the board members are all held to the same degree of responsibility as a driver on the road. Your AI car ran someone over? There goes the company drivers license for a few years… Good luck keeping your AI car company afloat in that time.
That would collapse the industry overnight. Who in their right mind would take on that risk?!
I want self-driving vehicles someday. They are incapable of road rage, getting drunk, etc. There’s going to be risk no matter what, but once this tech is truly mature, that risk is going to be far below what we see from humans. I foresee a time when you have to pay extra insurance to self drive, and it may even eventually be banned outside of closed tracks.
You won’t accept an AI car killing someone, but you’re OK with losing ~41,000 people a year to human drivers? If the former upsets you, the later should enrage you. (American numbers, probably better in places like Europe. Fewer drivers, shorter distances.)
That’s a great idea.