"I was the logical choice. It calculated that I had a 45% chance of survival. Sarah only had an 11% chance. That was somebody's baby. 11% is more than enough. A human being would've known that."My GPS can be programmed to find me the closest restaurant, but not the closest restaurant that I like. Hypothetically, there's no reason why a GPS couldn't be able to do that. It just has to integrate preferences into its algorithm to trade off value and distance. So the GPS knows to direct me to FiveGuys instead of Arby's, but not if FiveGuys is more than 8.2 miles further away.
Google. Make this.
My GPS is stupid because it's 10 years old, it doesn't have access to my own personal indifference curves (or does it?), and what it would take to make it smarter in this way isn't worth the cost. The robots from the movie, however, are dealing with human lives, which is really important. It's also a more broadly recognized value to prioritize children over adults. This is an easy problem to solve.
I'm not sure I would give a 12-year-old girl 4x the moral value of a 36-year-old Will Smith. In fact, I know I wouldn't have saved the 12-year-old girl if I knew she only has 11% chance of survival. Has Will Smith fully absorbed the reality that if the robot had chosen to save the girl, 9 times out of 10 both he and the girl would have died? Maybe he knows that if he were dead too he wouldn't have to think about it, be all mopey all the time, and have an irrational prejudice against robots.
Here's what the robot taking the picture should see, juuuuust in case they all fall into the creek behind them |
If you think the problem with death is that you don't get to be alive anymore, then this isn't actually that difficult. A 12-year-old female can be expected to live 70 more life years. A 36-year-old male can be expected to live 42 more life years. If the robot had been designed to maximize life years, the guy in I Robot should have been rated at 60% the value of the 12-year-old girl. He says an 11% chance is "more than enough," but he had 4x the probability of survival sooooo, not really.
You might object that a robot would never be able to evaluate different lives perfectly. You'd be right. You might object that sometimes the robot would make a mistake and the wrong person would die. You'd be right about that too. But the question isn't whether a robot with an updated model of human life is perfect, the question is whether it's superior to a low-resolution model that sees every life as the same even though all humans agree they're not.
In the world of I, Robot, in the middle of a disaster, a robot can assess the probability of survival for potential victims on the fly - 11% for her, 45% for him - but for some ungodly reason it can't take into account the ages of said victims and make an updated judgment about who so save? I'm sorry, this is not a robot problem. This is a human problem. Most likely it's the human mistake of letting a surplus of human life years get wasted away because it feels bad to attach numerical values to individuals.
But more on that later...
No comments:
Post a Comment