M068402b8978cf01c28fa5e7a66b282e74

MEMBERS


M068402b8978cf01c28fa5e7a66b282e74

ARCHIVES


RECENT ENTRIES

    M268402b8978cf01c28fa5e7a66b282e74

SYNDICATE


MAILING LIST

M368402b8978cf01c28fa5e7a66b282e74

Machine Morality

August 2014
By Bryan Bergeron

Machine intelligence that is in some way superior to human intelligence is often touted as the ultimate goal of AI research and development. Machines have long been capable of making decisions and, in many cases, these decisions are superior to those made by average humans. A common GPS wouldn’t pass the Touring Test, but if I were lost in some big city, I’d refer to it before asking a random biped on the street.

In medicine, cardiac monitoring machines have been able to interpret EKG waveforms with excellent accuracy for decades. And yet, humans have remained in the loop. One reason is to be doubly certain that the machine-rendered diagnosis is correct. Another is legal liability. Interestingly, morality isn’t an issue.

When it comes to autonomous weaponry — whether a heat-seeking missile or a drone equipped with optical recognition — machines are at least as capable as human operators. However, at least publicly, drones and other autonomous and semi-autonomous machines all have a human in the loop — not because of limited “intelligence,” but for moral reasons.

I think human morality in the loop is a stopgap measure. Today, it’s politically correct. Tomorrow, humans will be so outnumbered by autonomous machines — including autonomous weapons — that it won’t be possible to have humans in every kill-or-no kill decision loop. I suppose that will be the time of the autonomous “terminator.”

Similarly, in medicine, in a resource-limited situation with all else being equal, should a robotic triage nurse attend to the old woman or young child first? Or, what of the robotic surgeon performing an operation on a pregnant woman with complications. Should the robot save the woman or the child?

The way I see it, intelligence is a minor hurdle. So, what’s the point of an intelligent machine that lacks a moral compass? It may not matter if the machine is operating, say, the power grid – unless, of course, there’s a decision to be made about where to divert power in an emergency. The local preschool? Hospital? Shopping mall?

To my knowledge, there’s no Touring Test for morality. Part of the reason is that as slippery as intelligence is, it has been defined. Morality, on the other hand, is intertwined with local customs, religion, and politics. For example, one culture might revere the elderly at the expense of the youth, while another considers the elderly overhead that has to be dealt with. Then, there’s war.

Just something to ponder when you’re programming your carpet crawler to autonomously avoid a wall — something a mouse does without thinking. SV


Posted by Michael Kaudze on 07/17 at 02:46 PM


Comments



<< Back to blog