You're currently reading an archived version of Jim Hightower's work.
The latest (and greatest?) observations from Jim Hightower are only now available at our Substack website. Join us there!
Noting that the US military was developing armed, autonomous robots to serve as battleground soldiers in our wars, a 2015 New York Times article asked an important question: “Can they learn to make moral choices?”
But wait, what about us humans – when it comes to war, can we learn to make moral choices? Remember that President George W. Bush chose to torture al Qaeda suspects (including innocent civilians) by waterboarding them. Where was the morality in that? It was, of course, immoral. But, at the request of the White House, an ambitious, eager-to-please lawyer in the Justice Department wrote a memo that whitewashed waterboarding, summarily decreeing that such torture was not, technically, torture.
Enjoying Hightower's work? Join us over at our new home on Substack:
The author of that memo sanctioning Bush’s immoral warfare was John Yoo. His name is relevant to the current question about robot morality, because Yoo is now at the American Enterprise Institute – a nest of far-right, neo-con war hawks – where he’s become a leading booster of turning robots into our killing machines. In a September Wall Street Journal article, Yoo exults that, unlike humans, robots won’t get fatigued in battle or become “emotionally involved” in the business of killing humans. Not merely cold-blooded warriors, these efficient machines are no-blooded – plus, they’re much cheaper than a flesh-and-blood army.
But, you might ask, what if they go rogue, turning into an army of rampaging “Terminators” and using their artificial intelligence against us civilians? Tut-tut, says Yoo, admonishing us to “have more confidence in our ability to develop autonomous weapons within the traditional legal and political safeguards.”
Huh? Come on, John – you’re the guy who carelessly, flagrantly, and immorally violated those very safeguards in your torture memo! We’re to trust you? No thanks.