This is the third part of a four-part series which began with “What If?  What If?  Why Shoudn’t?” on Friday, 03-07-2014.

Mad science has gone mainstream.  We don’t even call people who meddle with human nature mad scientists any more.  The term is considered insulting.  Now we call them transhumanists, performance enhancement researchers, and people who work for a better future.

One early omen was the conference, “Converging Technologies for Improving Human Performance,” sponsored by the National Science Foundation and the Department of Commerce in 2001.  Today that means “in ancient times.”  The idea was to use nanotechnology, biotechnology, and information technology for the “advancement of mental, physical, and overall human performance.”*  What this bland language means doesn’t become clear unless one has the patience to wade through more than 400 pages and a knack for reading between the lines.

But the transhumanist program becomes more blatant every day, and it is making great inroads in the press.  Case in point:  A recent Wall Street Journal feature on using implants to change the properties of the human mind and body.**

The authors make very clear that they aren’t talking about familiar and morally unproblematic things like pacemakers, dental crowns, or implantable insulin pumps, which merely restore abilities which have been lost or damaged.  No, they are talking about pushing abilities beyond human limits -- and inventing new abilities which humans have never possessed.

It would be interesting to dissect the article in a college rhetoric class, if only college rhetoric classes still did that sort of thing.

First comes the appeal to more or less innocent desires.  Would you like to be able to remember things better than you do now?

Gradually, though, the authors blend darker desires in their appeal.  Would you like to be able to hear “any conversation in a noisy restaurant, no matter how loud?”  I hope it is obvious that they are not talking about hearing your own conversations more perfectly.  They refer to any conversation.  What they mean is, “Would you like to be able to spy on the conversations of other people?”

Then they lull us, digressing for several paragraphs on the fact that “neuroprosthetics aren’t new.”  This part of the article completely disregards the distinction the authors have previously conceded between healing damaged abilities and granting on new ones.  A “prosthetic” is something that compensates for an infirmity, not something that tries to “augment” us, and the difference is morally relevant.

But next they suppress the moral question.  “The real question isn't so much whether something like this can be done but how and when.”

More and more strongly they appeal to insecurity, first to the insecurity of parents.  “Many people will resist the first generation of elective implants …. But anybody who thinks that the products won't sell is naive .… The chance to make a "superchild" … will be too tempting for many.”

All the other moms and dads will be altering their children.  You wouldn’t want yours to fall behind, do you?

Soon the appeal to insecurity is broadened, for in every sphere of life, the augmented will outperform the unaugmented.  “Even if parents don't invest in brain implants, the military will …. Who could blame a general for wanting a soldier with hypernormal focus, a perfect memory for maps and no need to sleep for days on end?” ***

The authors parenthetically add, “Of course, spies might well also try to eavesdrop on such a soldier's brain, and hackers might want to hijack it.  Security will be paramount, encryption de rigueur.”  Of course; of course.  But wait a moment.  To hijack something is to employ stealth or cunning to transfer its direction from one controller to another. Which means that it was already under control.

At last all is clear.  Transhumanism isn’t about relieving human infirmities.  It isn’t even about making superhumans.  It is about making subhumans -- treating people into mere things to be manipulated -- changing them from whos into whats.

One wonders what other temptations nobody with power could resist.  Could a general be “blamed for wanting” soldiers who never asked disturbing moral questions, never suffered nightmares for anything they had done, and never asked to see their wives or children?  Could a mine supervisor be “blamed for wanting” miners who never complained about cave-ins, never asked for a raise, and never went on strike?  Could a political boss be “blamed for wanting” citizens who were willing to starve for the regime, always did as they were told, and accepted euthanasia when they were no longer able to work?

If the authors are right that we cannot resist the temptation to develop such technology, then it is hard to see why they think we can resist the temptation to do further evil with it.  The inconsistency is even more glaring in view of the fact that they don’t see the evil of such a thing as taking over a soldier’s brain.  For them it this is just another step in international competition.

They ask, “Will these devices make our society as a whole happier, more peaceful and more productive?  What kind of world might they create?”  They answer, “It's impossible to predict.  But, then again, it is not the business of the future to be predictable or sugarcoated.  As President Ronald Reagan once put it, ‘The future doesn't belong to the fainthearted; it belongs to the brave.’”

C.S. Lewis once provided a translation of that sort of prose:  “I don’t know what will happen, but I want it to happen very much.”

Next time:  What if we transcended our nature?

* The conference report is available athttp://www.wtec.org/ConvergingTechnologies/Report/NBIC_report.pdf .

** Gary Marcus and Christof Koch, “The Plug-and-Play Brain,” Wall Street Journal, 15 March 2014, pp. C1-2, 15 March 2014 .

*** For a sobering examination of how far this sort of thinking has gone already, see Christopher Coker, Warrior Geeks: How 21st Century Technology is Changing the Way We Fight and Think About War (New York: Columbia University Press, 2013).