Sunday, July 26, 2020

Do computers make us more safe or less safe?

Norbert Weiner wrote a paper Some Moral and Technical Consequences of Automation in 1960. It warns of the dangers of computers in two ways:

1) If a chess program is only trained against expert chess players then it might get confused if its opponent makes a bad move. This is not dangerous. But imagine a nuclear missle system that assumes the opponent is rational. If the opponent is not rational then it might launch and have an accidental nuclear war. So there must be a human component so that this won't happen.

I offer a story and a counter narrative. In the 5th season, 23rd episode of the TV show Castle,
title The Human Factor a character had the following story to tell:

The drone on its own was going to bomb a car. But the human noticed that there were red roses on the car, so it was a wedding couple, not a terrorist. If a human had not been involved the drone may have killed an innocent just married couple!

This scene bothered me. It could EASILY be the other way around: the human wants to bomb and the drone (which has better vision) notices the roses. Or there may be many other ways that a computer could be BETTER than a human. I am not saying that a completely automated system is better, I am saying that its not obvious which way to go.  Both in some combination? What combination? Who has the final say? And in the drone scenario there may not be time for a human to consider the options.

2) The Sorcerer's apprentice scenario. In The Sorcerer's Apprentice segment of the (original) movie Fantasia, Mickey mouse tells a broom to get him a glass of water. The broom keeps bringing him water and Mickey almost drowns. Computers may take orders to literally and not stop. I wonder if  automated stock-trading and automated auctions may have this problem. Is there a case known where this really did cause a problem?

So what do you think?

NOW- do computers (or, more generally technology) make us more safe or less safe?

FUTURE- same question.


  1. The average lifespan keeps growing.

  2. Less safe because increased complexity is less manageable, be it by humans or (other) computers.

  3. In stock trading there could be a "simulated market panic" (where the computers go into a feedback loop and crush the markets) that is too fast for human intervention. Maybe it is a good idea to impose an upper speed limit on computer trading.

  4. Who would program a drone with human logic like noticing roses? There are too many things to try to predict.

    Automated stock-trading was revised after an AI panic crash, to halt the market if certain patterns emerge.