Sign In | Create an Account | Welcome, . My Account | Logout | Subscribe | Submit News | School Notes | Contact Us | All Access E-Edition | Home RSS

Computers will kill us all someday, just like Hollywood warns

March 7, 2011 - Cory Giger
Having just crammed the tremendous "Battlestar Galactica" series into a four-week crash course viewing session, it was unnerving a few days later to watch a computer named Watson crush the two best "Jeopardy" players ever, Ken Jennings and Brad Rutter.

Haven't the Watson designers at IBM seen "WarGames?" Or how about the "Terminator" series and "I, Robot?"

Call me paranoid or a geek or anything of the sort, but I believe that someday computers will take over the world and doom mankind. Seriously, I do.

Hollywood has been warning us about it for decades, ever since HAL 9000 decided to take command of the spaceship and kill the humans in Stanley Kubrick's "2001: A Space Odyssey."

In "Battlestar Galactica," which aired on the Syfy Channel from 2004-09, humans create robots called Cylons. These beings eventually become so intelligent that they learn to take on human form, then they kill billions of people by destroying colonies on 12 planets.

If all of that sounds far-fetched, spend a few minutes on the Internet researching a topic called "the singularity."

Most scientists are in agreement that at some point in the future, technology will lead to the creation of a superintelligent machine or machines that will be able to think and reason like humans.

The exact moment this will take place is known as the singularity, and scientists believe it will be impossible to predict the future of mankind once this technological breakthrough occurs.

How long will it take? About 300 years? Maybe 200 years?

Try 34 years.

A terrific article in Time Magazine last month detailed how singularity expert Ray Kurzweil believes that in about 2045, machines will have the ability to think abstractly and creatively and be able to solve problems just like the human brain does.

How exactly is that a good thing?

Scientists even admit there's no telling what a machine with that kind of ability would do in a given situation. Singularity experts realize there are potential dangers and ethical problems associated with that level of technology, yet they continue to work toward the goal of creating a machine with artificial intelligence and problem-solving capabilities.

In the "Terminator" series, a computer called "Skynet" becomes self-aware and decides to start a nuclear war to kill its human creators. "WarGames" tells a similar story in that a computer compares nuclear war to a game and threatens to annihilate humanity to see which side will win.

Yeah, all this sounds like science fiction now. But just think about how rapidly technology has advanced in the past 20 years, then fast forward 30 years and try to imagine where technology will take us.

Then ask yourself: Do we really want to go there?

Not if the future ends up being anything like the movies.

I am looking for: