A Collapse: 2-Technological Singularity

Initially Published 23 April, 2018

The Technological Singularity is the term for a notion that dates back to at least the 1950s and the birth of modern computing.  The idea is that eventually we will build computers more intelligent than us and machines that are more capable than us at which point humanity will become redundant.  Think Terminator here, at some point the machines will rise up and try to get rid of us.  The opposite view is that when the Singularity occurs humanity will move to the next level of existence and become able to concentrate on pure thought and the arts.  Think techno hippies living in some kind of dream world.

I will freely admit that I am somewhat of a computer geek and love playing around with code.  I just don’t see how computers are going to become self-aware even if we create a computer that is able to self-correct its code.  Face it, computers are only superficially brilliant, they are only as smart as the programmers who wrote the code and they can only do what you tell them to.  That is why you get error screens at times when you try to do something.  Whatever it is you tried to do was not covered in the instructions of the program.

Don’t get me wrong, computers can do some amazing things and they are very good at doing lots of calculations very fast.  That makes them seem smart.  What computes do not have and cannot have is intuition, that faculty that makes a person look at a problem and come to the right decision without going over thousands or millions of alternatives and discarding them beforehand.

It is entirely possible that computers will be developed before long that can reliably mimic consciousness, they might even be able to pass a Turing Test.  That being so, the danger I believe is not from the computers themselves but from rogue programmers writing software that will cause autonomous or semi-autonomous machines to start killing people or disabling pieces of critical infrastructure.  As machines and robots become more autonomous and are given more leeway in decision making the biggest threat I see is these machines being hacked.  Think of an armed Predator drone being hacked and turned against its users.  Another example would be more viruses like Stuxnet being deployed to hash up control software and cause machines to self-destruct.

The threats above I see as more realistic than that some version of Terminator gains self-awareness and decides humanity is an inconvenient problem.  The Singularity might happen but if it does I don’t think it will be like what its proponents believe.

Now, what do I see as the likelihood of a technological singularity occurring?  Here are my cloudy crystal ball predictions.

  1. Short-term-the next 5 years – roughly 5% chance of a collapse occurring
  2. Medium-term-in 5-15 years – roughly 10% chance of a collapse occurring because I just don’t think we are that close to developing sentient machines
  3. Long-term-more than 15 years from now – roughly 35% chance of a collapse occurring.  I think that at some point we will develop thinking AIs.  I just think it is a leap to posit that they will go all terminator on us and try to destroy us.