The technological singularity
The premise of the technological singularity is that in less than 40 years your computer will be cleverer than you are. Can you imagine what that means?
A man called Ray Kurzweil, is largely responsible for developing this concept.
If you're a fan of TED Talks, you'll already have heard what Raymond Kurzweil has to say. If you're not, you need you listen to him talk. I've put a few links into this post.
There are two reasons for this post about Kurzweil and his ideas.
Firstly, Kurzweil is a genius. And I'm glad to say, a genius in the traditional sense - slightly potty, slightly detached from reality but massively insightful. Among the many things he's invented, he has created a female alter-ego of himself. "Raymona" allowed him to become the female pop-star he always wanted to be. I've always been a fan of out-there inventors like Buckminster Fuller, and Kurzweil fits the bill as a modern day version of the same.
So, back to the technological singularity. Among other things, Kurzweil believes that technological development will, in the coming years, follow Moore's law. This means grow exponentially. Making the point at which computing power exceeds the human mind quite close - about 40 years away.
Kurzweil writes that, due to paradigm shifts, a trend of exponential growth extends Moore's law from integrated circuits to earlier transistors, vacuum tubes, relays, and electromechanical computers. He predicts that the exponential growth will continue, and that in a few decades the computing power of all computers will exceed that of human brains, with superhuman artificial intelligence appearing around the same time.
A technological singularity includes the concept of an intelligence explosion. Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not changed significantly for millennia.However with the increasing power of computers and other technologies, it might soon be possible to build a machine that is more intelligent than humanity.If superhuman intelligences were invented, either through the amplification of human intelligence or artificial intelligence, it would bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine, or re-write its source code to become more intelligent. This more capable machine then could design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.
Kurzweil reserves the term "Singularity" for a rapid increase in intelligence. He writes that "The Singularity will allow us to transcend these limitations of our biological bodies and brains ... There will be no distinction, post-Singularity, between human and machine". He also defines his predicted date of the singularity (2045) in terms of when he expects computer-based intelligences to significantly exceed the sum total of human brainpower, writing that advances in computing before that date "will not represent the Singularity" because they do "not yet correspond to a profound expansion of our intelligence."
There are lots of sceptics of Kurzweil's thinking, of course. However, I prefer to err on the optimistic side; that our generation, could well be living during a landmark period for human evolution.