Anti-Buzz: The Singularity

by Andrew Emmott on July 3, 2012

in Anti-Buzz,Future Tech

The Buzz Word: The Singularity

Surprise! The last two weeks were a build-up for what will likely be one of my more philosophical, “out there” columns. Readers here vary in their technological savvy, but the more technophilic among you are probably at least a little familiar with the idea of the technological singularity, the trendiest of futurist topics today. If I want to claim to be on theme, we could say that I am honoring Independence Day in the United States by discussing a topic that enjoys independence from reality; which may not be the slam that you might think it is, as the stricter definitions of technological singularities herald the end of reality-as-we-know-it.

As is usual for buzz words, “the singularity” has come to mean a little more than originally intended, working its way into the popular consciousness by means of being generalized into simpler and broader concepts. Personally, I don’t quite buy the idea of the singularity, but that’s neither here nor there because there are people much more credible than I who have put stock in it. Instead, let’s just look at the idea.

The Vingian Singularity

First, the “true” non-buzz definition of the concept. If you pull apart the history, the idea was on the tips of many tongues for a good while, but finally coined by computer scientist and science fiction author Vernor Vinge circa 1993. Begin with your sci-fi concept of Artificial Intelligence – that is, a true machine intelligence. Assume that this machine was smarter than humans. The implication would be that, because we can create an intelligence greater than our own, this machine could create an intelligence greater than itself, and that machine could create something even smarter, and so on and so forth, until whatever physical limits on intelligence were reached, ostensibly in the form of some universe-sized brain, (I warned you this was going to get weird).

I’m being glib, because to postulate anything like a universe-sized brain is dishonest to the idea of the singularity, which is very clear to point out that we have no idea what this sort of super-intelligence would look or feel like, and how it would choose to interact with us, and even what its capabilities would be. This is why Vinge uses the singularity analogy: he likens the arrival of this post-human intelligence to the event-horizon of a black hole, a point in time in which we are completely incapable of predicting what happens beyond it. Perhaps it is like Skynet and we get wiped out, or perhaps we all gain greater intelligence and become like gods, or perhaps the greater intelligence is a giant space squid and we are bugs on the sidewalk, left alone but occasionally stepped on when the squid walks home from space work. The point is that our lives will no longer carry the universal significance they do now.

There are a lot of things I’m leaving out. If you’re really interested, here’s his paper.

The Popular Singularity

The singularity is a rather apocalyptic idea, which is part of the allure for some people, and strangely the part that gets left out by others. As the singularity moved into popular understanding, the “we can’t predict the future” message seemed to be what stuck, which is sort of funny because we all already knew that. A good portion of the buzzification of the singularity comes by generalizing Moore’s Law as a stand in for our general astonishment over the last two decades of paradigm shifts. Sure enough, computing technology has required us to rethink a lot of long-standing notions of the world. The perception of faster-and-faster technological progress and the general difficulty of “keeping up” has led many to slap the term “singularity” on anything new and unforeseen much in the same way the tin-foil hat league slaps “big brother” on every video camera in existence.

Alternatively, many have softened the harsh reality of a Vingian singularity by focusing on “reality as we know it will be over.” This is a hybrid of the pop-singularity and the Vinge-singularity, where we make observations about, say nanotechnology, and what it might mean for human existence if machines allow us to live an extra 200 years and watch YouTube on the back of our eyelids. But it is too safe to say that human existence changes with new technologies. When geometry was invented, the Egyptians were able to manage property even as the Nile moved and flooded, the printing press allowed common folk the luxury of reading, and the modern sewer made urban life feasible. Technology has always changed the reality of being a human, but I’m not so sure we need to categorize every paradigm shift as evidence of the singularity. Even if change seems more rapid now, most of the recent upheaval in human life is indirectly caused by the advent of one technology: the computer. The printing press did not change the world overnight – it took a long time for the social ramifications of that technology to pass. It is dishonest to count what is effectively one new revolutionary technology as many – we are simply still discovering the benefits of one of the most influential technologies of all time. Ultimately, the pop notions of the singularity are in fact a bit hollow; little more than the borrowing of a word.

Criticism

Returning to the superhuman-intelligence singularity: I am a PhD student in Machine Learning, so I’m not a lightweight on the issue, but I’m also not a world renown expert. Still, I will say that is is my opinion that true machine intelligence is still a long way away. To quote somebody more credible than me: “Sheer processing power is not a pixie dust that magically solves all your problems.” I have said in this column many times that one thing I repeatedly discover is that computers are stupid, and people are very smart. While human-like intelligence isn’t the practical day-to-day goal of AI research anyway, we aren’t being held back from the sci-fi dream of true AI simply because we don’t have enough hardware, its because we don’t really even understand the depth of regular human intelligence yet.

Further, the apocalyptic notions of the singularity are considered by many to be suspect, not unlike Christians who keep adjusting the date of the Rapture, or cynics who thought nuclear holocaust was just around the corner, or environmentalists who think a super storm will consume us all – there are people, have always been people, and always will be people who want to believe they are among the last generation of humans on Earth. To the singularity’s credit, this particular apocalypse is not necessarily bad – maybe we get lucky and turn into gods – but I’m betting that all that is going to happen is that we will share more cat pictures.

by: at .

Share

Comments on this entry are closed.

Previous post:

Next post: