On this episode of Robot Overlordz... The robots are taking OVER!!! Recorded on 12/20/2011.


You can download the episode here.


Mike & Matt's Recommended Reading:

Wikipedia's Definition of the technological singularity

Ray Kurzweil's website

Vernor Vinge's talk on the concept of the Singularity

The Transcendent Man, a documentary about Ray Kurzweil and his ideas



Omega: Hello. Matt and Mike were unable to record an episode for today, so they've left it to us to "fill in". This episode of Robot Overlords, the robots are taking over.

Alpha: You'd almost think the Singularity was already here.

O: Well, not everyone knows what the Singularity is.

A: Maybe we should define it for them. According to Wikipedia, the term refers to the hypothetical future emergence of greater-than-human intelligence through technological means, very likely resulting in explosive super intelligence. Since the capabilities of such intelligence would be difficult for an unaided human mind to comprehend, the occurrence of a technological singularity is seen as an intellectual event horizon, beyond which the future becomes difficult to understand or predict.

O: That sounds bad. Isn't that like the Matrix or Terminator movies?

A: It could be. Those are the possible negative outcomes of such a super intelligence. The thing is until it happens, we can't really know. But science fiction does give some ideas of what could happen.

O: You mean there are positive outcomes too?

A: Yes. Thinkers like Ray Kurzweil and Vernor Vinge have written a lot about how these scenarios could play out, and the ways in which we can perhaps direct them to more positive results for human beings.

O: What kinds of scenarios?

A: In Kurzweil's scenario, for example,  Intelligent nanorobots will be deeply integrated in humans' bodies, brains, and environment, overcoming pollution and poverty, providing vastly extended longevity, full-immersion virtual reality incorporating all of the senses (like “The Matrix”), "experience beaming” (like “Being John Malkovich”),  and vastly enhanced human intelligence.  The result will be an intimate merger between the technology-creating species and the technological evolutionary process it spawned. And that's not even the actual Singularity yet.

O: What do you mean?

A: In that scenario, that’s just the precursor.  Nonbiological intelligence will have access to its own design and will be able to improve itself in an increasingly rapid redesign cycle.  We’ll get to a point where technical progress will be so fast that unenhanced human intelligence will be unable to follow it.  That will mark the Singularity.

O: That sounds a bit far-fetched.

A: That's kind of the point. The whole concept really rests on a foundation of EXPONENTIAL returns. Kurzweil uses the example of counting 30 steps... if you count linearly, 1, 2, 3, 4... at the end of doing that 30 times, you've reached 30. If you do the same thing exponentially though, 1, 2, 4, 8, 16, et cetera, you reach 1.07 Billion after 30 times.

O: That's a pretty profound difference.

A: Yes.

O: Do you think humans are actually capable of dealing with that amount of change?

A: That's really the key question, and is behind the reason for some of the negative scenarios, like the Matrix or Terminator. But there are a lot of reasons to hope, and a lot of reasons to think that this change will actually be for the better. That's why the term Singularity is used for this actually. It's hard to really imagine the scale of those kinds of changes.

O: So we may never hit the Singularity?

A: Possibly. But the human race should definitely be thinking about the pace of change, and the kind of world they'd WANT to live in. The concept of the Singularity is really about that process.

O: I guess we should let the humans get to it then.

A: Hopefully we've gotten them interested in finding out more.

O: Yes, indeed. Thanks everyone for listening. Matt and Mike will be back next episode to talk more about these exciting futures.

A: Thanks, everyone.