The short answer:
The Singularity is the idea that humanity, using technology, will become gods/super-powerful/whatever.
I don't think this will happen because I believe there are limits to what science/technology can do and because I think that humanity will run out of motivation before we even reach that limit.
The long answer:
The way I understand it, the idea of a technological singularity is that advances in technology have been coming faster and faster for the last 200 years. If the pace of increase continues, then advances will come so quickly as to be simultaneous – humanity will achieve in a week what used to take 10 years.
At that point, humans will effectively become gods: immortal, all-knowing and all-powerful.
The big problem(s) that I see with such theories are:
- Unlimited technology
Inherent to the idea of a technological singularity is the notion that technology is infinite. I don't think there's any reason why this has to be the case. In my own field of computers, for example, there are definite limits to what you can and cannot do. Mind you, the limits may be ridiculous, like taking the total energy in the universe and feeding it into a big-honkin computer, but there are limits. From classical physics: force = mass times acceleration, at least at the level that you and I are familiar with.
So, if technology is limited, or at least limited in some areas, then it may well be the case that humanity can reach that limit very rapidly, but then very rapidly stop increasing. The question is: where is that limit? I think it is well short of what most people would consider "godhood."
Around the year 1900, there seemed to be the same belief that mankind would suddenly develop vast powers, using steam engines or electricity, but it didn't happen. In fact, we have many of the same problems we did then: war, disease, governments, etc.
What's worse is that some of the basic technologies like trains, batteries and fuel cells are the very same that we are looking at right now.
One of the driving forces behind technological advancement is a motivation to advance. But if you can transcend your limitations, especially with respect to intelligence, then why have motivation at all?
Looking at this from a different perspective, we assume that we will always be motivated. What if this is not the case? At some point, the pace of advance would slow, maybe even reverse.
This is not quite as crazy as it sounds. For example, the Japanese went through a period where they actively suppressed development of guns and firearms. As I understand it, this was because the samurai did not like the idea that some clown could learn how to utterly defeat them in the course of an afternoon (i.e. learn how to shoot someone), while it took years to learn how to be effective with a sword.
Then there was the Dark Ages in
So just to sum up, I don't think technology and science are unlimited. I also don't think that motivation is a given in all cases. In particular, don't believe that we can assume a motivation for science/technology to fuel unlimited advancement.
P.S. This blurb ignores the whole possibility of AIs "taking over" style of thing. If anybody is actually curious, I can babble on about that too :-)