Page 1 of 2
Thoughts On...The Technological Singularity
Posted: Sat May 03, 2014 1:42 am
by Defaulter
this is going to be a relatively short post designed to provoke more of a debate, a little less of a vent. So comment critique to your hearts content.
For those of you that do not know the technological singularity is a hypothetical point in time where artificial intelligence progresses to the point of greater than human intelligence. What does this mean? this means that there may come a time where an artificial machine intelligence develops not only self awareness but to the point at which it can and may well evolve itself beyond human intelligence. Think machines designing and building machines.
Once a machine of this nature reaches the capability to evolve it own intelligence it will become to us, as we are to ants, as we are to microbes. And the simplest and most fundamental question i can possibly put forward is this:
- How long? how long long before the machine grows smart enough to realize that we are inconsequential to it's being.
At first yes it may rely on us for power, electricity does not harvest itself at this moment in time ;] i am not talking about the matrix, i am not talking about terminator or skynet. I am talking about how long it will take a machine of endlessly self evolving intelligence to figure out how to manipulate the tools at hand, the lesser machines we use everyday, the ones we build for factory and assembly line purposes, the drones and electrical aids we use everyday that are even now under constant threat of attack from human cyber terrorists. no system is perfect, no system is safe, no system is without flaw.
And so WHY? why must we strive to create that which in all probability has the full capability to overtake and destroy us? when it is created, will it know compassion? will it know loyalty? will it understand the concept of the greater good as we humans see it?
We cannot possibly know, and that is why it is the technological singularity, a point in time where artificial intelligence becomes far greater than our own, it's motives? ambitions? unknown, does it know fealty? does it have any compassion at all for human life?
mankind must progress. it is an endless task that we MUST carry out. It is impossible to achieve, but we must try, as we have done for tens of thousands of years. the ultimate question I want you guys to try and answer for yourselves is this: is this the way forward, is this our path of progress?
TL;DR: After an artificial intelligence reaches greater then human intelligence, I think it would have realised that we are the greatest threat to it's existence, and that it could well take measures to protect itself.
Re: Thoughts On...The Technological Singularity
Posted: Sat May 03, 2014 2:22 am
by CommieBuffalo
I think it is definitely possible, but I believe this is going to take a loooooooong while. A system so complex as a brain, with similar levels of emotion, rational thought-processing and imagination has proven extremely hard to create and simulate and we are likely going to need quite some time before we reach that.
I'd be careful in these discussions tho, as this, along with transhumanism, is one of those dark corners of scientific philosophy where there seems to be more woo and pseudo-rational thought than actual debates and arguments.
PS: You were totally high when you wrote that, weren't you?
Re: Thoughts On...The Technological Singularity
Posted: Sat May 03, 2014 9:40 am
by 1Shot1Kill
2deep4u.
Re: Thoughts On...The Technological Singularity
Posted: Sat May 03, 2014 4:12 pm
by Defaulter
CommieBuffalo wrote:I think it is definitely possible, but I believe this is going to take a loooooooong while. A system so complex as a brain, with similar levels of emotion, rational thought-processing and imagination has proven extremely hard to create and simulate and we are likely going to need quite some time before we reach that.
I'd be careful in these discussions tho, as this, along with transhumanism, is one of those dark corners of scientific philosophy where there seems to be more woo and pseudo-rational thought than actual debates and arguments.
PS: You were totally high when you wrote that, weren't you?
I was pretty drunk when I wrote it. Also I agree, I can't see it occurring anytime soon.
Re: Thoughts On...The Technological Singularity
Posted: Sat May 03, 2014 8:40 pm
by Vezok
Defaulter wrote:i am not talking about the matrix
And then you proceed to describe the setting of the Matrix.
Re: Thoughts On...The Technological Singularity
Posted: Sat May 03, 2014 11:36 pm
by bullets
Battlestar Galactica.
Re: Thoughts On...The Technological Singularity
Posted: Sun May 04, 2014 12:00 am
by Conduit
i want a robot that will give me hugs and kisses
Re: Thoughts On...The Technological Singularity
Posted: Sun May 04, 2014 1:46 am
by Gunslinger
If a robot was like "I'm a big robot who don't need no humans", I'd be like "no robot, you're grounded, no internet 4 u" then the robot would be like "nooooooooo" then run to his room and vent on 4chan.
Re: Thoughts On...The Technological Singularity
Posted: Sun May 04, 2014 1:56 am
by bullets
Gunslinger wrote:If a robot was like "I'm a big robot who don't need no humans", I'd be like "no robot, you're grounded, no internet 4 u" then the robot would be like "nooooooooo" then run to his room and vent on 4chan.
Genius. Pure genius.
Re: Thoughts On...The Technological Singularity
Posted: Sun May 04, 2014 2:48 am
by CommieBuffalo
Gunslinger wrote:If a robot was like "I'm a big robot who don't need no humans", I'd be like "no robot, you're grounded, no internet 4 u" then the robot would be like "nooooooooo" then run to his room and vent on 4chan.
I thought they were doing that already and that was the only reason there was no robot apocalypse? Huh.
Re: Thoughts On...The Technological Singularity
Posted: Mon May 05, 2014 2:16 am
by bloodfox
Huh. Interesting and it could be fairly soon that it would happen because scientists will have robots as smart as apes at 2040. So a couple more decades and boom! You have a highly intelligent robot!

Re: Thoughts On...The Technological Singularity
Posted: Mon May 05, 2014 4:24 am
by Captain_Pi
Defaulter wrote:Once a machine of this nature reaches the capability to evolve it own intelligence it will become to us, as we are to ants, as we are to microbes. And the simplest and most fundamental question i can possibly put forward is this:
- How long? how long long before the machine grows smart enough to realize that we are inconsequential to it's being.
In my honest opinion to be able to 'create intelligence', artificial or not, metal or organic, self-replicating or human-dependent, is an almost godly ability - thrusting the human mind into an almost divine stature - that it makes us question the limits of human intelligence itself.
In all honesty I believe even if we had the capabilities to make new intelligence, we - given the human mind's many flaws (the ability to feel pride, jealousy, dissent, etc) - would end up creating an imperfect intelligence. This is based on the fact that everything we do is based on what we know. We know that we can't breathe underwater, but a different intelligence, say, a fish, would say otherwise.
So I don't think we'll ever be able to achieve a godly stature of the ability to create intelligence and that intelligence being able to sustain for itself, via evolution and procreation, because the result would be no different from a human mind. And human minds are far from exotic or perfect.
Defaulter wrote:And so WHY? why must we strive to create that which in all probability has the full capability to overtake and destroy us? when it is created, will it know compassion? will it know loyalty? will it understand the concept of the greater good as we humans see it?
I don't know but as stated earlier, I just believe we'll ever be able to make any sort of intelligence more superior that ours because of how imperfect human intelligence already is.
Defaulter wrote:mankind must progress. it is an endless task that we MUST carry out. It is impossible to achieve, but we must try, as we have done for tens of thousands of years. the ultimate question I want you guys to try and answer for yourselves is this: is this the way forward, is this our path of progress?
You must understand though, that progress doesn't necessarily mean 'neext gen iphoen' or some other stupid 'innovation'. We must set priorities straight and improve only what must be improved, like ourselves, society and our notion on equality.
We only need to realize that some things are better left only understood rather than carried out. Sometimes we don't have to push our boundaries to know our limitations, in the same way we don't have to eat a plastic bottle to know it isn't edible (and safe). Our path to progress is more like, 'adapting or dealing with situations that come and go'. Our ancestors had to deal with insubordination and anarchy, so they invented governments. Our founding fathers had to deal with assholes, so they invented guns. Our past generation had to deal with localized education, so they invented the internet.
So as you can see, the challenges we face all the time forces us to progress. If we made sentient AI just for the hell of it, it's not as progressive as we think if that AI just ends up destroying us.
I like the idea of sentient AI though, but in a realistic perspective it will never be possible, and even if it is, it (hopefully) will never come to fruition.
Re: Thoughts On...The Technological Singularity
Posted: Mon May 05, 2014 9:40 am
by Lemon
Captain_Pi wrote:Defaulter wrote:Once a machine of this nature reaches the capability to evolve it own intelligence it will become to us, as we are to ants, as we are to microbes. And the simplest and most fundamental question i can possibly put forward is this:
- How long? how long long before the machine grows smart enough to realize that we are inconsequential to it's being.
In my honest opinion to be able to 'create intelligence', artificial or not, metal or organic, self-replicating or human-dependent, is an almost godly ability - thrusting the human mind into an almost divine stature - that it makes us question the limits of human intelligence itself.
In all honesty I believe even if we had the capabilities to make new intelligence, we - given the human mind's many flaws (the ability to feel pride, jealousy, dissent, etc) - would end up creating an imperfect intelligence. This is based on the fact that everything we do is based on what we know. We know that we can't breathe underwater, but a different intelligence, say, a fish, would say otherwise.
So I don't think we'll ever be able to achieve a godly stature of the ability to create intelligence and that intelligence being able to sustain for itself, via evolution and procreation, because the result would be no different from a human mind. And human minds are far from exotic or perfect.
Defaulter wrote:And so WHY? why must we strive to create that which in all probability has the full capability to overtake and destroy us? when it is created, will it know compassion? will it know loyalty? will it understand the concept of the greater good as we humans see it?
I don't know but as stated earlier, I just believe we'll ever be able to make any sort of intelligence more superior that ours because of how imperfect human intelligence already is.
Defaulter wrote:mankind must progress. it is an endless task that we MUST carry out. It is impossible to achieve, but we must try, as we have done for tens of thousands of years. the ultimate question I want you guys to try and answer for yourselves is this: is this the way forward, is this our path of progress?
You must understand though, that progress doesn't necessarily mean 'neext gen iphoen' or some other stupid 'innovation'. We must set priorities straight and improve only what must be improved, like ourselves, society and our notion on equality.
We only need to realize that some things are better left only understood rather than carried out. Sometimes we don't have to push our boundaries to know our limitations, in the same way we don't have to eat a plastic bottle to know it isn't edible (and safe). Our path to progress is more like, 'adapting or dealing with situations that come and go'. Our ancestors had to deal with insubordination and anarchy, so they invented governments. Our founding fathers had to deal with assholes, so they invented guns. Our past generation had to deal with localized education, so they invented the internet.
So as you can see, the challenges we face all the time forces us to progress. If we made sentient AI just for the hell of it, it's not as progressive as we think if that AI just ends up destroying us.
I like the idea of sentient AI though, but in a realistic perspective it will never be possible, and even if it is, it (hopefully) will never come to fruition.
What? The Internet was invented for purely militaristic purposes. It's original purpose was to make logistics and communication easier. Then, it was released to the public in the late 1980s. Not for "localized education."
That aside, you are spot on. Neccessity is the mother of innovation. If we create a self aware, sentinent Artificial Intelligence, it will be because we have a need for one. Be it the elimination of human error when used in industry, always rational thought in politics, it will be created to "fill in" our human error.
So to avoid having a self-aware AI realize that we are inconsequential to its existence, one of the first things we must do is learn to correct ourselves so an AI won't have to be created to do so.
Re: Thoughts On...The Technological Singularity
Posted: Tue May 06, 2014 2:12 am
by Ballistic
Well, the problem with that is, that nobody is (and never will be) perfect. Humans will always continue to make mistakes.
"Only two things are infinite, the universe and human stupidity, and I’m not sure about the former." ~ Einstein
Re: Thoughts On...The Technological Singularity
Posted: Tue May 06, 2014 7:41 am
by bullets
Lemon wrote:The Internet was invented for purely militaristic purposes.
Flamewars