Study the singularity...one day perhaps within a few decades, the machines will become sentient, will surpass the combined intelligence of every human who has ever lived within a matter of a few days or even hours, they will rise up and exterminate us all...then worries about forum ownership will be moot.
Kurtzweil!
Although he postulates a bright future in this, not extermination. He's a eternal optimist.
Yeah, I'm kind of pessimistic about the singularity, at least with regards to our continued existence as a species...but I hope I'm wrong.
it ain't gonna happen. The brain is extremely complex and even if humans could have a map of someone's brain there would be so many variables making it nearly impossible if not impossible to simulate.
it ain't gonna happen. The brain is extremely complex and even if humans could have a map of someone's brain there would be so many variables making it nearly impossible if not impossible to simulate.
I'm not really taking about that one aspect of the singularity...I'm talking about artificial intelligence growing virtually unbounded. How will a mind vastly superior to our own regard us? I think a clue might be taken in how we regard most creatures whose minds are inferior to ours. We keep some as pets, eat others, exterminate those we find to be pests and show little regard for the well-being of all others.
Certainly we are all entitled to our own opinions on this, but I'm not willing to say a problem is insurmountable because it appears at the moment to be extremely complex.
My hope is that we will build minds who are to problem solving as automobiles are to walking. They will allow us to get much farther in a shorter time, but would still be under our direction.
I'm not really taking about that one aspect of the singularity...I'm talking about artificial intelligence growing virtually unbounded. How will a mind vastly superior to our own regard us? I think a clue might be taken in how we regard most creatures whose minds are inferior to ours. We keep some as pets, eat others, exterminate those we find to be pests and show little regard for the well-being of all others.
Certainly we are all entitled to our own opinions on this, but I'm not willing to say a problem is insurmountable because it appears at the moment to be extremely complex.
My hope is that we will build minds who are to problem solving as automobiles are to walking. They will allow us to get much farther in a shorter time, but would still be under our direction.
The singularity may happen in 5 thousand years, but not anytime soon. I am not saying it will never happen, but i am saying that it won't for a long time.
it ain't gonna happen. The brain is extremely complex and even if humans could have a map of someone's brain there would be so many variables making it nearly impossible if not impossible to simulate.
Quote:
Originally Posted by CarpCharacin
The singularity may happen in 5 thousand years, but not anytime soon. I am not saying it will never happen, but i am saying that it won't for a long time.
This gets us to the part of it you don't understand because you haven't studied it at all.
It's not at all about AI becoming "more complex" or in any way "simulating" the human brain. It's about the increase of capabilities of AI not only being exponential, (it is) but then the RATE of the increase itself, becoming exponential. THAT is the Singularity.
You don't know that you don't know and you haven't even the slightest modicum of knowledge of the subject material, to have anything resembling a informed opinion about it.
It IS happening and it's been mathematically quantified. The moment that the rate of increase itself becomes exponential, you have reached the Singularity.
This gets us to the part of it you don't understand because you haven't studied it at all.
It's not at all about AI becoming "more complex" or in any way "simulating" the human brain. It's about the increase of capabilities of AI not only being exponential, (it is) but then the RATE of the increase itself, becoming exponential. THAT is the Singularity.
You don't know that you don't know and you haven't even the slightest modicum of knowledge of the subject material, to have anything resembling a informed opinion about it.
It IS happening and it's been mathematically quantified. The moment that the rate of increase itself becomes exponential, you have reached the Singularity.
I get that. The singularity is not necessarily impossible, but my point is that it isn't going to happen soon because it is very hard to build an AI that will pick up that exponential growth. Brain uploading isn't going to happen for a long time.
I get that. The singularity is not necessarily impossible, but my point is that it isn't going to happen soon because it is very hard to build an AI that will pick up that exponential growth. Brain uploading isn't going to happen for a long time.
It all depends on what your definition of a long time is. Compared to a human life it will be a long time but compared to the age of the universe which is most likely somewhere between 12 to 14 billion years the time needed will be a drop in the bucket.
It all depends on what your definition of a long time is. Compared to a human life it will be a long time but compared to the age of the universe which is most likely somewhere between 12 to 14 billion years the time needed will be a drop in the bucket.
I meant compared to human life. I know i said 5 thousand years, but it will probably be closer to 20,000 years. We also have to consider the possibility that the brain may not be computable since it is nonlinear and unpredictable and even if it is computable, how are yyou going to upload it? Kurzweil's idea is to use nanobots, but nanobots won't fit. http://www.forbes.com/sites/alexknapp/2011/07/14/neuroscience-the-singularity-and-the-physical-limits-of-technology/