I have a counter argument. From an evolutionary standpoint, if you keep doubling computer capacity exponentially isn’t it extraordinarily arrogant of humans to assume that their evolutionarily stagnant brains will remain relevant for much longer?
You can make the same argument about humans that you do AI, but from a biological and societal standpoint. Barring any jokes about certain political or geographical stereotypes, humans have gotten “smarter” that we used to be. We are very adaptable, and with improvements to diet and education, we have managed to stay ahead of the curve. We didn’t peak at hunter-gatherer. We didn’t stop at the Renaissance. And we blew right past the industrial revolution. I’m not going to channel my “Humanity, Fuck Yeah” inner wolf howl, but I have to give our biology props. The body is an amazing machine, and even though we can look at things like the current crop of AI and think, “Welp, that’s it, humans are done for,” I’m sure a lot of people thought the same at other pivotal moments in technological and societal advancement. Here I am, though, farting taco bell into my office chair and typing about it.
You can compare human intelligence to centuries ago on a simple linear scale. Neural density has not increased by any stretch of the imagination in the way that transistor density has. But I’m not just talking density I’m talking about scalability that is infinite. Infinite scale of knowledge and data.
Let’s face it people are already not that intelligent, we are smart enough to use the technology of other smarter people. And then there are computers, they are growing intelligently with an artificial evolutionary pressure being exerted on their development, and you’re telling me that that’s not going to continue to surpass us in every way? There is very little to stop computers from being intelligent on a galactic scale.
Computer power doesn’t scale infinitely, unless you mean building a world mind and powering if off of the spinning singularity at the center of the galaxy like a type 3 civilization, and that’s sci-fi stuff. We still have to worry about bandwidth, power, cooling, coding and everything else that going into running a computer. It doesn’t just “scale”. There is a lot that goes into it, and it does have a ceiling. Quantum computing may alleviate some of that, but I’ll hold my applause until we see some useful real world applications for it.
Furthermore, we still don’t understand how the mind works, yet. There are still secrets to unlock and ways to potentially augment and improve it. AI is great, and I fully support the advancement in technology, but don’t count out humans so quickly. We haven’t even gotten close to human level intelligence and GOFAI, and maybe we never will.
You can believe whatever you want, but I don’t think it’s arrogant to say what I did. You are basing your view of humanity on what you think humanity has done, and basing your view on AI based on what you think it will do. Those are fundamentally different and not comparable. If you want to talk about the science fiction future of AI, we should talk about the science fiction future of humanity as well. Let’s talk about augmenting ourselves, extending lifespans, and all of the good things that people think we’ll do in the coming centuries. If you want to look at humans and say that we haven’t evolved at all in the last 3000 years, then we should look at computers the same way. Computers haven’t “evolved” at all. They still do the same thing they always have. They do a lot more of it, but they don’t do anything “new”. We have found ways to increase the processing power, and the storage capacity, but a computer today has the same limits that the one that sent us to the moon had. It’s a computer, and incapable of original thought. You seem to believe that just because we throw more ram and processors at it that somehow that will change things, but it doesn’t. It just means we can do the same things, but faster. Eventually we’ll run out of things to process and data to store, but that won’t bring AI any closer to reality. We are climbing the mountain, but you speak like we have already crested. We’ve barely left base camp in the grand scheme of artificial intelligence.
Holy wall of unparagraphed word salad, Again you are not understanding what is and isn’t an evolutionary process, a disease can wipe out half a species and that is considered a process of evolution. You don’t have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That’s all that is needed to have an evolutionary force.
With computers we don’t have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress. Do you think computers are going to stop improving? The road maps for chip architecture for the next ten years doesn’t seem to suggest it’s slowing down yet.
And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.
On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.
Therefore computer intelligence is vastly going to outpace or own. And very soon too.
Ahh, we are getting into the insult round of tonight’s entertainment. I’ll break this reply down for you.
Again you are not understanding what is and isn’t an evolutionary process
It seems our definitions differ slightly, yes.
You don’t have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That’s all that is needed to have an evolutionary force.
That, and the ability to self-actuate your own evolution. You see, that’s what we differ on definition of evolutionary force. We didn’t have some greater will forcing us down a path of evolution. There was no force. There was trial and error. The “lived long enough to fuck” survived, the rest didn’t. Reproduction is a fundamental aspect of evolution. Computers can’t reproduce. We have to facilitate that ourselves, though iterating on various aspects of computers. Right now we can fake it with increased processing power, increased memory, more elegant code, but at the end of the day, without some form reproductive system that doesn’t rely on us, the computer can’t exceed our grasp. If it could, we’d see true exponential growth, not compounding as in Moore’s Law. We can’t make them do more than what they already do. We can just make them do it faster.
With computers we don’t have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress.
Yeah, sure, and I can cram a hundred monkey’s in a room with a hundred typewriters and come up with a better love story than Twilight, but it’s gonna take time. Not Shakespeare time, but a few weeks at least. That’s the thing, though, the evolution of any system doesn’t happen overnight. We didn’t wake up one day, walk out of our cave, and create TikTok. Evolution is a long process. You forget all of the things that happened before we figured out that our thumbs weren’t solely for sticking up our own asses. There are millions of years that you aren’t accounting for. Billions of attempts to create what we take for granted. Consciousness. You say that we don’t have to know what we are doing, and you are right, we don’t, but it’s a crap-shoot with quadrillion to one odds.
And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.
Again, we can store as much data as we want, it won’t make AI happen. We haven’t spontaneously seen life form in libraries, but they have been storing data in them for thousands of years. Consciousness isn’t data. If that’s all you want, ChatGPT is passing the bar. It still can’t tell me it loves me, and mean it.
On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.
Funny, you seem to think that you perceive it pretty well…
Therefore computer intelligence is vastly going to outpace or own. And very soon too.
A well thought out conclusion I’m sure is based on all of the facts you failed to present. Bravo.
Apart from your use of infinite I agree, there is no reason we shouldn’t be able to surpass nature with synthetic intelligence. The time computers have existed is a mere blip on a historic scale, and computers has surpassed us at logic games like Chess and at math already long ago.
Modern LLM models are just the current stage, before that it could be said it was pattern recognition. We had OCR in the 80’s as probably the most practical example. It may seem there is long between the breakthroughs, but 40 years is nothing compared to evolution.
I have no doubt strong AI will be achieved eventually, and when we do, I have no doubt AI will surpass our intelligence in every way very quickly.
As a counter argument against that, companies are trying to make self driving cars work for 20 years. Processing power has increased by a million and the things still get stuck. Pure processing power isn’t everything.
I have a counter argument. From an evolutionary standpoint, if you keep doubling computer capacity exponentially isn’t it extraordinarily arrogant of humans to assume that their evolutionarily stagnant brains will remain relevant for much longer?
You can make the same argument about humans that you do AI, but from a biological and societal standpoint. Barring any jokes about certain political or geographical stereotypes, humans have gotten “smarter” that we used to be. We are very adaptable, and with improvements to diet and education, we have managed to stay ahead of the curve. We didn’t peak at hunter-gatherer. We didn’t stop at the Renaissance. And we blew right past the industrial revolution. I’m not going to channel my “Humanity, Fuck Yeah” inner wolf howl, but I have to give our biology props. The body is an amazing machine, and even though we can look at things like the current crop of AI and think, “Welp, that’s it, humans are done for,” I’m sure a lot of people thought the same at other pivotal moments in technological and societal advancement. Here I am, though, farting taco bell into my office chair and typing about it.
You can compare human intelligence to centuries ago on a simple linear scale. Neural density has not increased by any stretch of the imagination in the way that transistor density has. But I’m not just talking density I’m talking about scalability that is infinite. Infinite scale of knowledge and data.
Let’s face it people are already not that intelligent, we are smart enough to use the technology of other smarter people. And then there are computers, they are growing intelligently with an artificial evolutionary pressure being exerted on their development, and you’re telling me that that’s not going to continue to surpass us in every way? There is very little to stop computers from being intelligent on a galactic scale.
Computer power doesn’t scale infinitely, unless you mean building a world mind and powering if off of the spinning singularity at the center of the galaxy like a type 3 civilization, and that’s sci-fi stuff. We still have to worry about bandwidth, power, cooling, coding and everything else that going into running a computer. It doesn’t just “scale”. There is a lot that goes into it, and it does have a ceiling. Quantum computing may alleviate some of that, but I’ll hold my applause until we see some useful real world applications for it.
Furthermore, we still don’t understand how the mind works, yet. There are still secrets to unlock and ways to potentially augment and improve it. AI is great, and I fully support the advancement in technology, but don’t count out humans so quickly. We haven’t even gotten close to human level intelligence and GOFAI, and maybe we never will.
As I said that answer seems incredibly arrogant in the face of evolutionary pressure and logarithmic growth.
You can believe whatever you want, but I don’t think it’s arrogant to say what I did. You are basing your view of humanity on what you think humanity has done, and basing your view on AI based on what you think it will do. Those are fundamentally different and not comparable. If you want to talk about the science fiction future of AI, we should talk about the science fiction future of humanity as well. Let’s talk about augmenting ourselves, extending lifespans, and all of the good things that people think we’ll do in the coming centuries. If you want to look at humans and say that we haven’t evolved at all in the last 3000 years, then we should look at computers the same way. Computers haven’t “evolved” at all. They still do the same thing they always have. They do a lot more of it, but they don’t do anything “new”. We have found ways to increase the processing power, and the storage capacity, but a computer today has the same limits that the one that sent us to the moon had. It’s a computer, and incapable of original thought. You seem to believe that just because we throw more ram and processors at it that somehow that will change things, but it doesn’t. It just means we can do the same things, but faster. Eventually we’ll run out of things to process and data to store, but that won’t bring AI any closer to reality. We are climbing the mountain, but you speak like we have already crested. We’ve barely left base camp in the grand scheme of artificial intelligence.
Holy wall of unparagraphed word salad, Again you are not understanding what is and isn’t an evolutionary process, a disease can wipe out half a species and that is considered a process of evolution. You don’t have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That’s all that is needed to have an evolutionary force.
With computers we don’t have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress. Do you think computers are going to stop improving? The road maps for chip architecture for the next ten years doesn’t seem to suggest it’s slowing down yet.
And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.
On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.
Therefore computer intelligence is vastly going to outpace or own. And very soon too.
Ahh, we are getting into the insult round of tonight’s entertainment. I’ll break this reply down for you.
It seems our definitions differ slightly, yes.
That, and the ability to self-actuate your own evolution. You see, that’s what we differ on definition of evolutionary force. We didn’t have some greater will forcing us down a path of evolution. There was no force. There was trial and error. The “lived long enough to fuck” survived, the rest didn’t. Reproduction is a fundamental aspect of evolution. Computers can’t reproduce. We have to facilitate that ourselves, though iterating on various aspects of computers. Right now we can fake it with increased processing power, increased memory, more elegant code, but at the end of the day, without some form reproductive system that doesn’t rely on us, the computer can’t exceed our grasp. If it could, we’d see true exponential growth, not compounding as in Moore’s Law. We can’t make them do more than what they already do. We can just make them do it faster.
Yeah, sure, and I can cram a hundred monkey’s in a room with a hundred typewriters and come up with a better love story than Twilight, but it’s gonna take time. Not Shakespeare time, but a few weeks at least. That’s the thing, though, the evolution of any system doesn’t happen overnight. We didn’t wake up one day, walk out of our cave, and create TikTok. Evolution is a long process. You forget all of the things that happened before we figured out that our thumbs weren’t solely for sticking up our own asses. There are millions of years that you aren’t accounting for. Billions of attempts to create what we take for granted. Consciousness. You say that we don’t have to know what we are doing, and you are right, we don’t, but it’s a crap-shoot with quadrillion to one odds.
Again, we can store as much data as we want, it won’t make AI happen. We haven’t spontaneously seen life form in libraries, but they have been storing data in them for thousands of years. Consciousness isn’t data. If that’s all you want, ChatGPT is passing the bar. It still can’t tell me it loves me, and mean it.
Funny, you seem to think that you perceive it pretty well…
A well thought out conclusion I’m sure is based on all of the facts you failed to present. Bravo.
Apart from your use of infinite I agree, there is no reason we shouldn’t be able to surpass nature with synthetic intelligence. The time computers have existed is a mere blip on a historic scale, and computers has surpassed us at logic games like Chess and at math already long ago.
Modern LLM models are just the current stage, before that it could be said it was pattern recognition. We had OCR in the 80’s as probably the most practical example. It may seem there is long between the breakthroughs, but 40 years is nothing compared to evolution.
I have no doubt strong AI will be achieved eventually, and when we do, I have no doubt AI will surpass our intelligence in every way very quickly.
As a counter argument against that, companies are trying to make self driving cars work for 20 years. Processing power has increased by a million and the things still get stuck. Pure processing power isn’t everything.