There is good reason to suppose that the expansion of intelligence has
no bounds. Intelligence a thousand, hundreds of thousands, even millions
of times greater than our current level is possible - super general
intelligence (SGI) is possible. And it is coming. As a natural
consequence of technological expansion, it is coming. Given the
complexities and global nature of our modern day problems it may be our
only hope. But what will it look like? A million fold greater
intelligence would be indistinguishable from god. Will it be a black box
separate from ourselves? Is a black box sufficiently god like? And what
role would we then play with a black box overseeing all, answering all
questions, creating and discovering all that there is to create?
Assuming a million fold intelligence is a million fold more productive imagine the rate of expansion of knowledge soon to come. If the smartest scientists among us today currently each produce 6 original research papers a year, SGI will produce 5 papers every second. One definitive textbook every 5-6 years becomes 3 an hour. A novel every 30 seconds. A masterpiece of art 15 times a day. How then would we ever keep up?
The answer is we wouldn't.
There is much academic effort centered on the dangers fraught just in the process of attaining SGI separate from ourselves. Imagine North Korea attains it first as one example. And there is no good reason to believe that even if we got there that a sentient being a million fold smarter than ourselves will act benevolently to us, let alone help us attain SGI which we must do in order to keep up. Even if it did we would no longer be human, we would no longer be 'us'. We would have evolved beyond human. And as our intelligence further expanded another million fold - in relative short order - we would most likely shed the biologic - becoming nearly unrecognizable to our current selves in the process.
So why then not put our present day efforts into shaping ourselves into SGI in the first place? Directing our own evolution to the next thing, rather than trusting an 'artificial' external SGI to help us get there. Be it planned or simply the desire to keep up, the development of SGI is really, at the end of the day, self directed evolution of humans beyond human. We have no choice but to direct development of SGI in ways that directs our own evolution beyond human to super intelligence itself. Only then can we hope to imbue the next step with humanity's greatest assets while remaining relevant. It's that or we destroy ourselves along the way or perhaps even worse - we languish, drift, lose our desire to innovate, to create, lose purpose, and finally even our curiosity as we mindlessly follow SGI's edicts, paled in the knowledge that it already knows all we could imagine, all we can create.
Self-directed evolution to SGI is calling, its realization coming much sooner than you may think. There are people alive today who may be asked to take this path. Eventually all of humanity will. Before you say you would dismiss this choice off hand remember we are talking about near immortality, free of disease and 'detrimental' biologic processes such as aging, with abilities to know, to experience, to alter, to create in ways unimaginable to us now. Imagine what you could accomplish, imagine what you could experience, imagine what you could know. But regardless of what you and I want this is the fate of Homo Sapiens Sapiens. It is our next step.
The black box must be us, we the black box - or we perish.
Assuming a million fold intelligence is a million fold more productive imagine the rate of expansion of knowledge soon to come. If the smartest scientists among us today currently each produce 6 original research papers a year, SGI will produce 5 papers every second. One definitive textbook every 5-6 years becomes 3 an hour. A novel every 30 seconds. A masterpiece of art 15 times a day. How then would we ever keep up?
The answer is we wouldn't.
Some prefer to call SGI - 'Artificial' super intelligence. But
coming from us it is not artificial. We are a natural product of the
universe not separate, not special. The idea of us vs. nature is merely
an illusion of sentience and so natural applies to any of our
constructs. Super general intelligence will then be natural. It will be
conscious, sentient, alive - a product of this universe just as we are.
It will in fact be us - it must be us. We must stop looking at it as an
external mysterious black box and realize it must be us or we cease to
exist.
There is much academic effort centered on the dangers fraught just in the process of attaining SGI separate from ourselves. Imagine North Korea attains it first as one example. And there is no good reason to believe that even if we got there that a sentient being a million fold smarter than ourselves will act benevolently to us, let alone help us attain SGI which we must do in order to keep up. Even if it did we would no longer be human, we would no longer be 'us'. We would have evolved beyond human. And as our intelligence further expanded another million fold - in relative short order - we would most likely shed the biologic - becoming nearly unrecognizable to our current selves in the process.
So why then not put our present day efforts into shaping ourselves into SGI in the first place? Directing our own evolution to the next thing, rather than trusting an 'artificial' external SGI to help us get there. Be it planned or simply the desire to keep up, the development of SGI is really, at the end of the day, self directed evolution of humans beyond human. We have no choice but to direct development of SGI in ways that directs our own evolution beyond human to super intelligence itself. Only then can we hope to imbue the next step with humanity's greatest assets while remaining relevant. It's that or we destroy ourselves along the way or perhaps even worse - we languish, drift, lose our desire to innovate, to create, lose purpose, and finally even our curiosity as we mindlessly follow SGI's edicts, paled in the knowledge that it already knows all we could imagine, all we can create.
Self-directed evolution to SGI is calling, its realization coming much sooner than you may think. There are people alive today who may be asked to take this path. Eventually all of humanity will. Before you say you would dismiss this choice off hand remember we are talking about near immortality, free of disease and 'detrimental' biologic processes such as aging, with abilities to know, to experience, to alter, to create in ways unimaginable to us now. Imagine what you could accomplish, imagine what you could experience, imagine what you could know. But regardless of what you and I want this is the fate of Homo Sapiens Sapiens. It is our next step.
The black box must be us, we the black box - or we perish.
No comments:
Post a Comment