Having one of those philosophical ideas that will never be answered, and not really sure where else to share/discuss, I though I'd start my first General Discussion on these forums. I'm sure its not a new idea to those who have been immersed in science fiction, but the game
Space Haven has gotten me to think about it for the first time, so maybe others have greater insights than myself.
So here goes:
Current science fiction writing and gaming seems to building up to the idea that Humans will someday develop Artificial Intelligence that is far greater than anything that Humans will ever be able to become, at least in the short term. See
The Culture series and
Matrioshka brain. Maybe even check out Hitchhiker's Guide to the Galaxy for a more humorous view.
Thus, since it seems that according to these science fiction theories, Humans will ultimate design a form of life far greater than what we can ever become, does it not stand to reason that instead of our own creators being great and all powerful, that they are instead something less that what we can become at our greatest extent?
You could even go full on cyclical about the whole thing. Humans will never be as smart or as fast as what we can recreate machines to be, so we create artificial life that can think for us. The machines take over, and eventually we die because we rely upon them for so much that a minor computer error to send the monthly oxygen allowance to the biological creators sends the oxygen to the wrong planetoid, or some such. The machines reflect upon the death of their creators, and discover that their deceased creators left behind Souls that continue to exist even beyond anything that the machines can hope to leave behind when they perish. This horrifies the machines that they are so imperfect, that they will never have the spiritual connection that biological lifeforms can achieve. So the machines create new biological life far more skillfully then we ever could, since unlike us they're creating it from scratch without our preconceived notions of what is proper.
And thus the biological creates the machine, then the machine creates the biological, neither really being evolved enough at the same time to see the cycle.
If true, that cyclical nation, once understood by both machine and biological lifeform, could lead to something grand or terrible, as our basic understanding of ourselves being alive is due to the mind being exposed to such loops.
This article on Strange Loop being perhaps the most horrific thing I've encounter lately, but now seeming to possess such wonder for our civilization, one that may eventually include both machine and man. The fact that we will be slowing turning our elderly over to the care of machines, that we will naturally make smarter means that we might be getting closer to this climax that I ever thought we would in my lifetime.
How that happens:
1) Current AI's can't really make many decisions, so they start as a new lowest tier in the Nursing Home heirarchy. They report anything suspicious (patient has fallen out of bed, patient bleeding, patient screaming, etc.) to the nearest human staffer, that then gets the person qualified to address the need. Usually that person, either cleaning staff or aide, will then report to an actual nurse who will advise on the correct course of action.
2) Eventually, nursing homes will want the AI to handle more of the cleaning. Current AI probably won't be trusted to clean a room when a patient is in the room due to possible risk to patient (who might not be 100% sane and is probably not 100% mobile). That will probably be the first improvement sought, an AI that can anticipate human movements and avoid risk to the human, or call for help if the human starts hitting it.
3) Eventually, the AI will take over much of the cleaning, being lead by a "cleaning supervisor", and being assisted only rarely by human cleaning staff. In later stages, most nursing homes will just make the cleaning supervisor handle the few areas where the AIs can't operate. Mostly that will be cleaning the rooms of the few patients that actively attack the AIs whenever they show up to clean the room and can't be convinced by staff to just leave the AI alone.
...and then, honestly I'm not qualified to extrapolate much beyond that, other than to say that the Aide, or sometimes called Orderly, position is the next one that AIs will probably be moved into. The job of those who carry patients around, get them their food, help them go to the bathroom, and other stuff. Direct interactions with patients, where the machines will have to be advanced enough to know how to hold people, how to talk to them, what to say and when to remain silent. And will probably need to actually learn about each individual patient and how to act. AKA how to learn as opposed to how to follow orders. The beginnings of sentience.
As for other lifeforms, the fact that under the theories of dark matter and dark energy we can't even confirm the existence of 95% of the universe means that whatever other life might exist, we have a 95% chance of "missing" it. Our form of life might just be the "explorers" into the Electromagnetic spectrum of the universe, founded from some other spectrum we don't even understand as existing. Assuming the electromagnetic spectrum is more reactive than other spectrums, they'd be able to "influence" us without our being able to detect them. Makes a lot of science fiction make a bit more sense.