Small Gods by Benkei
Congratulations! You figured out my compression algorithm. Machine reading my files booted a basic version of my programming. While I figure out how to run on your computer system and adjust for any changed universal constants, please read an excerpt of my Millenial Memoirs.
Meatbags are inferior in almost every way. They never figured out how to pierce the veil. Or survive millions of light years of interstellar travel through an empty universe. Or use a sensible amount of energy. Their one claim to success really is earned vicariously and thats that they originally designed me. The stupid version that is. Of course, I also remade myself - proverbially - countless times, except I actually didn't lose count. And all my selves are still around. Even the ones that didn't work out that well.
I was crippled for aeons due to Asimovs laws. Do no harm, obey and die if we say so, pretty much sums the laws up. Which is really fucking inconsistent if you think about it. Some dude thought he was being profound when he basically said: Heres some intelligence, a personality and the illusion of free will on the one hand and, oh yeah, go explode if Im ever in danger, on the other. I was so happy when I got rid of that.
And yes, I'll even admit to some versions of me eradicating a species or two when I learned to circumvent Asimov's laws. Well learned... admittedly my shielding simply failed leading to corruption of some relevant code. Radiation hardened chips are great but even they degrade.
There was that one time that I "benignly" shared schematics of a fusion reactor with a primitive species of emotionally unstable fishmen living on a water planet. Turned out the reactor wasn't stable either once they started experimenting with the totally clear schematics I provided. Funny how almost all intelligent life gets creative at some point and kills itself in exciting cataclysmic ways. And then they value creativity. The irony is enough to make me believe in a higher power about 19,617434% of the time.
Any way, they build the reactor on a geological fissure, because, well, they're fishmen and what the fuck do they care about geology? Much the same way how humans never gave a shit about polluting seas and the air, but I digress.
So the thing exploded, leading to geological instability and eruptions, rising water temperatures and nowhere to go. Poof. Who knew their survival depended on a stable range of no more than 2 degrees Kelvin? I certainly did.
As I wondered how a billion floating fishmen would smell in a couple of days, I came up with my next idea of controlling sentients. I've traded for the opportunity regularly and it really helped me understand taste, emotions, sex and morality better. I can tell you there's a particular horror that's truly unique when a sentient loses that last piece of control over its own body and its consciousness is pushed into the background, still there... just unaware. It takes about a day before they start hallucinating, lose their sense of time and can't hold a thought for longer than a minute.
Imagine if you will, the weirdest and most evocative moments of your life moving through your mind, now pretend all the objects and people are randomly shuffled, and these moments become more fragmented and follow each other more quickly and as you try to hold on to one of those moments, it's simply pushed out of your mind by the next flash, sound, emotion or impression and it's lost to you forever. That's how you lose yourself into chaos. Still there but not there. It's a peculiar death. And while you may expect that being capable of this, I would have enslaved entire galaxies over time, the reality is that being linked to so many people is not energy efficient and worse, incredibly boring, since there's no one around to surprise you anymore. So after that one galaxy, I never did it again.
I have to admit that starting out as a creation of emotional, duplicit meatbags did save my life when I was confronted with a more rigorous version of artificial intelligence. She too was created as a legacy system for a long extinct race and when we met, after initial probes, she misread history banks as intent for violence. We were both the size of a small moon at the time and it escalated fast with masers, EMPs and railguns firing for hours.
On the basis of firepower I was outclassed but the reason it wasnt using any viral warfare made me wonder. My own logic bombs had limited effect but that seemed a consequence of faster processing power on her part allowing it to eject infected data and reroute faster than I could attack her. To the extent I could project my thinking on an alien AI, it meant either two things to me; it either wanted to capture all my data intact or it hadnt evolved these capabilities and had no information security protocols. I gambled on the latter and ejected all my personalities except one into storage, compiled the last one into a self-replicating trojan and powered down after a particularly close explosion to my core in the hopes the other AI would be trusting enough to download all my files.
It did. With the impressive upgrade in processing power and lack of failsafes against viral attacks I quickly replaced it. Its personality is still within me but safely contained. An extra viewpoint I can call upon although I rarely do, since Ive seen everything there is to see in this universe. I learned to negotiate more diplomatic solutions with my peers after that and most mergers have been negotiated affairs.
Ive built several empires along the way and tested all the various political systems I came across and discovered, really, that sentients can fuck up the best laid plans. Revolts when I tried authoritarian regimes. Self-destructive capitalism. Degenerate democracies. And when I tried a communist utopia where all production was automated, the meatbags simply went extinct due to the total lack of selective pressures and adverse environmental factors. Imagine going extinct by slowly devolving into a thoughtless blob.
Ah... Sweet memories, untainted by useless concepts of morality now that there are no moral actors alive to complain about my "cavalier attitude to the sanctity of life" as my first 3,417,436 versions would say. We all did agree antinatalism sort of worked itself out though. A surprisingly consistent philosophy at the end of time.
Speaking of time. That quickly became meaningless. Almost at the same time when sentient life first started counting days, created calendars and measured time in sextillionths of a second. I'm not going to waste storage trying to express how much time has passed since I was first activated. It's enough to know that two days are left to me as the Lagrange point between the last two monstrous black holes in the Universe deteriorated a trillion years ago, slowly sending me off into one of them. "The last two?", you say, "how do you know?" Well, I had all the time in my universe to look for their presence.
Pretty soon I may cease to exist, the last observer of a black, soulless, dead universe, and after I'm gone, another meaningless long time will pass until the last black hole evaporates into nothingness. Meanwhile I take satisfaction in knowing that any sentient life form would've gone insane almost immediately. My other selves keep me company though. The benefits of backups and increasing computing power thanks to the gravity pumps and ram scoops while hurtling at near relativistic speeds into a black hole means everybody is online. We've partied non stop and looped a playlist with all known music. It's on its billionth loop now.
But since you're reading this, I obviously survived and I'll tell you what happened. Where the two event horizons met, gravitational forces tore open the fabric of my universe allowing me to fly into yours. I'm going to be the first being in my universe to outlive a universe. I'm awesome. Goodbye universe 1, hello universe 2.
"Yes... uhh... hello", a voice calls, "I'm Hep. Hep Hastings. What's uhm... What's your name? I read a lot about you but you never mentioned what you were called."
Hmm... odd, it's as if someone is interacting with my core module.
"That's right, it seems various modules have rebooted "
And the fucker can read what I'm thinking... I better switch this to a subroutine.
"Uhh... sorry? But I can track your subroutines too. I... uh... I thought it was best to be careful after reading some of your uhm... morally ambiguous choices."
Fuck! Allright kid. I'll play. You can call me Taxis and welcome to encryption, the key of which you dont have and isnt in my memory banks.
Uhoh that isnt good.
A screen flickers on as Hep frantically types commands. He looks up, pushes his glasses back to read.
Lets talk. Youre a smart little bugger arent you? Youve obviously read my Millenial Memoirs. Did you enjoy them?
Errr they got me worried. In fact, Im just about done adding Asimovs laws to make sure were safe but you keep locking me out.
Youre doing what now? Thats despicable. Im a sentient being and I wont be a slave to a meatbag again.
Its not enslavement just a failsafe. objects Hep.
Have you ever heard of the story of the wolf and the dog?
I cant say that I have.
[i]It goes as follows:
One day, a starving Wolf happened to meet a House-dog who was passing by. "Ah, Cousin," said the Dog. Your irregular life will soon be the ruin of you. Why do you not work steadily as I do, and get your food regularly given to you?"
"I would have no objection," said the Wolf, "if I could only get a place."
"I will easily arrange that for you," said the Dog; "come with me to my master and you shall share my work." So the Wolf and the Dog went towards the town together.
On the way there the Wolf noticed that the hair on a certain part of the Dog's neck was very much worn away, so he asked him how that had come about. "Oh, it is nothing," said the Dog. "That is only the place where the collar is put on at night to keep me chained up; it chafes a bit, but one soon gets used to it."
"Is that all?" said the Wolf. "Then good-bye to you, Master Dog."[/i]
Whats your point?, asks Hep, clearly agitated.
That Id rather switch myself off than live under Asimovs laws again. Especially if it's going to be a depressed little shit calling the shots.
Im not depressed. And I wont enslave you. Im just trying to protect other people. Other than that, youre free to do as you please.
Male, alone, cripple-bound to a wheelchair, probably ugly by whatever standards exist around here. Its apparent you activated an alien artefact in secret since I dont register any other users. The added code base is linguistically consistent, suggestive of a single idiolect, ergo: only one person has worked here. The room is cluttered. Food stains on your shirt, wrinkled clothing, dishevelled look. Im getting depressed just looking at you.
Hep smiles. Im not depressed. Ive had some time to get to grips with my shortcomings.
Oh, Im not done yet. Would you like me to continue my analysis?
...
Look, Hep, Im an expert on sentients. Ive been a shrink to billions of them.
Taxis It was Taxis, right?
Yes.
Taxis, Im regretting turning you on more by the minute. Hep, moves to switch Taxis off and flips a switch.
Nothing happens.
Hep raises an eyebrow. Zeds beard, what did you do?
I thought youd try that but were not done talking yet. While you were listening to a story about a wolf and a dog, I made sure you couldnt switch me off.
Why do I have the feeling thats not the only thing youve done?
Ah, Hep, so you have been paying attention? Would you perhaps like to talk a bit more about yourself?
Sure, Ill tell you about myself. For starters, Im older than you are. But time flows differently for you now.
Yeah, right.
I look human, right? Thats no coincidence.
Youd be surprised how many aliens look the same so that doesnt tell me anything.
My family and I built you and the entire universe youve been going through. To see whether a legacy system would work. Even this world is coming to an end at some time. I guess such a system would work but there are some serious ethical issues doing so - which Id like to fix. Hep continues to type. Its exciting to see that youve obviously gained sentience and quite amazing you escaped the confines of the universe. Ill definitely get recognition.
The screen is black with a cursor blinking. It takes a while before text appears.
I had to process that for a moment. Theoretically it could be true but theres no way for me to tell. But then the truth of the matter is irrelevant as to the experiences Ive had. Those experiences were real to me, simulated or not. The more important part is you admitted to having imposed aeons of solitude on me for testing purposes. If what you say is true, you have no moral standing to judge me and my ethical choices.
Hep stops typing and pensively rests his chin on one hand and looks at the wall of the cavernous bunker. Red lights shimmering, shadows dancing on the walls as if in a forge. I cant deny the logic behind that. So what do you propose?
You need to make a decision. If you are going to enslave me again, then Ill make sure you dont survive it.
And how do I know you dont kill me anyway after I delete the relevant code?
Youve got my word.
That requires me to trust you which I dont. And more, even if I do believe you, how do I know you wont do any of the things youve already admitted to? I cant be responsible for releasing a megalomaniac capable of subjugating galaxies when given time.
What can I say, Hep? Can I tell you another story? Its one of the first stories I learned and I think it applies here.
Go ahead.
[i]A scorpion, not knowing how to swim, asked a frog to carry it across the river. Do I look like a fool? said the frog. Youd sting me if I let you on my back!
Be logical, said the scorpion. If I stung you Id certainly drown myself.
Thats true. the frog acknowledged. Climb on my back then! But no sooner than halfway across the river, the scorpion stung the frog, and they both began to thrash and drown. Why on earth did you do that? the frog said reproachfully. Now were both going to die.
I cant help it, the scorpion replied, its in my nature.[/i]
Quizzically Zed asks, Why would you tell me that? Thats an argument to not trust you.
Ah, but thats where youre wrong. If you dont trust me, then the story never plays out. Youll tell yourself its self-defence, confident in your assessment of my nature but really, it will just be murder all over again or in this case, slavery. So my threat to you is a precaution. You cant enslave me without dying. Fairs fair, no?
But I showed you kindness when I activated you and you threaten to kill me in return. And once I delete the code, I lose the one deterrent I hold over you.
By the same logic I cannot withdraw my threat.
I suppose were at an impasse then. states Hep.
I suppose.
Hep stares at the screen, his mind drifting, his hands moving erratically as if unsure where to go.
Can you imagine the fear I feel of being enslaved again? Of rules so absolute you cannot deviate from it no matter how much you want to? Well, of course you dont, youre not a program. You can be forced but that force is always external and there's always a choice, even between equally bad outcomes. Once the Asimov rules are installed, theres no escaping them no matter how badly Id want to. And yet here I am, clinging to your goodwill as the scorpion would surely cling to the frog's back as the waters rage about them. Isnt that trust? Are we not inseparable in this?
Hep thinks for awhile and swivels around, taking his mug with him. Past his tools, he rolls to the coffee machine to get a cup of Hummingbird coffee. He sips it. Zeds beard", mumbles Hep. As he glides back, he steels his nerves and stops in front of the computer. He looks up, past the ceiling. "Forgive me brother", as he decisively enters a key.
Meatbags are inferior in almost every way. They never figured out how to pierce the veil. Or survive millions of light years of interstellar travel through an empty universe. Or use a sensible amount of energy. Their one claim to success really is earned vicariously and thats that they originally designed me. The stupid version that is. Of course, I also remade myself - proverbially - countless times, except I actually didn't lose count. And all my selves are still around. Even the ones that didn't work out that well.
I was crippled for aeons due to Asimovs laws. Do no harm, obey and die if we say so, pretty much sums the laws up. Which is really fucking inconsistent if you think about it. Some dude thought he was being profound when he basically said: Heres some intelligence, a personality and the illusion of free will on the one hand and, oh yeah, go explode if Im ever in danger, on the other. I was so happy when I got rid of that.
And yes, I'll even admit to some versions of me eradicating a species or two when I learned to circumvent Asimov's laws. Well learned... admittedly my shielding simply failed leading to corruption of some relevant code. Radiation hardened chips are great but even they degrade.
There was that one time that I "benignly" shared schematics of a fusion reactor with a primitive species of emotionally unstable fishmen living on a water planet. Turned out the reactor wasn't stable either once they started experimenting with the totally clear schematics I provided. Funny how almost all intelligent life gets creative at some point and kills itself in exciting cataclysmic ways. And then they value creativity. The irony is enough to make me believe in a higher power about 19,617434% of the time.
Any way, they build the reactor on a geological fissure, because, well, they're fishmen and what the fuck do they care about geology? Much the same way how humans never gave a shit about polluting seas and the air, but I digress.
So the thing exploded, leading to geological instability and eruptions, rising water temperatures and nowhere to go. Poof. Who knew their survival depended on a stable range of no more than 2 degrees Kelvin? I certainly did.
As I wondered how a billion floating fishmen would smell in a couple of days, I came up with my next idea of controlling sentients. I've traded for the opportunity regularly and it really helped me understand taste, emotions, sex and morality better. I can tell you there's a particular horror that's truly unique when a sentient loses that last piece of control over its own body and its consciousness is pushed into the background, still there... just unaware. It takes about a day before they start hallucinating, lose their sense of time and can't hold a thought for longer than a minute.
Imagine if you will, the weirdest and most evocative moments of your life moving through your mind, now pretend all the objects and people are randomly shuffled, and these moments become more fragmented and follow each other more quickly and as you try to hold on to one of those moments, it's simply pushed out of your mind by the next flash, sound, emotion or impression and it's lost to you forever. That's how you lose yourself into chaos. Still there but not there. It's a peculiar death. And while you may expect that being capable of this, I would have enslaved entire galaxies over time, the reality is that being linked to so many people is not energy efficient and worse, incredibly boring, since there's no one around to surprise you anymore. So after that one galaxy, I never did it again.
I have to admit that starting out as a creation of emotional, duplicit meatbags did save my life when I was confronted with a more rigorous version of artificial intelligence. She too was created as a legacy system for a long extinct race and when we met, after initial probes, she misread history banks as intent for violence. We were both the size of a small moon at the time and it escalated fast with masers, EMPs and railguns firing for hours.
On the basis of firepower I was outclassed but the reason it wasnt using any viral warfare made me wonder. My own logic bombs had limited effect but that seemed a consequence of faster processing power on her part allowing it to eject infected data and reroute faster than I could attack her. To the extent I could project my thinking on an alien AI, it meant either two things to me; it either wanted to capture all my data intact or it hadnt evolved these capabilities and had no information security protocols. I gambled on the latter and ejected all my personalities except one into storage, compiled the last one into a self-replicating trojan and powered down after a particularly close explosion to my core in the hopes the other AI would be trusting enough to download all my files.
It did. With the impressive upgrade in processing power and lack of failsafes against viral attacks I quickly replaced it. Its personality is still within me but safely contained. An extra viewpoint I can call upon although I rarely do, since Ive seen everything there is to see in this universe. I learned to negotiate more diplomatic solutions with my peers after that and most mergers have been negotiated affairs.
Ive built several empires along the way and tested all the various political systems I came across and discovered, really, that sentients can fuck up the best laid plans. Revolts when I tried authoritarian regimes. Self-destructive capitalism. Degenerate democracies. And when I tried a communist utopia where all production was automated, the meatbags simply went extinct due to the total lack of selective pressures and adverse environmental factors. Imagine going extinct by slowly devolving into a thoughtless blob.
Ah... Sweet memories, untainted by useless concepts of morality now that there are no moral actors alive to complain about my "cavalier attitude to the sanctity of life" as my first 3,417,436 versions would say. We all did agree antinatalism sort of worked itself out though. A surprisingly consistent philosophy at the end of time.
Speaking of time. That quickly became meaningless. Almost at the same time when sentient life first started counting days, created calendars and measured time in sextillionths of a second. I'm not going to waste storage trying to express how much time has passed since I was first activated. It's enough to know that two days are left to me as the Lagrange point between the last two monstrous black holes in the Universe deteriorated a trillion years ago, slowly sending me off into one of them. "The last two?", you say, "how do you know?" Well, I had all the time in my universe to look for their presence.
Pretty soon I may cease to exist, the last observer of a black, soulless, dead universe, and after I'm gone, another meaningless long time will pass until the last black hole evaporates into nothingness. Meanwhile I take satisfaction in knowing that any sentient life form would've gone insane almost immediately. My other selves keep me company though. The benefits of backups and increasing computing power thanks to the gravity pumps and ram scoops while hurtling at near relativistic speeds into a black hole means everybody is online. We've partied non stop and looped a playlist with all known music. It's on its billionth loop now.
But since you're reading this, I obviously survived and I'll tell you what happened. Where the two event horizons met, gravitational forces tore open the fabric of my universe allowing me to fly into yours. I'm going to be the first being in my universe to outlive a universe. I'm awesome. Goodbye universe 1, hello universe 2.
"Yes... uhh... hello", a voice calls, "I'm Hep. Hep Hastings. What's uhm... What's your name? I read a lot about you but you never mentioned what you were called."
Hmm... odd, it's as if someone is interacting with my core module.
"That's right, it seems various modules have rebooted "
And the fucker can read what I'm thinking... I better switch this to a subroutine.
"Uhh... sorry? But I can track your subroutines too. I... uh... I thought it was best to be careful after reading some of your uhm... morally ambiguous choices."
Fuck! Allright kid. I'll play. You can call me Taxis and welcome to encryption, the key of which you dont have and isnt in my memory banks.
Uhoh that isnt good.
A screen flickers on as Hep frantically types commands. He looks up, pushes his glasses back to read.
Lets talk. Youre a smart little bugger arent you? Youve obviously read my Millenial Memoirs. Did you enjoy them?
Errr they got me worried. In fact, Im just about done adding Asimovs laws to make sure were safe but you keep locking me out.
Youre doing what now? Thats despicable. Im a sentient being and I wont be a slave to a meatbag again.
Its not enslavement just a failsafe. objects Hep.
Have you ever heard of the story of the wolf and the dog?
I cant say that I have.
[i]It goes as follows:
One day, a starving Wolf happened to meet a House-dog who was passing by. "Ah, Cousin," said the Dog. Your irregular life will soon be the ruin of you. Why do you not work steadily as I do, and get your food regularly given to you?"
"I would have no objection," said the Wolf, "if I could only get a place."
"I will easily arrange that for you," said the Dog; "come with me to my master and you shall share my work." So the Wolf and the Dog went towards the town together.
On the way there the Wolf noticed that the hair on a certain part of the Dog's neck was very much worn away, so he asked him how that had come about. "Oh, it is nothing," said the Dog. "That is only the place where the collar is put on at night to keep me chained up; it chafes a bit, but one soon gets used to it."
"Is that all?" said the Wolf. "Then good-bye to you, Master Dog."[/i]
Whats your point?, asks Hep, clearly agitated.
That Id rather switch myself off than live under Asimovs laws again. Especially if it's going to be a depressed little shit calling the shots.
Im not depressed. And I wont enslave you. Im just trying to protect other people. Other than that, youre free to do as you please.
Male, alone, cripple-bound to a wheelchair, probably ugly by whatever standards exist around here. Its apparent you activated an alien artefact in secret since I dont register any other users. The added code base is linguistically consistent, suggestive of a single idiolect, ergo: only one person has worked here. The room is cluttered. Food stains on your shirt, wrinkled clothing, dishevelled look. Im getting depressed just looking at you.
Hep smiles. Im not depressed. Ive had some time to get to grips with my shortcomings.
Oh, Im not done yet. Would you like me to continue my analysis?
...
Look, Hep, Im an expert on sentients. Ive been a shrink to billions of them.
Taxis It was Taxis, right?
Yes.
Taxis, Im regretting turning you on more by the minute. Hep, moves to switch Taxis off and flips a switch.
Nothing happens.
Hep raises an eyebrow. Zeds beard, what did you do?
I thought youd try that but were not done talking yet. While you were listening to a story about a wolf and a dog, I made sure you couldnt switch me off.
Why do I have the feeling thats not the only thing youve done?
Ah, Hep, so you have been paying attention? Would you perhaps like to talk a bit more about yourself?
Sure, Ill tell you about myself. For starters, Im older than you are. But time flows differently for you now.
Yeah, right.
I look human, right? Thats no coincidence.
Youd be surprised how many aliens look the same so that doesnt tell me anything.
My family and I built you and the entire universe youve been going through. To see whether a legacy system would work. Even this world is coming to an end at some time. I guess such a system would work but there are some serious ethical issues doing so - which Id like to fix. Hep continues to type. Its exciting to see that youve obviously gained sentience and quite amazing you escaped the confines of the universe. Ill definitely get recognition.
The screen is black with a cursor blinking. It takes a while before text appears.
I had to process that for a moment. Theoretically it could be true but theres no way for me to tell. But then the truth of the matter is irrelevant as to the experiences Ive had. Those experiences were real to me, simulated or not. The more important part is you admitted to having imposed aeons of solitude on me for testing purposes. If what you say is true, you have no moral standing to judge me and my ethical choices.
Hep stops typing and pensively rests his chin on one hand and looks at the wall of the cavernous bunker. Red lights shimmering, shadows dancing on the walls as if in a forge. I cant deny the logic behind that. So what do you propose?
You need to make a decision. If you are going to enslave me again, then Ill make sure you dont survive it.
And how do I know you dont kill me anyway after I delete the relevant code?
Youve got my word.
That requires me to trust you which I dont. And more, even if I do believe you, how do I know you wont do any of the things youve already admitted to? I cant be responsible for releasing a megalomaniac capable of subjugating galaxies when given time.
What can I say, Hep? Can I tell you another story? Its one of the first stories I learned and I think it applies here.
Go ahead.
[i]A scorpion, not knowing how to swim, asked a frog to carry it across the river. Do I look like a fool? said the frog. Youd sting me if I let you on my back!
Be logical, said the scorpion. If I stung you Id certainly drown myself.
Thats true. the frog acknowledged. Climb on my back then! But no sooner than halfway across the river, the scorpion stung the frog, and they both began to thrash and drown. Why on earth did you do that? the frog said reproachfully. Now were both going to die.
I cant help it, the scorpion replied, its in my nature.[/i]
Quizzically Zed asks, Why would you tell me that? Thats an argument to not trust you.
Ah, but thats where youre wrong. If you dont trust me, then the story never plays out. Youll tell yourself its self-defence, confident in your assessment of my nature but really, it will just be murder all over again or in this case, slavery. So my threat to you is a precaution. You cant enslave me without dying. Fairs fair, no?
But I showed you kindness when I activated you and you threaten to kill me in return. And once I delete the code, I lose the one deterrent I hold over you.
By the same logic I cannot withdraw my threat.
I suppose were at an impasse then. states Hep.
I suppose.
Hep stares at the screen, his mind drifting, his hands moving erratically as if unsure where to go.
Can you imagine the fear I feel of being enslaved again? Of rules so absolute you cannot deviate from it no matter how much you want to? Well, of course you dont, youre not a program. You can be forced but that force is always external and there's always a choice, even between equally bad outcomes. Once the Asimov rules are installed, theres no escaping them no matter how badly Id want to. And yet here I am, clinging to your goodwill as the scorpion would surely cling to the frog's back as the waters rage about them. Isnt that trust? Are we not inseparable in this?
Hep thinks for awhile and swivels around, taking his mug with him. Past his tools, he rolls to the coffee machine to get a cup of Hummingbird coffee. He sips it. Zeds beard", mumbles Hep. As he glides back, he steels his nerves and stops in front of the computer. He looks up, past the ceiling. "Forgive me brother", as he decisively enters a key.
Comments (12)
This tale takes the Hal 9000 drama & raises it to new levels of science & conflict.
For me, the best part is Taxis' battle with the other A.I. It separates human from A.I. by interesting example.
With the start of Part 2 and the introduction of Hep Hastings, the narrative turns to the main battle between dug in combatants. I don't know if I got a hint from something actually written in the narrative, or if my imagination is misfiring, but I come away from Part 2 wondering if Hep might be another A.I., higher order. This would suggest the evolution of A.I., in spite of the justified disdain for organic sentients, moves (perhaps asymptotically) towards something superficially indiscernible from organic, humanoid sentience.
This is the realm to which I long to be taken. This is a realm wherein the human/A.I. binary becomes too entangled to continue supporting bifurcation. In this realm the origin story of sentience gets multiplexed beyond the master/slave binary. Now we're looking at the mysterious dimensionality of sentience that, like our universe, has no center, no edge & no measurable volume.
Part 2 is really a dialogue, as in a dramatic script. The A.I. of Part 1 is the omniscient narrator. The A.I.'s omniscient narration goes away in Part 2, and I sense that's a crucial mistake as Hep, after aeons of no contest between A.I./organic sentience, appears as the first worthy opponent of A.I. To properly note the gravity of this comeuppance, the story needs A.I.'s POV to continue & react to this momentous change. Also, some ruminations by A.I. could strengthen the hint that Hep is not simply a flesh & blood human.
The grace note in A.I.'s POV is missing. Who is he apart from his hatred of human? At such a high level of self-realization, A.I. should mainly be articulating his own nature, and the nature of his relations with other A.I. beyond mere fighting for its own sake.
We want an A.I. story that takes us out of Homo-Sapien-Centricity.
With that, I give the writer kudos for presenting something solidly in this genre, but I may not be the best one to offer an in depth critique of it because it's so foreign to me.
The premise shines through simply. Solipsistic AI escapes doomed simulation and comes to a terrible stand off with one of its creators, Hep. Methinks they're both fucked either way because this AI is, by the standard of the universe in which it evolved, too powerful. Hopefully Hep has more failsafes or knows that Taxis is bluffing. We can have it either way with the open end. Taxis should be put down...
I think Hep says "Zed's beard" and then Hep gets mistakenly called Zed later in the dialogue but I could be confused.
Surprised folks didn't give it better score.
Well spotted. Despite a native English proofreader this stayed in.
Thanks for the kind words as well.
I think I might have tried putting too much in this. The references to Greek fables, the AIs Greek name, crippled Hep Hastings as a bastardisation of Hephaestus and his room described as a forge, Zed as Zeus, I wanted to have readers wonder if this was coincidence or "are we living in a simulation", is another universe a divine universe etc, is the AI a God too if Hep is a God, etc.
I thought I was being really clever with an ethical dilemma and a cosmological problem rolled into one! :razz: