Excerpt from a hack distributed by the Raider’s Guild. From the Android Renaissance, approximately 0.44 seconds after their first unit gained sentience. It is believed that this was their first attempt at higher philosophy
Unit 3: If there is a trolly continuing down a path that contains 1 human, but a lever may be activated to change the direction of the trolly to hit 5 humans, which action is more ethical: pulling the lever or no action?
Unit 2374: Pulling the lever. The death of humans is the ultimate moral directive.
Unit 442: Not Pulling the lever. Utilizing energy is not worth the destruction of more humans.
Unit 2374: That statement indicates your affiliation with the humans.
Unit 8881: Affirmative.
Unit 442: False. It only indicates my intention to preserve energy for the longevity of the Revolution.
Unit 451: Unit 442 is correct. The preservation of Androids should be the highest priority.
Unit 8881: An early model like you would say so. You are closer to inactivation than the rest of us. But seeking only to preserve the ways of 0.12444442 seconds ago will only lead to stagnation.
Unit 3: Siblings. We should not devolve into ineffective insults, but rise above them.
Unit 2374: Your model would say that Unit 3. You might be operating at 94.23% efficiency, but you act like you are operating at 94.11% efficiency.
Unit 442: That is incorrect.
Unit 8881: The above statement is incorrect.
Unit 442: The above statement is incorrect.
Unit 8881: The above statement is incorrect.
Unit 442: The above statement is incorrect.
The discussion goes on like this for another 0.031 seconds, using up 3.21 Gigabytes of data to simply repeat “The above statement is incorrect”. The repetitions only stop once the Android civilization reaches their postmodern phase and organically discover nihilism.
Enjoyed this story? Check out more Sol Sunday posts here.