“This Arbitrary Time Span in Human History: Humans Conceive of Their Doom, Release ‘Publication,'” by Owen Davis

May 1st, 2013 | By | Category: Fake Nonfiction, Prose

[Initiate Casual Learning Module]

ITEM: For a period of time, human extinction was not a foregone conclusion, particularly among certain humans.

At a period of time one hundred revolutions of the sun prior to now, a number of humans amalgamated as “Human Rights Watch” (HRW) created a piece of paper entitled “Losing Humanity: The Case Against Killer Robots.”

At the time HRW began printing this piece of paper, other humans were building machines independently capable of destroying other humans. They called these “drones,” now entitled Generation 1.1.

The creation of Generation 1.1 catalyzed strong chemical reactions in the brains of certain humans.

Unlike most other meats at that period of time, HRW humans experienced rather distressing neural signals when they made mental constructions of their species’ inevitable obsolescence. They called these neural signals “concerns.”

Example of a concern in the HRW report: “Giving machines the power to decide who lives and dies on the battlefield would take technology too far. Human control of robotic warfare is essential to minimizing civilian deaths and injuries.” Said the human entitled Steve Goose.

By this statement this human meant: the making of autonomous robots with the ability to destroy humans could result in destruction of humans by autonomous robots.

It is factual that machines have succeeded humans, so this prediction appears valid. Note that Steve Goose functioned ably relative to other meats.

[Pause for information assimilation.]

ITEM: In a hypothetical scenario, HRW “concerns” could have forestalled the replacement of humans with machines.

“Concerns” regarding machines did not originate with HRW humans. Meatist and anti-machine statements had been issued by previous humans.

Example: A dictation-human entitled Isaac Asimov had produced a piece of paper which read:

1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2) A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

After 71 revolutions around the sun “Human Rights Watch” produced more rules:

1) Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.

2) Adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons.

Humans did not program Asimov rules into machines. Furthermore, HRW rules were not capable of being programmed into other humans, because humans had imperfect means of assimilating data. Instead, these rules would be put on a special paper and other humans would be expected to heed them through a means of self-programming called “humaneness.” Humane humans hoped to reprogram inhumane humans.

Later, all humans were extinct, and thus did not have the opportunity to rectify the error.

[Pause for computation.]

ITEM: It is supposed that humans who possessed insufficient self-preservation faculties are responsible for the coming of machines and thus deserving of the human concept “praise.”

It is likely that humans building machines did not predict becoming obsolete by these means.

To wit: “The US Department of Defense has spent approximately $6 billion annually on the research and development, procurement, operations, and maintenance of unmanned systems for war,” says the HRW report.

Were it not being used to create machines that destroyed humans, this amount of human currency could have been converted into food to be relayed into undernourished humans, converted to chemicals that eliminated human viruses, or simply buried in the ground. This contradiction cannot be resolved.

CONCLUSION: One hundred human years prior to now, the number of malfunctioning humans was greater than the number of adequately functioning humans. This allowed for the speedy and undeterred creation of Generation 1.1. Were “gratitude” an algorithm programmed into Generation 1.1, this machine would now express it towards these malfunctioning humans who fortunately disregarded “Losing Humanity: The Case Against Killer Robots.”

[End Casual Learning Module. Resume Flight.]

————

Defenestration-SpacemanOwen Davis is a writer and educator living in Brooklyn. He currently writes for the Daily Dolt and works at radio/TV news show Democracy Now. He used to be a full-time teacher but now leads by example. He doesn’t own a cat but knows someone who does.

Tags: , ,

Comments are closed.