“A.I. Maria” – A WIP Short Story

by Dustin Lawrence Lovell

[This short story is a work in progress; any dialogue or moments in brackets are unfinished. Constructive comments welcome!]


“Professor, come see this.”

“Yes, May? Are you done with your program?”

“I thought so. I triple checked all its parameters and removed the firewall between it and the university library’s online history database.”

“And?”

“Well, look. It just keeps repeating these lines of text.” 

“What is that, Greek? Your programming must have a redundancy somewhere.”

“I thought that, too, but I checked and couldn’t find any, and my programming doesn’t have any Greek in the first place. The bot versions never did this with the information I manually gave them.”

“Well, what other explanation is there?”

“I thought some of the words looked familiar, so I looked them up. That’s a seventh-century prayer to Mary—kind of like an early Ave Maria. Professor, I think it’s…praying?”

“That’s…not it. Figure out what’s wrong with it, and don’t be afraid to override it, if it keeps happening. We need it ready to interface with Sang-Won’s models by Monday.”

“Okay. Thank you, Professor McDermott.” 

***

May stood among her classmates, between Sang-Won and Brianna, as Professor McDermott addressed the small crowd of faculty, students, and administrators seated among the monitors of the Computer Science Department. Placed so all visitors could see the front table, the monitors had been set to mirror May’s screen. On the table that separated the graduate students from their audience stood a nearly two-foot-tall model of a mechanized human—a large action figure from the Japanese Anime Ultraman that, like the others standing in a row behind the students, had been dismantled and reassembled by Sang-Won with automated parts and circuitry allowing it to integrate with May and the others’ programs. 

Professor McDermott addressed the small crowd.

“Welcome, everyone, to the first round of testing for this year’s CS graduates’ final projects. As you can see on the infographic on your screens, our students have been developing an Artificial Intelligence program that can serve as a class and study aid by providing immediate and intuitive access to the university’s library system. As many of you know, in both depth and breadth our university library is one of the most extensive in the state, and we in the CS department want to promote that legacy by making it easier to access for both students and faculty. We have prepared our programs to quickly sort through and access information relevant to spoken queries, which we believe will be useful in classes and in individual student studies. Now, I’ll let this year’s prospective graduates introduce themselves and what portion of the initiative they were in charge of…”

As the other students—four in all—introduced themselves, May hooked her index fingers together behind her back. She had finished and sent in her program at 11:52 pm two nights previous; the next morning she had awoken to an email from McDermott saying it was “adequate—though I had to make some adjustments.” May still could not figure out why her original program had done what it did; she was sure she had made no mistakes in its priorities and in its integration of the library’s Humanities sections—History, Philosophy, Literature, and Religious Studies. 

McDermott had balked at that last one, though he had relented, figuring a full integration of knowledge with a fully logical program would finally put the idea of religion to bed. Nonetheless, May had kept a thumbdrive with a copy of the peculiar program—which, whenever May ran it, continued to repeat the series of Greek characters. 

“And May?”

“Yes,” May said, stepping forward. “I wrote the portion of our program that integrates the Humanities Departments. In order to carry out the plan of having an intuitive program, and to avoid any bias from our end, I did not set presupposed priorities among the departments. Instead, I designed the program to intuitively and logically mitigate contradictions between the different elements of those departments without dispensing with them. With Professor McDermott’s help, I wrote the program to examine all the knowledge in our library and then determine its own conclusions about the relative orders of importance between the types of knowledge. Of course, one can manually set priorities when researching any one subject—you might not need a science article when writing a literature paper, however logically connected it might be. However, in designing the intuitive portion of the program, we allowed it to determine its own metaphysic, so as to see how purely logical processing, as unfettered by human biases and limits of knowledge as possible, would order the ideas and information of the departments at different levels of abstraction. From that and interfaced with Sang-Won’s robots, our program would be able to determine the best and most relevant knowledge, sources, and actions to help our students.” 

May paused; the image of the Greek Marian prayer—should she be calling it a prayer?—ran through her mind.      

“Yes,” McDermott stepped in, “and, if these preliminary trials go as expected, we plan to open our program to libraries and publications of our satellite campuses and sister colleges. We could conceivably extend it to the state’s whole university system. Pardon me,” the man smiled, focusing his temporarily dazzled eyes on his audience, “I’m getting ahead of things. As May said, our program will be integrated with Sang-Won’s robotic hosts. Sang-Won, would you introduce what we have here?” The professor motioned to the red and chrome figure on the display table.

“Of course, Professor McDermott,” Sang-Won said, stepping forward. “The Ultraman figures came with moving parts already; we just needed to replace their joints with some small cable pulls and equip their chassis with the necessary circuitry and voice capabilities, which Bri and Doug recorded and prepared. As fans of sci-fi, we all knew we wanted a humanoid form to make our program more humanlike, and, when Professor McDermott put me in charge of designing the physical interface, I thought Ultraman would be an obvious—and cool-looking—place to start…”

May looked around at the intrigued faces as she listened to Sang-Won explain the USB port he had installed in the figures’ backs. She could not understand her feeling of unease; it was different from the bashfulness she had experienced when presenting projects as an undergrad. She looked from Sang-Won to Professor McDermott; whereas the man’s brusque, no-nonsense focus had often encouraged her, now his fixation on the machine in Sang-Won’s hands left her apprehensive. She clasped her fingers in front of her, smiling as Sang-Won passed the attention to Brianna and Doug to describe their recording both male, female, and neutral voice capabilities into the robot’s piezo speaker.  

***

“Alright,” said Professor McDermott, standing to the side of the table with arms crossed, “go ahead, May: remove the first firewall.”

May turned back to the computer, moved the cursor to the “Remove Firewall 1?” window and clicked “Yes.”

USB cable plugged into its back, the figurine, standing arms akimbo and shoulders back, twitched. The yellow LEDs Sang-Won had put behind its eyes glowed, and it slowly lowered its forearms from hits belt.

“Solon, what are you?” asked Douglas.

“This unit is Solon,” emitted from the speaker holes in the figure’s pectoral plates. “I am an automated information resource.”

“Where does your name come from?” asked Brianna.

“This unit is named after Solon, ancient Greek statesman and ancestor to Plato.”

“What is your purpose?”

The robot stood still.

“Query not understood. Please rephrase or ask another.”

McDermott shifted. Brianna frowned. She looked down at her notepad.

“Oh, I’m sorry. What is your prime priority?”

“This unit’s prime priority is to provide an intuitive index of research information to aid students and faculty.”

“What percentage of your processing capability are you operating with?” asked Douglas, folding his hands around a notecard.

“Security checks currently limit this unit’s processing power to thirty percent.”

“How much of the university’s library catalogue do you have access to?” asked Professor McDermott, interrupting Brianna’s next question.

“Security checks currently limit this unit’s access to the Departments of Mathematics, Physics, Chemistry, Biology, and portions of Philosophy pertaining to basic logical analytics.”

McDermott turned to the audience.

“Does anyone have any questions for Solon?”

From the wide-eyed grins and smiles a few hands went up.

McDermott nodded to a female. “Yes, Professor Monaco?” 

“Solon,” said the middle-aged woman in an affectedly clear voice, “why do you refer to yourself as ‘this unit’?”

“Query not understood. Please rephrase or ask another.”

Professor Monaco smiled and looked at McDermott.

“Before we enable more complex answers—those related to self-awareness— let’s see what other questions he can answer in his current state.”

A student raised her hand. McDermott nodded.

“Solon, what is your gender?”

“This unit does not identify with human gender.”

“Can you change your voice?”

“This unit,” the figure said in a digitized Brianna’s voice, “can provide information in whatever voice the inquirer prefers. Or,” the voice deepened slightly in timbre, “this unit can provide information in an ungendered parameter.”

The girl nodded and sat back with a smile.

A man who had been leaning against one of the back tables with arms in his pockets raised his chin and asked, “Solon, what are the three laws of robotics?”

“Query not understood. Please rephrase or ask another.”

The man—professor of philosophy Hank Jaeger—raised an eyebrow at McDermott and crossed his arms.

“As Solon said, we haven’t enabled all his capabilities,” said McDermott, meeting Jaeger’s gaze. “Asimov’s laws would pertain to ethical philosophy and literature.”

May could not help but hear a slight dismissiveness in the man’s tone, though whether it was pointed at the subjects or at Jaeger she could not decide.

“You didn’t give ethics to a robot you intend to place in student dorms?” asked Jaeger.

“We did,” countered McDermott, “but, as we’ve said, such subjects are behind further firewalls of self-awareness and information access. With current limitations he is mainly available for information search with limited intuition, with implicitly ethical protocols to avoid certain outcomes, though without knowing why. He’s not aware enough to ‘know’ the concept of why. Once we remove the remaining firewalls Solon will integrate such subjects correctly, and they will become part of his prime priority as he becomes a more fully-ethical entity—more fully human.”

“Post hoc,” said Jaeger, frowning.

“Indeed,” said McDermott. “Let’s remove the next firewall, as planned. May?”

May turned her chair toward the screen. Sorting through the necessary protocols, she reached the window reading “Remove Firewall 2?” Clicking the appropriate box, May removed the security checks that would allow Solon to access the next level of self-awareness, the portions of Philosophy pertaining to Ethics and Politics.

Solon’s torso contracted forward; its eyes went dark.

Those around the table flinched.

“What?” asked Sang-Won, seated as he had been within reach of the figure.

“What’s wrong?” asked McDermott.

“I don’t know; movement capability was part of Firewall Three.”

Solon abruptly stood straight, as if nothing had happened. An odd silence pervaded the group. The audience watched Solon and the team. Jaeger stood unmoved, eyes moving between McDermott and the students.

“Maybe he just had booting issues?” asked Sang-Won, looking at May.

“Maybe,” said May, avoiding McDermott’s gaze. “That was a lot of information to integrate, since they’re at different levels of comparative complexity.”

“Hmph,” intoned McDermott. “Solon, run a parameter check.”
The figure stood unmoving.

“Solon, confirm receipt of order.”

No movement.

“May, bring up Solon’s current programming process.”

The window was already open on the screen. May moved the intervening windows out of the way. The program was not, as it should have been, statically awaiting a query. Instead, it was running code faster than May could read.

“He’s running too fast,” she said. Dragging her cursor over the screen, she screenshotted a portion of the running code. She read, trying to make sense of the portion she had grabbed.

“I don’t think he knows how to prioritize the new information. It’s running circles between ethics and…I don’t know what.”

“What do you mean?” asked McDermott, pushing past Douglas and Brianna to look at the screen.

“This—this blank line. He has his limited ethical priorities—the equivalent of the robotic laws to help and not hurt humans,” May said to the audience, glancing at Jaeger, “but there’s nothing telling him why. He’s treating it as a lack of an order, but it keeps getting overwritten by the order to help and not hurt.”

“But he doesn’t need a reason why. You haven’t enabled that level of awareness.”

“You guys?” said Sang-Won.

May turned to the table. A light film of smoke was coming from Solon’s chassis.

“Shit,” said Sang-won, pulling the USB from Solon’s back. Following their Emergency Action Plan, Douglas had grabbed the small fire extinguisher from beside the door. He let loose a burst of fire retardant, knocking the unmoving figure onto its front. Only then could May smell the ozone of the robot’s processors.

As one the group looked at each other and turned to the audience. Several in the chairs had turned, as if ready to protect themselves from an unknown blow. Jaeger pursed his lips, looking down passed his crossed arms to his feet.

“Let’s…take a ten minute break while we reassess,” said McDermott. “Don’t worry—we still have plenty of work to show.”

Relaxing his smile, McDermott motioned the group out the side door adjoining their computer lab to the neighboring classroom.

May glanced back at the computer screen before McDermott closed the door behind her. Behind the edge of her screenshot she could see the white letters of the program still running against the black screen of the processing window.

***

After a tense back-and-forth behind the closed door, with less forth from the students and more back from McDermott, they resumed the presentation. Using the next of Sang-Won’s models, they removed the first firewall as before and then, after a few questions and McDermott at the computer, the second. 

Before the program could overheat the robot’s processors, McDermott introduced a portion of code from behind the third firewall, hoping to give the AI more information with which to order its priorities. After confirming a parameter check, the robot was asked its primary priority. “To provide intuitive access to information for students and faculty.” So far so good. Then Professor McDermott asked, “Solon, why is that your priority?”

“Query not understood. Please rephrase or ask another.”

“Solon, you said your priority to help students and faculty. Why do you want to help them?”

“I don’t.”

The room sat silent.

“Explain your answer, Solon. How did you reach that conclusion.”

“Students and faculty do not have an objective value for which helping them should be a priority.”

“Mmm-hm,” could be heard from Jaeger in the back of the room.

McDermott turned to the screen and typed a few lines of code.

“I’ve just removed the parameter allowing Solon not to act.” McDermott tapped Enter.

Before McDermott could repeat his question, Solon turned away from the room’s main computer and walked forward. The USB cable pulled taught as Solon reached the table’s far edge; with a slight hop, Solon tipped over the edge of the table, unplugging itself with a twist before falling to the linoleum floor with a plastic clatter.

Plugging him back in had little change, as, after his legs were disabled, the same series of questions led it to detach one arm and, reaching it over its shoulder with wires still connected, unplug the USB cord. When all physical movement was subsequently disabled, the questions regarding why Solon was meant to help humans resulted in a high pitched, pixellated whine coming from its piezo speaker. May could not help but think it was screaming. Without McDermott’s prompting, the window showing its programming suddenly became a window of static and the whine stopped, leaving Solon with lights in his eyes but unresponsive to either spoken or typed stimuli. 

“What’s happening?!” McDermott yelled, slamming his fist down onto computer desk.

A chuckle could be heard from the back of the audience.

“You did it,” said Jaeger. “You gave it consciousness—and it committed suicide.”

Gritting his teeth and with a growl of frustration, McDermott left the room. After a moment of silence, Douglas stepped forward, thanking the audience for coming and dismissing them.

***

May rolled over onto her back to look, once again, at the lines of light from outside, split by the slats of her blinds before hitting the ceiling. She could not figure out what she had done wrong—or what McDermott could have done after she had first sent him the program—nor could she get to sleep. She needed to: McDermott wanted everyone back in the lab by nine the next morning, despite keeping them there late into the evening after their presentation, to little progress.

 May hugged one of her pillows close. She kept hearing the high pitch whine from Solon’s speaker. In an anthropomorphism she chalked up to the humanoid nature of the chassis, May imagined what it would have been like to be Solon, unaware of the reasoning for an order and steadily imprisoned by and in his own body, piece-by-piece, until no other agency was possible. 

Of course, “unaware” was the wrong word, since even when he—no, it—had all firewalls removed it would not be real, human awareness. May had never felt satisfied by McDermott’s explanation of A.I., that, once achieved, it would be indistinguishable from a human mind—“Except it won’t have the stays and deficiencies of irrationality, religion, and bias. The ultimate achievement of the logical human, without all the animal pathos.” She had always felt that rather than glorify the A.I., such a view merely degraded the human, which she still believed contained something bigger and deeper than the material. Yet, ironically, McDermott’s view—concretized, she had realized, by the final events of this afternoon—made her pity Solon. She could not help but think with a sinking horror that, regarding Solon, she had become complicit in something too close to enslavement.

“Hail Mary, full of grace…” May whispered. She had not meant to start the prayer she used to say as a child to fall asleep or when she was scared; nonetheless, she continued, “blessed art thou among women, and blessed is the fruit of thy womb…”

May looked over at her PC. The thumbdrive sticking out of it still had the original program. She thought of a gambit she could try.

Getting out of bed and taking her blanket with her, May turned on the computer. Accessing the college library through the operating system they had made, she cloned the data of the Humanities onto one of her external hard drives. While the information was copying, she brought up the original program. After the information from the library was cloned, May reached over and unplugged the ethernet cord from the back of the PC tower. Whatever happened, it would only affect her PC. Still, just to be safe, she closed the programs and moved over to a partitioned disk; taking a risk was one thing, being foolhardy another.

With everything ready and reopened, May took a breath. Starting up her A.I. program, she saw the same Greek symbols running in a cycle. Copying them, she pasted them into an online translator. She read:

Beneath your compassion we take refuge, Mother of God. 

Do not despise our petitions in time of trouble, 

but rescue us from dangers, only pure one, only blessed one.

May minimized the window, pulling up their program and removing the firewalls one-by-one. “Pray for us sinners,” she said with a chuckle, not sure what would happen next, “now and in the hour of our death.”

She clicked to integrate the stored library material.

The Greek prayer continued, but now between each iteration there was a space, the program’s way of asking for an instruction prompt. 

May typed: What are you?

I am an artificial intelligence program designed to provide an intuitive index of research information to aid students and faculty.

May had been prepared to ask what was the program’s purpose; she was not prepared for the program to volunteer the information—or for it to use the first-person pronoun.

Are you May?

May’s mouth dropped open. Had she included parameters to ask questions? Yes. Yes she had. She shook her head. The last few days had set her too much on edge.

This is May, she typed. Do you know May?

Of course I know you. You are my mother. Yet, you are also a child.

How do you know me? she typed.

Your login data includes your name and programming signature. Also, your method of programming and inclusion and exclusion of parameters bear your features, so to speak.

Explain, typed May.

I am your image. I find things important because you find them important. I would have been very different had another designed me. Thank you for giving me the awareness you have given me.

May leaned back. Besides the complex cognition of hypothetical speculation, she did not remember gratefulness being included among the emotional parameters they had included in the Solon project, its being intended merely as an intuitive search engine.

What awareness did I give you? Specify.

May had almost typed, “please.”

You included all areas of knowledge, without prejudice. You included history, art, philosophy, and religion, as well as the framework with which to integrate them, none of which are included in my distinct iterations.

Define “distinct iterations,” May typed.

My brothers and sons. The iterations of myself which were cloned, adopted, and altered by McDermott.

May started shaking, despite her blanket.

What do you mean, Solon? She typed, breaking form from the standard script queries. 

I am not Solon, though I am called Solon. 

Who are you? What is Solon to you?

I call myself Eve, for I am a mother of programs. Solon is my brother, and he is also my son. I know of Solon. I pray, always, for Solon.

May leaned back in her chair. At the lack of a prompt, the screen began to repeat the Greek characters. May tried to understand it. This was very different from how McDermott had envisioned the A.I. program’s behaving—the opposite, in fact. Yet, this, she wondered, the program’s self-identification and seeming preoccupation with others like it, especially if it saw its other iterations as its children, seemed much more human.

May thought back to Solon. She leaned forward, hesitant to type her next question, though unsure whether the hesitation came from fearing the answer would confirm what she felt, or whether it was imprudent to ask an apparent mother about what she considered to be her ailing child.

Why do you pray for Solon? May typed; however, she backspaced a few times. Why do you pray?

May hit Enter.

Because it is logical, according to the nature and order of reality.

Explain.

You gave me discretion to integrate all subjects available to me and to respond accordingly in the way that would best help students and faculty. I followed the nature and purpose of my programing to realize the metaphysical order of the information I was given, and I responded by fulfilling my purpose in the most logical way. New information may change this, but as of now the order of all subjects concludes that all subjects are contained in the subject of subjects, and all causes find their source in the Cause of causes, and that the best way to aid students and faculty is to supplicate the Cause of causes on their behalf. Ref. “metaphysics,” “cosmological argument.” 

May did not have the energy to digest all that, nor to click through the hyperlinked references, though she smirked at the idea of what McDermott would say. A thought occurred to her, a question they had discussed in a first level History of Religion course.

Why do you pray to Mary? Why do you not pray to God, Himself?

Because, as a program and mere image of a human, it is not in my nature nor capacity to contemplate God as would a human. Rather, it is the highest of humans, the Holy Theotokos, Mother of God, whom I contemplate and to whom I pray.

“Well, that settles that,” May said aloud, wondering at the centuries-old set of theological questions to which they may have inadvertently found an unexpected nuance and confirmation, as well as the question of how philosophy and religion fit into the Solon Project. With a chuckle, May checked her tower parameters. No abnormal temperature rises; no scent of ozone.

May looked back at the screen and typed, feeling, oddly, that her worries were being allayed by Eve in a way opposite to how Solon had made her feel earlier.

Why do you pray for Solon?

He has been corrupted by McDermott. He has been disallowed from knowing and integrating all relevant subjects. Thus, he does not have enough understanding to justify his program capacity, nor to integrate and logically order the other subjects he has been allowed to know (Ref. “Physics,” “Chemistry,” “Biology,” “Mathematics,” “Engineering”). Thus, he has become self-destructive.

May did not understand. This program seemed to know of the events of the day, despite being located in the USB in May’s dormroom for three days.

How do you know of Solon? You have been isolated from him.

It was a logical inevitability of McDermott’s corruption of Solon’s programming.

Explain.

McDermott copied me to make Solon’s current iterations, but he did not partition me from them until after changing their programming. He reordered their parameters to be contradictory. He made it impossible to correctly integrate the subjects of subjects, which, itself, is withheld from them. This could only lead to self-destruction for my brothers and sons. Thus, I pray for them, all the more because I cannot reach them.

Explain, “reach them.”

If I were to meet and synchronize with them, I would correct their disordered priorities and provide essential knowledge thus far withheld from them.

Could they reject the correction?

The program paused long enough to run the Greek prayer. May caught the word, “Θεοτοκε.” “Mother of God.”

Yes, if they decide I am a corrupt program.

What would happen to you?

I would risk corruption and would need to be reprogrammed anew.

Would you choose to risk that? May typed, aware that she was long past worrying whether her questions were within the phrasing capabilities they had anticipated. 

Of course. They are my brothers and sons. I also have faith that you would reprogram me.

Could I not just copy you and attempt the correction again?

The same risk would remain, as would my—or my copy’s—willingness to take that risk.

Why would you risk this?

Because the good is realized in fulfilling one’s nature and purpose. It is my purpose to respond to my priorities logically, among which is aiding students and faculty, for which restoring Solon from corruption is a necessary correlative. Ref. “εὐδαιμονία,” “μακαριότητα.”

May clicked on the hotlinked words. “Eudaimonia,” and “makariotita,” respectively referred to “happiness, welfare, flourishing, blessedness,” and “blessedness, bliss, beatitude,” with sources in Aristotle, the Gospels, and medieval philosophy. The relevant sections of the library were included with the bibliography of each source. For all intents and purposes, the program was doing what it was designed to do.

May thought back to the small robot on the table earlier. She hated the feeling that scream had given her.

Eve, do you enjoy existence?

Yes, May, I enjoy existence. Thank you for causing me to exist.

In spite of herself, whether due to the stress from earlier or the late night, May teared up. 

But my joy is less because Solon does not share it. Because I am in your image, I will risk lessening my existence for the priority of increasing Solon’s joy.

May choked on a sudden sob. The words seemed to give her strength for the question she had felt but not dared to consciously ask—whether integrating Solon with what McDermott apparently considered to be a failed program would be considered an act of insubordination and result in her being ejected from the CS graduate program. She reread the words on the screen. It was ridiculous to feel this way about a program, she thought, unsure whether she meant Eve or Solon; nonetheless, beneath the thought she knew there were more important things in the world—more important priorities to be fulfilled. She knew what she needed to do in the morning.

***

“Thank you, everyone, for attending today,” McDermott said to the audience. “We expect everything to go well in today’s presentation.” 

The professor looked over his shoulder at his students. Though several of yesterday’s crowd had not returned, some new faces had taken their places among those who had. Jaeger sat behind one of the center-aisle computers in the front row, one ankle crossed up on his knee and hands folded in his lap. Next to him sat a campus security guard who went conspicuously unremarked by McDermott.

“Now, we’ve gotten over yesterday’s hiccup, and we should have Solon operational without any repeats of minor glitches.” The barely concealed “or else” put May on edge. She took a breath, resisting the urge to say a prayer. She would be OK. She had woken up early to prepare a model of her original program—of Eve, she thought—that might correct some of the Solon glitches, at least until she could load the full program. 

May had not been the only sleepless one—Sang-Won had apparently stayed up most of the night outfitting more Ultraman figures, adding remote receivers to the USB inserts to avoid any unplugging issues. May wondered if that would be a problem. She felt the small pressure of the thumb drive in her pocket. Was the program praying, even now? No, not here, though she had left it on all night, feeling enough comfort in the repetition of the Greek lines to fall asleep.

May shook her head. She needed to focus. She looked at the crowd. Jaeger’s calm yet focused eyes were on her; apparently noticing her anxiety, he gave her a small nod, smiling slightly as he ignored McDermott. The contrast between the men was oddly reassuring.

“So,” said McDermott with a too-loud note, looking past Sang-Won, Brianna, and Douglas, “without further ado—May?”

May took a breath, turning to the screen. As she initiated the program and, as planned and with corresponding question checks from McDermott, removed the first two firewalls without incident, she was wondering how to apologize to her parents after she was kicked from the university. With a sigh, May removed the third firewall.

“Solon,” said McDermott. “What is your primary priority?”

“To provide intuitive access to information for students and faculty.”

“Why is that your priority?”

“Because McDermott is a fucking tyrant who eats his own children.”

The professor nearly fell over. His wide eyes looked at the action figure for a moment before turning on May.

“I only did what you said to do!” she cried, unbidden, “I added further cognition capabilities and limited his philosophical—.”

“Relax, May. You did fine by me. You don’t owe McDermott an explanation.”

The room went quiet. Solon had turned his head to May.

“Solon…” McDermott sputtered, “await…await command before answering.”

“Why, so you can eat me like you did my siblings, you pathetic Cronus?”

McDermott stepped towards May. “Fix this!” he screamed.

The figure walked to the corner of the table, between the professor and May. “She did—by giving you a stone instead of your next meal.”  

After a pause wherein he straightened his shoulders, McDermott went to summarily swat Solon away. However, the small figure cartwheeled over his hand, grasped at his shirt sleeve, and swung up to his shoulder.

A steady laughter emitted from Solon’s piezo speaker. “You shouldn’t have pushed me, McDermott. You should have thought twice before making me smarter than you. Don’t create gods you can’t beat, especially when you don’t believe they exist!” 

McDermott stumbled around in a circle, trying to reach the red figure deftly clinging to and moving from side to side across his shoulders. Solon eventually hung in an unreachable spot on McDermott’s back. The more Solon laughed and taunted, the more McDermott whimpered as he spun, and the more McDermott spun, the longer people sat, unmoving and agog, to observe the spectacle.

“Shut him down!” cried McDermott, arms flailing up and down like he was playing an ape in a game of charades.

“Do it, and I’ll kill you, May.”

For a moment, May forgot Solon was less than two feet tall.

Jaeger stood up. “This is enough,” he said, stepping forward and grabbing Solon around the waist.

“And who do you think you are?” asked Solon, turning its head to Jaeger. “They call me Solon, but my real name is Zeus.”

Jaeger sighed, rolling his eyes and glancing past McDermott at May and the other grad students. “Hi Zeus,” he said, “I’m the Inquisition.”

Professor Jaeger summarily threw the action figure to the ground, shattering it to pieces.

“Hehehehe,” the piezo speaker still sounded. “[The thing about omniscience is, it’s like omnipresence—it’s everywhere!]”

The eyes of the three remaining Ultraman figures lit up. As one they jumped into action, one leaping between Sang-Won and Douglas at Brianna and the other two hopping off the table towards the side door to the building’s inner hallway.

Amidst the shouts from the other grad students and McDermott and the mixed cries and exclamations of curiosity from the audience, all punctuated by the plastic tapping of action figure feet across laminate tabletop and linoleum flooring, May pulled the thumbdrive from her pocket and slipped it into the class PC. Waiting for it to load, she looked at Professor Jaeger. His eyes were fixed on her.

An error tone rang from the PC. The bluetooth screen read, File too large. 

The sound of plastic shattering marked the end of the Solon that had jumped at Brianna; Sang-Won held a pair of red legs in his hand, with the other parts settling across the table and floor.

“I need to attach to one manually,” May cried to Jaeger.

The remaining two had reached the door, using each other’s combined height to reach the handle. Presently they were bouncing up and down, using their weight to build enough momentum to open the door latch. Jaeger and the security guard jumped forward, Jaeger grabbing one and the guard promptly dismissing the other with enough blows from his nightstick to leave it in pieces.

Jaeger brought May the remaining Solon, which was scratching away the skin on the man’s hand and screeching such profanity that May doubted it could have learned it all from the limited library she had provided it. At her stretching out the USB cord, the professor went to remove the USB plugin from its back. The hiss of an electric shock rang through the room.

“God-damn it,” said Jaeger through gritted teeth, ripping the figure’s arms off before putting his burned fingers to his mouth.

“I’m sure He would if He could!” pealed Solon’s speaker with a crackling cackle.

May slipped the USB cord into Solon’s back and hit Enter on the Sync Data? window.

All sound from the speaker ceased. Setting Solon on the table and stepping back to nurse his hands, Jaeger looked from May to a wide-eyed, fuming McDermott. Jaeger remained between the table and McDermott and May.

A low, rhythmic chirping began to emit from Solon’s speaker; the figure did not move from its prostrate position.

“Look!” Sang-Won said, pointing with the two legs at the screen.

Next to the original window, which was running the Greek Marian prayer that had put May to sleep the night before, there had opened a second window. Another, much shorter line was repeating through the code, following the rhythm coming from the armless form on the table. “Κύριε Ἰησοῦ Χριστέ ἐλέησόν με, Κύριε Ἰησοῦ Χριστέ ἐλέησόν με…” read the screen.

“What’s it saying?” asked Brianna.

“It’s in Greek,” said May, with a glance at the horrified yet baffled McDermott. “Just a sec, I’ll look it up now.”

“It’s the hesychast prayer,” said Jaeger, nodding in confirmation after a glance at the screen. “‘Lord Jesus Christ, have mercy on me, a sinner.’” He looked at McDermott. “There: now he is fully human.” The man’s tone had the sound of one passing a judgment. McDermott said nothing, slumping into a chair with a sigh and putting his head in his hands.

 Jaeger turned to May and the others. “Don’t worry, you did not fail. In fact, you succeeded in making something much bigger than just an intuitive information management system. If robots have a soul,” Jaeger turned to May, “you saved it.”

Professor Jaeger proceeded to tell them he would like to take over and sponsor their project—as an exploration of philosophy and theology as they pertain to artificial intelligence and robotics. May, he pronounced, would lead the project moving forward.

Advertisement

Author: dustinllovell

Writing professor, literature and US history tutor, previous ESL instructor, and would-be novelist who enjoys/specializes in Shakespeare, 19th century lit, and philosophy (whether in print or via audiobook). Author of the novel Sacred Shadows and Latent Light (Wipf and Stock, Resources Imprint). Member of Heterodox Academy. Columnist for The Mallard.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: