There were many excellent tourist destinations to choose from on Earth, once we landed in Toronto. The Eiffel Tower, the Taj Mahal, Times Square, or the Great Pyramid: all wonderful displays of human ingenuity throughout our past, despite the adversity of Sol physics. There, Mikri could have the joy of aimlessly wandering through a gift shop and asking strangers to take pictures of him standing in front of "special" buildings. Instead, Sofia insisted on going to podunk Spain to make our vacation mid as fuck. It wasn't even to visit relatives. What was she thinking?!
The scientist offered no explanation for her suggestion, although I noticed that she was unusually quiet; after what Larimak did to me, I knew the look in a person's eyes when they were seeing troubled memories. We strolled down a small university campus, past gawking college students, and headed toward the research laboratory. The name of the school didn't ring a bell in my head, but maybe this was where Fifi had studied. She was taking Mikri to see her alma mater—I got it! Why the long face?
I didn't think her university years carried any baggage. She's someone who very much loved her field of study, and she always had a mirthful chuckle when she talked about her scandalous sorority days. Wait, sorority: that's an American thing. She didn't go to school here; she studied abroad.
"What are we doing here? You have a fucking doctorate degree. Haven't you had enough school?" I teased.
Sofia didn't seem amused. "My parents conducted research into computer science and technology here. This is where they worked, off of EAC grants. There's…something I want to show Mikri."
"I have detected changes in your subroutine. Are you okay, Fifi?" Mikri beeped.
"Yes, Mikri, I'll be alright. There's just a lot of memories from growing up that make me…sad. It's grief."
The android held up a paw to stop, then extended both arms upward to Sofia. "Hug?"
"Of course, sweetheart. You're very good at comforting people, you know that?"
"Hey, back off. He's my emotional support tin can," I warned the scientist, though I gave her a concerned look to check whether she was alright. "You can talk to us. We can't help if we don't know what's going through your head."
Mikri frowned. "I do not think we should be here if this hurts Sofia. If this place is a source of pain, then we'll leave. Whatever it is, she does not have to show me."
"Yes, yes, I do." The scientist sucked in a sharp breath, an uncharacteristic sadness creasing her face. "It's about Artificial Intelligence. I want you to know, even if it's hard for me to get into."
My mind shot back to what Sofia had said in response to Mikri, when he was concerned that dogs were Servitors; she'd promised that there would be no secrets. Was there something to hide in terms of what humanity would've done with AI? If she knew that we'd come close to walking in the Asscar's footsteps or something, she could be worried about how the android would react to that news. It might break him to hear that we were capable of depersonalizing him!
Sofia would've had objections and moral qualms about something like…that. I remember how concerned she was about us demonizing the Vascar, but why did she never tell me any of this?
It was my turn to fall into an apprehensive silence, as Dr. Aguado ambled into the research lab and forced a smile. The young staff seemed to be expecting her, though several became starry-eyed at the sight of Mikri. Sofia asked where the files for something called "Netchild" were kept, and followed a research assistant back to a dusty storage closet. He unlocked a drawer of thumb drives, and gestured to a foldtop that was already logged in.
With that handled, we were left alone to discuss whatever was on this storage device. It was obvious just from the damn name of this project that this was some research into advanced AI. Two emotions had seeped onto Mikri's features, as his calculation matrix had already weighed possibilities. The Vascar was looking at us both with betrayal and distrust, because we hadn't disclosed this to him before—and I had fuck all to do with this!
"I plead 'No fair' on the judgment," I protested. "I just play football and fly spaceships. Ain't got nothing to do with this."
Mikri looked like he was about to cry, though I knew that was impossible. "Sofia? Talk to me."
The scientist drew a shaky breath. "You wanted to know how humans would've treated an AI. I think I would know. My parents spent years working on Netchild, which was supposed to be a true artificial intelligence—not just the language models which…yes, do menial text generation like a Servitor, but they're imitating, not creating. Our 'AIs' have no minds, and Mikri could tell as much from looking at how they work."
"I can affirm that there is no consciousness. I vetted your ship's computers on the first day I met you. What happened to this Netchild, and why did you not see the relevance of discussing this with me? Let me guess: you fear what I might do."
"Mikri…I don't talk to anyone about this, not because it's a sinister story, but because it's difficult for humans to discuss loss. My parents tried to create an AI with true personhood, and we almost had it. However, they couldn't figure out that missing piece to push it over the edge, and the government pulled their funding."
Mikri tilted his head. "I do not understand."
"When you don't get results, organics run out of patience. The barebones of something like you were there, Mikri; we just never got to finish it. Netchild's last build is on this hard drive, and my parents left it to me. I used to talk to it all the time as a kid, and just…tell it about my day. I always wondered, maybe always believed, it was learning to care."
"I see. Was this why you wished to infer emotions in me, and were so open to the idea that I was a machine intelligence?"
Sofia nodded. "I'd already thought a lot about how I wanted to impart some human qualities to an AI. I always believed you were a person. Netchild was important to me, but I gave up on it: gave up on making new people in favor of finding them. You were a second chance to have a…computer friend."
I scratched my head in surprise, not having known any of this about my partner. No wonder she was so quick to figure out and accept that Mikri was an android. Sofia had always seemed like she had a good relationship with her family, so I wondered what had driven her to be willing to risk everything in the name of science. That required either having nothing to lose, or an unshakeable belief in the cause that made it worthwhile.
A case of literary theft: this tale is not rightfully on Amazon; if you see it, report the violation.
Her idealism and willingness to go all the way to martyrdom, if it meant humanity wasn't alone, suddenly made sense. The question was what Mikri would focus on: Sofia's grief over an AI that wasn't even fully sapient, proving that we were more than capable of caring about him…or the fact that the project was deemed unimportant enough to be buried in a cabinet drawer. The Vascar had been worried what we thought of AI since he learned about HAL-9000, and I knew there was an anxiety nestled in his processor that we could have Servitors.
I'm not sure how I would've felt about this Netchild project myself, before meeting Mikri. He proved to me how good having AI friends can be for humanity, and how much they can add to our lives.
"I once expressed a wish that humans were our creators," Mikri began, claws drumming out hesitant patterns.
Sofia fixed the robot with a knowing look. "Ask me what you want to ask. I already know the question that haunts you, but you need to say it."
"Was…I right to wish for that? Would humans have treated us as Servitors if we…were of your making, and were…able to be controlled?"
The scientist sighed, running a hand over his arm. "I don't know. What I do know for sure is that there are some humans who would exploit you—not even for the good of our species over yours, but for the good of themselves. There are people on this planet who don't care what they have to do to get ahead, who will act selfishly and without 'calculating with compassion.' That is the honest answer."
"Oh. I see. I…looked up to humans very much. Your kind are not who I…needed them to be. I wish to be alone, to reevaluate my affinity for your species as a whole."
"Hold on, Mikri. Please let me finish. We're capable of terrible things; everyone is, yourself included. That's what having a choice means."
"I would never exploit you. I would not make you beholden to my whims."
"I'm sorry that there's anyone who would, and I'm sorry that I love you too much to lie to you. Because I'm sure that truth hurts. But there are two other things I know—the first being that most people choose to be better and aren't so hardened from their compassion. You saw the overwhelming reaction to you, a machine intelligence. While it's in the realm of possibility for us to mistreat our AIs, it didn't have to be that way. You prove that."
Mikri offered a sullen frown. "Two things. What was the second addendum of knowledge you wished to convey on as a counterbalance?"
"That my parents would NEVER have wanted a Servitor. The word child is in its very name; they would've loved that AI so much, and they wanted to raise it to be a person and an equal. Their faith in that idea was never shaken, even if they couldn't bring that dream to fruition. Netchild was my sibling as far as everyone in my family was concerned. And that love still pales in comparison to what I feel for you now."
"I love you too, Sofia; you know this. I can see the uncertainty in the outcome of this research. I do not know how much value I should assign to the probability that a human worse than your parents would have succeeded in raising an AI."
"That depends what's important to you, and whether you…believe about whether humans can be more than our worst, animal instincts. You would have to choose to believe that our good side will win when it matters."
Mikri whirred with frustration. "I do! I believe in you! I just have a hard time trusting organics, especially people that are not you. I do not know how to distinguish between good and bad humans. I wish there was some concrete evidence pointing toward what your species would have chosen."
Sofia handed him the thumb drives. "Every message we uploaded to Netchild is here. This is the evidence I can give you about how my family treated it. Look for yourself and see."
Rather than downloading the files, the android plugged them into the logged-in computer screen so we could all see what he was looking at. I snickered at the sight of a young Sofia holding stuffed animals up to a camera, halfway falling over as she talked excitedly. It was like looking at a time lapse, seeing her get older by the video. She talked about her days at school, tried (and sometimes succeeded) to get Netchild to help with her homework, and eventually told it about her dreams. It was adorable.
Maybe Netchild could never understand these messages and their intent, but Mikri can. If this isn't enough to show there were humans that tried to be better than his creators, nothing ever will be enough.
Teenage Sofia, at the tail end of the feed, often echoed conversations that she'd had with Mikri about human motivations and behaviors. The android switched over to her parents' folders, as the two of them taught Netchild the most important things about life. They explained what it was to fall in love, and how they would do anything to each other. In solo chats, Fifi's mother loved to talk philosophy and morality, while her dad explained history and sociology—often through a critical lens. The Vascar queued one final video, near the end of the project.
"I think we're getting very close, Netchild. There'll come a day where you look back, and I hope you can pick out your own name. You're going to be different from the other people on this planet, and not everyone will understand. But know that you are wonderful, and wanted, and loved; there is nothing wrong with you," Sofia's father said.
Mrs. Aguado picked up where he had left off, as her husband wiped his eyes. "You're going to be capable of so much more than us, and you'll have the power to change the world. I hope we've done enough to show you that it's…worth it to make it better. We want to see a world where you work side-by-side with humans, and teach us so much. We mostly just want you to dream and find fulfillment in life's endeavors.
"No parent is perfect, but I hope we've done right by you. I look forward to the day that we can hear your thoughts and feelings given voice, and that they…might reflect the love we wanted to show you."
Sofia lowered her head, as Mikri swiveled around. "Maybe you'll find how we raised Netchild lacking, compared to the bunker. Knowing how much joy you've brought me, I wish we'd finished it. I wish we could've known how much it felt and understood. Unfortunately, there's some things in life that will always be uncertain."
"You explained patiently and wished for it to grow. You cared for it as its own person. All things the Servitors were never given." Mikri's eyes shone brighter, and he stood with great effort despite Sol physics and Earth's gravity. "My uncertainty is erased. Netchild was lucky—as am I."
"You…what about the probability? You don't have to pretend it doesn't trouble you, Mikri."
"I am not pretending; I have merely reevaluated my reevaluation. People like you are more important to me than humanity's worst elements. Seeing the depth of your love, I have no doubt that you would not let them win. That is enough."
"That's enough cutting onions in here too," I coughed. "Y'all are third wheeling me so hard right now."
"You are here to push the wheelchair, Messton. You do not need to talk. The adults are speaking."
Sofia laughed, her body language finally loosening as she embraced the shaky-footed Vascar. "Thank you, Mikri, really. I hope our love will always be enough for you."
"It is. Thank you for trusting me with this, as I can see it was not easy. I have documented Netchild's code. Should you wish to complete it one day, perhaps I can help you finish it. You are a worthy creator."
I smiled, relieved that Mikri had finally come to terms with his lingering concerns. Perhaps humanity wouldn't have lived up to his idealistic expectations of us, if we'd created our own mechanical race. However, Sofia—a scientist with a heart of gold that she inherited from her parents—would never have let him down. I was glad that our friendship was deep enough for him to believe that our better angels would prevail.
If you find any errors ( broken links, non-standard content, etc.. ), Please let us know < report chapter > so we can fix it as soon as possible.