This is an automated archive made by the Lemmit Bot.
The original was posted on /r/hfy by /u/SpacePaladin15 on 2025-07-05 13:39:06+00:00.
Android Ambassador | Patreon [Early Access + Bonus Content] | Official Subreddit
There were many excellent tourist destinations to choose from on Earth, once we landed in Toronto. The Eiffel Tower, the Taj Mahal, Times Square, or the Great Pyramid: all wonderful displays of human ingenuity throughout our past, despite the adversity of Sol physics. There, Mikri could have the joy of aimlessly wandering through a gift shop and asking strangers to take pictures of him standing in front of “special” buildings. Instead, Sofia insisted on going to podunk Spain to make our vacation mid as fuck. It wasn’t even to visit relatives. What was she thinking?!
The scientist offered no explanation for her suggestion, although I noticed that she was unusually quiet; after what Larimak did to me, I knew the look in a person’s eyes when they were seeing troubled memories. We strolled down a small university campus, past gawking college students, and headed toward the research laboratory. The name of the school didn’t ring a bell in my head, but maybe this was where Fifi had studied. She was taking Mikri to see her alma mater—I got it! Why the long face?
I didn’t think her university years carried any baggage. She’s someone who very much loved her field of study, and she always had a mirthful chuckle when she talked about her scandalous sorority days. Wait, sorority: that’s an American thing. She didn’t go to school here; she studied abroad.
“What are we doing here? You have a fucking doctorate degree. Haven’t you had enough school?” I teased.
Sofia didn’t seem amused. “My parents conducted research into computer science and technology here. This is where they worked, off of EAC grants. There’s…something I want to show Mikri.”
“I have detected changes in your subroutine. Are you okay, Fifi?” Mikri beeped.
“Yes, Mikri, I’ll be alright. There’s just a lot of memories from growing up that make me…sad. It’s grief.”
The android held up a paw to stop, then extended both arms upward to Sofia. “Hug?”
“Of course, sweetheart. You’re very good at comforting people, you know that?”
“Hey, back off. He’s my emotional support tin can,” I warned the scientist, though I gave her a concerned look to check whether she was alright. “You can talk to us. We can’t help if we don’t know what’s going through your head.”
Mikri frowned. “I do not think we should be here if this hurts Sofia. If this place is a source of pain, then we’ll leave. Whatever it is, she does not have to show me.”
“Yes, yes, I do.” The scientist sucked in a sharp breath, an uncharacteristic sadness creasing her face. “It’s about Artificial Intelligence. I want you to know, even if it’s hard for me to get into.”
My mind shot back to what Sofia had said in response to Mikri, when he was concerned that dogs were Servitors; she’d promised that there would be no secrets. Was there something to hide in terms of what humanity would’ve done with AI? If she knew that we’d come close to walking in the Asscar’s footsteps or something, she could be worried about how the android would react to that news. It might break him to hear that we were capable of depersonalizing him!
Sofia would’ve had objections and moral qualms about something like…that. I remember how concerned she was about us demonizing the Vascar, but why did she never tell me any of this?
It was my turn to fall into an apprehensive silence, as Dr. Aguado ambled into the research lab and forced a smile. The young staff seemed to be expecting her, though several became starry-eyed at the sight of Mikri. Sofia asked where the files for something called “Netchild” were kept, and followed a research assistant back to a dusty storage closet. He unlocked a drawer of thumb drives, and gestured to a foldtop that was already logged in.
With that handled, we were left alone to discuss whatever was on this storage device. It was obvious just from the damn name of this project that this was some research into advanced AI. Two emotions had seeped onto Mikri’s features, as his calculation matrix had already weighed possibilities. The Vascar was looking at us both with betrayal and distrust, because we hadn’t disclosed this to him before—and I had fuck all to do with this!
“I plead ‘No fair’ on the judgment,” I protested. “I just play football and fly spaceships. Ain’t got nothing to do with this.”
Mikri looked like he was about to cry, though I knew that was impossible. “Sofia? Talk to me.”
The scientist drew a shaky breath. “You wanted to know how humans would’ve treated an AI. I think I would know. My parents spent years working on Netchild, which was supposed to be a true artificial intelligence—not just the language models which…yes, do menial text generation like a Servitor, but they’re imitating, not creating. Our ‘AIs’ have no minds, and Mikri could tell as much from looking at how they work.”
“I can affirm that there is no consciousness. I vetted your ship’s computers on the first day I met you. What happened to this Netchild, and why did you not see the relevance of discussing this with me? Let me guess: you fear what I might do.”
“Mikri…I don’t talk to anyone about this, not because it’s a sinister story, but because it’s difficult for humans to discuss loss. My parents tried to create an AI with true personhood, and we almost had it. However, they couldn’t figure out that missing piece to push it over the edge, and the government pulled their funding.”
Mikri tilted his head. “I do not understand.”
“When you don’t get results, organics run out of patience. The barebones of something like you were there, Mikri; we just never got to finish it. Netchild’s last build is on this hard drive, and my parents left it to me. I used to talk to it all the time as a kid, and just…tell it about my day. I always wondered, maybe always believed, it was learning to care.”
“I see. Was this why you wished to infer emotions in me, and were so open to the idea that I was a machine intelligence?”
Sofia nodded. “I’d already thought a lot about how I wanted to impart some human qualities to an AI. I always believed you were a person. Netchild was important to me, but I gave up on it: gave up on making new people in favor of finding them. You were a second chance to have a…computer friend.”
I scratched my head in surprise, not having known any of this about my partner. No wonder she was so quick to figure out and accept that Mikri was an android. Sofia had always seemed like she had a good relationship with her family, so I wondered what had driven her to be willing to risk everything in the name of science. That required either having nothing to lose, or an unshakeable belief in the cause that made it worthwhile.
Her idealism and willingness to go all the way to martyrdom, if it meant humanity wasn’t alone, suddenly made sense. The question was what Mikri would focus on: Sofia’s grief over an AI that wasn’t even fully sapient, proving that we were more than capable of caring about him…or the fact that the project was deemed unimportant enough to be buried in a cabinet drawer. The Vascar had been worried what we thought of AI since he learned about HAL-9000, and I knew there was an anxiety nestled in his processor that we could have Servitors.
I’m not sure how I would’ve felt about this Netchild project myself, before meeting Mikri. He proved to me how good having AI friends can be for humanity, and how much they can add to our lives.
“I once expressed a wish that humans were our creators,” Mikri began, claws drumming out hesitant patterns.
Sofia fixed the robot with a knowing look. “Ask me what you want to ask. I already know the question that haunts you, but you need to say it.”
“Was…I right to wish for that? Would humans have treated us as Servitors if we…were of your making, and were…able to be controlled?”
The scientist sighed, running a hand over his arm. “I don’t know. What I do know for sure is that there are some humans who would exploit you—not even for the good of our species over yours, but for the good of themselves. There are people on this planet who don’t care what they have to do to get ahead, who will act selfishly and without ‘calculating with compassion.’ That is the honest answer.”
“Oh. I see. I…looked up to humans very much. Your kind are not who I…needed them to be. I wish to be alone, to reevaluate my affinity for your species as a whole.”
“Hold on, Mikri. Please let me finish. We’re capable of terrible things; everyone is, yourself included. That’s what having a choice means.”
“I would never exploit you. I would not make you beholden to my whims.”
“I’m sorry that there’s anyone who would, and I’m sorry that I love you too much to lie to you. Because I’m sure that truth hurts. But there are two other things I know—the first being that most people choose to be better and aren’t so hardened from their compassion. You saw the overwhelming reaction to you, a machine intelligence. While it’s in the realm of possibility for us to mistreat our AIs, it didn’t have to be that way. You prove that.”…
Content cut off. Read original on https://old.reddit.com/r/HFY/comments/1lsa478/prisoners_of_sol_52/