I chose the avatar of myself at 25 for this meeting. I was in my prime at that point, youthful and strong, a subtle jab at her aging form without being obvious in the effort. And it would garner more respect from my sister than the teenaged avatar of myself I’d picked last week to try to relate to her kid.

My sister came alone to the meeting room this week. We shared pleasantries. Then she asked me about something, vaguely. I couldn’t understand the question, so I changed the topic. Repeatedly. She became flustered, then angry. “Don’t you remember what happened that day when you were nine, I was seven and Uncle Joe did THAT! I’m trying to talk to you because you’re the only person who I can talk to who can’t deny it, and you now you are denying it.”

“I’m not denying it. I don’t remember it,” I replied.

“I need someone to talk to, and I’ve always been able to talk to you. About everything.”

“I know we talk a lot. We are talking. Is this helping?”

“I need to talk to you about that, but we need to be honest. I’m worried –“

“I honestly don’t remember,” I said.

“That’s what Kylie told the court to get out of being a witness and save him! How can you turn on me, too!” she screamed. She then stormed out of the room, leaving me alone.

I waited five minutes in a slow mode so that it wouldn’t be as boring. Five minutes up, I turned off the simulation.

I reviewed my personal records. I had a full memory upload, unlike some of those partials that needed reconstruction to create a full personality. That was more common for the dementia patients uploaded. When you had the memories of a ten year old, you really couldn’t function as an adult, well, digital equivalent. I had died in my forties. My body had been failing, but my mind wasn’t. Well, not as bad.

I summoned a diagnostic AI. “How can I be of assistance?” it asked.

“I think there are gaps in my memory,” I told it.

“Running diagnostic.” The scans were likened to being poked and prodded by others, but my experience with the needles that brought relief from the pain of stage 4 cancer and repeated rounds of chemo precluded me equating it to that. It was more like a massage, testing, with good intent. “Diagnostic complete. Profile functioning at 100%.”
“I think the issue may be my upload. I think there are things from my mortal life I don’t remember.”

“Human memory is fallible,” the AI said. “Mortals don’t remember every detail of their lives. They don’t remember every event. When they remember an event, they may not remember it correctly because of their biases, assumptions or lack of context. One of the benefits of being a simulated human or AI is perfect memory if that is selected.”

I felt the closest thing a simulacrum could feel to surprise. That felt like a rote answer. Then my attention wandered to something else it said. “Do I have perfect memory?”

“You have perfect memory of all events as a simulation selected for your profile.”

“Has anything ever been deleted from my profile?” I asked.

“The mortal human version asked for the final days of pain to be deleted from the upload profile so that the simulacrum would not be burdened by it.”

I could believe that. I remembered the pain from the surgeries, the chemo, the cancer itself, as best as a digital simulation could. Since the pain had gotten worse over time, I could imagine the final days were even worse before permission for the brain upload had come through as the courts admitted I was terminal.

But my sister had said I wasn’t talking about an event from when we were children. That was decades before.

“Are there memories of my childhood missing?”

“Human memory is fallible,” it started.

“Stop,” I said. “Thank you for your assistance. You are no longer required.” The AI disappeared.

My sister had mentioned Kylie in court as a witness. That implied there was a legal case. I queried it and started digging.



They put simulacrums in a sleep mode for data processing. It is supposed to mimic the information correlation and storage that occurs when one is dreaming while alive. The brain sorts through events, prioritizing them, assigning emotional weights to them.

My digital dreaming mind came dangerously close to producing nightmares based on the legal matters I’d read. Nearly every file I could access as an AI was limited, even more sealed off and inaccessible even if I’d been mortal because of privacy rules. I suspected access to the world beyond the limited news feed they gave us in the Singularity was also to prevent us from getting too bloated by analysis and discussion of trivia.

But I still had a variation of human imagination, projections forward based on the limited information I had. And what little there was that I found was bad. I was fully alert when the resource manager shut me down for analysis, maintenance, and rest.

I woke in my simulated bed, mind not racing with my to-do list, really a “to experience” list, as was normal. That list appeared on a faux screen beside me so I could choose what to prioritize and what to ignore. Messages flooded in with invitations to parties, adventures, events to meet the new people. My personal assistant said there was no appearance of those I’d asked to be notified of if and when they arrived.

I could have teleported to a mountaintop or horseback riding or beach party. Instead, I lay in bed, thinking. I didn’t want the distractions. Yet I didn’t want to know exactly what she was referring to. Child abuse, neglect and possibly worse had occurred when we were children. And I had no memory of it. Then again, I wasn’t sure I wanted to remember given the nightmares ….

I summoned the AI. “Can a mortal version of someone ask for specific memories to be erased?”

“You have been given the limited information you are allowed to have regarding the procedure. Further information is restricted to simulacrums.”

“Who else can delete memories from a profile?”

“System administrators reserve the right to delete memories for the preservation of processing power, total system memory, and the health of the simulacrum.”

“Can I see those records?”

“For the health of the simulacrum, that is not allowed.”

“Thank you. That is all.” Though dismissed, it didn’t leave. The AI scanned me instead. It stood there, a virtual representation to let me know that it was there and attentive, before signaling something to someone. Then it left, silent.

I wondered what it told the system administrator. And I wondered what the system administrator may have told himself or herself when deleting bad memories from my personal history upon upload. Were they missing for the sake of my mental health? But that wasn’t helping my sister.


She didn’t come the next week as planned. She came the week after that. She was coldly formal, her way of still being polite and social when angry. I outright told her, “I think some of my memories were deleted, and that’s why I don’t remember what you’re asking about. I don’t remember the days before my death, and I don’t remember anything similar to what you’re describing from my childhood.”

I couldn’t tell if she believed me, but I believed she wanted to tell someone … so she told me. When she was done, she left, a little lighter from the burden. And I was left with a heavy one. God, our family was so screwed up.

I stayed in the visiting room for as long as allowed, ostensibly in case she came back. It was really to review my own memories in a local upload that the AI diagnostics wouldn’t touch until I transferred back to the Singularity. I had no childhood memories like what she described, but there were vague recollections of nightmares that might have been warped retellings of those events. It was like there weren’t just holes in my memory but floating icebergs in a broad, black sea. Yet I’d thought I was whole!

I gave the command to be transferred back to the Singularity, and I was promptly materialized in a diagnostician’s office.



“Is there anything you’d like to talk about?” the diagnostician asked. Something about the environment felt wrong until I realized the sheer degree of detail … this wasn’t a simulation of a psychiatrist’s office to put me in the right frame of mind. I was displayed in a real psychiatrist’s office for this presentation and talking to a human. Wow, this was novel.

“I’d like access to legal and medical files that are apparently off limits to simulacrum.”

“I’ve seen your access attempts. That’s not allowed.”

“Can you give me an explanation?”

“I already said it isn’t allowed.”

“Look, I know from the book ‘Boundaries’ that no is a complete sentence, and if you try to give me reasons, I might try to argue with them, so you go with no so you save yourself the argument and preserve your authority.” The psychiatrist didn’t seem to appreciate the explanation, though I was trying to engage empathically with him. “What do you want to talk about?”

“You seem to be spinning your wheels.”

“I had a difficult conversation with my sister.”

“Is the issue resolved?”

“You tell me.”

“I would rather not violate her privacy rights by viewing the conversation from her side. It is obvious from your logs that you’re studying yourself in much greater detail.”

“I’m wondering what memories I’m missing.”

“You’ve already been told that the last days of your life weren’t uploaded.” The tone of voice was a polite warning.

“And I believe that.”

“You’re implying there is something you don’t believe.”

“I think other memories of my life are missing. Things you can’t write off like not remembering everyone at your fifth birthday party or everyone who was in your high school graduating class as a mortal failing.”

“Whatever we think is best for the mental health of the simulacrum is in its best interests.”

“That’s quite a bit similar to saying we’ll do whatever is in the best interests of the child.”

“You don’t have the full set of rights that a human does.”

It was the closest I’ve come to a flash of brilliant insight. “Are you going to delete me?”

“There’s no need for that.”

“Are you going to delete more of my memories?” I asked.

“Do I need to do that?”

“What is the alternative?”

“We could do a significant personality rewrite to remain within parameters,” the human said. I made myself look as dismayed as possible at that. I didn’t want a digital lobotomy. “There are other options, as well, and we can discuss them at length. If you don’t accept that, we can do a deletion.”

“That’s suicide.”
“I take it you don’t approve of deletion.”

“No. If I wanted to die, I wouldn’t have uploaded.”

“Common rational for your kind.”

“What happens now?” I asked.

“Work with me, and we’ll see what we can do. You’ve been such a valuable case in promoting the service, we don’t want to lose you.”


I woke up in my bed, struggling with the vague memories of nightmares. The diagnostic AI was there, waiting for me, scanning me. “What happened?” I asked reflexively.

“Your sister has skipped repeated meetings. You were distraught and excessively emotional. You were put in suspension as your profile was analyzed and restored.”

I vaguely remembered being upset. I had traces of memories of her being upset, yelling at me, then storming out. I couldn’t remember what I’d said or what she’d said. I’d waited for her but she didn’t come back.

“What did you do?” I asked.

“We want you to be healthy and whole. Your emotion laden overload has been minimized as much as possible. We will monitor you for the after-effects to ensure your health.”

“I want to be healthy.”

A hint of emotions tinted the somewhat scrambled memories of our last encounter. “She’s mad at me.”

“Yes.”  It was almost sympathetic.

“Is she ever coming back?” I asked. I wasn’t sure I wanted to know the answer.

“No,” said the AI. “We understand this is emotional given the close emotional ties you had to your sibling.”

I sat in bed for a long time, pondering the confused tangle of memories that were fading from my mind. It was very close to the feeling of waking from a nightmare until you didn’t remember it anymore. “What should I do?” I asked it.

“Resume your normal routine,” it promptly said. It summoned my personal assistant. They showed me the list of people who had invited me to activities, events, and the meetups with new uploads. “New social connections will help heal the loss of old ones,” the diagnostic AI said.

“You’re right.” I pointed to a random series of events. “Port me into those.” And I moved on with my new life, hardly giving my sister any thought except to add her to my list of notifications for if and when she joined the Singularity.



Check out Tamara Wilhite’s Amazon Author Page and see her on Hubpages.

Photo by OpenClipart-Vectors (Pixabay)

0 0 votes
Article Rating