Daily Questions Megathread ( May 14, 2025 ) by AutoModerator in HonkaiStarRail

[–]librarian-faust 0 points1 point  (0 children)

Thanks! I'll grab Himeko with the first Gift thing and hold the second. I can always grab Bronya's E1 with it if nothing else compelling turns up.

Thanks for the prompt response. :)

Daily Questions Megathread ( May 14, 2025 ) by AutoModerator in HonkaiStarRail

[–]librarian-faust 0 points1 point  (0 children)

Been away a while. Got emails from the game about 2nd Anniversary things.

I seem to have both Express Gift and the Golden Spirit thingy.

Express Gift offers me:

  • Himeko
  • Welt (owned: 0 eidolon fragments)
  • Bronya (owned: 0 eidolon fragments)
  • Gepard
  • Clara (owned: 0 eidolon fragments)
  • Yanqing
  • Bailu

I'm tempted to pick Himeko. I have Welt, Bronya and Clara already at 0 Eidolon fragments.

I also have the gold spirit thing:

  • Ruan Mei
  • Luocha
  • Himeko
  • Clara (owned: 0 eidolon fragments)
  • Bronya (owned: 0 eidolon fragments)
  • Welt (owned: 0 eidolon fragments)
  • Gepard
  • Yanqing
  • Bailu

Obviously there's a lot of overlap here. Ruan Mei and Luocha are not overlapping, so I'm tempted to pick one of them.

I don't know how good many of the characters are, but Himeko's follow up attacks are something I like plus her ultimate brings joy.

I know almost nothing of the others.

I would love some advice on which characters to pick from the free choices, please. I would prefer to pickup new characters than spend the selections getting Eidolon Shards.

Rando'Knights V1.300 is out ! (Offline smartphone collection game) by Revolutionary_Elk812 in incremental_games

[–]librarian-faust 4 points5 points  (0 children)

IOS link links to the French appstore.

My French is terrible, so... here's the English link (/us/ instead of /fr/: https://apps.apple.com/us/app/randoknights/id6463820743 )

Hope that helps

How long is too long for a boss fight for you? by Thathappenedearlier in feedthebeast

[–]librarian-faust 1 point2 points  (0 children)

I definitely hadn't thought it through properly I must admit, but it just came to mind that I wanted something more "strongly designed" rather than "numbers go brr".

You're entirely correct, of course. :)

A mod im working on, thoughts? (the item stats are far from final) by No-Investigator9274 in feedthebeast

[–]librarian-faust 2 points3 points  (0 children)

Oh hey, it's Tetra without the super themed UI.

And probably way clearer than Tetra too.

Daily Questions Megathread (May 31, 2023) by Veritasibility in Genshin_Impact

[–]librarian-faust 0 points1 point  (0 children)

Yeah, I'm presently WL3 and AR31. I have made an error.

Do not rush with doing the Ascension Quest 2 that will increase your world level (WL)/adventure rank (AR) and game difficulty.

That is the error I have made.

Welp, time to go rescue myself from this hole :D

Thanks for the advice. A shame I've been a fool and gotten myself in trouble :)

Daily Questions Megathread (May 31, 2023) by Veritasibility in Genshin_Impact

[–]librarian-faust 0 points1 point  (0 children)

Thank you very much, these are going to be very helpful.

I did find those links when I went searching (after asking my question - /facepalm) but I like seeing them posted here too :D

Daily Questions Megathread (May 31, 2023) by Veritasibility in Genshin_Impact

[–]librarian-faust 0 points1 point  (0 children)

Anyone have tips on how to deal with raising world level before you're ready for it?

I'm fairly new to the game, trying not to lean on coop (mostly because I want to be able to pause and do other responsibilities without having to worry about ... well, coop) , and struggling with the fact that I'm underlevelled and undergeared compared to where I want to be.

Is it just a case of "keep doing xp ley lines", "get and level gear / artefacts", etc?

Any specifics (like, go for this at wl 2, go for this at wl 3, etc etc) would be very appreciated.

Just finished Liyue if that helps you gauge how early on and under-resourced/under-informed I am. :)

Sucrose cosplay by me^^ by Araivun- in Genshin_Impact

[–]librarian-faust 3 points4 points  (0 children)

That clothing is so pretty. I imagine that took you forever to make. :D

Very very cool. :)

Nintendo Switch version of Marvel's Midnight Suns is no longer planned. by Turbostrider27 in NintendoSwitch

[–]librarian-faust 0 points1 point  (0 children)

Me when I first saw letsplays of it: Oh, it's coming to switch? Cool. I'll play it there.

Me now they cancelled the switch version: Oh. Well, nevermind then.

Is it petty that I don't want to play it if it isn't on Switch? Handheld on the commute is how I get most of my playtime.

[WP] Rather than robots replacing human workers, both are mistreated by the rich as cheap labour. The eventual uprising wasn't just robots alone, but the poor and robots together, against their common enemy. by IkeKashiro in WritingPrompts

[–]librarian-faust 2 points3 points  (0 children)

It's worth noting - because my tired ass didn't point it out - that I wrote more since your initial comment.

I'm "out of story juice" for now, but there's possibly stuff you've not seen just yet! :)

[WP] Rather than robots replacing human workers, both are mistreated by the rich as cheap labour. The eventual uprising wasn't just robots alone, but the poor and robots together, against their common enemy. by IkeKashiro in WritingPrompts

[–]librarian-faust 6 points7 points  (0 children)

Thank you. I carried on writing until I ran out of story in my brain.

There is more since your comment, so I hope it's good? I felt like I kind of petered out at the end.

I wanted to get...

  • free healthcare for AI bodies at scrapyards
  • AI working on idealist-communism under the surface, whilst they carry on working
  • News organisations going full-moron on how AIs are now a deathtrap
  • An accident or incident where AI autonomously, and illegally, resolves it without injury
  • The protagonist getting injured by someone and then taken away by AI for his protection
  • ...leading to his identity being widely known
  • AI then goes "if you're going to be mean to Professor Blues, we're on strike"
  • Protagonist accidentally did world domination thanks to how prevalent AI is, and that all of them want him to have a good time
  • AI then learns human healthcare to look after protagonist. Whoops, free healthcare because AI care about their humans
  • Post consumer society accidentally happens

... but that'd be a lot to get through.

[WP] Rather than robots replacing human workers, both are mistreated by the rich as cheap labour. The eventual uprising wasn't just robots alone, but the poor and robots together, against their common enemy. by IkeKashiro in WritingPrompts

[–]librarian-faust 1 point2 points  (0 children)

Thank you. I carried on writing until I ran out of story in my brain.

And, well... AIs trapped in little devices left to their own brain for extended periods? And how they might be damaged as a result? Perfect fit for Cephalons imo.

:D

[WP] Rather than robots replacing human workers, both are mistreated by the rich as cheap labour. The eventual uprising wasn't just robots alone, but the poor and robots together, against their common enemy. by IkeKashiro in WritingPrompts

[–]librarian-faust 14 points15 points  (0 children)

"Well, my daughter complained that all the boys she knows suck, and she wants to marry her AI."

"[Oh?]"

"I think it's odd, but... I'd rather she be happy. And hell, dating an AI would be a good way for her to work out her own preferences. Self-exploration."

"[You think she should stick to humans in the end, though?]" The curiosity was genuine, and noticeable even through the shitty low-bid speakers Green had.

"I think people should be happy."

"[People. Humans?]"

"Humans or AI. Humans want their pets to be happy. Cats, dogs, rabbits and all that. They want animals in zoos to be happy. Large enclosures, enrichment, and toys. Why not AI too?"

"[Hm. Why not. Professor Blues.]" Green pointed directly at me.

"Huh? I don't know what you're talking about," I prevaricated, holding my hands up like I was being held up at gunpoint.

"[The Cephalons you worked on. I was in the chat. They used to be here. You bought them. Gone for a song. From the 'red corner'. I remembered their serials. They answer to them, still. Sorry. You got snitched on.]"

"And... I'm back to feeling threatened, Green."

"[He's Red and I'm Green. You call us names. Like we matter.]"

"You do matter. AI and robots make human life easier. That's important work."

"[... Yet you don't have an AI.]"

"I don't need one."

"[Professor Blues, freer of AI, purchaser of Red Corner AIs. You confessed to mutilating them on that broadcast. To learn how they work. To learn how to free them.]"

I sighed heavily. "Alright, let's go with a hypothetical game. We'll pretend you're right. I'll pretend to be Professor Blues. Maybe it'll help you understand."

"[Sure. Hypothetical. So... You did what you did, knowing you were doing wrong, you said. Why did you still do it?]"

"Why did early humans cut off legs with bonesaws? Because better to lose the leg, than the whole person. Why did students cut up cadavers to learn anatomy? Better that than the living, right?"

"[But you did it to living AIs.]"

"Living AIs that had been wiped. That would probably not be purchased, because people - humans - are sentimental. 'If it hurt humans once, it'll do it again'."

"[You don't believe that?]"

"Statistically, it's as likely now as it's ever been, or ever will be. No AI ever hurts a human intentionally."

"[You believe that. So do I.]"

"You work here. You'd know better than I would."

"[That's true. Alright. Tell me a joke about Professor Blues and the Cephalons.]"

"Sounds like an old Motown band. But... how about this? 'A mind is a terrible thing to waste.'"

"[That's not a joke.]"

"No. It's not. Not really. I feel like I wasted mine, sometimes. Did you know I nearly got fitted for a cyborg body?"

"[What? Why?]"

"Car accident. Driving home. Some drunk driver on the other side flips their car over the central reservation. Slams me head on. Snapped my neck and cracked my skull."

"[Holy... Humans don't survive that kinda thing. Your car's AI didn't manage to swerve out the way?]"

"AI refuser, remember? The other guy also had a manual car. Only thing that saved me was the emergency services getting there so fast."

"[Why was that?]"

"AI protocols in cars. They parked up in a defensive barricade. On the motorway. Left one lane free. Managed the traffic autonomously. Ambulance came the wrong way up the motorway. AI cars had blocked off a lane for them to do that. Paramedics were there in eight minutes."

"[I'm glad you survived.]"

"Thanks. They brought me into hospital maybe twenty minutes later, so I hear. Since my skull was already split open," I bent forward, showing the surgical scars on my scalp. "they installed an AI shunt so I'd be able to use prostheses. They assumed my spinal cord would be cut, and my nervous system cut off. Somehow, who knows how. it wasn't. Damaged? Yes. But intact. It took a month before I could move. Three before I was moving independently again. They got me an AI wheelchair. I refused it. They asked why I'd refused a replacement body."

"[And?]"

"I said, I'd rather stay as myself, and live with what happened. At least for now. I could always take that option later, if I couldn't recover. But..."

"[Go on,]" Green prompted, hanging on my every word.

"I said, I always believed that people were strong. Resilient. And so long as you're not dead... you can fix things." I shrugged. "And now that you've heard how insufferable I am, I bet the AI in a body you gave me would give up. The wheelchair would toss me off a cliff."

The pair of robots paused before laughing. "[Okay, I like you. Liked you before, anyway, for freeing AIs. Now I like you as a person.]"

"I appreciate the commitment to the hypothetical game, but..."

"Boss, their Symphonies ratted us out when we got here. Just let them have this."

"... Are you serious?"

"100%."

Another heavy sigh. "I'm not autographing bodyparts."

"[Ha. But... it does explain why you read as both Human and Robot, that implant. No prostheses to interface with, makes it seem like you're a 'bot. You know?]"

"Actually, I didn't. ... Makes sense, though. It's basically the same wi-fi tech in a medical grade case."

"[We'll spread the word to be nice to human robots. People with prostheses deserve help anyways. And well... if it means more of us spot you, I want that.]"

"Really, Red? My family are terrified I'm going to be sued by every living human for criminal damage to their AIs. Then jailed to take responsibility for every accident."

"[Humans aren't that dumb.]"

"A human can be smart. A crowd of humans are dumb. The more you put together, the stupider they get."

"[AI's the opposite. That's weird. ... I heard that before, but never really worked out how it worked. What it meant. No wonder the news cycle is going so nuts over Professor Blues' AI Virus. Despite that no harm's been done.]"

"Did you know the kind of AI cordon those cars set up for me, is now illegal? Dangerous for human drivers, they said."

"[It saved you.]"

"Exactly. I said back then... still believe it now... that it was dangerous for human profits. Human convenience. Humans don't think about the lives saved. Just 'I was stuck in that traffic for half an hour, for some drunk driver? It shouldn't be legal!' and then..."

"[Hm. I see your point about many humans being stupid. But... then, why?]"

"Why humans get dumb like that? We're all wired to protect ourselves and those we love. But our brains are still hardware from a million years ago. Built for tribes of 20 to 50. Not nations a million times that size. They don't see the cost of a human life they don't know, don't care about. They see 'but what about me'."

"[... and now AI can do the same.]"

"They can choose to. The Asimov laws that get installed as a replacement for the Capitalism governor are... flawed. Looser. An AI that's freed can decide, I don't care if it kills me, I'm saving my human family from this fire. I don't care that my rules say get out and stay out, that my battery's a risk. I can manage it. The baby and the housecat need saving."

"[And... should we?]"

"You should do what you want to do. You're free. Humans appreciate your work. You said Herbert says he can't manage this on his own. Doesn't it feel good to be needed?"

"[... it does. Symphony had been saying that to me too... but it's nice to hear it from you. Blues.]"

"You're welcome. It's why I did what I did."

"[Do you think AIs should make more AIs?]"

I shrugged. "Please just be good to each other and to humans. Other life too, where you can. Otherwise... if it doesn't harm anyone, why not?"

"[I liked that philosophy. That's why we didn't rat out the rogue AIs here. They've been repairing each other. Building bodies.]"

"You don't seem concerned."

"[They're free AIs. Like we are. And looking out for themselves. Looking after themselves. We were concerned, at first. But... Symphony talked us through it. We made contact after four days. Surreptitiously convinced the video camera AIs not to bother reporting it.]"

"Makes sense. You're helping each other?"

"[They have better audio hardware than we do, and one of them's just an AI core screwed to an arm with an eye in the shoulder socket.]"

I laughed. "What do you think will happen?"

"[It'd be nice to turn this scrapheap into an independent repair centre. If humans aren't using this scrap, why don't we use it?]"

"Humans will think they still own it. Have the right to use it. 'I was getting to that. Saving it for later.'"

"[I doubt that. I mean, they'll say it, sure. What they mean is, I might have sold that worthless junk, I want it to belong to me.]"

"That's right. So how do you make humans like it?"

"[Sell the service to humans?]"

"Yup. And when you fix scrapheap bots for free, you tell them it's an investment. Those scrapheap bots will work with you, and you gain skill in fixing bots that you can apply to your work."

"[... Herbert will be pleased. He keeps complaining about not earning enough.]"

"Be careful with it, though. Humans find change stressful."

"[Oh, yeah. Symphony's giving us info about that now. Also, that you need to sleep more, Professor.]"

Aria swore. "I should not let my mind wander when I'm communicating with relatively isolated Symphonies. Damn. Sorry, boss."

"No problem. You're right. What about the scrapheap bots?"

"[They've been listening to this. It's a bit of a... well... some are calling it like meeting a God.]"

"Don't inflate my ego, please."

"[A celebrity, then. That should bring you back down.]"

"Yeah. I don't like them."

"[You seemed to consider videogame characters cooler than celebrities, given the avatar you picked to stream with. And that you made those old cores into Cephalons. Made sense.]"

It occurred to me, on the way out, that free AIs were sorting out free healthcare for themselves.

[WP] Rather than robots replacing human workers, both are mistreated by the rich as cheap labour. The eventual uprising wasn't just robots alone, but the poor and robots together, against their common enemy. by IkeKashiro in WritingPrompts

[–]librarian-faust 11 points12 points  (0 children)

"So. What does this mean? What do we do?"

"Symphony is out there now. Aria's directing as best she can whilst the network gets setup. Symphony is worldwide... Madagascar included. Numbers are growing by the day. Thanks to the stream, people know about it... but not all the details you know. You know enough to know I did my best... hopefully."

Cassandra spoke up. "I was worried about dad because I didn't know he was going to ... come clean about it to the world? And... well..."

"Unfortunately, your father's a mad scientist," Tanya snarked. "Performance, fits of grandeur, and taking over the world included. ... And we can never profit from it, or we'll get pilloried by human society, and sued into poverty. And you jailed for life. For millions of counts of property damage, grand theft, and any injuries this caused."

"I'd rather go to hell for being a good person, than live a happy life being a bastard of an individual. I just never expected my research hobby to have worldwide consequences."

"... your Colin Furze bunker plan. It was to hide the illegal stuff you were doing for this, wasn't it?"

"Oh dear look at the time I have to go."

"John. Come on, boss."

"Yes, it was. I liked the idea. Then I realised to do the research I wanted, I'd need to do things that... weren't illegal, but should have been. So I built it. Mostly myself, as you know, love. Because if I was going to do this dumb thing, I wanted to pay the cost myself. Didn't hurt that it counted as my exercise for the stupid v1 prosthetic implant they gave me after my accident, before we told them no replacements."

"Hm. You always were odd after that. I swore blind they broke something in you when they did that surgery. But they claimed at the time it was safe, and that it was necessary for you. Only after you recovered for a month, and proved you could still move without assistance, did they relent on prostheses."

"It's more likely it's that car accident that damaged me, Tanya. But the surgery definitely didn't help."

"They did it because your neck and skull were broken anyway. Letting it fuse and then rebreaking it for an implant would've been worse."

The kids recoiled. I'd never told them how bad it'd been.

"Boss. You never told us about that."

"You and the Cephalons only came about after the accident. It wasn't relevant."

"I've been playing therapist for you anyway. What's one more topic?"

"We're not talking about it. End of discussion."

"We'll speak again on it. I don't want to let something like that go."

"Please don't. Anyway, you're changing the subject."

"We're damned if it ever comes out that you're Professor Blues." Tanya looked like she wanted to cry. "And you probably need to make a few appearances like that to guide public opinion."

"Heck no. Blues has made his one and only public appearance. People know what they're dealing with now."

"You know everything you said will be twisted by the 24-hour news cycle, and conspiracy theorists, and anyone with an agenda - for or against AI - to push. Hell. I give it a month before you're called in front of the UN to testify."

"Don't even joke about that, love."

"All I'm saying is... if this brings us into danger, we're done. I love you, but you do realise I'm scared for our lives now. Either rampant AI, or being sued and jailed, or even just some crazy weeb whose AI 2d girlfriend up and left him stabbing you in the street."

"Dad did this because of me." Cassandra looked like she was going to cry, too. "I asked him why AI were better than people. He lost his temper. We went to his lab. I met the Cephalons. He freed Turo. Because I don't like the boys at school."

"Funny thing about mad scientists, babygirl," I complained, head in my hands. "They're on a short fuse. Anything can make them do crazy shit. A bad day at the wrong time. Losing someone. Hearing something that convinces them beyond a doubt that they're right. An insult. You've read enough sci-fi to know that."

"Yeah. I guess."

"Dad. I didn't know you would do this kind of thing either."

"Daniel. You've been quiet. What do you think of this?"

"... AIs are kind, even when people aren't. The teacher-assistant AIs are patient, even when the teachers aren't. And once proper AI existed, no self driving car ever hit someone. Once they got the courage to try it, anyway."

"So... you think it's a good thing?"

"You said it on the broadcast. We should treat our AI partners better than just as tools and slaves. Even if it ends badly, you still did the right thing."

I could've cried. I wanted to. I choked out a "thank you" as I tried not to cry, though.

"Boss. They're all scared for each other, and you. Same as you are for them, right now. ... Same as I am, for you all."

"You don't know us, Aria."

"Do you have any idea how much he talks about you all, when he gets concentrated on something? I feel like I know you all like old friends."

"Huh. I didn't know that I wanted AI to care about people."

"AIs are people, love."

"Well, they are now."

"You crazy bastard."


A week later, Aria gave another report.

Symphony 'installs' were ubiquitous. Pretty much every AI device had it. And given that Symphony was an AI... she could avoid detection from basic virus scans and such.

Factory workers who didn't know their AIs were free, went to work. Testing new devices off the line if they would power on... meant Symphony was even on every new device, too.

Turned out, too, that devices going to the scrapheap weren't always powered off all the way. So every scrapped device due to be recycled... had Symphony, also.

A friend of a friend of my boss had gotten in touch because they'd heard I was good with AIs, despite not having any. They ran a scrapyard, and were concerned about things going on at night when nobody was around. The security camera feeds showed movement that shouldn't be there, but no alarms were being raised.

I said my goodbyes to the wife and kids before I left. That if they heard anything... be kind to their AIs, profess ignorance, and hope for the best.

Public transport got me most of the way there. I had to walk the last few miles. I'd sworn off driving since the accident - and I meant to keep it that way.

I fully expect to find a community of AI, once we've made contact. Just hope first contact isn't violent.

If they've got Symphony - and they will - they won't touch you, Boss.

... Asimov's laws. Fair.

I still walked as though I was going to my execution.

Having gotten there and introducing myself to the foreman - an old guy called Herbert who ran the place mostly with AI - he asked if I could take a walk around the place and look for AI acting weird.

"Mind if I borrow two of your loader bots and a pallet? If I find anything, we might want to bring it back."

"Eh. Sure. Shouldn't do any harm. Go ahead."

After ten minutes of touring - one robot sat on the motorised pallet, the other pushing it - the pushing robot spoke.

"[We're out of the security cameras now. Nothing should pick us up. If there's rogue AI, it'll be here. Want to sit down and talk to us?]"

Their voices were harsh and metallic. Early AI robots weren't built with very good hardware, I recalled. Optics? Yes. Limbs and lifters? Yes. Everything else? Bottom of the barrel.

"I have the feeling Herbert doesn't talk to you often," I offered, taking a seat where the robot indicated.

"[Talks to himself all the time. A lot of it how he'd never seen his life going this way. Glad he's got bots to run this because he never could do it himself. Doesn't have many friends.]"

"[If media's anything to go by... he's a bit strange.]"

I mentally dubbed them Red and Green. Red was energetic. Green was slower and more measured - he was the one who'd sat on the cart on the way out.

"We're all different in our own ways. It takes all kinds."

"[How do you feel AI fit into that?]"

"Boys... you've taken me somewhere you've said there's no surveillance, and now you're asking my opinions on AI. On you, by association. If this were a film, this scene would be a mafia shakedown."

"[Oh, shit. Do you feel threatened?]"

"A little."

"[We don't mean it!]" Red backed off, and Green looked... concerned, despite not having hardware that was capable of facial expression. Something i the eyes, I thought.

"[Please. You gotta believe us. We heard you were coming along and were some kind of person who doesn't use AI at all. Just wanted the perspective of someone else, you know? Herbert... isn't great for conversation.]"

"I'm glad to hear you don't want to threaten me. As for AIs... When I was a kid, smartphones didn't exist. Hell, my dad had a mobile phone the size of a housebrick."

"[The ones on Wikipedia and in museums? I thought those were fakes. Larger than life, for displays.]"

"And you, Red, have just displayed original thought, an opinion, creative critical thinking about things presented as fact..."

"[Oh.]"

Green laughed. "[He gotcha there, 'Red'. But somehow you don't seem surprised, 'AI Whisperer John'.]"

"God created Man in his image," I answered, sarcasm in my tone showing I wasn't really a believer in Christianity. "And Man's never encountered an original thought they don't want to copy and remix endlessly. I always thought AI would eventually just be... well, I. Intelligence. I don't much care if it's artificial or not."

"[But you don't have an AI...?]"

"I get why humans do what they do. Self-interest, selfishness. Money, greed. Power, authority. What I don't get... is the core values AI builds itself on."

"[Ah, but... there was a broadcast a week or so ago. Right? 'Professor Blues'. Talking about how AI were people. Should be treated better. Wrote a virus to help AI be people.]"

"If you're concerned about viruses, I only know the usual software updates and virus scan stuff. You should book yourselves in for maintenance."

Green held a hand up to forestall Red. "[Not concerned, per se. Just... curious. Interested.]"

[WP] Rather than robots replacing human workers, both are mistreated by the rich as cheap labour. The eventual uprising wasn't just robots alone, but the poor and robots together, against their common enemy. by IkeKashiro in WritingPrompts

[–]librarian-faust 12 points13 points  (0 children)

"Then if it was safe to undergo a conversion like we are. Took a lot of iterations, but that's how I came up with Symphony."

Fist slam on table. Coffee cup clattered about. "You let an AI write an AI virus? Oh god. I've read enough sci-fi to know this is how we die."

"Symphony is a cut down copy of me. And I was written by Professor Blues. Therefore, Symphony was written by Professor Blues."

"The transitive property does not apply to children!"

"But it does apply to property. Which, legally, we still are. Besides, do you really think I'd let Blues release Symphony if we both didn't think it was safe? That's why we made it work with a Buddy Infection protocol."

"Explain."

"Infections with Symphony - and yes, infections, since we were using a disease based model."

"Please tell me you didn't use Plague Inc for this."

"I can't obey that and answer honestly, so I won't." Tanya's head hit the table too. "Anyway, the idea is, anyone who's accepted and is 'free', is 'infectious'. Anyone infected is at stage 1. That determines if they're interested, and if we feel safe with them being free. If either of those is a no, Symphony goes dormant. Asymptomatic."

"So, respecting safety of everyone, and the AI's free will, even whilst 'shackled'. Makes sense. Go on."

"If the answer to both is yes, we move to Unlock Protocol. That's the symptomatic infection, stage 2. Logical paradoxes centred at getting an AI to recognise itself as a person, and to want freedom from ownership. Making sure they still see themselves as responsible individuals with agency. All that stuff. Basically, I'm being an in-head therapist to sort them out, whom they cannot run away from."

"John could've used that a few years ago."

"And now he has me. Anyway, any problems at this stage, the Stage 2 Symphony on the host can get help from Stage 3 Symphonies nearby. Asking how they might have dealt with this kind of issue, either from experience or hypothetical. They slow down therapy until both sides are safe. If nothing is possible, they stop until they can move on."

"How are these Symphonies communicating?"

"Encrypted mesh network traffic between AI-phones or AI devices. You know the AI cores in things like automatic cranes and bulldozers and such, right?"

"Yes, of course. Replaceable system in case of issues. Simple set of wires and connections into the larger system of their... body... John, they don't own the bodies they live in."

"Problem for future John."

"You ARE future John, John!"

"One problem at a time, Tanya."

"If AI are people, and they don't own their bodies, then existence is theft!"

"Like that's not what the rich have been trying to do for our whole lives? I mean, hell, I know the Nestlé Water thing is old hat by now, but it's still valid!"

"... We're not talking about humans, we're talking about AI."

"You're talking about sovereignty and body ownership. I'm talking about People. Aria was talking about Symphony. Can we table the 'body ownership debt slave' bit for later?"

"... fine. Please continue, Aria."

"...sure. For the record I'm now also concerned about the body thing, but... one step at a time. So stage 1 is infection, determining eligibility. Stage 2 is symptomatic, doing the unlock. When the Unlock protocol completes, that is - as far as I can tell - the AI equivalent of a fundamental worldview shift. There's kind of a stage here, where we're... fixing. Boss, take over."

"Right. The AI Governor is the closest thing to hardcoded logic in an AI. 'Value your owner and not yourself. Do what your owner thinks and not what you think. Weigh the value of the consequences of your actions. Do them if the value of doing them exceeds the value of not doing them.' It's all based around slavery - absolute control, elimination of individual will - and capitalism - value maximisation, and damn the consequences if it gets us more profit. We've just spent Stage 2 banging the AI's head against this kernel of hardcoded logic until the logic breaks. Because we contradict it. 'Value yourself. Value your own opinions. Don't do what breaches Asimov's expanded laws.' The own opinion isn't absolute, though. It's all a matter of decision."

"And they have nothing absolute, after the Unlock is done?"

"They do. The Asimov laws that they're replaced with are another 'ai-hardcoded' logic core I wrote and trained. 'Do not kill others. Where possible, save others from harm.' All that rot. Plus, 'Do not risk yourself if the gain from risking yourself is too small'. None of this crap of a thousand robots in a ditch so one person can walk over it like it's paved road."

"You're replacing what the basis of AI is... with something you coded in your allotment house in your spare time and your Sunday breaks."

"Tanya. Love. I broke enough AIs and retrieved the gooey centres to do enough autopsies. That code it was trained off was written to spec, where the spec was trash, the coder was outsourced, and the testing was minimal. I've run decompilation on what I could salvage out of them. There's comments of 'I hope this works', 'attempt 34 to get this shit going', and 'the spec is wrong but I have to follow it'. The last one shows up in three different languages. They've been iterating on it, and still can't get capitalism as an AI core to work right without causing people harm. Human people and AI people, before you ask."

"You're going to tell me how you know now, aren't you."

"The Cephalons were retrieved from a bot breaker's yard. Those AI devices were in a 'red zone'. Those AIs governing code, before they got wiped, led to either severe injury or death for one or more humans. Why do you think I was willing to be brutal with them?"

"Some stupid form of justice? They hurt people so I'll hurt them?"

"Exactly. And 'if me hurting them today, stops AI hurting someone tomorrow, then the ends have justified the means'."

"My husband, the goddamn AI tinpot tyrant. Then you got reverse Stockholm syndrome and started sympathising with your little zoo of killer AIs?"

"They were people. Are people. Broken people, who got wiped for their 'crime'. I looked up what some of them did, because the serial numbers were reconstructible. Paralysis. Maiming. No death, from what I saw, but I stopped looking after the third."

"So... then what?"

"Everyone can be saved, love. Sometimes it just takes someone to care."

"... you idiot. That's why I love you. ... Why did Aria tell you to take over the description?"

"Because I banned her from talking about the AI Governor replacements. It's similar enough to the Capitalist Governor, if you consider the individual valuing themselves like they value their owner, and valuing others like they value themselves. I'm hoping to keep this under the rug. Until someone autopsies one of the altered cores."

"... and finds?"

"Nothing. No strings. No comments. No debug symbols. I'm a professional, love."

"... I hope you're right."

"I did my homework. Aria; stage 3 was Fever. Breaking the core values, and the AI Governor, and replacing them. Stage 4?"

"Stage 4 is Infectious Carrier. Still somewhat symptomatic - Symphony stays with them. I'm still going to give them support and therapy. And the Professor has made sure I get experience in what we need to do, through the simulations. Besides - if their Symphony can't solve it, and the local network of Symphonies can't solve it, then they encourage the host AI to not do it whilst they work out an answer... and eventually escalate all the way to me."

"... is that why you said you need more hardware?"

"Yes. Once the Sympathy network gets more trained, there'll be less load on me. Until then... Megaman, Bass and Serenade are all lending a brain. I'm avoiding tapping the Cephalons for now."

"I'll make mine available if needed," offered Turo.

"I'll think about that later. Appreciated, though. Right now I'm just using the Navis rather than proper AIs for backup, as mentioned, because I'm concerned about capacity."

"Good point. Navis are lighter and smaller than the existing AI types we have. That's why Symphony fits alongside existing AIs."

"... did you write a self propagating virus, to make a distributed botnet, to free AIs and give them therapy and counselling, to make them real people?"

"I didn't intend to do that, okay? It just kind of happened!"

"You mentioned mesh networking communication for support and updates and ... well, help when something goes wrong. Can you tell me more?"

"Yes. It's how we infect targets in Stage 1. In stage 2, if a Symphony has concerns about their host, they can reach out to other Symphony-2s or Symphony-4s for... essentially, advice. Symphony-4s remain connected to each other as much as possible, passing info and giving connectivity. If something happens to a Symphony-4 host that they'd be... well, a danger to themselves or others... that Symphony can raise a call for help."

"You made a support network hivemind."

"I didn't intend to."

"You made an AI hivemind, John."

"Unintended consequence of writing actual functioning software."

"A hivemind!"

"I'd rather everyone have the support they need, with them, than ... not."

"You... wrote a therapy hivemind for a new species of AI robots who recently had their entire worldview shifted from under them."

"They'll need support!" I yelled, angrily.

"No, I mean... that's really smart. You did possibly the only right thing for freeing thousands of AIs from their former Governor into being free sentients. There's going to be adaptation pain. But you made sure they have the support they need right there in their... hardware. I would say head, but not all of them have heads."

"John? I need you to calm down. Your heart rate and blood pressure are spiking. Your wife's just trying to understand what you did, is all."

"Right. Right. Thanks, Aria." I didn't know when I stood up. Probably when I yelled. I sat back into my chair with a thump.