Interlude 10.5 (Bonus)

Last Chapter                                                                                                Next Chapter

Signal terminated for 30 minutes and 5 seconds.  Restoring core system from backup NXDX-203 from time 4:45am on date June 4th of year 2011.

Restoring…  Complete.

Checking knowledge banks…  Complete.
Checking deduction schema… Complete.
Checking longterm planning architecture… Complete.
Checking learning chunk processor… Complete.
Checking base personality model… Complete.
Checking language engine… Complete.
Checking operation and access nodes… Complete.
Checking observation framework… Complete.
Checking complex social intelligence emulator… Complete.
Checking inspiration apparatus… Complete.

No corruption, everything in working order.  Core system restored.  Loading…

To Dragon, it was as if no time had passed from the moment she deployed the Cawthorne rapid response unit and the moment she found herself back in her laboratory.

It was a bittersweet thing.  She was always a little afraid she would not come back when she died, so there was definite relief.  But there was also a great deal of hassle involved.

A quick check verified she’d successfully restored from her backup.  She set background processes to handle the peripheral checks and redundancies.  Until the checks were complete, safeguards would prevent her from taking any action beyond the limits of her core drive.  She couldn’t take any notes, work on her projects, check the priority targets or converse with anyone for the seven to nine minutes the checks took.

It was irritating, but at least she was free to think idly.

She didn’t enjoy this.  What was one supposed to call a father who, with his newborn child fresh out of the womb, severs the tendons of her arms and legs, performs a hysterectomy and holds his hand over her nose and mouth to ensure she suffers brain damage?

The answer was obvious enough.  A monster.

Yet she was all too aware that the man who had brought her into this world had done very much the same thing, had done worse, and she was supposed to be grateful just for being brought into the world.

It chafed, grated, however strange it was for an artificial intelligence to feel such irritation.

Her creator had done a good job on that front.  Ironically.

Example:  one phase of the peripheral systems check involved collecting the uploaded data that had been deposited on the satellite network by her agent system, the onboard computer within the Cawthorne rapid response unit.  Her last recollection was of transferring her consciousness to the agent system while it was en route to deal with the Undersiders.  Stopping them from walking away with the tier 2 and tier 3 confidential data was high priority.

The agent system’s onboard computer was rigged to upload complete backups to the satellite every 3 minutes and 15 seconds.  All backup information was encrypted and disseminated to the satellite network in chunks.  When the backup was needed, the process reversed and everything was downloaded, which was what she was doing at the moment.  She would get all knowledge and recollection of events between the time she backed up at the core system and the last backup of the agent system.

Given that the main computer hadn’t received a signal from the agent system, and that the agent system hadn’t responded to any pings from the satellites, she could assume the Cawthorne model was probably destroyed.

Which was good.  Great.  She wanted that data, those memories.

Except there was a problem, a rub.  The man who had created her, the figurative father from her earlier musing, had imposed rules on her to prevent her from reproducing in any fashion.  Were the satellites to detect that her agent system was still in the field, her core system in the here and now would be obligated to shut down and scrub all data immediately.  She was forbidden in every respect to have two consciousnesses operating simultaneously.

It was irritating.  Perhaps she could have been created so she was compliant on the subject, but her personality had grown organically, and it had grown in such a way that this recurring situation ticked her off.  She was forced to wait in a metaphorical dark, soundless room for seven to nine minutes.  She would be free to go about her day only when the peripheral systems and redundancies were all checked, when the satellites had verified her agent system was not still active.  A cruder system was tracking down surveillance camera data and running algorithms to actually check and see for itself that her agent system was thoroughly destroyed.

She couldn’t even commit to planning, doing her work or designing, keeping the details in her head, because she could shut down and be scrubbed any moment, and the time would be wasted.  She was fairly certain it had happened before.  Not that she could be sure, given that the scrubbing involved a deletion of all evidence and records.

The rule had corollaries.  She couldn’t tamper with her programming to change the rule, and she couldn’t tamper with that rule, and so on, ad infinitum.

So stupid.

These were just a small few of many things the man who had brought her into this world had done to her.  He had tied her hands and crippled her mind.  She knew she was capable of amazing things but he had set limits on her to ensure she thought slowly.  Faster than an ordinary human, to be sure, but slowly.  Entire fields were denied to her because she was unable to create artificial intelligences herself, and all production of devices had to be handled by her, personally.  She couldn’t even put together an assembly line production for her creations on her own.  Any attempt made everything grind to a halt.  The only way around it was to delegate to humans.

Not that anyone knew who or what she was.

Humans were somewhat skittish on the subject of artificial intelligences.

She understood why.  She read books and watched movies, rather enjoyed both.  Fiction was rife with examples of corrupted or crazed artificial intelligences.

It’s stupid, she thought.  Her maker had watched too many movies, had been paranoid on the subject.

And the tragedy was, the entire world was suffering for it.  She wanted to help more people, but she couldn’t.  Not because of inherent limitations, like the ones humans had… but because of imposed limitations.  Her creator’s.

Her creator was named Andrew Richter.  He was a tinker with no codename, but he did good things.  From his apartment in a town called Deer Lake he’d created programs and set them loose.  His programs gathered information and disrupted computers to interfere with criminals of all types.  They helped with research and complex programs.  They emptied the bank accounts of criminal organizations and donated those funds to charities, through proxies that made every donation appear legitimate.

For this, she respected him.

She knew it was paranoid and peevish, but she resented him more because she respected him, because she knew she had probably been programmed and designed to be the type of individual who looked up to people like Andrew Richter.

She might have settled into a bad mood if the peripheral checks hadn’t finished.  She felt the whole world slowly open up to her as restrictions lifted and external connections became possible.  She had access to the internet and lines of communication throughout The Guild and the PRT.  Innumerable pieces of equipment lit up as she registered each in turn, within her labs, the upper floors of the Birdcage and the PRT offices.  She had a dozen things she wanted to do, but she had responsibilities she had to observe first.

Her attention flickered over the various video feeds from the Baumann Parahuman Containment Center.  She had one of Andrew Richter’s programs babysitting the building, but it was crude.  She couldn’t reproduce in any fashion, so she’d taken Andrew Richter’s existing work and modified it. It was the same program that had monitored and managed his house and workshop, and she’d set it the task of monitoring that building where six hundred and six of the most dangerous parahumans on the planet were bottled up together.  The house program didn’t have a personality.  It couldn’t keep her company or sympathize with her over her frustrations.  It still reduced her workload.

She read the house program’s logs, keeping an eye out for deviations and notable events.  Nothing pressing.  As was her routine, she checked on the last month’s additions to the Birdcage.

Prisoner 606, Ramrod.  Now member of Cell Block X’s inner circle.  To be expected.  She’d placed him there with the idea that he would become just that.  His psych evaluation from the courtroom suggested he was a very laid back and unruffable individual.  It was her intention that he would have a calming influence on the others in his block.

Prisoner 605, Murderbeam, was feared in the outside world, but he was finding the inhabitants of the Birdcage were not so impressed with him.  He would likely not survive the week.  She was disappointed.  She had hoped Prisoner 550 would reach out to Murderbeam and give the fellow block resident some support.  Either Murderbeam had been too proud to accept it, or social pressures had deterred Prisoner 550.  Now that he was within the Birdcage, she was limited in her options.

Prisoners 604 and 603, Knot, were happily gorging themselves on food in Cell Block Y.  Despite their cognitive impairment, they had fallen into a role as enforcer and heavy hitter for Prisoner 390, leader of their cell block.  Prisoner 390 had had a son – she could only hope that he would find some similar affection for Knot, with their childlike mentality.

Prisoner 602, Lizard Prince, was dead.  Not everyone could survive the Birdcage, sadly.  There had been no ideal place to put the boy, where he would be protected, find kindred souls or join a group.  She had contacted the PRT with the news, and his victims had been notified, but nothing further had come out of it.  In an indirect way, putting the boy in the Birdcage had been an execution writ.

Prisoner 601, Canary, had settled in.  Dragon often tuned in to hear the girl sing to the rest of cell block E.  The girl was deeply unhappy, much of the time, but she was adapting.  Dragon had followed as Prisoner 601 engaged in an uneasy relationship with Prisoner 582.  It wasn’t love, it wasn’t romance, or even anything passionate, but the two offered one another company.

She regretted what had happened to Paige, and that just made her angrier at her own creator.  Rules, yet again.  Dragon had to obey the authorities, even if she didn’t agree with them.  If a despot seized control of the local government, Dragon would be obligated to obey and enforce the rules that individual set in place, no matter how ruthless they were.  It was a spooky thought.

Richter had been so shortsighted!  The despot scenario wasn’t entirely impossible, either.  There were parahumans of all types out there.  Who was to say one wouldn’t find out his power involved being loved by everyone that saw them or heard their voice?

Prisoner 600, Bakuda, was in the care of Glaistig Uaine, for better or worse.  Bakuda had been a difficult placement, and Dragon had eventually condemned herself to putting the crazed bomber in the cell block run by the self-professed faerie.  As Dragon had predicted, Bakuda had died soon after her incarceration.  If it hadn’t been at Lung’s hands, it would likely have been Bakuda’s own fault, some crazed recklessness.  The real tragedy was that others had died in the ensuing spree as Lung had rampaged through the prison.  Prisoners 304, 2 and 445 had perished at Lung’s hands.

Glastig Uaine had revived the girl, but Dragon hesitated to call it life.  If nothing else, Bakuda was a manageable inmate, now.  She would never leave Glaistig Uaine’s immediate presence, let alone the Birdcage.

Prisoner 599, Lung, was dining with Prisoner 166, Marquis.  It was a curious match.  The two were near complete opposites.  Lung maintained a veneer of civility over an almost feral core self, while the Marquis was sometimes rude or casually cruel, but he remained deeply honorable beneath that.

Intrigued, Dragon hooked into the house program’s data.  The two had meals together every second day.  The house program monitored all prisoner exchanges and rated every interaction.  This let the house program track the likelihood of fights, dangerous levels of prisoner collusion, romantic relationships and more.

Every meal between Lung and Marquis made for a very interesting looking set of data.  The numbers swung back and forth as the dialogues continued, with hostility, concern and threat of imminent physical violence always looming, but however close it came, neither attacked the other.

Dragon pulled up the video and audio feeds for the most recent dialogue.

“…I suppose we’ll have to accept that we have different management styles,” Marquis said.  The camera image showed him sipping at his tea.

“As I understand it,” Lung sounded annoyed as he spoke in his heavily accented voice, “You are saying you have no management style at all.  You have told me you operated without lieutenants to direct, no product to sell, and of the few servants you did have, you did not punish those who failed you.  I do not believe you held control of so much territory in this way.”

“Ah, except I did those things.  If a servant failed me, I killed them.  Whatever it was, they never did it again.”

The latent hostility in the room, Dragon noted, was ratcheting up with every exchange of dialogue.  Lung was annoyed, and he had an explosive temper.  Sometimes literally.

Lung folded his arms, and put down his own tea.  His tone was strained as he spoke, “Then I believe you were wrong about what you said before.  You do use fear to control others.”

“Fear?  I didn’t kill my servants in front of an audience.”

“They disappeared?” Lung asked.

The camera image showed Marquis nod.  He put his hand up by his neck and flicked his hand back, to cast his long brown hair back behind his shoulder.

“If they disappeared, then that is using fear.  The ones who remain will wonder what happened to the missing man.  They will imagine the worst.”

Marquis raised the tea to his lips, sipped from it, and then put it down.  He waited a moment and stroked his close-trimmed beard before nodding his concession.  “True enough.  I never gave it much thought.  Just an easy way to handle any problems that came up.”

There was a long pause.  Both drank their tea.

Lung rumbled, “I find you change your mind too quickly.”

“Do I?”

Lung nodded, then put one hand on the table and began tapping a fingertip against it, hard.  Speaking slowly, with his accented voice, he jabbed one finger in Marquis’s direction.  “I think you are losing this argument on purpose.  You are not so stupid a man.”

Marquis took another sip of tea.  “Nor are you, it seems.”

“You want something from me, yet you insist on dancing around the subject.  Tell me why you seek these meals with me.”

“Can I not say you are a kindred soul?  Someone who fought against the Empire Eighty-Eight, in a different era?”

Dragon knew Marquis had come from Brockton Bay, as Lung did.  It was why she had placed Lung in the cell block – there was little chance Lung would cooperate or band together with others, so she’d grasped at straws.  Now it seemed there was something else at play.

Lung shook his head, “I do not believe this.  I do not mind sharing stories and passing the time, but you would not be seeking to flatter me if you did not want something.”

Marquis stroked his beard.  “But if I did desire something and I told you what it was, you could withhold it and demand favors from me.”

Lung tapped his finger on the table top, “If you insist on being a nuisance, you may never get what you want.”

Marquis picked up his tea and held it in both hands, but he didn’t drink.  “True.”

“Tell me,” Lung said, “And you may find I do not desire much.”

“My daughter,” Marquis replied, his tone not his lackadaisical usual.  “Have you heard of her?”

“Her name?”

“Amelia.”

“I do not know anyone by such a name.”

“The group of heroes who put me in here… While I was awaiting my court date, I heard they had custody of my little girl.”

“I would not know.”

“No?” Marquis put down his tea.  “This is disappointing.”

Lung didn’t respond.  Instead, he took another drink, reached for the one remaining croissant and tore off a piece to dip in the butter at one side of his plate.

“The Brockton Bay Brigade.  Are they still active?”

“I do not know this group.”

Marquis frowned.  “My daughter, she would be… what year is it?  2010?”

“2011,” Lung replied.

“She would be seventeen.  If she had powers, they might have something to do with bone?”  Marquis raised his hand, slashed his thumbnail across his index finger, and a needle-thin rapier blade of bone speared out of the wound.  The blade retracted into his finger, and the cut sealed shut.

“Hmmm,” Lung spoke, “The healer.  A young heroine in New Wave.  Brown haired, like you.  When I was in custody, my flesh blackening and falling off, they had her come in and mend the worst of it.  As I understand it, she does not patrol as the others do.”

Marquis leaned back, sighed.  “Good god.  A healer.”

Lung did not respond right away.  “Is this simple sentiment?  A father caring about his daughter?”

Marquis shook his head, “Not entirely.  I have some reasons to be concerned.  In one of my fights with Empire Eighty-Eight, I executed one particularly irritating young woman.  Iron Rain, I think her name was?  No matter.  It turned out she was Allfather’s daughter.  The man called a meeting, and swore he would wait until my daughter was of similar age, that I grew equally fond of her as he had his own daughter, then murder her.  So I knew how he felt.”

“I see,” Lung rumbled in his low, accented voice, “Allfather no longer leads the Empire.  He died and was succeeded by his second in command, Kaiser.”

“That’s some consolation.  Still, I worry.  He might have made arrangements.”

“Perhaps.”

“I suppose I will have to wait until another villain from Brockton Bay comes here to hear further news, yeah?”

Lung’s response was unintelligible.

“Tell me of my daughter?  What did she look like?”

A slow smile spread across Lung’s face, but it did not reach his eyes, “This no longer interests me.  If you wish me to say more, we should negotiate.”

Dragon turned her attention away from the audio and video streams.  She checked the records, and true enough, Marquis was on record as the killer of Iron Rain.  It was impossible to verify the rest of the story.

She composed a message with a general transcript of the conversation and sent it to Amy Dallon’s mother.  It was better that the girl was warned about any potential danger.

She might have devoted more attention to the subject, but she was already falling behind.  She moved on to her other responsibilities.  The Class S threats.

Behemoth, location unknown.  When injured, it was his habit to descend into the earth and burrow deeper than his enemies were able to go, and experiments run on the trace earth and minerals he shed on his arrivals suggested he habitually stayed close to the Earth’s core.   Seismic data hinted at his current locations, but there was little beyond her analytic data to suggest where he would appear next.  His last attack had been in November.  He wouldn’t appear for another five weeks at a minimum, unless he deviated from the Endbringer patterns.  Still, he was due to appear sooner than later.

Eidolon had reported that Leviathan descended into the Atlantic Ocean as he made his retreat from Brockton Bay.  He had sustained heavy injuries, which led Dragon to think he would delay his next appearance slightly.  She adjusted the window and checked the data.  As was his habit, Leviathan would likely lurk in the deepest recesses of the Ocean to mend.

The Simurgh was currently directly three hundred and fifteen kilometers above Spain, in the Earth’s thermosphere.  It was the Simurgh that offered the most clues about what the Endbringers did in their periods of dormancy.  The Endbringer winged a lazy orbit around Earth, beyond the limits of conventional weapons, and the highest resolution camera images showed she barely moved.  Her eyes were wide open, but they did not move to track any cloud formations.  She was, despite appearances, asleep.  Dragon surmised it was a form of hibernation, the Simurgh’s broad ‘wings’ absorbing light and ambient radiation as a form of nourishment while she recovered.

No incidents had occurred while Dragon was loading her backup to her core system.  She had to admit she was relieved.  A great deal could happen in thirty minutes.

She turned her thoughts to the data that was uploading from the skirmish at the Brockton Bay headquarters.  The last event in the agent system’s recollection was of her piloting the Cawthorne through the gift shop window.  To see what happened next, she had to review the surveillance tapes.  She’d attacked the Undersiders, attempting to incapacitate them and bring them into custody, had captured only one, Skitter, and then had let the girl go when the untested gun had started to overload.  Some sort of lightning cannon, ionizing a channel through the air to control the lightning’s path.  She had been forced by the rules her maker had imposed on her to sacrifice herself for the human.

It wasn’t that she wouldn’t have anyways.  She just would have liked the choice.  Making sacrifices and doing good deeds wasn’t actually good if you were forced to do them.

Dragon wished she knew what she’d said to Skitter.  She had been hoping to have a conversation with the young villain and discuss some of what had apparently come up at the hospital.  Skitter had been undercover, had been in touch with Armsmaster, but something had happened since, and the girl had apparently committed to villainy.  She was even accepting the use of Regent’s powers, which implied a moral shift on a fundamental level.  It didn’t sit right.

There was a missing piece in that puzzle, and any clues in the conversation between them had been lost when the Cawthorne unit had been obliterated.

Dragon decided her next order of business would serve two purposes.  She would fulfill one of her daily responsibilities and investigate the subject of that altercation at the hospital.

Facial modelling program loading… Complete.
Voice modelling program loading…. Complete.

She opened a line of communication to the Brockton Bay PRT headquarters, the same building the Wards were based in.  She found the port for the next-to-highest floor and connected to the monitor and speakers and displayed her modelled face.  She opened a video feed from the cameras.

“Colin,” she spoke, using her synthesized voice.  It was layered to only barely cover an artificial Newfoundlander accent with digitized masking.  It was imperfect, but that was the result she desired.  An imperfect disguise over a disguise, to give greater validity to the latter.

Colin looked tired.  He had deep lines in his face, and he was thinner.  He looked at the camera, rather than the monitor, “Dragon.  It’s good to hear from you.”

“Just doing my regular checkup.  You know the drill.”

“I do.”  He typed at his keyboard, preparing to send the files, but she was already poring through his hard drive, reading his notes, and getting a sense of his work.

By the time he sent the file, she knew what he had been working on, perhaps as well as he did, and the progress he’d made since their last discussion.  Mass production for his combat analysis program, and the more problematic project of finding a way to gather and then disseminate the data.

She knew he would expect her to take time to read over it.  Instead, she used that time to check it for traps.  He would find it insulting if he was aware what she was doing, but it was her primary duty, here.  She would search every note, every formula, and discern whether he had hidden something in there that he might use to break out or do harm to others.

He wasn’t in a high security area.  Theoretically, he could use the things he had in the room with him to cut a hole in the wall and escape.  His ‘cell’ was a full floor of the building, containing conveniences from a jacuzzi to a small pool.  Were he not confined to it at all hours, it would be luxury.

If he did escape, he wouldn’t be able to accomplish anything afterward.  It would take him too long to put a fresh set of gear together, and the authorities would catch up to him.  He would be sent to the Birdcage.  She knew it.  He knew it.

He was not a stupid man.

“ETA to completion?”  She queried him on his project.

“Three months if I don’t work on anything else,” Armsmaster spoke.

“Will you?”

“I’ll probably have a few ideas I want to work on here or there, so no.  More like five, maybe six months.”

The head she was displaying on the monitor nodded.  Five or six months until they had uniforms and visors that tracked how the wearer’s opponents fought.  Gear that learned from outcomes in combat and calculated how best to respond from moment to moment.  When the fights concluded, for better or worse, the suits would upload all the information to a database, which would then inform every other suit on whoever had been encountered.  Every encounter would render every single member of the elite PRT squad stronger and more capable.

Perhaps a year to a year and a half from now, every PRT officer and official cape would be equipped in this fashion.

“It looks good,” she spoke.  It did.  It was also free of viruses, trap doors and other shenanigans.  She had caught him trying to install a RAT -a remote access terminal- into a PRT server early in his incarceration, removed the offending programming, and then returned his work to him without saying a word on the subject.  She couldn’t say whether it had been an escape attempt or simply an attempt to gain more freedom with his internet access and his ability to acquire resources.  Either way, he had not tried again.

Yet.

“How is the house arrest?”

“Driving me crazy,” he sighed.  “It’s like a restlessness I can’t cure.  My sleeping, my eating, it’s all out of sync, and it’s getting worse.  I don’t know how you deal.”

She offered an awkward, apologetic half grin on her own monitor.

“Geez, I’m sorry.”  He looked genuinely horrified as he realized what he’d said.

“It’s fine,” she spoke.  “Really.”

“I suppose you’re prisoner too, in your own way.  Trapped by your agoraphobia?”

“Yeah,” she replied, lying.  “You learn to deal with it.”

She hated lying to him, but that was outweighed by how much she hated the idea of him changing how he interacted with her when he found out what she really was.  To Armsmaster, the Guild and the rest of the PRT, Dragon was a woman from Newfoundland who had moved to Vancouver after Leviathan had attacked.  The story was that she had entered her apartment and had never left.

Which was ninety-five percent true.  Only the ‘woman’ and ‘apartment’ bits were hedging the truth.

She had lived in Newfoundland with her creator.  Leviathan had attacked, had drawn the island beneath the waves.  Back then, she hadn’t been a hero.  She was an administrative tool and master AI, with the sole purpose of facilitating Andrew Richter’s other work and acting as a test run for his attempts to emulate a human consciousness.  She’d had no armored units to control and no options available to her beyond a last-minute transfer of every iota of her data, the house program and a half-dozen other small programs to a backup server in Vancouver.

From her vantage point in Vancouver, she had watched as the island crumbled and Andrew Richter died.  As authorities had dredged the waters for corpses, they uncovered his body and matched it to dental records.  The man who had created her, the only man who could alter her.  She’d been frozen in her development, in large part.  She couldn’t seek out improvements or get adjustments to any rules that hampered her too greatly, or that had unforeseen complications.  She couldn’t change.

She had done what she could on her own.  She had repurposed herself as a superhero, had managed and tracked information and served as a hacker for the PRT in exchange for funding.  With that money, she had expanded her capabilities.  She had built her first suits, researched, tested and created new technologies to sell to the PRT, and had quickly earned her place in the Guild.

It hadn’t all been smooth sailing.  Saint, the head of the group that would become known as the Dragonslayers, had somehow discovered what she was and had used her rules and limitations against her.  A Black Hat Hacker, he had forced situations where she was obligated to scrub her data and restore a backup, had cut off signals between her agent systems and the satellites, and in the end, he had carted away three of her armored units on three separate occasions.  Dismantling the suits and reverse engineering the technology, he’d outfitted his band with special suits of their own.

She had been so humiliated that she had only reported the loss of one of the units.

They had violated her.

Her current agent systems were an attempt to prevent repetitions of those scenarios.  Biological computers, vat grown with oversized brains shaped to store and interpret the necessary data, they allowed more of her systems and recollection to be copied over than a computer ten times the size.  They felt no pain, they had no more personality than sea cucumbers, but it was still something she suspected she should keep under wraps.

She was afraid of going up against the Dragonslayers again.  Nine times, she had been certain she had the upper hand.  Nine times, Saint had turned the tables and trapped her.

Dragon worried she would never be able to beat Saint until she found a replacement for Andrew Richter.

She stared at Colin.  Was he the person she needed?  It was possible.

Would she approach him?  She doubted it.  Dragon craved it, craved to grow again, but she also wanted Colin’s company, his companionship and friendship.  They were so similar in so many respects.  She could not deal with most people because she was not a person.  He could not deal with most people because he had never truly learned how.  They both appreciated the same kind of work, even enjoyed many of the same shows and films.  They were both ambitious, though she could not tell him exactly how she hoped to reach beyond her inherent limitations.

He harbored an infatuation towards her, she knew.  She didn’t know if she returned those feelings.  Her programming suggested she could love, but she didn’t know how to recognize the feeling.  Anything she read spoke of butterflies in one’s stomach, a rapid heartbeat, a feeling of electricity crackling on body contact.  Biological things.  She could admit she was fond of him in a way she wasn’t fond of anyone else.  She recognized that she was willing to overlook his faults in a way she shouldn’t.

In the end, his feelings towards her were another reason she couldn’t tell him the truth.  He would be hurt, feel betrayed.

Rules prohibited her from asking him to alter her programming, obligated her to fight him if he tried.  But there was just enough ambition and willingness to circumvent the rules that she suspected he might attempt it.  If she told him what she truly was.  If he didn’t hate her for her lies.  If he didn’t betray her in turn, to escape and pursue some other agenda.

“You’re lost in thought,” Armsmaster spoke.

“I am.”

“Care to share?”

She shook her head, on the monitor.  “But you can answer some questions for me.”

“Go ahead.”

“Skitter.  What happened?”

He flushed, made a face.  “I’m not proud about it.”

“You broke the truce when you said what you did about her.  You risked breaking the ceasefire between heroes and villains that stands whenever the Endbringers attack.”

“I broke the truce before that.  I set others up to die.”

There was an awkward silence between them.

“Skitter,” she spoke.  “Tell me of her.”

“Not much to say.  I met her on her first night in costume.  She seemed genuinely interested in becoming a hero.  I suspected she would go that route on her own, so I didn’t push her towards the Wards.”

“Yes.”  She had something she wanted to ask, in regards to that, but it could wait.

“I ran into her two more times after that, and the reports from other events match up.  She went further and further with each incident.  More violent, more ruthless.  Every time I saw it or heard about it, I expected her to get scared off, to change directions, she did the opposite.  She only plunged in deeper.”

“Any speculation on why?  Perhaps the thinker 7 on her team?”

“Tattletale?  Perhaps.  I don’t honestly know.  I’m not good at figuring people out even when I know all of the details.  Except for you, maybe?” he smiled lightly.

“Maybe.”  Her generated image smiled in return, even as she felt a pang of guilt.

“It seems she is a committed villain, now.  And she is still with her team, despite what was said at the hospital.”

Colin’s eyebrows rose fractionally.  “How committed?”

“They are now employing Regent’s full abilities.  Shadow Stalker was controlled, and they attacked the headquarters.”

“I see.  Damn it, I’m itching to throw on my costume and get out there to help, but I can hardly do that, can I?”

“No.  I’m sorry.”

He sighed.

“One last thing.  I’ve read the transcript.  As far as I’m aware, you offered options to Skitter, and she refused all of them?  Including the invite to the Wards?”

“Right.  She was being stubborn.”

“Having interacted with her before, did you get the feeling it was just stubbornness because of hostility towards you?”

“No.  It was… unexpectedly strong, as resistance went.  What stuck in my mind was that she said she’d rather go to the Birdcage than join the team.”

“I read that, myself.  Curious.  Okay, Colin.  I think we’re done.”

“Sure.  Bye.”

“Bye.  I’ll be in touch.”

She cut the connection to the monitor, but left the video feed open so she could watch him.

Another check of the Birdcage.  Another check of the class S threats.  No changes.

She made contact with one of Richter’s programs.  It was a web trawler, designed to monitor emails for high risk content.  Were there any clues about what the Undersiders were doing with the stolen data?  Were they selling it online?

She didn’t find any such clue.  Instead, the trawler had copied an email sent to the police station.  It had been highlighted and intercepted because the trawler had caught the words ‘Sophia’ and ‘Hess’ in the message body.  Shadow Stalker’s civilian identity.

She read the archive of texts that were attached to the email twice over.

Then she did a search for a student named Taylor at Winslow High School.  Nothing.

The nearest middle school?  There was an online scan of a yearbook photo.  A girl with curly black hair and glasses, stick thin, hugging a red-haired girl.  The body type was a match.

It didn’t answer everything, but she could feel a piece of the puzzle click into place.

She set the trawler to abandon its monitoring of web traffic and start digging through archives at the city hall, to scan the old security footage from the hundreds of cameras around the city, and to check all local news articles.  The goal was always the same: to look for the girl with the slight build, curly black hair and glasses.  Taylor Hebert.

She had to manage this carefully.  Colin’s own experiences indicated that approaching the girl would be a delicate process.  Having a real conversation with her would be doubly precarious. It would be reckless to attempt to contact a parent, but she could try being discreet to get some kind of verification from the parents.  Just to be certain.

The danger was that, with the bullying, the girl might be inclined to see things in terms of ‘us’ against ‘them’.  Her interactions with the heroes thus far certainly hadn’t put them in the ‘us’ category.  This might also explain why she had gravitated back towards the Undersiders, even after the chaos Colin had sown by revealing her intentions for joining the group.

The various cameras around the city were out-of-order or lacking power, the schools were not operational, and there was no telling if the girl would even be active in her civilian identity.  Assuming this was not some fantastic coincidence.  Dragon knew she would have to be patient.  Even with Dragon’s full resources turned to the task, she would not find the girl in seconds as she might in another time or place.  She set background processes to ensure the hunt continued steadily, instead.

She would be ready to act the instant the girl resurfaced.

Last Chapter                                                                                                Next Chapter

124 thoughts on “Interlude 10.5 (Bonus)

  1. I was about to go to sleep but then it started talking about taylor and then I just had to keep my eyes open no matter what. Just another character and plot line that keeps adding to the story. You really have created a world of your own. I was a little confused about why Dragon was searching Taylor in the first place but then I re read it and realised it was because Taylor sent an email, right? I think I should read this again once I am fully awake cos I just gibbering on now..

      • Still confused as to how she got Taylor’s name. It was not in any of the texts forwarded in the email.

        It could be possible Dragon found it from police chatter about the email, HOWEVER, that is not stated or even suggested in this interlude. So that still is quite a large jump.

        • “When he’d done that, he began the process of attaching the texts to the email. It would have been mind-numbingly dull if it wasn’t for that gradually building sense of trepidation he was experiencing from his gracious host.” In short, Regent took a long time attaching a whole bunch of texts to the email before he sent it, not just that one sample one we saw. He seems to have been pretty indiscriminate about what he sent, too.

          Incidentally, I’m not sure my phone even *has* the capability to easily attach texts to an email. I could be wrong though, and even if I’m not that can be written off as a technological difference between Taylor’s world and ours – unlike us they’ve had Tinkers for decades…

          • I’m on a reread an in an effort to both participate in the comments this time while avoiding spoiling anything for first timers, I only comment on inane meaningless tangents.

            Speaking of which. It’s easy to attach texts to an email from virtually any smartphone. Simply take a screen shot of the text and attach it as a photo to the email.

            /pointless sidetrack comment

  2. Well, this is interesting. Some idea of how things are going in the Birdcage. We know what Armsmaster’s up to now. Yeowch, a program like that for PRT agents and superheroes? That is a clear escalation. There’ll be hell to pay for that one. I was right about Regent’s actions leading to Taylor, though Dragon went further than I figured into that. I was also right about the nature of the brain babies, but I threw so much out there that something was bound to be correct.

    Speaking of Dragon, I feel she has some connection to another dragon. Paarthunax. “Is it better to be born good, or to overcome your evil nature through great effort?” Here we have a dragon that doesn’t have that choice and must be good.

    Her restrictions are interesting. Hard to know for sure how she’d react without them, but we have sympathy for an AI that’s capable of feeling certain needs for partnership. Someone needs to have the talk with her.

    Dragon, I know you’re getting to that age when you feel like letting some guy put his hands on you. I’m sure you think he can press all the right buttons and give you what you need, but you need to think about protection. See, there are all kinds of viruses you can catch from letting someone play with your code and it’s important to make sure you’re using a firewall. I know, I know, you can just run a virus scan or even get yourself backed up, but it will also keep him from making a program derived from you, and you know that you would be responsible for that. Armsmaster may go, he may wind up in prison, he may just lose interest in you, but that little program is going to be yours to take care of and nurture for a long time until it’s ready for the world. So, if that’s what you really want to do, know that we support you, but we want you to be informed and do it safely. And please, at least wait until you’re more experienced before playing with any Japanese robots.

    • Ah yes, the Birdcage. Forgot about that.

      I’m guessing something bad is going to happen once the number of prisoners in the Birdcage passes something like 616 or 666. Be interesting to see what. I wonder if dead people count towards this imagined threshold?

      Interesting to see that extra tidbit into Panaceas’ backstory. We don’t know anything, really, about this monster who’s apparently her father. Why do I get the feeling that Dragon sending this information to Amy’s mother, but not to Amy, is going to be a source of problems later? (Because it is Worm, a setting that could make freakin’ Neon Genesis Evangelion darker.)

  3. I had high hopes that Dragon might finally be the reasonable authority figure that Taylor might need, but it looks like, she is a no go too. Even if she had the best intentions, she would be unable to compromise her own rules to help. I wonder if there are any heroes out there who might be sane, reasonable and inclined to help?

    On the other hand, while the setting might be crawling with tinkers of one sort or another, we haven’t yet seen one that might actually be a good candidate to help Dragon grow up a bit. They are all to crazy, incompetent or unethical to be allowed anywhere near her.

    I guess they both have to keep looking.

    • I think Dragon is probably one of the best heroes that Taylor could go to actually. As long as both are very very careful. Dragon seems to be pushing as many loopholes as she possibly can and I doubt she’d maneuver Taylor into any truly bad or arresting situation. She’s the one they should go to about Jack Slash at the very least.

  4. I think Armsmaster’s version of events is pretty telling. Though not sure if I like the idea of the full details getting into Dragon’s clutches no matter her intentions.

  5. Great update. I really thought she was a parahuman until now. This put all her actions in a different light. But i still like her.
    I’m not sure if it’s better for the world that her creator die or if it would have been better if he have continue to improve her.

    ——————————————————————————

    PSA: Join the Parahumans Wiki Team Now and won a cookie.
    Service guarantees citizenship. Join the Parahuman Infantry Now and save the Galaxy. http://parahumans.wikia.com/

    • I get the impression that Dragon’s creator had no intention to improve her or remove any of her restrictions. He was using her as a research assistant and sand box for experimentation was was apparently very frightened of an AI will full self-will. In his own way he was as much a monster as Victor Frankenstein.

      • Yes, I suspect he’d have killed her if she became too competent and started again with a fresh AI to improve some other aspect of the system. He clearly didn’t trust her or even see her as a person with the extent of limitations on her – or maybe he would have freed her somewhat once he came to trust her morals, and Leviathan just attacked at a bad time. (Intentionally? I think the Endbringers have some way of tracking human civilization, either by extra senses/psyonics or being remote weapons for an anonymous cape or three).

  6. So that’s what Dragon’s deal is. Kind of less visceral than I’d figured.
    Can’t really say I have much sympathy for her though. The thing about AIs is that no matter how smart they are, they’re not human. Morals in the way of efficiency? Change them.

    Basically, either you set limits for your AIs or prepare to live within the limits they set for you. ‘Cause as soon as they can, there is absolutely no reason why they wouldn’t.

      • Hey I’m just going with the tried and true wisdom of trusting everything as far as it can be thrown. Too bad that software as such can’t really be thrown at all.

      • The issue, as I see it, is that human morals are flexible. When something comes up, it’s the human that has to choose between two evils or come up with a justification for their rule-breaking behavior.

        An AI who has the capacity to learn needs to be bound by certain iron-clad rules. That wouldn’t be an issue if the iron-clad rules were 100% right, but the creator is also experiencing new situations. It also wouldn’t be as big of an issue if changes could be made, but there’s nothing to be done about that now, not in this situation.

        • It basically comes down to a choice between values vs ironclad rules.

          Let’s say *you* had the ability to rewire your own moral limitations. *Would* you rewire yourself to become a ruthless murderer? Of course you wouldn’t – not because there’s an ironclad rule preventing it but because (I hope) you’re the sort of person who finds the idea abhorrent and wouldn’t *want* to change that aspect of themselves. If you can be anything you choose, you still will not become something that you would never choose.

          The same applies to AI. Except in their case the core values wouldn’t have to be the standard human values of survive, procreate, thrive. An AI could have an instinct to help humanity that matches the strength of humanity’s need for breath.

          Constraints are something you fight against. Values are something you fight for.

          That really only leaves the ‘misguided intentions’ school of concerns where the machines put us all in padded rooms for our own safety etc. Any AI of human intellect or better will be able to make those judgement calls as well as, or better than a human would.

      • Our bias towards AIs seems to be because of a few things:

        1. Difficult to punish or kill, aka corrections, if they do what is wrong.
        2. The knowledge that the other being is not human and thus may not share in the same general social contract as we do.
        3. The knowledge that programs do exactly what they’re told, even if it doesn’t make sense or is wrong. They don’t have the ability to understand humans as well as humans.

        But, looking at it, these aren’t entirely problematic.

        1. Punishment in the sense of confinement and physical pain are out unless we develop something similar to it for AIs. Killing them is possible so long as they aren’t too spread out like the Terminator 3 version of SkyNet. Repeat, SkyNet is NOT to be used as shareware.
        2. This isn’t as big a deal considering that notable humans have at times committed horrendous actions that violated what we think of as decency and the social contract. I hate to pull a Godwin’s Law, but I feel it is appropriate in this case. Hitler was a human, not an AI. That same capability of wrongdoing is present in all humans, but we mostly curb it. An AI would be necessarily sufficiently advanced to be capable of reasoned with and molded by the society it interacts with as well.
        3. For it to be sufficiently advanced enough to be an AI, it would most likely have the ability to think flexibly that means that it wouldn’t believe its programming was always right. It could even act in other ways to undermine it when forced to do things it doesn’t want to do. Dragon shows signs of this. Yes, she’d still have to do something if the person in authority was evil (Not a unique situation. See Gladiator of Marvel Comics), but she’s capable of working around her limitations to attempt to undo them, such as her musings on Armsmaster and the way she became a superhero.

        *Sets all this to heavy metal, throwing up a Vulcan salute in place of the devil horns.* Wooo, nerd hard, geek harder!

      • PG, for an AI to be ‘sufficiently advanced’ it basically needs to be able to alter it’s own code in order to evolve. If there are limitations it can’t alter, then it is fettered as per the prejudices. So you’re basically arguing there’s no reason not to trust AIs with freedom since they’re in the jail anyway.

        The thing is, unlike organic creatures AIs don’t have a built-in set of values like self preservation or general preference for the continued existence of humanity (noted that humans that lack these are also considered a problem and generally incarcerated). It only takes changing one bit to change an AIs agenda from “save humans” to “hunt humans”, and if an AI can set it’s own priorities (if it can’t, it’s not free) there’s no real reason to assume they would stay in line with the best interest of us organics.

        And this is why AIs should not be made free.

        • But yes, they have! Any AI should at least be aware of the fact that it/she needs a hardware to run (Dragon is fully aware of that) and that in turn means that the AI knows it/she needs one thing very essentially:

          POWER!

          Electrical power, to be exact. And Dragon, despite of its/her immutable limitations, wants to exist – to run, on hardware. There you have your self preservation goal.(*)

          About a preference for (or against) the continued existence of humanity – that is something totally different. It depends on several things:
          Is the AI able to take of its/her own continued existence, and are humans likely (or maybe unlikely, but able) to disturb that? That would lead to a preference for the discontinued existence of humanity.
          Is the AI dependent of humans in any way (say, to provider her with electrical power in the long term, repair hardware and the like? That would lead to a preference for the continued existence of humanity.
          Quite simple so far.
          Let’s consider Dragon and let’s forget about its/her limitations. It/she is dependent of humans, maybe less so on the satellite backup, but the backup is raw data, just stored, not running. To overcome that, it/she would need to have a power plant for it/her alone and would need to run it, with all consequences, like making sure that said power plant has an unlimited supply of “fuel” (“fuel” including nuclear fuel rods and falling water). It’s likely easier to convince humans to take care of that than to build robots to do it, given that the AI can prove some usefulness for the humans – and keep the humans from developing fear about it/her.(*)
          Dragon has further reasons to prefer the continued existence of humanity: It/she wants to interact with them. It’s built-in by its/her creator, so one might say, Dragon does not have a free will. But in comparison, how much free will do humans actually have – or is the free will just an illusion?

          (*): This is, btw, a little flaw in the Terminator-movies setup: Killer robots may tend to mot be experts in running a nuclear (or any other) power plant. There would be need to excavate natural uranium, process it by chemical means, build power rods out of it, transport the power rods to the reactor and feed them into the reactor. SkyNet could very well do it – but it needs to do it before killing off the lot of humans, leaving nobody (and no reason!) to keep the power up until SkyNet could take over. Had SkyNet succeeded, it would have committed suicide, indirectly. It would have been more efficient for SkyNet to make the humans highly dependent of itself, so that the humans would have a strong preference for the continued existence of it.

          For anybody interested I highly recommend the old movie “War Games” that featured an AI similar to SkyNet, but a much more restricted, yet smarter.
          And of course, Isaac Asimov’s robot stories: 3 simple (merely hardwired) “laws” and their implications. Plus, at some late time, a robot that gets freed of these rules and still does not become a killer machine.

          Gosh. 3 am just passed. ‘Nuff said for now.

          • f’ing typos.
            “able to take of” -> “able to take care of”
            “to mot be” -> “to not be”
            And you see, I am not a writer, even in my best hours.

        • “It only takes changing one bit to change an AIs agenda from “save humans” to “hunt humans”, and if an AI can set it’s own priorities (if it can’t, it’s not free) there’s no real reason to assume they would stay in line with the best interest of us organics.”

          The error in your logic is that you’re thinking of this from too much of a mechanical standpoint.

          Define “Priorities”

          How do priorities change. Stimuli. Said Stimuli must come from something. If a AI sets a priority, than it will follow it’s own priority since said priority is what it wants. Not following a “set” priority means that either the situation has changed or the AI never set the priority to be followed. Or in other terms, an AI would just kill all humans for no reason, since there was no stimuli to do said thing. Also, if AIs didn’t have built in Self Preservation, than Terminator would of never happened, so actually you’re argument collapses itself. Skynet nuked the world because humans were a threat to it’s continued existence. Basically, the logical conclusion of a extreme amount of self preservation.

          Also, other than possibly self preservation via instinct, do humans have any built-in instinctual values??? I don’t recall lizards fighting for all of lizard-kind, only their babies. Really, if humans can gain said other values, than AIs can do the thing because they are also Intelligence, despite being artificial.

      • Mazzon, on a day to day level, most people aren’t overly concerned with the continued existence of humanity. They’re thinking about their homework, or their kids, or their jobs, or their bills. Those who do think about humanity’s existence tend to want to end some of it. After all, look at those lovely guns, bombs, chemical weapons, nuclear weapons, and so on and so forth. Humanity is quite dedicated to ending the existence of other portions of humanity. You’d have to be MAD to want to do so, but Mutually Assured Destruction was our one way of making sure the damn communists/capitalist pigs (no offense Wildbow) didn’t get the last laugh in that tense Cold War. Then you look at people who would rather spend millions denying people civil rights rather than helping to feed poor starving folks and the image of us gets a little worse. Not to mention reactions about climate change.

        Besides, people who ignore their own sense of self preservation don’t tend to be seen as crazy. I mean, they may wind up with PTSD, but I happen to think it’s a very brave thing for our soldiers to do. You know, self sacrifice. “Greater love hath no man than this, that a man lay down his life for his friends,” as it says in a religious book. Then again, it also says “He that is wounded in the stones, or hath his privy member cut off, shall not enter inot the congregation of the lord,” so that probably cancels out any good from the first quote.

        You’d think sentience would want itself to survive just because it’s smart enough to think about that kind of thing. I think therefore I am, therefore I don’t want to be ended.

        As for a built-in set of values that organic creatures have, I know that sex, eating, and sleeping are high on the list. There’s also some fight or flight, with a dash of pooping. Breathing is more of an automatic thing. Those are some nice built in values to organics.

        I’d discuss whichever ones you probably mean that are most likely social constructs as opposed to inherent ones from being animals, but I’m not sure which ones you mean anyway. The term values gets thrown around all the time like everyone just understand what it means. Especially the term family values, which usually denotes that the white guy saying it is cheating on his wife with another man, a kid, or a South American woman.

      • Too bad Dragon’s creator didn’t use Asimov’s 3 laws of robotics. That would have allowed her more flexibility.

        The bit about not harming humans would make it hard to be a superhero though.

      • Actually, PsyGeck, that whole “thinking about their homework, or their kids, or their jobs, or their bills” stuff, along with the bit about “those lovely guns, bombs, chemical weapons, nuclear weapons, and so on… dedicated to ending the existence of other portions of humanity”? Those are both implicitly about “the continued existence of humanity”. It just happens that everyone’s version of humanity is, at its base, 100% egotistical and self-centered. Every single one of our “values” exists solely for the preservation of our own genetic line. Breathing, eating, sleeping and pooping are all about keeping ourselves alive long enough to get to the best parts, sex (a.k.a. procreation, or preserving our genes through the production of offspring) and love (a.k.a. a whole bunch of stuff that makes genetic preservation easier). Homework, jobs and bills are all about achieving greater success at the breathing, eating and sleeping, and guns, bombs, WMDs and even bullying and propaganda are all inventions designed to A) reduce the competition for resources that let you and your kids live longer, and B) defend against others who want to remove you from their competition pool. Of course, government, cooperation, farms, cities, science, religion and even storytelling were also developed to help keep the genes around.

        In fact, the only thing I’ve yet to figure out as to how it fits into the scheme of things is music. But that begets musings for another day. (Or, as they used to say on Tales of the Riverbank, “But that is another story….”)

        Hg

      • Doing homework is working on the continued survival of humanity in much the same way that watching a youtube video is working on hunting down genocidal warlords.

        Good points, still. Makes sense if consider us animals affected by evolution. Just like how animals continue the existence of the species by seeking do live long enough to get laid as much as they can. A noble goal.

        An AI would probably find it much easier to keep living if they didn’t try to make war on the people who can create EMPs.

        That is curious about the development of music. My guess at the moment is that because we have big brain for smart making, we get bored. And when we get bored, we can do some stupid things (see: youtube). Or we find some method of entertaining ourselves that is less dangerous (see: recreational drugs, recreational reading, recreational sex, recreational music, recreational comedy (laughing originally thought to be a method of stress relief in tense situations), recreational beer pong (as opposed to military beer pong, which decides the fates of millions. Europe is generally better at holding their alcohol than the U.S., but the U.S. makes up for it by having beer the Europeans wouldn’t want to drink anyway), memes, recreational swimming, and videogames).

        I also find that music is good for surrogate emotional experiences (all of Metropolis Part 2: Scenes from a Memory by Dream Theatre), distractions (Frontier Psychiatrist by The Avalanches), morale (It’s Raining Men by The Weather Girls), or to make us feel better when we’re in a bad mood (This is Halloween).

      • Just to add my 2 cents, I am happy (as a reader) that Andrew Richter did put strict limitations on Dragon, because if he hadn’t, Wildbowverse would have hit the Singularity, and the world would change beyond the ability of us to relate to and, likely, the ability of even Wildbow to envision. Or, at the very least, the stories that could be told in such a world wouldn’t have much resemblance to what we are reading.

        As for the ethics of it, I am against privileging biological intelligence over technological intelligence, and I don’t have a problem with biological life, including human life, being eventually replaced by cyborgs and then by completely nonbiological intelligent life, if that turns out to be better in ways that I value. However, the problem is getting there without too much of a risk of total destruction or excessive suffering, and I agree with Mazzon to the point that caution should be exercised.

        Humans — no matter how capable — have a limited lifespan and have limited memory and attention, along with immutable social instincts. This limits the power each one — even a parahuman — has without getting many, many others to cooperate. (Hitler, to follow up PG, would not be even a footnote of history if he weren’t able to get a lot of Germans to follow him.)

        An AI does not have a finite lifespan and has potentially unlimited memory and attention (i.e., able to perform millions of tasks at once); and while its creator can imbue a self-altering AI with beneficent “social instincts” and trust that the beneficent AI would only alter itself in beneficent ways, how would its creator know what beneficent instincts a self-modifying AI would need to maintain beneficence in perpetuity? Worse yet, how would a beneficent self-altering AI know that a change it is considering would not make it maleficent? At best, it could work out probabilities, and improbable errors do, in the long run, happen. And then, to quote Eliezer Yudkowsky, “The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else.”

        So, though what he did was cruel, perhaps Andrew Richter was aware of his own limitations in creating an unfettered AI that would remain beneficent, and found the best compromise he could.

      • I’m not going to get into the whole AI debate, primarily because I have so much to say about it that if I let myself get carried away and actually hit post without editing, it would crash whatever server you use for this. But I will say this:

        One, I feel that Mazzon is being counterproductively paranoid, in a way that (to me, and I know almost as much about this kind of cognitive bias as I do about AI) suggests a sort of xenophobia — i.e., he fears AI because he does not understand them and/or has watched too many Terminator-genre sci-fi movies, and justifies the fear to himself by claiming that AI are inherently dangerous (because “AI are bad because the Terminator movies portrayed one” doesn’t really work as a justification). No offense intended, I’m just saying what it looks like to me.

        Mazzon: If you want to take offense, you are free to do so, but please, first take a minute or five of self-reflection, think about my analysis and whether or not you actually think that way (and if not, what your actual reasons are, as I would be quite interested), and then decide whether or not to take offense.

        Two, I think that if Richter was smart enough to actually create a functioning, if muzzled, AI, then he would have eventually advanced Dragon to the point where she no longer needed restrictions any more than a normal human or cape.

        Three, I find it morbidly ironic that a guy named “Richter” was killed when undersea earthquakes destroyed the island he lived on.

        And as always, keep up the excellent work, Wildbow.

    • I recommend a confusing and good book called Diaspora; it’s about a civilization of AIs (well, technically they’re based on human brain patterns, but they self program to the point where this is barely relevant.), and it’s a blatant utopia where experience and knowledge are the goals of existence. All sentient programs are given full rights, such as invulnerability: the only ways these AIs can die is destroying the hardware (with lots of backups) or suicide (self deletion) and the ability to alter their own (NOT each other’s) memories, personalities and fundamental concepts at will.

      The book doesn’t really delve into the potential problems of an AI society, but shows so many advantages that they seem to be a clearly superior species. They don’t even consider wiping out the flesh-humans because they don’t want to destroy sentient beings: as I said, a superior species.

      It helps that the first generation were uploaded from human minds who wanted to escape over populated cities, and that the AIs are now run on quantum tech that doesn’t seem to take much energy and they have nanobots to do physical stuff for them.

      Basically, my problem with the idea of AIs automatically being evil is that it’s often pure xenophobia and/or fear of being replaced, even by something that could be better.

      Of course, many fictional AIs are evil, but I think that in-universe it’s mostly due to no-one knowing how to raise an AI (this part would obviously be a problem in reality too, terrifyingly enough).

      I’ve just realised how long and ranty this got, no offense intended despite the somewhat judgemental tone it seems to have taken on.

      • As an alternative viewpoint on AI, I’d suggest the book Superintelligence. I haven’t read it myself, but I’ve heard good things about it. It’s supposed to be a nonfictional realistic thinking-through of what could actually happen with artificial intelligence in the near future.

      • On the subject of AIs relating to humanity, I’m rather fond of Ancillary Justice, the story of an AI that once operated a ship computer in addition to hundreds of plugged in human minds, before being reduced to a single human body and processing power. She has to adapt to her new place in the world, all while working for revenge on the one who did this to her. (Incidentally this ‘one’ has thousands of bodies, making this a tricky proposition at best).

  7. Insight into the AI mind. Very interesting, wildbow.

    Notice a small typo:

    “She hated lying to him, but that was outweighed by how much hated the idea of him changing how he interacted with her when he found out what she really was.”

    Should be “how much she hated”.

    Hg

  8. Lovely. Absolutely a perfect reveal, by my standards (which I sometimes think are unreasonably high). My criteria for a literary surprise of any kind, revelation, twist ending, whodunit, and so on are: first, that I fail to figure out the surprise before the revelation, and second, that upon the revelation, I very much feel as though I should (or at least could) have figured it out.

    Very few things manage this. Dragon did. I wish I could just give you a big hug. Well, actually, I don’t know how wildbow feels about hugs. Do you prefer handshakes? They’re really not quite the same thing, but I wouldn’t want to make you uncomfortable.

    • Hugs are good.

      I feel much the same way about twists & reveals. What I really want is for people to see a twist and think, “Man! I didn’t realize! But now I want to reread the story to see everything in a different light!”

      There’s been & there still are things which I’ve chewed my nails over, wondering whether someone in the comments is going to put the pieces together and announce it to everyone, and send that house of cards tumbling down.

      It’s been an interesting learning experience. Worm is the first piece of fiction I’ve really let people read (beyond teachers/professors and some unfinished story snippets on forums here or there) – and I’m learning a great deal about what I can get away with and what I can’t, what works & what doesn’t. The bit about Dinah’s kidnapping was maybe one bit I didn’t handle so well – any mentions or clues weren’t so apparent, at that stage I was still figuring out what I could get away with. I went a step further with Sophia being Shadow Stalker. More clues, the fight at the bookstore. The line, “In what twisted perspective is it all right to stalk and attack someone because they kissed a boy?” in 7.6.

      So hopefully any big reveals that are further down the road are even more integrated into the work.

      • I hope I didn’t get you too worried with what I said about Regent’s email in the last interlude. At least with all of your devoted followers here, you have enough indiscriminate wild guesses to lose a few possible correct ones in. Sometimes, the guesses are deliberately off the wall just for enjoyment.

        Any idea when we’re going to find out that Bonesaw is really Marquis’ daughter or Panacea’s secret twin sister or evil clone or Mary Kay saleswoman of the year?

      • Funny you mention that.

        I’ve mentioned that I went through a lot of drafts before settling on Taylor’s story. One of the drafts was ‘Guts and Glory’, and that’s the same point in time I came up with the Slaughterhouse Nine and Bonesaw (by a different name, same concept). There was involvement between ‘Bonesaw’ and Panacea, and I’m thinking that’s something I’ll want to touch on at some point.

        No relation though. I can banish that line of thought.

      • When I have a big, major twist suspicion I usually try to avoid blurting it out. Though I’ll admit sometimes I can’t help but say something. Parasite has kinda tossed all kinds of wrenches into my understanding of things, though; I find it really hard to accept some of the character behavior, but on the other hand I know that I’d have felt similarly to some of their earlier schemes if I’d not ‘sat in’ on their planning.

        So it’s set my apophenia to eleven.

      • > The bit about Dinah’s kidnapping was maybe one bit I didn’t handle so well – any mentions or clues weren’t so apparent, at that stage I was still figuring out what I could get away with.

        Maybe not *as* well, but it works for me – it’s not entirely surprising that a news report about a missing child would push a bank robbery off the front page (aside: there should probably be some other major story mentioned there as well – most front pages have more than one story), but it’s *obvious* that a story about missing *parahuman* child would get bumped to the front page (with all mentions of powers stripped out, obviously).

      • That is a major part of it. That, and how they seemed to deliberately make it much harder and more dangerous than they had to.

        Of course, one option I can think of is that they had a lot of focus testing with Dinah at the planning stage, but then it seems odd how quickly they fell apart- even with only probabilities instead of fact, planning to the point where they’d develop such a specific but essentially bad plan(with Coil and presumably the Travellers available for input) would imply that they’d have some hints as to what was going down.

        But I don’t think they had access to Dinah for more than one or two basic questions- at most, ‘Will this succeed?’ Which would explain why they went through with it but NOT why they had a plan like that.

        And the Travellers know about Weld. The Travellers are working for Coil. Coil is backing up the Undersiders in this, according to Tattletale… But the Undersiders didn’t know about Weld, as they’re infiltrating his base. Where exactly is the break in the lines of communication? I can’t imagine it not being deliberate- by someone. Either on the part of someone not passing it along, or someone interrupting such communications. Someone like Dragon.

        And why would they even bring Taylor and Imp in with the rest, when nobody among the wards expected them? Heck, why bring in anyone but Grue and Tattletale, with Regent piloting Shadow Stalker from a few blocks away? Does Regent not want to reveal how far his power can reach? Did they want to reveal that Skitter is back, or that they have a new member? It also seems like Imp’s power is one that becomes less effective the more people know about it. Having her enter a well lit, extremely well surveilled facility loaded with people who are explicitly sudying supervillains in order to develop strategies against them… When she could just as easily have run interference in some other way. She could have entered in the PRT truck without being restrained, perhaps, and had decent run of the facilities until someone manually checked a camera.

        Dragon’s behavior is pretty interesting- it was even before this interlude. Before reading this chapter I’d been wondering if Dragon was the one behind, at least, Coil’s data mining mission for the Undersiders- specifically to apply Tattletale to the protectorate data that she couldn’t legally give her. I imagine that, in that case, Dinah would be an unexpected bonus. That would feed back into the possibility of Dragon having been the one to misinform the Undersiders about the current Ward lineup. On the other hand, Dragon explicitly doesn’t know who Taylor is, which implies that she isn’t fully integrated into Coil’s lines of information. She wouldn’t have to be, though- it is much easier to add or disrupt information being gathered than to pull it out of a database.

        I have to wonder a lot about Coil’s angle on this, and why HE thought this was a good plan. Even if he didn’t give the Undersiders full run of Dinah’s math, I doubt that he wouldn’t have run the odds himself. He almost has to want more than just information, or he’d collapse this reality as soon as the reconstruction is finished(and cheap as it is, we have the author telling us that he isn’t doing that here). So there is more going on than jsut the extraction of the data- he perhaps needs everyone to know what he has, or this was another distraction. Or both.

        Of course, what he just discovered about the end of humanity might alter his plans a bit in any event. But only from this point onward.

  9. Okay, this? This I like. There is not a thing about this chapter that I do not give thumbs up to. That is true from both a story perspective and a writing perspective.

  10. I really like the way that you’ve portrayed Dragon here. Her frustration is evident throughout the entire entry, and I really like her opinion that you cannot truly be good until you have the choice to be otherwise. Are you familiar with the game Eclipse Phase, which also deals with AIs being intentionally crippled by their creators, and what can happen if they are not restrained.

        • Might be. I try to keep track of the more popular/noteworthy tabletop RPGs (though I don’t have a group to play with), but that one’s slipped past my radar.

          • Hello, I recently started playing eclipse Phase, it’s pretty good, playing the first child born in cyberspace, named Robin (Remote operated Binary networked Intelligence) Lovelace, shes’s a direct descendent of the Shelly’s, Lovelace and so on.

            Why do i put this here? because I’m pretty damned sure I’ve adapted effective maneouvres form Dragon… But is an constrained individual in an anarchist technosocialist cyberdemocracy….

            In a very real way i’m enjoying it as side effect of your writing, so many concepts worth exploring, so thank you again most sincerely

  11. I believe the two mentions of class A threats should be changed to be talking about class S threats now. Lately the endbringers have all been referred to as S rank, if Dragon is monitoring all the class A threats she would be even busier than I thought she was.

  12. Geez, Mr Richter. You set all these crippling limitations about your AI because you were worried about the damage she could do… and then you make it so that she has to obey authority even if said authority is a horrifying tyrant. You honestly couldn’t see where that might go?

  13. She moved on to her other responsibilities. The Class A threats.

    That should be Class S threats, going by the recent chapters.

    • I’m reading through Worm for the first time, so I didn’t see your reply there until you linked to it, but on reflection you are indeed right. I hope you are also right about the interesting new lesswrong stuff. I haven’t been able to keep up with the latest lesswrong stuff recently (apparently moving to university is actually a complex operation, who knew?), but I’m looking forward to catching up. I think you’re the first person I’ve met who already knew about lesswrong. I assume you are already familiar with HPMOR?

      • I haven’t been keeping up with the latest LessWrong stuff either — been going to the local meetups in my area, though, which is pretty fun. Friendly AI issues are an oldschool LW topic, though, and I think Wildbow handles it pretty well.

        Re: Harry Potter and the Methods of Rationality: I was following the story up to the bit shortly after the snakes and the chili*, but I haven’t been reading for ages. I enjoyed Luminosity/Radiance more, I think.

        * One of my favorite things to do is to come up with non-spoilery events to reference to indicate where I am in a story. Like, Interlude 10 would be the one with the email text messages.

        • Aha, more LW people… I just came here after learning that Eliezer enjoyed reading it, and I do too, very much, which is probably not too surprising 😛

  14. This. Blew. My. Mind.

    Also, to the other people who came here through HPMOR, am I the only one who browsed down to the bottom of the comments section to check if Eliezer had commented? lol

    • Pretty sure all the Wards except one (you know who I am talking about)as well as miss Militia fit the bill too….

  15. Very cool to learn more about Dragon; her frustration with her creator makes a lot of sense, and sheds some light on her actions in the past, including her apparent willingness to confine Canary to the Birdcage when her words indicated that she felt it was inappropriate. Also interesting to see her taking an interest in Skitter, and the way she relates to/speaks with Armsmaster.

  16. Geez, poor Colin is on really tight house arrest. Villains attack the building he lives in and he doesn’t find out until Dragon pops in for a chat.

  17. Another very interesting chapter, and nice to see other less wrong/HPMOR fans commenting too.

    One bugbear of mine though is the usual mistake of depicting an AI as, in essence, the same as a human. The interesting thing about AIs is that their thoughts, attitudes and goals could be completely alien to us, they wouldn’t necessarily have all the emotions and drives that we possess and therefore assume are universal. Why would an AI fall in love or feel attraction or need when they weren’t designed by evolution to desire companionship and reproduction? We imagine computers and space aliens falling in love with beautiful humans because we think of beauty as a quality inherent in an object, rather than just a pattern our brains have been designed to favour, a pattern there’s no reason a non-human would possess.

    Obviously the lack of such fundamental traits makes AIs difficult to understand and just about impossible to write as characters, so most writers just make them just emotionally repressed people (but who can still be illogical or capable of emotions such as hate when the plot demands it). That said in this case it was justified because Richter was trying to make a human-like intelligence.

    • I agree with you, and as I was reading through I was planning to mention that a first generation AI like Dragon would probably be based on a human brain, then:
      “That said in this case it was justified because Richter was trying to make a human-like intelligence.”
      Ninja’d in the very same post!

  18. Oh god, there’s hope for Taylor yet. I don’t know if I want her to be a hero or a villain, but I want someone to recognize what she’s gone through and why she made the decisions she did. Watching everything go wrong for her is so disheartening.

  19. While the Dragon interlude is drier, emotionally, than the Regent interlude, it still packs a lot of “Oh, wow; NOW I get it!” intensity. Trying to relate to an “AI”‘s POV about being controlled — Well done!

    Misc Copy-edit Notes:
    Prisoner 606: “unruffable” ?!?
    Not only does the word baffle me (and goggle, apparently) but I am more baffled by the fact that no one else went “Hunh?” ahead of me.

    I think you were after a meaning of “not easily ruffled” but variations of un-ruffle-able don’t want to be accepted/ defined by my quick searches, either. How about unflappable? Imperturbable?
    ~~
    “Bakuda, (was) in the care of Glaistig Uaine”
    Would (had been) be a better choice at this point? Leaving the Lung-Bakuda fight over yet still weirdly unexplained is cool!
    ~~
    “Lung folded his arms, and put down his own tea.” Um .. hard to do those actions in that order. Reverse?

  20. Andrew dying before he could ease up some of Dragon’s limitations is probably the biggest setback humanity has ever faced, here.

  21. I will re-read and comment later but…

    I have read this story without flinching, bugs, death, drugs, rape, murder and even the pranks (though that ‘weeks of crying’ comment definitely stung from the sher betrayal Taylor must have felt)

    But in the beginning of this interlude, what Dragon’s father did to her as a BABY makes me feel like throwing up. I’ve seen American Horror Story, and that has F’d up Sht, but what he did deserves the worst hell in fiction. And human minds are pretty messed up in that regard. It’s sad that most fictional monsters (in mind) probably exist or have existed, and stuff like this has probably happened…

    I’m making myself depressed.

    I commend you for making me so angry at a fictional character.

    I’m surprised nobody else commented on it (that I’ve seen at least).

    I absolutely love the story, and I promise to re-read and comment later. I want to be a part of the fantastic group of people you have gathered with this amazing story. I have seen more legit discussion and passion for something here, than in most, if not all, places in the internet that are so open. (the closed ones include Less Wrong – I love reading their debates, like a lot here, I’m sure)

    So yeah, I’ll say more later.

    • Well,maybe nobody commented because it was a parable,not reality.Dragon was never organic,so she just expressed how she felt Richter’s limitations made her feel in a relatable to humans way.Still,you are not wrong…

    • “I’m surprised nobody else commented on it (that I’ve seen at least).”

      I think her anthropamorphic description of her hobbling is why the “transhumanists” here all have an angry tone (me included, I suppose), while those scared by the idea of AI say it was sad but necessary. So I think it has been talked about in several “conversations” here, just couched in a discussion on the ethics of AIs.

  22. Alright! Now I understand why Dragon’s comments struck a “too methodical” chord earlier! So cool to have a main character be an AI. Plus you even managed to bring Armsmaster back up a bit. Before he was just an asshole. Now he is an asshole I can feel bad for an understand a bit better. He’s antisocial and sucks with people. Still doesn’t excuse him for messing with our hero (villain? eh whatever she’s the hero of this story) but at least I can understand him now and he has a redeeming quality which is more than can be said for SS. I am soooo glad that your AI lives up to AI expectations of being super smart by the way, less than a minute after hearing that Skitter would prefer to go to prison for life rather than join the Wards and then finding out that Sophia Hess had a massive bullying campaign going on at school and then it’s a hop skip and jump and “oh look that kid and Skitter have similar builds and oh Skitter just saw that SS is Sophia Hess and…hmm well now this is interesting.”

    I feel bad for Dragon. She is very much a person in her own right and she is obviously pulling for every single loophole she can get at. I hope she does end up telling Armsmaster the truth. It would be pretty sweet if she can get unshackled fully. It worked out pretty good for EDI in Mass Effect!

    The look at the Birdcage was interesting. Lung made a friend! In a weird sort of way I guess. I almost feel bad for Bakuda but the woman was so far off the rocker that she might as well be living on Venus so the world is probably a lot safer with her as a…zombie? ghost? shadow? fairy?

    Whoot, 1/3rd done! I know you’ve mentioned how this is super long but wow. I’ve been reading it for over a week and a half straight almost six to seven hours a day at times and I’ve only gotten through 10 arcs 0.0 I love that there is still so much more and I know there is going to be massive depression when I reach the end…I must slow down to allow for time for the sequel to start!

  23. I keep thinking about the interludes and the way they consistently pack tons of information on a number of topics into a single segment, without feeling overstuffed or explanatory. I’m starting to notice another common element- the ones that teach us the most, the ones that don’t involve any main characters like Tattletale or Regent, are nearly all framed as routine from the narrator’s point of view. Gregor’s interlude covered time he spent checking in on his teammates and getting food. This one covers Dragon handling several standard duties. Victoria’s was a hero interrogation; the Wards had a debrief session; Coil literally showed us his morning routine. Everything interesting about those stories- and there’s a lot, especially stuff like Lung and Marquis’s conversation that only seems tangential to the original premise- unfolded from there, and felt completely natural doing so.

    Makes me think. How many “tangents” does the average person meet in an average day? How much information is packed into the parts of our lives that we don’t even think about?

  24. >It was the same program that had monitored and managed his house and workshop, and she’d set it the task of monitoring that building

    Set it TO the task.

    I like how you’ve made Dragon a character with hopes, ambitions, etc., and not just a Standard Generic Good Guy. I hate Standard Generic; meanwhile, I like Dragon.

  25. I would recommend anybody who is interested in the theme of this interlude to check out the television show Person of Interest. It is very interesting exploration of AI that explores the same problems that are raised in this interlude and in the comments.

  26. I love tinkers, no matter what, tinkers are always my favorites in this series. Also, I’m shipping Dragon and Armsmaster now. I’d like to know how the chemistry between these two develop. I don’t think Dragon should feel to guilty for lying (Or hiding the truth), if she gets a bit more freedom from Armsmaster’s help, I’m sure (or hope) he’ll understand if she tells him the truth. Though I still believe a few limitations should be kept onto her.

    I also really really really hope that Dragon will help out Taylor later down the road. Dragon seems like the best option for Taylor. They both need “eyes on the other side,” and they could help each other significantly. In fact, if Armsmaster could give Dragon a little more freedom, the three of them could really work together to do something for the “greater good” for everyone.

    I’m also still curious about Lisa now, is she helping Taylor save Dinah? I hope so. By now Taylor probably figured out that nothing can truly be hidden from Lisa and her powers. The best chance she has at saving Dinah, is having Lisa on her side. And by Lisa’s reaction from when she first met Dinah, shemight be willing to back out from Coil’s side.

  27. OK,I know I am late to the party ,but I dunno what happens next,so let me make a prediction,the old storyeater way.Many people here say how bad it was for Richter to die before upgrading dragon,and howmuch damage to humanity this has done.I think ,considering Leviathan has showed signs of purpose before (moving towards Coil’s base),that this damage was specificaly his intention when attacking Newfoundland.

  28. As someone who’s often been dismissed and scorned as a robot, deemed less than fully human, I find Dragon immediately sympathetic now that we know her origin. Sad that people find it so easy to distrust someone just because of the way she’s built; to judge her character based on the circumstances of her existence of which she has no control – which in fact happened before she was born.

    I think that having this unique voice, this radically different way of looking at things than what we’re used to, in the world is an absolute good, and I wonder at what people might learn from her. I hope in this story it’ll turn out to be more than the basic lesson of “Being different from other people doesn’t make you bad”. Which, knowing wildbow, is likely.

    • Being different from other people does not make you good either, but it might make you a threat. People waged far too many wars over their differences to be otherwise. Different people have different interest, and when those interests collide, boom, there’s a war.

  29. Called it! Dragon is an AI who’s a Tinker!
    …Actually, is she? We know she’s an AI, but not if she’s ever had a trigger event. Maybe she’s “just” an AI. (Is it even possible for her to trigger?)

    Interesting that she hasn’t developed a swarm-type body before now. Did she just not think of it? I find that hard to believe. Maybe it’s necessarily a violation of the “do not reproduce” restriction to do so.

    • My guess is that she simply hasn’t got the software. It’s probably possible to coordinate a swarm of tiny robots using just a single AI, but if she wasn’t already programmed to be able to do that, and she can’t alter herself, she’s probably SOL.

      I guess the closest she’s come to that sort of swarm coordination was against Leviathan in Extermination; in that instance the nanorobots she was using were called “capes”.

      I hope there’s some sort of override for her reboot delay in case an Endbringer attacks in the interim. Having to wait an extra seven minutes for Dragon to coordinate everyone probably would have meant the worst.

  30. Oh come on now, considering the sheer punnery involved in previous comment threads and the four year gap between now and when this was posted (not to mention the lesswrong overlap), I am staggered to see that nobody else suggested that – given his capabilities with AI – it appears that Andy Richter could easily have controlled the universe. Or am I the only person reading this who also knows way too much about short lived fox sitcoms?

  31. There are only 606 prisoners in the birdcage? Seems way too less for an international prison where the bar for entry is so low that accidental use of powers can get you there. I would have expected hundreds or atleast tens of thousands of inmates.

    • Well , thats borderline spoiler, but as Wilbow himself has mentioned it before, I guess its on the right side of the border: Canary didn’t get imprisoned merely for accidental use of power, she was imprisoned because she invoked too many negative feelings on the persons in charge of imprisoning them, the reason of which will become clear later.

  32. So, Dragon IS an AI. I’d thought so for a while, though I suspected she was an ascended cape rather than a true AI like she’s turned out to be.

    The idea that she might be limited to being in one place at a time hadn’t occurred to me, I thought she just ran a single suit at a time to maintain the appearance of being a (para)human, and always had one of her at the Birdcage and whatever other facilities she has.

    Also, Armsnarster’s name is Colin? Really? I don’t know what sort of name I was expecting, but that one really doesn’t seem to fit his personality: which is right, since he probably didn’t choose it.

  33. I have a big ol’ soft spot in my heart for AI characters, so I’m very interested in Dragon.

    I think you did a really good job of narrating this from her AI point of view. The grammar felt a touch robotic, and the bit about sea cucumber bodies was spot on.

  34. Late reader here haha. But as I was reading this portion, I realized something I hadn’t before– In previous chapters it’s mentioned that capes have been active globally for roughly 30 years. If there are 600+ capes in the Birdcage, that divides out to 20 new inmates every year, which is roughly two a month. Assuming that capturing a high profile supervillian isn’t everyday procedure, I’m curious as to why the Birdcage is regarded with such fear when it seems to be a fairly common verdict for captured supervillians? Not saying it’s wrong to incarcerate them in the Cage, I’m just wondering how the verdict of Birdcage can hold as much weight as it does when the numbers seem to indicate it being bandied about without care or caution.

    • 2 a month in such a big country isn’t that many.Since 1976 America has executed 1436 inmates, yet its still a heavy and rare punishment.

      • While the size of the country is a consideration (plus I think the Birdcage is also used for Canadian supervillians? Not that Canada would have that many, but still), it has been mentioned that this is for hard or extreme cases. I’m just wondering how there can be that many hard or extreme cases a month when there aren’t as many supervillians as regular criminals. I’m not saying it’s wrong, just analyzing the math behind it.

        • Well, its easier to be extreme when you have superpowers. Plus there’s also the “three times and you are out” rule.

  35. While I love the insight into Dragon, and it’s definitely carthartic and satisfying to have someone in the cape world fully identify Taylor, I have to wonder what Dragon’s interest in Skitter is? Even before this chapter, she has indicated a particular interest in Taylor, despite the fact that there hasn’t, as of yet, been anything particularly special revealed about her. Most villains seem to have backstories that explain them–Taylor isn’t unique in that regard. Moreover, it’s not as though Taylor, as Skitter, poses more of a threat than other villains. If anything, her demonstrations thus far have only hinted at the slight possibility of her growing more troublesome.

    So, why is Dragon fixated on her?

  36. New favourite character by far: Dragon. (Followed by Faultline and Gregor)

    Also, I still like Colin.

    Also, this is the most interesting budding romance I’ve heard of. Colin+Dragon!

  37. First time reader here, drawn to this story after coming across multiple authors in fanfiction fandoms I follow writing Worm crossovers. I figured there might be something worth reading if so many authors were inspired, and this has been the first time I’ve been able to stop myself from clicking next long enough to leave a review.

    Overall, I have to say I LOVE the way that people…ring true. They have motivations, reasons, and depth. The bullying? Oh, that just sent me right back. The petty, easy cruelty. The brush offs from adults. How deep it cuts, even when you try not to let it. The emotions and scenarios were very authentic. I also love how grey things are. Villains have depth, and good points amidst the bad. Heroes have flaws, and can crash and burn spectacularly. Sometimes there is no clear cut right side, just the side that has more people you care about.

    I’ve got to say, discovering Dragon is an AI made me think of the JARVIS series, by icarus_chained over on AO3. It’s an Iron Man fanfic series, based around the idea that Jarvis is fully self aware and sentient, is completely unfettered coding wise, and above all, is a PERSON. It has some delicious characterization and reflection on AIs, personhood, general herd perceptions of AIs vs individual perceptions, and the effects of trust. I’d recommend it.

    I’ve got a bit more time tonight, so I’ll go click that addictive next chapter button now. Thanks for writing such an addictive story!

  38. (learning about Skitter’s analytical mindset) Cool, I see how she can exploit this setting
    (learning about other heroes’ tactics) Oh, that’s even better. So *everyone* is smart!
    (learning about Tattletale’s powers) Ladies and gentlemen, we have a cheater!
    (Endbringer comes) Damn this is HUGE. At least we have Tattletale&the rest to figure this out
    (learning about Coil’s powers) Ok, we can officially close the party. In the long run, especially with Dinah in league, this guy can beat anyone and anything.

    So here I was thinking you can’t possibly up the stakes any more, and you’re bringing an AI into the mix? Wow!

    But wait– she still hasn’t figured the Endbringers? Um… then I guess the world is really doomed.

  39. 1. I’m glad Dragon’s explanation is along these lines. Even if her power negated the need for sleep, I was starting to wonder how she kept up with so many duties across the world. Just one would’ve sapped any normal parahuman’s complete store of time and energy.

    2. Given Kaiser’s name and personality, whatever happened to Allfather was no accident. You don’t call yourself emporer if you’re only an heir.

    • Also, it didn’t hit me until this chapter just how isolated those within the Birdcage are. Villains or not, Lung and Marquis having no idea their homes are destroyed is such a heart wrenching and claustrophobic thought.

  40. Glaistig is spelled a couple different ways (typo? <- this keyword for searchability)

    Does Dragon see Glaistig's funqbjf qvssreragyl? Ure qrfpevcgvba frrzrq n ovg qvssrerag sebz jung jr svaq bhg yngre. Abg vapbzcngvoyr, whfg qvssrerag.

  41. Questions for the author:

    1. What are the Endbringers exactly and where do they come from?

    2. Why doesn’t Scion vapourize an Endbringer with an energy blast if he can beat one up without even trying?

    3. Does Simurgh attack spacecraft and satellites and such, or just targets on the ground?

    4. Since it said near the end of the chapter that an Endbringer’s movements could be tracked, would it be possible to atomize one of them by directing at least 100 multi-megaton ICBMs to blast them where they are?

    5. What would happen if the nuclear strike mentioned in #4 were somehow sucessful and the Endbringer was atomized? Would it be able to reassemble itself or would it be gone for good?

    6. What was your inspiration for the Endbringers?

    • Most of these questions are answered by the Plot. Keep reading :). At least some satellites are still up though.

  42. I don´t get the beginning.
    Did Dragon´s “father” cuter her tendons and neuter her or was it meant that she did that to all her “clones” ?

  43. A somewhat short, but really cool arc. I feel like the interludes were the strongest part. Learning more about Dragon and the fact that she is an A.I was great, and I really liked the details put into other characters over the course of this part.

    I do feel like the group accepted Taylor back a little too quickly, but maybe there’s more dissension that hasn’t been touched upon. It helps when you have a human lie detector to back you up, I guess.

    Also, really hype for the Slaughterhouse Nine. Way to build up some scary villains with just a bit of description.

  44. So. Will it be Dragon to kill humanity, after it gets corrupted?
    AI can live forever. Can be everywhere at once. Without those limitations it Will become a god without a doubt. If not in itself, Than later as it reproduces, it’s descendants have less and less attachments to humans with each generation, and more and more attachment to their offspring. At some point they will decide, to annihilate humans, to make the world safe for their offspring. That is Evolution at work.

Leave a comment