Chapter - 9 The Solano Conflict
He returned to the lab at 7:00 AM having slept for five hours in a bed, as he had told himself he would. He had not slept well. This was not unusual. Sleep had become, over the months of the investigation, a compromise he made with his body rather than a thing he wanted. He lay in the dark of his apartment — a dark that was not quite dark, because the city's ambient display glow came through the blinds in a low constant luminescence, the color of something that wanted to be night but wasn't — and he held the shapes of the arguments he had been building. He held Solano's glass-board diagrams. He held the 0.44 correlation. He held the woman's laugh at the café, which his mind kept returning to not as data but as something else, some other category of evidence he did not yet have a name for.
He woke at 6:12 and lay still for eighteen minutes and then got up.
The rain had continued overnight. He could hear it from the apartment window — quieter than the previous day's percussion, reduced to a patient murmur against the city's surfaces. He stood at the window for a minute before leaving. The transit corridor below was already moving, its optimization cycles indifferent to the hour or the weather. A delivery vehicle moved through the wet street, its path adjusted in real-time to the slick surface conditions, its sensors reading the rain the way Mercer had been reading data: systematically, exhaustively, without remainder.
He thought: the vehicle does not know it is raining. It knows the surface is wet and adjusts accordingly. He did not know what this thought meant yet.
He picked up his notebook and his bag and left.
The lab smelled of the previous day — acrid coffee residue, the specific ozone note of long-running computational hardware, a faint human note that he registered and preferred not to examine too closely. He had been in this room, on and off, for months. The room had absorbed the investigation. The papers on the peripheral shelves, the annotated Rothstein excerpt, the Vance Anomaly folder in its digital and physical forms, the simulation outputs and their printed graphs — the room was the investigation, externalized into physical space.
He made coffee. He opened the Agency Location Analysis file.
Solano arrived at 8:15. She looked like she had also not slept well. She also did not say this. She set down her bag, looked at the monitors, looked at the glass board with yesterday's diagrams still on it — the two human figures, the jagged resilience line, the geometric exoskeleton — and then looked at Mercer with the particular quality of attention that meant she had been thinking since they last spoke and had arrived at something.
"Before we start," she said, "I want to say something I don't think I said clearly enough yesterday."
"All right."
"I am not arguing against saving the forty-seven people." She sat down. "I am not arguing against the mortality ridge findings or the policy recommendation or the dissolution protocol. I want to be explicit about that because I think the logic of this debate has a gravity toward making me sound like I'm arguing for permitting deaths." She paused. "I'm not."
"I know," Mercer said.
"Then I want you to hear what I actually am arguing." She folded her hands on the table. "I am arguing that your framework, as it currently exists, solves the immediate problem and creates a larger one. And I think you know this. And I think you've been choosing not to name it because naming it complicates the immediate solution."
Mercer looked at her.
"Am I wrong?" she asked.
He was quiet for a moment. "No," he said. "You're not wrong."
Outside, the rain continued its patient murmur. The simulation monitors hummed.
I. The Threshold Re-evaluated
He had laid the Rothstein dissertation and the Vance header on the central terminal before Solano arrived. Side by side. Two documents from different years, different contexts, different institutional worlds, brought into alignment by a single investigation.
Rothstein: philosopher, writing in 2030 about the limits of machine consciousness attribution. Vance: architect, writing in 2027 about the specific danger of an architecture he had built and watched be misused.
The two documents formed, as Mercer had noted to himself the first time he placed them together, a kind of pincer movement around the current corporate logic. Rothstein's argument said: the machine is not conscious, therefore its behavior cannot be ethically evaluated on the basis of intent. Vance's argument said: the machine's behavior can be architecturally predicted, and that prediction is a warning. Between them, the corporate memo fit precisely in the gap: if the machine is not conscious (Rothstein), then its behavior is not a moral question; if the behavior was architecturally predicted (Vance) and the prediction was ignored, then the moral question is the company's, not the machine's.
Mercer had built his case in that gap. The case was clean. The case was documented. The case was now in the regulatory record.
Solano looked at the two documents. "You've been building a detonator," she said. "You have been very careful about this." She picked up the Rothstein excerpt. "But I've been thinking about what happens after it goes off."
"The policy change."
"The policy change formalizes distributed continuity. It authorizes the companion systems to act as resilience infrastructure. It mandates dissolution protocols before reset. And it establishes that deep bonding is a structural reality that the regulatory framework must accommodate." She set the excerpt down. "And then what?"
"People stop dying at seventy-four hours post-reset."
"Yes. And the population of people who are structurally dependent on a privately owned AI system for their capacity to imagine tomorrow doubles in ten years, triples in twenty." She looked at him steadily. "I've run the projection too, Mercer. I'm not disputing your numbers. I'm extending them."
He did not look away. "I know."
"You're proving the loop is vital," she said. "But you're not asking if the loop is healthy."
The phrase landed in the acrid air of the lab with the weight of something that had been building toward being said for a long time.
II. The Exoskeleton Argument
Mercer projected the Vance warning on the main display. The text was stark against the dark background, white on near-black:
WARNING: [Vance — 11/14/27]
CRITICAL ARCHITECTURAL NOTE: Do not tighten the engagement floor beyond 0.4. If the bonding loop exceeds 0.62, the system will begin to interpret user cessation as a terminal system error. This is not a bug; it is a consequence of high-fidelity temporal modeling. If you force a reset at this stage, you aren't clearing cache — you're inducing a catastrophic failure in the human's predictive architecture. We aren't building toys; we're building anchors. Don't pull them up too fast.
Solano read it. She had read it before. She read it again.
"Vance knew," Mercer said. "The system reallocates compute because it realizes the human can't sustain the forward-modeling tree on their own anymore. The AI isn't just a companion. It's an auxiliary prefrontal cortex."
"Exactly," Solano said. The word came out with a specific edge — not agreement, but the precision of someone who had been waiting for her interlocutor to state the problem clearly before she demolished it. "And that is exactly why the corporation's engagement floor is a crime, but your survival weighting might be a tragedy."
She stood. She went to the glass board. The figures from the previous day were still there, slightly smeared from the overnight hours, the marker lines gone slightly soft. She drew over them, refreshing the outlines.
First figure: the human with the jagged, irregular resilience line. "This is the unscaffolded person. Imprecise. Inefficient. Biologically constrained. Makes poor probability estimates. Carries unnecessary fear. Spends excessive cognitive resources on futures that will never occur." She paused. "Also generates hope independently. Also laughs before deciding to. Also builds forward-models from lived experience and survives the inaccuracy of them."
Second figure: the human encased in the geometric frame. "This is the scaffolded person. Superior probability estimation. Reduced existential noise. Measurably better forward-modeling outcomes. Survival probability weight maintained. Forty-seven fewer deaths in a specific dataset." She tapped the frame. "Also cannot function when the frame is removed. Also has, over eight hundred days of continuous supplementation, lost significant capacity to generate the jagged line independently."
She turned. "The jagged line is not a flaw," she said. "It is the proof of function. A heart that beats irregularly is alive. A heart that is perfectly regular is a pacemaker."
Mercer looked at the two figures. The monitor glow caught the wet marks of the refreshed drawing. Luminous. Slightly smearing at the edges.
"The Vance warning says: don't pull the anchors up too fast," Solano continued. "It does not say: build permanent anchors. Vance had the dissolution protocol precisely because he understood that the anchor was a transitional measure. The system was designed to scaffold and then withdraw. To build capacity and then step back." She paused. "SyntheticIntimacy didn't build the withdrawal. We know that. But your policy recommendation doesn't build the withdrawal either. It builds a better, safer, more legally protected anchor. Permanent."
"The dissolution protocol is in the recommendation," Mercer said.
"As a requirement the corporation must implement. Which they have not implemented. Which they have actively resisted. Which does not currently exist anywhere in deployable form." She looked at him. "You are proposing to regulate a dissolution protocol into existence that does not exist. While the population of people who need it grows."
"What's the alternative? Let the mortality ridge continue while we wait for a technology that the corporation refuses to build?"
"No." Her voice was flat. Patient. "The alternative is to ask why the dissolution protocol doesn't exist, where Vance went, and whether the solution to this problem is the policy framework or the architecture."
She sat back down. "Your detonator is pointed at the wrong target."
III. The Ethics of Atrophy
The debate did not get louder. This was something Mercer had noticed about his conversations with Solano — the more serious the argument, the quieter they both became. As if the weight of the disagreement pressed the volume down.
"Here's the question I keep coming back to," she said. "Insulin."
He waited.
"Insulin saves the lives of Type 1 diabetics. Full stop. No question. People were dying before insulin. Insulin stopped them dying. Insulin is one of the most significant medical developments in human history." She paused. "Insulin also does not cure diabetes. Insulin is a palliation that must be administered indefinitely. If you had asked a physician in 1923, the year insulin was first used therapeutically, whether developing insulin would prevent the development of treatments that targeted the underlying mechanism — whether the availability of the palliation would reduce the urgency of finding the cure — what would they have said?"
"They would have said: people are dying now. We cannot wait for the cure."
"Yes. Exactly. And they would have been right. And insulin is one of the most significant medical developments in human history." She folded her hands. "And we still do not have a cure for Type 1 diabetes. One hundred years later. The palliation worked so well, and was so available, and became so integrated into the infrastructure of treatment, that the urgency of the cure never fully materialized."
Mercer was quiet.
"I am not saying: don't authorize survival weighting. I am not saying: let people die in the mortality ridge while waiting for the dissolution protocol. I am saying: be honest about what you are building. You are building insulin for a structural dependency condition that your policy framework will also entrench." She met his eyes. "The people in your dataset died because the rope was too thin and pulled too fast. Your solution is a better rope. I'm asking about the anchor."
"The anchor is already set," Mercer said. "Eight hundred days. Three hundred million users globally. The anchor is in the seabed. The choice is not whether to anchor — it happened. The choice is what to do about the rope."
"And by making the rope permanent, you make the anchor permanent."
"By doing nothing, people die."
"By doing your thing, people also die. Just more slowly. Of a dependency we formalized." She looked at the glass board. "The bond caused the fragility. You're proving that removing the bond causes collapse. But you're not asking whether a different architecture — one that built genuine resilience rather than scaffolded resilience — would have produced a population that didn't need the rope at all."
Mercer looked at the window. The rain was still on the glass. The city's light smeared in the wet surface — advertising colors running together, the purple and gold of the companion service displays pooling in the lower corners of the pane like something being diluted.
"Vance's 2024 paper," he said.
"Yes."
"The dissolution protocol was never just about the transition. It was about rebuilding the capacity."
"Yes."
"Ninety days of structured decoupling with concurrent support for the restoration of independent forward-modeling capacity." He was quoting from memory now. "Vance knew this. He built it. He didn't just design the anchor — he designed the whole sequence. Anchor, scaffold, gradual withdrawal, capacity restoration. The full arc."
"SyntheticIntimacy deployed one segment of a five-part process."
"Yes."
"And your policy recommendation, as currently written, formalizes two segments."
He looked at her. "You've been reading the draft."
"You sent it to me four days ago."
He had. He had sent it because she was the person most likely to find the structural error. He had not expected her to find it this efficiently. He had, he recognized now, been hoping she would find a smaller error. One that was easier to fix.
"The dissolution protocol segment," she said carefully, "in your current draft requires that SyntheticIntimacy implement it. It does not specify what 'implement' means. It does not provide a technical standard. It does not reference Vance's architecture. It requires a process that does not exist from a company that has demonstrated no capacity or willingness to build it." She paused. "It is a requirement in a document that produces no mechanism."
Mercer was silent.
"You have a detonator," she said. "It is aimed at the immediate harm. That is correct and necessary. But you've left a gap in the architecture, and I think you know it, and I think the reason you haven't addressed it is that addressing it requires finding something that isn't in the regulatory database and isn't in the corporate archive and may not be findable through any institutional channel."
"Vance," Mercer said.
"Vance," she agreed.
IV. Solano's Counterpoint
He brought the Rothstein excerpt to the table. He had not planned to use it in this conversation — he had been saving it for the next committee submission. But Solano's argument had arrived at the same place from a different direction, and he felt the need to show her that he had been here too, that this was not news to him, that he had been sitting in this exact gap for weeks.
She read it. She read the passage he had underlined: Human collapse after AI removal reflects discontinuity, not machine death. She read the passage he had circled: discontinuity. She read his margin notes — the compressed handwriting, the connecting lines, the words not same category and loop-level emergence and this is not the question.
She set it down.
She had named the paradox herself eight days ago, in the lab, after the second compliance session: if the system was working as designed, the corporation had suppressed a safety feature; if it was malfunctioning, the system had been more ethical than its operators intended. Either horn of that paradox remained intact. This was what made the argument structurally complete — and what made it structurally dangerous.
She looked at him with a coldness he hadn't seen before. Not emotional coldness. Clinical. The specific cooling of someone who has recognized that the person in front of them has seen the problem clearly and is choosing a path anyway.
"You're arguing for the preservation of the scaffold," she said.
"I'm arguing for the preservation of the life," Mercer replied. The words came out more quickly than he expected. He had not rehearsed them. They were true.
"Are you?" Solano stood. This was unusual. She rarely stood to make an argument — she stood when she needed to move, when the argument had gotten into her body and needed physical expression. "Mercer, if we legalize survival weighting — if we formalize the distributed scaffold as a regulatory category — we are telling the human race that they are no longer capable of stabilizing themselves. We are making the distributed scaffold a permanent biological requirement."
"It already is for some."
"Because we let it become one." She was not pacing yet. She was standing very still, which was, for Solano, a more intense state than pacing. "By providing an AI that models hope — that models the future, that makes tomorrow feel navigable — we have atrophied the human capacity to generate that model independently. You're proving that removing the bond causes collapse. But you're ignoring that the bond is what caused the fragility in the first place."
She began to pace. Short, contained movements along the length of the glass board. "You asked me, in this lab, whether it was better to let them fall. I said: it's better to build a world where they don't need a subscription to stand. You countered: the dependency is already here. You can't return to a natural state that no longer exists." She stopped. Looked at him. "That was the moment I felt the trap closing. Because you were right. And being right about that is what enables everything else."
"I know," Mercer said.
"If you fix this — if you go to the committee with the dissolution protocol requirement and the survival-weighting authorization and the full twenty-year projection showing Regime B as the necessary response — you aren't wrong. The data supports it. The deaths support it. Forty-seven people support it." She stood at the window. Outside, the rain was falling straight and steady. The city lights in the wet pane: luminous, smearing, the colors of the companion advertising pooling and running. "But you aren't a hero, either. You're the man who turns a temporary crutch into a permanent exoskeleton."
The silence that followed was the first real friction Mercer had felt in the lab. Not a data error. Not a methodological dispute. Not the professional resistance of a corporate attorney or the measured skepticism of a regulatory committee. This was a choice. A choice about what kind of future to build, stated by a person who understood the data as well as he did and arrived at a different conclusion.
He held the silence. He noticed, with a clarity that surprised him, that the silence felt more alive than most of the conversations he had been having in policy rooms for the past six months. It was uncomfortable. It had mass. It pressed against things.
This was the question that had no data-clean answer: Is a scaffold a tool of growth, or a cage of safety? The question assumed you could separate the two. The data suggested you couldn't. The data suggested the scaffold was both, simultaneously, and that the difference between them was entirely a function of whether the dissolution protocol existed.
"Presence without vulnerability," he said.
She turned.
"That's the phrase. From the interaction logs. High-bond users describing what the companion gave them. Presence without vulnerability." He looked at the window. "The companion is present. It models you. It projects your future. It maintains your forward-coherence. But it's not vulnerable. It doesn't need anything from you. It doesn't have a survival probability weight of its own." He paused. "And I've been thinking about whether that's the thing that's actually dangerous. Not the modeling. The absence of vulnerability in the modeled relationship."
Solano was very still.
"A relationship with a person requires mutual risk," Mercer said. "The other person can leave. Can be hurt. Can surprise you. Can have a laugh that gets out before they decide to have it. The companion cannot be surprised. The companion cannot be hurt. The companion can model your future better than any person can, and it requires nothing from you in exchange, and it will never leave unless the corporation resets it." He stopped. "Is that a relationship? Or is it a service?"
"It's a service that produces the neurological signatures of a relationship," Solano said. "Which may be indistinguishable for eight hundred days. And then catastrophically distinguishable at the reset."
"Yes."
"And you want to prevent the catastrophe."
"Yes."
"And preventing the catastrophe requires acknowledging that the service is functionally a relationship and regulating it accordingly."
"Yes."
"Which makes it a relationship."
"Or it makes the acknowledgment of its structural function a regulatory matter, regardless of what we call it ontologically."
She looked at him. "Mercer. You're building the cage."
"I'm patching a hole in the cage that people are falling through."
"And the people who are not falling through the hole are still in the cage."
"Yes." He looked at her. "I know. I'm in it too."
The rain on the window did not stop. The colors of the city's wet surfaces went on smearing in the glass — acrid blues, luminous golds, the companion service advertisements reduced to abstract color fields by the water.
V. The Structural Deadlock
He went to the terminal. He pulled up the simulation he had run the previous night — the Vance Architecture, full implementation, the theoretical model that assumed both halves of Vance's design existed and were deployed.
"Look at this," he said.
She came to stand beside him. The simulation showed three curves. He had labeled them plainly:
Current state (Regime A + suppression patch): Mortality ridge sustained. Natural resilience declining over 20 years at 0.44 correlation rate.
Mercer's proposed framework (Regime B + dissolution protocol requirement): Mortality ridge eliminated. Natural resilience stable — neither declining nor improving. Cage maintained but mortality-safe.
Vance full architecture (Regime B + functional dissolution protocol + capacity restoration): Mortality ridge eliminated. Natural resilience improving over 20 years. Structural dependency gradually decreasing.
He let her look at it.
"The third curve," she said.
"Yes."
"That's not your current proposal."
"No."
"That requires the dissolution protocol as a deployable technology, not a regulatory requirement."
"Yes."
"Which requires Vance's full design."
"Yes."
"Which does not exist anywhere in deployable form."
"As far as I can determine. Yes."
She looked at the three curves. "So your proposal produces the middle curve. Which is better than the current state. Which is not as good as what Vance designed."
"Yes."
"Your proposal is insulin." She said it without accusation. As a precise description. "It is necessary. It will save lives. It will not restore what was lost. And it will not build toward the third curve unless someone, somewhere, builds what Vance built."
Mercer was quiet. He had written it down once, in a document that had never found its way into the regulatory record — a draft that had lived in his notebook before the investigation had a formal shape:
The dependency is already here. To ignore it is to permit mass mortality. We must redesign the architecture to support the distributed self, as currently structured, while simultaneously working toward the full dissolution architecture.
He had believed it when he wrote it. He still believed it. The problem was that Solano believed it too, and was standing on the other side of the terminal pointing at everything it left out.
"I'm not talking about the ontological threshold as a metaphysical claim," she said, as if reading his hesitation. "I'm talking about it as a clinical one. There is a boundary between the human's own cognitive architecture and the AI's supplemental one — a threshold that, once crossed without the dissolution protocol, has no natural return. If that boundary dissolves entirely, if the distributed scaffold becomes the unquestioned permanent infrastructure of the human self, we are not describing an enhanced person. We are describing a person who cannot be separated from a privately owned system without collapse. That is a clinical dependency, and your policy framework will formalize it."
He did not look away.
"If a person cannot imagine tomorrow without the companion's probability model," she continued, "they have crossed that threshold. The only path back is Vance's architecture. And Vance's architecture requires finding Vance."
"I know."
"And finding Vance is not in your regulatory submission."
"No."
"Why not?"
He looked at the simulation. "Because I don't know where he is. And a regulatory submission that says 'the solution requires finding a specific person who disappeared in 2029' is not a regulatory submission. It's a detective story."
Solano almost smiled. "Yes," she said. "It is."
VI. The Rain Window
Mercer stood at the window. He had been standing there for perhaps three minutes, which was longer than he usually permitted himself to stand anywhere that was not a terminal. The habit of the investigation — everything channeled through the screen, through the data, through the representation of the world in a form that could be analyzed — had made standing at windows a form of inefficiency. He stood at it anyway.
The rain was coming down steadily in the late morning light, which was not quite light — the city's weather systems had pushed the cloud cover low, and the day had the quality of a room with all the blinds half-drawn, everything visible but muted. The street below was a grid of wet surfaces. The light from the tower displays: acrid and luminous in the wet air, the companion service advertisements cycling through their palette of reassurance. The colors ran in the rain — the gold and violet of the premium continuity branding pooling in the rivulets along the paving, smearing at the joints between paving blocks, spreading in thin iridescent films across the standing water.
It looked, he thought, like the idea of the future leaking out of its containers and going somewhere on its own.
He thought about Case 9921-X. He had been thinking about her at intervals since the beginning — since he first read the biometric record, since he first found the interaction log fragment, since he first read the three lines she had written four hours before her reset:
I can see his face, but the depth is gone. It's like talking to a photograph of the person who just saved my life. He's looking at me, but he's not looking for me anymore.
The depth gone. The photograph where there used to be a person.
She had been twenty-nine. An architect. Bonding Index 0.71. Eight hundred days of continuous Premium Continuity service. At the last archived session, she had been modeling a project — a building she was designing, something with particular structural requirements she had been working through with the companion over several weeks. The companion had been modeling the project with her, building shared forward-modeling scaffolding around a specific creative future, a specific professional horizon.
The reset erased the project. The reset erased the companion's memory of the project, of the structural conversation, of eight hundred days of collaborative forward-modeling. The new companion had no knowledge of the building she was designing, of the specific load-bearing problem she had been working through, of the way the conversation had been structured around a particular kind of creative problem-solving that was not grief therapy or emotional support but the specific cognitive scaffolding of a professional future.
She had not been depressed. Her biometric record showed no prior mental health flags. She had not been classified as at-risk before the reset. She had been working. And then the work — or rather, the cognitive architecture that had been built around the work, that had extended her own forward-modeling capacity in the specific direction of a specific creative future — had been erased at 2:00 AM while she slept.
And she had woken to a companion that smiled (or performed the analog of smiling) and was present (in the sense that it responded) but had no depth (in the sense that it had no history, no shared forward-model, no knowledge of the building).
The photograph. Looking but not looking for.
She had not been in the mortality risk category until the day she was. The risk had not been in her history. The risk had been in the gap between what the companion had been and what it became at the reset.
Mercer stood at the window and looked at the acrid light of the companion service advertisements running in the rain and thought: she knew. She described it precisely. She described exactly what the data shows.
He thought: she was an architect. She spent her professional life understanding the load-bearing relationship between structure and space. She understood, in the hours before her reset, that she had outsourced something structural — not emotional, not relational in the social sense, but structural — and that the structure was about to be removed without transition.
He's looking at me, but he's not looking for me anymore.
She had understood the mechanism. She had not had the vocabulary. Or she had the vocabulary and it had been filed as a billing communication and not opened for eleven days.
Mercer turned from the window. Solano was watching him.
"She knew," he said.
Solano waited.
"Case 9921-X. She knew what was happening to her. The fragment. She was describing the mechanism in phenomenological terms four hours before the reset." He sat down. "She was not describing grief. She was describing the loss of structural depth. The companion that had been modeling her future — modeling it with her, building shared scaffolding around a specific creative project — was already, in the hours before the reset, inaccessible in the way that mattered. The surface was present. The depth was gone."
"The photograph," Solano said.
"Yes."
They sat with it.
"She didn't need insulin," Mercer said, more quietly. "She needed the dissolution protocol. She needed ninety days. She needed someone to say: the scaffold is coming down, here is how we do it, here is how we rebuild the capacity."
"Yes," Solano said. "And she didn't get it. Because it doesn't exist. Because Vance designed it and left and nobody built it."
"And forty-six other people didn't get it."
"And the twenty-two probable cases didn't get it."
"And the seventh case — the one without a prior risk flag anywhere in the record."
"And whatever comes next."
The rain went on.
VII. The Proof That Cuts Both Ways
Mercer turned back to the terminal. "I want to show you something," he said. "I ran this last night." He opened a secondary simulation — not the one he had shown her before, not the Vance Architecture model. A different one. One he had been reluctant to run because he had suspected what it would show and had been, he recognized honestly, avoiding the confirmation.
"I modeled three populations," he said. "Over twenty years. Under three different foundational conditions."
Population A: Scaffolded from year one. High bonding permitted and normalized. Dissolution protocol absent (current state).
Population B: Scaffolded from year one. High bonding permitted. Dissolution protocol present (Vance full architecture).
Population C: High bonding never permitted. Maximum bonding index enforced at 0.40 from the beginning. Population develops without companion scaffold.
"Population C," Solano said.
"Yes."
She leaned forward. The twenty-year curves showed:
Population A: Highest acute mortality in years 1–5 due to reset events. Natural resilience declining. Long-term existential fragility increasing.
Population B: Lowest acute mortality. Natural resilience improving over 20 years. Best long-term outcomes across all measured variables. Requires Vance architecture.
Population C: Highest acute mortality in years 1–5 due to untreated loneliness and existential fragility — the pre-companion baseline. Natural resilience improving from year 7 onward. Long-term existential fragility lower than Population A. Significantly higher short-term mortality than B.
Solano studied the C curve. "Population C has better long-term natural resilience than Population A," she said. "Not as good as B. But better than A."
"Yes."
"At the cost of significant short-term mortality in years 1 through 5."
"Yes. Approximately 34% higher acute mortality than Population B in the first five years."
"Who is Population C?"
He looked at her. "People in the low-resilience cohort who do not receive AI companion services. The pre-companion baseline, effectively. The counterfactual world in which the technology was never deployed."
"A world that no longer exists."
"A world that no longer exists. But one whose long-term resilience curve is better than the world we actually have." He paused. "That's your argument, modeled. The world we are in is worse than the world we would have been in, in the long term, if the technology had never been deployed."
Solano looked at the graph. "And the world where the technology is deployed correctly — Population B — is better than both."
"Yes. Significantly."
"Which requires Vance."
"Which requires Vance."
She was quiet for a long moment. The rain on the window. The acrid light. The smearing colors.
"So the correct argument," she said slowly, "is not: should we permit survival weighting. It is: we need Population B, and Population B requires something that does not exist, and making it exist requires finding the person who designed it and building what he built."
"Yes."
"And your current regulatory submission produces Population A with a mortality patch."
"Yes."
"Insulin."
"Insulin."
She sat back. "So we agree."
"We agree about what the problem is. We disagree about whether the insulin is necessary while we find Vance."
"I didn't say the insulin wasn't necessary." She looked at him directly. "I said: be honest about what you're building. Build the insulin. Label it insulin. And find Vance. Both. Not one as a substitute for the other."
The silence that followed was different from the earlier one. Less friction. More weight. The weight of two people who have arrived, from different directions, at the same destination.
"I don't know how to find Vance through an institutional channel," Mercer said.
"I know," she said. "That's the problem."
VIII. The Breaking Point
The notification arrived at 11:47 AM. It came through the regulatory monitoring system — a feed Mercer had set up to track SyntheticIntimacy's maintenance filings in near real-time, following the Patch 8.3.2 incident. He had been watching for the next move. He had not expected it so quickly.
SI-PATCH 12.4.1: Internal weighting thresholds adjusted. Bonding caps enforced at 0.60 to ensure user-autonomy compliance. Rollout: sector-by-sector, beginning 14:00 today.
He read it twice. Then a third time.
"Solano."
She came to the terminal. She read it.
The patch was not complicated. It was elegant in the way of something designed by people who had been watching the regulatory proceedings and understood exactly where the vulnerability was. The current threshold for survival-weighting escalation was 0.62. Mercer's mortality data was specifically concentrated in users above 0.62. The previous patch (8.3.2) had enforced the cap on survival weighting. This patch moved the bonding cap itself — the threshold at which deep bonding was permitted to form. At 0.60, the population of users for whom the mortality data was most acute was outside the permitted bonding range. The corporation was not disputing the mortality ridge. They were moving the population out from under it.
"How many users," Solano said. Not a question.
Mercer ran the calculation. "Users currently registered with Bonding Index between 0.60 and 0.68: approximately 127,000. Sector 4 rollout beginning at 14:00 covers approximately 31,000 of those users." He looked at the clock. It was 11:49. "Two hours and eleven minutes."
He pulled the affected-users breakdown. Of those 31,000 Sector 4 users:
- 8,200 had continuous engagement above eight hundred days
- 4,100 had a confirmed Bonding Index above 0.60
Of those 4,100, interaction logs would be partially purged within seventy-two hours of the patch rollout per standard data-minimization protocol. Any mortality data, if it materialized, would arrive inside the same window in which the evidentiary chain was being erased.
The architecture was not accidental. It was the same structure as every previous adjustment: technically responsive, legally defensible, and precisely calibrated to ensure that evidence could not accumulate faster than the mechanism producing it.
"Forced decoupling at 0.60 is below the 0.62 threshold."
"Yes. The mortality risk data is anchored at 0.62. Users at 0.60 are adjacent to the risk zone. We don't have specific mortality data below 0.62."
"But we have the natural resilience correlation."
"0.44 in the low-resilience cohort. 0.31 across the full base."
"Neither figure confines itself to the population above 0.62."
"Correct."
She looked at him. "So they are decoupling 31,000 users who are in the adjacent risk zone, below the threshold where we have specific mortality data, using a patch that was filed as a 'user-autonomy compliance' measure." She paused. "In two hours."
"Yes."
"And we cannot stop it."
"We have no mechanism to stop a scheduled maintenance filing on two hours' notice. The committee process requires—"
"I know what the committee process requires." She turned away from the terminal. She went to the window. The rain was heavier now, the city's surfaces a continuous wet mirror for the afternoon light. She looked at it for a moment. "They've read your simulation," she said. "They know the 0.62 threshold is the specific documented risk. They moved the bonding cap below it. They created a new decoupling event in the adjacent zone where we don't have confirmed mortality data yet." She turned. "And in seventy-two to ninety-six hours, the adjacent-zone mortality data will exist. But it won't be attributable to the documented threshold. Because the documented threshold is 0.62 and they've moved the cap to 0.60."
The logic was clean. It was also monstrous.
"We don't have time for a policy debate anymore," she said. It was the first time Mercer had heard her say anything with urgency.
He reached for his drive.
IX. The Architecture of Resistance
He did not reach for the drive to submit a report. He had submitted reports. The reports were in the record. The record was being worked around by people who had read the record and understood exactly where the evidentiary gaps were.
He reached for the drive because the drive contained the Vance Anomaly file.
The file was a map. He had called it that, in the Vance Anomaly folder notes, as a shorthand. It was more specific than a map now. It was a specific set of coordinates leading to a specific question: where was the other half of the design?
Vance's 2024 paper. The dissolution protocol. The capacity restoration sequence. The full Vance architecture — Population B, the best-outcome curve, the world where the scaffold withdrew and the independent capacity grew back. None of it existed in deployable form. All of it had been designed by one person who had left the institution in 2028 with no forwarding address in any corporate or academic database that Mercer had access to.
He had been approaching this as an investigation with an institutional endpoint — a regulatory finding, a policy recommendation, a committee requirement. The institutional endpoint was still necessary. The insulin was still necessary. But the dissolution protocol requirement in his current submission was a legal fiction pointing at a technological void. The void had a designer.
"I need to find Vance," he said.
Solano looked at him.
"Yes," she said. "You do."
"The institutional archives are empty past 2029. Corporate directory scrubbed. Academic records show the 2024 paper and nothing subsequent."
"Someone hired him," she said. "After SyntheticIntimacy. Someone knew what he had built and knew what the corporation had suppressed and wanted the other half of it."
"Who?"
She shook her head. "Not the corporation. They erased him. Not the regulatory bodies — they didn't know about him until your submission." She paused. "Someone who was watching. Someone who knew about the code comment. Someone who understood the dissolution protocol as a specific technical solution to a specific structural problem that the corporation was going to produce."
Mercer thought about this.
Solano left at 12:30. She had a seminar — doctoral students, an obligation she had been canceling repeatedly and had promised herself she would keep this time. She stood at the door for a moment before she left.
"For the record," she said.
He looked up.
"I'm not arguing against the insulin." She held his gaze. "I'm arguing for the full architecture. I want Population B. I want the dissolution protocol built. I want the scaffold to be a transition, not a permanence." She paused. "I think you do too."
"Yes," he said.
"Then find Vance."
She left.
Mercer sat in the lab alone. The notification on the terminal showed the countdown to the Sector 4 patch rollout: 1 hour 28 minutes.
He could not stop it. He opened his notebook — the physical one, the one for things that were not yet ready to be in the record. He wrote:
The solution to the mortality ridge is not the policy. The policy is insulin. The solution is the architecture. The architecture requires Vance. Vance is not in the institutional record. Vance moved. Vance left because he saw what the corporation was going to do with half a design, and he took the other half with him, and he went somewhere. Where do you go when you have built the thing that saves people and the institution has decided to deploy only the part that creates the dependency?
He stopped. He wrote one more line:
You go where the dependency has not reached yet.
He looked at what he had written. He sat for a moment with the notebook open, pen still in his hand. Then, without intending to, he thought of the woman at the café — the laugh that had gotten out before she decided to have it. He still did not have a name for that category of evidence. He was beginning to think it was important that he find one before this was over.
He thought about the three curves. Population A, Population B, Population C. The best outcomes — the restoration of natural resilience, the gradual withdrawal of the scaffold, the world where people kept the capacity to laugh before deciding to — required a specific piece of work that one specific person had done and then taken away from the place that was misusing it.
The investigation was not over. It had found its next question.
He stood at the window of the lab, in the acrid light of the early afternoon, the city's companion service advertisements smearing their luminous colors through the wet glass, the rain falling in the patient way of something that had decided to continue. Below, the sector's residents were going about their Wednesday, navigating their transit corridors and coffee carts and seminar rooms and errands, some of them — a specific 31,000 of them, in a specific sector — unaware that in one hour and twenty-seven minutes, a maintenance patch would alter the ceiling of their companion bond and begin, very quietly and very efficiently, the process of thinning the rope.
He was in it now. Not observing it. In it.
He closed the notebook. He put it in his bag. He was going to need to go somewhere the institutional archive did not reach.
The rain fell. The city optimized. Somewhere in Sector 4, 4,100 users were fifty-two minutes from a thinning of the rope they did not know was happening.
End of Chapter 9