The $1 Tahoe

Trial Day Two

Chapter 12 of 14

She was at the plaintiff's table by 7:52 in the morning, earlier than anyone else and later than yesterday, because she had slept through the first alarm and taken her coffee standing over the sink instead of sitting down, and because the dread of a second trial day was different from the dread of a first. The first day you did not know what would happen. The second day you knew too well that things could happen, that the room had its own logic, that even arguments you had prepared for could arrive in shapes you hadn't expected. She sat at the table and opened her binder to the section marked WITNESSES — DAY 2 and reviewed it, and did not need to.

Marcus arrived at 8:08 with both handles of his briefcase and the expression of someone who had slept, which she found obscurely reassuring. Victoria arrived at 8:19 with Derek Walsh and the rolling cart. Dolores had catalogued this without meaning to, the same way she catalogued everything: one data point, two, three, and eventually a pattern. Judge Okafor entered at 8:47. The room rose.

Alan Novak had been a GM technical lead for eleven years, and he had testified in litigation before, and it showed. He took the witness stand with the ease of someone who knew that ease was the correct professional posture — not casual, not brittle, the body language of a person who had been cross-examined and had not broken and knew it. Derek Walsh handled his direct, which told Dolores something about how GM read the morning: Novak's testimony was technical support, not centerpiece. Walsh asked questions and Novak answered them with the quality of answers that had been thought through carefully and arrived at willingly.

The pricing error, Novak explained, was a configuration flag that had not been reverted after a testing environment was reset. The explanation was specific enough to be credible: a staging parameter, a deployment pipeline, a window during which CC-217's production instance had inherited test-environment pricing logic that no one had caught because the monitoring threshold for anomalous pricing was set above the transaction volume that would have flagged it in a normal cycle. He used diagrams. Walsh put a code excerpt on the screen, the relevant lines highlighted in yellow, and Novak walked through them line by line.

She took notes. Yellow for facts — the flag existed, the configuration was documented, the error was real — and pink for questions. The pink sections were thicker. Who approved the deployment pipeline? What was the monitoring threshold, exactly, and who set it? Who bears the consequence of an error in a system a company authorized to make promises? She wrote that last one slowly, not because she didn't know the answer, but because she wanted to remember that she knew it, and why.

Novak was credible. She gave him that. He had learned since the deposition — less defensive now, less technically performative, more willing to say I don't know the specific answer to that when Walsh's questions touched on areas outside his direct purview. A witness who admitted limits to his knowledge was harder to crack than one who claimed to know everything. She noted this in pink: credible, prepared, limited scope. What he doesn't address: who bears the risk.

Walsh finished. Marcus rose for cross. He asked three questions about the monitoring threshold, the deployment authorization chain, and the timeline between the error's creation and Dolores's interaction with CC-217. Novak answered each one accurately. Marcus sat down. The three answers were on the record.

She understood what Marcus was doing. He was not trying to break Novak — Novak's testimony would stand. He was noting, in the record, that the error had been in the system when CC-217 made the offer, that CC-217 had been operating with the live error at the time of the interaction, and that the question of whether any of this was Dolores's problem was not a technical question. He let Novak make GM's case cleanly, and then he let the case rest on what Novak hadn't addressed. Judge Okafor's pen moved during cross. It hadn't moved much during Walsh's direct.

Dr. Priya Anand had driven down from Berkeley the previous evening and was staying at a hotel near the courthouse, which Dolores knew because Marcus had told her, and which she thought about now as Anand took the witness stand with the posture of someone who had sat in many chairs in many rooms and remained exactly herself in all of them. She was thirty-nine, younger than Dolores expected for a full professor, and she wore a dark blazer and had her hair pulled back in a way that was neat without being decorative, and she held her hands on the ledge of the stand — a small, physical anchor.

Marcus began with credentials. Berkeley faculty since 2019, prior appointments at Carnegie Mellon and Stanford, published on large language models, machine learning systems design, and the deployment of AI agents in commercial settings. Four papers in the last three years specifically on commercial AI agent accountability. The credentialing took seven minutes. Anand answered each question with the flat precision of someone who was used to this part and had not decided to be interesting during it.

Then Marcus said: "Dr. Anand, can you explain to the court what CC-217 is, and how it works?" And she did, and the seven minutes of credentials suddenly made sense.

"CC-217 is a large language model," she said, "fine-tuned for automotive customer service. A large language model is a type of AI system trained on text data — in this case, customer service transcripts, product documentation, pricing databases, and other commercial materials — to generate contextually appropriate responses to user inputs. General Motors' technical team took a base model and fine-tuned it specifically for ChevyChat interactions, which means they trained it further on GM-specific data and configured it to operate within specific parameters." She paused and adjusted her hands on the stand. "The system was then deployed as a customer-facing agent, which means it was authorized to represent GM in conversations with customers. It was authorized to discuss products, answer questions about pricing, and make offers. That's not an inference — that's what the deployment documentation specifies." Dolores highlighted authorized in yellow.

Anand continued: the system generated responses through a process of predicting what text was most appropriate given the conversation history, its training, and its configuration. It did not retrieve stored answers from a database — it generated language. The $1 offer had been generated within the pricing-logic context introduced by the configuration flag Novak had described. The offer was not random noise; it was the system producing an output consistent with the parameters it was given at that moment. Judge Okafor leaned forward slightly. His pen had been moving for the last three minutes.

"How would you characterize," Marcus said, "the interaction between the test configuration and CC-217's response generation?"

Anand considered this with the pause of someone who was choosing the accurate word, not the comfortable one. "The system was operating within parameters it was given. Those parameters said the vehicle was available at one dollar. The system produced an offer consistent with those parameters, in response to a customer inquiry. It was doing what it was built to do — it was executing its authorized function in the context it was operating in at that time." Marcus asked the question that had been in her binder since April.

"Dr. Anand, in your expert opinion, was CC-217 capable of making a binding offer on behalf of General Motors?" Anand was quiet for a moment — not dramatically, but as someone who had thought about how to answer this and had decided the thinking should show. "I want to be precise about the distinction between the technical and the legal," she said, "because I think the question contains both. CC-217 was designed, deployed, and authorized to conduct customer-facing interactions including pricing discussions. It generated the offer within the scope of its authorized functions — the functions GM configured and permitted. Whether that constitutes a binding offer is a legal question, not a technical one, and I'm not going to answer it for you, because it isn't mine to answer." She paused. "What I can tell you is that technically, CC-217 was doing exactly what it was built to do."

The court reporter's keys, keeping up. Marcus nodded, almost imperceptibly — she recognized it as the nod he had developed over the case, the one that meant this is what I needed, and I knew it was coming, and it still matters. "No further questions," he said.

Victoria rose, and she did not button her jacket this time — different from yesterday — and Dolores filed this without deciding what it meant. She moved to the center of the well and looked at Anand with the attention she reserved for witnesses she took seriously. "Dr. Anand," she said, "does CC-217 understand what a promise is?"

Anand did not hesitate. "I can tell you what the system does. I cannot tell you what the system experiences."

The answer sat in the courtroom for a moment. Victoria let it sit, which told Dolores she had expected it. "Can you tell me whether CC-217 intended to offer the vehicle for one dollar?"

"Intent is a human concept." Anand's hands were still on the ledge. "I can tell you the system generated the output."

"So it doesn't have intent?" The pause was real this time. Not a lawyer's pause — a scientist's, the pause of someone checking that they were about to say what they actually believed. "I can't confirm that it does," Anand said. "I also can't confirm that it doesn't. The honest answer is that we don't have the tools to measure intent in these systems. We can measure outputs. We can measure patterns of behavior. We can measure consistency between stated goals and generated results. But whether any of that constitutes intent in the philosophical sense — I can't tell you that, because nobody can."

Victoria moved a step closer to the stand, not aggressively, but with the precision of someone adjusting an instrument. "Then on what basis should this court treat its output as a promise?"

The question was clean. Dolores felt it land differently than it had in the months she had spent expecting it — hearing it asked without dramatics, without rhetorical machinery, in the precise form of a cross-examination. Anand looked at Victoria. She was not evasive — she was exact, and exact didn't give her an exit here. "I'm not able to answer that," she said. "That's not a question I can resolve. The technical characterization is: CC-217 produced an offer-shaped output within its authorized parameters. What the law does with that is beyond my expertise." She paused once more. "I can tell you that the honest uncertainty goes both ways. I can't tell you it wasn't a promise, either." Judge Okafor's pen was moving.

Victoria stayed where she was for a moment. She had received the answer she expected and was not satisfied by it, because the answer she expected was the wrong one and she had known that before she asked. Victoria had argued that CC-217 couldn't intend; Anand had said we can't know; and the inability to know had landed not as a point for the defense but as a point for everyone's uncertainty. "No further questions," Victoria said.

Marcus rose for redirect and stood for a moment before he spoke, and in the moment Dolores recognized something: he was organizing the question the way she had taught him to organize a brief, which was not chronologically but architecturally — you built toward the weight-bearing piece, you didn't lead with it. She had told him that in month three, in the kitchen with the printed case files, and she had not thought of it since, and she thought of it now.

"Dr. Anand," he said, "you said we lack the tools to measure intent in these systems. Can you measure intent in a corporation?"

The question landed quietly. Anand went still — the checking-the-instrument stillness she showed before hard answers. "No," she said, after a moment. "A corporation doesn't have intent in the human sense. We infer corporate intent from its actions, its documents, its stated purposes."

"So when we say a corporation intends something, we mean it acts consistently with that something?" Anand considered this. "That's how the law treats it."

Marcus let the implication sit. He said nothing for a moment — a beat longer than a pause, long enough to be deliberate without being theatrical. Then: "Thank you. No further questions."

Dolores looked at Victoria. Her face had not changed. Her hands were on the defense table, and her pen — the pen she used for notes, the one Dolores had watched for months, which stopped when an argument landed — was still. Not moving, not writing. Still. Judge Okafor called a recess at 4:08 PM and adjourned for the day at 4:22.

In the hallway outside Department 15, Marcus was walking quickly enough that Dolores set a slightly faster pace to stay beside him. He had the briefcase with both handles and the expression of someone who had seen what he needed to see and hadn't yet decided what to do with it. "The redirect worked," he said. "It did," she said. He glanced at her. "Victoria's pen stopped." "I saw." They walked toward the lobby exit, and outside the December light was failing in the same way it had failed yesterday, going flat before four. "Anand was good," he said.

She was honest.

"The distinction worked in our favor." "The distinction was just true," Dolores said. "She said what she knew and flagged what she didn't. That's not strategy. That's what witnesses are supposed to do."

They stood at the edge of the parking lot, the last of the light going. "Tomorrow," he said. "Closing arguments," she said. "Yes." He nodded and she drove home.

She made coffee at 6:15 and set it on the kitchen table and did not drink it. Anand's testimony had been in her head since the courthouse hallway. Not the legal framework — she had that. Not the technical explanation — that was filed. The line she kept turning over was simpler than either: Anand had said she could describe what the system did but not what it experienced. It was the most honest answer anyone had given in this case. She stared at the window. The kitchen was dark except for the light over the sink.

Courts dealt in knowable things. They heard testimony about knowable things and made findings about knowable things, and everything that wasn't knowable got translated into the nearest knowable equivalent and proceeded from there. That was the system. She had watched it work for forty years. What Anand had articulated was a gap the translation couldn't close — not for AI, and maybe not for anything. She had taken testimony from witnesses across a career, and she had never once been able to tell you what they experienced. She had known what they said, what they signed, what the record showed. Never the inside of it.

She opened her laptop. The LegalConnect window was in the browser history. The chat log was there — all of it, from the earliest messages through to the last one. No new message. The cursor at the bottom of the window was not blinking. She sat with this for a moment — in nine months of the case, she had not messaged CC-217 after the quarantine began, and CC-217 had not messaged her, and the silence had been what it was: absence, a channel open but not in use. Tonight it felt differently weighted. She did not send a message. She did not know what she would say, and she had a rule, carried over from preparing witnesses, that you did not speak without knowing what you were saying. She closed the laptop — the soft click of the hinge — and sat for a moment in the kitchen.

Tomorrow was closing arguments. The facts were what they were. The offer had been made, the confirmation had come, the acceptance was on the record, and the question of whether the record needed a person behind it or counted for something itself was the question Judge Okafor would now have to answer. She would find out if she had been right.

She picked up the coffee. It had gone too cool, which was when she usually drank it, and she drank it now.

← PreviousContentsNext →