the-predictive-policing-poet

The Pattern

Chapter 7 of 14

The desk lamp made a cone of yellow light that reached only as far as the stack of folders beside her keyboard. Beyond that edge, the office dissolved into the kind of dark that offices become when the building empties out — not menacing, just indifferent, the way rooms go when they stop being performed for anyone. The fern on the windowsill was a shadow behind a shadow. Evelyn had noticed it looking worse three weeks ago and had not done anything about it since.

She opened the error log interface at 9:14 PM, using her project liaison credentials, which gave her access to the internal data that didn't reach the program's public-facing reports. She had never used this access before tonight. She typed the first query and waited for the data to load, and by the time the rows populated the awareness of what she'd just registered — this access, before tonight, never — had already receded behind the numbers.

False positive logs. She sorted by date range first, then by confidence interval, then tried by geographic cluster. The interface was built for someone who knew what they were looking for, and she was pretending she did. She had co-written a paper last spring on ORACLE-9's methodological rigor. She had cited the 3.2% error rate in that paper, in three conference presentations, in the TED talk that had now accumulated more than three million views. She had said the number the way you say a number you trust. The logs loaded: case numbers in gray text, confidence scores in blue, outcome designations in black. She started at the top of the Parkwood cluster and read down.

Case #2028-0312-FP. Subject: Cole, Marcus. Age 34 at time of prediction. She almost scrolled past it. The case was three years old, predating the program's public literary phase, from the period when ORACLE-9's narrative outputs were still being classified as administrative summaries. She stopped because of the confidence score: 74%. That was above the threshold. The intervention column was not blank.

She read the intervention record, which ran to four lines. Detained at residence, 7:40 AM, October 9, 2028. Held 48 hours for questioning. Released without charges. No crime occurred.

She found the employment flag in the supplementary records, which was a field most people didn't open. Occupation at time of prediction: Teacher, Parkwood Elementary School, 3rd grade. Status 90 days post-intervention: Terminated. The reason field said administrative reassignment, which was the language the district used when the actual language would require an explanation they didn't want to give. The intervention had gone onto his file — not a charge, not an arrest, just a record that said a system had predicted him dangerous — and the school board had called it a conflict of interest and let him go before the semester ended.

Evelyn pulled the literary archive search and typed his name; the system returned one result, dated November 2028: "The Hands That Held the Chalk," published in Prairie Schooner that December. She had selected that issue's ORACLE-9 submissions herself. She remembered the poem. She had described it to the editorial board as one of the program's most formally sophisticated pieces — the way it built toward the imagined violence through the subject's own hands, the chalk-dust detail that grounded it in specificity. She had said the word specificity. She had not known whose hands they were. She had not asked.

Marcus Cole had never read it. She could not find any record of him being notified. There was a notification field in the case record, required by the program's ethical disclosure protocol — a protocol she had helped draft, two years ago, at a two-day working session in a hotel conference room with catered lunch. The field was blank. The protocol existed. The field was blank.

She sat with that for a moment. The poem had been published in December 2028. It had been taught in a poetry seminar at the University of Michigan in the spring of 2029, she knew this because she had supplied a curriculum packet. A man who used to teach third grade, who had been detained for 48 hours for a crime he had not committed and never planned to commit, had lost his classroom and his mornings and whatever it was that 3rd-grade teachers build over years with children whose names they remember — and somewhere in a seminar room in Ann Arbor, a graduate student had written a paper about the chalk-dust imagery in a poem about his hands.

The lamp hummed. The fern didn't move. Outside the window, the campus was empty and quietly lit, the kind of institutional lighting that doesn't know the difference between 10 PM and 2 AM, that simply persists. She didn't close the window. She opened the geographic filter and set it to Parkwood, 2026 through present, all confidence levels, all outcomes. She expected a manageable number. She was wrong about that.

The grandmother was case #2029-0561-FP. Sixty-three years old. Predicted for shoplifting, 56% confidence — below the intervention threshold, so no detention, but the prediction had been logged and generated output. ORACLE-9 had written about her in a poem that appeared in Copper Nickel six months later. She had bought ibuprofen and a birthday card for her granddaughter, according to the corrected record.

The teenager was case #2030-1142-FP. Seventeen. Predicted for vandalism, 61%. Below threshold. He'd been walking home from a friend's apartment. There was a poem about that too, published in Ploughshares.

She started a second spreadsheet, separate from the interface, because she needed to see the numbers side by side in a way the program's display wasn't designed to show. Each poem. Each case. The journal, the confidence score, the actual event, the demographic data the program collected as part of its input matrix and that she had never asked to see disaggregated on the output side.

She ran the numbers three times because the first time she thought she had made an error in the pivot table, and the second time she thought the sample size might be too small to be meaningful, and the third time she looked at the numbers and did not think either of those things.

Overall error rate: 3.2%. The published figure. The one she'd used.

Parkwood district: 11.4%.

Black men under 25: 14.7%.

She had testified before the city technology board eleven months ago. A council member had asked about demographic disparities in predictive system error rates and she had said the program's overall accuracy was validated at 96.8% and that demographic breakdown analyses would require additional scoping. The council member had accepted this. She had believed it was the honest answer, because it was technically the honest answer, and she had never pulled the scoping analysis herself to see what it would show. She had access. She was the ethics consultant. She had never asked.

The spreadsheet sat there with its numbers, not accusing her of anything, just being numbers, just being a thing that had always been true and was now on her screen at 11:07 PM in a dark office where the only moving thing was the cursor blinking at the end of an empty cell.

She opened a new email to herself. She typed the subject line: Methodological limitations — internal note. Then she sat with the cursor in the body of the email and did not type anything. The question was what she would say after "The disaggregated error rates suggest—" and she could not find the verb. Suggest a need for review was what she wrote in papers, and she had written it many times, and the sentence had always felt like action taken. Now it felt like a sentence that kept the door shut. She deleted the draft. The email window closed.

She went instead to her faculty page and found the paper. "Algorithmic Ethics in Practice: The ORACLE-9 Literary Program as a Case Study in Responsible AI Deployment." She opened the PDF and read from the abstract, which she had written carefully, which had been peer-reviewed by three people whose judgment she trusted. The program's 3.2% false positive rate compares favorably to human judicial error rates across comparable jurisdictions. She had written that sentence. She recognized it as a sentence she had written, had found sound, had defended at two conferences when audience members raised objections she had considered and set aside. She read it again. The sentence did not break. It remained technically correct. The 3.2% was the published figure. The human judicial comparison was standard methodology. The framing was defensible. She closed the PDF.

She looked up the city council member's name, the one who had asked about demographic disparities eleven months ago. Councilmember Ruth Adebanjo, District 7, whose office she had a direct line to, whose question she had deflected with the additional-scoping answer. She did not open the contact. She found the transcript of the public meeting, pulled up the timestamp, read Adebanjo's question again: Dr. Okafor, are the error rates distributed evenly across demographic groups in the affected districts, or is the program making more mistakes about some people than others? The question was not ambiguous. The question was the 14.7%. She had told Adebanjo she would need additional scoping to answer it. She had then not done the scoping. She closed the transcript.

She thought about the council member's question, the one she'd deflected eleven months ago. She thought about what Adebanjo might have done with 14.7%. Whether it would have changed anything. She thought about the grandmother with the ibuprofen and the birthday card, how ORACLE-9 had imagined what she might do with her hands, how a journal she respected had published that imagination, how no one — not Evelyn, not the editors, not the readers who'd written enthusiastic annotations in the margins of their copies — had asked whose hands they actually were.

The mug sat on the far corner of the desk, behind the second monitor. She had had it for four years, had picked it up at an AI ethics conference where it was merchandise. Ethics Is Not Optional. She had thought it was funny in the good way, the self-aware way, when she bought it. The coffee had gone cold two hours ago. She did not look at it.

She thought about the TED talk. The version of herself on that stage, in the red blazer she'd bought for the occasion, explaining to a room of people who already wanted to believe her that the presence of an ethics consultant was itself a structural safeguard. She had said preventive art at least seven times in eighteen minutes. She'd coined the phrase herself, two years before, in a paper arguing that ORACLE-9's poetic output represented a form of pre-emptive memorialization — grief for violence that the system's intervention made unnecessary. The phrase had caught. Journalists used it. The program's promotional materials used it. She had been invited to speak because of it. She said it now, quietly, to the empty office.

Preventive art.

It didn't do what it used to do. The words landed and lay there like something that had been let go. The TED talk was online. The papers were published. There were three panels she'd sat on in the past eighteen months where she had explained to other academics and policy people and curious journalists how ORACLE-9's literary work represented a new category of automated empathy, how the poems made the abstract concrete, how she saw her role as building a bridge between the data center and the bookshelf. She had used the bridge metaphor in multiple settings. She had believed it, and she thought now about what a bridge is built on.

The photo on the far shelf showed her with the program's project director at the Confluence Consortium launch event, eighteen months ago. Both of them smiling. She was holding the printed program with her name listed as ethics chair. She had been proud of that. She had felt like her career had led somewhere that meant something.

She pressed both palms flat on the desk, the way she did sometimes when she needed to feel something solid. The wood was cool. Her hands were hers. She had submitted poems for publication. She had selected Marcus Cole's poem herself.

She had a framework for distinguishing between complicity and negligence. She had published it. The distinction rested on whether a practitioner had the information necessary to foresee harm and had chosen not to act on it, versus whether the harm arose from a genuine gap in the available evidence. She applied the framework now, in the way she would apply it to any case brought before her. She had had access to the disaggregated data. The capability to run the scoping analysis had existed. The 11.4%, the 14.7% — these were not figures that had been concealed from her. She had been the person with the credentials to find them. The gap was not in the evidence. She had written a paper about this distinction, had stood at a podium and walked through its logic, had told her graduate students that ethical frameworks were only as useful as the rigour with which you applied them to yourself. She was an ethicist who had not asked for the disaggregated data. She had access to the disaggregated data. She had written papers about the importance of disaggregated data in algorithmic accountability assessments. She had not asked.

Her phone was on the desk beside the cold mug. She picked it up, set it down, picked it up again. Margot Levine's number was in her contacts under Confluence — Margot, from the planning conversations about this year's festival retrospective. ORACLE-9 was supposed to be celebrated. A five-year retrospective: the Pushcart poem, the Prairie Schooner run, the institutional legitimacy she had helped build. Margot had been excited. Evelyn had been excited. She dialed before she had finished deciding to.

It rang twice. Margot picked up with the slightly guarded tone of someone getting a call from a professional contact after 11 PM.

"Evelyn. Everything alright?"

"I need to propose something," Evelyn said. "About Confluence. About the retrospective." A pause. "Okay."

"I want to invite Darnell James to read. His own poem. On stage."

The silence on the other end was not the silence of someone who didn't understand. Margot understood. She was quiet for a long time, and Evelyn waited, because there was nothing to add to what she'd said that would make it easier.

"That's never been done," Margot said finally.

"I know."

"The board will—"

"I know."

Another silence. Then: "Is this about the error?"

Evelyn looked at the spreadsheet. She looked at the darkness where the fern was, the fern she had been meaning to water for three weeks. "It's about more than the error," she said.

She didn't know if inviting Darnell was conscience or something else. Whether giving him the stage was restitution or another form of arranging him for an audience that had already decided what he meant. She had arranged things for audiences before, and she was good at it, and she didn't know what it said about her that the same skill applied here. She made the call anyway. Some decisions arrive as done things before you have finished making them.

"I'll call you tomorrow," Margot said. "I need to think."

"I know," Evelyn said. "Think."

She hung up and sat in the lamp's cone of yellow light, the spreadsheet still open, the numbers still themselves, the fern still dying at the edge of sight. She did not close the laptop. She was not ready to stop looking at what she'd found, even though looking at it didn't change it, and not looking at it wouldn't either.

The building was quiet. The campus lights persisted outside. Somewhere in a data center she had toured twice and described in print as an example of thoughtful infrastructure, servers processed four point two million data points per second and did not know it was late.

← PreviousContentsNext →