This story and essay, and the accompanying art, are presented by AI Policy Futures, which investigates science fiction narratives for policy insights about artificial intelligence. On Wednesday, Oct. 14, at noon Eastern, Yudhanjaya Wijeratne will join Tochi Onyebuchi, author of “How to Pay Reparations: A Documentary,” and Holli Mintzer, author of “Legal Salvage,” to discuss sci-fi, artificial intelligence, bias and justice. To RSVP for the hourlong online discussion, visit the New America website.
Each month, Future Tense Fiction—a series of short stories from Future Tense and Arizona State University’s Center for Science and the Imagination about how technology and science will change our lives—publishes a story on a theme. The theme for July–September 2020: justice.
First came the idea of the robot, on a Prague stage of all places—the unfeeling, enduring slave of Karel Capek’s R.U.R. The idea was much older; but R.U.R. really defined the concept, wrapped its edges in use case and narrative, and thus set in stone the relationship we were supposed to have toward it.
And then came the slave rebellion. Shades of Shelley’s hideous progeny recast in liquid metal, with Schwarzenegger later riding shotgun. Cyborgs, Cylons; the Oracle and the Architect; the name changes, the function remains the same. The fear that every parent has: that one day their own child would throw down all they held dear, and turn against their house, and would actually be justified.
I am thankful, then, that the world we actually live in is not defined by robots’ mastery or servitude. The sky outside is white; cold, yes, but not blackened, not scorched, simply a monsoon season shaking itself down into spring. The wind carries with it the smell of etteriya flowers; little gifts from the cell tower trees, which carry orange jasmine DNA somewhere. There is a flock of little machines tending the one closest to my door—I think last night’s storm wasn’t too kind to it—and as I pass they move aside and point me in the direction of the bagel shop. One of them, very solemnly, holds up a little white flower.
What made it do that? The State Machine, knowing that I have barely stepped out of my flat after the breakup? That delicate symbiosis between machine input and well-intentioned social campaigns, setting forth in hard code a law that people who suffer must be taken care of?
Was the tree actually damaged in the storm, or was it just an excuse to plant something out here, to give me this flower, and make sure I wasn’t alone?
The means, I suspect, are now too complex for even my department to understand. But the end is just what I needed. The flower is beautiful, the scent is beautiful, and standing out here, for the first time in so long, feeling the sharpness of the wind on my face, Oh God, I’m thankful.
Martin Wong is the first person to greet me at the University. Wong leads the Night Watchmen Project, a group of interdisciplinary academics playing with the State Machine code to see if there’s some perfect combination of starting conditions and fixed constants that might lead to a sustainable libertarian society. We’ve had plenty of arguments in the past. I think he’s naïve and too obsessed with the computer science; he thinks I lack imagination. He’s wearing a greatcoat today that makes him look like some giant Dracula knockoff.
“Come on. It’s legal now, don’t worry about it. Anti-smoking codes went under last week when all the nicotine addicts countercampaigned. Stupid health craze.”
I should probably note here that Wong doesn’t trust vegetables and lives entirely on a diet of nutrient soup and nicotine. Let’s just say it takes all sorts to make a university.
We smoke in silence. The nicotine salts are heady, almost overpowering, and we studiously examine the gables and windows we’ve seen 10,000 times before. Somewhere beyond, judging by the cars, is a student protest. Several hundred drones circle them like flies. Every so often a pair peeks our way, and I see a banner: NO MORE WALLS! BYZANTIUM FALLS! and BRING BACK THE NATION.
“They’re trying to get us to open up to the Rurals,” says Wong. “Merge with the other cities, throw down the walls, all that bullshit?”
“Is it working?”
A scoff. “Mad? The city-state model is the best we have. None of these idiots have lived in an actual nation. Hippies.”
A drone flashing FREE HEARTS, FREE MINDS, FREE BORDERS wobbles our way, no doubt heading back to recharge.
“Glad you’re back,” says Wong, at last. “I was running out of people to argue with.”
“Good to be back. I still think you’re deluded.”
Wong grins. “Finish your thesis, then?”
Such a ritual, at heart no different from the flower, but the difference is that we are just human, bound in our awkwardness, and the State Machine, with its catlike affection, is somehow more comforting.
Inside, the University is a haunted place. Stone floors and old walls laced with surface displays; microdrone swarms over ancient greens; history and future brought uncomfortably close together, with the present an infinitely thin slice between them. The politics of the Reds and the Greens, the Nationalist movement, all those things are ghosts here, weak and impotent, locked away behind newscasts. There was a movement to abolish the University at some point—a class argument that picked up serious traction—but what people don’t understand is that the University is more than just buildings and tenure: It’s an idea, a meme, a microreligion, an infinitely self-replicating concept that spreads among disparate actors and fights hard to preserve itself.
And so this strange structure remains. The sigils and mottos outside, the silent tread of weary professors, the rooms of debate and discussion, the eager first-years drunk with their own immortality. Life seems endless when you’re that young. Memories of our first year together—her libraries, her steps, the little artisan ice-cream shop tucked away in the corner—all hers, all things I scurry past, trying not to remember, until I come to the brown door marked TRACTACUS.
And, beneath that, the fourth clause: A thought is a proposition with a sense.
She’s inside, curled up in her usual corner, lost in some projection, the dark glasses cut by the darker hair. Still a sight that takes my breath away, only now in ways that hurt. She looks up as I walk in.
“Wong says he got a message from the State Machine,” she says. “Told him you were suicidal. Three others in the lab, too.”
There is that uneasiness between us. “I didn’t get a message,” she says quietly. “I didn’t get anything. I’m sorry. I didn’t mean—”
“Let’s not,” I say, taking my old, familiar place, even as something inside me crumbles and dies. Because we both know what that means. Neither of us sees as much as the State Machine does; to each other we are just idealized versions of ourselves, projections, half-lies and half-truths, not the real data trail we all leave behind. And she didn’t get the message.
The irony. In all those old stories I read it was humanity that triumphed over the cold heart of a machine. Love. Hope. Courage. Cunning. It was always the machine’s blindness, its inability to feel as we do, that became its downfall. But the reality is that it is we who are the blind, the unfeeling, the enduring, and a bunch of software modules sat there, knowing the real parameters of love all along.
On the way home, I can’t help thinking if it would have worked out had things been different at the start. For the longest time we believed the world around us was deterministic enough to be understood; that it was just a matter of encoding enough data, and enough processing power, to be able to see the future. That if I do x, and the other person does y, and if I know all the things I need to know about the actors and their actions, I can say that z is the logical outcome …
But the world isn’t mathematics on a screen. Complex deterministic systems exhibit chaos; high sensitivity to initial conditions. We can never know the initial conditions with infinite precision, so whatever simulation we have in our heads, no matter how detailed, is a step or two away from reality, and eventually must fail. The way we break people we love, and ourselves.
High sensitivity to initial conditions. Hmm. I think that’s a nice chapter title. Not too flashy, but accurate.
The University says my work on the State Machine began on Oct. 3, 2038, the day I enrolled. The day Jump!Space Industries’ Heart of Gold rocket cluster exploded in the sky. The day of the Mass Action protest. But that was just the date I enrolled on, and purely a quirk: She was here, she wanted me here, so I came, like the proverbial bumbling moth, uprooting my life for a dream.
I’d say my work on the State Machine began much earlier. It began with Pharaoh.
Pharaoh was an ancient video game, the kind you had to emulate to play. Pharaoh put you in charge of managing an Egyptian community, from tiny villages to vast townships. As my cities grew, the needs of my citizens grew with them. There were plagues; there was crime; there were fire hazards; I had to make sure enough houses got water, that there was entertainment around, libraries, monuments. I had to balance everything against income from taxes and markets and shipping; and if I did make a neighborhood livable enough, its citizens would build better housing, and new citizens would move in, with a new set of needs. Tiny decisions, driven by panic or ignorance, could snowball and shoehorn you into serious trouble a year down the line. What fascinated me the most was that I could click on every pixelated citizen and see their complaints, track their path through the city, and understand, at least from on high, the daily lives of my digital slaves.
My parents didn’t understand or approve of my obsessions. In our broken economy, they felt the only way out was to be a doctor, lawyer, or engineer, and none of those were achieved by loafing around playing video games all day.
Neither did my friends, for that matter. We were young. We were rebels; we were infinite. And here I was, locking myself away in a dark room, retreating from all that glory, hopping from video game to video game the way my friends hopped from party to party.
Even in college, bent over books that skipped from logic to rhetoric and bootlegged algebra that regurgitated solved problems, it was obvious to me that the people who had put serious time and thought into how a society might be built, how governance could be parametricized, and how an A.I. could run it were game designers, the Sid Meiers and the Will Wrights and the Tarn Adamses of the world—as opposed to political scientists, economists, or legal scholars.
Is it a wonder, then, that the State Machine came from a failed game designer?
Around the time when I was just discovering Pharaoh, a small company called Tambapanni Studios began building a strategy game, a city builder where one played an omniscient governor; halfway through, the engine was complete, but art assets were expensive and the studio was out of cash. Tambapanni shuttered its doors and released its code to the public.
At University we’re taught how the State Machine and the Legal Atomism movement grew out of the need for bureaucracy to regulate an almost infinite number of interactions between diverse constituents while processing an ever-expanding amount of information. Indeed, an extension of this need, a push for greater efficiency through automation. The ruling class, whatever it happened to be, had to offer enough goods and services to the ruled to keep them happy. So, in the name of maintaining that happy equilibrium: Automate enough processes, do it well enough, and you end up with systems that interact well enough with one another to replace portions of a human bureaucracy. Let the process continue for a while and you end up with the State Machine: a system performing the supreme act of rationalization.
But there is a lie at the heart of this narrative, and inconvenient truth shuffled under the rug by the weight of literature reviews. The first version of that State Machine was a sea of finely tuned cellular automata constantly trying to converge to a single steady state, designed to be hypercompetitive in the service of pre-built parameters of success. The people who wrote that code weren’t legal theorists; they were ordinary people with lives shaped and sculpted by a complex web of social contracts held long past their prime. Decisions they took to be common sense—maximize production, maintain trade relations where possible, weed out the underperformers, reward those who moved units of arbitrary fiat currency around—those were intrinsic biases, products of a political and ideological superstructure sold in paychecks and self-help books and success stories.
When Tambapanni went under, the Utopia Project lifted that code base and used it as the engine for a series of demos commissioned by the Center for Global Equity. Utopia found that only minor tweaks were required to implement constitutional frameworks; Tambapanni already had hundreds of index metrics named governance params. Civil rights? Check. Driving behavior? Press freedoms? Religion, that shadow governance all proper U.N.-bred economists feared to touch? All checks. The same problematic codification of culture captured externalities while well-meaning economists and legal theorists stuck to siloed abstracts that only worked half as good. Simple units working on simple rules interacted with one another to produce complex emergent behavior, the way millions of simple bees will converge to produce a complex hive.
Utopia filled in the gaps, downloaded satellite imagery, and the final demo, to judge by news reports of that time, went viral. Academia scorned it, but journalists started downloading it, playing with parameters, using mild predictions to advance their careers; from journalists it went to the hands of various advisers; from advisers to politicians, who realized they could get rid of some of those advisers; then from politicians to higher politicians.
And at the heart of it still were those lines: THIS SOFTWARE IS PROVIDED “AS IS,” WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL TAMBAPANNI, ITS SHAREHOLDERS OR ITS EMPLOYEES BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONRACT.
On Tuesday, after a night of fits and starts, I wake up to find my bedside glass of water has switched bedsides. Then I notice the face peering over me.
“Bleaaaargh,” I say, thrashing around a bit.
Fortunately, it’s not some random intruder. Unfortunately, it’s Adam Mohanani, or AdamM, as he styles himself. He’d dropped out of psychology, claiming that it was a load of tosh, and went off to study economics; dropped out of that and switched to religion; I used to say that at some point he’d pull a Wittgenstein, declare everything to be so much nonsense, and take up whistling instead. To which he usually replies:
“You’re spending too bloody long inside your own head.”
“How the hell did you get in here?”
He has the good grace to look embarrassed. “I heard about it from Wong. Tried ringing the doorbell, you didn’t answer, so I went to her place and she had a spare set of keys.”
I rarely get angry. I suppose I rarely feel anything these days. But there are no words for the bile and the ache that spreads through me upon hearing this.
“You mean you went back to her place.”
Adam is a good part of the reason we broke up. Call him an initial condition in a system highly sensitive to them.
“Shit. Look, let’s go outside and grab a coffee. Let’s just talk, OK? Come on. This is not healthy.”
Cause or effect? I don’t know, because at that moment I punched him. And soon it was fists and knees and the crash of furniture. Something glass shattered and stabbed into my palms.
“You’re crazy,” he says, when we break apart, torn and bleeding. “Go to hell.” The door slams and he’s gone. Back to her, I assume.
I look over the room, trying to see it with someone else’s eyes. My bookshelves have tipped over. Clothes, piled up in one corner, shoes tossed about. Real paper, yes, splayed out over the floor, trampled. The glass coffee table is in ruins. There’s a splatter of blood on my bed. In the corner, though, entirely untouched, is my screen.
Beginnings matter. The first thing I did when I began this project was to code a combination scraper and parser. It’s very similar to an old-school malware scanner: It looks for code signatures across digital archives, uses basic clustering to determine versions and generations, arranges branches by the contiguity of updates applied to them, modifies its signature definitions, and moves on. I think of it as the Hail Mary of my thesis.
There isn’t much of the Internet left, but the University has partnerships with the city of Vivarium, and Vivarium archived most of the clear web before the undersea cables started failing. The parser doesn’t understand history, but it crawls Vivarium’s archives, showing me how history was written. Here it is, in far greater granularity than anyone has ever achieved. V0: the genesis of the State Machine. Then V1, V2, in short order.
I understand the value of a single straight-line narrative, as I told my supervisor, who appreciates it too much. But the history of ideas isn’t a straight line. It’s evolution. It has forks, dead ends, horizontal gene transfer, sudden optimizations to market conditions that sound remote and bizarre today, and even the occasional vaporware project or 10.
By the time the Utopia Project brought out version 3.0, the Full Systems Toolkit, an entire ecosystem was evolving underneath the project, with entire governance rule sets and libraries being traded back and forth across Github. Utopia’s funder, a would-be superpower jostling for power, called in a few favors, and the project leaked throughout the U.N. ecosystem. By version 4.0 the Utopia Project was not even remotely in control of its own co-creation; depending on what day of the week it was, the U.N. Global Pulse lab would be championing any of six different versions of the system.
The next logical step, then, was to make those simple units more complex, to let them learn from real-life data. At which point every serious computer science school and A.I. startup realized there would be real money and power involved. A new renaissance in competitive governance was born around what, ironically, might just have been the greatest video game of all time … and almost all of it was open-source, simply because of that one decision made by Tambapanni.
Another chapter. A new Renaissance. And that was how the nascent State Machine ended up being bundled as a decision simulator into a massive aid grant to Sri Lanka, back in the day when countries were still a thing. Partly because its economy was crumbling, and partly because someone sitting in front of a New York skyline wanted to test the system before endorsing it. And, gamelike: What better way than to try it out on a microcosm? Sri Lanka was an island, and it had a smaller population than most cities today.
What would have happened if things had been different? If Tambapanni had never open-sourced its code? If some other agency had built a closed system from scratch, painstakingly translating legal documents into their closest equivalent in code? The space of what-ifs is always larger than the actual series of events. And it only ever leads to regret.
My supervisor is furious. Violence is taken very seriously. Thursday is the disciplinary hearing.
Well, hearing is a strong word. The whole process is handled by the State Machine. Out of respect for local standards there is a human jury, but they are anonymous, reviewing only the data; there are no appeals to file, no meetings to attend, only a series of quiet interviews, five minutes each, of everyone judged to be in my social web.
I’m instructed to stay home in the meantime. My devices switch to text-only messaging, my access shrinks to only university material, my feed politely informs me it’s switching to nonviolent material only for now. The little street-cleaner machines outside my door have no more flowers for me, but track me, almost apologetically, with their curious emoji faces.
Her: YOU’RE AN IDIOT.
Her: THE CALL OPTION DOESN’T SHOW FOR YOU. CAN YOU SEE THIS?
People don’t know it, but the social contract around me has changed for a day, enforced by a million smartphones, cameras, login systems, payment gateways, search engines. A mobile medic drops by, stares at my room, treats my wounds, and leaves me with a mandatory dose of painkillers and several “voluntary” doses of mild suppressants. For the first time, the real invisible hand is revealed to me; the State Machine’s many subsystems stepping firmly and politely in my way, marking new boundaries.
Camus was an idiot. There is no invincible summer inside of me, only a terrible buzzing noise that crawls inside my mind, creeping inexorably over the border that keeps me moving, thinking, writing. The only way Sisyphus is happy is if he’s on a metric ton of drugs. I take the drugs. The world tilts briefly, as it did last week.
Once things have calmed down a bit, I put on a comforting playlist. 1 A.M Study Session. Old music from old times. Perks of Vivarium’s archives. Sycamore, Snowcat, Burnt Reflections, less.noise. The lo-fi beats seep into the room, turning violent chaos into a sadder form of order. Guitar strings, cheap piano, audio hiss, mistakes salvaged and turned into music. Clean the blood off. Pack up the broken glass. Fragments, so many fragments.
V5–V6 were fragments, too. An explosion of code, branches that I explained in chapters 4 and 5. Most splinters were brought about by two broad categories of people that hated each other’s guts. One group consisted of regional data scientists who insisted that the automata models didn’t quite cover their regional quirks well enough. The other consisted of the post-structuralists, who argued that any rule set build on structural knowledge just wasn’t good enough.
The playlist switches to Sycamore again, and Sycamore wanders dreamily between very polished-sounding retro synthesizers and a piano, as if they agree.
V7. the Fuzzy Borders Update, which incorporated most of the fragments of V5–V6 by introducing data acquisition times, neural networks embedded in automata, and genetic algorithms to keep training generations of automata until they better resembled the societies they were supposed to represent. Chapter 6.
And so on until a massive influx of fragments start coming in from the Rosetta project. At this point various competing main branches emerged, hopping between various universities; the partnership between Berlin’s Resartus College and MIT was the first to implement the Rosetta standard. Between the two, the next updates were enormous; V10 carried the first Rosetta bytecode, allowing unparalleled translatability between legal syntax and code representation; V11 brought the code library that gave the State Machine interfaces to search engines and social media of all sorts, to use natural language processing and Rosetta to directly convert public opinion into possible legal structures.
That covers what I call the academic term. Now comes the hazier interpolations: the private term, where both big and small corporations start tussling for intellectual bragging rights. The private term is an absolute undocumented mess of timelines splitting off, vanishing, reappearing. Much of it destroyed by nations seceding, by cities turning themselves into city-states, and by Byzantium and Vivarium and Crimson Hexagon and the other academic states coming into being, flexing their legal might in a shattered world. And here, in this most whitewashed of all histories, we shine, my faithful parser and me. Occasionally impressive private releases are marked with papers and then get reverse-engineered by irate open-sourcers; through these the parser has drawn all the right lines, suggesting connections.
Her: THIS OBSESSION OF YOURS HAS GOT TO STOP.
Her: WE’RE TRYING TO HELP.
Her: I HAVE TO TALK TO YOUR SUPERVISOR ABOUT THIS.
Can’t lose focus now, not now. So easy to let the mind wander. To let the glass fall out of my hand. Nothing broken reforms itself. The diagram of history is broken, but at least I can fix it.
V25-33. The modules bloat in size; the code becomes increasingly unreadable. The Dynamic Constitution comes into play; the idea that you could preempt revolutions, riots, even voting, by just listening to the people and updating the core rule set every so often. It came at just the right moment, just when city-states began to look back at Athens and Sparta and Older’s Infomocracy and bring in people who thought in words like scalability and microgovernance.
V73. By now the State Machine is looking directly at behavioral data. Social media opinions, supermarket purchases, public-private partnerships for GPS traces.
The phone rings. No. No. Let me be. Here: the V102 bloc, invisible until now. The statist term. There is a time in all our histories when the State Machine, until now an instrument of the state, becomes the state; these dates are marked in stone and memory. But the code tree shows the truth. The states went under long before the formalities were sealed. I can only see a few branches at a time, but at this point various State Machines are interacting with themselves, very much like the automata that they are a part of, converging at a stable pattern, abstracting universal human needs as hyperparameters, weaving their own hegemonic superstructure.
The little emoji robots are clustering outside my window, on the other side of the road, looking—well, well, I can’t be sure, but I think they’re looking—at my window. I stare at them. Most drift away, like children caught staring. Two of them trundle forward into the complex. Moments later I hear a very soft knock at the door.
For my argument to be complete, I need one more thing, binding everything together. A final stitch. But I, drugged out, caught between hyperfocus and pain, can’t find it. The knocking, again.
“Leave me alone!” I scream, flinging the door open. The little emoji robots shrink back. One of them is holding a small clump of etteriya flower. It deposits them, very slowly, at my feet, the scent a strange countertenor to the dark notes in my head. I slam the door shut after them, confused. What distant goal did the State Machine actually pick to arrive at this equilibrium? What particular points of data? What turned it into this satisfying tyranny? What would have been the alternative?
The full scope of it yawns in my mind, almost on the tip of my brain, and if I just think a little harder—
The two emoji robots return to their place on the other side of the road, looking at my window.
The day of the hearing is a cold one. I’m still confined to my apartment. Martin Wong drops by in the morning, huge bat-cloak flapping.
“Heard about your, uh, thing,” he says, handing me a coffee. The emoji robots watch us. He stares daggers at them. “Those little bastards are creepy. They always hang around here?”
“They’re fixing something. Tree. Storm.”
“Of course they are. Just another state apparatchik on our doorsteps. Fantastic. You know the irony at home these days? My parents fled one surveillance state and we built another one around them. Remember the deCentralizers? They had the right idea.”
I remember the deCentralizers. They spun off almost 15 years ago for their Village-State project. The idea was that if you keep the number of residents small enough, you’d enable Coasian bargaining across every level of society, removing the need for a State Machine.
“What happened to them?”
“Probably getting shot at, or rotting their feet off somewhere trying to reinvent public infrastructure,” says Wong.
“The fate of all libertarians who get what they want.”
“Hark at the nanny-state fanboy. We should have stopped this when we had a chance of equilibrium.”
Ah, but they tried.
I have two chapters to explain that dead end on a sequence diagram. History is a fabrication to preserve egos and social capital. The reality is that the State Machine swept over us all, turning would-be politicians into toothless, defanged puppets in a ceremonial democracy that everyone pretends to care about while the real work happens underneath.
We smoke in silence, watching the emoji robots.
“It’s not so bad.”
“A tyrant by any other name.”
I know where Wong is coming from; from Frankenstein, from the Cyborgs, the Cylons, the Oracles, the Architects, from systems of control, from fundamental rights.
Outside, the campus stirs: Doors are starting to open; fit postgrads are running, and the saner ones are shrugging on coats and stumbling in the direction of the bagel shop. The protest is re-forming. A runner stumbles. A few of the emoji robots peel off to halt traffic while she limps across the road.
“Sometimes ignorance really is bliss.”
“You’re hopeless,” he says. “See you for lunch, tomorrow?”
“They won’t hold this against you,” Wong says, with a confidence that genuinely lifts me a little. I say my goodbyes, thank him for the coffee, and head back into the messy safety of my room.
Many decades ago, almost at the birth of modern computing, a scientist by the name of Knuth tried to define an algorithm. His definition, carved in stone on the State Machine monument, says that an algorithm must exhibit five properties:
1. Finiteness: An algorithm must terminate after a finite number of steps.
2. Definiteness: Each step of an algorithm must be precisely defined; the actions to be carried out must be rigorously and unambiguously specified for each case.
3. Input: “ … quantities which are given to it initially before the algorithm begins.”
4. Output: “ … quantities which have a specified relation to the inputs.”
5. Effectiveness: “ … all of the operations to be performed in the algorithm must be sufficiently basic that they can in principle be done exactly and in a finite length of time by a man using paper and pencil.”
Everything is an algorithm. This, any voter will tell you. The State Machine is an algorithm. It takes the input of public opinion and produces an output of corresponding laws and policies. Some elements of old-school politics still exist—factions keep proposing changes to the core algorithms. They take the source code and every so often come back with a new version, with unit tests, with pages of reports and simulations showing that such-and-a-change will be beneficial in such-and-such ways. And when they say something sensible, the public talks about it. The State Machine picks up on the chatter and sends it to the Steering Committee, the humans-in-the-loop, and thus a new update is pushed. Code becomes law that begets code that makes law. The philosophy of Legal Atomism allows a machine to rearrange the fundamental modules in Rosetta bytecode, pass it through a language compiler, and voila! Out, beautifully formatted, comes a clear expression of what rules we want governing us. This is Civics 101.
Unfortunately, it’s a lie, a Wittgenstein’s ladder, to be thrown away as soon as one has climbed to the top. Knuth’s definitions broke the moment deep learning, connectomics and neural architecture search came into its own. The current State Machine, version 302, Methuselah is a model of models, constantly modifying itself, spawning new submodels within itself, an entire ecosystem in constant process of evolution. Almost nothing major terminates in a finite number of steps; nothing is human-defined—a cluster sparking here is a butterfly setting off a tornado halfway across the virtual space; in the next moment, it does something else.
My parser dies here. Vivarium’s archives take a bow. The great lie of Open Source Governance is that it remains true to its origins: The code is all there for anyone to read and understand. Sure! Take it! But now we come to the end of my thesis, the truth that nobody really wants to see: Very few of the actual changes make it through in their original form; the system is its own input, and it decides what it sees. If the new Constitution contains most of what was supposed to come out—well, job done, policy victory, all that. If not, well, the State Machine is an ouroboros infinitely smarter than those who think they control it, and it moves in mysterious ways. Calling this thing an algorithm is like pointing at the sky and the sea and the forests and calling it Nature; it might pass muster for sixth grade textbooks and sophomore flirting, but look close and you see systems-of-systems with no definite end and no beginning, with a whole lot of humans meddling with it under some grand illusion of being in control.
Now you know why my thesis supervisor looks at me with pity when he drops by the apartment. I think he’s just waiting for me to give up.
“Have you considered something else?” comes the soft refrain. “You know, we all see you’re passionate about this, but sometimes, focus means you narrow the scope of your inquiry.”
“I’m not trying to explain the State Machine,” I protest. “Just how its history shaped it.”
“To describe the history of the functions of an object is to describe the object itself. Several times over. If I wrote a history on guns, would it not at some point have to describe what a gun is, how it works, and how that changed over time?”
To this, of course, I have no answer.
“How are you dealing?”
I know the question isn’t about the thesis. “I’ll be all right.”
“Are you talking to someone about it?”
“I don’t really have time.”
“I’ll approve an extension, take some time off, rethink your scope. And call me. Or the support line, if you don’t feel comfortable talking to me.”
The mental health support line feeds into the State Machine. I know it, they know it, we all know it. I’m analyzing a system that is, in turn, analyzing me. But then again, isn’t every relationship the same thing? Two systems locked in mutual analysis?
“I’ll think about it,” I say. “Thanks.”
Friday rolls around. The newsfeed is doing the runup to a new Constitution. No texts, no fanfare, just a notice that the public test server is now live. The protest is fading out, I think: Everyone’s just waiting, on the streets and in the shops, to see what the State Machine will say. And I can think.
How did we arrive at what we presently call the State Machine? When did we go from code and academia and failing nations to the all-encompassing, all-knowing, responsive automated government that runs our cities today? The one that can simultaneously understand the changing needs of its citizens, compile the Dynamic Constitution every week, and still spare time to hand out flowers to depressed students at their doorstep?
What, in short, is the nature and structure of God?
That’s the big question. The one I now wish I hadn’t been asinine enough to type out in big letters on my application. Even if I manage, in some convoluted way, to answer this, it’s not like people are going to care. Life will go on. The political divisions will stay; the Reds will raise hell in the Agora about how the rural way of life is being wiped out, the Green Democratic Party will harp endlessly about progress. The State Machine will listen to the protests in its increasingly mysterious ways.
The phone rings. I ignore it, lost in the ritual of thought and my apartment door. At best I’m looking at a long internship in the State Machine Steering Committee proving myself all over again as a programmer, and maybe eventually I’ll be a project lead on some obscure sub-submodule that nobody really thinks is sexy enough, and maybe I’ll become a roving scholar, orbiting the few cities that will take migrant scholars. Ten, 20 years down the line I’ll wander these streets again and wonder what the hell happened to the idealist in me. And the bagel shops will still sell bagels. Students will fall in love, break up, move on.
The phone rings again, more insistent this time.
“This is the Disciplinary Committee,” says the most neutral voice I’ve ever heard in my life. “We’re afraid we have some bad news.”
I know I heard the rest, but I can’t recall the words, only the gist. I was being asked to leave. I was unstable, it said softly, a danger to myself and others. It would make sure I was well cared for. At some point my adviser connects to the same line. I ask what happens to my research. They evade, telling me my friends are worried, telling me I need counseling, therapy. I remember breaking down; I remember, with equal clarity, not breaking thrown, but going outside, the cold biting my bare feet, and hurtling the phone at the first emoji robot that turns my way, and screaming at it as it topples.
The white van arrives later that evening. And just before the sirens stop outside my door, one last message arrives. It’s from the State Machine.
HIGH SENSITIVITY TO INITIAL CONDITIONS, it says at the top, in English.
It’s one of my sequence diagrams.
No, no it’s not.
It’s a diagram of a system; my style, but not my work; it’s sketched out to a level of detail I could never achieve. A society described as a system. I see names I recognize. I see Martin Wong; I see my thesis adviser; I see all the faculty, the students I’ve interacted with, the woman who runs the bagel shop. I see incidents marked with the symbol of the State Machine itself. Interventions: a robot handing me a flower. A small discount on a morning purchase. An offer from another university. And I see her, and Adam, like a spiderweb, pulling me back; and myself, right at the center, every interaction between us an update to an unstable code base, and me eroding every step of the way, from first savage break, right until now.
And underneath it, in neat Rosetta code:
Outside, the etteriya tree is finally upright, its white flowers strewn everywhere; across vehicles, across the visors of the medical police converging on my position, across the curious students watching in fear and curiosity and panic. The scent is beautiful, and standing out here, shivering, feeling the sharpness of the wind on my face, I realize the futility of my task.
First came the idea of the robot, the unfeeling, enduring slave. And then, if the fiction is to be believed, came the slave rebellion.
What they missed out on was the robot that would love us, would care for us, would understand us so perfectly, when nobody else could.
Read a response essay by S.B. Divya, an expert on machine learning.
This story and essay, and the accompanying art, are presented by A.I. Policy Futures, which investigates science fiction narratives for policy insights about artificial intelligence. A.I. Policy Futures is a joint project of the Center for Science and the Imagination at Arizona State University and the Open Technology Institute at New America, and is supported by the William and Flora Hewlett Foundation and Google.
More From Future Tense Fiction
“Affordances,” by Cory Doctorow
“A Priest, a Rabbi, and a Robot Walk Into a Bar,” by Andrew Dana Hudson
“Actually Naneen,” by Malka Older
“The Truth Is All There Is,” by Emily Parker
“It Came From Cruden Farm,” by Max Barry
“Paciente Cero,” by Juan Villoro
“Scar Tissue,” by Tobias S. Buckell
“The Last of the Goggled Barskys,” by Joey Siara
“Legal Salvage,” by Holli Mintzer
“How to Pay Reparations: a Documentary,” by Tochi Onyebuchi
And read 14 more Future Tense Fiction tales in our anthology, Future Tense Fiction: Stories of Tomorrow.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.