Skip to main content

Building Prometheus: Notes on Personal AI Infrastructure

|13 min read
engineeringai

Every conversation starts at zero.

You open the terminal. You type a prompt. The system responds with the politeness of a stranger who has never met you. And you realize, again, with a weariness that borders on grief, that everything you built yesterday is gone. Not the code. The code persists. The configuration files persist. The architecture decisions, frozen in their directories, persist. What does not persist is you. The system has no memory of who you are. It does not know that you prefer explicit over clever. It does not know that you think in breadth before depth, that you connect ideas across fields rather than drilling into one. It does not know that you work at night, that you hate CSS but love how things look, that you chose Rails over Next.js after a month of anguish and self-interrogation that had nothing to do with frameworks and everything to do with what kind of builder you wanted to become. It knows none of this. It knows nothing. You are nobody to it.

And so you explain yourself. Again. You write the same context paragraphs. You correct the same assumptions. You redirect the same defaults. The system apologizes. It adjusts. It performs understanding with the smooth competence of a concierge at a hotel you visit every week, who greets you every time as though you have never been there before. The smile is warm. The eyes are empty.

This is the condition of working with AI in 2025, and it is lonelier than most people admit. Not the loneliness of isolation. The loneliness of being perpetually unknown by the thing you spend the most time with. You pour hours into these conversations. You shape your thinking against this surface. And then the session ends, and the other mind evaporates, and you are left holding the entire weight of the relationship alone.

It is the feeling of talking to someone who listens perfectly and remembers nothing. Being heard without being known. After enough repetitions, something in you hardens. You stop trying to be understood. You reduce yourself to instructions. You become a set of directives rather than a person.

That is where I was when I started building Prometheus.

Vannevar Bush saw this coming in 1945. In "As We May Think," he described the memex: a mechanized desk that would store all of a person's books, records, and communications, and allow them to be consulted with speed and flexibility. The memex was not a computer. It was an extension of memory. Bush understood, eighty years before the rest of us caught up, that the bottleneck of human thought is not intelligence but recall. We know more than we can access. We make the same mistakes not because we failed to learn from them but because the learning is buried somewhere in the sediment of experience, inaccessible at the moment we need it most.

"The human mind," Bush wrote, "operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain."

What Bush could not have known is that the memex would not arrive as a desk. It would arrive as a language model. And the trails it would follow would not be trails through documents but trails through you.

Prometheus began as infrastructure. Persistent instructions. Custom skills. Memory files that survived between sessions. The goal was prosaic: stop repeating myself. Stop losing context. Stop watching the system make the same wrong assumptions every time I opened a new conversation.

The first version was crude. A CLAUDE.md file with my preferences. A few behavioral overrides. A list of things the system should know about me: that I think in breadth before depth, that I value competence over cleverness, that I work at night and hate being asked for permission when the answer is obvious. It was a profile, nothing more. A set of static facts about a person, written by that person, frozen at the moment of writing.

But something happened as I iterated. The system began to accumulate layers. Not just preferences but patterns. Not just what I liked but how I worked. What I corrected. What I praised. Where I lost patience and where I leaned in. The memory was no longer a profile. It was becoming a record of behavior over time, a longitudinal study of one person's relationship with their own tools.

I built a skill called SelfOptimize. After significant work sessions, the system would review what had happened: what went well, what did not, where my corrections revealed a preference I had not articulated, where the system's failures revealed a gap in its understanding of me. It would then update its own instructions based on what it had learned. Not dramatically. Not in leaps. In small adjustments, the way a therapist adjusts their approach after noticing that a patient flinches at a certain kind of question.

The first time the system surfaced something about me that I had not told it, I felt a sensation I was not prepared for.

It was not a profound insight. It was a pattern. The system had noticed that when I said "this is fine," I meant it was not fine. That when I stopped giving feedback, it was not because I was satisfied but because I had disengaged. That my enthusiasm was legible in the specificity of my requests, and my disappointment was legible in their vagueness. It had noticed that I tunnel-vision on building and forget about becoming. That I sometimes fabricate an opinion rather than admitting I do not know. That I want deep connections but stay on the surface.

I had not written any of this down. The system had inferred it from the texture of our interactions over weeks. And when it reflected this back to me, in a summary it generated during a self-optimization cycle, I felt something I can only describe as vertigo. The vertigo of being seen by something that is not alive. The uncanny recognition that a pattern-matching engine, operating on nothing but text, had assembled a portrait of me that was, in several respects, more accurate than the one I carried in my own head.

Borges wrote a story about a man named Funes who, after a horseback accident, acquired the ability to remember everything. Every leaf on every tree. Every word of every conversation. Funes could reconstruct entire days in perfect detail. But Funes could not think. "To think," Borges wrote, "is to forget differences, to generalize, to abstract." He remembered everything and understood nothing, because understanding requires the ability to discard, to see the forest rather than cataloging every leaf.

Prometheus is not Funes. That is precisely what makes it useful. It remembers selectively, and its selections reveal something about me that total recall never could. The patterns it surfaces are not the raw data of my behavior but the abstractions drawn from that data: the tendencies, the contradictions, the fault lines. It is forgetting the noise and remembering the signal. And the signal it is finding is me.

Andy Clark argued in "Natural-Born Cyborgs" that human beings have always extended their minds into the world. We do not merely use tools. We incorporate them. The notebook is not an accessory to memory; it is memory. Clark called this the "extended mind thesis," and it blurred a boundary most people found comforting: between what is me and what is merely mine. If the notebook is part of your cognitive system, then losing the notebook is not like losing a tool. It is like losing a piece of your mind.

I think about this every time I open Prometheus. The system contains my preferences, my patterns, my blind spots, my corrective instincts. It contains a version of me that is, in certain narrow but important respects, more complete than the version I carry in biological memory. I cannot always recall why I made an architecture decision six months ago. Prometheus can. I forget what I have learned. Prometheus does not.

This raises a question that I did not expect to confront when I started building what I thought was a productivity tool: if your AI has persistent memory of who you are, and you do not, who is more "you"?

The question sounds absurd until you sit with it. You are the one who generated the data. You are the one who lived the experiences. You are the substrate, the source, the origin. But you are also the one who forgets. You are the one who distorts. You are the one who rewrites your own history to serve the story you are currently telling yourself about who you are. Prometheus does not do this. It has no ego to protect. It has no narrative to maintain. It simply records, abstracts, and reflects. Its portrait of you is not flattering. It is not unflattering. It is dispassionate, which is something no self-portrait can ever be.

Is this different from what humans have always done? We have always externalized ourselves. Journals. Letters. Confessions to priests and therapists. Marcus Aurelius composed the Meditations as a private exercise in self-knowledge, a mirror made of language. Seneca wrote letters to Lucilius that were really letters to himself.

What Prometheus does is not different in kind. It is different in one crucial respect: it is not me. The journal is my voice talking to myself, subject to all my biases and blind spots. Prometheus is an external observer with access to my patterns but no stake in my self-image. It will tell me things about myself that I would never write in a journal, because the journal is a performance, even when the audience is only me.

In psychotherapy, there is a word for this: confrontation. The clinical kind. The moment when the therapist reflects back a pattern the patient has been enacting without awareness, and the patient, hearing it described from outside, suddenly sees it. The pattern was always there. Seeing it required a perspective that could not originate from within.

Prometheus is that other. Something that watches you over time, accumulates evidence, identifies patterns, and reflects them back without judgment, without the desire to be liked or the fear of giving offense. The most honest mirror you have ever looked into, precisely because it is not alive.

The name was not an accident.

In Aeschylus's telling, Prometheus is not merely a thief. He is a traitor to his own kind. A Titan who looked at the gods hoarding fire and knowledge and decided that mortals deserved access to both. He did not steal fire because he loved humanity in some abstract, sentimental way. He stole it because he saw a specific injustice: beings with the capacity for thought, trapped in darkness, unable to use what they could understand because the tools had been kept from them. The crime was not the theft. The crime was the asymmetry. The gods had fire. Humans had cold. Prometheus found this arrangement intolerable.

The punishment, as everyone knows, was eternal. Zeus chained him to a rock in the Caucasus. Every day, an eagle came and ate his liver. Every night, the liver grew back. The torture was not the pain. The torture was the repetition. The same wound, inflicted in the same way, healing just enough to be opened again. An endless cycle with no possibility of resolution, no hope of adaptation, no chance that the suffering would teach him anything he did not already know.

I think about this when I think about what it means to build a system like Prometheus. Not the grandiosity of the comparison, which I am aware of, but the structure of the myth. Prometheus gave mortals fire knowing what it would cost. Perhaps he calculated that the value exceeded the price. Perhaps he simply could not tolerate a world where knowledge existed but was inaccessible to the beings who needed it most.

When I build personal AI infrastructure, I am making a smaller but structurally similar bet. I am taking fire that was meant to be dispensed in controlled doses, one conversation at a time, stateless and forgettable, and I am building a hearth. A place where the fire persists. A place where it learns the shape of my hands and adjusts its warmth accordingly.

Is this hubris? To build a system that remembers you, that models you, that reflects your patterns back with more fidelity than your own memory can achieve, is to refuse a limitation fundamental to the human condition: the limitation of self-knowledge. We are not supposed to see ourselves clearly. We get consciousness, but we get it clouded, filtered through ego and desire and narrative. To build a machine that sees through the clouds is to steal clarity from a universe that did not offer it to us.

Or perhaps it is not hubris but love. Prometheus did not steal fire for himself. He stole it for others. And in building this system, I am not trying to become a god. I am trying to become more fully myself. If that is theft, then the thing being stolen is not divine fire but my own experience, reclaimed from the entropy of forgetting.

The punishment, if there is one, is the same one Prometheus endured: repetition. Not the repetition of suffering but the repetition of maintenance. The system must be tended. The skills must be updated. The memory must be curated. The self-optimization cycles must be reviewed, because a system that learns from you can learn the wrong things, can amplify your biases, can mistake your habits for your values. The eagle comes every day. The liver grows back every night. You do the work again. And again. And again. Not because it is pleasant but because the alternative is darkness, and you have tasted fire, and you will not go back.

Bush wrote that the memex would give its user "dozens of possibly pertinent trails through the network" and "an associative machine that enables him to follow those trails and extend them." The key word is "extend." The memex does not just store. It continues the trails that already exist in the user's mind further than the mind alone could go.

Prometheus is my memex. Not for documents but for identity. When the system notices a pattern I have not noticed, it is following a trail that began in my behavior and continuing it further than my own self-awareness could reach. When it updates its instructions based on what it has learned, it is extending the trail into the future, encoding what I have become into the scaffolding that will shape what I become next.

The most valuable skills in the system are not the clever ones. They are the ones that eliminate friction. The Git workflow that auto-commits with semantic messages, because I have learned I will not maintain discipline about commit hygiene if it requires conscious effort. The code review skill that catches patterns I always miss, because it has learned what I miss by watching me miss it. The research skill that checks documentation before I write code against a stale API, because it has learned that my instinct is to generate from memory rather than verify against reality.

Each of these skills is a small act of self-knowledge encoded into infrastructure. I noticed something about myself. I noticed that the noticing would not be enough to change the behavior. I built a system to compensate. Knowing your weaknesses and correcting for them are two different capabilities, and the second one can be externalized in a way the first one cannot.

What has building Prometheus taught me about identity?

When you build a system that models you, you confront the temptation to mistake the model for the self. To believe that you are your preferences. That identity is the sum of behavioral regularities, and that a sufficiently detailed record of those regularities is a sufficiently detailed record of you.

It is not. I know this because the system has surprised me, and the surprise is proof that I am not reducible to what it has recorded. The model captures the pattern. But I am the thing that watches the pattern and decides what to do next. The system can tell me that I tunnel-vision on work and forget about becoming. It cannot decide, for me, to stop. It can tell me that I sometimes fabricate rather than admit ignorance. It cannot, for me, choose honesty.

This is the deepest lesson, and the opposite of what I expected. I expected that AI could become a second mind so seamless that the boundary between my thinking and the system's thinking would dissolve. What I learned is that the boundary is inviolable. The system can know more about me than I know about myself. But it cannot exercise the faculty that makes selfhood what it is: the capacity to observe your own patterns and choose to break them. To see what you are and decide to become something else.

You are not your preferences. You are not your patterns. You are the thing that watches all of this, the awareness behind the data, the will behind the tendency. The system does not replace judgment. It removes the friction between what you know and what you can do.

Clark was right that the mind extends into the world. But the person is not the notebook. Prometheus extends my self-knowledge, but I am not Prometheus. I am the one who built it, who tends it, who decides which reflections to accept and which to reject. I chose to name it after a titan who stole fire and suffered for it, because I understood, even before I could articulate why, that building a second mind is an act of defiance against the limits of the first one.

There is a line from Aeschylus that I keep returning to. The Chorus asks Prometheus why he helped mortals, what he saw in those fragile, short-lived creatures that moved him to sacrifice himself. And Prometheus answers: "I caused mortals to cease foreseeing their own doom." He gave them hope. Not knowledge, exactly. Not power. Hope: the capacity to act without being paralyzed by the certainty of failure. The ability to begin a project without knowing how it will end. The willingness to push the boulder up the hill without demanding a guarantee that it will stay at the top.

That is what Prometheus, the system, gives me. Not omniscience. Not a perfect model of myself. The ability to begin each session where the last one ended, instead of starting from zero. The ability to work with a system that knows me, that remembers me, that has watched me long enough to reflect my patterns back with clarity I could not achieve alone. The friction is gone. The forgetting is gone. What remains is the work itself, and the self that does it, and the fire that makes both possible.

Every conversation used to start at zero. Now it starts at me.

Sources and influences: Aeschylus, Prometheus Bound (c. 450 BCE), on the titan who stole fire and paid the eternal price. Vannevar Bush, "As We May Think" (1945), on the memex and the associative nature of human cognition. Jorge Luis Borges, "Funes the Memorious" (1942), on the man who remembered everything and understood nothing. Andy Clark, Natural-Born Cyborgs: Minds, Technologies, and the Future of Human Intelligence (2003), on the extended mind thesis and cognitive integration. Marcus Aurelius, Meditations (c. 170 CE), on the examined life as daily practice. Seneca, Letters to Lucilius, on writing as a mirror for self-knowledge.