--- title: The Architecture of Autonomy (§Preface & §Intro) version: 0.93 date: 2025-07-30 status: Public draft for comments ; edit complete tags: taoa robots: noindex, nofollow --- ## **Preface: On Coercion, Systems, and Sovereignty** I'm writing this during a quiet moment in Washington, D.C., just days before July 4th — an anniversary meant to honor freedom, self-determination, and the fragile ideal of rule by consent. Yet I find myself not in a celebratory mood. I've just come from a digital policy summit where I was invited to speak about decentralized digital identity, a field I've helped pioneer. But what I see today in our digital systems, in our politics, and in our institutions fills me less with pride and more with concern. Around the world, coercion is on the rise. Surveillance tightens. Digital systems close in. Authoritarian tendencies are no longer the exception; they are metastasizing into the default. Here in the United States, a nation that once enshrined dissent, privacy, and the right to leave as civic virtues, our own commitment to autonomy is fraying. Support for individual autonomy is why I've spent the past three decades working at the intersection of law, identity, and cryptography, helping build foundational internet infrastructure used by billions including TLS 1.0 and the W3C DID standard for decentralized identity and establishing the Ten Principles of Self-Sovereign Identity. These efforts were born from a belief that technology could protect autonomy, that it could be a shield against coercion, rather than a conduit for it. Today, I'm developing the [Gordian Stack](https://developer.blockchaincommons.com/): tools for developers that demonstrate privacy without secrecy, verification without surveillance, and exit without erasure. I'm not just building technical infrastructure in isolation. I'm actively working with human-scale communities, from creative collectives and advocacy organizations to open source security teams, exploring how coordination tools can enable democratic participation without requiring platform intermediation. Yet watching how this infrastructure has evolved has forced a reckoning. I promised sovereignty, but I didn't know how slippery [the slope of compromise would be](https://www.blockchaincommons.com/musings/musings-ssi-bankruptcy/). These systems were built with the best intentions, to make the web more private and authentication more secure. But they have also created new vulnerabilities we didn't anticipate. > "I am, at heart, a systems thinker." My dual perspective as both technical architect and community practitioner gives me standing to diagnose how systems I helped build have been weaponized—and to prove that alternatives that preserve human dignity remain not just possible, but practical. These are not partisan or culture-war hypotheticals. Some are concerned about censorship or debanking, others about losing access to health records, education, or community. What unites them is power without accountability, and systems that revoke without warning, without explanation, and without appeal. When no one can be heard, everyone is at risk — regardless of which flag they wave. Neither is it an academic observation. It's a practitioner's responsibility to understand how tools meant to expand freedom have sometimes constrained it and to help chart a better path forward. What I see is not a flaw in this or that system but a deeper, systemic problem: an inversion of our goals: something fundamental in our legal, technical, and institutional architectures has turned against their original purpose. What disturbs me most is how identity and the threat of its loss has become a coercive vector of control. In India, the Aadhaar biometric system has become mandatory for basic services, leaving millions without basic rights. Syrian refugees find their digital identities frozen in time, unable to update credentials from a government that may no longer exist. This threat of statelessness is also used as a political cudgel. Ukrainian and Russian expatriates cannot renew passports without returning home to face conscription, leaving them documented only by expiring papers. Lithuanian-born children of Russian parents are denied birthright citizenship, trapped between nations that won't claim them. These people haven't just lost access to services; they've lost the ability to prove they exist in a world that demands digital verification for everything from banking to border crossings. As identity becomes digital, statelessness becomes invisibility. The same systems that can erase identity can also weaponize it through financial exclusion. Canadian truckers saw their bank accounts frozen for protesting. Nicaraguan NGOs critical of the government found their banking access revoked without trial. Russian dissidents in exile watched their accounts closed and their funds seized. The system in China is even more insidious: social credit pilots blend financial history with social behavior, restricting "untrustworthy" individuals from premium travel, certain jobs, or their children's school admissions while publicizing their names to create shame. These are not criminal penalties imposed by courts; they are infrastructural punishments executed by platforms and payment rails, often at governmental request but without governmental accountability. When money becomes digital, dissent becomes expensive — or impossible. Mechanisms that define us and our relationships before we define ourselves are just as troubling. They assign identity without participation and deny us the power to refuse. When we cannot contest how we are seen, we are no longer treated as citizens, but as data points: fixed, ranked, and acted upon. An even deeper harm extends beyond individuals to groups. Platforms have fundamentally altered how we organize, collaborate, and create together. Where coordination was once direct and mutual, it now requires algorithmic permission. What we've lost isn't just individual rights, but the very infrastructure of spontaneous human cooperation. As I've written [elsewhere](https://www.blockchaincommons.com/musings/musings-ssi-bankruptcy/), I now believe that self-sovereign identity — as a movement, as an industry — may be losing its way. Our systems grow more complex, but our commitments grow more hollow. We promise sovereignty while building dependency. We invoke consent while stripping it of substance. We talk of decentralization, yet reconstruct the same asymmetries of power through new infrastructure. This series of articles is an attempt to reckon with that contradiction. It's not a manifesto, nor a memoir, nor a technical roadmap. It is, instead, an effort to trace one edge of a much larger problem: how digital systems — intentionally or not — are inverting the legal structures once designed to guard against coercion. I focus especially on property law, contract, enforcement, exit, and identity, not because they are the only vectors of freedom, but because they show clearly how the inversion works. Where once the law constrained arbitrary power, digital architecture now encodes it. Where once legal rights created room for dissent and refusal, digital platforms now optimize for dependency and extraction. This is not just a technical failure. It is a moral one. When systems revoke without notice, extract without consent, enforce without recourse, and define without reply, they do more than violate design principles. They dismantle the conditions of freedom itself. What's lost is not just access or control, but the capacity to dissent, to depart, to be heard — ***to be human***. But I don't believe this future is inevitable. I believe it is a design problem — and therefore, a solvable one. The articles that follow begin with critique, but they do not end there. I explore how cryptographic secrets, like private keys, might restore real digital possession; how self-sovereign identity, if grounded in its original principles, might return meaning to consent; how agency law could expand our legal toolkit to counter coercion in a world of programmable agents and opaque infrastructure. This is only a part of a larger struggle, one that includes institutional reform, cultural change, economic redesign, and moral renewal. But this is the part I know how to contribute to. My hope is not just that this work informs, but that it inspires: not to performative outrage or shallow optimism, but to the deeper intellectual work of understanding, designing, and governing systems that resist coercion rather than reproduce it. If we do not reclaim our basic human rights — through law, through code, and through culture — we will not be governed as persons, but as digital projections. Not as selves, but as shadows. The future is not yet written. The systems we inhabit are still malleable. And if we act with care, clarity, and courage, we may yet turn the inversion around. — *Christopher Allen* > _(STATUS: 0.93 2025-07-30: Edit complete.)_ ## INTRODUCTION: ***Law, Code, and the Design of Power*** Freedom has never been a default. It must be constructed — through laws, institutions, practices, and norms that constrain coercion and hold power accountable. In the physical world, we have developed legal architectures to do just that, including property law, contract law, due process, and the right of exit. These are not perfect tools, but they are the mechanisms by which liberal societies have sought to preserve space for consent, dissent, and self-determination. In the digital world, that architecture is unraveling. This series is about a profound and accelerating transformation: the **systematic inversion** of the anti-coercive logic of law by digital systems. Where law once constrained power, code now often embeds it. Where rights once enabled refusal, platforms now engineer dependence. Where identities were once inalienable and self-defined, they are now fragmented, inferred, and commodified. This inversion is not the product of malice or mistake, but of design — or more precisely, misaligned design rooted in perverse incentives. It emerges from architectures that optimize for extraction over empowerment, dependence over autonomy, and scale over human dignity — inevitably hollowing out the very freedoms they claim to serve. In the early days of the web, we designed protocols like TLS to democratize secure communications, making encryption available to anyone with a server, not just large corporations. The architecture we built enabled not only banks and major services, but also small merchants, independent blogs, and emerging voices to communicate securely without seeking permission. But over time, that democratic architecture was subverted. Certificate authorities consolidated. Browser vendors became gatekeepers. Platform monopolies emerged. What we designed as infrastructure for autonomy was gradually enclosed — not through technical failure, but through economic and political capture. Infrastructure plus money equals power — and digital platforms have captured both. They've replaced shared infrastructure with closed ecosystems, transforming coordination from a public good into a private service. They simulate ownership for us while retaining revocation powers. Contracts offer consent as ritual, not reality. Enforcement is privatized, invisible, and often irrevocable. We've seen a father lose a decade of family photos when Google's AI scanner mistakenly flagged medical images he'd sent to his doctor. A musician lost access to years of work when SoundCloud changed its terms overnight. Small business owners have watched their PayPal accounts frozen — and their livelihoods with them — based on opaque "risk" assessments. But the harm extends beyond individual loss to collective capacity: communities are scattered when platforms change algorithms, creative collaborations are destroyed by policy shifts, and organizing efforts are silenced by opaque moderation. These are failures of infrastructure, where memories, creative work, livelihoods, and collaborative discovery vanish not by law, but by invisible systems with no recourse. These systems that revoke without warning, without explanation, and without appeal demonstrate power without accountability. When no one can be heard, everyone is at risk. These systems do not merely fail to protect autonomy. They systematically invert the very legal functions meant to resist coercion. Possession becomes conditional access. Contract becomes adhesion. Enforcement becomes algorithmic judgment. Exit becomes erasure. The result is not just a loss of control, but a loss of agency: legal, political, and personal. Where once rights guaranteed room for refusal, we now face systems that presume compliance by default and render resistance invisible. Perhaps most alarmingly, some of these systems do not just govern what we do — they seek to define who we are. Identity, once grounded in continuity and consent, is increasingly inferred, fragmented, and assigned without input or appeal. This is not just misrepresentation, it is a denial of standing: the ability to object, revise, and be recognized as a self. Emerging decentralized identity protocols are facing these issues as well. Some proposals reintroduce "phone home" behaviors — where resolving an identifier, verifying a credential, or checking revocation status requires contacting a centralized endpoint. This is just one of many patterns that undermine privacy, enable surveillance, and risk recentralizing control, while quietly rebuilding the dependencies decentralization set out to dismantle. Legal protections against coercion form an interlocking architecture: identity grounds selfhood in systems that would otherwise overwrite it; property law grants independence; contract law ensures voluntary agreement; due process protects against arbitrary enforcement; visibility makes power challengeable; and the right of exit provides leverage. When these safeguards are inverted, freedom doesn't just weaken. It unravels. ### Mapping the Argument This series begins by diagnosing the problems with digital identity, assets, and other systems, then explores possible paths toward new systems that better resist coercion. * **Section 1**, *Shield to Snare*, traces six key mechanisms through which legal protections were designed to resist coercion — possession, contract, enforcement, power visibility, exit, and identity — and shows how their digital analogs systematically invert their intended function to embed power and engineer dependence rather than the opposite. The sections that follow examine how we might recover — or reimagine — autonomy within digital systems: * **Section 2**, *Possession Without Power*, examines how cryptographic secrets, particularly private keys, offer one authentic reversal of digital dispossession — yet remain fragile without legal recognition and social infrastructure. Wyoming's pioneering statute on private key disclosure marks a crucial shift: treating secrets not just as information, but as authority. * **Section 3**, *Delegable Authority*, recognizes that autonomy cannot be reduced to possession alone. Through agency law, entrustment doctrines, and fiduciary principles, it explores how to structure relationships of dependence that preserve recourse, to make platforms answerable to those who rely on them. * **Section 4**, *Sovereignty, Refusal, and the Architecture of the Self*, returns to the foundations of self-sovereign identity — not as data control, but as the right to refuse a system's definition of us. True sovereignty means retaining standing in the systems that claim to know us, preserving the power to say "that is not me" and have that refusal matter. * **Section 5**, *Beyond Ownership*, challenges the conflation of freedom with individual property. It explores institutional alternatives — commons, cooperatives, and participatory infrastructures — that resist coercion through collective stewardship rather than isolated control. * **Section 6**, *Designing for Exit and Resilience*, examines how interoperability and portability can make departure meaningful rather than destructive. Freedom requires not just the right to leave, but the ability to persist through disruption — to maintain identity and relationships across systems. * **Section 7**, *Governing the Invisible*, confronts algorithmic governance and infrastructural opacity. Whether through instant execution or deliberate friction ("sludge"), uncontestable systems govern without accountability. Making power visible — and challengeable — is a prerequisite to justice. * **Section 8**, *Building for the Long Arc*, argues that meaningful change requires a dual strategy: building alternatives that serve specific communities while supporting direct challenges to platform power. It explores how technical innovation and political action must work together rather than competing for attention and resources. * Finally, the **Epilogue** reflects on dignity in system time — how to build institutions that can remember, repair, and renew themselves. It issues an invitation to builders: to create systems worthy of our trust, because we chose to make them worthy. Throughout, this series insists: **autonomy is not isolation**. It is the right to define our dependencies, contest our representations, and revise the terms of collective life. The goal is not to eliminate intermediaries or governance, but to make them answerable — through law that constrains, code that respects, and culture that remembers. This breadth — spanning philosophy, law, and technology — reflects the problem itself. While readers need not master every domain (technologists may gravitate toward Sections 2, 6, and 8; legal practitioners toward 3 and 7; community builders toward 5 and 8), what matters is recognizing how these forces interconnect to shape our digital lives. The inversions affect us all. ### Why It Matters In our reality, three forces shape our freedom: **legal code**, which governs through statutes and rights; **software code**, which governs through architecture and algorithms; and **social code**, which guides how systems are used (and misused) through cultural norms, habits, and expectations. While Lawrence Lessig famously argued that ['code is law,'](https://www.harvardmagazine.com/2000/01/code-is-law-html) the reality is more complex: these forces do not merely coexist. They **interact, reinforce, and sometimes collide**, often in ways we barely perceive. When aligned, they can protect freedom; when misaligned, they accelerate its erosion. To preserve autonomy, we must understand how each operates — and ensure they align in service of freedom, not coercion. Before we go further, a word of caution: these articles do not idealize the law. Legal systems have long reflected power as much as they've constrained it — from colonial land regimes to coerced labor and selective enforcement. Yet even within that mixed legacy, liberal traditions carved out tools — flawed, partial, but real — that created space for dissent, constraint, and refusal. What concerns me now is that even these modest affordances are being silently eroded — not through lawmaking, but through the architectures of the digital systems we increasingly depend on. This is not an abstract debate. Our digital systems increasingly define the architecture of our society. As work, identity, money, and governance migrate online, the rules encoded in software become the rules we live under. When the digital world becomes ever more intertwined with the physical one — through smart locks, payment rails, biometric IDs, cloud-controlled infrastructure, and platform-mediated labor — its systemic design defects are also no longer theoretical. They are producing real harms in real lives. People are losing access to assets they thought they owned, accounts they rely on to work or communicate, or services that shape their visibility and agency in society — often with no warning, no explanation, and no recourse. We've seen this story before. In the industrial age, unchecked infrastructure produced monopolies, exploitation, and environmental collapse. It took labor movements, antitrust law, public utilities, and democratic reform to restore balance. Today, we face a similar inflection point, only this time, the infrastructure is invisible and the coercion feels like choice. If we fail to act, we risk entrenching a new form of digital feudalism: a world where rights are simulated but control is absolute; where persons are reduced to data and governance is outsourced to code; where freedom is offered in interface but denied in architecture. We are losing the brittle, hard-won affordances that once held power to account, tools that (however flawed) made resistance possible: legal scaffolds that kept some powers visible, some processes contestable, some exits intact. Their disappearance is not always visible and rarely dramatic. But it is consequential. Unless we name that loss. Unless we call out how our systems have been redesigned to silence refusal, obscure power, and close the doors of departure, we will be left with infrastructure that rules without being seen and governs without being challenged. And if people cannot contest what is enforced upon them — if they cannot take their possessions, revoke their consent, seek remedy, or exit with dignity — then we are not building freedom at all. We are building systems that preempt autonomy before it can act. If we can reshape these increasingly coercive structures — if we can re-align law and code to resist coercion rather than reproduce it — we can instead build systems that support genuine autonomy, shared governance, and human dignity. The tools are in our hands. The challenge is to use them wisely. ### How to Read This Work This series spans over 45,000 words across eight sections — a significant investment of time and attention. I ask for that investment not because these problems are simple, but because they are foundational. The systems that govern our digital lives were built by people like me, and they can be rebuilt by people like you. But first we must understand precisely how they fail, why they persist, and what alternatives remain possible. This is practitioner's knowledge offered in service of practical change. ### Who This Work Serves This series speaks to three audiences whose work must align if we are to reclaim digital autonomy: **For Technologists**: You'll find diagnostic tools for understanding how well-intentioned systems become coercive and frameworks for building alternatives. This is not another call for decentralization, but a roadmap for understanding which architectures preserve dignity and which erode it. Sections 2, 6, and 8 may particularly resonate with implementation challenges you face. **For Policymakers and Legal Practitioners**: You'll discover how digital systems systematically evade legal constraints and why traditional regulatory approaches often fail. More importantly, you'll find new approaches — from recognizing cryptographic possession to mandating fiduciary duties for digital platforms. Sections 3 and 7 directly address legal frameworks and governance challenges. **For Citizens and Digital Rights Advocates**: You'll gain language for articulating why digital systems feel oppressive even when they claim to serve you. This work names the patterns of coercion and points toward alternatives we can demand. The broader arc from diagnosis (Section 1) through collective solutions (Sections 5 and 8) maps a path from understanding to action. While readers need not master every domain, what matters is recognizing how these forces interconnect to shape our digital lives. The inversions affect us all. ### What This Work Attempts This is neither a technical manual nor a policy prescription. It is: 1. **A Diagnosis**: Mapping how digital systems invert law's protections against coercion 2. **A Framework**: Providing tools for recognizing and resisting digital domination 3. **A Bridge**: Connecting technical architecture to human dignity through law, code, and culture 4. **An Invitation**: Calling builders to create systems worthy of our trust What distinguishes this work is its insistence that **technical architecture is never neutral**. It either preserves space for human agency or it forecloses it. By making the inverted patterns of coercion visible, we can begin to demand and build systems that resist domination rather than reproduce it. The goal is not to eliminate power or governance, but to ensure that when power is exercised — whether through code, law, or culture — it remains **contestable, accountable, and bounded**. This is the essence of anti-coercive design: creating systems where power must answer to those it affects. This series is one attempt to meet that challenge — not the only path, but a necessary one. It maps how digital systems invert law's protections against coercion, then explores how we might build alternatives through cryptographic tools, legal frameworks, cooperative institutions, and technical architectures that preserve contestability and choice. But mapping is not building. What follows is both diagnosis and invitation: to understand how power operates in digital systems, and to join the work of encoding dignity in the infrastructure we depend on. Responding to the challenge requires not just analysis but also action. These inversions can be addressed through two complementary approaches: building alternatives for communities that need them now, while also supporting direct challenges to the economic structures that make extraction profitable. Technical innovation without political power remains fragile. Political strategies without technical grounding lack credibility. What follows explores both strategies as interconnected rather than competing approaches to reclaiming digital autonomy. > *"The future is written in the code we build today. We must write carefully, write with intention, and write together — because the systems we create now will govern the next generation's capacity for freedom."*