--- title: The Architecture of Autonomy (§3. Delegable Authority) version: 0.93 date: 2025-07-30 status: Public draft for comments ; edit complete tags: taoa robots: noindex, nofollow --- ## Section 3: Delegable Authority — Legal Foundations for Relational Autonomy ***Agency, Entrustment, and the Power of Reliance*** In 2014, TaskRabbit transformed overnight. What had been an open marketplace where workers set their own rates and chose their clients became an algorithmically-controlled system. Workers who had spent years building specialized businesses — home organizers commanding premium rates, skilled handyworkers with loyal client bases — suddenly found the platform setting their rates and assigning jobs algorithmically. As one worker told researchers: "They took away my ability to run my own business. I went from being an independent contractor to being managed by an app — but without any of the protections of employment." These workers had invested years building platform-specific capital — reviews, client relationships, operational knowledge — that they couldn't transfer elsewhere. While platform advocates might respond that TaskRabbit needed to standardize to survive in a competitive market, and that workers had accepted terms allowing such changes, this misses the relational reality. The issue isn't whether TaskRabbit had the legal right to change, it's whether platforms accumulating power over livelihoods should bear any obligation to those who depend on them. This case illustrates the gap between possession and power: workers possessed their accounts but had no authority over the system that governed them. TaskRabbit's betrayal of its users followed a predictable pattern that extends far beyond gig work. Compare MySpace's collapse between 2005-2008. Musicians had built entire careers on the platform. Bands like Arctic Monkeys and Lily Allen launched from MySpace pages that housed their music, fan connections, and promotional materials. When Facebook's growth forced MySpace to redesign repeatedly, each change broke existing functionality. Musicians lost carefully crafted profiles, fan messages, and promotional tools. The 2013 redesign was so catastrophic that it deleted 12 years of user uploads: 50 million songs and photos vanished overnight. The pattern repeats across platforms: Tumblr's acquisition by Verizon led to sweeping content purges that destroyed creative communities. Medium's repeated business model pivots left writers watching their audiences evaporate with each platform redesign. Google's shutdown of Google+ erased a decade's worth of collaborative discussions. Vine's shutdown gave creators six months to download years of work before everything disappeared forever. ***Each transition reveals the same structural flaw: when platforms are owned by entities whose interests diverge from users, those users become expendable.*** Ownership models that extract value from user dependency inevitably betray that trust when market conditions change. But the deepest form of digital dependence operates through economic infrastructure. As explored in Section One, when platforms control banking, payments, or financial access, they govern the conditions under which all other digital rights can be exercised. ***Economic exclusion preceded and enabled every other form of digital dispossession.*** Without transactional capacity, communication platforms become unusable, cloud files become inaccessible, and digital identities become worthless. There are other answers. Around the world, alternative ownership models are proving that digital infrastructure can serve users rather than extracting from them — a topic we will explore in Section Five. But even within existing ownership structures, legal frameworks can constrain platform power and create accountability where market forces fail. ### Real Harms, Absent Remedies Unfortunately, users usually are treated as **tenants without rights**. Apple disables an Apple ID, cutting off access to all connected devices. Meta revokes access to social login credentials without explanation. Payment processors freeze accounts based on opaque risk assessments. Certainly, platform governance often succeeds precisely because of this sort of speed and consistency. Content moderation removes harmful material millions of times daily, fraud detection prevents financial crimes in real-time, and algorithm updates improve user experience. **But when these systems fail, the scale of impact matches that efficiency.** Worse, these lockouts cause cascading damage: lost access to work tools and client relationships, frozen funds and payment histories, inaccessible healthcare portals and medical records, vanished family photos and personal archives, and broken authentication chains that lock users out of dependent services. The pattern for this cascade is consistent: no transparency for procedures, no notice, no meaningful appeal, no proportionality. Platforms assert **ownership-like power** without corresponding responsibility. ### The Legal Foundation for Challenging Platform Power Though cryptographic secrets, as explored in Section Two, provide a foundation for autonomy through direct control, life cannot be reduced to possession alone. Ultimately, there will always be situations where we share with other systems, to delegate, depend, and entrust. When we engage with these systems, they act for us, over us, or on our behalf. When those systems fail, we are too often left without recourse. A platform revokes access, and with it go livelihood, identity, even personhood. This Section argues that the response lies not only in technical redesign, but in **reclaiming legal frameworks that recognize and constrain power in relationships** — especially when that power is delegated or relied upon. **Agency law**, though limited to cases of explicit delegation, offers a model for bounding authority with fiduciary duty. Even where there is no formal agency, **reliance and entrustment** can still impose obligations: duties grounded not in contract, but in asymmetry and dependence. Platforms that mediate identity, access, or continuity must not be treated as mere vendors or owners. They are, at minimum, **entrusted stewards**; in some cases, they are **digital fiduciaries**. > *Autonomy is not the absence of reliance. It is the right to **define, revoke, and contest the terms under which others act in our name**.* ### Agency Law: Delegated Power, Bounded by Duty **Agency is an old and well-defined legal structure** for granting power to another. Its core elements include consent (the principal must authorize the agent), control (the agent acts under the principal's direction), fiduciary duty (the agent must act with loyalty, care, and accountability to the principal), and revocability (the principal can withdraw power). Digital analogs of agency appear throughout modern systems: a DAO member delegates vote casting to a proxy holder, a user designates an identity agent or key recovery entity, or a group of users create multi-sig setups where quorums act on a user's behalf. These actors are best described as **delegated keepers** — they hold authority not through ownership, but through entrusted power. The legal implication is clear: if someone is acting on your behalf with your consent, **they are bound by fiduciary obligations** under traditional agency law. This means they must log actions, respect boundaries, and allow revocation. But the promise of agency law extends beyond formal delegation. As legal scholar Jack Balkin argues in his groundbreaking work on ["Information Fidicuaries and the First Ammendment"](https://perma.cc/67MG-MCFB), the relationship between platforms and users often resembles traditional fiduciary relationships — marked by vulnerability, dependence, and trust. When we grant Google access to our emails or Apple access to our health records, we place a special confidence in them that they will not use this information against our interests. Yet Balkin's framework faces serious challenges identified by legal scholars such as Neil Richards. Critics note that traditional fiduciary principles assume an alignment between fiduciary and beneficiary interests that platform business models contradict: platforms profit from user data in ways that may conflict with user welfare. The scale problem is equally real: fiduciary duties developed for individual professional relationships don't easily govern millions of algorithmic interactions across conflicting legal jurisdictions. This creates a conceptual challenge for applying agency-derived principles. There's no intentional delegation from users, no mutual understanding of representation, and no direct principal control over platform actions. **Users don't think Facebook is their "agent" — they think it's a service provider.** **This is precisely why traditional agency law is insufficient.** But this tension points toward practical solutions rather than theoretical deadlock. There are a number of legal options that move _Beyond Traditional Agency._ This include the creation of graduated obligations, including reliance-based obligations and the development of public service obligations—or else the recognition of principal authority over digital assets. ### Beyond Traditional Agency: Graduated Obligations ***Instead of requiring platforms to become perfect fiduciaries (impossible given their business models) or abandoning accountability entirely (unacceptable given their power), one option is to develop **graduated obligations** that recognize different levels of dependence and different types of platform functions.*** **One approach to this is to demand procedural rather than substantive fiduciary duty.** Instead of requiring platforms to optimize for user welfare (which cannot be done at scale given the conflicting interests), require them to use fair procedures: transparent decision-making processes, consistent application of stated policies, appeals mechanisms for significant impacts, and disclosure of conflicts of interest. **Another approach recognizes that platform "loyalty" might mean serving the user ecosystem rather than individual users.** A platform that removes one user's content to protect others from harassment isn't betraying fiduciary duty, it's balancing competing interests within its user base. The obligation is to make such tradeoffs transparently and proportionally, not to avoid them entirely. Effectively, we need to create "quasi-fiduciary" obligations calibrated to digital realities: transparency about data use, prohibition on manipulation, and accountability for harms — without pretending platforms are traditional fiduciaries or destroying the operational flexibility that makes them valuable. > *The goal isn't to choose one framework but to match regulatory approaches to platform functions and user dependencies.* ### Beyond Traditional Agency: When Reliance Creates Obligation Platforms don't just provide services, they mediate relationships, store irreplaceable content, and control access to essential infrastructure. Users may not intend to create agency relationships, but they do entrust platforms with authority over digital assets, identity, and social connections that can't be easily moved elsewhere. This creates another option for the creation of quasi-fiduciary obligations. **The legal innovation is recognizing that reliance-based obligations can exist without formal agency.** Just as common carriers bear duties to serve fairly without being agents of their customers, platforms that control essential digital infrastructure can bear obligations proportional to their role — not because users delegated authority, but because platforms accumulated power that affects user welfare in ways traditional service relationships don't contemplate. This innovation raises difficulties because courts have upheld unilateral digital "agreements," retroactive *terms-of-service* changes, and automated moderation policies. In these cases, law did not constrain the coercion encoded in code, but instead validated it. This complicity reveals a deeper challenge: not simply the displacement of law by code, but their collusion in producing governance regimes with minimal accountability. Yet legal doctrine does support obligations based on reliance. I've already noted **Common carrier** law, which requires businesses offering essential services to serve fairly and predictably. In addition **Bailment** law governs entrusting goods to someone for safekeeping (like valet parking or bank deposit boxes); while **entrustment** doctrines recognize that reliance can create fiduciary or quasi-fiduciary duties. > *The core claim is that **dependence creates obligation** — not by contract or delegation, but through reliance.* But where does reliance become legally significant? Not every dependency creates duties. We rely on countless services without special obligations arising. The key factors should be switching costs (how difficult is exit in practice?), integration depth (how embedded is the service in daily life?), power asymmetry (can users meaningfully negotiate terms?), and societal impact (does the service affect fundamental rights or needs?). A search engine used casually differs from an identity provider controlling access to government services. Apple and Google's digital wallet services, which store driver's licenses, health records, and authentication credentials, represent the deepest form of infrastructure dependence — controlling access to both digital and physical spaces. I call actors such as Apple and Google, acting as digital wallet providers, **entrusted stewards**. They operate as critical infrastructure whose power must be bound by duty, even absent consent. ### Beyond Traditional Agency: Public Service Obligations Digital services that function as essential infrastructure require an even higher level of oversight: they require public utility-style accountability, regardless of ownership structure. This can be done through a **public service obligation**: when infrastructure serves essential functions, it bears duties proportional to its role in citizens' lives. This doesn't require public ownership, it requires public accountability through legal frameworks that recognize infrastructure dependencies. Estonia's e-governance system demonstrates how legal frameworks can embed public service obligations from the design stage: citizens retain democratic recourse when policies need changing, privacy protections are embedded by design rather than balanced against advertising revenue, and service quality depends on citizen satisfaction rather than engagement metrics. Municipal broadband networks like EPB in Chattanooga demonstrate how public service obligations create accountability. When EPB faced a major outage in 2019, the city-owned utility was required to provide hourly updates, to compensate affected businesses, and to submit to independent technical review — accountability measures that private ISPs typically avoid through terms-of-service disclaimers. But the model works even with private operators under public service obligations. In Vermont, private ISPs receiving public subsidies must meet "digital equity" requirements: transparent service quality reporting, standardized appeals processes for service disputes, and community representation on network governance boards. These requirements haven't driven ISPs away, they've created more sustainable business models based on community trust rather than captive customers. "When you're accountable to the community you serve, you make different decisions," explains Tom Stevens, who manages rural broadband for Kingdom Fiber in Vermont. "We invest in customer service and network redundancy because our business depends on community satisfaction, not just subscriber lock-in." ### Beyond Traditional Authority: Wyoming's Principal Authority Wyoming has an alternative answer, at least for the topic of digital identity, and it's one that they've already codified as law. [Wyoming SF0039 (2021)](https://www.wyoleg.gov/Legislation/2021/SF0039) defines personal digital identity as something a person has **principal authority** over — a conceptual breakthrough that emerged from collaborative work between technologists and legal scholars. I was directly involved in making this law: asked by Wyoming legislators to join a subcommittee on digital identity, I helped to recruit legal experts to explore frameworks that would align with the [principles of self-sovereign identity](https://www.lifewithalacrity.com/article/the-path-to-self-soverereign-identity/#ten-principles-of-self-sovereign-identity) that I had articulated. It was Professor Clare Sullivan of Georgetown University — an Australian legal scholar recognized for her pioneering work on the legal implications of digital identity — who first highlighted to our team the opportunity to formally define digital identity in terms of principal authority. This framing represents a fundamental shift in how law conceptualizes digital identity. Identity is framed as **relational and delegable**, not as property. Authority flows **from the person**, not from platform ownership or *terms-of-service*. Like an agent in law, identity infrastructure must act with care and fidelity to the user's intent. Professor Sullivan's insights showed that principal authority creates a more flexible and dignified framework than property rights. Where property can be sold, seized, or abandoned, principal authority remains inherently connected to the person. It can be delegated but not alienated, shared but not surrendered. ***Recognition creates visibility but not accountability. A group can be seen, yet still be governed in silence. Without contestability, recognition is symbolic, not structural.*** The statute remains largely symbolic: no court has yet used it to constrain platform behavior, and no platform has changed practices in response. But even symbolic law can shift discourse. By establishing that identity involves "principal authority" rather than property, Wyoming creates conceptual space for future development that can be paired with enforcement mechanisms. > *Identity is not property to be owned, but personhood expressed through relations.* ### Legal Frameworks in Practice: Healthcare's Fiduciary Model Healthcare platforms demonstrate how quasi-fiduciary obligations can work at scale without destroying platform economics. Under HIPAA and state medical privacy laws, telehealth platforms like Teladoc and Epic MyChart operate under strict fiduciary obligations that resemble what we need for other platform relationships. These platforms must obtain explicit consent before sharing patient data, provide transparent appeals processes for treatment decisions, maintain detailed audit logs of all access to patient records, and allow patients to port their complete medical history when switching providers. Violations carry significant penalties — both financial and reputational. The economic impact has been positive rather than destructive. Healthcare platforms differentiate themselves on privacy protection and patient control, leading to innovation in user-friendly consent mechanisms and data portability tools. Patient satisfaction scores consistently rank platforms with stronger privacy protections higher than those with weaker controls. Dr. Sarah Chen, who runs a telehealth practice through multiple platforms, notes: "When platforms have real accountability to patients, they compete on trust rather than lock-in. That benefits everyone — patients get better care, and doctors get better tools." This success suggests that graduated fiduciary obligations could work for other high-stakes platform relationships without destroying their economic viability. The key is matching the obligation to the actual dependence and potential harm. ### Legal Frameworks in Practice: Credit Union's Transparency Credit unions provide another model for how quasi-fiduciary obligations can govern platform relationships. Unlike traditional banks, credit unions operate under legal requirements to serve member interests rather than maximize shareholder returns. Members elect boards, contest decisions through established appeals processes, and receive transparent financial reporting. This model has scaled successfully. Credit unions serve over 130 million Americans while maintaining higher customer satisfaction scores than traditional banks. The key insight: graduated obligations based on the depth of member relationship and dependence. Digital financial platforms are beginning to adopt similar approaches. Chime, a digital banking platform, voluntarily implements credit union-style transparency despite being a for-profit corporation: quarterly impact reports showing how customer fees are used, appeals processes for account restrictions, and member representation on product development decisions. "When customers can see how their money creates value for the platform, they become partners rather than products," explains Ryan King, Chime's former head of member advocacy. "Transparency costs us some short-term profit but creates much stronger long-term relationships." These examples demonstrate that quasi-fiduciary obligations can enhance rather than threaten platform economics when implemented thoughtfully. ### Legal Frameworks in Practice: Robinhood's Restrictions Without legal frameworks like HIPAA or even imposed frameworks such as Chime's adoption of Credit Union–like transparency, the opportunity for harm increases. Consider a complex case: the 2021 Robinhood trading freeze during the GameStop surge. When clearinghouse collateral requirements spiked, Robinhood restricted purchases to avoid insolvency. This wasn't arbitrary discrimination but systemic risk management responding to genuine liquidity constraints that could have collapsed the platform entirely. Yet users who had been promised "democratized finance" discovered their access was contingent on backend financial plumbing they never knew existed. **The lesson isn't that Robinhood acted wrongly**, but that platforms operating critical infrastructure owe users transparency about systemic dependencies. Quasi-fiduciary obligations might not have prevented the restrictions, but they could have required clearer disclosure about when and why access might be limited. ### Calibrating Obligations to Dependence: A Graduated Framework A fiduciary framework need not be all-or-nothing. Rather than imposing uniform duties across all platforms, obligations should scale with the depth of user dependence and the severity of potential harms. These are not just cost burdens: successful platforms increasingly treat these duties as competitive advantages. Users gravitate toward platforms that provide transparency, appeals processes, and respect for user agency — creating market incentives for accountability. Signal demonstrates how mission-aligned funding models can support accountability at scale, while healthcare platforms show how fiduciary obligations can enhance rather than threaten competitive positioning. The economic model isn't the barrier; it's often the solution. **True fiduciary status** should apply to platforms controlling core identity infrastructure or other essential infrastructure or charging premium fees: Apple and Google's digital wallet services storing driver's licenses, health records, and government credentials; legally enabled wallet services like emerging EU Digital Identity wallets; government software like Trusted Traveler apps. When platforms mediate access to physical spaces, government services, or essential documents, they function as digital trustees and should bear corresponding obligations. - Duties: full fiduciary duties with individual accountability. - Cost: significant but justified by revenue model and societal impact. **Moderate obligations** could govern large social platforms and services that mediate professional relationships or store irreplaceable user-created content: robust appeals processes, proportional responses to violations, data portability guarantees, and human review of automated decisions with significant impact. Platforms like Facebook, LinkedIn, or YouTube fall into this category — not quite fiduciaries, but more than mere service providers. - Duties: human review of automated decisions affecting high-value accounts, data portability guarantees, proportional sanctions. - Cost: manageable with proper implementation. - Benefit: differentiation in crowded markets and reduced legal liability. **Minimal obligations** might apply to entertainment or casual-use platforms: basic transparency about major policy changes, reasonable notice before service termination, and simple appeals processes for account suspensions. - Duties: quarterly transparency reports, basic appeals processes, reasonable notice periods. - Cost: minimal administrative overhead. - Benefit: reduced regulatory risk and improved user trust. This graduated approach recognizes that a social gaming platform poses different risks than a service controlling access to unemployment benefits. The goal is not to burden every service with identical duties, but to ensure accountability matches actual dependence. ### Designing for Delegation and Contestation Legal frameworks must be **matched by design**. Systems that hold or route authority should log all delegated actions, make delegation and revocation explicit, and allow for **auditable trust**, not blind trust. The W3C's [Verifiable Credentials standard](https://www.w3.org/TR/vc-data-model-2.0/) shows how delegation can be made explicit and revocable in technical systems. Users can issue credentials to services acting on their behalf, with clear scopes and automatic expiration. This isn't theoretical: the Verifiable Organizations Network in Canada uses these patterns to let nonprofits delegate authority while maintaining accountability. These protocols connect directly to the cryptographic possession principles explored in Section Two, showing how technical and legal frameworks can reinforce each other. Similarly, the emerging field of "competitive compatibility" — what Cory Doctorow calls "comcom" — recognizes that true delegation requires the ability to switch agents. Technical approaches to exit and interoperability, explored in Section Six, become essential infrastructure for making legal obligations meaningful. Design principles for delegable authority include: - **Least privilege**: agents have only the authority they need - **Reversibility**: revocation is always possible - **Transparency**: actions must be legible to the user - **Resilience**: no single point of failure should define one's digital self These technical designs must work in concert with evolving legal frameworks to create meaningful accountability. > *We do not need to eliminate intermediaries. We need to name them, bind them, and reclaim the right to define what they may do in our name.* ### Closing: Reclaiming Autonomy Through Legal Design Autonomy does not mean never depending on others. It means being able to **define those dependencies** — and to revoke them when they fail. Agency law shows us how to structure authority: revocable, accountable, fiduciary. Entrustment doctrines show us how to constrain power even when delegation is not explicit — when reliance itself creates obligation. Wyoming's "principal authority" framing reminds us that identity is not property to be owned, but personhood expressed through relations. Systems that act for us must act **by our will**, **in our interest**, and **under our terms** — or face consequences. The goal is not pure decentralization or independence. It is **relational autonomy**: systems designed so that when power flows, **it remains accountable to those who grant or depend on it**. These frameworks strain to address digital relationships because those relationships are genuinely new. Courts resist extending old doctrines not from mere conservatism but because the analogies are imperfect. The solution isn't to abandon legal constraint on platform power but to develop frameworks specific to digital intermediation. This requires patience. Legal evolution is slow, especially when it must balance innovation, user protection, and economic sustainability. But the alternative — accepting platform power without accountability — abandons the anti-coercive commitments that define free societies. The challenge is to constrain without crushing, to protect without paralyzing. The examples explored here — from healthcare platform accountability to municipal service obligations to credit union governance — demonstrate that these frameworks work when implemented thoughtfully. They don't require perfect solutions or unanimous adoption. They require patient experimentation, careful measurement of costs and benefits, and gradual expansion of what works. The cooperative and public ownership models explored in Section Five will show how these legal frameworks enable alternative platform relationships. The technical architectures examined in Section Six will demonstrate how design can support rather than undermine accountability. Together, they form a toolkit for reclaiming digital autonomy through law, ownership, and code working in concert. The question is not whether these frameworks are theoretically sound — it's whether enough practitioners will commit to the practical work of making them real. > *We do not need to eliminate intermediaries. We need to name them, bind them, and reclaim the right to define what they may do in our name.* ### Appendix: Implementation Roadmap **Immediate (6-12 months)**: Professional platform pilots with high-dependence users. Uber and Lyft could implement enhanced appeals processes for drivers above certain income thresholds. Healthcare platforms could expand HIPAA-style protections to mental health and wellness apps. Financial platforms could adopt credit union-style transparency for high-value accounts. **Near-term (1-2 years)**: Municipal digital services embedding accountability from launch. Cities implementing digital ID systems could require transparent algorithms, user representation in governance, and robust appeals processes. State professional licensing boards could implement portable, user-controlled credentials demonstrating principal authority principles. **Medium-term (2-5 years)**: Model legislation expanding successful pilots. Wyoming-style statutes recognizing user authority over identity delegation. Federal safe harbors for platforms implementing graduated fiduciary obligations. International coordination through digital governance frameworks. **Long-term (5+ years)**: Mature regulatory ecosystem where platform accountability matches user dependence. Legal precedents establishing when reliance creates obligation. Technical infrastructure supporting auditable delegation and revocation across platforms. Each phase builds on demonstrated successes rather than theoretical frameworks, creating sustainable progress toward platform accountability.