Markus de Medeiros
    • Create new note
    • Create a note from template
      • Sharing URL Link copied
      • /edit
      • View mode
        • Edit mode
        • View mode
        • Book mode
        • Slide mode
        Edit mode View mode Book mode Slide mode
      • Customize slides
      • Note Permission
      • Read
        • Only me
        • Signed-in users
        • Everyone
        Only me Signed-in users Everyone
      • Write
        • Only me
        • Signed-in users
        • Everyone
        Only me Signed-in users Everyone
      • Engagement control Commenting, Suggest edit, Emoji Reply
    • Invite by email
      Invitee

      This note has no invitees

    • Publish Note

      Share your work with the world Congratulations! 🎉 Your note is out in the world Publish Note

      Your note will be visible on your profile and discoverable by anyone.
      Your note is now live.
      This note is visible on your profile and discoverable online.
      Everyone on the web can find and read all notes of this public team.
      See published notes
      Unpublish note
      Please check the box to agree to the Community Guidelines.
      View profile
    • Commenting
      Permission
      Disabled Forbidden Owners Signed-in users Everyone
    • Enable
    • Permission
      • Forbidden
      • Owners
      • Signed-in users
      • Everyone
    • Suggest edit
      Permission
      Disabled Forbidden Owners Signed-in users Everyone
    • Enable
    • Permission
      • Forbidden
      • Owners
      • Signed-in users
    • Emoji Reply
    • Enable
    • Versions and GitHub Sync
    • Note settings
    • Note Insights New
    • Engagement control
    • Make a copy
    • Transfer ownership
    • Delete this note
    • Save as template
    • Insert from template
    • Import from
      • Dropbox
      • Google Drive
      • Gist
      • Clipboard
    • Export to
      • Dropbox
      • Google Drive
      • Gist
    • Download
      • Markdown
      • HTML
      • Raw HTML
Menu Note settings Note Insights Versions and GitHub Sync Sharing URL Create Help
Create Create new note Create a note from template
Menu
Options
Engagement control Make a copy Transfer ownership Delete this note
Import from
Dropbox Google Drive Gist Clipboard
Export to
Dropbox Google Drive Gist
Download
Markdown HTML Raw HTML
Back
Sharing URL Link copied
/edit
View mode
  • Edit mode
  • View mode
  • Book mode
  • Slide mode
Edit mode View mode Book mode Slide mode
Customize slides
Note Permission
Read
Only me
  • Only me
  • Signed-in users
  • Everyone
Only me Signed-in users Everyone
Write
Only me
  • Only me
  • Signed-in users
  • Everyone
Only me Signed-in users Everyone
Engagement control Commenting, Suggest edit, Emoji Reply
  • Invite by email
    Invitee

    This note has no invitees

  • Publish Note

    Share your work with the world Congratulations! 🎉 Your note is out in the world Publish Note

    Your note will be visible on your profile and discoverable by anyone.
    Your note is now live.
    This note is visible on your profile and discoverable online.
    Everyone on the web can find and read all notes of this public team.
    See published notes
    Unpublish note
    Please check the box to agree to the Community Guidelines.
    View profile
    Engagement control
    Commenting
    Permission
    Disabled Forbidden Owners Signed-in users Everyone
    Enable
    Permission
    • Forbidden
    • Owners
    • Signed-in users
    • Everyone
    Suggest edit
    Permission
    Disabled Forbidden Owners Signed-in users Everyone
    Enable
    Permission
    • Forbidden
    • Owners
    • Signed-in users
    Emoji Reply
    Enable
    Import from Dropbox Google Drive Gist Clipboard
       Owned this note    Owned this note      
    Published Linked with GitHub
    • Any changes
      Be notified of any changes
    • Mention me
      Be notified of mention me
    • Unsubscribe
    ## 1. Overview - The purpose of the model is to explain why a Rust program is memory-safe in terms of what a Rust user is allowed to do to memory at each point in the program's execution. - Necessarily, this analysis is approximate (perfect aliasing information, for example, is undecidable). - By building our model around the same approximations Rust is already using we hope that our model is - easy to automate, given the compiler facts, and - easy to connect to other features of the compiler. - We model permissions the program state has for locations in memory as _capabilities_. The _PCS_ represents the program state in terms of capabilities. - We model the changes that pending borrow expiries will have on the PCS using the _coupling graph_. The coupling graph represents the state of the borrow checker and is expressed in terms of origins (lifetimes) and takes the liveness of origins as an input. - The semantics for both the _PCS_ and _coupling graph_ have analgous components in the compiler/borrow checker. Their semantics is designed to mimic the compiler's behavior. Some properties of the PCS and coupling graph are testable by performing rewrites on the original Rust program. - Our analysis obtains a _trace_ of both the PCS and coupling graph. The _reborrowing DAG_ transforms this trace into a _fully annotated program_, with fully elaborated Hoare triples around each statement, and which is ready to be encoded into a verification backend. ## 2. Model ### 2.1. Resources - **Permission**: one of `E|e` (exclusive mutable) or `R|r` (read only/exclusive immutable) or `S` (shared immutable). - Permissions `E`, `R`, and `S` are called _deep_ - Permissions `e` and `r` are called _shallow_ - **Resource**: a _place_ or a _lifetime variable_. - **Tag**: either `None`, a `Location`, or a `Function Call Lifetime`. All contain a location, the last contains an index. - **Capability**: a Tag-Permission-Resource triple. - **Deep Capability of a place `p`**: denoted `deep(p)`. - When `p` has no projections, as in it is a local, define `deep(p) = E p` if `p` is declared mutable and `deep(p) = R p` if not. - Likewise, the **Shallow Capability** is denoted `shallow(p)`. The shallow capability for mutable and immutable locals are `e` and `r` respectively. - The `field` and `downcast`, projections preserve `deep` and `shallow`. - `shallow(*p)` is `e *p` when `p: &mut _` and undefined when `p: &_`. - `deep(*p)` is `E *p` when `p: &mut _` and `S *p` when `p: &_`. - **Lifetime Capability**: Capability for an abstract set of resources which are regulated by lifetime `'a`. - **Loan Capability**: the Polonius borrow index for a syntactic issue of a loan. - Interpreted as a stand-in for the right-hand side lifetime `'a` in `let _ = &'a _` or `let _ = &'a mut _`. ### 2.2. Semantic Domain - **Place Capability Summary (PCS)**: a separating conjunction of capabilities which is valid under Prusti's fractional separation logic. Typically written as a set. - A PCS contains only untagged resources (not `E@bb0[12] x`) - A PCS contains only places, ie no lifetimes (not `'a`) - **Free PCS Operations**: the following statements: - `{ P p.x0, ..., P p.xn } pack(P p) { P p }` - where `P` is a permission, and `p` is of type ... - `{ Deep(*x), Shallow(x) } pack (Deep(x)) { Deep(x) }` - when `x` is borrow-typed. - `{ P p } unpack (P p) { X }` - where `{ X } pack (P p) { P p }` - downcasts - `{ Deep(x) } drop(Deep(x)) { Shallow(x) }` - `{ e x } drop(e x) { r x }` ^[3] - (needs work) `{ E x[i] } pack (Deep(x)) { E x }` - when `x` is indexable. - (needs work) separate the capabilities for the list length from list elements. - **Free PCS**: the set of legal PCS modulo repacking by any sequence of free PCS operations. - For all borrow-checked code, there exists a free PCS trace. - **Coupling Graph (CDG)**: a directed acyclic hypergraph whose vertices are the set of capabilities. It's edges are described in the next section. - **PCS Trace**: a map from locations to PCS-CDG pairs. ### 2.3. Analysis #### State - **Subgraph Identifier**: a Polonius **origin**, tagged with `None` or a `Location`. - **Coupled Borrow Index**: a collection of symbols. - To reach a fixed point, we need to either consider coupled borrow indices as equal only up to alpha equivalence, or have a deterministic gensym at each join point (that is stable under iterated coupling). Either are good options. - **Implementation of Coupling Graph**: The coupling graph is the smallest^[1] graph which contains a family of subgraphs, indexed by subgraph identifiers. - **Subgraph core edges**: Each subgraph must contain exactly one of the following edges - **owned** edge: corresponding to a loan issue, from a _loan capability_ to the borrowed-from capability. - **subgraph** edge: an edge labeled with a subgraph identifier, representing that origin's _subgraph signature_^[2] - **coupled** edge: an abstract edge, labeled with a _coupled borrow index_, and a set of resources. The destination nodes of all coupled edges of the same index must be the same. The set of resources for a coupled edge are the union of the preconditions for coupled borrows of the same index. - It also suffices to track the set of the eliminated resources as coupled borrows of the same index are removed. The purpose of this set is to remember which resources are allowed to be used once the last coupled borrow of that index is expired. - **Children** of a subgraph: The subgraph identifier in a _subgraph_ edge, or the set of subgraph identifiers in a _coupled_ edge. Owned edges may have no children. - **Subgraph signature**: Each subgraph represents an abstract transfer of capabilities in the PCS. - Subgraphs take the form ``` { Pre } ---repack edges--> { 'a } --core edge-> { Post } ``` - We say `{ Pre } --* { Post }` is the subgraph signature. `Post` is a contained in the preconditions of it's children (except for owned edges). - The node `'a` is always present, but may be tagged. This nodes takes the same name as the subgraph itself. - The coupling graph can only ever remove entire subgraphs at once. #### Removing Subgraphs - **Expiry Instructions**: - `consume` a place: removes a place from the PCS by tagging it at `l`. - `consume` a lifetime: removes all resources in the PCS which are deeply initialized behind a lifetime by tagging that lifetime. - Alex has some idea relating to how this is encoded (for borrows in structs). The case of simple borrows`p` amounts to tagging `E *p`. - `apply` Performs an abstract exchange of capabilitues by way of an opaque list of annotations (to be inserted by the reborrowing DAG, at a lower down pass). - **Origin Live on Entry**: A Polonius fact which regulates the subgraphs contained in the coupling graph. - At each location, we will expire all subgraphs whose origins are no longer live. - **Expiring Origins** at a location `l`: Once origin `o` is no longer live - tag all untagged subplaces of `o` (this includes the entire left-hand side) at `l`. - Iteratively for `r` in the set of all regions which are expiring, and until no tagged subgraphs are children of live subgraphs, `consume(r)` and all remaining preconditions of `r`. - `apply` that subgraph. - There always exists an order that we can apply these origins in so that all such subgraphs are removed. - **Applying a subgraph**: - The precondition must be completely tagged - Delete the subgraph from the coupling graph - Emit an `apply` instruction - The Hoare triple of that `apply` instruction, as relevant to the PCS, takes no capabilities and returns the new leaves of the coupling graph. #### Owned PCS Semantics - We build a Hoare triple for each MIR statement according to the following rules | `s : StatementKind` | `requires(s)` | `ensures(s)` | |---|---|---| | `assign(p, r)` | `e p * requires(r)` | `Deep(p) * ensures(r)` | | `FakeRead(_, _)` | `True` | `True` | | `SetDiscriminant(p, _)` | `Deep(p)` | `Deep(p)` | | `Deinit(p)` | TBD | TBD | | `StorageLive(p)` | `True` | `e p` | | `StorageDead(p)` | `Shallow(p)` | `True` | | `Retag(_)` | TBD | TBD | | `PlaceMention(p)` | `Deep(p)` | `Deep(p)` | | `AscribeUserType(_, _)` | TBD | TBD | | `Coverage(_)` | `True` | `True` | | `Intrinsic(_)` | unsupported | unsupported | | `ConstEvalCounter(_)` | `True` | `True` | | `Nop` | `True` | `True` | | `r : rvalue` | `requires(r)` | `ensures(r)` | |---|---|---| | `Use(op)` | `requires(op)` | `ensures(op)` | | `Repeat(op)` | TBD | TBD | | `Ref(rgn, Shared, p)` | `Deep(p)` | `Deep(p)` | | `Ref(rgn, Shallow, p)` | `X p` where `X ∈ {E,S,R}` | `S p` | | `Ref(rgn, Unique, p)` | TBD | TBD | | `Ref(rgn, Mut(two_phase: true), p)` | TBD | TBD | | `Ref(rgn, Mut(two_phase: false), p)` | `Deep(p)` | `True` | | `ThreadLocalRef(_)` | unsupported | unsupported | | `AddressOf(_)` | unsupported | unsupported | | `Len(p)` | `Deep(p)` | `Deep(p)` | | `Cast(_, _, _)` | TBD | TBD | | `BinaryOp(_, op_1, op_2)` | `requires(op_1) * requires(op_2)` | `ensures(op_1) * ensures(op_2)` | | `CheckedBinaryOp(_, op_1, op_2)` | `requires(op_1) * requires(op_2)` | `ensures(op_1) * ensures(op_2)` | | `NullaryOp(_, _)` | `True` | `True` | | `UnaryOp(_, _)` | `True` | `True` | | `Discriminant(p)` | `Deep(p)` | `Deep(p)` | | `Aggregate(..., [ops])` | | | | `ShallowInitBox(_, _)` | TBD | TBD | | `CopyForDeref(p)` | TBD | TBD | | `o : Operand` | `requires(o)` | `ensures(o)` | |--|--|--| | `Copy(p)` | `(E\|R\|\|S) p` | `(E\|R\|S) p` | | `Move(p)` | `Deep(p)` | `Shallow(p)` | | `Constant(_)` | `True` | `True` | - Note that `S p` can not be moved-from in legal Rust code. | `t : TerminatorKind` | `requires(t)` | `BasicBlock -> ensures(t)` | |--|--|--| | `goto(bb)` | `True` | `bb -> True` | | `SwitchInt(op, (_ -> bbs))` | `requires(op)` | `(_ in bbs) -> ensures(op)` | | `Resume` | TBD | TBD | | `Terminate` | TBD | TBD | | `Return` | `E 0` | `True` | | `Unreachable` | `True` | `Talse` | | `Drop(p, bb, _, _)` | `e p` | `bb -> e p` | | `Call(f, ops, p_dest, Some(bb_trg), _, _, _)` | `e p_dest * (sep. fold requires over ops)` | `bb_trg -> deep(p_dest) * (sep. fold ensures over ops)` | | `Assert(op, _, _, bb, _)` | `requires(op)` | `bb -> ensures(op)` | | `FalseEdge(bb_real, bb_false)` | `True` | `bb_real -> True`; `bb_false -> False` | | `FalseUnwind(bb, _)` | `True` | `bb -> True` | | `Yield(_, _, _, _)` | TBD | TBD | | `GeneratorDrop` | TBD | TBD | | `InlineAsm(...)` | unsupported | unsupported | - Return place `_0` is implicitly declared as mutable. - Functions implicitly drop arguments at the end of the call (not written as a `StorageDead` in the MIR) #### Coupling Graph Semantics - **Issuing Mutable Borrows**: of the form `(x: &'a mut T) = (&mut y : &bwx mut T)` - Has Hoare triple `{ e x, deep(y) } ... { deep(x) }` - Adds `'a --> bwx` _instance_ edge in `'a` - Adds `bwx --> deep(y)` _owned_ edge in `bwx`. - **Issuing Shared Borrows**: of the form `(x: &'a T) = (& y : &bwx T)` - Has Hoare triple `{ e x, P y } ... { deep(x), S y }` where `P` is `E|R|S`. - Adds `'a --> bwx` _instance_ edge in `'a` - If `P = E|R` and `P y` is not in the graph, adds `{ bwx, S y } --> { P y }` edge in `bwx`. - If `P = E|R|S` and `P x` is a precondition of `o`, adds a `{ S y, S y } --> P y` repack edge in `o` and an `bwx --> S y` _owned_ edge in `bwx`. - The two `S y` nodes should be considered distinct, and distinct from `P y` as well. - Any time `P=S`, `P y` is a node in the graph. - **Unpacking Borrows**: To perform `unpack(x: &'a T)`, add a `deep(*x) --> 'a` _unpack_ edge. - **Killing**: Tag all subplaces in the graph with a location. This includes subplaces in labels. - **Moving Borrows**: of the form `(x: &'a _ T) = move (y : &'b _ T)`. - Hoare triple is `{ e x, deep(y) } ... { deep(x), shallow(y) }` - This triple is the same as any other move. - There should be an `'b -inst-> W` in the graph, since `y` is fully packed. - `W` is a lifetime node. ##### Option 1: No `un-move` annotations needed - Kill `'a` - Issue a `'a --> W` instance edge. - Kill `'b`, expire as appropriate ##### Option 2: `un-move` annotations required - Kill `'a` - Kill `'b` - Issue a `'a --> 'b@l` edge corresponding to the un-move. - Edges formerly in `'b` are now considered part of `'a` and are not expired. - **Invariance**: - Certain moves (eg. moves of nested borrows) add invariant constraints between lifetimes - We only distinguish origins up to invariance, in these cases. See the examples. - **Function Calls**: - With the exception of some cases for invariance, add a fresh node for each lifetime in the function call. - These should all have corresponding origins. - Add an _instantiates_ core edge between these nodes and the moved into/moved out of nodes. - Add a _predicate_ edge between all origins with _predicate_ subset facts - **Universal Origins**: - In general, we can handle function arguments and the return place without too much hassle since they are generated with fresh origins. - To fully explain the borrow checker, we also need to explain the universal origins. Many of the return place fields and function arguments will have origins that are invariant with a universal origin. - Polonius ensures that there are no transitive subset relationships between universal origins when those subset relationships are not declared. - At the end of the function, we can use the regular algorithm for expiring origins to determine which origins need to be packaged (and in what order) to meet all of the predicate edges. - TODO what happens when we add edges throughout the execution that makes a cycle? - Not a DAG? `#1 =:= #7 <: #3 <: #8 <: #4 <: #6 =:= #1` - It may be possible, to instead of treating universal origins as if they were regular origin nodes, to just see them as groups of nodes only at the end of the function. - As in, during the verification of the function body we ignore the fact that `#6 =:= #1 =:= #7`. When we're creating the magic wands at the end of the function, the case of `expires_first(#1)` means both `#6` and `#7` are permitted to expire. #### Coupling - **The Coupling Algorithm** - The coupling algorithm is a way to traverse several coupling graphs at once. - The output describes which capabilities are definitiely, and conditionally, accessible after each origin expiry. - At a join point, the set of live origins in each coupling graph is the same (because "live" means live in the compilers sense). - Like the compiler, the coupling graph interprets dead origins as eagerly expired. - **Initial state** - For each graph, list its **leaves** as a set. Leaves come from the precondition of live origins which are not children of any other origin. - **Find Next Definitely Accessible Resource** - There must exist an origin, which is live, and which can become definitely accessible by at most one expiry in each branch. - Intuition: If not, then any use of a place one edge away from a leaf is illegal as it surely will be maybe borrowed. Thus, either the place is not used (so is not live) or the program does not borrow-check. - Call this resource `r` - For each branch, define a **signature**: `Sig = { pre(o) } o { post(o) }`, where `post(o)` contains `r`. This is the Hoare tiple associated to the expiry in one branch. - **LHS: Definitely Consume the same set of resources** - Ensure the same resources are definitely consumed in all branches by expiring additional orgins. Add their pre- and post-conditions to the signature - Intuition: Definitely accessing `r` means `r` is definitely not maybe blocked. So when we access `r`, any place blocking `r` in any branch can't be live. Thus, we should consume them before regaining access to `r`. - **RHS: Freeze extras** - Once the definitely consumed resources are the same in every branch, emit `freeze` statements to ensure the right-hand sides are the same. - `freeze` instructs the backend to remember that it conditionally has permission to a place - In our examples, we write frozen resources as `|x|`. - Frozen resources can be unfrozen via `unfreeze`. This counts as a legal step when we're finding the next definitely accessible resource. - If a resource will be frozen in all branches, we `unfreeze` it in all branches. This exchanges `|y|` for `y` in all signatures. - **Apply the signatures as Hoare triples** - Remove any expired edges - The change in leaves of the graph in each branch is the unfrozen part of the signature. Perform this exchange to the set of leaves in each branch. Because of freezing and unfreezing, this action may be different in each branch. - The unfrozen part of the signature should be the same in all branches, this is the signature of a new **coupled edge** in the output. - If all branches are identical, the content of those branches can be copied to the output. - Otherwise, introduce a new coupled edge in each definitely consumed precondition to all postconditions. - Record the origins which are expired, and the freezes and unfreezes, as annotations in each branch. - **Repeat** - Repeat this procedure until all graphs are empty. ##### Edge cases in the coupling algorithm - When I write "same" above, I mean "same up to repacking". - When an origin is differently packed in two branches, we might need to repack to make them equal. These annotations should be included with the `expire`, `freeze` and `unfreeze` annotations. - Repacking may also involve the `{ deep(x) } weaken(x) { S x }` annotation, which can be treated as a ` { S x } --* { deep(x) }` edge. - `Weaken` is needed when a place is shared-borrowed-from in one branch, but not in the other. - Care must be taken with coupled edges: above I said that we can always definitely expire one edge to definitely access a resource. What I mean by this is that we can always expire one _coupled_ edge (since we only regain resources once we expire all coupled edges of the same kind)... this may be represented by several coupled hyper-edges. - This situation can happen when trying to find a loop invariant. #### Problems with Shared Reborrows - slices: unpack a mutable slice. As mentioned by Aurel, there should be separate capabilities for the slice element(s) and the slice length. #### Problems with variance - In general, an edge in the coupling graph corresponds to an _outlives_ constraint between origins. - This is because assignments between borrow-typed data is covariant in terms of the outermost lifetime ``` let (_: &'a mut T) = (_: &'b mut T) => (_: &'b mut T) <: (_: &'a mut T) => 'b <: 'a, T <: T and T <: T by variance rules ``` - One reason why the coupling/reborrowing graph is a good abstraction is because that `'b <: 'a` can be interpreted as an exchange of the resources in `'b` to regain the resources held in `'a`. However, other subset relations don't admit such a straightforward interpretation. - For example, lifetimes are _invariant_ under mutable borrows (see: the `T <: T` and `T <: T` constraints above). - It seems to work to consider lifetimes in this invariant position only up to invariance. - Another invariance: lifetimes declared at function signatures are invariant with the lifetimes of the locals to which they are bound (ie. function arguments and the return place). - In this situation, we distinguish between _input_ (arguments) and _output_ (return place) instantiations of that universal lifetime. In input position, we add an edge from the resource to the lifetime and in output position we add an edge from the lifetime to the resource. - It is unclear if we can always do this. - An alternative strategy is to ignore the universal lifetimes all the way until the end, when we're trying to package expiry tools (in that sense, we only use the lifetimes to abstract over a set of nodes). - Lifetimes with user-declared lifetimes are invariant with their declared lifetime as well. It's unclear how to model this-- in some situations it's enough to just consider lifetimes up to invariance but I'm not sure about in general. - Finally, I'm not sure about modelling contravariance. - For contravariance from ghost variables, it may not be meaningful to model a contravariant constraint on lifetime expiry in terms of a resource exchange. As in, if `'b <: 'a`, but `'b` and `'a` are otherwise unrelated, the fact that `'b <: 'a` can be completely ignored. - That said, I don't know that contravariance is always meaningless. For example, the arguments of function pointers are contravariant. Does this matter? How are function pointers handled in Prusti at a lower level? ## 3. Examples - Simple features will be more abbreviated as part of more complex examples. - I include the types of places in the PCS. ### Owned Data - [Example: Moves of Owned Data](/ViVFHBptSUGtQN_p1c84Iw) - [Example: Delayed Initialization of Owned Data](/UR9tgzeJRWqi4vfwjSRJIw) - [Example: Multiple Move-In ](/Ynyp8waNTgqp8aA7t4UNHQ) - [Example: Copy of Owned Data ](/-WUneQMHThui83zs4XS3JQ) - [Example: Move-out of Struct Fields ](/JUenpp0GR-ySiQHP3dCKQQ) ### Mutable Borrows - [Example: Mutable Borrow](/f7EZ09oES7K628aoryoTHw) - [Example: Mutable Reborrow](/bIXCZmn_QEagqdp0QTNjww) - [Example: Later Initialized Borrow](/jen0SOcVRZ2cxIeC6jygVw) - [Example: Mutate Through Borrow](/ibLuVdxuQcy15awDnm2HlA) - [Example: Borrow Multi Move-in](/Q0yxcJyrTvuJdVFiFP5HlA) - [Example: Killing Target of Mutable Reborrow](/AIKJyvNLRuy7c5hMytUrTQ) - [Example: Iterated Mutable Borrow](/u07KyqUzTrmuyaugjyf_aQ) - [Example: Split Mutable Borrow of Tuple](/fwDziMfMR3qRRGSZkUVJqg) ### Shared Borrows _note_: these examples are kind of hard to read in text form. Sorry about that. - [Example: Shared Borrow of Mutable Data](/dMZgYlyBSSq9Z_xkglKETg) - [Example: Shared Borrow of Immutable Data](/FC5YMwixTFecriADfY0BIA) - [Example: Shared Reborrow](/Xr_LiM_3RfakxNI5BzU4VA) - [Example: Shared Reborrow Through Mut Borrow](/r_7BSeLlRKKL7EZeqLpPAQ) - [Example: Kill of Shared Reborrow Target](/69zsnmS2S0KVaBrXiVVEmw) - [Example: Clone Shared-Borrowed-From Place](/_Ew0jG5QRP20HK5XrlwAPw) - [Example: Unpack Shared Reborrow](/bloRTdReRhCVPsWOZvg59A) - [Example: Pack a Shared Reborrow](/XdXZv5m5QwOHzTOunqEmXA) - TODO [Example: Kill of Shared Reborrow Through Mut Borrow](/Oq8_2jjeTBijImXlHSnw5w) - [Example: Split Mutable and Shared Borrow of Struct ](/3ZCdBjKpS1SrdsRKD_P7yg) - TODO [Example: Shared Borrow of Mutable Borrow ](/ZRJRbLmlSAWNahO6bC5U7g) - TODO [Example: Mutable Borrow of Shared Borrow](/25i1OqdQTOKLfUmUDm7bZA) - [Example: Maybe Shared-Borrowed-From Coupling](/cQQmBSJ0Q8O62BINIqgMmA) - TODO [Example: Copy Shared Borrowed From Struct (Jonas 1)](/uq0xwurtSKCobP01DW1L4A) - TODO [Example: Conditionally Shared Borrowed From ](/OKtRJNd9QpSf4SF7SmsfkA) - TODO [Example: Shared Borrow of Coupled Data](/Y80ONVHfR56gjJ4zzc_ibw) - TODO [Example: Shared Borrows Loop Swap](/a_DqISz0SXeimNuB24rSEA) ### Coupling at Simple Join Points - [Example: Mutable Conditional Field Drop](/1VLbFPizS8e2zJa1Jk26Xg) - [Example: Triangular Coupling ](/zAEksPVRSZ-cEeILIvBPAQ) - [Example: CADB Square Coupling ](/JlrOqRtdT9iJMjGaegJZ3Q) - [Example: ACBD Square Coupling ](/k6OO3zn3RTivFnLDZyCSXw) - [Example: Degenerate Triangle Coupling](/R0PsTk2cQZKvU1fdXTpidg) - TODO [Example: Couple Different Shared Borrows](/8JB0y9-GTtCssyktmFWstw) ### Loops - [Example: Drop Struct Fields in Loop Body](/6FaQA0CEQge3i-5Wyz9nPw) - TODO [Example: Mutable Swap in Loop](/ACYTWyhNQ-KA7NVcCQ32Qg) ### Function Calls - See what happens when we put a struct containing borrows through a function call - See what happens when we equate lifetimes via function call - [Example: Ownership Passing Function ](/rnjDkwoPRzq4_QIrrF6pOQ) - [Example: Reborrowing Function ](/_InExldtSEeXIF930LIHJQ) - [Example: Clone Mutable Borrow Function ](/KH89RZw9RRSe-zq2rpRKlA) - [Example: One Lifetime Reborrower Function ](/A10SmJ7UTwKd9S1Q39Z-dw) - [Example: Unimplementable Reborrower Function ](/-gSoIMsDRlyTx3KEAs0s_g) - [Example: Zero-Possibilty Reborrower Function ](/mtaiZtMfRq6_nboSOGMuYw) - [Example: One-Possibility Reborrower Function ](/LvSZBuujSJC6NYqB7W6IRg) - [Example: Possible Outlives Reborrower Function](/78TkwsvTQE-hByTvNJK4Cg) - [Example: Double-Posiblilty Same-Lifetime Reborrower Function ](/fpYe6dvISsGM3JuHioYq4w) - [Example: Double-Posiblilty Invariant-Lifetime Reborrower Function ](/RXJlQMClTsqPWXzb7NxC7A) - [Example: 5-Lifetime W-Reborrower ](/9zrD0lqdRO2NulyAUJtRwQ) - TODO [Example: Function on Iterated Mutable Borrows ](/pM9SWZVvQMi75YwZE2gfQw) - TODO [Example: Function on Interated Mutable Borrows Same Lifetime](/SWY_8wDcQKauu0tZfwY2yg) ### Borrows in Structs - [Example: Single-Field Borrow in Struct ](/TRfSWxbYTzWz3gqMbu1Xyw) - [Example: Independent Mutable Two-lifetime Struct ](/N6QwTVc9RFqRSrF0DbMo8w) - [Example: Double Mutable Field, Single Lifetime Struct ](/u0urXqDtSlqZg9QMa8lvuQ) - [Example: Struct Information Loss via Function Call ](/zwGhKKlRSqqQQWxuUuxbYA) - TODO [Example: Option+Borrow Enum Type ](/mZ5oY3wHSaCB2BLhKFLpJQ) - TODO [Example: Reborrows of Struct with Borrows](/zkfCK32HQBSTA92-HM6NyA) - TODO [Example: Moving Struct with Borrows](/oSsBwmHzRhOmaNDTVEZtqw) ### Real World Examples - TODO [Example: Mutable Borrow Swap](/W2O9XxF3Tp2PwxjBls1WTg) - TODO [Example: Reborrowing Linked List Traversal (Loop)](/DYtP__yZTSmzTiBh1VBCXQ) - TODO [Example: Reborrowing Linked List Traversal (Recursion)](/Lh6PpbphQiuNA0m1e9qfvg) - EXPLAINED, NEEDS TRACE [Example: List of Mutable Borrows to List of Shared Borrows](/vZZ6qnBBS9KkJhHCQP4wow) - Sorting algorithms on linked lists for interesting specs? - other basic data structures w/ interesting representations? - Kinds of maps, maybe? - Graphs are hard for borrowck ## 4. The Model and the Prusti Pipeline The main purpose of the model in terms of the larger Prusti pipeline is to 1. Collect and interpret facts from the borrow-checker. 2. Construct a sound model of the abstract transfers of capabilities in the program. 3. Present this information in a way the reborrowing DAG can understand. To this end, when we compute the PCS we also emit a trace of _Reborrowing DAG Instructions_ that the reborrowing DAG can read off operationally, to fill in the concrete annotations for the program. These instructions serve as an interface to the reborrowing DAG. :::info **Design note:** Division of Responsibilities The coupling graph and the reborrowing DAG are similar data structures; their purposes are best described by their differences. - The coupling graph computes the abstract transfers of capabilities that must happen in the PCS at each origin expiry. The reborrowing DAG manages annotations, and inserts them into the program when instructed. - The coupling graph is computed by a fixed-point analysis (hypothesized to be bounded above by $2^d$ iterations, where $d$ is the maximum loop depth). The reborrowing DAG is computed in a single pass over the program, given foresight about loop invariants and join points. - The coupling graph must be stable under some notion of equality (eg. repacking and alpha-equivalence of certain variables) to ensure the fixed point calculation converges. The reborrowing DAG does not need to do this. - The coupling graph has a rigid structure governed by the PCS and live origins at each point. The reborrowing DAG can be _any_ directed acyclic hypergraph, provided it has groups of edges which can expire as directed by the coupling graph. - Furthermore: the coupling graph structure is lightweight because it does not track pack/unpack edges. The reborrowing DAG must track pack/unpack edges, though it is free to optimize the many redundant edges the coupling graph would require. - The reborrowing DAG is free to be lazy if desired, whereas the coupling graph is always eager. - The leaves of the coupling graph must always be contained in the PCS, as in they are definitely accessible. There is no such constraint on the reborrowing DAG: reborrowing DAG leaves may even be only conditionally in the proof state. ::: There are three classes of instructions. Each edge in the reborrowing DAG is associated to exactly one _group_: either an origin, or the index of a coupled borrow. The coupling graph instructs the reborrowing DAG with the set and order of groups it is allowed to expire. #### 1. Intro Instructions These do not depend on the coupling analysis, and can be read off compiler facts beforehand^[4]. - `new_borrow(bwx, c_from, mut?)` Issue a new mutable or shared borrow with temporary value `bwx` from capability `c_from`. This edge expires when origin associate to `bwx` expires. If `c_from` is in the graph then ensure this edge points to `c_from` (ie. is a reborrow); the coupling graph will ensure `c_from` is in the graph if a reborrow is needed. - `move(c_from, c_to, o_to)` Move `c_from` to `c_to`. If the proof needs annotations to undo this operation, they belong to group `o_to`. - `kill(p, l)` Tag all unkilled subplaces of `p` with tag `l`. Tagged places represent old versions of a place. The places may still be required in order to emit a sequence of annotations. - `tag_origin(o, l)` Rename group `o` to group `o@l` to maintain parity with the coupling graph. This is not semantically meaningful to the reborrowing DAG, since it does not decide when to expire groups, but this means that some other origin is blocking `o` from expiry. - `unpack(p, ..., o)`, `pack(..., p, o)` Add a pack or unpack edge to group `o`. #### 2. Elim Instructions Elim instructions are calculated by the coupling graph. - `expire(o)` Emit all annotations in group `o` and remove group `o` from the graph. This should only be emitted for coupled edges once the last coupled edge is expired. - `consume(c, o)` Make capability `c` unusable except for expiring annotations in `o` (the reborrowing graph should no longer expect `c` to be in the PCS when `o` expires). For backends which are capable of tracking conditionally accessible places, such as Viper, this can be a no-op. - For simplicity, we require that all capabilities required for an `expire` are explicitly `consume`d beforehand. Note that group expiries are _not_ simply translations of the compiler's origin liveness facts, for example intermediate edges in long chains of reborrows may not be live in the compiler well before we want to emit their associated annotations. The coupling graph is able to determine exactly when the annotations should be emitted into the program. #### 3. Coupling Instruction Coupling instructions are only emitted by the coupling graph at join points in the CFG. They operate on _all_ incoming graphs simultaneously, we will discuss what that means for loop heads below. - `couple(id, B -> [C])` Introduce a coupled edge `id` where `B` draws from basic block indices and `C` draws from _coupling expressions_. A coupling expression is one of: - Any elim instruction: `expire` or`consume` - `freeze(c)`, `unfreeze(c)` Marks a capability as conditionally inaccessible. Can be implemented as a no-op in backends that track conditional accessibility such as Viper. - Unfrozen capabilities can be consumed, even though they are not in the PCS. - For example let `c0` and `c1` be two coupled groups, such that only under branch `bb0` will the edges of `c0` will regain capability `c` and `c1` also requires `c` to `consume`. Inserting `freeze(c)` at the end of the instructions for `bb0` in `c0` tells a backend that that `c` will be used in a future coupled edge and that it should not drop it, for example. Inserting `unfreeze(c)` at the start of the `bb0` branch of `c1` tells the backend that it certainly can still `consume(c)` alongside the rest of the `bb0` annotations, even though `c` is not in the PCS. ##### Implementing coupled edges For coupled edges `cx` introduced at loop heads, a reborrowing DAG should introduce a _magic wand_ of the shape ``` package ... { if bb0 { // annotations for bb0 } else if bb1 { // annotations for bb1 } ... } ``` into the program, replacing the coupled subgraph with an _opaque edge_ in `cx` which will `apply` this magic wand upon `expire(cx)`. At other join points, coupled edges can be implemented in several ways. One straightforward way is to add conditional branching to a standard reborrowing DAG. In this implementation coupled edges are no more than groups of edges with the same label, which has the potential to be very fast, though asymptotically this will introduce `2^n` branches for a **sequence** of `n` 2-branch `if` statements: ```rust= // 1 possible path => 1 copy of graph leaves if ... { ... } // 2 paths => 2 copies of graph leaves, 2 total edges if ... { ... } // 4 paths => 4 copies of graph leaves, 4 total edges if ... { ... } if ... { ... } if ... { ... } // 32 paths => 32 copies of graph leaves, 32 total edges ``` Another option is to introduce a _translucent_^[5] edge for each coupled borrow which stores annotations of a similar shape to the aforementioned magic wand, but simply _inserting_ those annotations when the coupled edge expires rather than _applying_ a dynamically _packaged_ magic wand. This system will use asymptotically `n` translucent edges for a sequence of `n` join points, each translucent edge tracking only `2` branches in the example above: ```rust= // 0 translucent edges => 1 copy of graph leaves if ... { ... } // 1 translucent edges => 1 copy of graph leaves, 2 total branches if ... { ... } // 2 translucent edges => 1 copy of graph leaves, 4 total branches if ... { ... } if ... { ... } if ... { ... } // 5 translucent edges => 1 copy of graph leaves, 10 total branches ``` A downside of this strategy is that it may involve moving more graphs around in memory, which is potentially expensive in comparison to a more ad-hoc subgraph sharing algorithm on top of the former algorithm. It is also no asymptotically better than the former algorithm for _nested_ `if` statements, since those lead to nested translucent edges of the same quantity. Origins which are the same in both branches are not coupled, even though they may have different representations in the reborrowing DAG. In particular, it is possible for a group `o` to use two different sequences of pack/unpack edges and obtain the same set of live leaves (and thereby are indistinguishable in the coupling graph). In this case, it is sound to choose either sequence, provided the backend only requires tracking live leaves, or to guard the packs and unpacks conditionally underneath a translucent edge that belongs to `o`. ### Implementing the new features in the Reborrowing DAG Since the reborrowing DAG is able to be more backend-specific, we discuss how some of the new features might be implemented in ways well-suited for Viper. We do not discuss optimizations for the reborrowing DAG (I haven't thought about this at all, because I do not understand what typically causes performance issues in Viper) but I would be interested in having this discussion if anyone has ideas. #### Function calls We build our work on top of Lorenz Gorse's [prior work](https://ethz.ch/content/dam/ethz/special-interest/infk/chair-program-method/pm/documents/Education/Theses/Lorenz_Gorse_MS_Report.pdf) on expiry tools. At the end of a function which returns any lifetime-typed data, we seek to construct a tree of magic wands for each legal order in which the universal origins of the function may expire. Since no other origins should be live at the return point of a function, we can (broadly speaking) iterate over these orderings with the regular origin expiry methods to determine the signatures for each sequence wands. From the caller's perspective, we are able to model a function call with a series of predicate edges between the origins passed into the function. In the coupling graph, these edges are not coupled, since they can expire in any (legal) order. In the reborrowing DAG, they should be implemented as a special kind of opaque edge whose _apply_ annotation instead takes the next step `expires_before` through the expiry tool. #### Shared Borrows The only notable feature of shared borrows is that each borrow of a place introduces a `{ S x, S x } --> { deep(x) }` edge in the coupling graph. In particular, a program like ```rust= let x = ...; let y = &x; let z = &x; ``` will have several of these identical edges. We rely on abstract read capabilities to correctly model expiring these edges in any order: expiring either one of these edges should _not_ regain the full capability for `deep(x)` but expiring both edges in any order _should_. More precisely, if $r_1$ and $r_2$ are the abstract fractions associated to `*y` and `*z` respectively, the behavior of the coupling graph is modeled by the commutativity of abstract read permissions: $$ 1 - r_1 - r_2 = 1 - r_2 - r_1.\\ r_1 + r_2 + (1 - r_1 - r_2) = r_2 + r_1 + (1 - r_2 - r_1) $$ This property means that we do not need special logic to wait for all `{ S x, S x } --> { deep(x) }` edges to expire before reconstructing `deep(x)`, or take different action depending on the order that shared borrows of `x` expire (for example, assign certain versions of `S` _different_ concrete fractions). Instead, our semantics is able to eagerly give fractions back to `deep(x)` as shared borrows expire and `x` will still be `S` until all shared borrows of it have been eliminated. #### Borrows inside structs I personally don't understand how we're planning to encode these, but maybe we can speak more on this after a prototype? The biggest unresolved question on my end is about translating lifetime nodes into the reborrowing DAG. We've claimed that we should do something like this, but I don't think we ever talked about the details. #### Repacking Algorithm Suppose we have two PCS's `P` and `Q`. We will solve two related problems: repacking `P` and `Q` to be identical with the correct `drop`s and `weaken` operations at join points, and repacking `P` to contain `Q` in order to apply Hoare triples. Partition `P = P_E + P_R + P_S + P_e + P_r` based on capability (interpreting `+` as disjoint union). We will denote a sequence of `unpack` operations inside a partite set `P_X` as `upk(P_X)`, which can be _reversed_ to get a sequence of `pack` operations `pk(P_X)`. Define ``` sup(deep(x)) = S x ``` and it's inverse `sub`. - For sequential statements, `sub` is implemented as a no-op but only in _subtypable_ contexts. For example in the Hoare triple for copies, we are allowed to use `E` or `R` in terms of `S` so `S` is _subtypable_. This is not the case for eg. repacking to expire a shared borrow. - At join points, `sub` adds a `S x --> deep(x)` edge to the graph. This allows coupling to work as normal when a place is maybe-shared-borrowed-from. Define ``` deepen(shallow(x)) = deep(x) ``` and it's inverse `drop`. Only `drop` can be implemented as an annotation. Finally, we make the following observation: :::info For sets of capabilities `P_X`, `Q_X` of the same permission `X`, there always exists `A`, `P'_E`, and `Q'_E` such that ``` P_E -> A + P'_E Q_E -> A + Q'_E ``` where both rows are witnessed purely by a sequence of `unpack`s. ::: Below, I call this a _Division_. A greedy algorithm suffices to find this sequence of unpacks (as implemented both in my prototype and in prior Prusti). Now we will describe a general calculation, which involves determining several sequences of rewrites to both `P` and `Q`. We will show below how these sequences of rewrites suffice to provide annotations at join points and before applying Hoare triples. ``` General repacking algorithm: Let (P_E/R + P_S + P_e/r) and (Q_E/R + Q_S + Q_e/r) be PCS /* meet the E/R requirements */ Divide P_E/R -> A + P'_E/R upk(P_E/R) Q_E/R -> A + Q'_E/R upk(Q_E/R) /* meet as many S caps as possible */ Divide P_S -> B + P'_S upk(P_S) Q_S -> B + Q'_S upk(Q_S) /* Subtype unmet requirements if possible */ Partition P'_S = RemainderSharp_{P_S} + P''_S where P''_S is subtypable Q'_S = RemainderSharp_{Q_S} + Q''_S where P''_S is subtypable Subtype P''_S -> (P_S)_E/R sub(P''_S) Q''_S -> (Q_S)_E/R sub(Q''_S) /* Meet subtyped requirements with E/R */ Divide Q'_E/R -> C + Q''_E/R upk(Q'_E/R) (P_S)_E/R -> C + Remainder_{P_S} upk((P_S)_E/R) Divide P'_E/R -> D + P''_E/R upk(P'_E/R) (Q_S)_E/R -> D + Remainder_{Q_S} upk((Q_S)_E/R) /* Meet as many drop requirements as possible */ Divide P_e/r -> F + P'_e/r upk(P_e/r) Q_e/r -> F + Q'_e/r upk(Q_e/r) /* Subtype unmet drop requirements */ Deepen P'_e/r -> (P'_e/r)_E/R deepen(P'_e/r) Q'_e/r -> (Q'_e/r)_E/R deepen(Q'_e/r) /* Meet unmet drop requirements */ Divide P''_E/R -> G + Remainder_{P_E/R} upk(P''_E/R) (Q'_e/r)_E/R -> G + Remainder_{(Q_e/r)_E/R} upk((Q'_e/r)_E/R) Divide Q''_E/R -> H + Remainder_{Q_E/R} upk(Q''_E/R) (P'_e/r)_E/R -> H + Remainder_{(P_e/r)_E/R} upk((P'_e/r)_E/R) ``` Now, we will use these steps to describe how to implement the two applications. Each has some assumtions, which I've called the _free PCS assumptions_ in the past. ##### Application 1: Join Points We seek to apply some of these rewrites to both `P` and `Q` in order to obtain a common state. ``` Assumption: All remainders are empty (every capability is accounted for). Annotations PCS ----------------------------------------------------------------------- (P_E/R + P_S + P_e/r) upk(P_E/R) -> (A + P'_E/R + P_S + P_e/r) upk(P_S) -> (A + P'_E/R + B + P'_S + P_e/r) = (A + P'_E/R + B + P''_S + P_e/r) upk(P'_E/R) -> (A + D + P''_E/R + B + P''_S + P_e/r) pk((Q_S)_E/R) -> (A + (Q_S)_E/R + P''_E/R + B + P''_S + P_e/r) sup(Q''_S) -> (A + Q''_S + P''_E/R + B + P''_S + P_e/r) upk(P''_E/R) -> (A + Q''_S + G + B + P''_S + P_e/r) pk((Q'_e/r)_E/R) -> (A + Q''_S + (Q'_e/r)_E/R + B + P''_S + P_e/r) drop(Q'_E/R) -> (A + Q''_S + Q'_e/r + B + P''_S + P_e/r) upk(P_e/r) -> (A + Q''_S + Q'_e/r + B + P''_S + F + P'_e/r) = (A + B + F + Q''_S + Q'_e/r + P''_S + P'_e/r) ``` An analogous calculation on `Q` obtains the same state, as required. ##### Application 2: Sequential Statements We seek to apply a sequence of operations to `P` such that it includes `Q`. ``` Assumptions: All Q remainders are empty (the entire precondition is met). P''_S is empty (we never only have (S x) when we need (E x / R x)). Similarly P'_e/r is empty (we never only have (shallow(x)) when we need deep(x)). Annotations PCS ----------------------------------------------------------------------------------------------------------- (P_E/R + P_S + P_e/r) upk(P_E/R) -> (A + P'_E/R + P_S + P_e/r) upk(P_S) -> (A + P'_E/R + B + P'_S + P_e/r) = (A + P'_E/R + B + P''_S + P_e/r + RemainderSharp_{P_S}) upk(P'_E/R) -> (A + D + P''_E/R + B + P''_S + P_e/r + RemainderSharp_{P_S}) pk((Q_S)_E/R) -> (A + (Q_S)_E/R + P''_E/R + B + P''_S + P_e/r + RemainderSharp_{P_S}) sup(Q''_S) -> (A + Q''_S + P''_E/R + B + P''_S + P_e/r + RemainderSharp_{P_S}) = (A + Q'_S + P''_E/R + B + P''_S + P_e/r + RemainderSharp_{P_S}) pk(Q_S) -> (A + Q_S + P''_E/R + P''_S + P_e/r + RemainderSharp_{P_S}) upk(P_e/r) -> (A + Q_S + P''_E/R + P''_S + F + P'_e/r + RemainderSharp_{P_S}) upk(P''_E/R) -> (A + Q_S + G + Remainder_{P_E/R} + P''_S + F + P'_e/r + RemainderSharp_{P_S}) pk((Q'_e/r)_E/R) -> (A + Q_S + (Q'_e/r)_E/R + Remainder_{P_E/R} + P''_S + F + P'_e/r + RemainderSharp_{P_S}) weaken(Q'_e/r) -> (A + Q_S + Q'_e/r + Remainder_{P_E/R} + P''_S + F + P'_e/r + RemainderSharp_{P_S}) pk(Q_e/r) -> (A + Q_S + Q_e/r + Remainder_{P_E/R} + P''_S + P'_e/r + RemainderSharp_{P_S}) = (A + Q_S + Q_e/r + Remainder_{P_E/R} + P''_S + (P'_e/r)_E/R + RemainderSharp_{P_S}) upk((P'_e/r)_E/R) -> (A + Q_S + Q_e/r + Remainder_{P_E/R} + P''_S + H + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) = (A + Q_S + Q_e/r + Remainder_{P_E/R} + (P_S)_E/R + H + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) pk((P_S)_E/R) -> (A + Q_S + Q_e/r + Remainder_{P_E/R} + C + H + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) pk(Q''_E/R) -> (A + Q_S + Q_e/r + Remainder_{P_E/R} + C + Q''_E/R + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) pk(Q'_E/R) -> (A + Q_S + Q_e/r + Remainder_{P_E/R} + Q'_E/R + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) pk(Q_E/R) -> (Q_S + Q_e/r + Remainder_{P_E/R} + Q_E/R + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) = (Q_E/R + Q_S + Q_e/r + Remainder_{P_E/R} + Remainder_{(P_e/r)_E/R} + RemainderSharp_{P_S}) This contains Q. ``` ##### Problems with this proof The proof assumes the unpack of deep capabilities are all deep; this is not true for borrows. I think the algorithm is still correct. Possible problems with this proof for borrows: `E x` and need a hoare triple `e x`. Will drop `E x` (in this case, _drop_ means _kill_). ## Footnotes ^[3] used only in the case of `let x; drop(x);`. ^[1] ^[2] when the signature changes... ^[4] You could think of `intro` instructions like Polonius `InputFacts`. In particular, the coupling graph could also read the `intro` instructions to ensure the two graphs are in sync. ^[5] to stretch an analogy where statically known annotations are _concrete_, and dynamically known annotations (magic wands) are _opaque_: _translucent_ edges statically know the right annotations for each of a set of control flow paths but does not statically know which path will be taken.

    Import from clipboard

    Paste your markdown or webpage here...

    Advanced permission required

    Your current role can only read. Ask the system administrator to acquire write and comment permission.

    This team is disabled

    Sorry, this team is disabled. You can't edit this note.

    This note is locked

    Sorry, only owner can edit this note.

    Reach the limit

    Sorry, you've reached the max length this note can be.
    Please reduce the content or divide it to more notes, thank you!

    Import from Gist

    Import from Snippet

    or

    Export to Snippet

    Are you sure?

    Do you really want to delete this note?
    All users will lose their connection.

    Create a note from template

    Create a note from template

    Oops...
    This template has been removed or transferred.
    Upgrade
    All
    • All
    • Team
    No template.

    Create a template

    Upgrade

    Delete template

    Do you really want to delete this template?
    Turn this template into a regular note and keep its content, versions, and comments.

    This page need refresh

    You have an incompatible client version.
    Refresh to update.
    New version available!
    See releases notes here
    Refresh to enjoy new features.
    Your user state has changed.
    Refresh to load new user state.

    Sign in

    Forgot password

    or

    By clicking below, you agree to our terms of service.

    Sign in via Facebook Sign in via Twitter Sign in via GitHub Sign in via Dropbox Sign in with Wallet
    Wallet ( )
    Connect another wallet

    New to HackMD? Sign up

    Help

    • English
    • 中文
    • Français
    • Deutsch
    • 日本語
    • Español
    • Català
    • Ελληνικά
    • Português
    • italiano
    • Türkçe
    • Русский
    • Nederlands
    • hrvatski jezik
    • język polski
    • Українська
    • हिन्दी
    • svenska
    • Esperanto
    • dansk

    Documents

    Help & Tutorial

    How to use Book mode

    Slide Example

    API Docs

    Edit in VSCode

    Install browser extension

    Contacts

    Feedback

    Discord

    Send us email

    Resources

    Releases

    Pricing

    Blog

    Policy

    Terms

    Privacy

    Cheatsheet

    Syntax Example Reference
    # Header Header 基本排版
    - Unordered List
    • Unordered List
    1. Ordered List
    1. Ordered List
    - [ ] Todo List
    • Todo List
    > Blockquote
    Blockquote
    **Bold font** Bold font
    *Italics font* Italics font
    ~~Strikethrough~~ Strikethrough
    19^th^ 19th
    H~2~O H2O
    ++Inserted text++ Inserted text
    ==Marked text== Marked text
    [link text](https:// "title") Link
    ![image alt](https:// "title") Image
    `Code` Code 在筆記中貼入程式碼
    ```javascript
    var i = 0;
    ```
    var i = 0;
    :smile: :smile: Emoji list
    {%youtube youtube_id %} Externals
    $L^aT_eX$ LaTeX
    :::info
    This is a alert area.
    :::

    This is a alert area.

    Versions and GitHub Sync
    Get Full History Access

    • Edit version name
    • Delete

    revision author avatar     named on  

    More Less

    Note content is identical to the latest version.
    Compare
      Choose a version
      No search result
      Version not found
    Sign in to link this note to GitHub
    Learn more
    This note is not linked with GitHub
     

    Feedback

    Submission failed, please try again

    Thanks for your support.

    On a scale of 0-10, how likely is it that you would recommend HackMD to your friends, family or business associates?

    Please give us some advice and help us improve HackMD.

     

    Thanks for your feedback

    Remove version name

    Do you want to remove this version name and description?

    Transfer ownership

    Transfer to
      Warning: is a public team. If you transfer note to this team, everyone on the web can find and read this note.

        Link with GitHub

        Please authorize HackMD on GitHub
        • Please sign in to GitHub and install the HackMD app on your GitHub repo.
        • HackMD links with GitHub through a GitHub App. You can choose which repo to install our App.
        Learn more  Sign in to GitHub

        Push the note to GitHub Push to GitHub Pull a file from GitHub

          Authorize again
         

        Choose which file to push to

        Select repo
        Refresh Authorize more repos
        Select branch
        Select file
        Select branch
        Choose version(s) to push
        • Save a new version and push
        • Choose from existing versions
        Include title and tags
        Available push count

        Pull from GitHub

         
        File from GitHub
        File from HackMD

        GitHub Link Settings

        File linked

        Linked by
        File path
        Last synced branch
        Available push count

        Danger Zone

        Unlink
        You will no longer receive notification when GitHub file changes after unlink.

        Syncing

        Push failed

        Push successfully