changed a year ago
Linked with GitHub

Upstreaming Rust-Enzyme

After 3 years of experimenting, we (Manuel Drehwald, Lorenz Schmidt, Jed Brown, William Moses, Ningning Xie), would propose to upstream our work on automatic differentiation (in the calculus sense) into rustc. The user-facing change would be the introduction of a rustc_builtin_macro called autodiff. A compiler-team MCP was previous submitted and accepted here.

Use Cases

Automatic Differentiation (aka. algorithmic differentiation, backpropagation) is used in various fields for scientific computing and HPC (such as climate, shape optimization, and other simulations), computer vision (bundle adjustment, SLAM), and machine learning (e.g., training of neural networks), for model formulation, optimization, and sensitivity analysis/uncertainty quantification.

Companies, national labs, and university research groups

This list of stakeholders that would like to use Rust AD is biased by the communities the four authors work in. These groups include those who are not currently working in Rust, but are interested in doing so, as well as those working in Rust who would like AD.

Autodiff users / interests Application
Alan Aspuru-Guzik (University of Toronto) Manuel's Supervisor, Automation of Chemistry Laboratories (https://www.matter.toronto.edu/)
Alec Jacobson (University of Toronto) / Xiaochun Tong (University of Waterloo) Differentiable Rendering with C++-Enzyme: Github, Differentiable Rendering in Rust: SIGGRAPH Paper. Differentiable Rendering with Rust+Enzyme next?
Burn/Rai/<TBA> ML Frameworks for Rust, want an Enzyme backend
Earth, Atmosphere, Planetary Science (MIT) Climate Simulations in Julia, using Enzyme (unlikely to move to Rust)
Jan Hückelheim (Argonne National Lab) US Department of Energy applications
Jed Brown (CU Boulder) Constitutive modeling and differentiable computational fluid and structural mechanics (partial Rust support)
Martin Robinson (Oxford Research Software Engineering) ODE Solver using Rust + Enzyme link
Ningning Xie (UofT, JAX Team @ Google Deepmind) Interested in AutoDiff in Rust, Co-Supervise Manuel's research
Nils Peters (FAU Erlangen-Nürnberg) Lorenz's Supervisor, Autodiff for embedded audio devices for IoT applications in Rust
Paul Goulart (University of Oxford) Convex Optimization in Rust and Julia. Interested in "making a general non-convex optimiser [], in which case automatic differentiation might be very useful"
Rodrigo Vargas (McMaster University) Compute moleculular forces with Rust (Faer+Enzyme), Paper under review
Sasha Rush (Cornell) Interested in ML / NLP in Rust, would like to Experiment with Rust-Enzyme
Timo Betcke (UCL) Numerical Methods for Inverse Problems in Rust, Interested in the Batching (vectorization) Feature of Enzyme
William Moses (University of Illinois Urbana-Champaign) Enzyme core lead developer
Xanadu.ai Enzyme for Quantum Computing: Github

PhD students and independent developers/researchers

We have also received Discord/Github/email reactions from indiviual Ph.D. students and other developers.

Autodiff users Assumed application
Quinn Sinclair (m-rph) More flexible AD for Reinforcement Learning: zulip
Easton Potokar (CMU) "Evaluating whether the needed tools for robotics were available in Rust yet. [] (linalg, autodiff, message passing, etc)
Andreas Longva (RWTH Aachen) Constitutive modeling, differentiation of energies and forces that arise in multiphysics simulation
John ArbitRandomUser Interactive example using Enzyme + wasm
Lukas Lipp fknfilewalker (Vienna University of Technology) Computer graphics, considers switching to Rust/Enzyme
Jose Melo jmelo11 Finance / Risk assesment (?)

Our approach is based on Enzyme, an LLVM incubator project that given a function in LLVM-IR, computes the gradients with respect to a user-defined subset of input and output parameters. Doing this work on LLVM-IR has two benefits. First, the performance of running AD after/in combination with LLVM optimizations gave a 4.2x median performance improvement over doing AD before optimizations. Second, the work to develop such an AD tool can be shared between multiple languages. AD on Rust instead of rustc level is already explored by various packages like burn, candle, dfdx, rai, and others. They strongly focus on neural networks, which in a certain sense are "nice", because you have a somewhat limited set of operations, and usually few large matrices. Scientific Computing and HPC projects do not always map nicely to these two properties. We talked with burn and rai devs who would be happy to experiment with Enzyme as a nightly feature/backend to be able to cover more cases. We also know of various groups in scientific computing and HPC who would like to use Rust-Enzyme directly once on nightly. Their main reason is the performance of Enzyme and the ability to differentiate arbitrary functions and arbitrary types - other tools require users to use their own tensor types to compute gradients, which requires rewriting applications and limits interactions with other libraries.

We wrote a Major-Change-Proposal to upstream rust-enzyme one year ago, which got accepted, but no one involved at the time realized that this would need t-lang approval, so now we are back. Over the last year we implemented all the three main concerns that were raised in the discussion of the MCP:

  1. We moved from implementing our autodiff macro as a proc-macro, to reimplementing it as a rustc-builtin-macro (now we don't break cross-compilation anymore).

  2. We added Enzyme CI that runs against LLVM main, so we know immediately when an LLVM commit breaks Enzyme. Enzyme maintains compatibility with a broad range of LLVM versions, and we expect support will be in place well ahead of rustc bumping to a new version of LLVM. We also added a x.py flag to toggle Enzyme builds on and off, if rustc thus wants to update to an LLVM and Enzyme is a few days late with support, we can just ship some nightlies with autodiff disabled (I also added an error message to recognize this case and report it to users). We also added Rust CI tests and Rust style error messages for usage mistakes.

  3. We started adding safety checks and warnings as a proof of concept. Those safety concerns mainly regard mutability and size compatibility, as in the following example. When you differentiate a function fn f(x: &f32) -> f32 with respect to the argument x, Enzyme (reverse mode) will create a function fn df(x: &f32, bx: &mut f32, by: f32) -> f32 where bx is a new variable into which Enzyme will write the gradient. (Specifically, \(\bar x \gets \bar y \frac{\partial f}{\partial x}\).) bx should therefore be mutable and as large as x. This is trivial when x: &f32 as above, but becomes more subtle for a DST such as x: &[f32] or x: &[&[f32]] (with corresponding bx: &mut [f32] and bx: &mut[&mut [f32]] respectively) must have matching (nested) shape. We have added safety checks for both cases (at both run-time and compile-time), and have a plan to tackle not-yet covered cases. Those improvements will, however, probably be best implemented on MIR, which is new for us and therefore a good point to stop and ask for upstreaming first. Maintaining an out-of-tree fork takes time that we would rather use to improve safety and usability. Also, getting into nightly will enable richer and more diverse feedback since many library authors and research groups interested in using Rust-Enzyme are reluctant to invest significant time before it becomes available on nightly.

For prior Art on autodiff in general, see here.

For other Enzyme frontends, see here

For using our macro, see here

Three Research Papers on LLVM based Autodiff, see here.

Current limitations are described here

Future work is described here

The Motivation for the implementation inside of rustc and discussion of alterantive approaches are here

Our question to t-lang:

We currently support various environemnt variables, to simplify debugging, mentioned here.
We would prefer to leave them in all builds, but understand that some of them are quite invasive to the compilation process and we might not want to give enduser this level of control, even on nightly. We assume that as such we should disable them in official nightly builds and only enable them for developers building their own version of rustc?

Previous questions:

To repeat (and in part update) the main questions raised in the MCP discussion:

  1. How close does Enzyme follow LLVM head -> resolved, we have had CI running against HEAD for various months, and it's a single flag to disable building AD. Nikita was fine with that last time we talked.

  2. How about technical debts -> improved. We cleaned up / upstreamed some Enzyme specific code into LLVM, and we do support increasingly more languages. Remaining work to support the msvc target here.

  3. Unfinished features -> resolved. As mentioned not all safety checks are implemented, but we implemented some of them and in earlier discussions we were told that not everything getting merged into nightly is expected to be 100% safe. If user's actively try to break Rust-Enzyme they can, but that seems acceptable. Having more users would of course speed up testing.

  4. How about Generics, Constexpr? -> Were already resolved. Generics work out of the box, after all they just creates multiple LLVM-IR function copies. We can only differentiate what's lowered to LLVM-IR, but that is not an issue since no unless someone tries to train Neural Networks or run Climate simulation at compile time.

  5. How about codegens other than rustc_codegen_llvm? -> Unchanged. Enzyme does support cuda, so rustc_codegen_nvvm should work. People who want to use multiple backends could use Enzyme through Burn, rai, etc. s.t. they have multiple backends and can only use Enzyme when supported. People who care about performance and features more can use it directly. It seems that a lot of people in scicomp/hpc prefer better runtime and support for more features over compile time improvements, but if more people care about AD for cg_clif or cg_gcc, it shouldn't be impossible to implement, see 7). After all Enzyme, Rust-Enzyme, and the other frontends started to exist because for each of them 1-2 students where interested in doing the work.

  6. How much code will you add to rustc and who will maintain it? -> Last year it was 2k LoC + 1k tests, now it will be around 3k + 1.5k, mostly because we added a large number of checks and warnings. Otherwise unchanged. We (Manuel, Lorenz, Jed) will maintain the Enzyme frontend like we did for the last years. We also got various messages from other users on discord/email/github who would like to help, so we hope to potentially grow a small wg-autodiff/scicomp/hpc/ml community.

  7. (continued from 6) Generally, Enzyme has it's "rules" to differentiate llvm instructions in a declarative tablegen file: InstructionDerivatives.td and generates most code from there. The remaining code to handle LLVM-IR is ~50k LoC of C++, though that also covers features not relevant for Autodiff or Rust. There are intentions to ask for LLVM upstreaming later this year.

Further notes:

  1. Strongly related features: "Batching" and "Custom Derivatives" are currently not implemented due to design questions. Batching ("Merge N function calls into one run and compute N outputs at one") reduces the overhead for backpropagation/autodiff significantly. Custom Derivatives tell AD tools to not differentiate certain subfunctions (often for numerical stability), but instead "plug-in" a user provided definition that is better in the context of AD. Both features are supported from all major AD tools and we will also implement them later (Enzyme supports these already, it's just a question of how to expose it to Rust. We can ask t-lang again for approval later, or it could get considered part of this request.

  2. Further benefits: Autodiff cares a lot about aliasing and mutability. If Rust fails to annotate something with noalias LLVM just might not apply some optimization. Enzyme instead might have to recompute and/or cache a whole matrix if it can't proof that an unrelated write to some pointer doesn't overwrite that matrix. So the overhead for failing to provide good information to LLVM for AD is higher, which in turn could benefit Rust since the motivation to fix these issues is therefore also much higher for people related to autodiff.

  3. Future/Independent work: If rustc ever decides to have an MLIR backend or one for LLVM based GPU computing, Enzyme will already have support for it. (Please reach out if you are interested). Manuel and Jed are going to work on two different approaches to improve GPU support for Rust in the summer.

  4. Reviewer: Oli-obk agreed to review our upstreaming PR.

  5. Various past discussions on the design happened in zulip in the wg-autodiff stream: https://rust-lang.zulipchat.com/#narrow/stream/390790-wg-autodiff

Select a repo