owned this note
owned this note
Published
Linked with GitHub
# Rebuilding the Wheel - Becoming a Build Job
The [rebuilding the wheel](https://github.com/markmc/rebuilding-the-wheel) prototype has been hugely instructive. We can now boootstrap a self-contained package index containing binary distributions (wheels) for the entire dependency chains for complex packages like PyTorch and LangChain, all built from source. Where to from here?
Assume we are hosting a package index containing these wheels, and we wish for new versions of existing packages to automatically be built and added to the index.
Human input would usually only be required when new packages are to be added to the index - either because we wish to add a new toplevel package, or because a new version of an existing package has added this new package as a dependency.
The purpose of this document is to design the steps of a build job that would be triggered under a number of circumstances:
1. The upload of a new version of an existing package to PyPI
2. The addition of a new PyPI package has been approved by a human reviewer
3. The rebuild of an existing package version
4. The change of any per-package metadata, patches, etc. that are used by the build process
## Download the source
The default case is to download an sdist from PyPI given the package name and the new version (or latest version, for new packages)
We might support alternative source locations for some packages (e.g. a github repo), but even in those cases we would ultimately create an archive (e.g. from a git clone) so that all projects have the same source distribution format.
We would validate the downloaded source, starting with its checksum, but over time this could include security scanning or other automated analysis
The archive would be stored for the next step, or subsequent rebuilds, or for auditing purposes.
In the case of rebuilds, this step would be skipped.
Relevant code: [sources.download_source()](https://github.com/markmc/rebuilding-the-wheel/blob/59fc62d7730ffbc32d6f453b6fe16d552eb6f776/mirror_builder/sources.py#L14) - but note that no "resolution" is required except in the case "latest version for new packages" case.
## Prepare build environment
Create a fresh build environment, where we ensure reproducibility of what is available during the build.
We'd use containers for this, but the philosophy would borrow from e.g. [mock](https://rpm-software-management.github.io/mock/).
Some build requirements will be installed using RPMs - e.g. [cmake, auto*, gcc, rust, etc.](https://github.com/markmc/rebuilding-the-wheel/blob/main/Containerfile) - the list of these build requirements is not available from any upstream metadata source, so this will need to be stored somewhere.
All other build requirements will be wheels - let's assume for now we know what these requirements are, and we install them in advance from our package index into a virtual environment.
### Determining (wheel-based) build requirements
In the prepare build environment step, we deferred the question of how to know which wheels to install into the build environment.
One option for how to handle this is a build requirements analysis step, where we use packages from our package index to recursively analyze the build requirements of the package ([relevant code](https://github.com/markmc/rebuilding-the-wheel/blob/59fc62d7730ffbc32d6f453b6fe16d552eb6f776/mirror_builder/sdist.py#L9)) and then store that list for the subsequent step.
In this case, if a new package was detected, this would get queued up as a new package request for review by a human, and the build would fail.
However, there is a simpler option - we could simply allow the build process to automatically install the build dependencies it requires from our package index, and fail if it doesn't exist.
PyTorch [lists cmake as a build dependency](https://github.com/pytorch/pytorch/blob/836a86064cf27de91479dddd1d834dd57bb0bd07/pyproject.toml#L9) such that it gets installed from [a package on PyPI](https://pypi.org/project/cmake/) But it seems pretty straightforward that we would want to use [the system package](https://packages.fedoraproject.org/pkgs/cmake/cmake/)
## Prepare source
Unpack the source into the (not yet-running) build environment and apply any patches
## Build wheel
Launch the build environment - presumably.
However, ensure that network access is blocked in order to ensure everything required by the build is coming from the build environment.
If all required wheels are installed, there is no need to provide access to our package index.
Build the package with `pip wheel` by default, as per [build_wheel()](https://github.com/markmc/rebuilding-the-wheel/blob/59fc62d7730ffbc32d6f453b6fe16d552eb6f776/mirror_builder/wheels.py#L20).
Store the produced wheel.
## Validate install-time dependencies
Using the newly built wheel, analyze its regular dependencies.
If a new package is detected, this would get queued up as a new package request for review by a human, and the build would fail.
Now test the wheel and all its dependencies can be installed using only our package index.
Potentially perform other smoke-testing.
## Update index
Place the newly built wheel into the index, and re-generate the index listing.
We would likely have a separate index storing source archives for reference. A wheels-only index nobody using this index will unwittingly be building from source - on a different architecture, python version, etc. The source could be stored on a regular web server with a simple index.
## Misc/TODO
Thoughts/comments/discussion points to incorporate above.
* Automated update tracking - for packages that we have accepted into our system's allow-list, we would monitor PyPI for new versions and automatically kick off build jobs for the new version.
* Build job mode - a build job for package `foo-1.2.3`, should only build `foo-1.2.3.whl`. Or conversely, you should be able to query for the build job that built `foo-1.2.3.whl` and look through the logs etc.
* Bootstrap mode - analyze an entire dependency tree and kick off build jobs bottom-up all the way to the toplevel package.
* Ideally, we should be able to bump a part of the version number when we rebuild a particular version of a package - e.g. `requests-2.28.2-5` is the fifth rebuild of 2.8.2 release. Are [post-releases](https://packaging.python.org/en/latest/specifications/version-specifiers/#post-releases) the answer? Or [local versions](https://packaging.python.org/en/latest/specifications/version-specifiers/#local-version-identifiers)?
* There may not be a consistent way to inject a special version number into the metadata when building, so we may need multiple approaches depending on the build tool.
* Unlike in Fedora, users expect multiple package versions in a Python Package Index - e.g. `torch==2.1.1`, `torch==2.2.*` - so part of our bootstrapping should involve building versions other than the latest
* The ability to audit/inspect the build for a given wheel is of value - one of our build artifacts should be the output of `pip freeze` in the build environment
* Potential upstream improvements for Python packaging
* Some way to differentiate copies of the same distribution for optimized builds using info other than just CPU architecture (GPUs, etc.). This could be an extension to the architecture field (combining CPU and GPU arch, for example), or some other metadata that can be queried via a marker that the installer can determine. Maybe there are other options?
* The prototype builds wheels for multiple versions of Python, but what will make sense in practice? Should an index for a given Fedora version be expected to work with the default version of Python in that version of Python? Or all versions of Python available in that version of Fedora? If we did decide to support multiple Python versions for a given Fedora version, should we have a separate set of wheels for each version?
* [dhellmann] Different versions of Python do typically require different binary wheels for the same dist because the ABI of Python can change and the builds have to take that into account. The python version is included in the wheel filename, and pip is smart enough to identify the right wheel and use it, so we should be able to serve the same content out of a single wheel repo.
* [dhellmann] If we start doing optimized builds for reasons other than the Python version, CPU arch, OS, etc. -- such as to incoroporate the GPU architecture -- then we will have to do something different because the python toolchain does not understand that sort of binary difference. (See the previous point about multiple versions of packages.)