When building python libraries a developer does only want to declare the compatibility range of the runtime dependencies. This ensures that consumers can make use of future patched dependencies.
Still when it comes to testing always installing the bleading edge versions published to the pypa.org registry is a recipe for breaking your pipelines.
Most of the bugs introduced by new packages published on pypa are usually fixed in few days. As you probably do not want to pay the price of being the early adopter you may like the idea of controlling the test dependecies and upgrading them only when you want.
Python developers can learn here from JavaScript world where you get pinned dependencies insidepackage-lock.json
while having compatibility ranges defined in package.json
.
Using pip-compile utility, which is part of pip-tools package one can easily produce lock files, very similar with the result of pip freeze
. The difference is that these can be produced without creating a virtualenv and that they can be nicely annotated in order to document from where which dependency comes from.
pip-compile knows to read dependencies from any combination of setup.py
, setup.cfg
or standalone requirement.in
files, making it very easy to run. By default output is saved inside requirements.txt
.
I would really advise you to read the entire readme of the project as it does explain its use very well.
The result of running pip-compile can produce different results based on operating system or python version being used. If this happens for your project you may have to produce different files for each supported platform. That case is also covered in pip-tools main readme but so far I was able to avoid it.
If you ever used multiple requirement files with pip you probably discovered that it is very easy to endup with conflicts between them. They secret is to avoid using multiple requirement files. If you use pip-compile to produce a single requirements file that covers runtime and all testing requirements, you can use that file as a pip constraint when installing them.
So for our use case we want to generate a constraints.txt
. These files never ship, but they only assure that for we lock/pin test requirements until we decide to update them.
When we run pip-compile, we do make use of upstream openstack upper requirements, so our result is compatible with openstack. The difference is that we include more deps, deps which cannot be included in global-requirements for various reasons (popularity, licensing or bureaucracy).
Some time ago I seen others using a tox -e deps
environment which basically called pip-compile and update the pinned dependencies. As arguments passed to pip-compile may be different from project to project and some may even require multiple runs, it is much easier to have a tox environment that does only that. Keep in mind that this environment is not supposed to have a CI/CD pipeline, as it would likely always report changes to the tracked files.
This is part of https://projects.engineering.redhat.com/browse/TRIPLEOCI-512 ticket and we currently adopted it to following repos:
Merged
Pending
Next
Next (stage2)