Python Package Management
Update 2024-04-07
I've to using remote containers over SSH.
I use pdm as a Python package manager, which is something like npm
.
I've found it much better in terms of updating dependencies (pdm update -u
) and ensuring build reproducibility with the use of lockfiles.
Regarding automatic dependency updates and deployment:
- The main use of Github workflows is being able to automatically rebuild hosted web apps when dependencies are detected to have been changed.
- If the project is hosted locally, I can update dependencies myself with pdm update -u or ncu -u now and then.
- Dependabot doesn't support pdm lockfiles, and even if it did, when there is an update, I would have to access the terminal to rebuild the container, in which case I might have as well have did the update and tested it there and then myself.
Update 17/11/21
I now use WSL2 containers in VSCode. Additionally, I use requirements.txt
files generated with:
pip list --format freeze --not-required
--not-required
ensures only top-level packages are printed.
In addition, I use == X.Y.*
(aka compatible version specifiers) to pin the major and minor version of packages, while allowing for patch versions to be updated automatically each time the container is rebuilt. This prevents surprises during deployment (e.g. missing functionality in minor versions) while allowing bugfix versions to automatically be upgraded without hassle.