Create pinned requirements.txt inside a Docker image using pip-tools

Overview

Pin your Python dependencies!

pin-requirements.py is a script that lets you pin your Python dependencies inside a Docker container.

  • Pinning your dependencies is great because it gives you reproducible builds. See below for more motivation.
  • pip installs different dependencies depending on the version of Python, and which operating system you're using. So if you're deploying on Linux, doing the pinning inside Docker means you get consistent, correct pinning.

pin-requirements.py is a script based on pip-tools that takes the high-level requirements from requirements.in and transitively pins them to output file requirements.txt:

Just create a requirements.in listing your top-level dependencies:

flask>1.0

And then run:

$ pin-requirements.py --image python:3.9-slim

You will now have a requirements.txt file that looks a little like this:

# ...
click==8.0.3 \
    --hash=sha256:353f466495adaeb40b6b5f592f9f91cb22372351c84caeb068132442a4518ef3 \
    --hash=sha256:410e932b050f5eed773c4cda94de75971c89cdb3155a72a0831139a79e5ecb5b
    # via flask
flask==2.0.2 \
    --hash=sha256:7b2fb8e934ddd50731893bdcdb00fc8c0315916f9fcd50d22c7cc1a95ab634e2 \
    --hash=sha256:cb90f62f1d8e4dc4621f52106613488b5ba826b2e1e10a33eac92f723093ab6a
    # via -r /input/requirements.in
# ...

(Choose the matching Docker image for whatever Python version you actually use in production.)

All requirements will be updated to their latest version whenever the script is run.

Check-in both requirements.in and requirements.txt into version control, install your application dependencies using the latter, and update your dependencies by re-running this command.

To learn more about what the tool is doing, see the underlying pip-tools documentation.

This tool is sponsored by the Python on Docker Production Handbook, your complete reference for packaging Python applications for Docker in production.

Motivation

Note that everything I'm discussing here is focused on applications; libraries are a whole different story.

On the one hand, you want your builds to be reproducible: whenever you package or install your software, it should install the same dependencies. Pinning your dependencies to specific versions is how you do this, and you want to pin all dependencies, including dependencies-of-dependencies.

On the other hand, you need to update your dependencies... and a fully pinned set of dependencies is a pain in the ass to update, since it is overly constrained.

Thus, every application really requires two different sets of dependency description files:

  1. The logical, direct dependencies. For example, "this needs at least Flask 1.0 to run".
  2. The complete set of dependencies, including transitive dependencies, pinned to particular versions. Transitive means dependencies-of-dependencies, and pinning means particular versions. For example, this might be "Flask==1.0.3, itsdangerous==1.1.0, werkzeug==0.15.4, click==7.0, jinja2==2.10.1, markupsafe==1.1.1".

The first set of dependencies can be used to easily update the second set of dependencies when you want to upgrade (e.g. to get security updates).

The second set of dependencies is what you should use to build the application, in order to get reproducible builds: that is, to ensure each build will have the exact same dependencies installed as the previous build.

Cado Response Integration with Amazon GuardDuty using AWS Lambda

Cado Response Integration with Amazon GuardDuty using AWS Lambda This repository contains a simple example where: An alert is triggered by GuardDuty T

Cado Security 4 Mar 02, 2022
A Simple script to hunt unused Kubernetes resources.

K8SPurger A Simple script to hunt unused Kubernetes resources. Release History Release 0.3 Added Ingress Added Services Account Adding RoleBindding Re

Yogesh Kunjir 202 Nov 19, 2022
NixOps is a tool for deploying to NixOS machines in a network or cloud.

NixOps NixOps is a tool for deploying to NixOS machines in a network or the cloud. Key features include: Declarative: NixOps determines and carries ou

Nix/Nixpkgs/NixOS 1.2k Jan 02, 2023
Get Response Of Container Deployment Kube with python

get-response-of-container-deployment-kube 概要 get-response-of-container-deployment-kube は、例えばエッジコンピューティング環境のコンテナデプロイメントシステムにおいて、デプロイ元の端末がデプロイ先のコンテナデプロイ

Latona, Inc. 3 Nov 05, 2021
🎡 Build Python wheels for all the platforms on CI with minimal configuration.

cibuildwheel Documentation Python wheels are great. Building them across Mac, Linux, Windows, on multiple versions of Python, is not. cibuildwheel is

Python Packaging Authority 1.3k Jan 02, 2023
Caboto, the Kubernetes semantic analysis tool

Caboto Caboto, the Kubernetes semantic analysis toolkit. It contains a lightweight Python library for semantic analysis of plain Kubernetes manifests

Michael Schilonka 8 Nov 26, 2022
This repository contains code examples and documentation for learning how applications can be developed with Kubernetes

BigBitBus KAT Components Click on the diagram to enlarge, or follow this link for detailed documentation Introduction Welcome to the BigBitBus Kuberne

51 Oct 16, 2022
SSH tunnels to remote server.

Author: Pahaz Repo: https://github.com/pahaz/sshtunnel/ Inspired by https://github.com/jmagnusson/bgtunnel, which doesn't work on Windows. See also: h

Pavel White 1k Dec 28, 2022
Micro Data Lake based on Docker Compose

Micro Data Lake based on Docker Compose This is the implementation of a Minimum Data Lake

Abel Coronado 15 Jan 07, 2023
A honey token manager and alert system for AWS.

SpaceSiren SpaceSiren is a honey token manager and alert system for AWS. With this fully serverless application, you can create and manage honey token

287 Nov 09, 2022
A system for managing CI data for Mozilla projects

Treeherder Description Treeherder is a reporting dashboard for Mozilla checkins. It allows users to see the results of automatic builds and their resp

Mozilla 235 Dec 22, 2022
Python job scheduling for humans.

schedule Python job scheduling for humans. Run Python functions (or any other callable) periodically using a friendly syntax. A simple to use API for

Dan Bader 10.4k Jan 02, 2023
Webinar oficial Zabbix Brasil. Uma série de 4 aulas sobre API do Zabbix.

Repositório de scripts do Webinar de API do Zabbix Webinar oficial Zabbix Brasil. Uma série de 4 aulas sobre API do Zabbix. Nossos encontros [x] 04/11

Robert Silva 7 Mar 31, 2022
Remote Desktop Protocol in Twisted Python

RDPY Remote Desktop Protocol in twisted python. RDPY is a pure Python implementation of the Microsoft RDP (Remote Desktop Protocol) protocol (client a

Sylvain Peyrefitte 1.6k Dec 30, 2022
Bugbane - Application security tools for CI/CD pipeline

BugBane Набор утилит для аудита безопасности приложений. Основные принципы и осо

GardaTech 20 Dec 09, 2022
Quick & dirty controller to schedule Kubernetes Jobs later (once)

K8s Jobber Operator Quickly implemented Kubernetes controller to enable scheduling of Jobs at a later time. Usage: To schedule a Job later, Set .spec.

Jukka Väisänen 2 Feb 11, 2022
Rancher Kubernetes API compatible with RKE, RKE2 and maybe others?

kctl Rancher Kubernetes API compatible with RKE, RKE2 and maybe others? Documentation is WIP. Quickstart pip install --upgrade kctl Usage from lazycls

1 Dec 02, 2021
Asynchronous parallel SSH client library.

parallel-ssh Asynchronous parallel SSH client library. Run SSH commands over many - hundreds/hundreds of thousands - number of servers asynchronously

1.1k Dec 31, 2022
Deploy a simple Multi-Node Clickhouse Cluster with docker-compose in minutes.

Simple Multi Node Clickhouse Cluster I hate those single-node clickhouse clusters and manually installation, I mean, why should we: Running multiple c

Nova Kwok 11 Nov 18, 2022
This project shows how to serve an TF based image classification model as a web service with TFServing, Docker, and Kubernetes(GKE).

Deploying ML models with CPU based TFServing, Docker, and Kubernetes By: Chansung Park and Sayak Paul This project shows how to serve a TensorFlow ima

Chansung Park 104 Dec 28, 2022