The official python docker container does exactly what it should.
It automatically copies the requirements.txt file and your current directory into /usr/src/app. It should then automatically pip install the dependencies from the requirements.txt, before running the actual Dockerfile commands
But sometimes python libraries require additional system setups by themselves (one might consider it a pip bug..). For example, working with ssdeep, a fuzzy hashing library I’m working with for smart comparing of hash-identities
running “pip ssdeep” by itself fails, so when running the official container with a requirements.txt containing “ssdeep” – we’ll fail with:
Command “python setup.py egg_info” failed with error code 1 in /tmp/pip-build-Afec40/ssdeep/
Let’s solve it the “object oriented” way, using a base and a sub containers.
In a Dockerfile-base file we’ll build an ssdeep-included base python container, from an official python container:
RUN apt-get update && apt-get install -y build-essential libffi-dev python python-dev python-pip automake autoconf libtool
RUN BUILD_LIB=1 pip install ssdeep
docker build -f Dockerfile-base -t my-python-base .
In the regular Dockerfile file we’ll build the sub python container, from the previous base container. We’ll need to copy and pip install the requirements.txt file and copy our current directory into /usr/src/app, since this is no longer the official python container:
RUN mkdir -p /usr/src/app
COPY . /usr/src/app
RUN pip install -r /usr/src/app/requirements.txt
CMD [ “python”, “./my-python-app.py”]
Build and run it..
docker build -f Dockerfile -t my-python-sub . && docker run -it –name my-python-sub my-python-sub
This should install our other python dependencies, if any, bundle and run our python app with ssdeep support 🙂