Pip install with Docker remote interpreter is ephimeral

Hello, we are evaluating to use PyCharm Professional as IDE to develop in Python. Since our production is Docker based, we would like to implement a development workflow based on Docker for production parity. I would prefer to avoid Vagrant.

My idea is to implement the following workflow:

  1. clone the repository containing the requirements.txt file
  2. manually add a remote Docker interpreter, specifiying a Python base image (e.g. python:3.5.4)
  3. let PyCharm to provision the required packages running pip install -r requirements.txt in the docker container
  4. develop, debug and test

Steps 1 and 2 work fine. In step 3, PyCharm sense that the remote environment is missing the dependencies, and when I ask to download them, it spins a container from python:3.5.4 image and runs the pip install command. After that the container is stopped and removed and hence I am not able to proceed to step 4.

What I am missing here?





I'm seeing the same thing. The project's dockerfile doesn't install any of the dev and test dependencies for obvious reasons. When I instruct pycharm to use the test requirements file, it correctly prompts asking to install the missing dependencies. Each is installed on individual docker runs, but then pycharm continues to use the container it built from the original dockerfile.


PyCharm won't install packages that way. I suggest writing a Dockerfile to build your image, and install the packages using something like:

    COPY requirements.txt /app/requirements.txt
    RUN pip install -r requirements.txt

PyCharm will build a new image, execute instructions from Dockerfile, and stop the container. Each time you run your script, it will fire up a new image, but it will use the cache to avoid installing the packages each time.


Please sign in to leave a comment.