Connect Postgres service with custom container image

In my Django project, I have a CI workflow for running tests, which requires a Postgres service. Recently a new app introduced heavier packages such as pandas, matplotlib, pytorch and so on and this increased the run-tests job time from 2 to 12 minutes which is absurd. Also in my project, I have a base Docker image with Python and these packages that are heavier to speed up the build of the images. So I was thinking to use this same image in the workflow when running the steps because the packages would be loaded already.

Unfortunately, all goes well until it reaches the step to actually run the tests because it seems that the postgres service is not connected with the container and I get the following error:

psycopg2.OperationalError: could not connect to server: Connection refused
	Is the server running on host "localhost" (127.0.0.1) and accepting
	TCP/IP connections on port 5432?

This is my workflow right now. Any ideas on what I am doing wrong?

name: server-ci

on:
  pull_request:
      types: [opened]

env:
  DJANGO_SETTINGS_MODULE: settings_test

jobs:

  run-tests:
    name: Run tests

    runs-on: ubuntu-latest

    container:
      image: myimage/django-server:base
      credentials:
        username: ${{ secrets.DOCKERHUB_USERNAME }}
        password: ${{ secrets.DOCKERHUB_PASSWORD }}
      ports:
        - 8000:8000

    services:
      postgres:
        image: postgres
        env:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: admin
          POSTGRES_DB: mydb
        ports:
          - 5432:5432
        options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5

    env:
      POSTGRES_HOST: localhost
      POSTGRES_PORT: 5432
      POSTGRES_PASSWORD: admin
      POSTGRES_USER: postgres

    steps:
      - name: Checkout repository
        uses: actions/checkout@v2

      - name: Cache dependencies
        uses: actions/cache@v2
        with:
          path: /opt/venv
          key: /opt/venv-${{ hashFiles('**/requirements.txt') }}

      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          python -m pip install -r requirements.txt

        if: steps.cache.outputs.cache-hit != 'true'
      - name: Run tests
        run: |
          ./manage.py test --parallel --verbosity=2

Your workflow is now running in a container of it’s own, next to the postgres container. So the port mapping to the runner VM doesn’t do anything any more (because it affects the host, not Docker containers on it).

The job and service containers get attached to the same Docker network, so you should only need to change POSTGRES_HOST to postgres (the name of the service container) and Docker’s DNS should do the rest.

1 Like

@airtower-luna Thank you so much for your help. I spent too much time looking at the wrong places!

2 Likes