Docker container running tests keep on exiting on github acitons with exit code 137

I am running jest tests inside a docker container on github actions which used to run well until last commit and since then, it always exits with 137

> jest --coverage

Error: Process completed with exit code 137.

Not sure if this is docker exiting with 137 or jest inside docker. But on linux machines, the memory limit of a docker container is the same as memory of the host, there is no fine tuning like in docker for mac or docker for windows. And there hasn’t been added anything in the last commit that could eat up all the memory.

Searching this forum, didn’t find any post about anything related to exit code 137.

I am not finding any docs arround specs of github action runners, how much memory do they have? Not sure how to debug this. Any suggestions?

1 Like

Usually exit code 137 means that the process was killed due to OOM. I suggest you monitor how much memory your process uses.

1 Like

I’m having this same issue when running tests using pytest. The tests run fine on my PC but no in github actions.

@gaussianrecurrence How would you go about monitoring memory usage? Is there a way to track usage on github actions?

I checked memory usage on my local docker containers (using docker stats) and the container never went over 150mb in memory.

I am not aware of any official support to monitor GitHub-hosted runners. But I am guessing you can monitor mem usage by calling free -m periodically. You’d need to use process forking to run both your command and free -m in parallel. Something like:

run: |
  your_command &
  while kill -0 $! > /dev/null 2>&1; do
    free -m
    sleep <interval>
  done
1 Like

I think it may have been jest leaking memory Your Jest Tests are Leaking Memory | chanind.github.io