503 Response object too large when trying to pull

Hi everyone,

I used a Github action to build & push an image to github container registry.
But when I try to pull the image (either locally, or from a kubernetes cluster), I’m getting an error on the pulling of one layer received unexpected HTTP status: 503 Response object too large
So the status of this layers stay at “downloading”. Docker tries several time to download, but everytime, the same error.

All the other layers are properly pulled.
The layer that cannot be pulled is pretty big (several Go).
Is there a size limit by layer ? How can I get rid of this and manage to pull ?
Is it the proper place to report this or is there another place to report bug/weird behaviour ? :slight_smile:

Here is the workflow I used to build & push the image :

I’m available to provide more details.


Hi Hartorn,

Due to the way our caching works, there is currently a limit of 4GB per layer. We’re hoping to fix this, but I don’t have an ETA for exactly when.

The workaround would be to break your image into layers of 4GB or less. Might this be feasible?

Hi @jcansdale,

No not really. The layer is about 16go (yeah, quite big ^^’)
I used a multi stage build to download and build binaries, and so the secret and auth are done into the first stage.

Nvm, not a big issue for me, I’m using a GCP container registry for that time, will do as a workaround.

I feel like it would be better to break the push command still atm, to avoid people being able to push, but not pull. It feels quite weird, and I took some time to say : “oh must be github”.

I have no strict timing on ETA, but I’m available to test if needed


@Hartorn this should be fixed now. Please give it a try again and let me know if its still a problem for you.


1 Like

@markphelps Pulling it right now.
Seems a bit slow, but working nonetheless

Thanks !

1 Like