What is eating our bandwidth?

Hey there,

We are using GitHub with couple of integrations (via webhooks). On every merge these integrations pull the repository and perform builds of our app.

The issue however is, that our Bandwidth is being consumed way faster than expected. Our repository download size is 1.7GB, yet in 5 builds over 3 days 60GB were consumed. A small chunk of that was potentially used by our members pulling/pushing, but definitely not more than 5GB altogether. And yes, we are using git-lfs.

When I clone the repository locally this is the output:

Cloning into 'our-project'...
remote: Enumerating objects: 886, done.
remote: Counting objects: 100% (886/886), done.
remote: Compressing objects: 100% (361/361), done.
remote: Total 47628 (delta 698), reused 633 (delta 525), pack-reused 46742
Receiving objects: 100% (47628/47628), 1.70 GiB | 13.62 MiB/s, done.
Resolving deltas: 100% (33822/33822), done.
Updating files: 100% (21548/21548), done.
Filtering content:  76% (2926/3850), 4.94 GiB | 15.00 KiB/s
Filtering content: 100% (3850/3850), 5.30 GiB | 4.94 MiB/s, done.

Is there a way to see how bandwidth was consumed?