Weird Github error

so im making an fnf modpack with friends but ive noticed that there is an issue thats been happening, im not exactly the most tech savy so idk what it means any help here??

Seems like you’re handling lots of files, and big ones too. The error states that cURL can’t handle exchanging files that big.

Try to enable Git LFS (Large File Storage) support in your repository settings, and if this might help.

how would i do this through the github desktop app, i went to repository settings and i dont see such an option, i recently downloaded git large file storage as well

Under Settings » Archives

I don’t use GH Desktop, but I believe it already ships with native Git LFS support, and that having handled all the LFS settings via the command line and on the repository settings should be enough for GH Desktop to pick them up. But I’m adding a GH Docs link to the topic, just in case.

That’s a good idea, so you have access to it on the Git command line installation too, not just in GH Desktop’s own Git (last time I tried GH Desktop, it came with its own Git setup for internal use).

im afraid since this is my first time having this issue that i have no idea how this works, ive been trying to do this for a long time, and yet im still struggling, even when looking up help, is there any sort of way i could get help through discord where people could walk me through on errors im making?

Real time exchanges would probably be a better solution for live support, but I have no idea where to find such a channel. You could always ask a fellow developer and collaborator to join a Discord chat to help you out with this, you should probably be able to solve this whole issue fairly fast.

As of this community, you might have to endure patiently for replies might take some time.

I don’t use Git LFS, but if you can provide detailed info on the encountered errors I’ll do my best to help you. I’ll be at the PC for another three hours, and I should get notified in real time of any updates to this post, so you could take advantage of this.

I suggest you first try to ensure that LFS is working properly with Git command line, and once you’ve ascertained that it works (and the repository settings are OK), it should be easier to focus on GH Desktop.

I don’t have GH Desktop, so I can’t help you much there, but hopefully someone else might join the discussion in the meantime.

my main issue is even with lfs i cant get all of my files on at once cause its over 2 gb, so i tried to do it one at a time but it seems it doesnt seem to want to work

From the error report is seems that 2Gb (more ore less) is the hardcoded limit of file sizes; this might be due to filesystem limitations on various OSs supported by Git (e.g. Windows NT didn’t support transferring such big files, and other OSs might have that limit too).

Let me check online about this size limitation, and I’ll be back to you…

its set by github the issue im having now is trying to use github to let me seperate the folders and post each one once at a time in 2gb and under chunks

I read in GH Docs » About Git Large File Storage that the 2Gb size limit is tied to the type of account (i.e. Free accounts). Team accounts can store files up to 4Gb, and Enterprise up to 5Gb.

i dont have the money to pay for anything of the sort, so i have to find a way to get around this in some sort of way

That won’t solve the issue, the problem is the actual file size. All you can do is split large files with some splitter tool (e.g. 7Zip can do that), but then the repository will contain split files.

What sort of files are these? are they compressible in some way, or will this jeopardize their use in the project?

i was planning on doing one sub folder at a time, as that might be able to get it to work

It’s not the size of the commit, but that of the file on disk that is at stake here.

If there are single files that exceed the size limit, all I can think of right now is that you create a script to split big files into multi-parts (e.g. using Zip without compression) and another script to recreate the original files from the multi-parts. You then Git ignore the original files, and publish their multi-part versions, and use the scripts to split and rejoin them locally (i.e. on end users side).

You can add those scripts to the repository Git hooks, so they are executed at commit time, pull time, or whatever is best, which would automate the whole process on both your side and the end users’ side (beware that it would make all Git operations much slower, e.g. committing and pulling).

I don’t see any other way to circumvent the limitation of file sizes.

If these are ISO images, splitting them in multiparts with a Zip tool shouldn’t be a problem. Also, depending on the file types, you should check if their format also supports a multi-part version.

its not a single thing, there is no singular file that is worth that much its just im trying to upload multiple files which add up to around 4 gigs

I noticed in the error message that the repository URL ends in .git. This doesn’t work with HTTPS, and my guess is that if you try to push a commit with a small text change it would also fail.

Try fixing the remote (you’ll have to delete it and recreate it) adding the URL without the trailing .git.

Then, let’s see the new error that shows up (I think we’re dealing with multiple problems at once, right now)

Another problem I see in the screenshot, is that you’ve used wrong slashes in the path (\ instead of /) which Git is reporting as being outside the repository.

You need to use unix style slashes only with Git.

Once you’ve fixed all these issues, we’ll be able to determine whether the size problem also affects single commits (i.e. the collective payload), in which case it should be simple to work around, as you suggested by uploading one folder at the time (if that meets the requirements) or n chunks of files per commit.

The whole issue of using LFS with GitHub is rather complex, because beside the limits imposed by Git you’ll have to deal with accounts limitations.

Refer to:

In the latter linked document you can read:

Every account using Git Large File Storage receives 1 GB of free storage and 1 GB a month of free bandwidth. If the bandwidth and storage quotas are not enough, you can choose to purchase an additional quota for Git LFS.

So, even if you overcome the technical problems you’re facing right now, you’ll have problems uploading 4Gb into a single repository, both due to monthly bandwidth limits (it would take you 4 commits spread across 4 months to achieve this), and due to the repository limit of 1 GB.

From the StackOverflow discussion you can clearly see that these policies have undergone frequent changes, and that probably attempting (or achieving) this limits might result in a warning email send by GitHub.

Anyhow, let’s first try and fix the problem, until you manage to push at least one big file to the repository, to confirm that all settings are correct in both the repository and GH Desktop.