Constant "fatal: the remote end hung up unexpectedly" when pushing to new repositories

I have two new repositories that I can’t push to because I am constantly getting this error. Github support has not worked, and the issue hasn’t resolved itself despite it being a few days. I am at a complete loss as to what to do next.

1 Like

I initially contacted Github support on January 25th and am still waiting for help.

Hi, bro! It occurred to me at last week, today I tried resize the postBuffer of git and did some config of system, the clone process is still running and seems fine, I hope the follow commands will help u ~

export GIT_TRACE_PACKET=1
export GIT_TRACE=1
export GIT_CURL_VERBOSE=1

source ~/.bashrc


git config --global http.postBuffer 1048576000
git config --global https.postBuffer 1048576000

I was having the same issue.  I tried increasing the http.postBuffer, https.postBuffer, and ssh.postBuffer configs but nothing seemed to help.  I finally figured out that I could push *other* branches, just not the particular branch I was sitting on.  So I started pushing individual commits from the “bad” branch one at a time until I finally hit:

Pushing to git@github.com:directangular/unicorn.git
Counting objects: 100% (9/9), done.
Delta compression using up to 20 threads
Writing objects: 100% (5/5), 549 bytes | 549.00 KiB/s, done.
Total 5 (delta 4), reused 0 (delta 0)
remote: error: object 74c7584ff0b93591c19d3a3c19695889dd2274d2: badEmail: invalid author/committer line - bad email
remote: fatal: fsck error in packed object
error: remote unpack failed: index-pack abnormal exit
To github.com:directangular/unicorn.git
 ! [remote rejected] pizzafeast -> pizzafeast (failed)
error: failed to push some refs to 'git@github.com:directangular/unicorn.git'

So it looks like the “remote end hung up unexpectedly” is sort of “swallowing” the actual error message, which is probably some kind of malformed commit as I have here.

4 Likes

I’ve encountered this issue for the second time now, and still Github is absolutely useless in giving us reasonable error messages.

1 Like

I have the same issue as you when I push. I did the http.postBuffer and https.postBuffer resize still not work. 

Git LFS: (0 of 0 files, 2 skipped) 0 B / 0 B, 329.51 MB skipped Counting objects: 36968, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (20717/20717), done.
error: RPC failed; curl 55 SSL_write() returned SYSCALL, errno = 10053
fatal: The remote end hung up unexpectedly
Writing objects: 100% (36968/36968), 4.12 GiB | 10.15 MiB/s, done.
Total 36968 (delta 14717), reused 36967 (delta 14717)
fatal: The remote end hung up unexpectedly
Everything up-to-date

I did. Nothing happened to me. Still the same.

Git LFS: (0 of 0 files, 2 skipped) 0 B / 0 B, 329.51 MB skipped Counting objects: 36968, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (20717/20717), done.
error: RPC failed; curl 55 SSL_write() returned SYSCALL, errno = 10053
fatal: The remote end hung up unexpectedly
Writing objects: 100% (36968/36968), 4.12 GiB | 10.15 MiB/s, done.
Total 36968 (delta 14717), reused 36967 (delta 14717)
fatal: The remote end hung up unexpectedly
Everything up-to-date

Has anyone figured this out yet? 

2 Likes

Same problem…

I have the same problem too! 

I’m having the same issue. What I found is if I use https method vs ssh method, i can get my commits to post. i switch back to ssh and it works fine.

horrible work around but at least i didn’t lose any info.

I have the same problem when pushing my project with Github Desktop. I am not an expert, how do i use https to push? @NextStepGuru

I use fls for some huge textures > 100 mb. Is this the issue?

Greetings from the Netherlands

I assume so, check what File and repository size limitations says:

Individual files in a repository are strictly limited to a 100 MB maximum size limit.

Look at Versioning large files for how to work around that limitation.

After deleting the huge files, I still get the error. So it looks like it has nothing with huge files.

Did you also remove the large files from history? If you just delete them and commit that the huge files are still going to be part of the repository history, and thus subject to upload limits.

How do I remove files from history?
I am using github desktop. If I click on History, there is no option to delete a file from history?

Please check the link in my previous post. It’s Github documentation on the topic and there’s no mention of Github Desktop, so I assume you’ll have to use the command line.

Alternatively, if you don’t mind losing the existing history, you could create a new repository and start fresh with your current sources (and maybe the large files in LFS).

1 Like

Thanks, I choose another solution. Deleted everything , start from scratch.
And the huge texture files in a separate zip file in gdrive

2 Likes

I was having this problem too. Issue turned out to be with large files (> 100MB). Github was throwing an error but it was getting swallowed somehow as suggested by @mgalgs above. I found that out accidentally by following @mgalgs advice to push a different commit other than the most recent, in case there was a malformed commit making things choke. When I pushed a much older commit, the large-file error showed up.

To find large files in the repo history I used the answer here from user raphinesse : https://stackoverflow.com/questions/10622179/how-to-find-identify-large-commits-in-git-history

To remove the large files I used BFG (I was already using that to remove stuff from the repo history I didn’t want) - https://rtyley.github.io/bfg-repo-cleaner/ It’s super fast. BFG can automatically remove files over a certain size, but I felt more comfortable know first what was going to be removed so I used the scripting method mentioned above.

After that, things worked. :smile:

1 Like

@HMassink See my answer above. I used BFG Repo-Cleaner and removing large files fixed this error for me. Also, you can use git large file system (LFS) to have files > 100MB in your repo.

1 Like