Description
Recently I did observe a very unpleasant issue. I have a certain repository which was initialized with a single 16'000 files commit with few binary files of total about 1GB checked out (git repo size ~500MB). This repository works fine when it comes to browsing file tree, cloning, pushing and etc. It does however "freeze" the server when a first commit is to be shown, by. ie.
http://xxxx/commit/xxxx.git/71c07836f21bc42b0fc9999c1117be1d6e147307
The First attempt to load commit page after server reset puts server on hold for about 2 minutes. Second attempt (page reload) NEVER (I did await about half an hour) finishes and after few minutes server stops responding on both web and git access channels (web pages not loading, git pull not responding). The only solution I figured out is to kill it and restart.
What I can observe is that CPU at server is running at 100%
Tested it with 1.8.0 and Ubuntu machine (my server) and 1.9.1 on Windows 10 machine (my desktop).
The windows machine have shown in log at page reload the java.lag.outOfMemoryError and returned to functionality after about one minute while the Ubuntu machine just have frozen. The Windows machine has 10 times more memory and it is about ten times faster than Ubuntu machine.
I suppose that the culprit is the moment when commit page is computing "diffs" to show in commit log, which account to 5095165 changes total and includes all the binary file. The "out of memory" error on Windows suggests, that the computation flow is like: "load all models for all repos and THEN extract necessary information" instead of "load model for repo, extract, repeat for next repo". Inspecting source code suggests "getFilesInCommit" in JGitUtils.java.
I will try to work-around it by assigning more memory to Java that it would be otherwise a reasonable guess and try to re-write repo history to not contain such huge commits.