top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

Git server memory requirements

+1 vote
530 views

I'm trying to get an idea how much memory is required for a git serve that is hosting linux kernel repos.

What we're seeing is that git uses around 1GB of RAM on the server when a user does a clone of the Linux kernel source over ssh. Does this seem about right? Is this amount fixed, or arbitrary (trade-off memory for speed). It seems subsequent concurrent clones use less memory.

Is there any practical way to reduce the memory usage? We're running into occasional issues if there are multiple clones at once.

Is setting gc.auto=0 a good idea for large kernel repos? The idea is we can repack manually or in a cron on weekends rather than during user operations. However, manually running a git gc seems to use about as much memory as a user clone.

It may be that our EC2 small instance (2.5GB) is not up to the task, but would like to understand options (we can easily trade off some speed for less memory if we can) before upgrading.

posted Mar 21, 2014 by Sheetal Chauhan

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button

1 Answer

+1 vote

As we're working on our large repo migration, and this was a concern originally. So far it's best mitigated by the pack-bitmap patches in the next branch.

The initial clones become significantly faster and much less resource intensive.

answer Mar 21, 2014 by Satish Mishra
Similar Questions
+1 vote

I've been using git for some time now, and host my remote bare repositories on my shared hosting account at Dreamhost.com. As a protective feature on their shared host setup, they enact a policy that kills processes that consume too much memory. This happens to git sometimes.

By "sometimes" I mean on large repos (>~500MB), when performing operations like git gc and git fsck and, most annoyingly, when doing a clone. It seems to happen in the pack phase, but I can't be sure exactly.

I've messed around with the config options like pack.threads and pack.sizeLimit, and basically anything on the git config manpage that mentions memory. I limit all of these things to 1 or 0 or 1m when applicable, just to be sure. To be honest, I really don't know what I'm doing ;)

Oddly enough, I'm having trouble reproducing my issue with anything but git fsck. Clones were failing in the past, but after a successful git gc, everything seems to be ok(?)

Anyway, I'd like some advice on what settings limit memory usage, and exactly how to determine what the memory usage will be with certain values.

0 votes

I am unable to clone my repository. I get a message of out of memory. The server is located on Hyper-V running on Win Server 2008 R2 SP-1
The OS is Windows 7 64BIT.

+1 vote

Anyone encountered the following error

In the remote central depository, I have 2 git depositories.
I am able to clone/pull from the 2 git depositories using https but when issue a pull, it gives return code 22 fatal: git-http-push failed error

+1 vote

How to configure a read-only copy of a remote Git repository on a local server in bare mode and automatically synchronize its contents.

I need to configure a mirror of the repository hosted at another location and the mirrored repository should automatically perform syncing of code at regular intervals.

...