Cloning Large Git Repo

Stuti Jain
2 min readFeb 25, 2021

--

Git is a fantastic choice for tracking the evolution of your code base and collaborating efficiently with your peers.

Even though threshold for a qualifying a repository as “massive” is pretty high, they’re still a pain to clone.

The first solution to a fast clone and saving developer’s and system’s time and disk space is to copy only recent revision.

git clone --depth=1 [remote-url]

Later to get all ‘n’ commits, simply

git fetch --unshallow
git fetch --depth=n

Another great solution is to use git filter-branch. The command lets you walk through the entire history of the project filtering out, modifying, and skipping files according to predefined patterns.

git filter-branch --tree-filter 'rm -rf [/path/to/spurious/asset/folder]'

you can also limit the amount of history you clone by cloning a single branch, like so:

git clone [remote url] --branch [branch_name] --single-branch [folder]

Other helpful git commands:-

git config: Use the git config command to set the global variable of your system.

git config --global user.name "John Doe"
$ git config --global user.email johndoe@example.com

git status: This command lists all the files that have to be committed

git clone: This command is used to clone the repository from remote server to local machine.

git log: This command is used to tell us the commit records.

--

--