Backup multiple Git repositories at once to S3 easily using a Python script with Docker support.
I run my own private Gitlab (on-premise). I want a fully automatic periodic script that creates a security backup of my private projects, but just the source code, so it's fast and cheap.
- Tested sources: Gitlab, Github.
- Backup a list of repos at once.
- Requires HTTPS.
- The script zips each repo before uploading it to the cloud storage, so it will be cheap.
The script should work with any other source that supports HTTPS.
docker run --rm \
--env-file=.env \
-v ${PWD}/repos.json:/git-backup-s3/repos.json \
pirobtumen/git-backup-s3
Files .env and repos.json are required. Check the Configuration section.
Fill your environment variables inside the .env
file:
GIT_USER=
GIT_TOKEN=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_S3_BUCKET=
AWS_REGION=
Create the repos.json
file:
[
"github.com/<user>/<repo>.git",
"github.com/<user>/<repo>.git",
"gitlab.com/<user>/<repo>.git",
"gitlab.com/<user>/<repo>.git",
...
]
Requires Docker installed.
$ make run