Backups are as important as writing code. If you already know how to create a single repository dump (see our guide), you can scale that process to hundreds of repositories with very little effort.
Script overview
The script shown below loops through every folder inside your SVN root, runs svnadmin
dump, and writes each archive to a dedicated target directory. Place it where
your shell access runs, give it execution rights and adjust a few variables.
System requirements
- Linux server with Bash/POSIX shell
- Subversion installed (find
svnadminwithwhereis svnadmin) - Writable directory for dumps and enough storage
Start by creating the script and making it executable:
root@server:~# touch multidump.sh
root@server:~# chmod +x multidump.sh
Copy the following content into multidump.sh and edit the three
variables at the top to match your paths:
#!/bin/bash
REPO_BASE=/location/where/your/svn/root/folder/is/
TARGET=/location/of/storage/
SVNADMIN=/usr/bin/svnadmin
cd "$REPO_BASE"
for f in *; do
FILE="$TARGET$f.dump"
echo "Dump: $f => $FILE"
test -d "$f" && $SVNADMIN dump "$f" > "$FILE"
done
Replace REPO_BASE with the directory that contains all repositories, TARGET
with your dump destination and SVNADMIN with the absolute path you
discovered earlier.
Compress on the fly
If you want to save disk space, pipe the dump output into gzip. This version keeps the same structure, only the file extension changes and the compression command is added.
#!/bin/bash
REPO_BASE=/location/where/your/svn/root/folder/is/
TARGET=/location/of/storage/
SVNADMIN=/usr/bin/svnadmin
cd "$REPO_BASE"
for f in *; do
FILE="$TARGET$f.dump.gz"
echo "Dump: $f => $FILE"
test -d "$f" && $SVNADMIN dump "$f" | gzip -9 > "$FILE"
done
Run the script manually or schedule it via cron to keep nightly archives. You can also extend the loop to rotate old backups or copy them to another host.