BackupPC

from Wikipedia, the free encyclopedia
BackupPC

Screenshot
Basic data

developer Craig Barratt
Publishing year 2001
Current  version 4.3.1
(July 14, 2019)
operating system Linux , Unix , Mac OS X
programming language Pearl
category data backup
License GPL ( Free Software )
backuppc.github.io/backuppc

BackupPC is a free disk-to-disk backup suite with a web-based front end. No special client is necessary as the server itself supports different protocols.

In 2007 BackupPC was named, along with Amanda and Bacula, as one of the three best known open source backup utilities.

Data deduplication reduces the required storage space on the target server.

BackupPC is also able to back up network shares under Microsoft Windows via the SMB protocol .

Transmission types

Different types of transmission (to the server) are supported.

rsync

The data to be backed up is transferred using the rsync protocol.

The particular advantage of this type of backup is the ability to transfer a smaller amount of data: An important feature of rsync is that it can not only copy entire files, but also parts of files. If a file has been changed on the source data carrier, only the changed parts of the file are transferred to the target system ( delta coding ). This can save considerable transfer volumes, especially with uncompressed file types, and significantly accelerate synchronization. (further: rsync protocol )

An encrypted transmission using the SSH protocol is also possible as an option .

smb

This type of transfer can be used to back up “native” Windows shares.

Memory usage

Deduplication

Due to the deduplication , identical files from multiple clients are only ever backed up once in the so-called "pool". Further identical copies are referenced internally (currently using hard links).

One example of disk use: 95 latops with each full backup averaging 3.6GB each, and each incremental averaging about 0.3GB. Storing three weekly full backups and six incremental backups per laptop is around 1200GB of raw data, but because of pooling and compression only 150GB is needed.

Files to be backed up are transferred to the server. After the transfer, the files are compared with existing ones ( MD5 checksums). If the files already exist, they are referenced using hard links. In the existing stable version z. B. However, copies of files continue to be transferred because the deduplication only takes place on the server.

The developer has announced that in the coming versions there will be a possibility of deduplication before the files are transferred: The clients create checksums before the transfer - afterwards only files that cannot be referenced (new and changed files) are transferred.

Optional compression

Optionally, the transferred files can means in addition to the server gzip or bzip2 compressed be to further save space.

performance

Significant performance gains can be achieved using rsync and uncompressed data: In contrast to the assumption that large amounts of data would have to be transferred over and over again , rsync enables even faster backups (see example).

See also

Individual evidence

  1. ^ W. Curtis Preston (2007) Backup and Recovery O'Reilly Media, ISBN 978-0-596-10246-3
  2. http://backuppc.sourceforge.net/info.html

Web links

Commons : BackupPC  - collection of pictures, videos and audio files