Rclone as a helper for external backups
First Test
An initial test will show you if everything worked out so far. Select a small local text file and copy it to the cloud repository with:
rclone copy mytext.txt gdrv:mytext.txt
This may take a while, because Rclone works very slowly without options. My text only reached a transmission speed of 58bps without parameters. I learned this by specifying the parameter -P
(Figure 3). The "Accelerating Rclone" box explains parameters that you need to pass in to accelerate the Rclone commands.
Accelerating Rclone
You can specify the number of parallel file transfers via --transfers
. For example, to back up a folder with 50 files, specify --transfers=25
. As a rule, a value that is too high does not do any damage.
The --checkers
parameter defines the number of test processes running in parallel. The processes identify all the files to be uploaded and compare them with what is already in the cloud store. In this way, the system excludes unchanged data from the transfer. The value is based on the --transfers
parameter.
Using --drive--chunk-size
, you can specify how much memory Rclone uses when uploading. The developers recommend 16MB, which you specify as --drive--chunk-size=16384k
. However, experimenting with higher values will not do any harm. If you also specify the --progress
(-P
) parameters, you can see the exact values for the upload's duration. Rclone provides even more information with the -v
or -vv
options.
Step on the Gas
In order to investigate the effect of the acceleration options, I had Rclone back up a folder with 10 images with a size of 40MB in a folder to be created on the target (Listing 1). The average data rate during the backup fluctuated in the test at around 500KBps; doubling of the chunk size returned about 620KBps.
Listing 1
Testing Acceleration Options
Then I added three files with a size of 12MB to the locally stored image folder and restarted the backup with the same command. Since Rclone now only saved the newly added data, the process was completed in about 25 seconds. When restoring Google Drive to the local instance, no parameters are required. The command is otherwise the same as the copy
command; you only have to swap source and destination.
Copy and Sync
In addition to copying, Rclone also lets you synchronize files between a client and a server. The difference between copy
and sync
is that the latter compares the target with the source file by file and deletes everything on the target that is not in the source directory from which you issue the command.
If you call a sync
command from the wrong directory, you risk losing important data. The sync
option should therefore be tested with the --dry-run
parameter. copy
does not delete any data locally; however, the process overwrites the backup in the cloud. If you want to keep a backup, you can bypass this process by using the backup directory mentioned above.
Another way to connect to Rclone is to mount Google Drive as a filesystem in your home directory using Filesystem in Userspace (FUSE) [6]. In my example, I created the cloud/
folder there and then mounted the previously configured Google Drive account (Listing 2).
Listing 2
Mounting with FUSE
« Previous 1 2 3 4 Next »
Buy this article as PDF
(incl. VAT)