Copying terabyte of data from hard drive to Google Drive in Ubuntu - story of how I loved Rclone
Problem
There is need to sync/copy files from hard drive to Google Drive. Unfortunately for us sum of this files sizes is around 1TB. What you do?
Machine: Ubuntu 18.04 LTS with 2 CPU, 8GB of RAM and 120/10 Mbit network connection.

Solution
Choose your weapon
Actually there are couple of solutions how to do that:
- Just open web browser and upload your data using google drive webpage (https://www.google.com/drive/) - this solution is very simple, but it have limitations like file modification date would be lost, you cannot do it automaticaly ie. once a day etc.
- Ubuntu 18.04 provides something like 'Online Accounts' where you can log into your Google Account from GNOME and use Google Drive as external drive - unfortuantely I didn't try this one, because I couldn't access server physicaly and via xRDP I have XFCE4 as graphical interface.
- Use google-drive-ocamlfuse (https://github.com/astrada/google-drive-ocamlfuse) - this solution mounts Google Drive to your local system and you can use standard linux commands like cp, mv, rsync here. Unfortunately for me it was slow and required a lot of configuration.
- Use Rclone (https://rclone.org/) - I recommend to use this one for this job. It is very simple, fast, and powerful. This tool can copy data from hard drive to a lot of clouds and data between clouds. It also allow you to mount Google Drive to some path.
Rclone - how to install and use it
Installation of Rclone (1.51.0) is very easy:
- Execute this line in terminal:
curl https://rclone.org/install.sh | sudo bash
2. Execute:
rclone config
3. Follow the instructions from: https://rclone.org/drive/.
Only part which could be difficult is to obtain user and secret from Google Developer Console. The rest of installation is easy.
To copy files I used below line, because I wanted to copy files without deleting anything:
rclone copy /some/source/path googledrive:/some/path/on/google/drive --create-empty-src-dirs --progress --bwlimit "07:00,0.3M 17:00,off"
As you see I used some attribute like 'bwlimit'. This is great feature and I really recommand to use it especially when you need to work remotly :) This configuration allowed me to limit upload bandwidth to 0.3 MB/s (2.4 MBits/s) between 7:00 - 17:00 and upload without limit for the rest of the day. Without this option my RDP connection to workstation was so slow that it was impossible to work like that.
You can use also 'rclone sync' for the job, but be aware of that you can lost some data which are on GDrive. Try --dry-run before to see which files would be deleted.
Story
Once I wanted to move some data from my hard drive to Google Drive. As you know - there was around 1 TB of them. Because I have plans to do it more than once I didn't want to do it via webbrowser. So I started to look after some solution which allow me to do it headless.
First I read about google-drive-ocamlfuse and wanted to give it a try. After installation with which I faced couple of small problem I mounted GoogleDrive and started copying a data with rsync.
Unfortunately after couple of minutes I faced "no disk space left" problem, but copying did not stop. I manually stopped it to diagnose why this happend. This issue was because of default configuration which was not adopted to transfer big files and cache was growing fast. I changed it (especially making 'stream_large_files=true'), but problem still occured - not after couple of minutes, but after couple of hours. The solution to clear cache was to run:
google-drive-ocamlfuse -cc
If disk space didn't increase then you need to kill google-drive-ocamlfuse process and umount path if necessary.
Also speed of rsync was not that fast as I wanted (probably because rsync is single threaded), so I run it more than once for different directories which I wanted to copy.
During 3 days of copying via google-drive-ocamlfuse I saw that I have no guarantee that files are copied properly, so I decided to change tool.
After little research I found Rclone and I really love it. It's simple, fast, powerful and universal tool for copying files to/from/between clouds!
I not often feel that software is like exactly made for me and is doing much more than I need. For Rclone I had this feeling after 30 mins of use and the more I used it, the more I loved it.
I copied data 7 days straight and nothing bad happened. Once I had problems that Rclone used all upload bandwidth when I needed it, but in couple of minutes I found option to limit it! And it worked on first try!
I think I will put this tool on my top 10 greatest softwares I worked with.
P.S. Special thanks to Nick Craig-Wood for making Rclone.