martinb wrote:First up I have four different onedrive accounts syncing. I set up all four onedrive jobs in the web interface using the NAS
admin login. I don't let my family members near the NAS web interface - they just see their files on the network.
If you have multiple users who have each configured their own sync jobs it will get more complicated because they will each have
user_info.json files within their home folders.
I suggest installing 'Shell In a Box' (if you don't already have a preferred way to connect the NAS console) and 'Midnight Commander' from App Central. Midnight Commander is a pretty neat old school file manager - I'm really not a command-line type of guy - give me anything with a GUI to save me typing. And its great for having a poke around when you're hunting for something
Remember I have done all this using the
admin login, if you have a different login the directory names will need to be changed.
First up lets take a copy of the datasync directory, we'll need it later:
Code: Select all
cp -R /volume1/home/]admin/.datasync-center /volume1/home/admin/.datasync-center-save
You don't need to run the two commands below, they are just for explanation - the script does all the work.
The basic idea is to create a folder in the temp directory:
mkdir -p /tmp/.datasync-center/connections/
And then redirect the default .datasync-center folder to the temp directory with a symbolic link:
ln -sf /tmp/.datasync-center/connections/ volume1/home/admin/.datasync-center
But when you shutdown all your .datasync-center data in the temp dir is lost.
To make it all work we need a startup/shutdown script to:
1. recreate the folder in the temp directory
2. restore the data (from the backup you made earlier)
3. symlink the .datasync-center directory to temp
4. And importantly, save the data to backup on shutdown
In the event of a power outage you will still lose the temp data so I also have a CRON job that does regular backups. But because the regular backup would write to disk - preventing the disk from sleeping, which is the whole point, the CRON job compares the temp data with a second copy also in temp and only if there are differences to important files does it then write to the backup.
So here is the script
Code: Select all
#!/bin/sh -e
### BEGIN INIT INFO
# Provides: onedrive-daemon
# Required-Start: $local_fs $remote_fs $network
# Required-Stop: $local_fs $remote_fs $network
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: Start or stop the dropbox-daemon.
### END INIT INFO
## Datasync Center has a dir which contains configuration and log files for all the sync jobs.
## The logs get constantly updated meaning the HDDs never sleep
## The script symlinks the config/log dir to one created in memory /tmp
## It doesn't matter if all the log data is lost on reboot, but because the user edited
## .conf files are in the same dir we need to save and restore them from a backup on HDD
TMPDIR=/tmp/.datasync-center
BAKDIR=/volume1/home/admin/.datasync-center-save
DEFDIR=/volume1/home/admin/.datasync-center
case "$1" in
start)
echo "Redirecting Datasync Centre Logs"
### hack by MB to move log files to RAM to allows HDD sleep
##mkdir /tmp/.datasync-center/
#create in memory directory under /tmp to hold the datasync-center logs
mkdir $TMPDIR/
#copy datasync-center configurations to tmp from backup dir
cp -rf $BAKDIR/connections $TMPDIR/connections/
#create symlink to direct default datsync-center config dir to the one we created in /tmp
ln -sf $TMPDIR/connections/ $DEFDIR
#use rsync to create a second copy of the .conf files in a separate /tmp dir
#will use a cron job to periodically compare the copy with the live files and
#if there are any changes we will save to the backup dir
rsync -r -c --stats --include="*.conf" --exclude="log.sqlite" --include="*.sqlite" --include="*/" --exclude="*" $TMPDIR/connections/ $TMPDIR/confs/
chown -R admin:root $TMPDIR/
###
;;
stop)
echo "Stopping"
#$0 backup
cp -rf $TMPDIR/connections $BAKDIR/
;;
restart)
echo "Restarting"
$0 stop
$0 start
;;
refresh)
#use rync to compare copy .conf files with the live files and
#if there are any changes we will save to the backup dir
echo "Checking Datasync Centre Logs"
#pipe the output of rsync --stats and search for the line about number of files transferred
transferred=$(rsync -r -c --stats --include="*.conf" --exclude="log.sqlite" --include="*.sqlite" --include="*/" --exclude="*" $TMPDIR/connections/ $TMPDIR/confs/ | awk '/files transferred/ {print $5}')
#echo "$transferred"
if [[ "$transferred" != "0" ]]
then
#echo "work"
#cp -rf /tmp/.datasync-center/connections $BAKDIR/
$0 backup
fi
;;
backup)
echo "Saving Datasync Centre Logs"
## hack by MB copy from RAM before shut down and when any changes detected
cp -rf $TMPDIR/connections $BAKDIR/
;;
*)
echo "usage: $0 {start|stop|restart}"
exit 2
;;
esac
exit 0
I named the script move-config.sh and placed it in the back up directory /volume1/home/admin/.datasync-center-save
To get the script to run on startup and shutdown you need create two symlinks to the move-config script in the /usr/local/etc/init.d/ directory
Code: Select all
ln -s /volume1/home/admin/.datasync-center-save/move-config.sh /usr/local/etc/init.d/S49-datasync-move-config
ln -s /volume1/home/admin/.datasync-center-save/move-config.sh /usr/local/etc/init.d/K51-datasync-move-config
Scripts starting with
S are run at startup in numeric order, there was a S50datasync-center script so I used S49 to make sure the config was moved before it started and K51 to take the copy after it stopped
K scripts run at shutdown.
Finally to get the regular backups I added a CRON job to run the move-config script with the refresh switch:
Add the following line to /var/spool/cron/crontabs/root
3 * * * * /bin/sh /volume1/home/admin/.datasync-center-save/move-config.sh refresh
I use the builtin editor in midnight commander, type mc at the command line to launch
And thats it
One thing the script doesn't do is remove redundant files from the backup directory. So if you delete a sync job you will need to manually remove the corresponding numbered subdirectory from the backup connections directory. Other wise you will get phantom sync jobs reappearing after a reboot. I could probably improve that by using rsync instead of cp to copy the files.
Oh and it looks like there are some redundant comments at the start of the script - I must have copied another script.
Thanks for getting me to write this up - it will come in handy if I ever have to do it again
Good luck and if you make any improvements let me know.