====== Backup Software ====== Not recently I have looked through the backup solution for my home zoo. After looking through a number of alternatives ([[http://sourceforge.net/docman/display_doc.php?docid=5149&group_id=29282|afbackup]], [[http://packages.debian.org/lenny/backup-manager|backup-manager]], [[sourceforge>backup2l|backup2l]], [[http://backintime.le-web.org/screenshots/|backintime]], [[https://labs.riseup.net/code/wiki/backupninja|backupninja]], [[http://www.boxbackup.org/comparison.html|boxbackup]], cdbackup, [[http://cedar-backup.sourceforge.net/docs/cedar-backup2/manual/manual.html|cedar-backup2]], chiark-backup, dvbackup, dvdbackup, [[http://faubackup.sourceforge.net/man/faubackup.8.html|faubackup]], [[http://www.edwinh.org/flexbackup/|flexbackup]], jpilot-backup, [[http://rdiff-backup.nongnu.org/features.html|rdiff-backup]], slbackup, storebackup, vbackup) from [[wp>List of backup software|wikipedia]] I have stopped at [[sourceforge>backuppc|BackupPC]]. There are the following reasons, why I have chosen this software: - Web-interface. I know, that is not important for backup software, which can be a CLI-utility, but from the other side if you have apache server running, PHP-based interface is a plus. You can configure BackupPC from browser, force a backup, view the logs and browse currently available file backupped versions in a tree viewer and retrieve the necessary version just by clicking on it. - It supports Linux and Windows boxes. Linux works with SSH((optional))+tar, or SSH((optional))+rsync and Windows works as Samba+tar or SSH+rsync. - It supports full and incremental backups of different levels. - It performs per-file data compression before storing it to the pool. - Daemon periodically checks for host availability and performs backup when host becomes online (useful feature for notebooks and other mobile devices). The main idea of BackupPC is the following: it goes through the data tree to backup and retrieves only new/modified files since the last backup. The files are fetched to local filesystem and are organized in a pool, so if several clients have the same file, it will be cached only once. For home users this feature is not beneficial, as usually all boxes have different non-relative information to backup. This a behaviour of most backup software (backintime, rdiff-backup, etc). Another modern approach is one provided by [[http://b3.crashplan.com/consumer/features.html|CrashPlan]], which offers peer-to-peer backup with your friends (who should also have this software installed). The drawbacks of this approach might be: * The intersection of the time when you and your peer are online is small and you risk to miss daily backup for example. So you should have several peers to minimise the risk, but paying back with increased outbound traffic (multiplied by number of peers). * If your peer is resided physically long distance away from you, it might be not easy to get a full backup (ask your friend to burn and send you a DVD?) * Also you "pay" for the traffic, both in- and outbound. Further reading: * [[lifehacker>tag/backuputilities|Backup software articles]] * [[http://www.geekconnection.org/remastersys/index.html|Remastersys]] -- makes a full system backup including personal data to a live CD or DVD that you can install anywhere. * [[wp>List of online backup services]] * [[http://www.isyncbox.com/|Syncbox]] turns an existing PC immediately into a dedicated server and one can start file synchronization and sharing services right away. * [[http://www.makeuseof.com/tag/5-ways-to-securely-encrypt-your-files-in-the-cloud/|5 Ways To Securely Encrypt Your Files In The Cloud]] * [[habrahabr>266293|Linux и бесплатные облачные хранилища]] * [[sync#dropbox|Dropbox]] * [[wp>List of backup software]] * [[serverfault>15974|"Enterprise" backup software]] * [[https://www.greyhole.net/]] -- Samba-based replication solution * [[http://aulix.com/backup2dvd|Aulix Backup2DVD utility]] does very very super reliable backups to optical disks like Bluray and DVD * [[http://unix.stackexchange.com/a/24488/36095|tar supports '--acl' only from Debain jessie]] * [[http://www.veeam.com/endpoint-backup-free.html|Veeam Endpoint Backup Free]] * [[habrahabr>267881|12 типичных ошибок при бэкапе баз данных]] ==== Universal utility to access cloud ==== Provided that [[https://help.mail.ru/cloud_web/app/linux|Mail.Ru has abandoned their Linux client]] and [[https://help.dropbox.com/accounts-billing/settings-sign-in/computer-limit|Dropbox has limited the amount of clients to 3]] and [[https://www.linuxuprising.com/2018/11/how-to-use-dropbox-on-non-ext4.html|restricted the local filesystem to be ext4]] there is a need to search for universal alternative: ^ Project ^ Supports MailRu? ^ Supports DropBox? ^ Notes ^ | [[https://rclone.org/|rclone]] | :YES: [[https://rclone.org/mailru/|here]], but [[github>rclone/rclone/issues/3637|does not work with 2FA enabled]] | :YES: [[https://rclone.org/dropbox/|here]] | :MINUS: No two-way synchronization, see [[https://forum.rclone.org/t/upback-two-way-synchronization-utility-based-on-rclone/4692/5]] | | [[https://cloudcross.mastersoft24.ru/|CloudCross]] | :HELP: [[github>MasterSoft24/CloudCross/issues/71|affected by 2FA?]] | :YES: | :MINUS: Cannot synchronize part of the tree | ==== Utility to optimally distribute files into multiple fixed-sized volumes ==== Implements [[wp>Bin_packing_problem#First-fit_algorithm|First Fit Decreasing (FFD)]]: * [[https://sourceforge.net/projects/fpart/|fpart]] -- sort files and pack them into partitions ([[https://sourceforge.net/p/fpart/code/ci/master/tree/src/dispatch.c|code]]) * [[https://sourceforge.net/projects/discfit/|DiscFit]] ([[https://sourceforge.net/p/discfit/code/HEAD/tree/branches/1/BinPacking1.1/BinPacking.cs|source]]) Implements [[wp>Bin_packing_problem#First-fit_algorithm|First Fit]]: * ''[[https://linux.die.net/man/1/dirsplit|dirsplit]]'' from [[wp>cdrkit]] -- makes given number of random shuffles (500 by default) and chooses the best one ([[http://http.debian.net/debian/pool/main/c/cdrkit/cdrkit_1.1.11.orig.tar.gz|source]]). * :ADD: Takes into account that data in ISO image allocates more space because of additional structures. Closed source (hence algorithm is unknown): * [[http://hcidesign.com/dvdspan/|DVD Span]] * :DEL: Has problems with Cyrillic when creating ISO. Cannot create [[wp>Universal Disk Format|UDF]] ISO. So the only option to overcome these is to create ''.irp'' filelist for [[http://infrarecorder.org/|InfraRecoder]]. * :HELP: When using two other modes different from "In order" ("Re-order ..."), the result is every time different (looks like Random First Fit?). * [[http://lars.werner.no/?page_id=2|SizeMe]] My comparison: * ''DVD Span'': 16176MB left on last volume * ''dirsplit'' (''-a 5000'' iterations): 18084MB left on last volume See also: * [[ixbt>31:24267|Программа для автоматического оптимального распределения файлов по дискам]] * [[superuser>85466|Utility to optimally distribute files onto multiple DVDs?]] * [[http://unix.stackexchange.com/questions/10158/|Splitting large directory tree into specified-size chunks?]] Strategy I use with ''dirsplit'' for 50GB((Maximum ISO size 50048901120 B)) BD DL disks: - Create filelists: dirsplit -s47730M -a50000 -e1 -p /mnt/iso/video_ /mnt/video Note that split size is found out in empirical way. - Verify resulting ISO size. The maximum ISO size should be no more than the size of the target media (otherwise make a correction / rerun previous step): for file in /mnt/iso/video_*.list do echo -n "[$file]: " extents=`genisoimage -no-rr -allow-limited-size -graft-points -q -print-size -path-list $file` echo "$(($extents * 2048)) B = $(($extents / 512)) MB = $(($extents / 512 / 1024)) GB = $extents extents" done | sort -r -n -k 2,2 which prints something like this: [/mnt/iso/video_2.list]: 50048901120 B = 47730 MB = 46 GB = 24437940 extents [/mnt/iso/video_3.list]: 50048890880 B = 47730 MB = 46 GB = 24437935 extents [/mnt/iso/video_1.list]: 50048743424 B = 47730 MB = 46 GB = 24437863 extents [/mnt/iso/video_4.list]: 50048741376 B = 47730 MB = 46 GB = 24437862 extents [/mnt/iso/video_5.list]: 47073021952 B = 44892 MB = 43 GB = 22984874 extents - Generate ISO: for file in /mnt/iso/video_*.list do echo "[$file]" genisoimage -no-rr -allow-limited-size -graft-points -path-list $file -V "${file%.*}_`date +%F`" -o ${file%.*}.iso done - Burn ISO starting from most biggest (one that comes first in step 2) to be more secure that you will safely burn others. ===== Recovery ===== * [[habrahabr>253997|Извлекаем данные из iOS-устройств при помощи open source инструментов]] * [[habrahabr>256603|Тест бесплатных программ для восстановления данных]] (и далее по ссылкам в разделе //Похожие публикации//) * [[habrahabr>247421|Восстановление удалённых данных с помощью Scalpel]] * [[https://www.datahata.by/info/articles/besplanye-incstrumenty-vosstanovleniya-dannych.html|10 бесплатных инструментов для восстановления потерянных данных]] ===== [[http://backuppc.wiki.sourceforge.net/|BackupPC]] questions answered ===== * [[http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_rename_a_host|Rename a host]] * [[http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Speedupbackups|How to maximize BackupPC performance]] (nice hard drive optimisation tips) * [[http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Workaround_BackupPC_Windows_2003_Hang|How to workaround rsync stalls when backing up a Windows 2003 server]] (actually, comes up to installing ''rsyncd'') * [[http://www.goodjobsucking.com/?p=62|Backing Up Open Files on Windows with BackupPC (rsync) using volume shadow copy]] * ''%%for host in host1 host2; do su backuppc -c "/usr/share/backuppc/bin/BackupPC_tarCreate -h $host -t -n -1 -s \* ." | 7z a -mx=9 -ms=on -mqs=on -si /mnt/mybook/backup.$host.`date +"%F"`.tar.7z; done%%'' * [[http://akutz.wordpress.com/2008/12/23/creating-archives-with-backuppc-and-7zip/|Creating archives with BackupPC and 7-zip]] * [[http://www.tedcarnahan.com/2008/05/22/backuppc-only-when-mobile-device-laptop-is-plugged-in/|Prevent laptops from backing up when running on battery]] * The information about how to hide the user ID, which you use for backup to access a shared resources, see [[windows]] page. === Additional methods for resolving host IP except ''nmblookup'' or ''ping'' === During setting up of my notebook I've met the following problem: unfortunately, the NetBIOS Windows name was not visible from Linux box. First I thought that [[http://www.samba.org/|Samba]] server has to be configured appropriately. Playing with ''wins support = yes'' and ''domain master = yes'' configuration options has not given the desired result. I have also checked registry setting for LanManager (see [[http://support.microsoft.com/kb/136712|1]] and [[http://osr507doc.sco.com/en/ASUSystemG/asusystemT.asureg_lanmanserver.html|2]] and [[http://www.windowsnetworking.com/kbase/WindowsTips/WindowsNT/RegistryTips/Network/TCPNetBIOSBroadcastPerformanceRegistrySettings.html|3]] and [[http://www.tech-faq.com/understanding-netbios-name-resolution.shtml|4]]), thinking that it is running in "hidden" mode. Finally I've traced the network traffic and found out, that broadcast UDP packets do not pass through WiFi router in direction to Windows box, what I found strange. So in this configuration ''nmblookup'' does not reliably resolve Windows host IP address and ''ping'' does not work either, because of [[stackoverflow>2228839|domain search problem]]. I would suggest to: * Try to ''ping host'' first, then ''ping host.'' and get the results from one that succeeded * Implement one more check, ''nslookup host''. Also look for latest updates on [[sourceforgemailmessage>24538482|maillist]]. === How to test the network share is accessible by given user? === Use the following command: ''%%smbclient -U user%pass //host/share$ -c dir%%'' === The problem when trying to connect to the share: "Connection to //[host]// failed (Error NT_STATUS_UNSUCCESSFUL)" === In the case you are using [[http://www.kaspersky.ru|KAV/KIS]], you need to get the network type to "Local network" for your WiFi connection in //Firewall -> Settings -> Networks tab// dialog as advised [[http://support.kaspersky.com/faq/?qid=208280576|here]]. === When restoring files using web interface the downloaded ZIP archive is always broken === You are using the broken ''libarchive-zip-perl'' v1.30. The solution is either to use compression level 0, or to [[sourceforgemailmessage>201005111944.08468.tyler%40tolaris.com|downgrade to v1.18]]. Also reported to [[https://rt.cpan.org/Public/Bug/Display.html?id=54827|issue #54827]]. === [[http://unix.stackexchange.com/questions/22834|How to uncompress zlib data?]] === BackupPC compresses logs using ''zlib'' but they are not true ''.gz'' or ''.zip'' files. The trick is to prepend the gzip magic number and compress method: printf "\x1f\x8b\x08\x00\x00\x00\x00\x00" | cat - RestoreLOG.z | gunzip For example: printf "\x1f\x8b\x08\x00\x00\x00\x00\x00" | cat - /var/lib/backuppc/pc/centurion/XferLOG.1.z | gunzip | egrep -v '^ (pool|create)' | less {{tag>ISO Kaspersky}}