The support forum

Compressed backups

Sep 01, 2016


Bvckup 2 doesn't implement backup compression.
Nor do we have any plans for adding it in the future.

Why not?


All modern file systems implement native support for file compression. This includes NTFS, which added compression back in 1995 (!) with the release of Windows NT 3.51.

While the contents of compressed files are stored, well, compressed, they otherwise look and feel like regular files, e.g. they can be read and written at will and the OS will transparently decompress and compress data going from and to these files.

NTFS compression algorithm is LZ77, so you will get compression rates comparable to those of gzip and zip.

The CPU usage overhead is generally negligible on all but very ancient machines *or* under extremely heavy disk load.

In other words, if you are looking at keeping your backups compressed, consider enabling NTFS compression for your backup folder or its entire disk rather than using application-level compression, like zip or rar.

To enable NTFS compression for a folder


In Windows Explorer, right-click on the folder, select Properties, then:
https://bvckup2.com/support/data/folder-ntfs-compression.png

To enable NTFS compression for a drive


In Windows Explorer, right-click on the drive, select Properties, then:
https://bvckup2.com/support/data/drive-ntfs-compression.png

Addendum


There is one scenario where compressing files during a backup may be needed - this is when backing up onto a remote share *over a slow link*. In this compressing files locally before copying them reduces the amount of data transferred over the network.

HOWEVER, if you are ever in this situation, a cleaner solution is to utilize network-level compression. For example, by enabling compression on an SSH tunnel connecting two sites or enabling IPComp in your VPN setup.

Tim :

Jan 17, 2017

That's a real shame, because folder compression in Windows simply does not compress data very much:

Copy 300GB of data from one windows folder to a 'compressed' one and hey presto, you still lose 300GB of used space on the 'compressed' folder.

Run a backup program and choose the default compression, and you will get significant disk space savings. I can't say how much, because it depends on what backup program, how much data to backup and what type of files you are backing up. (Some files compress better than others).

However, with large backups and mixed file types, the space saving is really significant.

Alex Pankratov :

Jan 17, 2017



This really depends on the nature of the data and how the backup software does the compression. If it merely compresses on per-file basis, then you should see about the same level of compression as with native NTFS compression.

But if they compress *with deduplication*, then you will indeed get better compression rates, especially if backup contains multiple copies of the file file(s). On the flip side, you lose an ability to access files in your backups directly, without needing to go through backup software.

PS. There is a dedup support in Windows Server, so that's another option to consider.

PatchW :

Jan 23, 2017

I understand that you don't wish to enable compression within Bvckup itself but seeing as it also effectively ignores NTFS compression during sync it would be useful if either of the two following options available :

- Bvckup2 can effectively 'tick the box' for NTFS compression on a destination folder/file when it creates it, letting the destination OS handle the actual compression.

- If option 1 isn't viable then it is possible to at least include a note in the log 'compressed folders in source location C:\foo\bar not compressed on Destination' so we can at least parse the log and deal with it elsewhere?

Alex Pankratov :

Jan 24, 2017

I understand that you don't wish to enable compression within Bvckup itself


I didn't say that. It's a perfectly fine option to have and I'm expecting it to make its way into the app at some point. The one thing that I don't like about it is the fact that enabling/disabling compression on an *existing* file is a fully blocking, non-cancelable operation. Meaning that if you have a 100 Gig file and you somehow end up instructing Bvckup 2 to turn Compressed attribute on it, then you must wait for it to complete.

So I'm not sure how to go about toggling Compression for existing files in a user-friendly way, hence the hesitation.

Bvckup2 can effectively 'tick the box' for NTFS compression


Yes, that's one of 3 options basically - never set Compression attribute, copy it from the source or always set it. For now, it's #1 and that's it.

include a note in the log


Ok, will see about adding this.

Dariush :

Apr 03, 2017

I strongly disagree with the "you will get compression rates comparable to those of gzip and zip" part. Just now, I tested the various compressions on one of my most important files - Firefox bookmarks. The base SQLite file is 480 Mb. NTFS compresses it to 185 Mb, Zip - to 46 Mb, RAR - to 35 Mb. This is corroborated by other people running comparisons, for example - https://dmitrybond.wordpress.com/2015/04/09/ntfs-compression-vs-zip-compression-x3-6-to-x13-7/ ; http://itknowledgeexchange.techtarget.com/sql-server/should-i-use-backup-compression-or-ntfs-compression-for-backups/. This is not what you'd call 'comparable'. In other places people are saying that "NTFS compression is processor-intensive and is tuned for speed rather than compression efficiency.", which I subjectively feel is true, but cannot find any testing of.
So basically, NTFS compression was obviously not made for backups and I urge you to switch to an algorithm that _was_.

Alex Pankratov :

Apr 03, 2017

Noted.

It's not a matter of _switching_ to a different algorithm, but introducing a new mode of operation whereby the app would produce backups in some format that it different from the originals. As such this is very unlikely to be done in the scope of Bvckup _2_.

An algorithm that is made for backups would involve deduplication as well as compression. Merely rar'ing thing is a half-measure if you are in fact after efficient storage.

All that said, can I ask why compress in the first place when 1TB of disk space now costs less than $50?

Fle533 :

Aug 25, 2017

Considering enterprise class storage, $/TB increases quite significantly, as there's more than the cost of HDDs to take into account. In such a scenario, compression does make sense.

Compression on the storage system itself often turns out to be not as effective as for example 7z or the like, provided the user is willing to invest the necessary CPU cycles on the client machine. As being mentioned, this also depends on the file types. Deduplication is not a huge space saver if the data consists mostly of unique binary files.

A compression option in Bvckup would therefore be very welcome, ideally with the ability to utilize an arbitrary compression program with command line parameters (e.g. 7-zip) and a filter where the user can set which file types to compress. Obviously this extends the pure scope of syncing and could be seen as an additional archiving mode.

vinid223 :

Nov 12, 2017

I just want to add my thought in there.

I do feel the need to have a compression option in a backup since the NTFS compression doesn't give me more than 2% compression rate. Since I have more than 2 TB of data to back up and not a lot of money to put in an HDD, I do need this kind of compression.

I did develop some kind of compression system in the past and I used the 7 zip lib to be able to do it without having to code the algorithm. But I just say that as a suggestion since it could cost something to have a licence of using it in your app.

I know that this is not a priority for you guys, but I would really like to know if this is in your TODO list and if this will be implemented in Bvckup 2.

Thank you

Alex Pankratov :

Nov 12, 2017

@vinid223 - sorry to say, but this is not on a ToDo list, no.

I see where you are coming from and I agree that there are cases where better compression can come quite handy. But this is not a feature that, in my opinion, belongs to Bvckup 2. It has less to do with implementation details and more with the fact that it will blur the overall focus of the program, which is an as-is replication of A to B.

New topic

Create
Made by Pipemetrics in Switzerland
Support


Follow
Twitter
Dev blog
Miscellanea Press resources
Testimonials
On robocopy
Company
Imprint

Legal Terms
Privacy