The support forum

Specific Files' Changes Detection

Doequer :

Mar 29, 2017

While trying last program's version, some days ago I realized of a particular issue, which probably obey to a by design behavior, but that otherwise, might be improved.

I set up a simple test backup job, with the following options:

Backup from: Desktop (system disk).
Backup to: K: (int. disk).
What to backup: "Example.txt" (empty list, including only "Example.txt" file and no rules).
When to backup: Continuously (real-time).
Detecting changes: Use destination snapshot.
Copying: Use delta copying.
Deleting: Delete....
More options: Left as default.

In the first tests I did, as that particular file changed somehow, backups ran as expected, so nothing wrong in that sense. The thing is, after a few days of seeing how the program's tray icon kept moving quite frequently, I started wondering why, since neither me nor a third-party program were updating the file at all after the initial tests. So, when I looked at what the program was doing, I noted it was in the next looping situation:

"Monitoring for changes...>Changes detected, will run in 5 seconds ...>Working..."

And after some time I finally realized it was acting like that, not because changes being made to the example file, but to any other ones. Under that
circumstance, every time an unrelated file is created, renamed, modified or deleted in the "Desktop" location, that kind of actions will trigger the job's execution. Basically, that implies that job's last execution time will not be reliable, besides a greater log's (this relying on the amount of different files manipulations ran at the source folder) filled mostly with "false +" changes' detections.

So, I bet a logical and straightforward solution would be simply using dedicated folder/s for file/s to backup, but I'm wondering if couldn't you rather make the program monitors for changes exactly at the desired targeted file/s, instead scanning all the files/folders in a "all around" way?

Doequer :

Apr 03, 2017

So? Do you have any suggestions to avoid mandatory folders' use for each backup job? Maybe some advanced options?

I bet it would be useful if your program could omit unrelated changes detection which are done over any other file/folder than the desired/filtered one, and thus allowing to use a shared/common path as the source folder for backups, without that means Bvckup will loop on such unnecessary task.

Thanks.

Alex Pankratov :

Apr 03, 2017

Sorry for the delayed reply.

There's no simple answer. More discrete change tracking is doable. In fact, it's doable in two way, but both come with footnotes. See my post from a while ago on some context -
https://www.bvckup2.com/support/forum/topic/405/2438

In time that passed since then I've been leaning more strongly towards the NTFS journal option. This is not *that* hard to do and it should allow for impressively fast change detection. The caveat, of course, that it needs to be a *local* *NTFS* volume. For remote locations I guess we can try and use ReadDirectoryChanges() way, but as I mentioned in the above post it may lead to some severe system-wide consequences.

Doequer :

Apr 04, 2017

I see.

So the most viable solution so far will be using exclusive backup folders for any job, leaving shared locations out of the equation.

Thanks.

Alex Pankratov :

Apr 04, 2017

Yes, correct.

Alternatively, see if a short-interval scheduled job would work for your setup. This too will scan the source at full, but Windows generally does a good job at aggressively caching file indexes (indecies?), so subsequent scans should complete fairly quickly.

PS. This will spam the backup log though.

New topic

Create
Made by Pipemetrics in Switzerland
Support


Follow
Twitter
Dev blog
Miscellanea Press resources
Testimonials
On robocopy
Company
Imprint

Legal Terms
Privacy