The support forum

Same File Keeps Updating

User910590 :

Jul 21, 2017

Hello,

I have a few backups running, one of which copies over ~6000 video files (TV shows) and ~4000 subtitle files.

for some reason, every time i run the backup, a few specific files, for some reason, keep getting updated even though nothing in the file has changed. The backup will successfully copy over a file dated/modified around 07/01/2017 to update the backup dated 07/19/2017 (that's when the backup was run).

I would re-run the backup again today, and it would again copy over the same source file (which hasn't changed) to recopy the backup dated 07/19/2017.

It looks like something is altering the backup copies' date created/date modified, specifically for ~200-300 files in this huge archvie of ~10,000 files.

Would you have any idea what could potentially be causing this? Just FYI, the files on the backup location are occasionally read but never written to. The program accessing those files is "Plex".

Alex Pankratov :

Jul 21, 2017



It sounds like these files are in fact touched by some process between the backups. Given that you 'last modified' times change, these files are opened for 'write' access, so there's that. I am not familiar with Plex, but there's a chance that it incorrectly opens files in read/write mode whereby it should be requesting just the read access. Perhaps try looking at timestamps right after a backup and then again after opening/closing these files with Plex?

User910590 :

Jul 22, 2017

Hmm, good point. I find it odd  that its happening always to a specific set of subtitle (~1kb) files. but it may be that plex changing this modified and created date every time.

Is there a setting that allows you to ignore the modified and created date, and instead look solely at the block/bit level of the file?

Alex Pankratov :

Jul 22, 2017

No, this is not possible. Last-modified time is always compared.

User910590 :

Jul 24, 2017

so they only way to deal with this file continuously updating would be to exclude it?

Froggie :

Jul 24, 2017

Sure, you can exclude them, then they wouldn't get replicated, but allowing appx. 300 1kB files to update doesn't seem to present any real System load issues, just a tiny bit of time for the backup task to complete.

Alex Pankratov :

Jul 24, 2017

How do you expect this to be handled? Do you just want to get "no changes" summary after a backup run?

Even if these files were not *copied* on each run, it would still require reading every bit of the source *and* destination files just to understand that nothing changed. For smaller files this will take about as much time and effort as to copy a file (i.e. see what Froggie said).

AJNiteOwl :

Jul 25, 2017

Suggest you run 2 backups - isolate the problem files exclude them from your main backup and then do them separately, perhaps manually only if there is a need to (i.e. they have changed).

New topic

Create
Made by Pipemetrics in Switzerland
Support


Follow
Twitter
Dev blog
Miscellanea Press resources
Testimonials
On robocopy
Company
Imprint

Legal Terms
Privacy