What happens if you choose a different backup method? (Or is a different one not available? I'm thinking your hosts's 'upgrade' trashed more than just YaBB's functionality. Unfortunately, on my testbed I only have Archive Tar and Archive Zip available. I'm assuming nothing of note is showing up in the server error log.) BTW, overly small zip file sizes was one of the problems with Advanced Backup on micro-sized YaBBs until I put the $max_process_time down to 5 seconds or less.
I didn't really test if Archive::Tar was supported because of the timeout issue... I saw no reason to drop $max_process_time further as it always failed on the big directory, not the smaller ones. When it hadn't failed yet, it said the step took 7.3 seconds.
Posted by: Dandello Posted on: May 24th, 2016 at 2:43pm
What happens if you choose a different backup method? (Or is a different one not available? I'm thinking your hosts's 'upgrade' trashed more than just YaBB's functionality. Unfortunately, on my testbed I only have Archive Tar and Archive Zip available. I'm assuming nothing of note is showing up in the server error log.) BTW, overly small zip file sizes was one of the problems with Advanced Backup on micro-sized YaBBs until I put the $max_process_time down to 5 seconds or less.
Posted by: Monni Posted on: May 24th, 2016 at 2:13pm
That's a problem I can't reproduce. HOWEVER, if it IS related to the problem I was having with Advanced Backup, put the if statements back and move
Code
automaintenance('off');
into 'sub print_BackupSettings' just above the return.
Hopefully that will fix the problem(s) with your Backup.pm as I suspect the problems you were having with the if statements had to due with the $max_process_time being set longer than your allowed process time.
If I put the if statement back, I just get too small backup file... it doesn't contain all of the directories... The problem is that when it fails, it doesn't actually write anything to the file as the external zip command uses temporary files. When doing incremental backups, it just prints "Nothing to do." and browser can't handle that because there is no HTTP headers. If I use internal zip from Perl, I just get croak due to file corruption.
Posted by: Dandello Posted on: May 24th, 2016 at 1:25pm
That's a problem I can't reproduce. HOWEVER, if it IS related to the problem I was having with Advanced Backup, put the if statements back and move
Code
automaintenance('off');
into 'sub print_BackupSettings' just above the return.
Hopefully that will fix the problem(s) with your Backup.pm as I suspect the problems you were having with the if statements had to due with the $max_process_time being set longer than your allowed process time.
Posted by: Monni Posted on: May 24th, 2016 at 11:09am
There are large YaBB forums where the 'normal' backup can take 4+ hours. (Like YaBBForum itself.) But the backup issues for SQL based data is something that has to be seriously considered. It does no good to use mySQL for the data if we can't make good and fast backups of that data - and an easy way to restore the data when necessary.
Even if the backup isn't very large, just the time it takes to download the backup off-site is long, because shared hosting plans have limited diskspace and bandwidth and usually can only hold 4-5 backups on-site at most and only if there is no system backup in progress... Incremental backup currently doesn't work so it can't be used to save size of the individual backups...
Posted by: Dandello Posted on: May 23rd, 2016 at 2:17pm
There are large YaBB forums where the 'normal' backup can take 4+ hours. (Like YaBBForum itself.) But the backup issues for SQL based data is something that has to be seriously considered. It does no good to use mySQL for the data if we can't make good and fast backups of that data - and an easy way to restore the data when necessary.
Posted by: Monni Posted on: May 21st, 2016 at 10:44pm
if opening file succeeds but writing doesn't, all data is lost, because it doesn't keep copy of the overwritten data.
Yet another reason to start seriously looking at mySQL for the data storage. In theory, it's less likely to wipe the data.
I've used SMF which is basically MySQL version of YaBB as it is based on YaBB SE and it had database corruption issues instead... In YaBB we get one or two zero-length files, but when MySQL database gets corrupted, it usually means everything is unreadable. I know MySQL has repair command, but that doesn't always fix corruption successfully. With forum that is over 1 GB in size, that kinda means we would have to shutdown the forum for about 4 hours every week to take synchronized backups of both the attachment files and data moved to MySQL database, and copy them off-site.
Posted by: Dandello Posted on: May 21st, 2016 at 9:44pm
Ahh! a variation on the famous 0 length vars file problem.
It can basically happen with any file. I've seen it happen on attachment list file, instant message mailboxes etc... if opening file succeeds but writing doesn't, all data is lost, because it doesn't keep copy of the overwritten data.
Posted by: Dandello Posted on: May 21st, 2016 at 9:00pm
Ahh! a variation on the famous 0 length vars file problem.
Posted by: Monni Posted on: May 21st, 2016 at 8:42pm
Most of cases when files got corrupted was because YaBB didn't notice it was running out of disk space when writing file... Stupid Linux stops writing to files well before free space on a mounted filesystem hits zero. When that happens, the file size is reset to 0 and the file is truncated.
Posted by: Dandello Posted on: May 21st, 2016 at 8:38pm
I'll look for that when I start working on that function. There are also a lot of spots where it croaks because the file it's trying to read from doesn't exist. (All those readline on closed file errors.)
Posted by: Monni Posted on: May 21st, 2016 at 8:24pm
The places it gets tricky is you can't use that test if the variable can legitimately have a 0 value.
Well... While I was fixing the attachment handling code, there was cases when with "perl -w" it failed because it assumed it can always read the attachment list and find the entry... when the attachment list file was corrupted, it croaked because the attachment count was undefined and not zero. Same thing happened when editing attachments in a post... Adding first attachment worked, but adding second always failed.
Posted by: Dandello Posted on: May 21st, 2016 at 8:02pm
What I've found should work is replacing the
Code
if ( $myimportedvar eq q{} )
with
Code
if ( !$myimportedvar )
In most cases we're actually checking to make sure that the variable has a value and (in theory) undef, 0, and q{} should all be false. The places it gets tricky is you can't use that test if the variable can legitimately have a 0 value.
Posted by: Monni Posted on: May 21st, 2016 at 7:35pm
The cases where I had to reverse the tests were mostly where blank string was supposed to equal to undefined variable, variables that were loaded from files for example.