Replying to Backing Up A Very Large Wp Database And Files
Posted 31 October 2012 - 02:44 PM
The condition that you describe is simply indicative of your host offering insufficient resource (in this case, as you say your larger size backups are timing out, that would be time) for your backup to complete. This could be co-incident with slow server performance (low processor resource) which simply exacerbates the problem on your server. Unfortunately, as you may appreciate, this is not an issue that we can "fix" since it relates to your hosting account and the resources that are made available to you by your host based on that. BackupBuddy cannot, and nor should it, try to obtain from your server any level of resource above and beyond that which your host is prepared to make available to you.
We are not asking you to renew your membership in order to keep on using BackupBuddy and we provide information on our freely available Codex pages that you had access to during your membership and still have access to now that can be of use both to those who have active memberships and also to those who have expired memberships but continue to use BackupBuddy. There you will find information related to mitigating problems with larger sites and constrained server resources.
Of course if you decide to renew your membership then you will have access to the support forum if you choose to use it and there we will be able to advise you further based on additional evidence concerning your site and server capabilities. Of course you will understand that we cannot enter into detailed support activities with BackupBuddy users without a current membership on this public pre-sales and general enquiries forum as that would at the very least be unfair to those who have current paid memberships and rightly expect or attention on the members only support forums.
There are currently a variety of membership discounts and there is always a returning/renewing member discount, all of which our Sales team will be able to advise you about.
We will of course understand if you decide that you do not wish to renew your membership and may continue to use BackupBuddy on your sites with sufficient server resources and on your larger sites with an adapted backup strategy to fit within server constraints - or even if you wish to utilise an alternative solution where appropriate, we would rather that you make backups of your hard work by one means or another by a solution that you are comfortable with than not make backups at all.
Posted 30 October 2012 - 04:02 PM
Posted 28 August 2012 - 10:12 PM
Please sign in with the username and password you use to access the membership area (from where you downloaded BackupBuddy after purchase), and post your support request/question in the BackupBuddy support forum: http://ithemes.com/forum/forum/74-backupbuddy/
Posted 28 August 2012 - 10:07 PM
The last command run from the importer was:
exec() command (password hidden) `PATH=$PATH:/usr/bin/:/usr/local/bin/:usr/local/bin:/usr/local/sbin/:/usr/sbin/:/sbin/:/usr/:/bin/; /usr/bin/mysql --host=localhost --user=database_user --password=*HIDDEN* --default_character_set utf8 database_name < /path/to/sql` (with path definition).
Any ideas what's going on?
Posted 28 August 2012 - 11:52 AM
How about a PHP that splits the database process -if it fails-, and continues in about X amount of hours to import the rest of the tables. The process could be optimized by following said database optimization protocol. I mean treating the tables in away that the most important tables are updated first.
I get the divide and conquer concept, but automation is also key here. No? After all the backup script is failing in some area. Whether it's server issue or not, perhaps newer versions could avoid such issues.
No disrespect though. I believe it's a wonderful tool and the S3 migration is flawless.
Posted 06 August 2012 - 02:41 AM
With large sites such as you mention your focus should be on whether your server supports the use of the native command line utilities for making a dump of the database and creating a zip archive file through the PHP exec() function. PHP memory allocation isn't specifically an issue if this is the case - whether or not database server memory is an issue is independent of BackupBuddy, if the native command line utilities fail when used through the shell then that is a server issue and not something any PHP application/plugin can do anything about. Furthermore having a VPS is no particular indicator of capability in itself - in this case it would have to be a 64 bit machine, with a native 64 bit installation of PHP and you would need a recent zip utility version with at least zip64 extensions and your server would have to be configurable to the extent of giving the backup scripts enough time to be able to execute the backup process - the database dump phase is usually quite fast (even with a large database) because there really isn't much computation involved but creating a large zip archive is both processor and (to a lesser extent) disk intensive.
As Sridihar describes even with the basics in place there will still be areas that are challenged for the larger backup sizes, the ZipArchive one mentioned being a PHP library limitation which can be machine dependent. Also moving files around can depend on service provider limits so you would need to investigate those aspects as well.
So in essence, if your server has the right resources and capabilities then BackupBuddy will be able to marshal them but if that is not the case then you are always going to struggle. One of the advantages of BackupBuddy is that it has a lot of flexibility and options to help try and manage different situations and the support forum is always available to advise on how to make best use of BackupBuddy.
Looking at it from a slightly different perspective, when we see database sizes and site sizes like that mentioned the first question we normally ask is why are they so large, do they need to be that large and can you take a divide and conquer approach to make things more manageable. We often see huge tables that really don't have to be backed up or should be otherwise limited in size and also large volumes of effectively "static" data that doesn't have to be included in every backup - it's all about making the most efficient use of your resources and cutting your coat to fit your cloth :-)
Posted 06 August 2012 - 01:21 AM
In the meantime I want to share my recent experience as a user.
I took a full backup of one of my client's sites and it turned out to be 9 GB. Even though there were no errors reported during the bacukp process, the zip file was marked as bad. I was informed that with a backup zip of that size, the PHP ZipArchive library (which is used to handle backup zips for the Integrity Check) will not be able to handle it and I realized that even though it was marked bad it was actually fine. I sent the file to the destination from within BackupBuddy but only 1.9 GB of it reached. Luckily the destination happened to be on the same server, in another directory. Therefore I used cPanel file manager to copy the file to destination directory. Uploaded importbuddy.php and went through the steps but it failed. Then I use the zip extraction option in cPanel file manager and again ran ImportBuddy this time, selecting the option to not unzip the archive. In the end everything went just fine.
Posted 05 August 2012 - 05:21 AM
My website http://www.reallysimpleseo.com/ runs on WordPress and I'm looking for an ongoing backup solution. The question I have relates to how Backupbuddy will work with very large files. My database is getting on for a GB and all the other files amount to around 7GB. Some other backup plugins are crashing the site because it overloads the memory, even though it is running on a VPS, with extra memory for the database.
Can you let me know how Backupbuddy will work in this situation and what is the best backup medium.