I have a problem exporting my database from my hosting account. I have an account at one.com. On exporting, the download process stops exactly at 1GB every time. Google Chrome reports "download failed - network error", Firefox just stops downloading and outputs a .php file.
My database is 3.3GB. How can I export it in full? I tried to reach the one.com support, but they actually couldn't help (that really disappointed me because I expected much more expertise from the staff). When I select gzip compression on exporting, I can download a 260Mb file that extracts to about 2.7GB, so it is still not the full database.
do you have SSH access? if not try to export using MysqlDumper https://sourceforge.net/projects/mysqldumper/
I tried with SSH, the result is a 2.7 GB SQL file, still not 3.3GB.
I cannot install MySQLDumper because I have no access to the root on my hostin space, and can only place the installer inside the forum folder, however that way I cannot access it when Maintenance mode is on.
Are you using the gzip option on phpmyadmin?
Make sure you clear your smf_error_log
If I try phpmyadmin with gzip, it downloads a 260mb file that can be extracted to a 2.7GB one. In case I omit gzip, just download the pure SQL, the download stucks at 1GB.
I managed to install mysqldumper somehow. The problem is, it starts dumping, then it writes after a while: too many requests. And freezes.
i once had a host disallow me to do the same.... so...
check this out: https://www.phpclasses.org/package/8904-PHP-Dump-MySQL-tables-in-chunks-of-limited-size.html
pull it out in chunks. it starts where the last dump stopped, and until it reaches the timeout or the size/traffic. there are other php scripts to run out there too...
Is it useful in case one table (the one used by prettyurl) is 3 GB?
Uninstall pretty urls mod (if you can - good luck :P), drop the table. If 90% of your database is from a mod, you shouldn't be using it.
Maybe 2.7gb is correct. The different SQL export options can result in pretty dramatic size differences. E.g, whether column names are included.
One quick check - the beginning & end usually save off & restore some system settings. If the end restores the settings saved in the beginning, it is complete.
Have you tried restoring it?
I would have liked to, but problems arised.
1. At the recent host, I am only offered 1 database. I don't want to import the sql back to it, because I fear it will mess up the original tables.
2. I opened another account at Mochahost, and tried to restore the database there. The probelm is, Filezilla closes the FTP connection after every 1 GB upload, and thus I cannot upload the big file to the server, to restore it from SSH.
So at the moment I am stuck with verifying the sql file.
By the way, what can be the reson that MSQLDumper didn't work for me? In the instruction video they put it to the directory that contains the directory of the forum. But at my host I am not allowed the copy things there, so I copied the SQLDumper directory inside the forum directory. Can it be the source of problems?
MySQLDumper should run from directory in forum root. Some hosts block external script from dumping database.
Quote from: Sono on December 04, 2018, 09:31:21 AM
I cannot install MySQLDumper because I have no access to the root on my hostin space, and can only place the installer inside the forum folder, however that way I cannot access it when Maintenance mode is on.
Can you create additional folders within the forum folder? I would suggest trying to add a new folder for dumper, and running it from there.
Yeah, that is exactly what I do. Create a folder in the forum directory and run it from there. But it has issues, reports Unspecified index or what, and stops on exporting reporting Too many requests. It doesn't find the database automatically either.
As I remember (it's been a while) you can change much of dumper's behavior through it's settings - try to make it intentionally slower, the too many requests error would suggest you are trying to get too much too fast.
The 2,7GB file exported through SSH extracts to the original 3,3 GB database when importing. So that way it works, no need to fragment the export.