It should work for all users, at all experience levels, across all server platforms.
Something so comprehensive is totally impossible.

What I can say is: I wrote down this code, is not so different from the previous, it "simply" split the backup in chunks (well not exactly, not only, but that's not the point here).
The limits are more or less (I still have to optimize it a bit) the limits that phpMyAdmin has: almost none with plain text dumps, few with zipped dumps because of the way phpMyAdmin handles the backup I discovered they are almost useless if bigger then a certain size, pmyadmin concatenates gzencode/bzcompress outputs, but these cannot be concatenated, resulting in a file that has only the first echo readable, at least here on my box).
Additionally it takes time. That's not something related to server overheating or the fact you should stay in front of the computer waiting for the backup to finish, but is something related to data consistency. Think about it: if the backup takes 20 minutes it means that if something happens during these 20 minutes (a post, a new topic, a ban, whatever) this could be lost. Or, since the backup scans the entire database from the first table to the last, it could create inconsistent data: if new member registers while the backup is at {db_prefix}messages and this new user posts a message, in the backup the message could appear, but the corresponding user will not (because {db_prefix}members is before {db_prefix}messages).
So for "big databases" the maintenance mode would be mandatory in order to be sure the backup is "good enough".
There are few things to take in consideration.
(which has never failed for me, mind you - I can't recreate the problem on 3 different servers/sites. LOL!)
Just because you don't use it?
