I am attempting to use this converter with an IP Board version 2.3.6 forum, and while I was able to get it working after applying several integer cast edits (to catch fields which had NULL values), now I'm stumped.
The entire contents of the smf_messages table has empty body fields, and I can verify that the original ibf_posts table contains valid post bodies, although possibly with some control characters in them? When I select them using the mysql command line tool, they emit control codes that affect the terminal.
I already have a convenient compressed and encrypted copy of the database, which I can share via PM, but beware, it's over 200MB compressed using xz -9, or 1.2GB uncompressed.
E: Disregard the duplicate primary keys error I encountered, it was due to resuming a step in the middle of operating on a given table, I just needed to delete everything converted including and following the specified record to resume, after making further edits to the script.
Somehow the script in this topic expects camelCase field IDs, while the 2.0.11 I have expects underscore separated field names.
E2: Now I am getting this error partway through the conversion if I enable debugging support, which would seem to indicate that the PHP session is expiring. I have no idea why it would do that, since I have set the expiration time to 24 hours for the duration of my conversion process.
Notice: Undefined index: convert_script in /var/www/xx/convert.php on line 66
Notice: Function set_magic_quotes_runtime() is deprecated in /var/www/xx/convert.php on line 70
E3: I fixed the session problem by manually transplanting the session variables from the previous session into the "new" one that was spontaneously created when I added the debug variable to the get parameters.
Now it's on to posts, and actually appears to be converting post bodies this time around.
E4: Yes, that duplicate index error. Perhaps it is a VERY BAD IDEA to try to cram multiple conversions into a single step of the convert page? Or at least advance a page load so you're always starting a new step on each page load? That way, a refresh doesn't result in errors like that, where the target data already exists, but the step thinks it needs to finish the job again.