How Puppy Linux Saved the Murga Forum
Once again a Puppy Linux stepped in to do some critical work restoring the murga forum (oldforum).
The sudden loss of the sites and having to move to a new host meant recovering the databases intact and usable and is the most important part of the forum systems. The murga forum's database is large and unwieldy and since it is a conversion from a much older version of software, it can be slightly unstable at times.
Usually it's not that drastic of a procedure to create a new MySQL database and import a SQL dump file to populate the database with all of the correct tables and the data in those tables. But doing this on host provider shared systems and remotely when these SQL files are several gigabytes in size and the SQL dump may not have the exact formatting. The biggest hurdle is the dreaded "mysql server is gone" error or the variety of service unavailable pages and a stopped/crashed import.
The problem is iFastnet shutdown our forums and did the database dumps but not AFTER clearing the caches and deleting the huge search indexes , really inflating the SQL dump sizes.
So a Puppy Linux Bionic64 comes into play.
I have one running Zoneminder merrily chunking away streaming cameras and doing motion detection and other "stuff", on a decent Hiawatha, mariaDB, PHP7.2 setup. Also I do some phpBB forum work with this web server since Bionic64 is loaded with tools and utilities that come into play when working on these types of projects also very useful in a situation like this one, in a big way.
So I take the SQL dump that iFastnet made and go to load into the new host server, but too many problems and failure after failure. And these operations take lots of time when large amounts of data need to be transferred around, and @Clarity SAMBA is suddenly again very important. Some uploads take hours.
Using the command line mysql console I imported the murga database into the Bionic64 MySQL server. Takes about 2 hours all said and done. To load something like this on a remote machine on a host provider is practically impossible or very tricky to accomplish. There are some open source solutions like Bigdump.php which is a little program that loads the data in chunks with a small pause in between into a database. Made for exactly this purpose. Also on hand a freeware SQL splitter which splits very large SQL into smaller files worked well but did not import cleanly. Reason is the format used when the SQL dump is formed. To make it work what is needed is a conversion. Loading the entire database working on it, then exporting another SQL file with the right format.
The point of loading into the Bionic64 servers was to reduce the size of the database by truncating the search index tables manually with phpMyAdmin and breaking the long monolithic lines into individual commands delimited by ";" semi-colons. To do this I had to load the entire SQL file, search through the database tables and truncate caches and search index info. Then perform a mysqldump
from the command line, because phpMyAdmin can not do these large exports either. BUT this step is the key.....in the config file my.cfg
the most important line must be added:
Code: Select all
extended-insert=FALSE
Then proceed with the export:
Code: Select all
mysqldump -u mysql -p large_database > large_database.sql
Once this is done all the finished product is transferred by upload from Bionic64 to the web site server root and bigdump.php is used to do a staggered import.
Long story short...it worked.
Also Bionic64 easily handled the 30+ crashes I caused trying out different methods to fix all of this as the frustration levels grew, and bounced right back. This Bionic64 running the mariaDB (installed with pkg-cli and the PPM) worked very well under the tremendous loads.
And somebody wanted to mess with this Kennel?