Or, There is a Use for GUI Tools!
Recently I was working on a problem for a customer. They are converting a server with two InnoDB data files to
$ ls -l ibdata* -rwxr-xr-x 1 mysql mysql 2.0G Oct 14 13:10 ibdata1 -rwxr-xr-x 1 mysql mysql 350G Oct 14 13:10 ibdata2
I honestly don’t recall ever seeing (or hearing about) so large a data file.
The method chosen for conversion boils down to this:
- stop and start the server to enable
- alter all tables to myisam
- stop server and delete ibdata file
- restart server
- convert tables back to InnoDB
- add foreign keys
This post isn’t about how we did it, or about whether or not it was the best way, or anything like that. This post is really about the the last two steps.
There were over 500 InnoDB tables and something like 375 foreign keys. So I needed to somehow create the
ALTER TABLE statements for these 500 servers. To drop the Foreign Keys, I used the following
SELECT CONCAT( "mysql -u root -pPASS ", table_schema," -e 'ALTER TABLE ", table_name, ' DROP FOREIGN KEY ', constraint_name,';'' &') AS list_of_fks FROM information_schema.KEY_COLUMN_USAGE WHERE REFERENCED_TABLE_NAME is not null ORDER BY TABLE_NAME, COLUMN_NAME;
This created a list of statements that I could then put in a batch file, edit a little, and have them run in parallel.
I used the following
SELECT statement to find all InnoDB tables and prepare a command to convert them to MyISAM:
SELECT CONCAT("mysql -u root -pPASS ", TABLE_SCHEMA, " -e 'ALTER TABLE ", TABLE_NAME, " ENGINE=MYISAM;' & ") FROM INFORMATION_SCHEMA.TABLES WHERE ENGINE='INNODB';
Again, the output can be put in a batch file, modified a little, and run in parallel.
Once everything was converted to MyISAM, I shut down the database and removed the InnoDB log and data files and modified the
my.cnf to so I had
file_per_table and a much smaller InnoDB data file. That was the fairly straightforward part. Now the fun began. I needed to convert all the tables back to InnoDB and add the FKs back.
Now ordinarily, I’m a command-line kind of of guy. However, having used
diff before for comparisons, I decided to go another route. Based on some previous experience, I chose SQLyog for the job. Before I began any conversion of the tables I dumped the schema of the database and loaded it into a MySQL server running on my laptop.
Once all the conversions were done I used a tool in SQLyog called a “Schema Synchronization Tool”. Essentially, you point the tool a “source” database and a “target” database. (It is very important to keep track of which is which.) During the choice of the source and target you also choose the database that you want to compare. Once this is done you simply click compare and it checks both database and then returns a nice list of SQL statements.
Since we had 500 tables to convert I didn’t just copy the SQL into a file and import it. I created more files that contained lines similar to:
mysql -u root -pXXX "ALTER TABLE .... " &
This way, they ran in the background. At one point I had over 200 alter statements running at the same time.
My one complaint about SQLyog is that the SQL it generated was sometimes inefficient. It would give me a SQL statement to
ALTER TABLE engine=Innodb and then three separate statements for three different FKs. Each time it would have to copy the table so this would take four times as long as combining all these statement into one.Â It is perfectly feasible to do this:
ALTER TABLE X ADD FK X1, ADD FK X2, ADD FKX3, engine=innodb;
Big time savings!
Unfortunately, I did run across what amounts to a bug. Not in SQLyog, but in MySQL. You can run this:
ALTER TABLE X ADD FK X1, ADD FK X2, ADD FKX3;
and MySQL will happily take the statement and run. Hours later when it gets done, you find out that the table is still MyISAM and the FKs were just turned into indexes. Stupid!
Sheeri pointed out that the MySQL Workbench program does something very similar. I haven’t had a chance to look into it yet. Having worked with MySQL Workbench before, I am sure it does a schema sync very well. Either way, trust me on this—if you have a significant schema synchronization project, you need to look into one of these tools.