During last few weeks I had to work with relatively big MySQL dumps. I had to find interesting rows in about 400 files, each of them taking 40 minutes to import. In order to speed things up, I found simple tool that allowed me to extract only interesting tables.
The tool is actually single Perl script, named
extract_sql.pl (available on Github).
It allows extracting tables with simple command:
[email protected]:~$ extract_sql.pl -t TABLE_NAME -r DUMP_FILE.sql
This command will print dump to console output, so you may want to redirect it to some file:
[email protected]:~$ extract_sql.pl -t TABLE_NAME -r DUMP_FILE.sql > table_name.sql
extract_sql.pl is able to read input from
stdin, so it is easy to
extract and import single table from compressed dump file:
[email protected]:~$ zcat DUMP_FILE.sql | extract_sql.pl -t TABLE_NAME \
| mysql dest_database -u username -p