Importing Large Amounts of Data

If you've used mysqldump to backup databases, then importing large (huge) files can use a lot of RAM, and get the daemon killed.

Try splitting the dump into per-table sections:

csplit backup.sql -f backup -b '%03d.sql' '/^-- Table structure for table /' '{*}'

If there are still sizable chunks, then try importing the structure but not the data:

csplit backup025.sql -f backup025_ -b '%03d.sql' '/^-- Dumping data for table /' '{*}'

And then try importing the data one insert at a time.

csplit backup025_001.sql -f backup025_001_ -b '%03d.sql' '/^INSERT INTO /' '{*}'

SQL

Insert the files into a docker container like this:

for i in backup*.sql; do
  docker cp "${i}" "${CONTAINER}":/var/lib/mysql-files/
done

Import the files into the database like this:

\. /var/lib/mysql-files/backup000.sql

Use FLUSH TABLES; periodically to get things to disk.

Restarting the daemon can also help reduce memory usage.