Five mysql tricks to save your time while migrating huge data.

Hey there, when it comes to data archiving or migrating some of data from mysql to say csv of any other format. If the amount of data is huge it becomes really difficult to do so. Follow the following tips if you are working with great data. So lets see five mysql tricks to save your time while migrating huge data.

Five mysql tricks to save your time while migrating huge data.

Five mysql tricks to save your time while migrating huge data.


Indices are useful but too much indexing can cause a lot of problem.

Lets take a scenario where you have to delete a lot of data and lets say it has index on almost every field. In that case your search will be fast because of indices but when it comes to deleting the rows it will take more time.


Because every time you delete a row index has to be built again and again.

Same is the case with insertion if you have many index insertion will be slow cause it has to built the index.


Use bulk Insertion and bulk deletion instead of doing it row by row. 

Yes bulk insertion and bulk deletion are far better as they save a lot of time in building indices. Always use these and you can notice the difference very easily. Something like below:

insert into `company`(`companyID`,`CompanyName`,`Phone`) values 
(1,'Speedy Express','(503) 555-9831'),
(2,'United Package','(503) 555-3199'),
(3,'Federal Shipping','(503) 555-9931');

N+1 query problem

dogs = all_dogs();
foreach (dogs as dog) {
  dog_house = load_house_for_dog(dog);

Lets say all_dogs() has implementation like below:

SELECT * FROM dog WHERE contraint

and load_house_for_dog() has implementation like below:

SELECT * FROM house WHERE dogID = id

Thus you will be issuing 1(query for all_dogs) and n queries for load_house_for_dog() which is n+1 query problem.

We can use in clause of mysql to solve this problem, something like below.



SELECT * FROM house WHERE dogID IN (1, 2, 3, 4, 5, ...)

Thus one query for getting all dogs and other for getting hats for these dogs. Thus you save (n-1) queries. Congrats!


While altering table use INPLACE ALGORITHM

By using INPLACE algorithm you say not to copy table while altering the table and thus save your self some time for other queries to execute.

ALTER TABLE tbl_name DROP FOREIGN KEY fk_symbol, algorithm=INPLACE;


Try indexing the fields which are most search or say present in where clause

Yes this will save time as when you index your search becomes really fast as it now uses B tree instead of linear search which work in log(n) time. Thus increasing the performance of your mysql queries.

select * from table1 where id = 123;

Here id should be index to make this query fast if the number of rows are huge.

This is the end of the article five mysql tricks to save your time while migrating huge data. Liked the article please share and subscribe.

Gaurav Yadav

Gaurav is cloud infrastructure engineer and a full stack web developer and blogger. Sportsperson by heart and loves football. Scale is something he loves to work for and always keen to learn new tech. Experienced with CI/CD, distributed cloud infrastructure, build systems and lot of SRE Stuff.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Are you preparing for DevOps and SRE Interview?

Here is a book for you.

Interview preparation and interview questions for DevOps and SRE

A good reads for last-minute preparations for handling DevOps and SRE interviews at companies like LinkedIn, Visa, Atlassian, etc.