Good day! Today is another post about mysql I would like to talk to you about the problem of filling the data in large volumes. When you have a script puts the data into the database – gut, and when the 'handles' – just annoying. Today we will talk about acceleration of the process. First you need to figure out what is' cvs ". cvs – the file where the data are separated by commas. 'But how can we help cvs "- you ask.
You're probably at least once in their lives exported mysql database via phpmyadmin you could see in what format to export base. Normally use sql, but we have a export option 'CSV for Excel ". If you look closely, you can identify the trend. There can export data to csv, it is logical that will import, but we need to adhere to certain rules. 1.
The minimum file size 2,048 KB; 2.B our file number inserted data should be comparable with the structure of the table. Example. We have a table with these attributes: id, name, surname. csv file will have the form:'', 'Igor', 'Doe','''Max', 'Vasin''''Holy', 'Sidorov' Usually id, has a property auto_increment and for this reason we leave an empty field. With the application we're done. Now let's ponder how this method can accelerate the process adding data. It all depends on the form in which information comes to you for completion. Let me give you an example from practical experience. I filled in the database of information about Formula 1 from Wikipedia. Information was provided in a table, and after each cell, a comma will not be difficult) In this case, I filled the base in less than an hour. As you could see, it all depends in what form you got the information. And what means you still know? C Sincerely, cava!