I recently had to import a large csv file into mysql. Here’s how I did it. Also be sure that you have an ssh account/access to the terminal on the computer with the sql server.
1. Create the table you will store the items in.
Take a look at the top few lines of your csv file (using the head command makes this really easy). You might see something like this
Some Large Database ID,Name, Phone, Address 1,James,505-234-2123,3456 Super St. 2,Mark,254-342-6732,7351 Roger Rd. ...
Notice that the csv file is set up with four columns.The first step is to create a table in a mysql database that has the three columns as fields in the table. I’m not going to show the specifics, but if you don’t know how to create a table in mysql google it. Remember if your database is big, be sure to put the index/primary/unique keys on before you upload the database.
2. Use the mysqlimport utility
Now comes the magic, here’s the command to upload your file.
mysqlimport --ignore-lines=1 --fields-terminated-by=, --columns='ID,Name,Phone,Address' --local -u root -p Database /path/to/csvfile/TableName.csv
Ignore-lines skips the first line “Some Large Database.” The –fields-terminated by tell the utility that the commas separate the columns. The –columns is used to map the order of the data in the csv file to the SQL database table. –local is very important. I’m not sure why it’s there but importing a csv file won’t work without it. The Database is the name of the database in which your table is stored. You must put the absolute path of the csv file for it to register with the utility. The “TableName.csv” has to match the name of the table in your mysql database.
For more information on the mysqlimport utility take a look at it’s reference page.
Hopefully this makes someone’s life easier. Leave a comment below if it does.