search

Home  >  Q&A  >  body text

How to import CSV file into MySQL table?

I have a non-normalized event log CSV from a client that I'm trying to load into a MySQL table so it can be reconstructed into a normal format. I created a table called "CSVImport" with a field for each column of the CSV file. The CSV contains 99 columns, so this is a daunting task in itself:

CREATE TABLE 'CSVImport' (id INT);
ALTER TABLE CSVImport ADD COLUMN Title VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN Company VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN NumTickets VARCHAR(256);
...
ALTER TABLE CSVImport Date49 ADD COLUMN Date49 VARCHAR(256);
ALTER TABLE CSVImport Date50 ADD COLUMN Date50 VARCHAR(256);

There are no constraints on the table, and all fields hold VARCHAR(256) values ​​except count (represented by INT), yes/no (represented by BIT), price (represented by DECIMAL) and text description (represented by TEXT ).

I tried loading the data into a file:

LOAD DATA INFILE '/home/paul/clientdata.csv' INTO TABLE CSVImport;
Query OK, 2023 rows affected, 65535 warnings (0.08 sec)
Records: 2023  Deleted: 0  Skipped: 0  Warnings: 198256
SELECT * FROM CSVImport;
| NULL             | NULL        | NULL           | NULL | NULL               | 
...

The entire table is filled with NULL.

I think the problem is that the text introduction contains more than one line, and MySQL is parsing the file as if each new line corresponds to a database row. I can load the file into Open without any problem.

clientdata.csv file contains 2593 rows and 570 records. The first row contains column names. I think it's comma delimited and the text is obviously delimited by double quotes.

renew:

If you have any questions, please read the manual: http://dev.mysql.com/doc/refman/5.0/en/load-data.html

I added some information to the LOAD DATA statement that is smart enough to infer and now it loads the correct number of records:

LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY 'n'
IGNORE 1 LINES;

But there are still a lot of completely NULL records, and none of the loaded data seems to be in the right place.

P粉785957729P粉785957729446 days ago739

reply all(2)I'll reply

  • P粉057869348

    P粉0578693482023-10-10 10:07:31

    Use mysqlimportLoad the table into the database:

    mysqlimport --ignore-lines=1 \
                --fields-terminated-by=, \
                --local -u root \
                -p Database \
                 TableName.csv
    

    I found it at http://chriseiffel.com/everything-linux/how-to-import-a-large-csv-file-to-mysql/

    To make the delimiter a tab, use --fields-termerated-by='\t'

    reply
    0
  • P粉210405394

    P粉2104053942023-10-10 00:32:34

    The core of the problem seems to be matching columns in the CSV file to columns in the table.

    Many graphical mySQL clients have very nice import dialog boxes for this sort of thing.

    My favorite job is Windows-based HeidiSQL. It provides you with a graphical interface to build LOAD DATA commands; you can later reuse it programmatically.

    Screenshot: "Import Text File" dialog box

    To open the Import Text File dialog box, go to Tools > Import CSV File :

    reply
    0
  • Cancelreply