hi,

I want to import a text file which delimited by tab to mysql.
1) using phpmyadmin => results ( exceed time limit even i've extend)
2) open using excel file save to .csv but still couldn't save all the record
3) using ms access with odbc connection => almost get all the data but not in sequence
4) using command => record is seems like not follow the real sequence which suppose to insert from top of the text file to bottom.

now i'm blur how how to get the detail correct in with the position. anyone please

Recommended Answers

All 12 Replies

You can import CSV files with the mysql command line client with the LOAD DATA statement (http://dev.mysql.com/doc/refman/5.1/en/load-data.html). Put the CSV file on your server so that you won't have network timeouts.
What does it mean that the entries are not "in sequence"? How do you test the sequence? Please show the CREATE TABLE statment for the table in which you want to import.

i will try the LOAD DATA thankx for the suggestions
not in sequence meanse that

at data in text file is for example
row 1
row 2
row 3
.
.
.

but in database it shows after posting
row 2
row 1
row 3
.
.
.

The order of rows in the actual database is totally irrelevant (and that is part of relational database theory - there is NO sequence). You determine the order rows are displayed when you query the database, IF the display order matters.

How big is your huge text file?

pluss minus 10MB - 15MB for one table
Thats the problem i suppose to run the report using the sequence of given data in text file and couldn't get the sequence right using DESC or ASC

Eliminate the ORDER BY clause from your select statement.
Or add an auto_increment field to your table and ORDER BY this field.

i'm using
load data local infile 'c:\data1.txt'
into table data1
fields terminated by '\t'
lines terminated by '\r\n'

result i get is
Query OK, 8448 rows affected, 17148 warnings (0.10 sec)
Records: 8572 Deleted: 0 Skipped: 124 Warnings: 17148

how to know which record is skipped?

The number of warnings strongly suggests that your text file is loaded with errors, which makes it skip many rows until it gets back to correct code to insert - usually it's one less or one more attribute than in the table you're inserting into, or a date in the wrong format, or an extra comma in a text field.

I regularly insert from text files of 1 - 2 MB, but using software that can connect remotely from my PC, not phpmyadmin - I use HeidiSQL, and mysql manager lite from ems. I found comma delimited failed very often (my data included text fields with a comma in them sometimes) so I changed to tab delimited and that has worked well.

Also try

show errors;
show warnings;

immediately after the "LOAD DATA". You might see some useful debugging info.

hey thankx everyone .. you are right now i can see what's wrong with the entry
"Out of range value for column"

if the text file date format is 30-01-2011 10:20:11
how to change the format when load data?? i use below script but showing error

load data infile 'e:/data1.txt'
into table data1
(date_format(uwrstmtdt, '%d-%m-%Y %h:%m:%s'),
uwrid, uwracc, uwrjid)
fields terminated by '\t'
lines terminated by '\r\n'

LOAD DATA cannot execute any functions on the data.
Create a temporary table with a text field instead of a data field. Import the data into the temporary table. Then add a date field and convert your text field contents into it.
The date_format function does not work on text fields, only on dates. To insert a value into the date field use the format Y-m-d h:m:i.

Or do the conversion first on the input data with a sed script (or another text conversion tool) which converts your date field in the Mysql date format.

no wonder i try to mix and match the details with fail
first choice looks complicated

i use second one pump the data to excel file and change the formating of date
as for the sorting, i put auto increment number for each data
save it into text file then do the LOAD DATA

Thankx guyz for the help and idea

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.