1.11M Members

[T-SQL]How to use Pipe deliminator Format in OPENDATASOURCE command

 
0
 

Hello dear T-SQL community,

I am working on some exciting project but my problem is not exciting and I have not been able to find a valid solution... (all related fix are arround BCP or BULK INSERT).

The server where the DB is located is not allowing BCP / Bulk insert command ... this is why I need to use the OPENDATASOURCE command. Also, the PIPE symbole is something that is forced from our IT dep and I can't influence this...

My Problem:
How can I load a set of data (here in 5 collumns) when the each data is separated by a pipe symbole?

DETAILS
I am loading the data in the table variable @tablename that is created before the command and has bellow format:

        Col1 nvarChar(50),
        Col2 nvarChar(10),
        Col3 nvarChar(10),
        Col4 nvarChar(10),
        Col5 nvarChar(10)

The text file I need to import look like this:

START|RF|
Z2R|255|001|004|AB7

where:

START|RF| is the header (discarted)
Z2R|255|001|004|AB7 is the data I want to load...

Here is the code :

SET @ls_SQL = 'INSERT INTO ' + '[dbo].[' + @tablename + ']' + 'SELECT * FROM OPENDATASOURCE(''Microsoft.ACE.OLEDB.12.0'',''Data Source=D:\star\DropBox\;Extended Properties=''''Text;HDR=No;Format=^|;'''''')...[MyFileName#txt]'
SELECT @ls_SQL
EXECUTE sp_ExecuteSQL @ls_SQL

SQL is not able to load the data with the pipe and interprete that the whole line is basically for 1 collumn only...

If you have any idea how I can work with this...it will be appreciated :).

Kind regards,

Nicolas

 
0
 

Try this connection string:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=D:\star\DropBox\;Extended Properties="text;HDR=No;FMT=Delimited(|)"

More info here: http://www.connectionstrings.com/textfile

 
0
 

Hello Adam,

Thanks for your reply. I have just tried it and it is not working either :(, still getting the same error:
[Column name or number of supplied values does not match table definition].

I have also tried with:

SET @ls_SQL = 'INSERT INTO ' + '[dbo].[' + @tablename + ']' + 'SELECT * FROM OPENDATASOURCE(''Microsoft.ACE.OLEDB.12.0'',''Data Source=D:\star\DropBox\;Extended Properties=''''Text;HDR=no;Delimiter=|'''''')...[GPSC#txt]'

And it loads in the first collumn all the data on one row if the col1 is nvarchar(MAX),

It looks like i will have to either work on a schema.ini or to get a way to rework and extract all the data I need from the first collumn...

Kind regards,

Nico

 
0
 

If it loads all your data in a single record, single column, then I'm guessing that your file doesn't terminate lines correctly (CRLF) and SQL can't figure out where the line ends, so it considers it one big line.
Try setting up a file manually and give it a go. If it gets inserted, then try to fix the source of your file, or you'll be forced to get it into a field and use a sproc to cut it into the appropriate fields.

 
0
 

Hello Adam,

Thank you for the feedbacks. This is still not working with the opendatasource command however, I have been able to load data with the bulk insert by using a format file that looks like this (for other devellopers not to stay hungry):

10.0
5
1   SQLCHAR 0   30  "|" 1   col1    ""
2   SQLCHAR 0   30  "|" 2   col2    ""
3   SQLCHAR 0   30  "|" 3   col3    ""
4   SQLCHAR 0   30  "|" 4   col4    ""
5   SQLCHAR 0   30  "|" 5   col5    ""

Details how to build the format file Click Here:

and with the code to import that is like that:

BULK INSERT Tab_GPSC 
   FROM '\\D:\star\DropBox\GPSC.txt' 
   WITH (FORMATFILE = '\\D:\star\DropBox\FormatGPSCvalues.fmt');
GO

This work under my computer, but once I go to the integration server ... it says that bulk insert is not allowed. I guess that I have no choice but to have the bulk insert rights granted...

There is one last chance...to be able to convert the initial file that contains '|' into a file that contains ','...I will try to use that solution, if it works out I will post in the same thread.

Thanks for your assistance, it is a pity it did not worked out as thought.

Nico

You
This article has been dead for over six months: Start a new discussion instead
Post:
Start New Discussion
Tags Related to this Article