Can anyone help me with importing text file to sql server using

here is my code

Dim ImportedFile As String

        Dim ImportedLines() As String
        Dim OneLine() As String
        Dim strSQL As String
        Dim i As Integer
        If My.Computer.FileSystem.FileExists("D:\PPOS\Sales_report\inventory_255503221452_Cake102.CSV") Then
            ' Read the whole text file
            ImportedFile = My.Computer.FileSystem.ReadAllText("D:\PPOS\Sales_report\inventory_255503221452_Cake102.CSV")
            ' Separate lines
            ImportedLines = Strings.Split(ImportedFile, Environment.NewLine)
            For i = 0 To ImportedLines.GetUpperBound(0)
                ' Separate fields
                OneLine = Strings.Split(ImportedLines(i), ",")
                If OneLine.GetUpperBound(0) = 8 Then
                    ' Build SQL insert statement
                    strSQL = "INSERT INTO inventory(Inventory_id,item_id,Amount,AvgCost,CreateDate,CreateUser,Real_Qty,UnitID,BranchID) VALUES (" & OneLine(0) & ", " & OneLine(1) & ", " & OneLine(2) & ", " & OneLine(3) & ", '" & objgen.ChgDate(CDate(OneLine(4).ToString.Replace("'", "")), 0) & "', " & OneLine(5) & ", " & OneLine(6) & ", " & OneLine(7) & ", " & OneLine(8) & ")"
                    MessageBox.Show("Wrong number of fields", "Notification", MessageBoxButtons.OK)
                    ' Error: Wrong number of fields
                End If
            Next i
            MessageBox.Show("No record found.", "Notification", MessageBoxButtons.OK)
        End If
    End Sub

What happen is the For loop got excecuted 3 times more than the actual lines i have got in the text file

here is the data contains in the file

9,15,4440,60,'29/08/2553 11:49:10','Admin',72,21,'Cake101'
1 0,16,4100,100,'29/08/2553 11:49:47','Admin',41,21,'Cake101'

Please help me.
Every comment appreciated.

Recommended Answers

All 4 Replies

You can do it this way:

Call this method:

Private Sub InsertingIntoDB()
    Dim fileName As String = "D:\PPOS\Sales_report\inventory_255503221452_Cake102.csv"
    Using conn As New SqlConnection("connString")
        Dim query As String = "INSERT INTO inventory (Inventory_id, item_id, Amount, AvgCost, CreateDate, CreateUser, Real_Qty, UnitID, BranchID) " & "VALUES(@a, @b, @c, @d, @e, @f, @g, @h, @i)"
        Using cmd As New SqlCommand(query, conn)
            For Each line As String In ReadingFile(fileName)
                Dim data As String() = line.Split(New Char() {","C}, StringSplitOptions.RemoveEmptyEntries)
                cmd.Parameters.AddWithValue("@a", Integer.Parse(data(0)))
                cmd.Parameters.AddWithValue("@b", Integer.Parse(data(1)))
                cmd.Parameters.AddWithValue("@c", Integer.Parse(data(2)))
                cmd.Parameters.AddWithValue("@d", Integer.Parse(data(3)))
                cmd.Parameters.AddWithValue("@e", Integer.Parse(data(4)))
                cmd.Parameters.AddWithValue("@f", data(5))
                cmd.Parameters.AddWithValue("@g", Integer.Parse(data(6)))
                cmd.Parameters.AddWithValue("@h", Integer.Parse(data(7)))
                cmd.Parameters.AddWithValue("@i", data(8))
                Catch ex As Exception
                    Exit Try
                End Try
        End Using
    End Using
End Sub

Private Function ReadingFile(path As String) As IEnumerable(Of String)
    Using sr As New StreamReader(path)
        Dim line As String
        While (InlineAssignHelper(line, sr.ReadLine())) IsNot Nothing
            yield Return line
        End While
    End Using
End Function

I dont know exact types of your columns, so check them out.
hope it helps,

I just found out that the problem is not with the importing procedure but the actual problem is when i'm exporting the file

But still i don't get it why it is always have 3 lines more than the actual line that i got in the database.

You may be able to use the bulk insert capability. Here ase several links to more information

I used to do this when speed was a priority. I had to import 8000+ records every five minutes and it all had to be done as one transaction. Doing it record by record in a loop was not an option. If the text file is not in the correct format then you could always massage it into a suitable form (and validate the data) prior to bulk inserting.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.