Load text file data into SQL Server table

Last time we have seen how to manipulate text files from SQL Server code using Scripting object.

This time we would look at “Read File As Table” operation from a different angle – using BULK INSERT operation.

There are pros and cons in this approach. Lets look at them.

Advantages:

  • Getting data in table format, with properly associated columns
  • BULK INSERT is quicker operation if you know what you are importing
  • IDENTITY INSERT and other Batch optimizers
  • Transaction support
  • Is easiest way to copy data between servers

Disadvantages:

  • it is supported for variation of CSV formatted files only
  • structure of the file is strict and small error in formatting or a hidden character can cause problems with import
  • only members of the sysadmin and bulkadmin server roles can perform an operation
  • could be messy if not properly managed.

All saying, you should always try and see if it works for you, instead of creating custom import procedures or using DTS functionality.

Let fun begins:

We would be working with the following table:

CREATE TABLE StoreList (
StoreID INT,
StoreName VARCHAR(32),
  City VARCHAR(32),
  State VARCHAR(2),
  Country VARCHAR(32)
)

Sample data we would be using is stored in CSV file called c:\stores.csv

1111,Stone Age Books,Boston,MA,USA
2222,Harley & Davidson,Washington,DC,USA
3333,Infodata Algosystems,Berkeley,CA,USA

When table is created Bulk insert could be performed using the following command:

BULK INSERT dbo.StoreList FROM 'c:\stores.csv' WITH (
   FIELDTERMINATOR = ',',
   ROWTERMINATOR = '\n'
)

Notice that we are using coma as separator, it might not be appropriate in some situations and could be easily addressed for let say TAB-separated values using

BULK INSERT dbo.StoreList FROM 'c:\stores.csv'
WITH (
   ROWTERMINATOR = '\n'
)

As you can see that FIELDTERMINATOR has been removed. This is a case because ‘\t’ value otherwise used here is a default for the parameter. You can adjust Field and Row terminators to better suite format present in the file in each case.

Another thing to remember that some of the files, especially results from Excel export might include header rows which might make your import fail with conversion error

Bulk insert data conversion error (type mismatch) for…

To address the issue simply exclude header related rows from import

BULK INSERT dbo.StoreList FROM 'c:\stores.csv'
WITH (
    FIRSTROW = 2,
    ROWTERMINATOR = '\n'
)

There are other settings which might affect bulk import:

  • Turning Insert Triggers On while performing import
  • Keep NULL values – situation when there is nothing specified between delimiters
  • specifying LASTROW of the file to import
  • how many error to ignore when found in the file before invalidating a whole import

Please refer to SQL Server Book Online for details about such settings.

This entry was posted in Fun stuff with SQL Server and tagged , , . Bookmark the permalink.

Leave a Reply