设为首页收藏本站

 找回密码
 注册

QQ登录

只需一步,快速开始

BC Morning V1806 门户 IT世界 应用开发 查看内容

6 ways to import data into SQL Server

2011-10-5 15:53| 发布者: Test| 查看: 648| 评论: 0|原作者: Ted Krueger (onpnt)

摘要: I'm going to go over some methods to import data from text files into SQL Server today. The particular file I went out and grabbed is comma delimited and with a few qualifiers in it. It is a typical f ...

I'm going to go over some methods to import data from text files into SQL Server today. The particular file I went out and grabbed is comma delimited and with a few qualifiers in it. It is a typical file you may get and a request made to import or just for your own administrative tasks.

Below is the location of field layout and file that I grabbed off the net to play with. This is just a text file comma separated of zip codes. I will attach the file as well to this blog.

http://spatialnews.geocomm.com/newsletter/2000/jan/zipcodes.html

Field 1 - State Fips Code
Field 2 - 5-digit Zipcode
Field 3 - State Abbreviation
Field 4 - Zipcode Name
Field 5 - Longitude in Decimal Degrees (West is assumed, no minus sign)
Field 6 - Latitude in Decimal Degrees (North is assumed, no plus sign)
Field 7 - 1990 Population (100%)
Field 8 - Allocation Factor (decimal portion of state within zipcode)

Example of file

Import Wizard

First and very manual technique is the import wizard. This is great for ad-hoc and just to slam it in tasks.

In SSMS right click the database you want to import into. Scroll to Tasks and select Import Data…

For the data source we want out zips.txt file. Browse for it and select it. You should notice the wizard tries to fill in the blanks for you. One key thing here with this file I picked is there are “ “ qualifiers. So we need to make sure we add “ into the text qualifier field. The wizard will not do this for you.

Go through the remaining pages to view everything. No further changes should be needed though

Hit next after checking the pages out and select your destination. This in our case will be DBA.dbo.zips.

Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.

Hit next and then finish. Once completed you will see the count of rows transferred and the success or failure rate

Import wizard completed and you have the data!

bcp utility

Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx

This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported. For this file it actually works well with a small format file to show the contents and mappings to SQL Server.

To create a format file all we really need is the type and the count of columns for the most basic files. In our case the qualifier makes it a bit difficult but there is a trick to ignoring them. The trick is to basically throw a field into the format file that will reference it but basically ignore it in the import process.

Given that our format file in this case would appear like this

  1. 9.0
  2. 9
  3. 1       SQLCHAR       0       0       "\""         0     dummy1             ""
  4. 2       SQLCHAR       0       50      "\",\""      1     Field1             ""
  5. 3       SQLCHAR       0       50      "\",\""      2     Field2             ""
  6. 4       SQLCHAR       0       50      "\",\""      3     Field3             ""
  7. 5       SQLCHAR       0       50      "\","        4     Field4             ""
  8. 6       SQLCHAR       0       50      ","          5     Field5             ""
  9. 7       SQLCHAR       0       50      ","          6     Field6             ""
  10. 8       SQLCHAR       0       50      ","          7     Field7             ""
  11. 9       SQLCHAR       0       50      "\n"         8     Field8             ""

The bcp call would be as follows

C:\Program Files\Microsoft SQL Server\90\Tools\Binn>bcp DBA..zips in "C:\zips.txt" -f "c:\zip_format_file.txt" -S LKFW0133 -T

Given a successful run you should see this in command prompt after executing the statement

  1. Starting copy...
  2. 1000 rows sent to SQL Server. Total sent: 1000
  3. 1000 rows sent to SQL Server. Total sent: 2000
  4. 1000 rows sent to SQL Server. Total sent: 3000
  5. 1000 rows sent to SQL Server. Total sent: 4000
  6. 1000 rows sent to SQL Server. Total sent: 5000
  7. 1000 rows sent to SQL Server. Total sent: 6000
  8. 1000 rows sent to SQL Server. Total sent: 7000
  9. 1000 rows sent to SQL Server. Total sent: 8000
  10. 1000 rows sent to SQL Server. Total sent: 9000
  11. 1000 rows sent to SQL Server. Total sent: 10000
  12. 1000 rows sent to SQL Server. Total sent: 11000
  13. 1000 rows sent to SQL Server. Total sent: 12000
  14. 1000 rows sent to SQL Server. Total sent: 13000
  15. 1000 rows sent to SQL Server. Total sent: 14000
  16. 1000 rows sent to SQL Server. Total sent: 15000
  17. 1000 rows sent to SQL Server. Total sent: 16000
  18. 1000 rows sent to SQL Server. Total sent: 17000
  19. 1000 rows sent to SQL Server. Total sent: 18000
  20. 1000 rows sent to SQL Server. Total sent: 19000
  21. 1000 rows sent to SQL Server. Total sent: 20000
  22. 1000 rows sent to SQL Server. Total sent: 21000
  23. 1000 rows sent to SQL Server. Total sent: 22000
  24. 1000 rows sent to SQL Server. Total sent: 23000
  25. 1000 rows sent to SQL Server. Total sent: 24000
  26. 1000 rows sent to SQL Server. Total sent: 25000
  27. 1000 rows sent to SQL Server. Total sent: 26000
  28. 1000 rows sent to SQL Server. Total sent: 27000
  29. 1000 rows sent to SQL Server. Total sent: 28000
  30. 1000 rows sent to SQL Server. Total sent: 29000

bcp import completed!

BULK INSERT

Next, we have BULK INSERT given the same format file from bcp

  1. CREATE TABLE zips (
  2.    Col1 NVARCHAR(50),
  3.    Col2 NVARCHAR(50),
  4.    Col3 NVARCHAR(50),
  5.    Col4 NVARCHAR(50),
  6.    Col5 NVARCHAR(50),
  7.    Col6 NVARCHAR(50),
  8.    Col7 NVARCHAR(50),
  9.    Col8 NVARCHAR(50)
  10.    );
  11. GO
  12. INSERT INTO zips
  13.     SELECT *
  14.       FROM  OPENROWSET(BULK  'C:\Documents and Settings\tkrueger\My Documents\blog\cenzuszipcodes\zips.txt',
  15.       FORMATFILE='C:\Documents and Settings\tkrueger\My Documents\blog\zip_format_file.txt'    
  16.       ) AS t1 ;
  17. GO

That was simple enough given the work on the format file that we already did. Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.

SSIS

Next is my favorite playground in SSIS

We can do many methods in SSIS to get data from point A, to point B. I’ll show you data flow task and the SSIS version of BULK INSERT

First create a new integrated services project.

Create a new flat file connection by right clicking the connection managers area. This will be used in both methods

Bulk insert

You can use format file here as well which is beneficial to moving methods around. This essentially is calling the same processes with format file usage. Drag over a bulk insert task and double click it to go into the editor.

Fill in the information starting with connection. This will populate much as the wizard did.

Example of format file usage

Or specify your own details

Execute this and again, we have some data

Data Flow method

Bring over a data flow task and double click it to go into the data flow tab.

Bring over a flat file source and SQL Server destination. Edit the flat file source to use the connection manager “The file” we already created. Connect the two once they are there

Double click the SQL Server Destination task to open the editor. Enter in the connection manager information and select the table to import into.

Go into the mappings and connect the dots per say

Typical issue of type conversions is Unicode to non-unicode.

We fix this with a Data conversion or explicit conversion in the editor. Data conversion tasks are usually the route I take. Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.

New look in the mappings

And after execution…

SqlBulkCopy Method

Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy. Not only fast but also handy for those really “unique” file formats we receive so often

Bring over a script task into the control flow

Double click the task and go to the script page. Click the Design script to open up the code behind

Go ahead and put this code into the task.

  1. Imports System
  2. Imports System.Data
  3. Imports System.Math
  4. Imports System.Xml
  5. Imports System.Data.SqlClient
  6. Imports System.Data.OleDb
  7. Imports Microsoft.SqlServer.Dts.Runtime
  8.  
  9. Public Class ScriptMain
  10.  
  11.  
  12.     Public Sub Main()
  13.         Dim conn As New SqlConnection("Data Source=LKFW0133;Initial Catalog=DBA;Integrated Security=SSPI")
  14.  
  15.         Using bulk As New SqlBulkCopy(conn.ConnectionString)
  16.             bulk.BatchSize = 1000
  17.             bulk.NotifyAfter = 1000
  18.             bulk.DestinationTableName = "zips"
  19.             AddHandler bulk.SqlRowsCopied, AddressOf OnSqlRowsCopied
  20.             bulk.WriteToServer(LoadupDTFromTxt)
  21.         End Using
  22.  
  23.         Dts.TaskResult = Dts.Results.Success
  24.     End Sub
  25.  
  26.     Private Sub OnSqlRowsCopied(ByVal sender As Object, _
  27.         ByVal args As SqlRowsCopiedEventArgs)
  28.         Console.WriteLine("Copied {0} so far...", args.RowsCopied)
  29.     End Sub
  30.  
  31.  
  32.     Private Function LoadupDTFromTxt() As DataTable
  33.         Dim cn As New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data Source=C:\;Extended Properties=""Text;HDR=No;FMT=Delimited""")
  34.         Dim da As New OleDbDataAdapter()
  35.         Dim ds As New DataSet
  36.         Dim cd As New OleDbCommand("SELECT * FROM C:\zips.txt", cn)
  37.         cn.Open()
  38.         da.SelectCommand = cd
  39.         ds.Clear()
  40.         da.Fill(ds, "zips")
  41.         Return ds.Tables(0)
  42.         cn.Close()
  43.     End Function
  44. End Class

Then execute the script task

Again the same results as previous methods but with a new look.

All of these methods have a place in each unique situation. Performance wise in my experience, bcp wins on speed typically. This is not always a method you can use though which leads us to other resources SQL Server and services provide us.

Of course none of this would be completely finished unless we added some statistics on runtimes along with these methods. I went ahead and created 3 addition zips.txt files to go with the original. In these files we have the following

zips.txt - 29,471 rows at around 1.8MB
zips_halfmill.txt - 500,991 rows at around 31.4MB
zips_million.txt - 1,001,981 rows at around 62.8MB
zips_5mill.txt - 5,009,901 rows and around 314.3MB

I ran these each through all the methods. My results are below. Mind you, the important thing to understand is that I write my blogs/articles off my eprsonal test lab. In no way do I utilize monster servers that would be more suited for benchmarking each. All hardware is created not so equal and results will be varying given that variable. Take these results but keep in mind that resources will make them move up and down the chart. Memory, I/O and CPU is a big factor in speed.

Tests were complete by running each process 5 times. All resources cleared on each execution.
Shown in AVG of milliseconds between the types. Import Wizard was not tested as this is basically a Data Flow Task behind the scenes and can be seen (minus the slow user clicking things) from the Data Flow Task in SSIS results

Hope this helps as a good reference in your own imports.


路过

雷人

握手

鲜花

鸡蛋

相关阅读

最新评论

手机版|BC Morning Website ( Best Deal Inc. 001 )  

GMT-8, 2025-7-8 08:49 , Processed in 0.013612 second(s), 17 queries .

Supported by Best Deal Online X3.5

© 2001-2025 Discuz! Team.

返回顶部