[dba-SQLServer] [dba-VB] How I'm approaching the problem

jwcolby jwcolby at colbyconsulting.com
Tue Jul 24 11:10:37 CDT 2007


One issue I am running into is that I really want to be able to log the
start / stop PKID of each record imported.  I use an autoincrement in SQL
Server, but I do not know how to get at the last PKID in these values "live"
as the SQLBulkInsert runs.  Is it possible to pull the value of a field back
out of the SQLBulkInsert object?  Do I have to execute a sql statement to
"get" the value of the PKID of the last record before the import and the
last record after the import?


John W. Colby
Colby Consulting
www.ColbyConsulting.com 
-----Original Message-----
From: dba-vb-bounces at databaseadvisors.com
[mailto:dba-vb-bounces at databaseadvisors.com] On Behalf Of Shamil
Salakhetdinov
Sent: Monday, July 23, 2007 3:57 PM
To: dba-vb at databaseadvisors.com
Subject: Re: [dba-VB] How I'm approaching the problem

<<<
I am doing tables of 50 to 100 million records so the total process can
still run 60 to 90 minutes but it was taking 2 days.
>>>
Yes, that's speedy!

Do you use any special hardware?

Here are some results of using SQLDMO.BulkCopy under C#/VB.NET:

SQLDMO.BulkCopy
================

BCP export - 141 seconds 1,216,138 reported rows

BCP Import - 174 seconds 1,256,819 reported rows

Extrapolating to 100 million records:

BCP Export: 141*100,000,000/1,216,138 = 11594,17 seconds = 3.22 hours ~=
3h10m

BCP Import: 173*1000,000,000/1,256,819 = 12844,48 seconds = 3,85 hours ~=
3h50min

Well performance will probably degrade and total time will be higher...

These results are not that quick as on your system for SQLBulkCopy but this
my test PC is "pretty simple" for these days Dual Core Pentium 3GHz and 2GB
RAM and simple 7400 rpm harddisk....


--
Shamil
 
-----Original Message-----
From: dba-vb-bounces at databaseadvisors.com
[mailto:dba-vb-bounces at databaseadvisors.com] On Behalf Of jwcolby
Sent: Monday, July 23, 2007 9:38 PM
To: dba-vb at databaseadvisors.com
Subject: Re: [dba-VB] How I'm approaching the problem

Shamil,

I am likewise using SQLBulkCopy to import data from CSV files using a third
party library to handle the conversion of the file to a stream and vv.  I am
quite impressed.  Using my hand written code for doing something similar I
was getting between around 1k records / second loading "row by row" whereas
using SQLBulkCopy and this third party lib I am getting about 16K records /
second = about 60 seconds for 1 million records.  That is quite usable.

I am doing tables of 50 to 100 million records so the total process can
still run 60 to 90 minutes but it was taking 2 days.

John W. Colby
Colby Consulting
www.ColbyConsulting.com




More information about the dba-SQLServer mailing list