[AccessD] using a saved SSIS with VB.Net

JWColby jwcolby at colbyconsulting.com
Tue May 1 07:49:20 CDT 2007


Eric,

I have successfully generated the fmt file.  My only concern at this point
is that the import file has spaces at the end of the valid data, padding to
make up the full width of the field.  I do NOT want the spaces, and I
suspect that BCP is going to pull in all of the spaces.  Is it possible to
tell BCP to strip the spaces, or is it possible to use BCP to pull a CSV
file into an existing table, respecting the already established field types
/ sizes in the destination table?

John W. Colby
Colby Consulting
www.ColbyConsulting.com

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Eric Barro
Sent: Monday, April 30, 2007 7:31 PM
To: 'Access Developers discussion and problem solving'
Subject: Re: [AccessD] using a saved SSIS with VB.Net

Have you looked into BULK INSERT in SQL? This is supposed to be a faster
data import method.


Using T-SQL you can do something like this...

CREATE TABLE #tmpEmployees (<fieldnames here>) BULK INSERT #tmpEmployees
FROM 'c:\temp\import.csv'
WITH (FORMATFILE = 'c:\temp\importCSV.fmt'

importCSV.fmt would contain the file format...in this example it's fixed
width

8.0
18
1   SQLCHAR   0   5   ""     1	suffix	SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0   30   ""   2	last_name	SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   20   ""   3	first_name	SQL_Latin1_General_CP1_CI_AS



-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby
Sent: Tuesday, April 24, 2007 9:07 AM
To: 'Access Developers discussion and problem solving'
Subject: Re: [AccessD] using a saved SSIS with VB.Net

The CSV file is on the same machine.  It appears that the clause that pulls
the source table (csv file) into memory is taking a ton of time.  These are
large files, the smallest are a little under 200 million bytes and the
largest are up in the 3 gigabyte range.  It appears that SQL Server does not
read a few records and append them, but rather reads the whole CSV and then
starts appending all of the assembled records.

If I were a SQL Server pro I could probably speed this up considerably.
Alas, I am not.

John W. Colby
Colby Consulting
www.ColbyConsulting.com

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart
Sent: Tuesday, April 24, 2007 11:53 AM
To: accessd at databaseadvisors.com
Subject: Re: [AccessD] using a saved SSIS with VB.Net

John,

One of the keys to getting it into SQL Server faster is to have the CSV file
on the server and not on a different machine. Network traffic can kill the
process and slow it down significantly.

Robert

At 10:22 AM 4/24/2007, you wrote:
>Date: Tue, 24 Apr 2007 10:32:44 -0400
>From: "JWColby" <jwcolby at colbyconsulting.com>
>Subject: Re: [AccessD] using a saved SSIS with VB.Net
>To: "'Access Developers discussion and problem solving'"
>         <accessd at databaseadvisors.com>
>Message-ID: <003101c7867d$6cce93a0$657aa8c0 at m6805>
>Content-Type: text/plain;       charset="us-ascii"
>
>Gustav,
>
>My bigger issue here is that there 56 of these files to import into SQL 
>Server, supposedly ~100 million records.  I have done about 8 million 
>records so far.  I really must get this thing automated such that it 
>just chunks through these CSV files without my having to be around to 
>start the next one.  I am working now on setting up the append query 
>using that syntax below into a stored procedure so that I can then just
replace the file name.
>After that I will need to write something in VB.Net or whatever to 
>execute the stored procedure feeding in all of the file names from a 
>specific directory, deleting the file once the stored procedure 
>finishes the import for a given file.
>
>I have never written a stored procedure.
>Obviously, given the above, I have never called a stored procedure from 
>code.
>
>So much to learn, so little time.
>
>Once this is imported I have to turn right around and export a subset 
>of fields from the table back out as 1 - 2 million record chunks for 
>CAS / DPV / NCOA processing, then I have to import THOSE back in to a 
>new
table.
>
>And when this set of data is finished, I have another set of about the 
>same size on the way, to which I have to perform the same processes.  I 
>soooooo need to get this process automated.
>
>John W. Colby


--
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com

--
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com

No virus found in this incoming message.
Checked by AVG Free Edition. 
Version: 7.5.463 / Virus Database: 269.5.10/774 - Release Date: 4/23/2007
5:26 PM
 

--
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com




More information about the AccessD mailing list