jwcolby
jwcolby at colbyconsulting.com
Sat Jan 30 08:30:55 CST 2010
Thanks for the reply Mark, Whatever I do has to be executable fro C#. Additionally I have to be able to figure it out. That last is a pretty severe constraint. ;) BCP is extremely fast, and I already automate that in C#. I have a couple of BCP input and output stored procedures which build dynamic TSQL to perform the BCP. I then execute those stored procedures using a Stored Procedure class I wrote which essentially wraps a command object and implements a set of standard SP parameters that I use all of the time. So... if I can do it with BCP that is preferred, simply because I have struggled 1/2 way up the learning curve already. I am open to other tools, and I know that I really should learn SSIS (or whatever they are calling it now). John W. Colby www.ColbyConsulting.com Mark Breen wrote: > Hello John, > > Forgive me if you have already deep dived into SSIS (2005) which used to be > DTS (2000 + SQL 7.0), which used to be BCP (SQL 6.5) which you thought me > about in 1997. > > I just looked around my SQL 2008 installation here and could not find SSIS, > but I do have Business Intelligence here so that must be what they are > calling it in SQL 2008. > > I only explored SSIS a little a few years ago, but is seemed to be a simple, > almost script based system that is specificially designed for this type of > work. I believe that data warehouse people refer to this work as ETL - > Extraction, Transforming and Loading. On one project I working on there was > a whole ETL department. > > As we all know, you can easily code this by hand, but I guess that the SQL > team now that for data warehouses they need a set ot ETL tools and BCP was > version 1.0, but now with SQL 2008, we must have a more sophisticated set of > tools. > >>From my own, limited use of SSIS, I know that you can load a file, perform > validation, modification on a line by line basis, and then push that along, > to either a good data or bad data branch, and finally place that good data > in a new table, in what ever form you wish. > > Would it be faster than loading a single record, or even a 1000 records at a > time - who knows, but the SQL tools for ETL are designed for this type of > work. > > >>From this <http://www.microsoft.com/sqlserver/2008/en/us/integration.aspx>link, > I have copied the following > > Overview > > Microsoft SQL Server 2008 provides a scalable enterprise data integration > platform with exceptional Extract, Transform, Load (ETL) and integration > capabilities, enabling organizations to more easily manage data from a wide > array of data sources. > Top New Features > > - > > Create Script tasks by using Microsoft Visual C# and Microsoft Visual > Basic .NET. > - > > Use ADO.NET for tasks as well as for source and destination components. > - > > Improve scalability with thread pooling and enhanced lookup > transformations. > - > > Perform more functional and scalable data transfers with the improved SQL > Server Import and Export Wizard. > - > > New SSIS connectors for SAP BW, Oracle and Teradata. > > > > > > I hope that is some help, and does not frustrate you to say "I know that > stuff, but it is a hassle to take the deep dive" > > Mark > > > > > On 30 January 2010 04:44, jwcolby <jwcolby at colbyconsulting.com> wrote: > >> I have a situation where I have to import into SQL Server millions of >> records, fixed width, same >> format, different data, over and over again. >> >> >> The format is given to me in a spreadsheet looking like this: >> >> FROM TO LENGTH MODE DESCRIPTION >> 1 40 40 C FULL NAME (PARSED NAME IN POS 421-482) >> 41 80 40 C COMPANY >> 81 120 40 C ADDRESS 1 >> 121 160 40 C ADDRESS 2 >> 161 190 30 C CITY >> 191 192 2 C STATE >> 193 197 5 C ZIP >> 198 198 1 C ZIP 4 - >> >> MANY more columns. >> >> I need to turn import this into a table. I have done this once, brute >> force, using the import wizard. >> >> The bottom line is that I now have an existing table in SQL Server that >> "fits" the spec. The fields >> are all nvarchar() with the correct number of characters in each field. >> One problem is that I do >> NOT want the trailing spaces which pad the data to the fixed width. I do >> have a udf which strips >> off padding but it is going to be a PITA to apply that to all of the fields >> for millions of records >> (though I am capable of doing that if necessary). >> >> Is there any way to tell SQL Server to import the data into this table. >> Preferably in a stored >> procedure that I can execute from my existing C# code base which knows how >> to execute stored procedures. >> >> I have to do this many times in the future, and I do NOT want to do this >> manually every time. >> Somebody mentioned on time doing something with the wizard and then peeking >> at the code that the >> wizard generated. I don't know how to do that but if I could, I could then >> cut that into a SP and >> execute the SP as needed. >> >> Something like that. >> >> Anybody have any ideas? >> >> TIA >> >> -- >> John W. Colby >> www.ColbyConsulting.com >> _______________________________________________ >> dba-SQLServer mailing list >> dba-SQLServer at databaseadvisors.com >> http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >> http://www.databaseadvisors.com >> >> > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > >