jwcolby
jwcolby at colbyconsulting.com
Thu Jan 29 07:52:49 CST 2009
Fred, I guess I didn't explain the big picture well enough (or at all?). I do this SQL stuff for one particular client. I have DOZENS of databases, each database represents an order to export data and send it off to his client. Each database has queries, tables, SPs etc. In GENERAL, I copy a template database that has most of the required stuff, then modify existing views, populate tables in that database with the data set and extract a small subset of that data for export. I ALWAYS modify the views to extract the data. I sometimes pull data into more than one table, in which case that is a new table which never existed before. Once my SPs work I rarely if ever modify them, they are in the template and I just use them. When I am done, I have an "encapsulated" order database for a specific client of my client, an "order". My client can come back to me later and say "we need X more records from this order sent to the same client" at which time I simply open that database again, and looking into the result tables from the previous order, I pull (and mark as pulled) X more records and export them. All of this stuff requires pre-created SPs to do the heavy lifting inside of SQL Server. These orders are pulling data from source data of anywhere from 5 million to well over 100 million record tables. I need GENERIC methods of calling into SQL Server and causing ANY of my stored procedures to execute. While I was developing this system I have been opening these databases and executing by hand. Now it is time to take the next step and get an external "control" system in place to allow me to do this stuff. In the case of the SPECIFIC emails I have been sending in this flurry, I am running a process of: 1) Import CSV files into a table. 2) Export a subset of fields back out to CSV files. 3) Run an external program on those CSV files 4) Pull resulting CSV files back into SQL Server. The steps are the same, the details change. Where the original CSV files come from changes. The quantity of records in the source CSV files changes. The database and table that the source CSV files go into changes. The destination that the export csv files go to changes. Etc. But the PROCESS is identical. So... 1) I need to be able to execute stored procedures, passing in parameters such as directory paths, database and table names etc. 2) I need to be able to execute stored procedures that get pieces of data from the database (such as the start / stop PKID of a table) and get it back out to an external control program. 3) These stored procedures use bulk insert and BCP and is a major PITA to get set up and functioning, and getting that set up and functioning pretty much has to happen inside of SQL Server. Once I am done and it works, it is already inside of SQL Server so just calling it and passing parameters is a logical thing to do. Doing things this way I can write an program that uses metadata stored in tables to do the work, controlling existing stored procedures inside of the databases in SQL Server. I do appreciate your code however. There are times where it will be useful to use that method. John W. Colby www.ColbyConsulting.com Fred Hooper wrote: > Hi John, > > If I remember how this discussion started, you just wanted to run some SQL > Server code from Access. After you were able to do so you wanted to know how > many rows were affected. > > If this is right, I don't see why you need stored procedures; why not > execute the code directly? Just construct the SQL statement in code (no need > for parameters this way) and then execute it. > > I wrote an Access subroutine that does this in my local copy of SQL Server > 2008. The code is below. It inserts the records from one table into another > and then deletes the new records (because truncate, while faster, doesn't > return the number of records). > > Hope this helps, > Fred > > Public Sub RunCommand() > Dim strCNN As String > Dim cnn As ADODB.Connection > Dim cmd As ADODB.Command > Dim intRecords As Integer > > ' Setup connection (using 2008, you'll need to change the provider too) > strCNN = "Provider=SQLNCLI10" & _ > ";Server=FRED-6998B25045" & _ > ";Database=TrapoLocal" & _ > ";Uid=Trapo" & _ > ";Pwd=Trapo" > Set cnn = New ADODB.Connection > With cnn > .ConnectionString = strCNN > .ConnectionTimeout = 10 > .CursorLocation = adUseClient > .Open > End With > > Set cmd = New ADODB.Command > With cmd > .ActiveConnection = cnn > .CommandText = "insert into test select * from dbo.TempVsPSIG" > .Execute intRecords > .CommandText = "delete from test" > .Execute intRecords > End With > > Set cmd = Nothing > cnn.Close > Set cnn = Nothing > End Sub > >