[AccessD] pls advise: Performance problem => Staging + UpdateBatch?

Gustav Brock gustav at cactus.dk
Thu Oct 28 05:18:31 CDT 2004


Hi Sander

First, reading a 0,25 MB file off the network should be fast, no need
to copy the file first. But that is easy to test ... just try with a
local copy.

Second, you can read the file directly with a query as you did for
exporting.
Then use this query as the source for three append queries, one for
each table. Further, this step you can wrap in a
BeginTrans/CommitTrans to give you the option of rolling back.

/gustav


> Hi group,
 
> I've got a A2K (that needs to migrate to XP) app that needs to import a flatfile.
> A 267kb file takes about 25-28 HOURS(!!) to proces. Yes, it's a slow network.
> I need to improve this.
> This is how it 'works' now:
> 1 - read the file on the network to determine the number of lines
> 2 - start a for x=0 to NumberOfLines loop
> 3 - check the first x characters in the line
> 4 - create a new record in a table (table depends on step 3, there are 5 different tables to store the records)
> 5 - execute an update
> 6 - goto the next record
 
> Here's how I want to do it:
> 1 - copy the file from the network to the desktop
> 2 - import the complete file in a staging table
> 3 - Use an UpdateBatch mechanisme to update the BE.
 
> Question:
> How can i implement this UpdateBatch? I mean I have to store the records for each table 
> and after the complete file is processed I want to update all tables.
 
> Question:
> I need to implement a transaction. If the processing failes somewhere I need to rollback the changes (if I use an updatebatch this is probably not necesseray but it's good programming I think)
 
> any tips/advice is greatly appreciated!!
 
> Regards,
 
> Sander




More information about the AccessD mailing list