Shamil Salakhetdinov
shamil at smsconsulting.spb.ru
Sat Apr 26 18:47:37 CDT 2008
<<< I am definitely interested in doing strongly typed recordsets though. >>> John, That strongly typed recordsets are imposing big overhead - you can dig into the code generated (by VS) for these recordsets... ...as for batch updates "automagically" generated by ADO.Net - you can use SQL profiler to see what happens "under the hood"... ...I'd bet that if you generate batch updates by custom code into temp SPs and use datareader to get data to build these batches then this approach will result in a way faster data processing, and you will have many options to optimize it even more... ...just my opinion but as you can find in ASP.Net apps ADO.NET typed recordsets are not recommended to use - have a look at e.g. "ASP.NET Website Programming: Problem - Design - Solution C# Edition" by Marco Bellinaso.... -- Shamil -----Original Message----- From: dba-vb-bounces at databaseadvisors.com [mailto:dba-vb-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Sunday, April 27, 2008 3:32 AM To: Discussion concerning Visual Basic and related programming issues. Subject: Re: [dba-VB] ADO.Net Thanks Shamil, Unfortunately unless you are dealing with 80 million row 200 field tables, any timing comparisons are going to be suspect. I considered doing (and may still do) a record by record update using either a stored procedure or dynamic sql right from vb.net. Then I read an article (blog) that essentially said that the batch update from an ado dataset really did work as you would hope that it would, and this in a blog from a member of Microsoft's ADO.Net dev team. He claimed that either of the other methods had network transaction overhead as each update (when done singly) required the command to do it as well as a result coming back, whereas if you used a batch it would group all of the commands out and then group all the results coming back into big packages. Understand that I do not know enough to even comment on the validity of his argument however he said to definitely try it. Set the batch size and then let ADO just send batches of updates. Given that the coding effort is smaller to do that I figured I would at least get that running. I am definitely interested in doing strongly typed recordsets though. Shamil Salakhetdinov wrote: > Hi John, > > Try to use SqlDataReader and custom classes as in the sample code below > (sorry C# is used - real pleasure to use C# is also coming from the fact > that you can copy and paste code and do not care about line warps because > even "screwed by e-mail" code should be compiled well).... > > ...using ADO.NET datasets and bulk update will anyway result in a series of > updates on SQL server side therefore custom on-the-fly built sql updates or > calls to update SPs should work as quick as ADO.NET DataSets' batch > update... > > ...etc... > > ...as you can see my sample did update 25000 records in ~10 seconds - and > this only for starters - if that approach will work reasonably well for you > then the next step could be to introduce multi-threading etc... > > ...note also how MS SQL 2005's paging feature is used to get record's > batches... -- John W. Colby www.ColbyConsulting.com _______________________________________________ dba-VB mailing list dba-VB at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-vb http://www.databaseadvisors.com