[dba-VB] ADO.Net

jwcolby jwcolby at colbyconsulting.com
Sat Apr 26 19:45:47 CDT 2008


This is the blog I mentioned:

http://blogs.msdn.com/dataaccess/archive/2005/05/19/420065.aspx

There are many things to consider when I program in .net, not the least 
of which is simplicity.  If I can get a simple method that gets me 
medium efficiency, it is better (for me) than a complex method that gets 
high efficiency simply because I may be able to implement the simple 
method whereas the complex method would never get done.

I can always come back later and recode for a higher efficiency after I 
have a method working and chugging away.  As long as the program does 
not take weeks to run, it is better to spend cpu time than John Colby time.

As I mentioned, the name parser is not super fast.  I am hoping to get 
the first database update batch happening in parallel with the second 
name parsing batch.  I also hope to run this on a completely different 
computer which has more cores (and memory) so that I can use threads 
effectively.

Which is not to say I am not interested in the most efficient method, 
simply that I will likely be unable to code it very quickly.

Shamil Salakhetdinov wrote:
> 
> <<<
> I am definitely interested in doing strongly typed recordsets though.
> John,
> 
> That strongly typed recordsets are imposing big overhead - you can dig into
> the code generated (by VS) for these recordsets...
> 
> ...as for batch updates "automagically" generated by ADO.Net - you can use
> SQL profiler to see what happens "under the hood"...
> 
> ...I'd bet that if you generate batch updates by custom code into temp SPs
> and use datareader to get data to build these batches then this approach
> will result in a way faster data processing, and you will have many options
> to optimize it even more...
> 
> ...just my opinion but as you can find in ASP.Net apps ADO.NET typed
> recordsets are not recommended to use - have a look at e.g. "ASP.NET Website
> Programming: Problem - Design - Solution C# Edition"
> by Marco Bellinaso....
> 
> --
> Shamil
> 
> -----Original Message-----
> From: dba-vb-bounces at databaseadvisors.com
> [mailto:dba-vb-bounces at databaseadvisors.com] On Behalf Of jwcolby
> Sent: Sunday, April 27, 2008 3:32 AM
> To: Discussion concerning Visual Basic and related programming issues.
> Subject: Re: [dba-VB] ADO.Net
> 
> Thanks Shamil,
> 
> Unfortunately unless you are dealing with 80 million row 200 field 
> tables, any timing comparisons are going to be suspect.
> 
> I considered doing (and may still do) a record by record update using 
> either a stored procedure or dynamic sql right from vb.net.  Then I read 
> an article (blog) that essentially said that the batch update from an 
> ado dataset really did work as you would hope that it would, and this in 
> a blog from a member of Microsoft's ADO.Net dev team.
> 
> He claimed that either of the other methods had network transaction 
> overhead as each update (when done singly) required the command to do it 
> as well as a result coming back, whereas if you used a batch it would 
> group all of the commands out and then group all the results coming back 
> into big packages.
> 
> Understand that I do not know enough to even comment on the validity of 
> his argument however he said to definitely try it.  Set the batch size 
> and then let ADO just send batches of updates.
> 
> Given that the coding effort is smaller to do that I figured I would at 
> least get that running.
> 
> I am definitely interested in doing strongly typed recordsets though.
> 
> Shamil Salakhetdinov wrote:
>> Hi John,
>>
>> Try to use SqlDataReader and custom classes as in the sample code below
>> (sorry C# is used - real pleasure to use C# is also coming from the fact
>> that you can copy and paste code and do not care about line warps because
>> even "screwed by e-mail" code should be compiled well)....
>>
>> ...using ADO.NET datasets and bulk update will anyway result in a series
> of
>> updates on SQL server side therefore custom on-the-fly built sql updates
> or
>> calls to update SPs should work as quick as ADO.NET DataSets' batch
>> update...
>>
>> ...etc...
>>
>> ...as you can see my sample did update 25000 records in ~10 seconds - and
>> this only for starters - if that approach will work reasonably well for
> you
>> then the next step could be to introduce multi-threading etc...
>>
>> ...note also how MS SQL 2005's paging feature is used to get record's
>> batches...
> 

-- 
John W. Colby
www.ColbyConsulting.com



More information about the dba-VB mailing list