Max Wanadoo
max.wanadoo at gmail.com
Sat Nov 14 05:30:41 CST 2009
Very interesting Shamil What happens with all the other stuff that memory is taking up though, OS, Programs, etc and the physical memory he has placed in the machine? Your limits below presumably are saying that his physical memory is loaded to...what? Max -----Original Message----- From: dba-vb-bounces at databaseadvisors.com [mailto:dba-vb-bounces at databaseadvisors.com] On Behalf Of Shamil Salakhetdinov Sent: 14 November 2009 10:41 To: 'Discussion concerning Visual Basic and related programming issues.' Subject: Re: [dba-VB] What to do, what to do? Hi Robert, Yes, DataSets are mainly targeted for working with small number of records in memory - so the following is more a "theoretical" calculation: - JC has 64bit PC which allow to load in memory "practically unlimited" volume of data - 64bit logical (process) address space could be as large as 16 exbibytes = 1,152,921,504,606,846,976 bytes * 16 (http://en.wikipedia.org/wiki/Exbibyte). - the speed of data transfer for 64bit Intel's processor - "the 200 MHz McKinley bus transferred 6.4 GiB/s), and the 533 MHz Montecito bus transfers 17.056 GiB/s (http://en.wikipedia.org/wiki/IA-64#Memory_architecture); ----- So "theoretically" with modern speedy harddisks JC can load 50 millions of records into memory within minutes - within half a minute in the near future when large enough flash-memory disks will become available and relatively inexpensive? To load such large data volume it would be better to use SqlDataReader as it keeps loaded data in a very compact form. And 50 million records long SqlDataReader should be probably better split into several chunks... Again - this is just a "theoretical" consideration - in practice JC can process his 50 million records long data table in chunks as he has a hash field (tblHashPK), which can be used to load related records into memory: I mean he can split tblHashPK table keeping hash values into several groups, e.g.: 1 - 100,000 100,001 - 200,000 ... and process each group joining its records to source 50 million records... -- Shamil -----Original Message----- From: dba-vb-bounces at databaseadvisors.com [mailto:dba-vb-bounces at databaseadvisors.com] On Behalf Of Robert Stewart Sent: Saturday, November 14, 2009 7:46 AM To: dba-vb at databaseadvisors.com Subject: Re: [dba-VB] What to do, what to do? John, I don't think your machine, or any for that matter, have the memory to load 50 million records, extract 40 million, transform them and then load them back to the 50 million and update the database from the recordset. Datasets are for working with a small number of records in memory, not millions. Robert <<< snip >>> __________ Information from ESET NOD32 Antivirus, version of virus signature database 4606 (20091114) __________ The message was checked by ESET NOD32 Antivirus. http://www.esetnod32.ru _______________________________________________ dba-VB mailing list dba-VB at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-vb http://www.databaseadvisors.com