[AccessD] Every 100th record

Arthur Fuller artful at rogers.com
Tue Aug 31 18:32:55 CDT 2004


Just to put things in perspective, JC, the first client of the people
who developed MySQL had 60M rows in their principal table. There are
lots of apps way bigger than that. I once had a client that was adding
10M rows per month to the table of concern (this was an app recording
seismic activity from several hundred meters). I must caution you that
you should not use the term VLDB as loosely as you have been using it.
You don't know the meaning of VLDB -- not yet at least. You're beginning
to appreciate the turf, however. Once I bid on a project that had 100M
rows each containing a graphic file. Not to say that size is everything,
but IMO VLDB comprises at least a TB, and often many hundreds of TBs.

I just got a contract with a company using MySQL whose test database's
most important table comprises 100M rows. They expect their clients to
have 10* as many rows. My job is to optimize the queries. Fortunately, I
can assume any hardware I deem necessary to do it. They are after
sub-second retrieves against 1B rows, with maybe 1000 users. Life's a
beach and then you drown. I don't know if I can deliver what they want,
but what I can deliver is benchmarks against the various DBs that I'm
comfortable with -- SQL 2000, Oracle, MySQL and DB/2. I figure that if
none of them can do it, I'm off the hook :)

The difficult part of this new assignment is that there's no way I can
duplicate the hardware resources required to emulate the required
system, so I have to assume that the benchmarks on my local system will
hold up in a load-leveling 100-server environment -- at least until I
have something worthy of installing and then test it in that
environment.

I sympathize and empathize with your situation, JC. It's amazing how
many of our tried-and-true solutions go right out the window when you
escalate the number of rows to 100M -- and then factor in multiple
joins. Stuff that looks spectacular with only 1M rows suddenly sucks
big-time when applied to 100M rows.

Arthur

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of John W. Colby
Sent: Tuesday, August 31, 2004 7:13 AM
To: 'Access Developers discussion and problem solving'
Subject: RE: [AccessD] Every 100th record


Paul,

In fact I am trying to make this run on my home system which is part of
the problem.  This week I am playing "stay-at-home dad" as my wife
starts the chhool year this week and has all those 1st week teacher
meetings / training.  

I have never come even close to a db this size and it has definitely
been a learning experience.  Here's hoping I survive.

John W. Colby
www.ColbyConsulting.com 

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Paul Rodgers
Sent: Tuesday, August 31, 2004 3:49 AM
To: 'Access Developers discussion and problem solving'
Subject: RE: [AccessD] Every 100th record


65 million! What an amazing world you work it. Is there ever time in the
week to pop home for an hour? 
Cheers paul 

-----Original Message-----
From: John W. Colby [mailto:jwcolby at colbyconsulting.com]
Sent: 27 August 2004 16:39
To: 'Access Developers discussion and problem solving'
Subject: RE: [AccessD] Every 100th record


Gustav,

I am working on a SQL Server database of about 65 million records.  We
need to pull a subset of those for doing counts of data in specific
fields. Trying to do that analysis on the entire 65 million records just
won't work at least in anything close to realtime.  Thus we literally
want to pull every Nth record.  If we pulled every 100th record into a
table that would give a sampling of 650K records to run this analysis
on.  That still won't be lightning fast but at least doable.

John W. Colby
www.ColbyConsulting.com 

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Gustav Brock
Sent: Friday, August 27, 2004 11:22 AM
To: Access Developers discussion and problem solving
Subject: Re: [AccessD] Every 100th record


Hi John


> Does anyone have a strategy for pulling every Nth record?  My client 
> wants to pull every 100th record into a dataset for analysis, to speed

> things up I am guessing.

To speed up what? Analysis on a sample only and not on the full set?

If so, you could select by "Random Between 1 To 100" = 1.

/gustav

-- 
_______________________________________________
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com



-- 
_______________________________________________
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com

---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.742 / Virus Database: 495 - Release Date: 19/08/2004
 

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.742 / Virus Database: 495 - Release Date: 19/08/2004
 

-- 
_______________________________________________
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com



-- 
_______________________________________________
AccessD mailing list
AccessD at databaseadvisors.com
http://databaseadvisors.com/mailman/listinfo/accessd
Website: http://www.databaseadvisors.com




More information about the AccessD mailing list