[AccessD] Every 100th record

John W. Colby jwcolby at colbyconsulting.com
Tue Aug 31 19:25:07 CDT 2004


In fact the client has another database that has 125 million addresses,
another that has 14 million more people, and another that handles all their
sports mailings.  They would like to merge them all.  I just bought a 3ghz
socket 754 Athlon 64 which I am loading with Win2K and SQL Server tonight.
I can only pray that this gives me SOMETHING in the way of a speedup against
my old AMD Athlon 2500.  I have to examine my options, down to splitting up
the database and having different machines process pieces.  I also have to
learn to tune SQL Server.  

Since I am starting from "know absolutely nothing" it shouldn't be too hard
to get better results over time.  ;-)

John W. Colby
www.ColbyConsulting.com 

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller
Sent: Tuesday, August 31, 2004 7:33 PM
To: 'Access Developers discussion and problem solving'
Subject: RE: [AccessD] Every 100th record


Just to put things in perspective, JC, the first client of the people who
developed MySQL had 60M rows in their principal table. There are lots of
apps way bigger than that. I once had a client that was adding 10M rows per
month to the table of concern (this was an app recording seismic activity
from several hundred meters). I must caution you that you should not use the
term VLDB as loosely as you have been using it. You don't know the meaning
of VLDB -- not yet at least. You're beginning to appreciate the turf,
however. Once I bid on a project that had 100M rows each containing a
graphic file. Not to say that size is everything, but IMO VLDB comprises at
least a TB, and often many hundreds of TBs.

I just got a contract with a company using MySQL whose test database's most
important table comprises 100M rows. They expect their clients to have 10*
as many rows. My job is to optimize the queries. Fortunately, I can assume
any hardware I deem necessary to do it. They are after sub-second retrieves
against 1B rows, with maybe 1000 users. Life's a beach and then you drown. I
don't know if I can deliver what they want, but what I can deliver is
benchmarks against the various DBs that I'm comfortable with -- SQL 2000,
Oracle, MySQL and DB/2. I figure that if none of them can do it, I'm off the
hook :)

The difficult part of this new assignment is that there's no way I can
duplicate the hardware resources required to emulate the required system, so
I have to assume that the benchmarks on my local system will hold up in a
load-leveling 100-server environment -- at least until I have something
worthy of installing and then test it in that environment.

I sympathize and empathize with your situation, JC. It's amazing how many of
our tried-and-true solutions go right out the window when you escalate the
number of rows to 100M -- and then factor in multiple joins. Stuff that
looks spectacular with only 1M rows suddenly sucks big-time when applied to
100M rows.

Arthur

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of John W. Colby
Sent: Tuesday, August 31, 2004 7:13 AM
To: 'Access Developers discussion and problem solving'
Subject: RE: [AccessD] Every 100th record


Paul,

In fact I am trying to make this run on my home system which is part of the
problem.  This week I am playing "stay-at-home dad" as my wife starts the
chhool year this week and has all those 1st week teacher meetings /
training.  

I have never come even close to a db this size and it has definitely been a
learning experience.  Here's hoping I survive.

John W. Colby
www.ColbyConsulting.com 

-----Original Message-----
From: accessd-bounces at databaseadvisors.com
[mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Paul Rodgers
Sent: Tuesday, August 31, 2004 3:49 AM
To: 'Access Developers discussion and problem solving'
Subject: RE: [AccessD] Every 100th record


65 million! What an amazing world you work it. Is there ever time in the
week to pop home for an hour? 
Cheers paul 






More information about the AccessD mailing list