John W. Colby
jwcolby at colbyconsulting.com
Tue Aug 31 06:12:30 CDT 2004
Paul, In fact I am trying to make this run on my home system which is part of the problem. This week I am playing "stay-at-home dad" as my wife starts the chhool year this week and has all those 1st week teacher meetings / training. I have never come even close to a db this size and it has definitely been a learning experience. Here's hoping I survive. John W. Colby www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Paul Rodgers Sent: Tuesday, August 31, 2004 3:49 AM To: 'Access Developers discussion and problem solving' Subject: RE: [AccessD] Every 100th record 65 million! What an amazing world you work it. Is there ever time in the week to pop home for an hour? Cheers paul -----Original Message----- From: John W. Colby [mailto:jwcolby at colbyconsulting.com] Sent: 27 August 2004 16:39 To: 'Access Developers discussion and problem solving' Subject: RE: [AccessD] Every 100th record Gustav, I am working on a SQL Server database of about 65 million records. We need to pull a subset of those for doing counts of data in specific fields. Trying to do that analysis on the entire 65 million records just won't work at least in anything close to realtime. Thus we literally want to pull every Nth record. If we pulled every 100th record into a table that would give a sampling of 650K records to run this analysis on. That still won't be lightning fast but at least doable. John W. Colby www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Gustav Brock Sent: Friday, August 27, 2004 11:22 AM To: Access Developers discussion and problem solving Subject: Re: [AccessD] Every 100th record Hi John > Does anyone have a strategy for pulling every Nth record? My client > wants to pull every 100th record into a dataset for analysis, to speed > things up I am guessing. To speed up what? Analysis on a sample only and not on the full set? If so, you could select by "Random Between 1 To 100" = 1. /gustav -- _______________________________________________ AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com -- _______________________________________________ AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com --- Incoming mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.742 / Virus Database: 495 - Release Date: 19/08/2004 --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.742 / Virus Database: 495 - Release Date: 19/08/2004 -- _______________________________________________ AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com