[dba-Tech] Any ideas

Francisco H Tapia my.lists at verizon.net
Tue Oct 21 11:20:01 CDT 2003


Martin,
   I will concur, a database would be the ideal solution, preferably a 
record locking mdb (2000 or up) or MSDE, or if SQL Server is available 
then that.  The problem I'm afraid does arise from multiple pc's writing 
  at the same time, This problem would not be so problematic if each pc 
wrote to a *new* file either.

Martin Reid wrote:

> LOL
> 
> The "I" in this case isnt me. Its our head of IT services.
> 
> Martin
> 
> 
> ----- Original Message ----- 
> From: "Mitsules, Mark" <Mark.Mitsules at ngc.com>
> To: "'Discussion of Hardware and Software issues'"
> <dba-tech at databaseadvisors.com>
> Sent: Tuesday, October 21, 2003 3:52 PM
> Subject: RE: [dba-Tech] Any ideas
> 
> 
> 
>>LOL...I thought I was the only one allowed to be "Captain Obvious":(
>>
>>
>>
>>Mark
>>
>>
>>-----Original Message-----
>>From: John Colby [mailto:jcolby at colbyconsulting.com]
>>Sent: Tuesday, October 21, 2003 10:49 AM
>>To: Discussion of Hardware and Software issues
>>Subject: RE: [dba-Tech] Any ideas
>>
>>
>>Uhhh... use a database?  ;-)
>>
>>John W. Colby
>>www.colbyconsulting.com
>>
>>-----Original Message-----
>>From: dba-tech-bounces at databaseadvisors.com
>>[mailto:dba-tech-bounces at databaseadvisors.com]On Behalf Of Martin Reid
>>Sent: Tuesday, October 21, 2003 10:32 AM
>>To: Discussion of Hardware and Software issues
>>Subject: [dba-Tech] Any ideas
>>
>>
>>Anyone have any ideas re the following?
>>
>>I have a system whereby each PC in the SCCs sends in one short line per
>>minute to a central server. Each line is of the form IP address, time,
>>date[, user id]. The central server is only a P450 with 256Mb memory but I
>>have used a P733 with the same results.
>>
>>When a user logs in to a PC, it writes a line to the same file on the
>>central server as all the other used PCs. Each PC writes at the same
> 
> second
> 
>>each minute, but the PCs determine their second to write by chance,
>>basically. Thus the incoming data for the file is reasonably well spread
>>across 60 seconds.
>>
>>On the minute, the software on the central server renames the input file,
>>thereby causing a new one to be created with the next record sent to it.
> 
> The
> 
>>central file is held on a share to which each PC has to authenticate.
>>
>>When enough PCs are active, and I have not been able to deduce if there is
> 
> a
> 
>>threshold figure for that number, some or most of a record may be lost.
> 
> That
> 
>>can be seen from the input files.
>>
>>During stress tests, when my PC was the only system communicating with the
>>server, My PC could send in about 630 lines per minute and none would be
>>lost. And this over a period of say an hour. However, when multiple PCs
> 
> send
> 
>>in lines, the data loss may arise with 50 PCs active. The difference is
> 
> the
> 
>>number of active network connections.
>>
>>As I don't believe the data is being lost on the network (I have monitored
>>this and have not seen losses so far), it is most likely being lost
> 
> through
> 
>>the networking code/file system combination, and probably the former.
>>
>>I was wondering if anyone had a better method for collecting this
>>asynchronous auditing information, one which did not lose data.


-- 
-Francisco




More information about the dba-Tech mailing list