Martin Reid
mwp.reid at qub.ac.uk
Tue Oct 21 10:23:11 CDT 2003
LOL The "I" in this case isnt me. Its our head of IT services. Martin ----- Original Message ----- From: "Mitsules, Mark" <Mark.Mitsules at ngc.com> To: "'Discussion of Hardware and Software issues'" <dba-tech at databaseadvisors.com> Sent: Tuesday, October 21, 2003 3:52 PM Subject: RE: [dba-Tech] Any ideas > LOL...I thought I was the only one allowed to be "Captain Obvious":( > > > > Mark > > > -----Original Message----- > From: John Colby [mailto:jcolby at colbyconsulting.com] > Sent: Tuesday, October 21, 2003 10:49 AM > To: Discussion of Hardware and Software issues > Subject: RE: [dba-Tech] Any ideas > > > Uhhh... use a database? ;-) > > John W. Colby > www.colbyconsulting.com > > -----Original Message----- > From: dba-tech-bounces at databaseadvisors.com > [mailto:dba-tech-bounces at databaseadvisors.com]On Behalf Of Martin Reid > Sent: Tuesday, October 21, 2003 10:32 AM > To: Discussion of Hardware and Software issues > Subject: [dba-Tech] Any ideas > > > Anyone have any ideas re the following? > > I have a system whereby each PC in the SCCs sends in one short line per > minute to a central server. Each line is of the form IP address, time, > date[, user id]. The central server is only a P450 with 256Mb memory but I > have used a P733 with the same results. > > When a user logs in to a PC, it writes a line to the same file on the > central server as all the other used PCs. Each PC writes at the same second > each minute, but the PCs determine their second to write by chance, > basically. Thus the incoming data for the file is reasonably well spread > across 60 seconds. > > On the minute, the software on the central server renames the input file, > thereby causing a new one to be created with the next record sent to it. The > central file is held on a share to which each PC has to authenticate. > > When enough PCs are active, and I have not been able to deduce if there is a > threshold figure for that number, some or most of a record may be lost. That > can be seen from the input files. > > During stress tests, when my PC was the only system communicating with the > server, My PC could send in about 630 lines per minute and none would be > lost. And this over a period of say an hour. However, when multiple PCs send > in lines, the data loss may arise with 50 PCs active. The difference is the > number of active network connections. > > As I don't believe the data is being lost on the network (I have monitored > this and have not seen losses so far), it is most likely being lost through > the networking code/file system combination, and probably the former. > > I was wondering if anyone had a better method for collecting this > asynchronous auditing information, one which did not lose data. > _______________________________________________ > dba-Tech mailing list > dba-Tech at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-tech > Website: http://www.databaseadvisors.com > > > > _______________________________________________ > dba-Tech mailing list > dba-Tech at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-tech > Website: http://www.databaseadvisors.com > _______________________________________________ > dba-Tech mailing list > dba-Tech at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-tech > Website: http://www.databaseadvisors.com >