Jim Lawrence
accessd at shaw.ca
Fri Mar 29 17:30:44 CDT 2013
Anyone that is mildly interested: <irrelevant rant on> I have been working on the most ancient Informix database and it has been an absolute bear. Memory issues are continually biting... Below are but a few of the issues: It has taken a series of deleting and replacing common variable memory blocks to get some decent speed as when a module is too complex (at least as far as core application is concerned) or a table is loaded with too many key it either crashes or starting writing to the disk... and the system slows to a halt. Loading tables just disappear...found that a table under load could just loses its connection to the window it was connected to...no apparent reason so code has to be made to check. It scans the table header to see if the window dual bytes are appropriately filled and if not moves to last known window, of the table in question and directly go to the table there by recovering it. It is significantly faster to just delete network tables and replace them with blank empty tables than trying to run a full delete of the contents...of course this is all size dependant. There is now a universal module in place which continually monitors performance and when it drops below a certain level the system halt the process, clears the current working common block, unloads and reloads the affected tables and a reduced set requery is run and the process then continues where it left off... It also has to take into account, if it is running across the internet whether the connection speed is the issue or not. Some modules have to be loaded in pieces, into high memory as not to over-run or over write the system due to some odd memory leaks. It resolves these problems by continually refreshing the code and variable sets. There is also some the system's primitives which are (even) more flaky so care had to be taken when using them or methods to circumnavigate had to be constructed. If a system module was unavoidable then it had to be run in a module freshly compiled and run, then immediately removed from memory. There are many more such issues but this is just a taste. This is all outside the client's understanding or care for that matter as they just want it to work and work reliably every time and from everywhere. This is not the worse environment as some of the old chips, when working with assembler needed to be continuously monitored for timing and stability issues but this rates in the top ten (top five?) of ugly. Considering it is all built on top of a single user package which now can support as many users as the system's and network's resources allow. Most the coding effort is figuring out work-arounds and not just straight coding. Would I have started the project if I had any inkling of the issues ...absolutely not. <irrelevant rant off> But the day is beautiful so I will go for a quick walk, have another coffee and get back to it. Jim