Colby, John
JColby at dispec.com
Wed Aug 25 11:28:44 CDT 2004
Francisco, Unfortunately I am not really up to speed on SQL Server so this is very much "learn as you go". I have always wanted a project that demanded SQL Server so I could get up to speed on it while earning a living. I am quite willing to drop my rate (or bill fewer hours than actually spent) to reflect my "on the job learning" but to just go learn something as complex as SQL Server when the knowledge would just grow stagnant has always seemed a waste. If this client is as big as it appears, I may some day be knowledgeable on SQL Server. And no, I haven't tuned the log file to account for the bulk inserts. At the moment, each 3 million name source file is taking about 45 minutes to pull in, which really gives me time to go read the manual and do other exploring while the insert is happening. I am working on-site ATM, but when I get home I hope to get one of my other desktops running the BCP and use my laptop to simultaneously start querying the db. Not really sure if that is even possible but I would hope so. Using the server itself to do the BCP causes the server to "lock up" by putting all the processor cycles and all the memory into the process. Perhaps by having another workstation do the BCP, the server can "just be a server", offloading some of the work to the workstation, thus allowing another workstation (my laptop) to also sneak in some queries and stuff. Of course I also have to get the db compacted so that it will fit on my 120g Seagate external (usb) hard disk), then try and get raid0 running to use the two hard disks as a big 400g drive (I'm gonna need it!) then copy the db file back to the raid0 drive and pick up on the BCPs where I left off. I have to stop the import here or the db won't fit on my Seagate external drive and I'll be unable to get the raid0 array working. This has definitely been an experience and a challenge. I do love a challenge so it's been fun so far. JWC -----Original Message----- From: Francisco Tapia [mailto:fhtapia at gmail.com] Sent: Wednesday, August 25, 2004 12:07 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Big db update On Wed, 25 Aug 2004 07:19:07 -0400, John W. Colby <jwcolby at colbyconsulting.com> wrote: > I looked at Shrink which will reduce the current size by 30% according to > EM. It appears that the extra space is there for future expansion and > seeing as I have a slew more files to import there is no point in doing so > YET. > > I will keep you in mind for those hard questions about largish DBs! You are becoming quite the VLDB expert :), one question I had for you, on your log file settings do you have your db set to either BULK Logged or SIMPLE, this will reduce the amount of records stored in the transaction log while you are doing this "Massive" rollup. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com