Asger Blond
ab-mi at post3.tele.dk
Thu May 8 19:59:29 CDT 2014
Hi David, I would suspend the shrink log to the end of all of your batches. Your database having recovery set to SIMPLE, don't worry bloating the log file: the log file will truncate after each batch (truncate meaning freeing space inside the physical file, but not reducing size of the physical file). In your scenario each batch will force the log file to a physical reduction, and then the next batch will demand a physically expansion. This causes big io's and big file fragmentation: just don't. / Asger ----- Original meddelelse ----- > Fra: David McAfee <davidmcafee at gmail.com> > Til: Discussion concerning MS SQL Server > <dba-sqlserver at databaseadvisors.com> > Dato: Tor, 08. maj 2014 18:10 > Emne: [dba-SQLServer] Any way to help speed things up? > > Hi all, I've been busy working at my new job with SQL related tasks. > > My current project is to import a huge (120 column) table from 710 > access databases spread around 14 network locations into SQL server. > > I've created an SSIS package that loops through a recordset of > connection > strings and database paths, importing new and changed records into a > master > load table. > > The SSIS package normalizes the data a bit into and places the data > in > their respective locations, then truncates the load table and > continues the > loop. > > > I've altered my Database and set the Recovery to Simple. > > I run the package in batches of 10-20 mdb imports just to check > things out. > > I shrink the log file after every batch (which grows pretty big). > > Is there any tips that can save me some time and possibly space? > > Thanks in advance, > David > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com