jwcolby
jwcolby at colbyconsulting.com
Wed Oct 7 09:49:36 CDT 2009
Cool, thanks James. John W. Colby www.ColbyConsulting.com James Barash wrote: > John: > Yes, it can be done. It can help to keep the transaction log files from getting too large. You just need a way to determine which records have already been updated. The following will update 1,000,000 records per transaction. > > James Barash > > Set rowcount 1000000 > > Declare @rc int > Set @rc=1000000 > > While @rc=1000000 > Begin > > Begin Transaction > > Update MyTable > Set MyField = 0 > From MyTable > Where MyField <> 0 > > Select @rc=@@rowcount > > Commit Transaction > End > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby > Sent: Tuesday, October 06, 2009 5:35 PM > To: Dba-Sqlserver > Subject: [dba-SQLServer] Transactions > > Is it possible to tell SQL Server to use transactions inside of a single query? > > I routinely do things like update a single field for 50 million records. It might be beneficial to be able to tell SQL Server to commit the updates every N million records. I have no idea whether it is possible, or if it is what the results would be but I thought it worth asking the question. > > Can it be done? Would it make any difference. > > -- > John W. Colby > www.ColbyConsulting.com > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > >