From serbach at new.rr.com Tue Aug 3 12:48:44 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Tue, 3 Aug 2004 12:48:44 -0500 Subject: [dba-SQLServer] Builder.com article on SQL-DMO Message-ID: <20040803124844.1902786528.serbach@new.rr.com> Arthur, Your tip in today's Builder.com on SQL-DMO is the first I've heard of it, though I see from a quick search of SQL Server Central that there's quite a bit in the discussions there. I did a little browsing of SQL-DMO in BOL. Looks like functions you'd normally use Enterprise Manager for can be handled programmatically with SQL-DMO calls, yes? Regards, Steve Erbach Scientific Marketing Neenah, WI "I will stand up and struggle, as others have, to try to get that right balance between violence, and sex, and things." - John Kerry ( http://www.abcnews.go.com/sections/WNT/US/kerry_interview_040722-3.html ) From Susan.Klos at fldoe.org Wed Aug 4 07:31:58 2004 From: Susan.Klos at fldoe.org (Klos, Susan) Date: Wed, 4 Aug 2004 08:31:58 -0400 Subject: [dba-SQLServer] formatting a number to 2 decimal places. Message-ID: <01B619CB8F6C8C478EDAC39191AEC51EE737DD@DOESEFPEML02.EUS.FLDOE.INT> I am a newbie to SQL Server and have been pretty much applying my Access knowledge of SQL to SQL Server. I have created three views. The first one finds the number of students in schools by district and school number. The next one creates a weighted school gpa for each school and then sums this and school enrollment by district. The last one finds a weighted gpa for the district. The result shows up as an integer. I would like to have the result show up rounded to 2 decimals. How and at what step do I do that? Here is the code for the views: Create Total School Enrollment by district and school CREATE VIEW dbo.SchoolEnrollmentNumbersXDistschl AS SELECT TOP 100 PERCENT DistEnrl, SchlEnrl, DistEnrl + SchlEnrl AS distschl, COUNT(SID) AS TotSchlEnrl FROM dbo.Survey3 GROUP BY DistEnrl, SchlEnrl ORDER BY distschl Create weighted school gpa and Sum weighted school gpa and total school enrollment by district CREATE VIEW dbo.WTGPAPtsXDist AS SELECT arm_klos.GradedSchools.distenrl, arm_klos.GradedSchools.distname, SUM((arm_klos.GradedSchools.gr04 - 1) * dbo.SchoolEnrollmentNumbersXDistschl.TotSchlEnrl) AS wtgpa1, SUM(dbo.SchoolEnrollmentNumbersXDistschl.TotSchlEnrl) AS EnrlTot FROM arm_klos.GradedSchools INNER JOIN Dbo.SchoolEnrollmentNumbersXDistschl ON arm_klos.GradedSchools.distenrl = dbo.SchoolEnrollmentNumbersXDistschl.DistEnrl AND arm_klos.GradedSchools.schlenrl = dbo.SchoolEnrollmentNumbersXDistschl.SchlEnrl WHERE (arm_klos.GradedSchools.SchoolType = N'elem') GROUP BY arm_klos.GradedSchools.distenrl, arm_klos.GradedSchools.distname Create District Weighted GPA CREATE VIEW dbo.DistWGPA AS SELECT distenrl, distname, wtgpa1 / EnrlTot AS wgpa FROM dbo.WTGPAPtsXDist Susan Klos Senior Database Analyst Evaluation and Reporting Florida Department of Education 850-245-0708 sc 205-0708 From stuart at lexacorp.com.pg Wed Aug 4 10:00:35 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Thu, 05 Aug 2004 01:00:35 +1000 Subject: [dba-SQLServer] formatting a number to 2 decimal places. In-Reply-To: <01B619CB8F6C8C478EDAC39191AEC51EE737DD@DOESEFPEML02.EUS.FLDOE.INT> Message-ID: <411186B3.26805.1C5AE26@lexacorp.com.pg> On 4 Aug 2004 at 8:31, Klos, Susan wrote: > > SELECT distenrl, distname, wtgpa1 / EnrlTot AS wgpa > I presume that everything up to here has been integers, so you need to convert wtgpa1 to a float before doing the division, otherwise it will use integer arithmetic. To round it to one decimal place, you also need to round the result of the division. SELECT distenrl,distname, ROUND(CAST(wtggpa1 AS FLOAT) / ENrlTot, 1 ) as wgpa -- Lexacorp Ltd http://www.lexacorp.com.pg Information Technology Consultancy, Software Development,System Support. From Susan.Klos at fldoe.org Wed Aug 4 10:53:52 2004 From: Susan.Klos at fldoe.org (Klos, Susan) Date: Wed, 4 Aug 2004 11:53:52 -0400 Subject: [dba-SQLServer] formatting a number to 2 decimal places. Message-ID: <01B619CB8F6C8C478EDAC39191AEC51EE737E0@DOESEFPEML02.EUS.FLDOE.INT> Thank you, Stuart. That was exactly what I was looking for. I thought I had to use cast but I hadn't seen float anywhere in my research. Susan Klos Senior Database Analyst Evaluation and Reporting Florida Department of Education 850-245-0708 sc 205-0708 -----Original Message----- From: Stuart McLachlan [mailto:stuart at lexacorp.com.pg] Sent: Wednesday, August 04, 2004 11:01 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] formatting a number to 2 decimal places. On 4 Aug 2004 at 8:31, Klos, Susan wrote: > > SELECT distenrl, distname, wtgpa1 / EnrlTot AS wgpa > I presume that everything up to here has been integers, so you need to convert wtgpa1 to a float before doing the division, otherwise it will use integer arithmetic. To round it to one decimal place, you also need to round the result of the division. SELECT distenrl,distname, ROUND(CAST(wtggpa1 AS FLOAT) / ENrlTot, 1 ) as wgpa -- Lexacorp Ltd http://www.lexacorp.com.pg Information Technology Consultancy, Software Development,System Support. _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Thu Aug 5 11:14:44 2004 From: artful at rogers.com (Arthur Fuller) Date: Thu, 5 Aug 2004 12:14:44 -0400 Subject: [dba-SQLServer] Builder.com article on SQL-DMO In-Reply-To: <20040803124844.1902786528.serbach@new.rr.com> Message-ID: <009901c47b07$523fd8c0$6601a8c0@rock> It's nice to know that some of the stuff actually gets read! And once in a while, maybe even proves useful to someone. You're quite right. As I mentioned briefly in the tip, I wrote an Access app that used the DMO library to accompany another app that shipped with MSDE as its back end. Therefore we could assume that few if any users would have EM installed. The nature of the app was such that users would have frequent occasion to install an old version of the database, look at it for a while, and perhaps uninstall it. Using DMO I put together a little app that presented the list of servers in a combobox, then the list of databases on a selected server, then let the user browse to the location of the backup of interest (typically on a CD), and then install it, giving it any desired name. There were a few bumps in the road, primarily because it was my first use of the DMO library. At the end of the day, however, the most surprising thing was the small number of lines of code required to do the job. It was almost shocking. One little thing that tripped me up for a while was the timeout factor. I had tested it on small databases but in the field the app would sometimes time out. Once I realized what the problem was, it was a simple fix. Good luck with your DMO experiments. Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steven W. Erbach Sent: Tuesday, August 03, 2004 1:49 PM To: 'dba-sqlserver at databaseadvisors.com' Subject: [dba-SQLServer] Builder.com article on SQL-DMO Arthur, Your tip in today's Builder.com on SQL-DMO is the first I've heard of it, though I see from a quick search of SQL Server Central that there's quite a bit in the discussions there. I did a little browsing of SQL-DMO in BOL. Looks like functions you'd normally use Enterprise Manager for can be handled programmatically with SQL-DMO calls, yes? Regards, Steve Erbach Scientific Marketing Neenah, WI From serbach at new.rr.com Thu Aug 5 16:47:16 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Thu, 5 Aug 2004 16:47:16 -0500 Subject: [dba-SQLServer] Builder.com article on SQL-DMO In-Reply-To: <009901c47b07$523fd8c0$6601a8c0@rock> References: <20040803124844.1902786528.serbach@new.rr.com> <009901c47b07$523fd8c0$6601a8c0@rock> Message-ID: <20040805164716.952471058.serbach@new.rr.com> Arthur, >> It's nice to know that some of the stuff actually gets read! << Well, it's a pleasure to see things you've written out where others can see them, too. I take back the ribbing I gave you about your MySQL book. What other columns have you written for SQL Server Central or Builder.com or whomever? Regards, Steve Erbach Scientific Marketing Neenah, WI 920-969-0504 Security and Virus information: http://www.swerbach.com/security From serbach at new.rr.com Thu Aug 5 16:56:07 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Thu, 5 Aug 2004 16:56:07 -0500 Subject: [dba-SQLServer] Spell checker Message-ID: <20040805165607.2145306462.serbach@new.rr.com> Dear Group, I had a meeting today with the client that's given me the go-ahead to convert an Access FE / BE application to a web-based database. I'll be constructing it with ASP.NET, VB.NET, ADO.NET, SQL Server.NET...I mean, just plain old SQL Server 2000. This .NET stuff gets a bit overwhelming. Anyway, my client admits to having a learning disability: he never learned to spell worth a darn. He was curious to know if a spell checking capability could be built into the web front end for a SQL Server database or if some add-on was available. I haven't the first clue about this partly because I'm a good speller. I rarely use spel chekkurs. Hah! Anybody know of any product or tool available for adding spell checking to a .NET interface for a SQL Server database? Regards, Steve Erbach Neenah, WI "IBM had every chance to end the Windows monopoly with OS/2 but shot itself in the foot, ankle, shin, knee, and hip, reloading each time, before giving up." - Jerry Pournelle From stuart at lexacorp.com.pg Thu Aug 5 17:39:05 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Fri, 06 Aug 2004 08:39:05 +1000 Subject: [dba-SQLServer] Spell checker In-Reply-To: <20040805165607.2145306462.serbach@new.rr.com> Message-ID: <411343A9.6628.1B31A89@lexacorp.com.pg> On 5 Aug 2004 at 16:56, Steven W. Erbach wrote: > Dear Group, > > I had a meeting today with the client that's given me the go-ahead to > convert an Access FE / BE application to a web-based database. I'll be > constructing it with ASP.NET, VB.NET, ADO.NET, SQL Server.NET...I > mean, just plain old SQL Server 2000. This .NET stuff gets a bit > overwhelming. > > Anyway, my client admits to having a learning disability: he never > learned to spell worth a darn. He was curious to know if a spell > checking capability could be built into the web front end for a SQL > Server database or if some add-on was available. I haven't the first > clue about this partly because I'm a good speller. I rarely use spel > chekkurs. Hah! > > Anybody know of any product or tool available for adding spell > checking to a .NET interface for a SQL Server database? > If you are using the IE engine for the client side, take a look at http://www.iespell.com/ For a full blown solution, try http://www.hotlingo.com/ http://www.xde.net/products/product_spellchecker.htm http://www.spellchecker.net/ -- Lexacorp Ltd http://www.lexacorp.com.pg Information Technology Consultancy, Software Development,System Support. From serbach at new.rr.com Thu Aug 5 22:33:39 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Thu, 5 Aug 2004 22:33:39 -0500 Subject: [dba-SQLServer] Spell checker In-Reply-To: <411343A9.6628.1B31A89@lexacorp.com.pg> References: <20040805165607.2145306462.serbach@new.rr.com> <411343A9.6628.1B31A89@lexacorp.com.pg> Message-ID: <20040805223339.183968798.serbach@new.rr.com> Stuart, Now we're talking! This puts me on the right path. Thank you. Sincerely, Steve Erbach Scientific Marketing Neenah, WI 920-969-0504 From artful at rogers.com Fri Aug 6 16:13:44 2004 From: artful at rogers.com (Arthur Fuller) Date: Fri, 6 Aug 2004 17:13:44 -0400 Subject: [dba-SQLServer] Builder.com article on SQL-DMO In-Reply-To: <20040805164716.952471058.serbach@new.rr.com> Message-ID: <01e101c47bfa$41ee6740$6601a8c0@rock> I don't know whether you can view the back-columns, but if you visit builder.com and subscribe to the SQL list, you'll receive everything hence. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steven W. Erbach Sent: Thursday, August 05, 2004 5:47 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Builder.com article on SQL-DMO Arthur, >> It's nice to know that some of the stuff actually gets read! << Well, it's a pleasure to see things you've written out where others can see them, too. I take back the ribbing I gave you about your MySQL book. What other columns have you written for SQL Server Central or Builder.com or whomever? Regards, Steve Erbach Scientific Marketing Neenah, WI 920-969-0504 Security and Virus information: http://www.swerbach.com/security _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Sat Aug 7 14:37:42 2004 From: artful at rogers.com (Arthur Fuller) Date: Sat, 7 Aug 2004 15:37:42 -0400 Subject: [dba-SQLServer] Views With Parameters ? In-Reply-To: <10485429.1090412869654.JavaMail.www@wwinf3001> Message-ID: <02b701c47cb6$01c1d9b0$6601a8c0@rock> I feel that this answer is insufficient. Almost any view can be translated into a table-UDF, in which case it can accept as many parameters as you would in practice care to pass. The (IMO) huge advantage of turning such views into UDFs as opposed to sprocs is that you can join to them in various larger queries. Once I discovered this (hee hee, like I discovered it! What I mean is that once I read the documentation and ran some test cases!), I became a big fan of table UDFs. Just for the record, there actually is a way to return the results of a sproc for further use, but IMO it sucks, as compared to UDFs. (OpenQuery is the way.) Arthur P.S. Forgive me. I just scrolled down to the original message and realized that you specified version 7. UDFs are not supported in 7. So the information above is useful only to those facing similar problems in SQL 2k and above. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of paul.hartland at fsmail.net Sent: Wednesday, July 21, 2004 8:28 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: RE: [dba-SQLServer] Views With Parameters ? Thought as much, thank you. Message date : Jul 21 2004, 01:22 PM >From : "Mike & Doris Manning" To : dba-sqlserver at databaseadvisors.com Copy to : Subject : RE: [dba-SQLServer] Views With Parameters ? To my knowledge, views don't take parameters...only Stored Procedures do... Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of paul.hartland at fsmail.net Sent: Wednesday, July 21, 2004 7:59 AM To: dba-sqlserver Subject: [dba-SQLServer] Views With Parameters ? To all, Is it possible to create a parameter type view in SQL Server 7.0, which I can then call from Visual Basic 6. Something like SELECT * FROM tblPersonnel WHERE tblPersonnel.PayrollNo = ? and then in VB something like rsPerson.Open("Query '" & Me.PayrollNo & "'"),OpenSQLConn etc etc Paul Hartland -- Whatever you Wanadoo: http://www.wanadoo.co.uk/time/ This email has been checked for most known viruses - find out more at: http://www.wanadoo.co.uk/help/id/7098.htm _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com -- Whatever you Wanadoo: http://www.wanadoo.co.uk/time/ This email has been checked for most known viruses - find out more at: http://www.wanadoo.co.uk/help/id/7098.htm _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Sat Aug 7 14:42:16 2004 From: artful at rogers.com (Arthur Fuller) Date: Sat, 7 Aug 2004 15:42:16 -0400 Subject: [dba-SQLServer] Calling Views From another Database but applying them to THIS database In-Reply-To: <10485429.1090412869654.JavaMail.www@wwinf3001> Message-ID: <02b801c47cb6$a5024ce0$6601a8c0@rock> I have been reading a thread on this subject in one of the many SQL lists I subscribe to. I scanned the recent emails and cannot find the thread here, so it may have hit me from another list. Point is, I think that I have figured out how to do it. So if the question did in fact arise on this list, would the question-poser please reply so that I can send a test query to confirm or deny that my solution works? TIA, Arthur From mikedorism at adelphia.net Sat Aug 7 14:51:08 2004 From: mikedorism at adelphia.net (Mike & Doris Manning) Date: Sat, 7 Aug 2004 15:51:08 -0400 Subject: [dba-SQLServer] Calling Views From another Database but applyingthem to THIS database In-Reply-To: <02b801c47cb6$a5024ce0$6601a8c0@rock> Message-ID: <000201c47cb7$e1fb2ad0$9ea51643@hargrove.internal> The syntax is database.dbo.object You can do same with a table, view, sproc, or function Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Saturday, August 07, 2004 3:42 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Calling Views From another Database but applyingthem to THIS database I have been reading a thread on this subject in one of the many SQL lists I subscribe to. I scanned the recent emails and cannot find the thread here, so it may have hit me from another list. Point is, I think that I have figured out how to do it. So if the question did in fact arise on this list, would the question-poser please reply so that I can send a test query to confirm or deny that my solution works? TIA, Arthur _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Sat Aug 7 15:02:15 2004 From: artful at rogers.com (Arthur Fuller) Date: Sat, 7 Aug 2004 16:02:15 -0400 Subject: [dba-SQLServer] Calling Views From another Database butapplyingthem to THIS database In-Reply-To: <000201c47cb7$e1fb2ad0$9ea51643@hargrove.internal> Message-ID: <02c001c47cb9$700a8c70$6601a8c0@rock> Thanks for the reply. If you have a spare moment, please verify your answer. (I think that you're not quite right, but I am most eager to stand corrected.) Copy the Northwind database. Create a view that does something interesting. Delete a few rows from the tables involved in said view. Open the original Northwind database and execute the view just created, using the syntax you suggest. Will the result set show the copy's rows or the original Northwind's rows? My experiments to date show that the returned rows are from the copied database (i.e. fewer rows), which implies that the query is executing on the database it's in rather than the database I'm in. What I want is for said query to apply to the database I'm in, not the database that the query is in. Am I making myself clear? I'm trying only to state the problem, not to be pedantic or insulting or anything else. Thanks, Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Mike & Doris Manning Sent: Saturday, August 07, 2004 3:51 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Calling Views From another Database butapplyingthem to THIS database The syntax is database.dbo.object You can do same with a table, view, sproc, or function Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com From stuart at lexacorp.com.pg Sat Aug 7 18:29:33 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Sun, 08 Aug 2004 09:29:33 +1000 Subject: [dba-SQLServer] Views With Parameters ? In-Reply-To: <02b701c47cb6$01c1d9b0$6601a8c0@rock> References: <10485429.1090412869654.JavaMail.www@wwinf3001> Message-ID: <4115F27D.8998.C2DF9E7@lexacorp.com.pg> On 7 Aug 2004 at 15:37, Arthur Fuller wrote: > P.S. > Forgive me. I just scrolled down to the original message and realized > that you specified version 7. That wouldn't happen if people trimmed appropriately and bottom posted -- Stuart From mikedorism at adelphia.net Sun Aug 8 18:29:02 2004 From: mikedorism at adelphia.net (Mike & Doris Manning) Date: Sun, 8 Aug 2004 19:29:02 -0400 Subject: [dba-SQLServer] Calling Views From another Databasebutapplyingthem to THIS database In-Reply-To: <02c001c47cb9$700a8c70$6601a8c0@rock> Message-ID: <000201c47d9f$7d61ccd0$870aa845@hargrove.internal> The object you are calling from the other database will always run/affect the other database -- never the database you are in. You cannot apply them to your current database unless you import them into the current database (thereby changing their database reference). Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Saturday, August 07, 2004 4:02 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Calling Views From another Databasebutapplyingthem to THIS database Thanks for the reply. If you have a spare moment, please verify your answer. (I think that you're not quite right, but I am most eager to stand corrected.) Copy the Northwind database. Create a view that does something interesting. Delete a few rows from the tables involved in said view. Open the original Northwind database and execute the view just created, using the syntax you suggest. Will the result set show the copy's rows or the original Northwind's rows? My experiments to date show that the returned rows are from the copied database (i.e. fewer rows), which implies that the query is executing on the database it's in rather than the database I'm in. What I want is for said query to apply to the database I'm in, not the database that the query is in. Am I making myself clear? I'm trying only to state the problem, not to be pedantic or insulting or anything else. Thanks, Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Mike & Doris Manning Sent: Saturday, August 07, 2004 3:51 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Calling Views From another Database butapplyingthem to THIS database The syntax is database.dbo.object You can do same with a table, view, sproc, or function Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Mon Aug 9 10:53:47 2004 From: artful at rogers.com (Arthur Fuller) Date: Mon, 9 Aug 2004 11:53:47 -0400 Subject: [dba-SQLServer] Calling Views From anotherDatabasebutapplyingthem to THIS database In-Reply-To: <000201c47d9f$7d61ccd0$870aa845@hargrove.internal> Message-ID: <013901c47e29$0edba0d0$6601a8c0@rock> I think that you are correct, but it saddens me. I can think of situations where the same query would be required in several different databases, but I don't want to have to visit every such databse to update the query when an update is required. I want to do it once and move on. There's got to be a way around this! Maybe OpenQuery? I'm off on another of my unprofitable mining expeditions :) Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Mike & Doris Manning Sent: Sunday, August 08, 2004 7:29 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Calling Views From anotherDatabasebutapplyingthem to THIS database The object you are calling from the other database will always run/affect the other database -- never the database you are in. You cannot apply them to your current database unless you import them into the current database (thereby changing their database reference). Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com From serbach at new.rr.com Mon Aug 9 20:35:00 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Mon, 9 Aug 2004 20:35:00 -0500 Subject: [dba-SQLServer] Connection problems Message-ID: <20040809203500.1249955044.serbach@new.rr.com> Dear Group, I have a client whose database is out on a shared host SQL Server. I can get at the data through an ODBC connection with Access 97, or through an Access 2000/XP ADP. My problem comes when I try to use Enterprise Manager. When I enter the user name and password I get this message: ~~~~~~~~~~~~~~~ A connection could not be established to 64.158... Reason: Cannot open user default database. Login failed. Please verify SQL Server is running and check your SQL Server registration properties. ~~~~~~~~~~~~~~~ Recently the default database for my client's application was, indeed, changed on this particular SQL Server...but there doesn't seem to be a way to change the default database in Enterprise Manager. What am I missing? Sincerely, Steve Erbach Scientific Marketing Neenah, WI 920-969-0504 From tuxedo_man at hotmail.com Mon Aug 9 22:06:56 2004 From: tuxedo_man at hotmail.com (Billy Pang) Date: Tue, 10 Aug 2004 03:06:56 +0000 Subject: [dba-SQLServer] Connection problems Message-ID: you can change it from EM. From EM, go to security -- > logins. Bring up the sql server login properties of the sql server login. The properties box pops up. From general tab, on the bottom of the page is the "default database". also, sp_defaultdb stored procedure will allow you to modify default db of db user from query analyzer HTH Billy >From: "Steven W. Erbach" >Reply-To: dba-sqlserver at databaseadvisors.com >To: dba-sqlserver at databaseadvisors.com >Subject: [dba-SQLServer] Connection problems >Date: Mon, 9 Aug 2004 20:35:00 -0500 > >Dear Group, > >I have a client whose database is out on a shared host SQL Server. I can >get at the data through an ODBC connection with Access 97, or through an >Access 2000/XP ADP. My problem comes when I try to use Enterprise Manager. >When I enter the user name and password I get this message: > >~~~~~~~~~~~~~~~ >A connection could not be established to 64.158... > >Reason: Cannot open user default database. Login failed. > >Please verify SQL Server is running and check your SQL Server registration >properties. >~~~~~~~~~~~~~~~ > >Recently the default database for my client's application was, indeed, >changed on this particular SQL Server...but there doesn't seem to be a way >to change the default database in Enterprise Manager. > >What am I missing? > >Sincerely, > >Steve Erbach >Scientific Marketing >Neenah, WI >920-969-0504 > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > _________________________________________________________________ MSN? Calendar keeps you organized and takes the effort out of scheduling get-togethers. http://join.msn.com/?pgmarket=en-ca&page=byoa/prem&xAPID=1994&DI=1034&SU=http://hotmail.com/enca&HL=Market_MSNIS_Taglines Start enjoying all the benefits of MSN? Premium right now and get the first two months FREE*. From serbach at new.rr.com Tue Aug 10 16:50:25 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Tue, 10 Aug 2004 16:50:25 -0500 Subject: [dba-SQLServer] Connection problems Message-ID: <20040810165025.2063639640.serbach@new.rr.com> Dear Group, I have a client whose database is out on a GoreNet shared host SQL Server. I can get at the data through an ODBC connection with Access 97, or through an Access 2000/XP ADP. My problem comes when I try to use Enterprise Manager. When I enter the user name and password I get this message: ~~~~~~~~~~~~~~~ A connection could not be established to blah.blah.blah.blah. Reason: Cannot open user default database. Login failed. Please verify SQL Server is running and check your SQL Server registration properties. ~~~~~~~~~~~~~~~ Recently the default database for my client's application was, indeed, changed on this particular SQL Server...but there doesn't seem to be a way to change the default database in Enterprise Manager. What am I missing? Sincerely, Steve Erbach Scientific Marketing Neenah, WI 920-969-0504 From serbach at new.rr.com Tue Aug 10 17:09:58 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Tue, 10 Aug 2004 17:09:58 -0500 Subject: [dba-SQLServer] Connection problems In-Reply-To: <20040810170040.999239896.serbach@new.rr.com> References: <20040810170040.999239896.serbach@new.rr.com> Message-ID: <20040810170958.1077256039.serbach@new.rr.com> Billy, Thanks for responding. I sent another copy of my question just moments ago having not seen any responses. Oops! >> From EM, go to security -- > logins << Well, that stops me immediately. In SQL Server Enterprise Manager there's a window for the Console Root of my SQL Server Group. The menu for the group shows Action, View, and Tools. None of those menus have a Security option. I'm at a loss here. Steve Erbach From tuxedo_man at hotmail.com Tue Aug 10 17:57:18 2004 From: tuxedo_man at hotmail.com (Billy Pang) Date: Tue, 10 Aug 2004 22:57:18 +0000 Subject: [dba-SQLServer] Connection problems Message-ID: The "security" mentioned in previous email is referring to one of the folders in your sql server registration from Enterprise Manager. The order of the folders most likely resembles this on your machine: Databases Data Transformation Services Management Replication Security <-- you want to be here Support Services Meta Data Services The "security" folder is peer to the "databases" folder. The "databases" folder is where you access all the tables. So if you can access the tables from EM, just go up one level and you should see the "security" folder. Within the "security" folder, look for the "logins" icon. HTH Billy >From: "Steven W. Erbach" >Reply-To: dba-sqlserver at databaseadvisors.com >To: dba-sqlserver at databaseadvisors.com >Subject: RE: [dba-SQLServer] Connection problems >Date: Tue, 10 Aug 2004 17:09:58 -0500 > >Billy, > >Thanks for responding. I sent another copy of my question just moments ago >having not seen any responses. Oops! > > >> From EM, go to security -- > logins << > >Well, that stops me immediately. In SQL Server Enterprise Manager there's a >window for the Console Root of my SQL Server Group. The menu for the group >shows Action, View, and Tools. None of those menus have a Security option. >I'm at a loss here. > >Steve Erbach > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > _________________________________________________________________ Take charge with a pop-up guard built on patented Microsoft? SmartScreen Technology http://join.msn.com/?pgmarket=en-ca&page=byoa/prem&xAPID=1994&DI=1034&SU=http://hotmail.com/enca&HL=Market_MSNIS_Taglines Start enjoying all the benefits of MSN? Premium right now and get the first two months FREE*. From serbach at new.rr.com Wed Aug 11 09:49:54 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 09:49:54 -0500 Subject: [dba-SQLServer] Connection problems In-Reply-To: References: Message-ID: <20040811094954.417532211.serbach@new.rr.com> Billy, >> The "security" folder is peer to the "databases" folder. The "databases" folder is where you access all the tables. << I was afraid that's what you were going to say. I cannot "see" those folders because I can't log in to the server, and that's because of the error I get regarding the default database. This is very odd since I can get it just fine through ODBC or an Access ADP. It's just in EM that I'm having the trouble. I want to check EM to see if the rights the user has include the right to modify or create sprocs. Steve Erbach From rl_stewart at highstream.net Wed Aug 11 12:21:34 2004 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Wed, 11 Aug 2004 12:21:34 -0500 Subject: [dba-SQLServer] Re: Connection problems In-Reply-To: <200408111701.i7BH1KQ03344@databaseadvisors.com> Message-ID: <5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> Delete the registration and re-register it. At 12:01 PM 8/11/2004 -0500, you wrote: >From: "Steven W. Erbach" >Subject: RE: [dba-SQLServer] Connection problems >To: dba-sqlserver at databaseadvisors.com >Message-ID: <20040811094954.417532211.serbach at new.rr.com> >Content-Type: text/plain; charset=utf-8 > >Billy, > > >> The "security" folder is peer to the "databases" folder. The > "databases" folder is where you access all the tables. << > >I was afraid that's what you were going to say. I cannot "see" those >folders because I can't log in to the server, and that's because of the >error I get regarding the default database. > >This is very odd since I can get it just fine through ODBC or an Access >ADP. It's just in EM that I'm having the trouble. > >I want to check EM to see if the rights the user has include the right to >modify or create sprocs. > >Steve Erbach From serbach at new.rr.com Wed Aug 11 13:20:33 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 13:20:33 -0500 Subject: [dba-SQLServer] Re: Connection problems In-Reply-To: <5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> References: <200408111701.i7BH1KQ03344@databaseadvisors.com> <5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> Message-ID: <20040811132033.1842960002.serbach@new.rr.com> Robert, >> Delete the registration and re-register it. << I've done that a few times already. No go. Any other ideas? "IBM had every chance to end the Windows monopoly with OS/2 but shot itself in the foot, ankle, shin, knee, and hip, reloading each time, before giving up." - Jerry Pournelle From tuxedo_man at hotmail.com Wed Aug 11 13:22:36 2004 From: tuxedo_man at hotmail.com (Billy Pang) Date: Wed, 11 Aug 2004 18:22:36 +0000 Subject: [dba-SQLServer] Connection problems Message-ID: oh ok.. I thought that it was the other way around. can you logon as the sa user? sa user should not have problem with opening default database, by definition of course. or use another security account to access EM to change the security account that is giving the "cannot open default database" error. Billy >From: "Steven W. Erbach" >Reply-To: dba-sqlserver at databaseadvisors.com >To: dba-sqlserver at databaseadvisors.com >Subject: RE: [dba-SQLServer] Connection problems >Date: Wed, 11 Aug 2004 09:49:54 -0500 > >Billy, > > >> The "security" folder is peer to the "databases" folder. The >"databases" folder is where you access all the tables. << > >I was afraid that's what you were going to say. I cannot "see" those >folders because I can't log in to the server, and that's because of the >error I get regarding the default database. > >This is very odd since I can get it just fine through ODBC or an Access >ADP. It's just in EM that I'm having the trouble. > >I want to check EM to see if the rights the user has include the right to >modify or create sprocs. > >Steve Erbach > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > _________________________________________________________________ Take advantage of powerful junk e-mail filters built on patented Microsoft? SmartScreen Technology. http://join.msn.com/?pgmarket=en-ca&page=byoa/prem&xAPID=1994&DI=1034&SU=http://hotmail.com/enca&HL=Market_MSNIS_Taglines Start enjoying all the benefits of MSN? Premium right now and get the first two months FREE*. From mwp.reid at qub.ac.uk Wed Aug 11 13:26:55 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Wed, 11 Aug 2004 19:26:55 +0100 Subject: [dba-SQLServer] Re: Connection problems References: <200408111701.i7BH1KQ03344@databaseadvisors.com><5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> <20040811132033.1842960002.serbach@new.rr.com> Message-ID: <000d01c47fd0$c94b7ff0$2702a8c0@Martin> Have you changed a Windows Account or anything to do with security?? Martin ----- Original Message ----- From: "Steven W. Erbach" To: Sent: Wednesday, August 11, 2004 7:20 PM Subject: Re: [dba-SQLServer] Re: Connection problems > Robert, > > >> Delete the registration and re-register it. << > > I've done that a few times already. No go. Any other ideas? > > "IBM had every chance to end the Windows monopoly with OS/2 but shot itself in the foot, ankle, shin, knee, and hip, reloading each time, before giving up." - Jerry Pournelle > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From cfoust at infostatsystems.com Wed Aug 11 13:46:36 2004 From: cfoust at infostatsystems.com (Charlotte Foust) Date: Wed, 11 Aug 2004 11:46:36 -0700 Subject: [dba-SQLServer] Re: Connection problems Message-ID: I ran into something like this not long ago on my laptop. I had to remove and reinstall SQL Server 2000 (and restart the machine) *6* times before I could finally register a server and create a new database. When I had hard drive problems and moved to a new hard drive and WinXP instead of Win2k, I was prepared for the worst, but SQL Server installed and cooperated beautifully. Charlotte Foust -----Original Message----- From: Steven W. Erbach [mailto:serbach at new.rr.com] Sent: Wednesday, August 11, 2004 10:21 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Re: Connection problems Robert, >> Delete the registration and re-register it. << I've done that a few times already. No go. Any other ideas? "IBM had every chance to end the Windows monopoly with OS/2 but shot itself in the foot, ankle, shin, knee, and hip, reloading each time, before giving up." - Jerry Pournelle _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From serbach at new.rr.com Wed Aug 11 14:11:16 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 14:11:16 -0500 Subject: [dba-SQLServer] Connection problems In-Reply-To: References: Message-ID: <20040811141116.305150145.serbach@new.rr.com> Billy, >> Delete the registration and re-register it. << Excellent question. No, I can't, because the database is on a shared SQL server and all we have are two user logins: one of them is read-only and the other allows data changes to all tables. Using an Access ADP to get at the data I found that I'm unable to create a new sproc or to modify an existing one. That's why I want to try EM. Steve Erbach From serbach at new.rr.com Wed Aug 11 14:14:20 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 14:14:20 -0500 Subject: [dba-SQLServer] Re: Connection problems In-Reply-To: <000d01c47fd0$c94b7ff0$2702a8c0@Martin> References: <200408111701.i7BH1KQ03344@databaseadvisors.com> <5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> <20040811132033.1842960002.serbach@new.rr.com> <000d01c47fd0$c94b7ff0$2702a8c0@Martin> Message-ID: <20040811141420.708619281.serbach@new.rr.com> Martin, >> Have you changed a Windows Account or anything to do with security?? << No. This SQL Server is a shared server at some web host's site that my client rents. It's the back end for an on-line ASP data-entry and reporting application. Using the user account that allows full editing of data in all tables, I found that I cannot modify nor create an sproc, at least not with an Access ADP. Steve Erbach From artful at rogers.com Wed Aug 11 14:13:17 2004 From: artful at rogers.com (Arthur Fuller) Date: Wed, 11 Aug 2004 15:13:17 -0400 Subject: [dba-SQLServer] SQL install f**ked up In-Reply-To: Message-ID: <01f301c47fd7$43341740$6601a8c0@rock> I have asked this before in several lists but so far received no useful response, so I'll try again. Can anyone provide a recipe that is guaranteed to blow away all traces of MS-SQL on my development machine? Everything used to work fine, by which I mean the server, EM, QA, Yukon beta, etc. Then something happened, no idea what, and now nothing works. I can't uninstall successfully, I can't re-install... Nada! I even booted safe and renamed the directory, then reinstalled and even that didn't work. Fortunately I have other boxes that I can use, and of course all the data is backed up safely (and now resides on said server), but that's not the issue. I want to blow away every trace of SQL on this box and then successfully re-install. Any suggestions, short of a complete reformat? TIA, Arthur From serbach at new.rr.com Wed Aug 11 14:21:43 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 14:21:43 -0500 Subject: [dba-SQLServer] Re: Connection problems In-Reply-To: References: Message-ID: <20040811142143.741681530.serbach@new.rr.com> Charlotte, >> I ran into something like this not long ago on my laptop. ?I had to remove and reinstall SQL Server 2000 (and restart the machine) *6* times before I could finally register a server and create a new database. When I had hard drive problems and moved to a new hard drive and WinXP instead of Win2k, I was prepared for the worst, but SQL Server installed and cooperated beautifully. << Hmmm. I installed EM as part of the Office XP Developer's edition. I have a SQL database on my own workstation that I can, of course, log into; and I have a SQL database on CrystalTech that I can get at with my EM installation. So I've got three servers registered. It's just this one that's giving me trouble. Here's a question: Since the error message says that the default database can't be found, does that mean that the SQL Server role that includes this user account as a member is exposing this default (now non-existent) database? I have a feeling that that must be a naive question. But since I can't get at the roles or the user definitions, it doesn't matter if I'm naive or not, I guess. I find it interesting that the connection string in the Access XP or Access 2000 ADP can include the name of the desired database, but that EM (apparently) cannot. Steve Erbach From serbach at new.rr.com Wed Aug 11 14:32:28 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 14:32:28 -0500 Subject: [dba-SQLServer] SQL install f**ked up In-Reply-To: <01f301c47fd7$43341740$6601a8c0@rock> References: <01f301c47fd7$43341740$6601a8c0@rock> Message-ID: <20040811143228.1049610184.serbach@new.rr.com> Arthur, >> Can anyone provide a recipe that is guaranteed to blow away all traces of MS-SQL on my development machine? << No recipe, but I wonder if there's an install log someplace that details all the registry entries and file copies made by the SQL Server installation program. I used Norton's Clean Sweep for a short time and it recorded exactly those things. I don't use it any more, but I believe that at least some applications keep an install log. Maybe SQL Server does. Steve Erbach From mwp.reid at qub.ac.uk Wed Aug 11 14:36:41 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Wed, 11 Aug 2004 20:36:41 +0100 Subject: [dba-SQLServer] Re: Connection problems References: <200408111701.i7BH1KQ03344@databaseadvisors.com><5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net><20040811132033.1842960002.serbach@new.rr.com><000d01c47fd0$c94b7ff0$2702a8c0@Martin> <20040811141420.708619281.serbach@new.rr.com> Message-ID: <002001c47fda$8769ea40$2702a8c0@Martin> Steven Can you use ISQL to see the database?? Then create a dbo user assign a password/username and try and use that. Martin ----- Original Message ----- From: "Steven W. Erbach" To: Sent: Wednesday, August 11, 2004 8:14 PM Subject: Re: [dba-SQLServer] Re: Connection problems > Martin, > > >> Have you changed a Windows Account or anything to do with security?? << > > No. This SQL Server is a shared server at some web host's site that my client rents. It's the back end for an on-line ASP data-entry and reporting application. Using the user account that allows full editing of data in all tables, I found that I cannot modify nor create an sproc, at least not with an Access ADP. > > Steve Erbach > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From Rich_Lavsa at pghcorning.com Wed Aug 11 14:43:31 2004 From: Rich_Lavsa at pghcorning.com (Lavsa, Rich) Date: Wed, 11 Aug 2004 15:43:31 -0400 Subject: [dba-SQLServer] SQL install f**ked up Message-ID: <833956F5C117124A89417638FDB11290EBD267@goexchange.pghcorning.com> We have issues like this with other software. The way we remove it if the uninstall does not work is to DELETE all known folders where the application lived. Second is to run a Registry Clean utility. The reason to DELETE the application folders is because if you don't, a good registry cleaner will try to find where the files have moved in which you are back to square 1. We use Norton which shows you what it is going to fix before it fix's it. By following this process it allows you to remove all folders and then cleans the registry entries that link back to the folders\files, thereby breaking the application and all references to it. Reboot making sure all the SQL Services are not started and try to do a fresh install. Like I said we use this method for a different application when it goes haywire on us, hopefully it will work for you. I would back up your machine first as I have never tried this with SQL Server, don't know the implications of using this method. IF you try it please try it at your own risk and PLEASE back everything up JUST IN CASE. rich -----Original Message----- From: Arthur Fuller [mailto:artful at rogers.com] Sent: Wednesday, August 11, 2004 3:13 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] SQL install f**ked up I have asked this before in several lists but so far received no useful response, so I'll try again. Can anyone provide a recipe that is guaranteed to blow away all traces of MS-SQL on my development machine? Everything used to work fine, by which I mean the server, EM, QA, Yukon beta, etc. Then something happened, no idea what, and now nothing works. I can't uninstall successfully, I can't re-install... Nada! I even booted safe and renamed the directory, then reinstalled and even that didn't work. Fortunately I have other boxes that I can use, and of course all the data is backed up safely (and now resides on said server), but that's not the issue. I want to blow away every trace of SQL on this box and then successfully re-install. Any suggestions, short of a complete reformat? TIA, Arthur _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Wed Aug 11 14:44:39 2004 From: artful at rogers.com (Arthur Fuller) Date: Wed, 11 Aug 2004 15:44:39 -0400 Subject: [dba-SQLServer] 2 Questions -- one about terminology, one about design choices In-Reply-To: <01f301c47fd7$43341740$6601a8c0@rock> Message-ID: <020301c47fdb$a443ab00$6601a8c0@rock> I must have asked various experts this question a dozen times, but it seems that wherever my brain chooses to file the answer must have been afflicted by those joints I smoked as a youth :) Actually, I have two questions here, the second dependent to some degree upon the first. Assume an Orders table. Assume several types of products/services that can be placed upon said Orders table. I.e. instead of a single OrderDetails table, there might be several such tables. A simple example might help clarify what I'm getting at... Order 123 FlightDetails -- departs, arrives, price, etc. HotelDetails -- checkin, checkout, price, etc. CarRentalDetails -- fromdate, todate, price, added insurance stuff, etc. ConcertDetails -- 2 gold tickets to Madonna + Eminem, price, row/seat numbers etc. ExtraDetails -- leather jacket signed by Madonna and Eminem, price, size, colour, etc. Question 1: assuming that you have 5 such "detail" tables connected to each order, and that you model this by creating one OrderDetails table each of whose rows point to one of the 5 tables, what is the jargon name for the OrderDetails table? There IS such a name, and this is not an uncommon modeling problem, but for the life of me, I cannot remember the name for this. Question 2: In particularly broad strokes, I can envision this in two ways. 1. Jam the OrderDetails table with every field that any DetailType might need; create queries that expose only the fields of interest to that DetailType; then create forms based on those queries. Advantage? Everything is in one table and certain fields are common to all DetailTypes (Description, Price, ExtendedAmount, etc.). This allows easy creation of the OrderDetails subform. To edit any detail, the user might double-click on the row. The code examines the detail type and loads the appropriate edit form. 2. Keep everything in separate tables, while the OrderDetails table serves merely as a pointer to which table to look in for each row's data. Advantage? No columns are irrelevant to any particular DetailType, which enables much better table-level validation. Disadvantage? You have to find some way to aggregate all the Details from 5 tables. Union could do it, perhaps. Opinions, anyone? (In my years on this list, I have noticed no dearth of opinions :) TIA, Arthur From my.lists at verizon.net Wed Aug 11 14:48:55 2004 From: my.lists at verizon.net (Francisco H Tapia) Date: Wed, 11 Aug 2004 12:48:55 -0700 Subject: [dba-SQLServer] SQL install f**ked up In-Reply-To: <01f301c47fd7$43341740$6601a8c0@rock> References: <01f301c47fd7$43341740$6601a8c0@rock> Message-ID: <411A7827.4020507@verizon.net> Arthur Fuller wrote On 8/11/2004 12:13 PM: >I have asked this before in several lists but so far received no useful >response, so I'll try again. Can anyone provide a recipe that is >guaranteed to blow away all traces of MS-SQL on my development machine? > >Everything used to work fine, by which I mean the server, EM, QA, Yukon >beta, etc. Then something happened, no idea what, and now nothing works. >I can't uninstall successfully, I can't re-install... Nada! I even >booted safe and renamed the directory, then reinstalled and even that >didn't work. > >Fortunately I have other boxes that I can use, and of course all the >data is backed up safely (and now resides on said server), but that's >not the issue. I want to blow away every trace of SQL on this box and >then successfully re-install. Any suggestions, short of a complete >reformat? > >TIA, >Arthur > > Recently I was messing around w/ my home pc and installed a hefty 250gb 8mb cache drive. It is ideal for video captuer off my Sony DV camera. In the process I re-organized drives for better cooling and also messed around w/ the registry. lo and behold I blew up my system registry hive by my actions :) and I didn't even use a backup before I started :D, way to live dangerously. I brought back my system hive from an old backup but that had obscure traces of a previous sql 7 install and some other junk. I downloaded registry mechanic an that cleaned up the raw traces in the registry, but there were still issues w/ files which were available on the system but the registry recognized incorrectly due to the old re-store. I downloaded this little utility file "TARS.CMD" from Michael Espinola's personal website, it toggles off all services on your pc. You can then try and re-install of Sql Server 2000 + SP3 and get your windows system working again like new. I had to do this to get my pc from SP2 to SP4 again and reload SQL Server 2000 develpoer w/ SP3a :) so much for fudging the registry :D hope this solution works for ya btw, I have a copy of the TARS.cmd file if you want it, contact me off the list. -- -Francisco From my.lists at verizon.net Wed Aug 11 14:50:10 2004 From: my.lists at verizon.net (Francisco H Tapia) Date: Wed, 11 Aug 2004 12:50:10 -0700 Subject: [dba-SQLServer] Connection problems In-Reply-To: <20040811141116.305150145.serbach@new.rr.com> References: <20040811141116.305150145.serbach@new.rr.com> Message-ID: <411A7872.7020501@verizon.net> Steven W. Erbach wrote On 8/11/2004 12:11 PM: >Billy, > > > >>>Delete the registration and re-register it. << >>> >>> > >Excellent question. No, I can't, because the database is on a shared SQL server and all we have are two user logins: one of them is read-only and the other allows data changes to all tables. Using an Access ADP to get at the data I found that I'm unable to create a new sproc or to modify an existing one. That's why I want to try EM. > >Steve Erbach > > > > Sounds to me like you have other problems... If you load QA can you create and delete objects w/ the Write Permissions login? -- -Francisco From serbach at new.rr.com Wed Aug 11 15:18:49 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 15:18:49 -0500 Subject: [dba-SQLServer] Re: Connection problems In-Reply-To: <002001c47fda$8769ea40$2702a8c0@Martin> References: <200408111701.i7BH1KQ03344@databaseadvisors.com> <5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> <20040811132033.1842960002.serbach@new.rr.com> <000d01c47fd0$c94b7ff0$2702a8c0@Martin> <20040811141420.708619281.serbach@new.rr.com> <002001c47fda$8769ea40$2702a8c0@Martin> Message-ID: <20040811151849.1153768750.serbach@new.rr.com> Martin, >> Can you use ISQL to see the database?? << Being unable to locate a reference to ISQL in the highly-acclaimed Harkins & Reid book, I reluctantly turned to Books Online. Using an Access 2000 ADP to determine what the connection string was, I fired up ISQL and entered the appropriate parameters, including the Initial Catalog/default database. Interestingly enough, I got a message: Cannot open user default database. Using master database instead. I then see the "1>" prompt. I haven't gone any further. Back in Access 2000 I tried once again to go into design mode for an sproc. Lo and behold, I was able to modify it in Access 2000!? This is on my own workstation. At my client's workstation last week, I couldn't get anywhere. I kept getting a message like "You may not have sufficient rights." I even tried creating a new sproc and again, lo and behold, there's the template for a new sproc. I don't get it. Hmmm. I think I'll see if I can get somewhere with Access 2000/XP rather than mess with ISQL. Thanks for pointing me in that direction, though. I may be all right now. Thanks for the help. Steve Erbach From mwp.reid at qub.ac.uk Wed Aug 11 15:24:29 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Wed, 11 Aug 2004 21:24:29 +0100 Subject: [dba-SQLServer] Re: Connection problems References: <200408111701.i7BH1KQ03344@databaseadvisors.com><5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net><20040811132033.1842960002.serbach@new.rr.com><000d01c47fd0$c94b7ff0$2702a8c0@Martin><20040811141420.708619281.serbach@new.rr.com><002001c47fda$8769ea40$2702a8c0@Martin> <20040811151849.1153768750.serbach@new.rr.com> Message-ID: <000b01c47fe1$34fa3b00$2702a8c0@Martin> This may be a silly point but did you check with the service provider. Is your DB actually still on line?? Martin ----- Original Message ----- From: "Steven W. Erbach" To: Sent: Wednesday, August 11, 2004 9:18 PM Subject: Re: [dba-SQLServer] Re: Connection problems > Martin, > > >> Can you use ISQL to see the database?? << > > Being unable to locate a reference to ISQL in the highly-acclaimed Harkins & Reid book, I reluctantly turned to Books Online. Using an Access 2000 ADP to determine what the connection string was, I fired up ISQL and entered the appropriate parameters, including the Initial Catalog/default database. Interestingly enough, I got a message: > > Cannot open user default database. Using master database instead. > > I then see the "1>" prompt. I haven't gone any further. > > Back in Access 2000 I tried once again to go into design mode for an sproc. Lo and behold, I was able to modify it in Access 2000!? This is on my own workstation. At my client's workstation last week, I couldn't get anywhere. I kept getting a message like "You may not have sufficient rights." > > I even tried creating a new sproc and again, lo and behold, there's the template for a new sproc. I don't get it. > > Hmmm. I think I'll see if I can get somewhere with Access 2000/XP rather than mess with ISQL. Thanks for pointing me in that direction, though. I may be all right now. Thanks for the help. > > Steve Erbach > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From serbach at new.rr.com Wed Aug 11 15:27:07 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 15:27:07 -0500 Subject: [dba-SQLServer] Connection problems In-Reply-To: <411A7872.7020501@verizon.net> References: <20040811141116.305150145.serbach@new.rr.com> <411A7872.7020501@verizon.net> Message-ID: <20040811152707.69850771.serbach@new.rr.com> Francisco, >> If you load QA can you create and delete objects w/ the Write Permissions login? << Thanks for replying. When I attempt to log in to the server I get: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Unable to connect to server blah.blah.blah.blah: Server: Msg 4064, Level 16, State 1 [Microsoft][ODBC SQL Server Driver][SQL Server]Cannot open user default database. Login failed. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Since it appears that I can now edit sprocs in Access, I'll work in that environment. Not that I wouldn't like to know what the problem is here. Steve Erbach From MPorter at acsalaska.com Wed Aug 11 15:30:01 2004 From: MPorter at acsalaska.com (Porter, Mark) Date: Wed, 11 Aug 2004 12:30:01 -0800 Subject: [dba-SQLServer] 2 Questions -- one about terminology, one about design choices Message-ID: <635B80FE6C7D5A409586A6A110D97D170E4C00@ACSANCHOR.corp.acsalaska.com> Could you abstract the data? example structure like: Product Type (Flight, Hotel, etc.) Product FromDate Product ToDate Product Cost Product Tax etc. > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of Arthur > Fuller > Sent: Wednesday, August 11, 2004 11:45 AM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] 2 Questions -- one about > terminology,one about > design choices > > > I must have asked various experts this question a dozen times, but it > seems that wherever my brain chooses to file the answer must have been > afflicted by those joints I smoked as a youth :) > > Actually, I have two questions here, the second dependent to > some degree > upon the first. > > Assume an Orders table. Assume several types of products/services that > can be placed upon said Orders table. I.e. instead of a single > OrderDetails table, there might be several such tables. A > simple example > might help clarify what I'm getting at... > > Order 123 > > FlightDetails -- departs, arrives, price, etc. > HotelDetails -- checkin, checkout, price, etc. > CarRentalDetails -- fromdate, todate, price, added insurance > stuff, etc. > ConcertDetails -- 2 gold tickets to Madonna + Eminem, price, row/seat > numbers etc. > ExtraDetails -- leather jacket signed by Madonna and Eminem, price, > size, colour, etc. > > Question 1: assuming that you have 5 such "detail" tables connected to > each order, and that you model this by creating one OrderDetails table > each of whose rows point to one of the 5 tables, what is the > jargon name > for the OrderDetails table? There IS such a name, and this is not an > uncommon modeling problem, but for the life of me, I cannot > remember the > name for this. > > Question 2: In particularly broad strokes, I can envision this in two > ways. > > 1. Jam the OrderDetails table with every field that any > DetailType might > need; create queries that expose only the fields of interest to that > DetailType; then create forms based on those queries. Advantage? > Everything is in one table and certain fields are common to all > DetailTypes (Description, Price, ExtendedAmount, etc.). This > allows easy > creation of the OrderDetails subform. To edit any detail, the > user might > double-click on the row. The code examines the detail type > and loads the > appropriate edit form. > > 2. Keep everything in separate tables, while the OrderDetails table > serves merely as a pointer to which table to look in for each row's > data. Advantage? No columns are irrelevant to any particular > DetailType, > which enables much better table-level validation. > Disadvantage? You have > to find some way to aggregate all the Details from 5 tables. > Union could > do it, perhaps. > > Opinions, anyone? > > (In my years on this list, I have noticed no dearth of opinions :) > > TIA, > Arthur > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > *********************************************************************************** 11/8/2004 This transmittal may contain confidential information intended solely for the addressee. If you are not the intended recipient, you are hereby notified that you have received this transmittal in error; any review, dissemination, distribution or copying of this transmittal is strictly prohibited. If you have received this communication in error, please notify us immediately by reply or by telephone (collect at 907-564-1000) and ask to speak with the message sender. In addition, please immediately delete this message and all attachments. Thank you. ACS From tuxedo_man at hotmail.com Wed Aug 11 15:41:24 2004 From: tuxedo_man at hotmail.com (Billy Pang) Date: Wed, 11 Aug 2004 20:41:24 +0000 Subject: [dba-SQLServer] Re: Connection problems Message-ID: do you know if anyone remove any permissions for the security accounts you are using to access the target database? sometimes when I get the "cannot open default database", it could mean one of two things: * I no longer have access to the default database * someone changed the default database to another database which I do not have access to if it is the first reason, then look for way to restore database access to my security account if it is the second reason, then change the default database of my security account to what it was originally. HTH Billy >From: "Steven W. Erbach" >Reply-To: dba-sqlserver at databaseadvisors.com >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Re: Connection problems >Date: Wed, 11 Aug 2004 15:18:49 -0500 > >Martin, > > >> Can you use ISQL to see the database?? << > >Being unable to locate a reference to ISQL in the highly-acclaimed Harkins >& Reid book, I reluctantly turned to Books Online. Using an Access 2000 ADP >to determine what the connection string was, I fired up ISQL and entered >the appropriate parameters, including the Initial Catalog/default database. >Interestingly enough, I got a message: > >Cannot open user default database. Using master database instead. > >I then see the "1>" prompt. I haven't gone any further. > >Back in Access 2000 I tried once again to go into design mode for an sproc. >Lo and behold, I was able to modify it in Access 2000!? This is on my own >workstation. At my client's workstation last week, I couldn't get anywhere. >I kept getting a message like "You may not have sufficient rights." > >I even tried creating a new sproc and again, lo and behold, there's the >template for a new sproc. I don't get it. > >Hmmm. I think I'll see if I can get somewhere with Access 2000/XP rather >than mess with ISQL. Thanks for pointing me in that direction, though. I >may be all right now. Thanks for the help. > >Steve Erbach > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > _________________________________________________________________ Take advantage of powerful junk e-mail filters built on patented Microsoft? SmartScreen Technology. http://join.msn.com/?pgmarket=en-ca&page=byoa/prem&xAPID=1994&DI=1034&SU=http://hotmail.com/enca&HL=Market_MSNIS_Taglines Start enjoying all the benefits of MSN? Premium right now and get the first two months FREE*. From serbach at new.rr.com Wed Aug 11 15:49:02 2004 From: serbach at new.rr.com (Steven W. Erbach) Date: Wed, 11 Aug 2004 15:49:02 -0500 Subject: [dba-SQLServer] Re: Connection problems In-Reply-To: <000b01c47fe1$34fa3b00$2702a8c0@Martin> References: <200408111701.i7BH1KQ03344@databaseadvisors.com> <5.1.0.14.2.20040811122102.0139c140@pop3.highstream.net> <20040811132033.1842960002.serbach@new.rr.com> <000d01c47fd0$c94b7ff0$2702a8c0@Martin> <20040811141420.708619281.serbach@new.rr.com> <002001c47fda$8769ea40$2702a8c0@Martin> <20040811151849.1153768750.serbach@new.rr.com> <000b01c47fe1$34fa3b00$2702a8c0@Martin> Message-ID: <20040811154902.1917703041.serbach@new.rr.com> Martin, >> This may be a silly point but did you check with the service provider. Is your DB actually still on line?? << Well, since I can get to it with an Access ADP or through an ODBC connection, yes. Since it's you, Martin, I wouldn't presume to term your questions silly. Steve Erbach From Susan.Klos at fldoe.org Wed Aug 11 16:12:44 2004 From: Susan.Klos at fldoe.org (Klos, Susan) Date: Wed, 11 Aug 2004 17:12:44 -0400 Subject: [dba-SQLServer] crosstabs in sql Message-ID: <01B619CB8F6C8C478EDAC39191AEC51EE73824@DOESEFPEML02.EUS.FLDOE.INT> How do you create a crosstab query in SQL Server. I tried doing one in Access and copying the SQL into SQL Query Analyzer but I can't seem to get it right. Susan Klos Senior Database Analyst Evaluation and Reporting Florida Department of Education 850-245-0708 sc 205-0708 From artful at rogers.com Wed Aug 11 16:43:15 2004 From: artful at rogers.com (Arthur Fuller) Date: Wed, 11 Aug 2004 17:43:15 -0400 Subject: [dba-SQLServer] 2 Questions -- one about terminology, one about design choices In-Reply-To: <635B80FE6C7D5A409586A6A110D97D170E4C00@ACSANCHOR.corp.acsalaska.com> Message-ID: <021f01c47fec$35973d50$6601a8c0@rock> Perhaps I mis-stated the issue, so I'll try again. SOME of the data is abstractable as you suggest. The problem is that each DetailType contains within it a bunch of columns that are irrelevant to other DetailTypes. For example, a HotelRoom detail involves CheckInDate, Duration (or CheckOutDate, whichever you prefer), Occupancy (Double, Single, Triple, Quad -- and if it's Single then there's a SingleOccupancyPremium to factor in). A ConcertTicket might be priced in Platinum, Gold, Silver, etc. ranges, and have a row and seat number attached. A CarRental might have a Size field (compact, mid-size, SUV, whatever). And so on. So back to the two (or more models): A) Jam all possible fields into a single OrderDetails table, mask the irrelevant ones using queries, and lose a lot of the built-in validation stuff that Access offers. B) Place all the stuff common to all DetailTypes in a single OrderDetails table, which also contains a pointer to the N tables corresponding to the collection of DetailTypes. Show only the common fields in the subform; provide an Edit button or double-click or whatever that opens the form corresponding to these fields unique to each DetailType. Does that help clarify the second question I was asking? As to the first question, there is a term to describe a model such as this: Orders -= OrderDetails =- DetailTypes Points to one of N tables that correspond to the collection of DetailTypes In this model, there is a term that describes the OrderDetails table, whose basic function is to bridge the gap between Orders and the various tables corresponding to the DetailTypes. Anyone know what this term is? TIA, Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Porter, Mark Sent: Wednesday, August 11, 2004 4:30 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] 2 Questions -- one about terminology,one about design choices Could you abstract the data? example structure like: Product Type (Flight, Hotel, etc.) Product FromDate Product ToDate Product Cost Product Tax etc. From artful at rogers.com Wed Aug 11 16:47:29 2004 From: artful at rogers.com (Arthur Fuller) Date: Wed, 11 Aug 2004 17:47:29 -0400 Subject: [dba-SQLServer] crosstabs in sql In-Reply-To: <01B619CB8F6C8C478EDAC39191AEC51EE73824@DOESEFPEML02.EUS.FLDOE.INT> Message-ID: <022201c47fec$ccdb4cb0$6601a8c0@rock> The basic scenario is this: For each column, use a CASE statement. Look up CASE in BOL and there are some useful examples. The basic idea is this: CASE [somecondition] THEN [FieldOfInterest] ELSE 0 (that's zero) END And you add as many cases as you need. That is the thumbnail to be sure, but that's the idea. Unfortunately there is no wizard to help you do this (until SQL 2005 finally hits the streets). Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Klos, Susan Sent: Wednesday, August 11, 2004 5:13 PM To: 'dba-sqlserver at databaseadvisors.com' Subject: [dba-SQLServer] crosstabs in sql How do you create a crosstab query in SQL Server. I tried doing one in Access and copying the SQL into SQL Query Analyzer but I can't seem to get it right. Susan Klos From stuart at lexacorp.com.pg Wed Aug 11 17:30:38 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Thu, 12 Aug 2004 08:30:38 +1000 Subject: [dba-SQLServer] 2 Questions -- one about terminology, one about design choices In-Reply-To: <021f01c47fec$35973d50$6601a8c0@rock> References: <635B80FE6C7D5A409586A6A110D97D170E4C00@ACSANCHOR.corp.acsalaska.com> Message-ID: <411B2AAE.26912.20917567@lexacorp.com.pg> On 11 Aug 2004 at 17:43, Arthur Fuller wrote: > > So back to the two (or more models): > > A) Jam all possible fields into a single OrderDetails table, mask the > irrelevant ones using queries, and lose a lot of the built-in validation > stuff that Access offers. > I don't like it. Wastes a lot of space, can run into maximumn fields in table problems, and worst fo all - what happens when you have one set of flight details, two sets of hotel details, three sets of concert details all on the same order? > B) Place all the stuff common to all DetailTypes in a single > OrderDetails table, which also contains a pointer to the N tables > corresponding to the collection of DetailTypes. Show only the common > fields in the subform; provide an Edit button or double-click or > whatever that opens the form corresponding to these fields unique to > each DetailType. > That's the way I'd go. Although I'd probably end up going for a series of details tabs rather than separate forms. Probably all I'd store in the link table is the OrderPK, Type and DetailTablesPK. > > In this model, there is a term that describes the OrderDetails table, > whose basic function is to bridge the gap between Orders and the various > tables corresponding to the DetailTypes. Anyone know what this term is? > I've always just called it a link table. -- Stuart From MPorter at acsalaska.com Wed Aug 11 17:41:50 2004 From: MPorter at acsalaska.com (Porter, Mark) Date: Wed, 11 Aug 2004 14:41:50 -0800 Subject: [dba-SQLServer] 2 Questions -- one about terminology, one about design choices Message-ID: <635B80FE6C7D5A409586A6A110D97D170E4C05@ACSANCHOR.corp.acsalaska.com> Another method may be a Products table, a Paramaters (Attributes) table and a Paramater Value table. i.e. Product = Hotel Room, Paramater = Arrival Date, Departure Date, Paramater Value = Arrival Date = 1/1/04, Departure Date = 2/1/04 A master set of tables for your products, possible paramaters and paramater value types, and a set for the customer order process. I see this getting dicy though, depending on the number of unique paramaters or attributes per product. Mark > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of Arthur > Fuller > Sent: Wednesday, August 11, 2004 1:43 PM > To: dba-sqlserver at databaseadvisors.com > Subject: RE: [dba-SQLServer] 2 Questions -- one about terminology,one > about design choices > > > Perhaps I mis-stated the issue, so I'll try again. SOME of the data is > abstractable as you suggest. The problem is that each DetailType > contains within it a bunch of columns that are irrelevant to other > DetailTypes. For example, a HotelRoom detail involves CheckInDate, > Duration (or CheckOutDate, whichever you prefer), Occupancy (Double, > Single, Triple, Quad -- and if it's Single then there's a > SingleOccupancyPremium to factor in). A ConcertTicket might > be priced in > Platinum, Gold, Silver, etc. ranges, and have a row and seat number > attached. A CarRental might have a Size field (compact, mid-size, SUV, > whatever). And so on. > > So back to the two (or more models): > > A) Jam all possible fields into a single OrderDetails table, mask the > irrelevant ones using queries, and lose a lot of the built-in > validation > stuff that Access offers. > > B) Place all the stuff common to all DetailTypes in a single > OrderDetails table, which also contains a pointer to the N tables > corresponding to the collection of DetailTypes. Show only the common > fields in the subform; provide an Edit button or double-click or > whatever that opens the form corresponding to these fields unique to > each DetailType. > > Does that help clarify the second question I was asking? > > As to the first question, there is a term to describe a model such as > this: > > Orders > -= OrderDetails > =- DetailTypes > Points to one of N tables that correspond to the collection of > DetailTypes > > In this model, there is a term that describes the OrderDetails table, > whose basic function is to bridge the gap between Orders and > the various > tables corresponding to the DetailTypes. Anyone know what > this term is? > > TIA, > Arthur > > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf > Of Porter, > Mark > Sent: Wednesday, August 11, 2004 4:30 PM > To: dba-sqlserver at databaseadvisors.com > Subject: RE: [dba-SQLServer] 2 Questions -- one about terminology,one > about design choices > > > Could you abstract the data? > > example structure like: > > Product Type (Flight, Hotel, etc.) > Product FromDate > Product ToDate > Product Cost > Product Tax > etc. > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > *********************************************************************************** 11/8/2004 This transmittal may contain confidential information intended solely for the addressee. If you are not the intended recipient, you are hereby notified that you have received this transmittal in error; any review, dissemination, distribution or copying of this transmittal is strictly prohibited. If you have received this communication in error, please notify us immediately by reply or by telephone (collect at 907-564-1000) and ask to speak with the message sender. In addition, please immediately delete this message and all attachments. Thank you. ACS From jwcolby at colbyconsulting.com Thu Aug 12 22:10:54 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 12 Aug 2004 23:10:54 -0400 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <001d01c471a1$a4739c90$1b02a8c0@MARTINREID> Message-ID: <001301c480e3$2879a9d0$80b3fea9@ColbyM6805> I have been contacted by what appears to be a startup marketing firm, who want to take a 64 million name database and pull it into SQL Server. They are getting the data in comma delimited format, apparently compressed, on two DVDs - totaling something like 60 gbytes of raw text data. They have never seen the data (just purchased it) but they think it is about 400-500 fields of some personal info but mostly demographic stuff. Things like "owns a boat, owns a motor home, has credit cards, ethnic, sex, income" etc. Their intention is to pull subsets of the data and sell it to other companies. This is all very vague since all I know so far is what I have gleaned from one of them in a handful of phone conversations. They haven't seen the data, don't know how to get it our of the DVDs etc. I have no experience with trying to get something like that into SQL Server. I have a couple of questions. First, if the data is comma delimited, my assumption is that the first line will be field names, followed by data. Is SQL Server (or some other big db) capable of exporting directly into a cab file or zip file? If this is two DVDs both of which are compressed comma delimited files, how do I uncompress the data before importing it into SQL Server? They think it is 60gb. I have room for a 60gb database but not the uncompressed data as well as the database. I can of course just go buy a huge hard disk (200 gb) but I just spent a lot of time getting a mirror up and especially for something of this nature I would want to get it on a mirrored drive. Plus they want to start looking at the data as soon as possible. Second, is this a job for bcp? How do I get it in? Writing a VB function to parse it doesn't seem reasonable. Third, how long is it going to take to get that much data into a SQL Server table. It apparently is a single flat file which should translate to a single table of 400-500 fields. I assume that something like bcp would handle building the table given the field name in the first line, comma delimited? If not, how do I look at that first line so that I can go build the table "manually" in order to do the import? Fourth, what tool would I use to query that? Access has query limits that appear to eliminate it as a tool for this, never mind the speed issues. On the other hand if the actual number of fields exported out are small (the data exported doesn't have to contain the demographics data, just the personal data) then perhaps an ADP would allow a form with controls to select the demographics, then SQL Server directly dumps the data. Fifth, it seems logical that I would want to index the demographics fields (several hundred fields) so that queries could efficiently pull subsets of the data. How "big" is the database going to get with indexes on 350 (or more) fields? IOW, the raw data is 65gb - turned into a properly indexed table, what does that translate into in terms of SQL Database file size? Sixth, given 350 demographic fields, wouldn't I need to pull subsets to maintain lists of all the valid values for a given field. IOW, if there is a field for ethnicity, with 15 choices, it seems I would need to run a "distinct" query against that field and build a table of all the possible values rather than run a "distinct" on the fly to populate the combo that allows choices. Now multiply that by 350 fields. That looks like a lot of preprocessing just to get ready to start hashing the data. Seventh, How much processor power / memory is needed to handle a database of this nature? Is this something that I could reasonably expect to buy / build? These guys are examining their options, but three options they mentioned are: 1) Just hire me (or someone) to "take an order and produce a file, on a CD or tape". IOW, I own the server, I hold the db, and I take an order and fed ex out a CD. 2) Buy a server, stick it in their office and hire me to set it up and maintain it. 3) Hire a company out there on the internet somewhere to put the data on a server at a server farm. Then query that, somehow get the data onto a cd or a tape. Much bigger question there since I would no longer just figure it out and do it all myself. Assuming a reasonable fee per data pull my preferences would be ordered 1, 2, and 3. Option 1 sets up a constant income stream but with the issue of having to be available to build the CDs. Option 3 is just too much setup, at least initially. Is anyone out there doing something of this nature? Any words of wisdom? John W. Colby www.ColbyConsulting.com From stuart at lexacorp.com.pg Thu Aug 12 22:31:01 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Fri, 13 Aug 2004 13:31:01 +1000 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <001301c480e3$2879a9d0$80b3fea9@ColbyM6805> References: <001d01c471a1$a4739c90$1b02a8c0@MARTINREID> Message-ID: <411CC294.18940.6AC168@lexacorp.com.pg> On 12 Aug 2004 at 23:10, John W. Colby wrote: > > Is anyone out there doing something of this nature? Any words of wisdom? > I wouldn't touch it with a bargepole. 1. It smells like spam/scam so I wouldn't on principle 2. You are looking at *several* hundred gig of data and indexes and huge processing power to be able to pull selected data sets. -- Stuart From jwcolby at colbyconsulting.com Thu Aug 12 22:48:52 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 12 Aug 2004 23:48:52 -0400 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <411CC294.18940.6AC168@lexacorp.com.pg> Message-ID: <001401c480e8$76472ac0$80b3fea9@ColbyM6805> I will certainly ask that question but they seem to be selling data sets to marketing companies. The guy indicated that it was address data, not email or phone numbers. They want the data on CD/Tape, FedExed to their clients. There are valid businesses doing exactly this kind of stuff, and valid plain old postal bulk mailings that would conceivably use something like this. From what I've read about spammers is they often don't even bother with demographics since it is just cheaper to blast out a million emails to everyone. I will not get involved with spam but I would have to consider doing this for a marketing firm that does bulk mail. I will be asking them about "do not bother me" lists as a filter to see what they are up to. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart McLachlan Sent: Thursday, August 12, 2004 11:31 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Going over to the dark side On 12 Aug 2004 at 23:10, John W. Colby wrote: > > Is anyone out there doing something of this nature? Any words of > wisdom? > I wouldn't touch it with a bargepole. 1. It smells like spam/scam so I wouldn't on principle 2. You are looking at *several* hundred gig of data and indexes and huge processing power to be able to pull selected data sets. -- Stuart _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From mark at markkaren.com Thu Aug 12 22:56:51 2004 From: mark at markkaren.com (Mark Rider) Date: Thu, 12 Aug 2004 22:56:51 -0500 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <001301c480e3$2879a9d0$80b3fea9@ColbyM6805> Message-ID: <200408130357.i7D3vMQ17739@databaseadvisors.com> John, I cannot address all of your points, but I am working with compressed CSV files daily. They are about 100MB zipped and come out to around 20 million rows with 8 fields /row. Without knowing how the data is being given to you, but assuming it is set up the same way as mine is (bcp'd into the database on the fly and then zipped and made available to download) there are some considerations you need to look at regarding the data. All of my raw data is sent as a varchar 8000 so that the DateTime field and the 5 character Name field is the same size. That can be greatly reduced by judicious use of appropriate fields in SQL Server. But it causes some interesting issues trying to get the raw data in there in the first place - there are often errors in the length of the fields and extra spaces that need to be dealt with. All of this can be done in the initial import using DTS and some careful monitoring. In one set of data I have to make 2 passes - one to get the fields into the database and a second DTS after I filter out the crap that would cause import issues. I know the DateTime field should be a DateTime Field, but I have to pull it in as a varchar to start because it is not always date and time information in that row! Again, this should only be a one time shot for you. I would suggest looking at the DTS functions - you can import a CSV file and create the table(s) on the fly, including or excluding the columns that you want based on the table you are creating, so that would not be a major issue. Once the tables are created, you can go back and index them as necessary. If you were to expect to have any real speed out of this you will need a heavy duty server. I run the data through a couple of iterations and the most intensive one is a correlation process where I have to compare every number in a column to every other column's data. With around 800 columns holding 390 rows each, a dual processor (1.8 GHz each) 3GB system takes about 5 hours to crunch the data - and that is a dedicated SQL Server box. What you are looking at will not be as numerically intense as the calculations necessary for correlations, but the sheer size of the tables, queries and sub-queries to get what you want will be as intense in terms of processor and memory, if not more. My suggestion is that until they can show you the data, so that you can see what the 350 demographic fields are and how they could be related (for example if there is an "owns a boat" how many other boat-related demographic points are there, and can that be queried differently), you don't want to make any commitments to doing anything. A VB or Access form front end to a SQL backend would be possible, but trying to make a query that will show every 30 - 45 year old red-headed left-handed boat owner that does not smoke and has a MasterCard with less than 10% of their limit left on it, who is married to a blonde woman without enhancement surgery and still has all of her original teeth and likes the double-wide they live in on the shores of Lake Okeechobee when they are not vacationing at Dollywood or Branson might be a difficult proposition. Especially when you have to include their kids in the mix. HTH, Mark -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 12, 2004 10:11 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Going over to the dark side I have been contacted by what appears to be a startup marketing firm, who want to take a 64 million name database and pull it into SQL Server. They are getting the data in comma delimited format, apparently compressed, on two DVDs - totaling something like 60 gbytes of raw text data. They have never seen the data (just purchased it) but they think it is about 400-500 fields of some personal info but mostly demographic stuff. Things like "owns a boat, owns a motor home, has credit cards, ethnic, sex, income" etc. Their intention is to pull subsets of the data and sell it to other companies. This is all very vague since all I know so far is what I have gleaned from one of them in a handful of phone conversations. They haven't seen the data, don't know how to get it our of the DVDs etc. I have no experience with trying to get something like that into SQL Server. I have a couple of questions. First, if the data is comma delimited, my assumption is that the first line will be field names, followed by data. Is SQL Server (or some other big db) capable of exporting directly into a cab file or zip file? If this is two DVDs both of which are compressed comma delimited files, how do I uncompress the data before importing it into SQL Server? They think it is 60gb. I have room for a 60gb database but not the uncompressed data as well as the database. I can of course just go buy a huge hard disk (200 gb) but I just spent a lot of time getting a mirror up and especially for something of this nature I would want to get it on a mirrored drive. Plus they want to start looking at the data as soon as possible. Second, is this a job for bcp? How do I get it in? Writing a VB function to parse it doesn't seem reasonable. Third, how long is it going to take to get that much data into a SQL Server table. It apparently is a single flat file which should translate to a single table of 400-500 fields. I assume that something like bcp would handle building the table given the field name in the first line, comma delimited? If not, how do I look at that first line so that I can go build the table "manually" in order to do the import? Fourth, what tool would I use to query that? Access has query limits that appear to eliminate it as a tool for this, never mind the speed issues. On the other hand if the actual number of fields exported out are small (the data exported doesn't have to contain the demographics data, just the personal data) then perhaps an ADP would allow a form with controls to select the demographics, then SQL Server directly dumps the data. Fifth, it seems logical that I would want to index the demographics fields (several hundred fields) so that queries could efficiently pull subsets of the data. How "big" is the database going to get with indexes on 350 (or more) fields? IOW, the raw data is 65gb - turned into a properly indexed table, what does that translate into in terms of SQL Database file size? Sixth, given 350 demographic fields, wouldn't I need to pull subsets to maintain lists of all the valid values for a given field. IOW, if there is a field for ethnicity, with 15 choices, it seems I would need to run a "distinct" query against that field and build a table of all the possible values rather than run a "distinct" on the fly to populate the combo that allows choices. Now multiply that by 350 fields. That looks like a lot of preprocessing just to get ready to start hashing the data. Seventh, How much processor power / memory is needed to handle a database of this nature? Is this something that I could reasonably expect to buy / build? These guys are examining their options, but three options they mentioned are: 1) Just hire me (or someone) to "take an order and produce a file, on a CD or tape". IOW, I own the server, I hold the db, and I take an order and fed ex out a CD. 2) Buy a server, stick it in their office and hire me to set it up and maintain it. 3) Hire a company out there on the internet somewhere to put the data on a server at a server farm. Then query that, somehow get the data onto a cd or a tape. Much bigger question there since I would no longer just figure it out and do it all myself. Assuming a reasonable fee per data pull my preferences would be ordered 1, 2, and 3. Option 1 sets up a constant income stream but with the issue of having to be available to build the CDs. Option 3 is just too much setup, at least initially. Is anyone out there doing something of this nature? Any words of wisdom? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jmoss111 at bellsouth.net Fri Aug 13 02:35:32 2004 From: jmoss111 at bellsouth.net (JMoss) Date: Fri, 13 Aug 2004 02:35:32 -0500 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <001301c480e3$2879a9d0$80b3fea9@ColbyM6805> Message-ID: John, I use to work for a database marketing company and they used purchased lists only when necessary to add to a customers small list, and usually only from reputable list sellers like Victorias Secret, specialty magazines, mail order firms, etc. There are quite a few disreputable list vendors whose products were raw, messy and without any hygiene performed like cleanup, names and addresses in several different formats, no state or zip info, fields transposed, commas in strings, deduping, householding, or NCOA. These lists normally contained 20 - 30 % dups, especially when considering householding. You might want to look at DoubleTake, Personator, RightrFielder, Styleist, and Dirty Harry's Character Assasin from PeopleSmith Tools. By the way, these tools aren't cheap, and neither is having an NCOA done on the files. If your marketer is going to sell phone lists, you are going to have to have a federal DNC an a state DNC list for each state that you sell in, and your list has to be updated quarterly or run the risk of stiff penalties. As far as I know, there is no direct export capability from SQL Server to a compressed file, unless you export onto a compressed volume or decompress a zip onto a compressed volume. We used DTS and/or DataJunction (now Pervasive Cosmos) to get the files into SQL 2000. You can look at the file using PFE which should handle most any size file that you throw at it. And you had better hope the first line contains a header, mapping 500 columns without any data definition could put you in a belltower with an M16. Problems with data could cost a lot of time. Dates if incorrect (like 10/21/948) can kill an insert at 95% complete; other gremlins will raise their ugly little heads. For this, forget Access, use SQL Server Enterprise Manager/Query Analyzer. We built a factor table that contained a factor key and a customer key to pull our lists. A factor is the basic info used to pull the list. We would select customerkey from customerfactor where factordesc = 'Owns bugZapper' and customerkey in (select customerkey from customeraddress where state = 'TN') We charged $125 an hour for setup, factoring, producing data for a mailing or marketing campaign, or segmenting data. Most of the time, there was no hard product generated, the client had the ability to use the data via a browser based interface. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: Thursday, August 12, 2004 10:11 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Going over to the dark side I have been contacted by what appears to be a startup marketing firm, who want to take a 64 million name database and pull it into SQL Server. They are getting the data in comma delimited format, apparently compressed, on two DVDs - totaling something like 60 gbytes of raw text data. They have never seen the data (just purchased it) but they think it is about 400-500 fields of some personal info but mostly demographic stuff. Things like "owns a boat, owns a motor home, has credit cards, ethnic, sex, income" etc. Their intention is to pull subsets of the data and sell it to other companies. This is all very vague since all I know so far is what I have gleaned from one of them in a handful of phone conversations. They haven't seen the data, don't know how to get it our of the DVDs etc. I have no experience with trying to get something like that into SQL Server. I have a couple of questions. First, if the data is comma delimited, my assumption is that the first line will be field names, followed by data. Is SQL Server (or some other big db) capable of exporting directly into a cab file or zip file? If this is two DVDs both of which are compressed comma delimited files, how do I uncompress the data before importing it into SQL Server? They think it is 60gb. I have room for a 60gb database but not the uncompressed data as well as the database. I can of course just go buy a huge hard disk (200 gb) but I just spent a lot of time getting a mirror up and especially for something of this nature I would want to get it on a mirrored drive. Plus they want to start looking at the data as soon as possible. Second, is this a job for bcp? How do I get it in? Writing a VB function to parse it doesn't seem reasonable. Third, how long is it going to take to get that much data into a SQL Server table. It apparently is a single flat file which should translate to a single table of 400-500 fields. I assume that something like bcp would handle building the table given the field name in the first line, comma delimited? If not, how do I look at that first line so that I can go build the table "manually" in order to do the import? Fourth, what tool would I use to query that? Access has query limits that appear to eliminate it as a tool for this, never mind the speed issues. On the other hand if the actual number of fields exported out are small (the data exported doesn't have to contain the demographics data, just the personal data) then perhaps an ADP would allow a form with controls to select the demographics, then SQL Server directly dumps the data. Fifth, it seems logical that I would want to index the demographics fields (several hundred fields) so that queries could efficiently pull subsets of the data. How "big" is the database going to get with indexes on 350 (or more) fields? IOW, the raw data is 65gb - turned into a properly indexed table, what does that translate into in terms of SQL Database file size? Sixth, given 350 demographic fields, wouldn't I need to pull subsets to maintain lists of all the valid values for a given field. IOW, if there is a field for ethnicity, with 15 choices, it seems I would need to run a "distinct" query against that field and build a table of all the possible values rather than run a "distinct" on the fly to populate the combo that allows choices. Now multiply that by 350 fields. That looks like a lot of preprocessing just to get ready to start hashing the data. Seventh, How much processor power / memory is needed to handle a database of this nature? Is this something that I could reasonably expect to buy / build? These guys are examining their options, but three options they mentioned are: 1) Just hire me (or someone) to "take an order and produce a file, on a CD or tape". IOW, I own the server, I hold the db, and I take an order and fed ex out a CD. 2) Buy a server, stick it in their office and hire me to set it up and maintain it. 3) Hire a company out there on the internet somewhere to put the data on a server at a server farm. Then query that, somehow get the data onto a cd or a tape. Much bigger question there since I would no longer just figure it out and do it all myself. Assuming a reasonable fee per data pull my preferences would be ordered 1, 2, and 3. Option 1 sets up a constant income stream but with the issue of having to be available to build the CDs. Option 3 is just too much setup, at least initially. Is anyone out there doing something of this nature? Any words of wisdom? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From John.Maxwell2 at ntl.com Fri Aug 13 11:25:34 2004 From: John.Maxwell2 at ntl.com (John Maxwell @ London City) Date: Fri, 13 Aug 2004 17:25:34 +0100 Subject: [dba-SQLServer] crosstabs in sql Message-ID: Hello Susan, afraid crosstabs not so straight forward in SQL server. They are reasonably straight forward to do if you are using fixed column headings, a little more involved if you want a 'dynamic crosstab' Example copied from SQL server 2000 Bible: (I am a newby with SQL server and just happen to be reading up on Crosstabs and recursive select variables, so anyone please jump in to point out any errors / inefficiencies / better ways to mimic access Crosstabs) 1)Fixed Column Select Y, Sum (Case X when 'A' Then data Else 0 End) AS A Sum (Case X when 'B' Then data Else 0 End) AS B Sum (Case X when 'C' Then data Else 0 End) AS C Sum (Case X when 'D' Then data Else 0 End) AS D Sum (Data) as Total >From RawData Group By Y Order By Y 2)Dynamic Crosstab Instead of 'hard coding' the columns of your crosstab the script below determines them via a 'recursive select variable' Use TempDB Declare @XColumns NVarChar(1024) Set @XColumns = '' Select @XColumns + 'Sum(Case X When ''' + [a].[Column] + ''' Then Data Else 0 End ) AS ' + [a].[Column] + ',' From (Select Distinct X as [Column] from Raw Data) as a SET @XColumns = Select Y,' + XColumns + ' Sum(Data) as Total From Raw Data Group By Y Order By Y' EXEC sp_executesql @Xcolumns I hope I have not confused things for you let if you email your Access SQL through happy to give a go converting if required. Regards john -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of Klos, Susan Sent: 11 August 2004 22:13 To: 'dba-sqlserver at databaseadvisors.com' Subject: [dba-SQLServer] crosstabs in sql How do you create a crosstab query in SQL Server. I tried doing one in Access and copying the SQL into SQL Query Analyzer but I can't seem to get it right. Susan Klos Senior Database Analyst Evaluation and Reporting Florida Department of Education 850-245-0708 sc 205-0708 _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com The contents of this email and any attachments are sent for the personal attention of the addressee(s) only and may be confidential. If you are not the intended addressee, any use, disclosure or copying of this email and any attachments is unauthorised - please notify the sender by return and delete the message. Any representations or commitments expressed in this email are subject to contract. ntl Group Limited From artful at rogers.com Fri Aug 13 12:56:07 2004 From: artful at rogers.com (Arthur Fuller) Date: Fri, 13 Aug 2004 13:56:07 -0400 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <001301c480e3$2879a9d0$80b3fea9@ColbyM6805> Message-ID: <045501c4815e$cf816850$6601a8c0@rock> There's a wonderful store nearby my home whose manager and employees have impressed me like few if any elsewhere. They specialize in very powerful equipment at excellent prices. No cheap crap anywhere in the store. Anyway, they have a LaCie 400GB external hard drive with USB interface for $449 CDN and a FireWire interface for $479 CDN. (For quick conversion to USD, knock about 1/4 to 1/3 off the dollar figure.) LaCie makes similar drives going all the way to 1TB. See www.lacie.com for some awesome equipment. (I have no connection to this firm.) You could always send the client to me and I'll run out and grab one of these drives, and burn the clients' CDs til the cows come home :) As to the file format on the DVDs, is the compression standard such as ZIP or RAR? You could unzip one with space available, perhaps. To read the first line, just go to a DOS window and type: Type filename.ext Then hit Ctrl-C quickly. You should be able to read most or all of the first line that way. Experiment with how quickly you have to hit Ctrl-C. I would want to see a row of data too. Assuming enough space available, you could simply try to import the data into Access or Excel. Either method would give you a peek at the first row and some sample data too. You could then abort the import if you don't have sufficient space available. Some archive programs offer the ability to create N files of a specified size, so that's one possible approach. But frankly I'd just go with the Lacie drive. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 12, 2004 11:11 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Going over to the dark side I have been contacted by what appears to be a startup marketing firm, who want to take a 64 million name database and pull it into SQL Server. They are getting the data in comma delimited format, apparently compressed, on two DVDs - totaling something like 60 gbytes of raw text data. They have never seen the data (just purchased it) but they think it is about 400-500 fields of some personal info but mostly demographic stuff. Things like "owns a boat, owns a motor home, has credit cards, ethnic, sex, income" etc. Their intention is to pull subsets of the data and sell it to other companies. This is all very vague since all I know so far is what I have gleaned from one of them in a handful of phone conversations. They haven't seen the data, don't know how to get it our of the DVDs etc. I have no experience with trying to get something like that into SQL Server. I have a couple of questions. First, if the data is comma delimited, my assumption is that the first line will be field names, followed by data. Is SQL Server (or some other big db) capable of exporting directly into a cab file or zip file? If this is two DVDs both of which are compressed comma delimited files, how do I uncompress the data before importing it into SQL Server? They think it is 60gb. I have room for a 60gb database but not the uncompressed data as well as the database. I can of course just go buy a huge hard disk (200 gb) but I just spent a lot of time getting a mirror up and especially for something of this nature I would want to get it on a mirrored drive. Plus they want to start looking at the data as soon as possible. Second, is this a job for bcp? How do I get it in? Writing a VB function to parse it doesn't seem reasonable. Third, how long is it going to take to get that much data into a SQL Server table. It apparently is a single flat file which should translate to a single table of 400-500 fields. I assume that something like bcp would handle building the table given the field name in the first line, comma delimited? If not, how do I look at that first line so that I can go build the table "manually" in order to do the import? Fourth, what tool would I use to query that? Access has query limits that appear to eliminate it as a tool for this, never mind the speed issues. On the other hand if the actual number of fields exported out are small (the data exported doesn't have to contain the demographics data, just the personal data) then perhaps an ADP would allow a form with controls to select the demographics, then SQL Server directly dumps the data. Fifth, it seems logical that I would want to index the demographics fields (several hundred fields) so that queries could efficiently pull subsets of the data. How "big" is the database going to get with indexes on 350 (or more) fields? IOW, the raw data is 65gb - turned into a properly indexed table, what does that translate into in terms of SQL Database file size? Sixth, given 350 demographic fields, wouldn't I need to pull subsets to maintain lists of all the valid values for a given field. IOW, if there is a field for ethnicity, with 15 choices, it seems I would need to run a "distinct" query against that field and build a table of all the possible values rather than run a "distinct" on the fly to populate the combo that allows choices. Now multiply that by 350 fields. That looks like a lot of preprocessing just to get ready to start hashing the data. Seventh, How much processor power / memory is needed to handle a database of this nature? Is this something that I could reasonably expect to buy / build? These guys are examining their options, but three options they mentioned are: 1) Just hire me (or someone) to "take an order and produce a file, on a CD or tape". IOW, I own the server, I hold the db, and I take an order and fed ex out a CD. 2) Buy a server, stick it in their office and hire me to set it up and maintain it. 3) Hire a company out there on the internet somewhere to put the data on a server at a server farm. Then query that, somehow get the data onto a cd or a tape. Much bigger question there since I would no longer just figure it out and do it all myself. Assuming a reasonable fee per data pull my preferences would be ordered 1, 2, and 3. Option 1 sets up a constant income stream but with the issue of having to be available to build the CDs. Option 3 is just too much setup, at least initially. Is anyone out there doing something of this nature? Any words of wisdom? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Fri Aug 13 18:39:17 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Fri, 13 Aug 2004 19:39:17 -0400 Subject: [dba-SQLServer] Going over to the dark side In-Reply-To: <045501c4815e$cf816850$6601a8c0@rock> Message-ID: <000f01c4818e$bff4ff20$80b3fea9@ColbyM6805> Arthur, Thanks for the suggestions. I'll have to think about this. One of the problems is that there are hundreds of fields that will be used in a where clause to filter down the data. There are 64 million records of (they think) about 400 fields, of which a handful are name and address. The rest are these filter fields. Running that on a 2gbyte ram machine is probably going to be slow, try pulling 5 million records. On the other hand, it if is possible, that would allow me to drop a CD burner in my laptop and still do this stuff while on the road. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Friday, August 13, 2004 1:56 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Going over to the dark side There's a wonderful store nearby my home whose manager and employees have impressed me like few if any elsewhere. They specialize in very powerful equipment at excellent prices. No cheap crap anywhere in the store. Anyway, they have a LaCie 400GB external hard drive with USB interface for $449 CDN and a FireWire interface for $479 CDN. (For quick conversion to USD, knock about 1/4 to 1/3 off the dollar figure.) LaCie makes similar drives going all the way to 1TB. See www.lacie.com for some awesome equipment. (I have no connection to this firm.) You could always send the client to me and I'll run out and grab one of these drives, and burn the clients' CDs til the cows come home :) As to the file format on the DVDs, is the compression standard such as ZIP or RAR? You could unzip one with space available, perhaps. To read the first line, just go to a DOS window and type: Type filename.ext Then hit Ctrl-C quickly. You should be able to read most or all of the first line that way. Experiment with how quickly you have to hit Ctrl-C. I would want to see a row of data too. Assuming enough space available, you could simply try to import the data into Access or Excel. Either method would give you a peek at the first row and some sample data too. You could then abort the import if you don't have sufficient space available. Some archive programs offer the ability to create N files of a specified size, so that's one possible approach. But frankly I'd just go with the Lacie drive. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 12, 2004 11:11 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Going over to the dark side I have been contacted by what appears to be a startup marketing firm, who want to take a 64 million name database and pull it into SQL Server. They are getting the data in comma delimited format, apparently compressed, on two DVDs - totaling something like 60 gbytes of raw text data. They have never seen the data (just purchased it) but they think it is about 400-500 fields of some personal info but mostly demographic stuff. Things like "owns a boat, owns a motor home, has credit cards, ethnic, sex, income" etc. Their intention is to pull subsets of the data and sell it to other companies. This is all very vague since all I know so far is what I have gleaned from one of them in a handful of phone conversations. They haven't seen the data, don't know how to get it our of the DVDs etc. I have no experience with trying to get something like that into SQL Server. I have a couple of questions. First, if the data is comma delimited, my assumption is that the first line will be field names, followed by data. Is SQL Server (or some other big db) capable of exporting directly into a cab file or zip file? If this is two DVDs both of which are compressed comma delimited files, how do I uncompress the data before importing it into SQL Server? They think it is 60gb. I have room for a 60gb database but not the uncompressed data as well as the database. I can of course just go buy a huge hard disk (200 gb) but I just spent a lot of time getting a mirror up and especially for something of this nature I would want to get it on a mirrored drive. Plus they want to start looking at the data as soon as possible. Second, is this a job for bcp? How do I get it in? Writing a VB function to parse it doesn't seem reasonable. Third, how long is it going to take to get that much data into a SQL Server table. It apparently is a single flat file which should translate to a single table of 400-500 fields. I assume that something like bcp would handle building the table given the field name in the first line, comma delimited? If not, how do I look at that first line so that I can go build the table "manually" in order to do the import? Fourth, what tool would I use to query that? Access has query limits that appear to eliminate it as a tool for this, never mind the speed issues. On the other hand if the actual number of fields exported out are small (the data exported doesn't have to contain the demographics data, just the personal data) then perhaps an ADP would allow a form with controls to select the demographics, then SQL Server directly dumps the data. Fifth, it seems logical that I would want to index the demographics fields (several hundred fields) so that queries could efficiently pull subsets of the data. How "big" is the database going to get with indexes on 350 (or more) fields? IOW, the raw data is 65gb - turned into a properly indexed table, what does that translate into in terms of SQL Database file size? Sixth, given 350 demographic fields, wouldn't I need to pull subsets to maintain lists of all the valid values for a given field. IOW, if there is a field for ethnicity, with 15 choices, it seems I would need to run a "distinct" query against that field and build a table of all the possible values rather than run a "distinct" on the fly to populate the combo that allows choices. Now multiply that by 350 fields. That looks like a lot of preprocessing just to get ready to start hashing the data. Seventh, How much processor power / memory is needed to handle a database of this nature? Is this something that I could reasonably expect to buy / build? These guys are examining their options, but three options they mentioned are: 1) Just hire me (or someone) to "take an order and produce a file, on a CD or tape". IOW, I own the server, I hold the db, and I take an order and fed ex out a CD. 2) Buy a server, stick it in their office and hire me to set it up and maintain it. 3) Hire a company out there on the internet somewhere to put the data on a server at a server farm. Then query that, somehow get the data onto a cd or a tape. Much bigger question there since I would no longer just figure it out and do it all myself. Assuming a reasonable fee per data pull my preferences would be ordered 1, 2, and 3. Option 1 sets up a constant income stream but with the issue of having to be available to build the CDs. Option 3 is just too much setup, at least initially. Is anyone out there doing something of this nature? Any words of wisdom? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From martyconnelly at shaw.ca Mon Aug 16 10:45:58 2004 From: martyconnelly at shaw.ca (MartyConnelly) Date: Mon, 16 Aug 2004 08:45:58 -0700 Subject: [dba-SQLServer] crosstabs in sql References: Message-ID: <4120D6B6.7040208@shaw.ca> SQL Server Express 2005 provides several new analytic functions RANK, DENSE_RANK, NTILE, ROW_NUMBER, to name a few, as well as quite a few other new language features, like PIVOT and UNPIVOT, OUTER APPLY and CROSS APPLY. These will probably help with crosstabs. Here is a book chapter excerpt that might explain. http://www.ftponline.com/books/chapters/default_pf.asp?isbn=0321180593 A First Look at SQL Server 2005 for Developers by Bob Beauchemin, Niels Berglund, Dan Sullivan Addison-Wesley Professional ISBN: 0321180593 John Maxwell @ London City wrote: >Hello Susan, > >afraid crosstabs not so straight forward in SQL server. > >They are reasonably straight forward to do if you are using fixed column >headings, >a little more involved if you want a 'dynamic crosstab' > >Example copied from SQL server 2000 Bible: >(I am a newby with SQL server and just happen to be reading up on Crosstabs >and recursive select variables, >so anyone please jump in to point out any errors / inefficiencies / better >ways to mimic access Crosstabs) > >1)Fixed Column > >Select Y, >Sum (Case X when 'A' Then data Else 0 End) AS A >Sum (Case X when 'B' Then data Else 0 End) AS B >Sum (Case X when 'C' Then data Else 0 End) AS C >Sum (Case X when 'D' Then data Else 0 End) AS D >Sum (Data) as Total >>From RawData >Group By Y >Order By Y > >2)Dynamic Crosstab >Instead of 'hard coding' the columns of your crosstab the script below >determines them via a 'recursive select variable' > >Use TempDB > >Declare @XColumns NVarChar(1024) >Set @XColumns = '' >Select @XColumns + 'Sum(Case X > When ''' + [a].[Column] + ''' Then Data > Else 0 > End ) AS ' > + [a].[Column] + ',' > From > (Select Distinct X as [Column] > from Raw Data) as a >SET @XColumns = Select Y,' + XColumns + ' Sum(Data) as Total From Raw Data >Group By Y Order By Y' > >EXEC sp_executesql @Xcolumns > > >I hope I have not confused things for you let if you email your Access SQL >through happy to give a go converting if required. > >Regards > >john > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of Klos, >Susan >Sent: 11 August 2004 22:13 >To: 'dba-sqlserver at databaseadvisors.com' >Subject: [dba-SQLServer] crosstabs in sql > > >How do you create a crosstab query in SQL Server. I tried doing one in >Access and copying the SQL into SQL Query Analyzer but I can't seem to get >it right. > > > >Susan Klos > >Senior Database Analyst > >Evaluation and Reporting > >Florida Department of Education > >850-245-0708 > >sc 205-0708 > > > -- Marty Connelly Victoria, B.C. Canada From cfoust at infostatsystems.com Mon Aug 16 11:16:04 2004 From: cfoust at infostatsystems.com (Charlotte Foust) Date: Mon, 16 Aug 2004 09:16:04 -0700 Subject: [dba-SQLServer] Going over to the dark side Message-ID: How are your data warehousing skills these days, John? That's essentially what you're looking at if you want to query data out of this mess in any reasonable amount of time. You're still going to have to produce flat files for their clients, which will be direct marketing firms doing mailings for their own customers, but to filter the data, you'll probably need to break up the flat table into a main fact table and a bunch of dimension tables, even in SQL Server. Charlotte Foust -----Original Message----- From: John W. Colby [mailto:jwcolby at colbyconsulting.com] Sent: Thursday, August 12, 2004 7:49 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Going over to the dark side I will certainly ask that question but they seem to be selling data sets to marketing companies. The guy indicated that it was address data, not email or phone numbers. They want the data on CD/Tape, FedExed to their clients. There are valid businesses doing exactly this kind of stuff, and valid plain old postal bulk mailings that would conceivably use something like this. >From what I've read about spammers is they often don't even bother with demographics since it is just cheaper to blast out a million emails to everyone. I will not get involved with spam but I would have to consider doing this for a marketing firm that does bulk mail. I will be asking them about "do not bother me" lists as a filter to see what they are up to. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart McLachlan Sent: Thursday, August 12, 2004 11:31 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Going over to the dark side On 12 Aug 2004 at 23:10, John W. Colby wrote: > > Is anyone out there doing something of this nature? Any words of > wisdom? > I wouldn't touch it with a bargepole. 1. It smells like spam/scam so I wouldn't on principle 2. You are looking at *several* hundred gig of data and indexes and huge processing power to be able to pull selected data sets. -- Stuart _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Mon Aug 16 12:01:50 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Mon, 16 Aug 2004 13:01:50 -0400 Subject: [dba-SQLServer] Import comma delimited from zipped files In-Reply-To: Message-ID: <001c01c483b2$ba0a48b0$80b3fea9@ColbyM6805> Is there a tool to allow me to import data into SQL Server from a comma delimited file contained in a zip file? John W. Colby www.ColbyConsulting.com From CMackin at Quiznos.com Mon Aug 16 12:07:04 2004 From: CMackin at Quiznos.com (Mackin, Christopher) Date: Mon, 16 Aug 2004 11:07:04 -0600 Subject: [dba-SQLServer] Import comma delimited from zipped files Message-ID: I've done this via the WinZip Comand Line add-ons http://www.winzip.com/daddons.htm Along with a DTS package in SQL Server. I actually created a batch file to pull the zip files from an FTP site, unzip them and have that batch file called from step 1 of the DTS package. -Chris Mackin -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: Monday, August 16, 2004 11:02 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Import comma delimited from zipped files Is there a tool to allow me to import data into SQL Server from a comma delimited file contained in a zip file? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Mon Aug 16 13:13:37 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Mon, 16 Aug 2004 11:13:37 -0700 Subject: [dba-SQLServer] Import comma delimited from zipped files In-Reply-To: <001c01c483b2$ba0a48b0$80b3fea9@ColbyM6805> References: <001c01c483b2$ba0a48b0$80b3fea9@ColbyM6805> Message-ID: while contained in a zip file???, not that I know of.... however there are scripts to "UNZIP" a file and then you can bcp the file in.... Check out this site for a comprehensive list of scripts On Mon, 16 Aug 2004 13:01:50 -0400, John W. Colby wrote: > Is there a tool to allow me to import data into SQL Server from a comma > delimited file contained in a zip file? > > John W. Colby > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco From markamatte at hotmail.com Mon Aug 16 14:15:33 2004 From: markamatte at hotmail.com (Mark A Matte) Date: Mon, 16 Aug 2004 19:15:33 +0000 Subject: [dba-SQLServer] Going over to the dark side Message-ID: John, Not sure what the original post was...but the last part sounds like something a friend of mine does...imports millions(billions if necessary)all contact and demographic data...then uses his software to cut/slice/view/output the data in any conceivable way. The initial load may take 8 to 10 hours...but queries after the fact only take 1 to 2 seconds. Its quite impressive. If you are interested...let me know offline...and I'll send you his contact information. Thanks, Mark A. Matte >From: "Charlotte Foust" >Reply-To: dba-sqlserver at databaseadvisors.com >To: >Subject: RE: [dba-SQLServer] Going over to the dark side >Date: Mon, 16 Aug 2004 09:16:04 -0700 > >How are your data warehousing skills these days, John? That's >essentially what you're looking at if you want to query data out of this >mess in any reasonable amount of time. You're still going to have to >produce flat files for their clients, which will be direct marketing >firms doing mailings for their own customers, but to filter the data, >you'll probably need to break up the flat table into a main fact table >and a bunch of dimension tables, even in SQL Server. > >Charlotte Foust > >-----Original Message----- >From: John W. Colby [mailto:jwcolby at colbyconsulting.com] >Sent: Thursday, August 12, 2004 7:49 PM >To: dba-sqlserver at databaseadvisors.com >Subject: RE: [dba-SQLServer] Going over to the dark side > > >I will certainly ask that question but they seem to be selling data sets >to marketing companies. The guy indicated that it was address data, not >email or phone numbers. > >They want the data on CD/Tape, FedExed to their clients. There are >valid businesses doing exactly this kind of stuff, and valid plain old >postal bulk mailings that would conceivably use something like this. > >From what I've read about spammers is they often don't even bother with >demographics since it is just cheaper to blast out a million emails to >everyone. > >I will not get involved with spam but I would have to consider doing >this for a marketing firm that does bulk mail. I will be asking them >about "do not bother me" lists as a filter to see what they are up to. > >John W. Colby >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart >McLachlan >Sent: Thursday, August 12, 2004 11:31 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Going over to the dark side > > >On 12 Aug 2004 at 23:10, John W. Colby wrote: > > > > Is anyone out there doing something of this nature? Any words of > > wisdom? > > > >I wouldn't touch it with a bargepole. > >1. It smells like spam/scam so I wouldn't on principle > >2. You are looking at *several* hundred gig of data and indexes and huge > >processing power to be able to pull selected data sets. > >-- >Stuart > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > _________________________________________________________________ Don?t just search. Find. Check out the new MSN Search! http://search.msn.click-url.com/go/onm00200636ave/direct/01/ From jwcolby at colbyconsulting.com Tue Aug 17 07:03:23 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Tue, 17 Aug 2004 08:03:23 -0400 Subject: [dba-SQLServer] Where did my data go In-Reply-To: Message-ID: <000801c48452$32578e00$80b3fea9@ColbyM6805> I used DTS to extract 3 million records from a raw data comma delimited text file, inserting it into a SQL database. The DTS wizard asked me for the name of the database, the database was built, the wizard spent two hours pulling the data showing me the record count of extracted records... But there's no data table in the database that it created. In fact, looking back on it, I don't remember being asked for a table name. So where did the data go? John W. Colby www.ColbyConsulting.com From James at fcidms.com Tue Aug 17 11:45:20 2004 From: James at fcidms.com (James Barash) Date: Tue, 17 Aug 2004 12:45:20 -0400 Subject: [dba-SQLServer] Where did my data go In-Reply-To: <000801c48452$32578e00$80b3fea9@ColbyM6805> Message-ID: <200408171645.MAA21627@kittybird.bcentralhost.com> John, Usually with DTS, the table created will have the same name as the file you imported. IF you are using EM to view the database, make sure you do a refresh to see the new table. James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Tuesday, August 17, 2004 8:03 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Where did my data go I used DTS to extract 3 million records from a raw data comma delimited text file, inserting it into a SQL database. The DTS wizard asked me for the name of the database, the database was built, the wizard spent two hours pulling the data showing me the record count of extracted records... But there's no data table in the database that it created. In fact, looking back on it, I don't remember being asked for a table name. So where did the data go? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From JColby at dispec.com Tue Aug 17 12:02:46 2004 From: JColby at dispec.com (Colby, John) Date: Tue, 17 Aug 2004 13:02:46 -0400 Subject: [dba-SQLServer] Where did my data go Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDC7@DISABILITYINS01> The database has nothing other that system tables in it. JWC -----Original Message----- From: James Barash [mailto:James at fcidms.com] Sent: Tuesday, August 17, 2004 12:45 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Where did my data go John, Usually with DTS, the table created will have the same name as the file you imported. IF you are using EM to view the database, make sure you do a refresh to see the new table. James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Tuesday, August 17, 2004 8:03 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Where did my data go I used DTS to extract 3 million records from a raw data comma delimited text file, inserting it into a SQL database. The DTS wizard asked me for the name of the database, the database was built, the wizard spent two hours pulling the data showing me the record count of extracted records... But there's no data table in the database that it created. In fact, looking back on it, I don't remember being asked for a table name. So where did the data go? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From rmoore at comtechpst.com Tue Aug 17 12:26:21 2004 From: rmoore at comtechpst.com (Ron Moore) Date: Tue, 17 Aug 2004 13:26:21 -0400 Subject: [dba-SQLServer] Where did my data go In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDC7@DISABILITYINS01> Message-ID: <003d01c4847f$53851480$8114a8c0@Comtech.Comtechpst.com> John, Did you save the DTS package? If so, it should tell you the 'import to' databasename.owner.tablename. Otherwise, you may want to check all the databases for a table with the name of the text file (or the name you supplied). Just for kicks, I would also check the master, model, msdb, pubs, etc. databases. Ron Moore Sr. Database Administrator Comtech PST Corp. Melville, NY www.comtechpst.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Colby, John Sent: Tuesday, August 17, 2004 1:03 PM To: 'dba-sqlserver at databaseadvisors.com' Subject: RE: [dba-SQLServer] Where did my data go The database has nothing other that system tables in it. JWC -----Original Message----- From: James Barash [mailto:James at fcidms.com] Sent: Tuesday, August 17, 2004 12:45 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Where did my data go John, Usually with DTS, the table created will have the same name as the file you imported. IF you are using EM to view the database, make sure you do a refresh to see the new table. James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Tuesday, August 17, 2004 8:03 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Where did my data go I used DTS to extract 3 million records from a raw data comma delimited text file, inserting it into a SQL database. The DTS wizard asked me for the name of the database, the database was built, the wizard spent two hours pulling the data showing me the record count of extracted records... But there's no data table in the database that it created. In fact, looking back on it, I don't remember being asked for a table name. So where did the data go? John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Tue Aug 17 12:32:23 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Tue, 17 Aug 2004 10:32:23 -0700 Subject: [dba-SQLServer] Where did my data go In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDC7@DISABILITYINS01> References: <05C61C52D7CAD211A7830008C7DF6F1079BDC7@DISABILITYINS01> Message-ID: On Tue, 17 Aug 2004 13:02:46 -0400, Colby, John wrote: > The database has nothing other that system tables in it. DTS will always ask you for a database to import to, however the table name is assumed to be the same as the source. Right click on your tables icon in EM and you should see a new table that matches the name of the source file (unless you changed it to something else). Additionally when you check the database properties it shoudl have increased dramatically to have the capacity for all the 3 million records. it is also possible that you may have encountered an error and thus nothing was imported, however the table name will still have been made. -- -Francisco From mark at markkaren.com Tue Aug 17 12:43:37 2004 From: mark at markkaren.com (Mark Rider) Date: Tue, 17 Aug 2004 12:43:37 -0500 Subject: [dba-SQLServer] Changing the default Port Settings In-Reply-To: <003d01c4847f$53851480$8114a8c0@Comtech.Comtechpst.com> Message-ID: <200408171743.i7HHhsQ14860@databaseadvisors.com> I have a database that is getting hit with a lot of connection attempts, according to the log files. It is listening on 1433, which I assume is where the login attempts are coming from. I can change the Firewall to allow a different port for the server, and I know how to change the port setting in the Network Configuration for the server. What do I need to do for the clients to connect to the database server once the port is changed? I tried changing a test database's port and could not figure out how to get to it from a client. I changed it back and all was fine. What am I missing? TIA, Mark Rider From fhtapia at gmail.com Tue Aug 17 14:45:43 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Tue, 17 Aug 2004 12:45:43 -0700 Subject: [dba-SQLServer] Changing the default Port Settings In-Reply-To: <200408171743.i7HHhsQ14860@databaseadvisors.com> References: <200408171743.i7HHhsQ14860@databaseadvisors.com> Message-ID: On Tue, 17 Aug 2004 12:43:37 -0500, Mark Rider wrote: > I have a database that is getting hit with a lot of connection attempts, > according to the log files. It is listening on 1433, which I assume is > where the login attempts are coming from. > > I can change the Firewall to allow a different port for the server, and I > know how to change the port setting in the Network Configuration for the > server. What do I need to do for the clients to connect to the database > server once the port is changed? I tried changing a test database's port > and could not figure out how to get to it from a client. I changed it back > and all was fine. > > What am I missing? On the connection string for your client, make sure you are including the same port you are changing on the server.... -- -Francisco From mwp.reid at qub.ac.uk Tue Aug 17 15:33:30 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Tue, 17 Aug 2004 21:33:30 +0100 Subject: [dba-SQLServer] useful free download References: <05C61C52D7CAD211A7830008C7DF6F1079BDC7@DISABILITYINS01> Message-ID: <000601c4849a$3ff7c910$0100a8c0@Martin> http://www.red-gate.com/godownloadsqlservercentral.htm Martin From fhtapia at gmail.com Tue Aug 17 17:32:39 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Tue, 17 Aug 2004 15:32:39 -0700 Subject: [dba-SQLServer] useful free download In-Reply-To: <000601c4849a$3ff7c910$0100a8c0@Martin> References: <05C61C52D7CAD211A7830008C7DF6F1079BDC7@DISABILITYINS01> <000601c4849a$3ff7c910$0100a8c0@Martin> Message-ID: On Tue, 17 Aug 2004 21:33:30 +0100, Martin Reid wrote: > http://www.red-gate.com/godownloadsqlservercentral.htm Thanks for the heads up on this very cool doc. 354 pages :O wow in 5 megs :D -- -Francisco From jwcolby at colbyconsulting.com Tue Aug 17 18:28:07 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Tue, 17 Aug 2004 19:28:07 -0400 Subject: [dba-SQLServer] Multi-processor In-Reply-To: <000601c4849a$3ff7c910$0100a8c0@Martin> Message-ID: Will SQL Server directly utilize multiple processors? JWC -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of Martin Reid Sent: Tuesday, August 17, 2004 4:34 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] useful free download http://www.red-gate.com/godownloadsqlservercentral.htm Martin _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From andrew.haslett at ilc.gov.au Tue Aug 17 20:19:39 2004 From: andrew.haslett at ilc.gov.au (Haslett, Andrew) Date: Wed, 18 Aug 2004 10:49:39 +0930 Subject: [dba-SQLServer] useful free download Message-ID: <0A870603A2A816459078203FC07F4CD201E7E0@adl01s055.ilcorp.gov.au> Yeah - I got their email last night also. Are you a user of their 'compare' tools Martin? I find them really handy, especially during development of test systems etc. Not overly expensive either. A -----Original Message----- From: Martin Reid [mailto:mwp.reid at qub.ac.uk] Sent: Wednesday, 18 August 2004 6:04 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] useful free download http://www.red-gate.com/godownloadsqlservercentral.htm Martin _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com IMPORTANT - PLEASE READ ******************** This email and any files transmitted with it are confidential and may contain information protected by law from disclosure. If you have received this message in error, please notify the sender immediately and delete this email from your system. No warranty is given that this email or files, if attached to this email, are free from computer viruses or other defects. They are provided on the basis the user assumes all responsibility for loss, damage or consequence resulting directly or indirectly from their use, whether caused by the negligence of the sender or not. From mark at markkaren.com Tue Aug 17 23:51:36 2004 From: mark at markkaren.com (Mark Rider) Date: Tue, 17 Aug 2004 23:51:36 -0500 Subject: [dba-SQLServer] Changing the default Port Settings In-Reply-To: Message-ID: <200408180451.i7I4peQ01564@databaseadvisors.com> That helps, but one more question: Is that as simple as myserver:port? Or do I need to specify a port="portnumber"; in the string? Mark -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Tuesday, August 17, 2004 2:46 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Changing the default Port Settings On Tue, 17 Aug 2004 12:43:37 -0500, Mark Rider wrote: > I have a database that is getting hit with a lot of connection > attempts, according to the log files. It is listening on 1433, which > I assume is where the login attempts are coming from. > > I can change the Firewall to allow a different port for the server, > and I know how to change the port setting in the Network Configuration > for the server. What do I need to do for the clients to connect to > the database server once the port is changed? I tried changing a test > database's port and could not figure out how to get to it from a > client. I changed it back and all was fine. > > What am I missing? On the connection string for your client, make sure you are including the same port you are changing on the server.... -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed Aug 18 01:53:41 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Tue, 17 Aug 2004 23:53:41 -0700 Subject: [dba-SQLServer] Multi-processor In-Reply-To: References: Message-ID: Yes, Sql Server will use multiple processors From fhtapia at gmail.com Wed Aug 18 02:07:46 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 18 Aug 2004 00:07:46 -0700 Subject: [dba-SQLServer] Changing the default Port Settings In-Reply-To: <200408180451.i7I4peQ01564@databaseadvisors.com> References: <200408180451.i7I4peQ01564@databaseadvisors.com> Message-ID: On Tue, 17 Aug 2004 23:51:36 -0500, Mark Rider wrote: > That helps, but one more question: > > Is that as simple as myserver:port? Or do I need to specify a > port="portnumber"; in the string? I generally use the port=portnumber to ensure there are no problems, I acctually do it from the udl screen (create a .txt file then change the extention to .udl and click it, it's a gui tool for fidning the connection string) in the last tab you just specify the port number. -- -Francisco From michael at ddisolutions.com.au Wed Aug 18 02:24:42 2004 From: michael at ddisolutions.com.au (Michael Maddison) Date: Wed, 18 Aug 2004 17:24:42 +1000 Subject: [dba-SQLServer] Multi-processor Message-ID: <59A61174B1F5B54B97FD4ADDE71E7D01011AEC@ddi-01.DDI.local> Depending on the version of SQL. IIRC you can always use 2, the higher end versions depend on OS and such but vary from 4 to 8. Zeons with hyperthreading show up as 2 cpu's in taskmgr which is cool but not fully utilised till Win 2003 server... check BOL cheers Michael M Yes, Sql Server will use multiple processors _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From mwp.reid at qub.ac.uk Wed Aug 18 02:57:20 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Wed, 18 Aug 2004 08:57:20 +0100 Subject: [dba-SQLServer] useful free download References: <0A870603A2A816459078203FC07F4CD201E7E0@adl01s055.ilcorp.gov.au> Message-ID: <002501c484f8$fcd17dc0$9111758f@aine> Been using the Package software lately and I fine that it is great. Martin ----- Original Message ----- From: "Haslett, Andrew" To: Sent: Wednesday, August 18, 2004 2:19 AM Subject: RE: [dba-SQLServer] useful free download > Yeah - I got their email last night also. > > Are you a user of their 'compare' tools Martin? I find them really handy, > especially during development of test systems etc. Not overly expensive > either. > > A > > -----Original Message----- > From: Martin Reid [mailto:mwp.reid at qub.ac.uk] > Sent: Wednesday, 18 August 2004 6:04 AM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] useful free download > > http://www.red-gate.com/godownloadsqlservercentral.htm > > > Martin > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > IMPORTANT - PLEASE READ ******************** > This email and any files transmitted with it are confidential and may > contain information protected by law from disclosure. > If you have received this message in error, please notify the sender > immediately and delete this email from your system. > No warranty is given that this email or files, if attached to this > email, are free from computer viruses or other defects. They > are provided on the basis the user assumes all responsibility for > loss, damage or consequence resulting directly or indirectly from > their use, whether caused by the negligence of the sender or not. > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From fhtapia at gmail.com Wed Aug 18 15:46:12 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 18 Aug 2004 13:46:12 -0700 Subject: [dba-SQLServer] Multi-processor In-Reply-To: <59A61174B1F5B54B97FD4ADDE71E7D01011AEC@ddi-01.DDI.local> References: <59A61174B1F5B54B97FD4ADDE71E7D01011AEC@ddi-01.DDI.local> Message-ID: On Wed, 18 Aug 2004 17:24:42 +1000, Michael Maddison wrote: > Depending on the version of SQL. IIRC you can always use 2, > the higher end versions depend on OS and such but vary from 4 to 8. > Zeons with hyperthreading show up as 2 cpu's in taskmgr which is cool > but not fully utilised till Win 2003 server... > > check BOL http://www.microsoft.com/sql/techinfo/administration/2000/scalabilityfaq.asp According to this article, Sql Server 2000 will scale up to 4 processors on a Sql Server Standard Install, course you'd want to check your available CAL/Per Processor Licenses. The maximum amount of ram you can acces however is only 2gb... the link shows more info on scalability. -- -Francisco From jwcolby at colbyconsulting.com Wed Aug 18 15:56:21 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Wed, 18 Aug 2004 16:56:21 -0400 Subject: [dba-SQLServer] Using task manager to close SQL Server In-Reply-To: <59A61174B1F5B54B97FD4ADDE71E7D01011AEC@ddi-01.DDI.local> Message-ID: <001a01c48565$d416ed80$80b3fea9@ColbyM6805> I am trying to add a "PK" autoincrement to this huge db I'm working with. There are around 6 million records at the moment. I went in design view and added the field, then "saved" the table and it took off adding the new field and putting values in the table. The problem is it is taking FOREVER (running for many hours so far) and is not giving any status indicator to say when it will be done. I absolutely must "get back to work doing some stuff in the db. If I "close" enterprise manager in the middle, will it damage my db? I am assuming not but don't want to take any chances. It appears that the reason it is taking so long is extensive use of the swap file. Enterprise manager is currently using 421 mbytes on a 512m machine. I would go somewhere and buy some 512mb sticks if it will help (and it appears it will). John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Michael Maddison Sent: Wednesday, August 18, 2004 3:25 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Multi-processor Depending on the version of SQL. IIRC you can always use 2, the higher end versions depend on OS and such but vary from 4 to 8. Zeons with hyperthreading show up as 2 cpu's in taskmgr which is cool but not fully utilised till Win 2003 server... check BOL cheers Michael M Yes, Sql Server will use multiple processors _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed Aug 18 16:25:17 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 18 Aug 2004 14:25:17 -0700 Subject: [dba-SQLServer] Using task manager to close SQL Server In-Reply-To: <001a01c48565$d416ed80$80b3fea9@ColbyM6805> References: <001a01c48565$d416ed80$80b3fea9@ColbyM6805> Message-ID: On Wed, 18 Aug 2004 16:56:21 -0400, John W. Colby wrote: > I am trying to add a "PK" autoincrement to this huge db I'm working with. > There are around 6 million records at the moment. I went in design view and > added the field, then "saved" the table and it took off adding the new field > and putting values in the table. The problem is it is taking FOREVER > (running for many hours so far) and is not giving any status indicator to > say when it will be done. I absolutely must "get back to work doing some > stuff in the db. > > If I "close" enterprise manager in the middle, will it damage my db? I am > assuming not but don't want to take any chances. > > It appears that the reason it is taking so long is extensive use of the swap > file. Enterprise manager is currently using 421 mbytes on a 512m machine. > I would go somewhere and buy some 512mb sticks if it will help (and it > appears it will). > wow only 512 on such a huge db?... I have on my work machine 512mb BUT, my test server has 1gb as does the production server... my home pc also has 1gb of ram to help w/ any high memory intensive applicaitons :) I definatly recommend you buy as much as you want to spend, (ie, if you can go for a full 1gb upgrade, do it, if you can take it to 2gb then do it :) -- -Francisco From fhtapia at gmail.com Wed Aug 18 16:31:58 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 18 Aug 2004 14:31:58 -0700 Subject: [dba-SQLServer] Using task manager to close SQL Server In-Reply-To: <001a01c48565$d416ed80$80b3fea9@ColbyM6805> References: <001a01c48565$d416ed80$80b3fea9@ColbyM6805> Message-ID: On Wed, 18 Aug 2004 16:56:21 -0400, John W. Colby wrote: > I am trying to add a "PK" autoincrement to this huge db I'm working with. > There are around 6 million records at the moment. I went in design view and > added the field, then "saved" the table and it took off adding the new field > and putting values in the table. The problem is it is taking FOREVER > (running for many hours so far) and is not giving any status indicator to > say when it will be done. I absolutely must "get back to work doing some > stuff in the db. > > If I "close" enterprise manager in the middle, will it damage my db? I am > assuming not but don't want to take any chances. > > It appears that the reason it is taking so long is extensive use of the swap > file. Enterprise manager is currently using 421 mbytes on a 512m machine. > I would go somewhere and buy some 512mb sticks if it will help (and it > appears it will). You know what, I also forgot to mention that you ought ot look into how you arrange the new hdds for your pc as well, Since you are DOING some Sql Server development, it is to your advantage to have your 2ndary data hdd on a seperate I/O card, this is important in a production environment, but also in a development one as well. What you'll find is that the system may become busy doing it's updates, but your PC will continue running like a champ, because all the heavy I/O process are offloaded to a seperate card/ channel. So if you don't have a second ATA or SATA card go out and look for one when you are out shopping for your 1gb of ram. You'll notice that the performance of the database increases like nuts, IIRC you just recently bought a 200 or 250gb drive w/ 8mb of cache did you not? Adding the I/O card will enhance the performance on that drive (or your 2ndary drives you have) -- -Francisco From jwcolby at colbyconsulting.com Wed Aug 18 17:22:28 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Wed, 18 Aug 2004 18:22:28 -0400 Subject: [dba-SQLServer] Using task manager to close SQL Server In-Reply-To: Message-ID: <001c01c48571$dbc97c30$80b3fea9@ColbyM6805> Francisco, This was until recently my desktop machine for Access dev. No need for more memory really. I "retired" it when I got my laptop which is a P64 with 512m ram (desktop replacement laptop). I will now be pulling it back into service as a (temporary) SQL Server machine. The db will hold a 60+ million record flat file of name/address/ demographics with 14+ million more to come immediately, and possibly merging in a couple of other dbs that are currently hosted elsewhere. My intention is to build a server since I am "bootstrapping" this operation (little cash). The current machine is an AMD 2.5g Barton with 512 mb ram running Windows2K Pro and SQL Server 2K. Awhile back I bought a RAID card and a couple of 120gb hard disks for the main c: drive (Raid1) which is where the current db resides. I purchased a couple of 200gb Maxtors with 8m cache which I intended to mirror, then throw the db out there. We'll just have to wait and see how big this mutha gets. I can see however that an immediate memory upgrade would be good if I intend to use this thing for long. Long term I'm looking at building a dual Opteron. I have been looking at how to get lots of memory and processing power though reading the web page you provided brings to doubt my ability to use said memory. 8-( John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, August 18, 2004 5:32 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using task manager to close SQL Server On Wed, 18 Aug 2004 16:56:21 -0400, John W. Colby wrote: > I am trying to add a "PK" autoincrement to this huge db I'm working > with. There are around 6 million records at the moment. I went in > design view and added the field, then "saved" the table and it took > off adding the new field and putting values in the table. The problem > is it is taking FOREVER (running for many hours so far) and is not > giving any status indicator to say when it will be done. I absolutely > must "get back to work doing some stuff in the db. > > If I "close" enterprise manager in the middle, will it damage my db? > I am assuming not but don't want to take any chances. > > It appears that the reason it is taking so long is extensive use of > the swap file. Enterprise manager is currently using 421 mbytes on a > 512m machine. I would go somewhere and buy some 512mb sticks if it > will help (and it appears it will). You know what, I also forgot to mention that you ought ot look into how you arrange the new hdds for your pc as well, Since you are DOING some Sql Server development, it is to your advantage to have your 2ndary data hdd on a seperate I/O card, this is important in a production environment, but also in a development one as well. What you'll find is that the system may become busy doing it's updates, but your PC will continue running like a champ, because all the heavy I/O process are offloaded to a seperate card/ channel. So if you don't have a second ATA or SATA card go out and look for one when you are out shopping for your 1gb of ram. You'll notice that the performance of the database increases like nuts, IIRC you just recently bought a 200 or 250gb drive w/ 8mb of cache did you not? Adding the I/O card will enhance the performance on that drive (or your 2ndary drives you have) -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed Aug 18 18:16:52 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 18 Aug 2004 16:16:52 -0700 Subject: [dba-SQLServer] Using task manager to close SQL Server In-Reply-To: <001c01c48571$dbc97c30$80b3fea9@ColbyM6805> References: <001c01c48571$dbc97c30$80b3fea9@ColbyM6805> Message-ID: On Wed, 18 Aug 2004 18:22:28 -0400, John W. Colby wrote: > Long term I'm looking at building a dual Opteron. I have been looking at > how to get lots of memory and processing power though reading the web page > you provided brings to doubt my ability to use said memory. 8-( Why is that? I was reading their Choose Edition white paper and found out this nifty info on Developer Edition of Sql Server: http://www.microsoft.com/sql/techinfo/planning/sqlreskchooseed.asp btw, the Developer edition is only $50 bucks per Developer :) http://www.microsoft.com/sql/howtobuy/development.asp >QUOTE: SQL Server 2000 Developer Edition This edition allows developers to build any type of application on top of SQL Server. It includes all of the functionality of Enterprise Edition but with a special development and test end-user license agreement (EULA) that prohibits production deployment (for complete details, see the SQL Server 2000 Developer Edition EULA at http://www.microsoft.com/sql. For maximum flexibility during development, it will install to the aforementioned server operating systems as well as Windows 2000 Professional and Windows NT Workstation 4.0. SQL Server 2000 Developer Edition is the only edition of SQL Server 2000 that gives the licensee the right to download and install SQL Server 2000 Windows CE Edition (SQL Server CE) from http://www.microsoft.com/sql. The Developer Edition licensee also has the right to redistribute SQL Server CE-based applications to an unlimited number of devices at no additional cost beyond the purchase price of SQL Server 2000 Developer Edition. Devices running SQL Server CE that access or otherwise use the resources of a SQL Server must be properly licensed. For more information, see http://www.microsoft.com/sql. SQL Server 2000 Developer Edition is the ideal choice for Independent Software Vendors (ISVs), consultants, system integrators, solution providers, and corporate developers developing and testing applications because it is cost effective, runs on a variety of platforms, and can be upgraded for production use to SQL Server 2000 Enterprise Edition. >>END QUOTE -- -Francisco From michael at ddisolutions.com.au Wed Aug 18 23:22:09 2004 From: michael at ddisolutions.com.au (Michael Maddison) Date: Thu, 19 Aug 2004 14:22:09 +1000 Subject: [dba-SQLServer] Using task manager to close SQL Server Message-ID: <59A61174B1F5B54B97FD4ADDE71E7D01011AF8@ddi-01.DDI.local> John, The db wont be hurt if you stop the transaction. What I would do... YMMV... Change Recovery mode to simple (cuts logging down) Drop all indexes inc PK before attempting to add the identity column. Script them using QA so you can put them back after. Turn off all unnecessary services. Have nothing else running. Buy or upgrade to a bigger box ;-))) You need multiple drives and processes. regards Michael M Francisco, This was until recently my desktop machine for Access dev. No need for more memory really. I "retired" it when I got my laptop which is a P64 with 512m ram (desktop replacement laptop). I will now be pulling it back into service as a (temporary) SQL Server machine. The db will hold a 60+ million record flat file of name/address/ demographics with 14+ million more to come immediately, and possibly merging in a couple of other dbs that are currently hosted elsewhere. My intention is to build a server since I am "bootstrapping" this operation (little cash). The current machine is an AMD 2.5g Barton with 512 mb ram running Windows2K Pro and SQL Server 2K. Awhile back I bought a RAID card and a couple of 120gb hard disks for the main c: drive (Raid1) which is where the current db resides. I purchased a couple of 200gb Maxtors with 8m cache which I intended to mirror, then throw the db out there. We'll just have to wait and see how big this mutha gets. I can see however that an immediate memory upgrade would be good if I intend to use this thing for long. Long term I'm looking at building a dual Opteron. I have been looking at how to get lots of memory and processing power though reading the web page you provided brings to doubt my ability to use said memory. 8-( John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, August 18, 2004 5:32 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using task manager to close SQL Server On Wed, 18 Aug 2004 16:56:21 -0400, John W. Colby wrote: > I am trying to add a "PK" autoincrement to this huge db I'm working > with. There are around 6 million records at the moment. I went in > design view and added the field, then "saved" the table and it took > off adding the new field and putting values in the table. The problem > is it is taking FOREVER (running for many hours so far) and is not > giving any status indicator to say when it will be done. I absolutely > must "get back to work doing some stuff in the db. > > If I "close" enterprise manager in the middle, will it damage my db? > I am assuming not but don't want to take any chances. > > It appears that the reason it is taking so long is extensive use of > the swap file. Enterprise manager is currently using 421 mbytes on a > 512m machine. I would go somewhere and buy some 512mb sticks if it > will help (and it appears it will). You know what, I also forgot to mention that you ought ot look into how you arrange the new hdds for your pc as well, Since you are DOING some Sql Server development, it is to your advantage to have your 2ndary data hdd on a seperate I/O card, this is important in a production environment, but also in a development one as well. What you'll find is that the system may become busy doing it's updates, but your PC will continue running like a champ, because all the heavy I/O process are offloaded to a seperate card/ channel. So if you don't have a second ATA or SATA card go out and look for one when you are out shopping for your 1gb of ram. You'll notice that the performance of the database increases like nuts, IIRC you just recently bought a 200 or 250gb drive w/ 8mb of cache did you not? Adding the I/O card will enhance the performance on that drive (or your 2ndary drives you have) -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Thu Aug 19 12:38:54 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 19 Aug 2004 13:38:54 -0400 Subject: [dba-SQLServer] Timeout In-Reply-To: Message-ID: <005401c48613$6915bf30$80b3fea9@ColbyM6805> I am trying to run a simple count query on this big db I'm building. The SQL is SELECT COUNT(ID) AS Cnt FROM dbo.Conduit Which is built by the query builder in EM. The query times out. [Microsoft][ODBC SQL Server Driver]Timeout Expired. Anyone have a clue where I go to increase this value so I can get a count. ID BTW is the "autonumber" PK. John W. Colby www.ColbyConsulting.com From John.Maxwell2 at ntl.com Thu Aug 19 12:58:16 2004 From: John.Maxwell2 at ntl.com (John Maxwell @ London City) Date: Thu, 19 Aug 2004 18:58:16 +0100 Subject: [dba-SQLServer] Timeout Message-ID: I also get this problem using the query builder in EM but fine when use Query analyser If in rush for results may be worth switching while waiting for a proper solution to be posted. Regards john -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: 19 August 2004 18:39 To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Timeout I am trying to run a simple count query on this big db I'm building. The SQL is SELECT COUNT(ID) AS Cnt FROM dbo.Conduit Which is built by the query builder in EM. The query times out. [Microsoft][ODBC SQL Server Driver]Timeout Expired. Anyone have a clue where I go to increase this value so I can get a count. ID BTW is the "autonumber" PK. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com The contents of this email and any attachments are sent for the personal attention of the addressee(s) only and may be confidential. If you are not the intended addressee, any use, disclosure or copying of this email and any attachments is unauthorised - please notify the sender by return and delete the message. Any representations or commitments expressed in this email are subject to contract. ntl Group Limited From jwcolby at colbyconsulting.com Thu Aug 19 14:57:27 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 19 Aug 2004 15:57:27 -0400 Subject: [dba-SQLServer] Out of Disk In-Reply-To: Message-ID: <005e01c48626$c5208ae0$80b3fea9@ColbyM6805> I imported more of the data into this big db I'm working on and now SQL Server shuts down after 5 seconds or so after startup. The error log says "not enough room on disk" for something it is trying to build (a temp db I think). I am down to 15gb on the disk that the SQL Server data resides on. The question is, can I simply copy the mdf and ldf files for this new database off to the new hard disk I installed to free up memory? I understand that SQL Server would now be missing the database, but can I then detach it and reattach it from the new drive? If not I will need to clean up enough space (amount unknown) to get SQL Server to stay up so that I can correctly detach the db and then move it. John W. Colby www.ColbyConsulting.com From mwp.reid at qub.ac.uk Thu Aug 19 15:03:35 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Thu, 19 Aug 2004 21:03:35 +0100 Subject: [dba-SQLServer] Out of Disk References: <005e01c48626$c5208ae0$80b3fea9@ColbyM6805> Message-ID: <003401c48627$9c755c00$0100a8c0@Martin> John Can you just move the tempdb out to the bigger disk?? Martin ----- Original Message ----- From: "John W. Colby" To: Sent: Thursday, August 19, 2004 8:57 PM Subject: [dba-SQLServer] Out of Disk > I imported more of the data into this big db I'm working on and now SQL > Server shuts down after 5 seconds or so after startup. The error log says > "not enough room on disk" for something it is trying to build (a temp db I > think). I am down to 15gb on the disk that the SQL Server data resides on. > > > The question is, can I simply copy the mdf and ldf files for this new > database off to the new hard disk I installed to free up memory? I > understand that SQL Server would now be missing the database, but can I then > detach it and reattach it from the new drive? If not I will need to clean > up enough space (amount unknown) to get SQL Server to stay up so that I can > correctly detach the db and then move it. > > John W. Colby > www.ColbyConsulting.com > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Thu Aug 19 15:08:08 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 19 Aug 2004 16:08:08 -0400 Subject: [dba-SQLServer] Log file In-Reply-To: <005e01c48626$c5208ae0$80b3fea9@ColbyM6805> Message-ID: <006001c48628$429659e0$80b3fea9@ColbyM6805> Is the ldf the log file? If so I understand it is possible to place that file on a different drive from the data file? How do I limit the size of the log file and dictate the location? John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 19, 2004 3:57 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Out of Disk I imported more of the data into this big db I'm working on and now SQL Server shuts down after 5 seconds or so after startup. The error log says "not enough room on disk" for something it is trying to build (a temp db I think). I am down to 15gb on the disk that the SQL Server data resides on. The question is, can I simply copy the mdf and ldf files for this new database off to the new hard disk I installed to free up memory? I understand that SQL Server would now be missing the database, but can I then detach it and reattach it from the new drive? If not I will need to clean up enough space (amount unknown) to get SQL Server to stay up so that I can correctly detach the db and then move it. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Thu Aug 19 15:09:20 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 19 Aug 2004 16:09:20 -0400 Subject: [dba-SQLServer] Out of Disk In-Reply-To: <003401c48627$9c755c00$0100a8c0@Martin> Message-ID: <006101c48628$6d404980$80b3fea9@ColbyM6805> Probably, but I fully intend to have a separate hard disk for this database so I would just as soon move it right now. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Thursday, August 19, 2004 4:04 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Out of Disk John Can you just move the tempdb out to the bigger disk?? Martin ----- Original Message ----- From: "John W. Colby" To: Sent: Thursday, August 19, 2004 8:57 PM Subject: [dba-SQLServer] Out of Disk > I imported more of the data into this big db I'm working on and now > SQL Server shuts down after 5 seconds or so after startup. The error > log says "not enough room on disk" for something it is trying to build > (a temp db I think). I am down to 15gb on the disk that the SQL > Server data resides on. > > > The question is, can I simply copy the mdf and ldf files for this new > database off to the new hard disk I installed to free up memory? I > understand that SQL Server would now be missing the database, but can > I then > detach it and reattach it from the new drive? If not I will need to > clean up enough space (amount unknown) to get SQL Server to stay up so > that I can > correctly detach the db and then move it. > > John W. Colby > www.ColbyConsulting.com > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From mwp.reid at qub.ac.uk Thu Aug 19 15:19:46 2004 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Thu, 19 Aug 2004 21:19:46 +0100 Subject: [dba-SQLServer] Log file References: <006001c48628$429659e0$80b3fea9@ColbyM6805> Message-ID: <001701c48629$dfa37870$0100a8c0@Martin> John One of the things I did recently was to actually detach a databse and log file. I thenremoved the log file completely as it was HUGH. Put it away soemwhere safe. I then reattached the MDF file and had SQL automatically recreate a new empty log file. Not sure how professional this is but appeared to work fine for my needs. You can also move the log out somewhere else detach the database move the log to its new location reattach sp_attachdb Yourdb PATH LOGFile Path Should be OK Oh and back it all up first if you have the disc space. Martin ----- Original Message ----- From: "John W. Colby" To: Sent: Thursday, August 19, 2004 9:08 PM Subject: [dba-SQLServer] Log file > Is the ldf the log file? If so I understand it is possible to place that > file on a different drive from the data file? How do I limit the size of > the log file and dictate the location? > > John W. Colby > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. > Colby > Sent: Thursday, August 19, 2004 3:57 PM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] Out of Disk > > > I imported more of the data into this big db I'm working on and now SQL > Server shuts down after 5 seconds or so after startup. The error log says > "not enough room on disk" for something it is trying to build (a temp db I > think). I am down to 15gb on the disk that the SQL Server data resides on. > > > The question is, can I simply copy the mdf and ldf files for this new > database off to the new hard disk I installed to free up memory? I > understand that SQL Server would now be missing the database, but can I then > detach it and reattach it from the new drive? If not I will need to clean > up enough space (amount unknown) to get SQL Server to stay up so that I can > correctly detach the db and then move it. > > John W. Colby > www.ColbyConsulting.com > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From mikedorism at adelphia.net Thu Aug 19 15:02:01 2004 From: mikedorism at adelphia.net (Mike & Doris Manning) Date: Thu, 19 Aug 2004 16:02:01 -0400 Subject: [dba-SQLServer] Out of Disk In-Reply-To: <005e01c48626$c5208ae0$80b3fea9@ColbyM6805> Message-ID: <000001c48627$6447f360$870aa845@hargrove.internal> Detach it... Move it... Then reattach it... Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 19, 2004 3:57 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Out of Disk I imported more of the data into this big db I'm working on and now SQL Server shuts down after 5 seconds or so after startup. The error log says "not enough room on disk" for something it is trying to build (a temp db I think). I am down to 15gb on the disk that the SQL Server data resides on. The question is, can I simply copy the mdf and ldf files for this new database off to the new hard disk I installed to free up memory? I understand that SQL Server would now be missing the database, but can I then detach it and reattach it from the new drive? If not I will need to clean up enough space (amount unknown) to get SQL Server to stay up so that I can correctly detach the db and then move it. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Thu Aug 19 15:01:05 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 19 Aug 2004 16:01:05 -0400 Subject: [dba-SQLServer] Block size for SQL Server disk In-Reply-To: Message-ID: <005f01c48627$46119360$80b3fea9@ColbyM6805> I have purchased new disks to build this new SQL Server database on. I believe I have read that SQL Server uses 8k block sizes internally. Is it useful to format the hard disk to 8K sectors so that the SQL blocks map to sectors directly? Does anyone know if this helps or hinders? John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John Maxwell @ London City Sent: Thursday, August 19, 2004 1:58 PM To: 'dba-sqlserver at databaseadvisors.com' Subject: RE: [dba-SQLServer] Timeout I also get this problem using the query builder in EM but fine when use Query analyser If in rush for results may be worth switching while waiting for a proper solution to be posted. Regards john -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: 19 August 2004 18:39 To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Timeout I am trying to run a simple count query on this big db I'm building. The SQL is SELECT COUNT(ID) AS Cnt FROM dbo.Conduit Which is built by the query builder in EM. The query times out. [Microsoft][ODBC SQL Server Driver]Timeout Expired. Anyone have a clue where I go to increase this value so I can get a count. ID BTW is the "autonumber" PK. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com The contents of this email and any attachments are sent for the personal attention of the addressee(s) only and may be confidential. If you are not the intended addressee, any use, disclosure or copying of this email and any attachments is unauthorised - please notify the sender by return and delete the message. Any representations or commitments expressed in this email are subject to contract. ntl Group Limited _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From stuart at lexacorp.com.pg Thu Aug 19 17:14:43 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Fri, 20 Aug 2004 08:14:43 +1000 Subject: [dba-SQLServer] Timeout In-Reply-To: <005401c48613$6915bf30$80b3fea9@ColbyM6805> References: Message-ID: <4125B2F3.3024.34ED260@lexacorp.com.pg> On 19 Aug 2004 at 13:38, John W. Colby wrote: > SELECT COUNT(ID) AS Cnt > FROM dbo.Conduit > > Which is built by the query builder in EM. The query times out. > [Microsoft][ODBC SQL Server Driver]Timeout Expired. Anyone have a clue > where I go to increase this value so I can get a count. Are you running this from the Query Analyzer or an Access (or whatever) FE using OBDC. If the later, go into the ODBC Manager in you Control Panel Also, try Count(*) rather than specifying a field. -- Stuart From fhtapia at gmail.com Thu Aug 19 17:16:29 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Thu, 19 Aug 2004 15:16:29 -0700 Subject: [dba-SQLServer] Log file In-Reply-To: <006001c48628$429659e0$80b3fea9@ColbyM6805> References: <006001c48628$429659e0$80b3fea9@ColbyM6805> Message-ID: On Thu, 19 Aug 2004 16:08:08 -0400, John W. Colby wrote: > Is the ldf the log file? If so I understand it is possible to place that > file on a different drive from the data file? How do I limit the size of > the log file and dictate the location? One way is to detach your current database from QA like this: (remember to be in the "master" database when running these commands) EXEC sp_detach_db 'MyDB', True then reATTACH it by running the sp_attach_db sproc, EXEC sp_attach_db @dbname = N'MyDB', @filename1 = N'd:\SqlServer\data\MyDB.mdf', @filename2 = N'f:\FastLogDisk\MyDB_log.ldf' to LIMIT your log file to a specific size then do this: IN EM: right click and go into the database properties Click on the Transaction Log TAB Deselect Automatically Grow File WARNING: you will NOW need to backup your transaction log more often in order to reuse some wasted space, one good way is to backup the transaction log when it reaches 60% of it's utilization space: you can do this by adding an Alert: IN EM: Under the Management folder and under the SQL Server Agent icon click on Alerts and create a new Alert. Give your alert a meaningful name In the General tab: Type choose: Sql Server Performance condition alert (enabled) Object: SqlServer:Databases Counter: Percent Log Used Instance: MyDb Alert If Counter: rises above Value: 60 In the Response Tab (Check Execute Job) and create a job (the three ... dots) your job should have the following TSQL job for backup: BACKUP LOG [MyDB] TO [LogBackupDeviceName] WITH INIT Then OK to save all your settings... I hope this helps you out. -- -Francisco From fhtapia at gmail.com Thu Aug 19 17:43:54 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Thu, 19 Aug 2004 15:43:54 -0700 Subject: [dba-SQLServer] Block size for SQL Server disk In-Reply-To: <005f01c48627$46119360$80b3fea9@ColbyM6805> References: <005f01c48627$46119360$80b3fea9@ColbyM6805> Message-ID: On Thu, 19 Aug 2004 16:01:05 -0400, John W. Colby wrote: > I have purchased new disks to build this new SQL Server database on. I > believe I have read that SQL Server uses 8k block sizes internally. Is it > useful to format the hard disk to 8K sectors so that the SQL blocks map to > sectors directly? Does anyone know if this helps or hinders? Although Sql Server uses 8k block sizes internally I've never heard on any list that it's a good idea to format your hdd that way... so I'd say nope. -- -Francisco From michael at ddisolutions.com.au Thu Aug 19 18:44:20 2004 From: michael at ddisolutions.com.au (Michael Maddison) Date: Fri, 20 Aug 2004 09:44:20 +1000 Subject: [dba-SQLServer] Block size for SQL Server disk Message-ID: <59A61174B1F5B54B97FD4ADDE71E7D01011AFD@ddi-01.DDI.local> Most of the recommendations for sector size say bigger is better. I have a ppt presentation from UNISYS that looks at all the options for setting up a SQL server box. IIRC they recommend 64K because it allows the fastest file growth. While SQL uses 8k page sizes I'm not sure that relates directly to sector size! If anyone wants a copy of the ppt lemme know. cheers Michael M On Thu, 19 Aug 2004 16:01:05 -0400, John W. Colby wrote: > I have purchased new disks to build this new SQL Server database on. > I believe I have read that SQL Server uses 8k block sizes internally. > Is it useful to format the hard disk to 8K sectors so that the SQL > blocks map to sectors directly? Does anyone know if this helps or > hinders? Although Sql Server uses 8k block sizes internally I've never heard on any list that it's a good idea to format your hdd that way... so I'd say nope. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Thu Aug 19 21:49:21 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Thu, 19 Aug 2004 22:49:21 -0400 Subject: [dba-SQLServer] External hard disk In-Reply-To: <59A61174B1F5B54B97FD4ADDE71E7D01011AFD@ddi-01.DDI.local> Message-ID: <000001c48660$4beaadb0$80b3fea9@ColbyM6805> I purchased a "mobile disk" external hard disk enclosure from CompUSA to put one of the 200gb hard disks in so I could take the db with me and work off my laptop. I found a diag program out there somewhere that tested the transfer speed. For USB 2.0 the transfer speed was ~25 mbyte / sec. For fire wire (4 pin) it was only 17.5 mbyte / sec. This with the Maxtor 200g 8mb cache. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Michael Maddison Sent: Thursday, August 19, 2004 7:44 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Block size for SQL Server disk Most of the recommendations for sector size say bigger is better. I have a ppt presentation from UNISYS that looks at all the options for setting up a SQL server box. IIRC they recommend 64K because it allows the fastest file growth. While SQL uses 8k page sizes I'm not sure that relates directly to sector size! If anyone wants a copy of the ppt lemme know. cheers Michael M On Thu, 19 Aug 2004 16:01:05 -0400, John W. Colby wrote: > I have purchased new disks to build this new SQL Server database on. > I believe I have read that SQL Server uses 8k block sizes internally. > Is it useful to format the hard disk to 8K sectors so that the SQL > blocks map to sectors directly? Does anyone know if this helps or > hinders? Although Sql Server uses 8k block sizes internally I've never heard on any list that it's a good idea to format your hdd that way... so I'd say nope. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Fri Aug 20 09:15:44 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Fri, 20 Aug 2004 10:15:44 -0400 Subject: [dba-SQLServer] Quick and dirty on varchar In-Reply-To: <000001c48660$4beaadb0$80b3fea9@ColbyM6805> Message-ID: <000001c486c0$32532890$80b3fea9@ColbyM6805> When I imported this mailing list into SQL Server every field came in as varchar 255. What does this mean in terms of actual disk space used? Does SQL Server "reserve" 255 bytes for that field or does it dynamically assign just enough space to hold the actual contents. Many if not most of the fields are a Y or N (actual character in the text coming in). Would it benefit me to change the data type for these columns in terms of the actual data storage size? If I did that would the DTS doing the import from the comma delimited field still function correctly (I assume yes to that since the data still fits). John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 19, 2004 10:49 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] External hard disk I purchased a "mobile disk" external hard disk enclosure from CompUSA to put one of the 200gb hard disks in so I could take the db with me and work off my laptop. I found a diag program out there somewhere that tested the transfer speed. For USB 2.0 the transfer speed was ~25 mbyte / sec. For fire wire (4 pin) it was only 17.5 mbyte / sec. This with the Maxtor 200g 8mb cache. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Michael Maddison Sent: Thursday, August 19, 2004 7:44 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Block size for SQL Server disk Most of the recommendations for sector size say bigger is better. I have a ppt presentation from UNISYS that looks at all the options for setting up a SQL server box. IIRC they recommend 64K because it allows the fastest file growth. While SQL uses 8k page sizes I'm not sure that relates directly to sector size! If anyone wants a copy of the ppt lemme know. cheers Michael M On Thu, 19 Aug 2004 16:01:05 -0400, John W. Colby wrote: > I have purchased new disks to build this new SQL Server database on. I > believe I have read that SQL Server uses 8k block sizes internally. > Is it useful to format the hard disk to 8K sectors so that the SQL > blocks map to sectors directly? Does anyone know if this helps or > hinders? Although Sql Server uses 8k block sizes internally I've never heard on any list that it's a good idea to format your hdd that way... so I'd say nope. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Fri Aug 20 10:21:56 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 20 Aug 2004 08:21:56 -0700 Subject: [dba-SQLServer] External hard disk In-Reply-To: <000001c48660$4beaadb0$80b3fea9@ColbyM6805> References: <000001c48660$4beaadb0$80b3fea9@ColbyM6805> Message-ID: On Thu, 19 Aug 2004 22:49:21 -0400, John W. Colby wrote: > I purchased a "mobile disk" external hard disk enclosure from CompUSA to put > one of the 200gb hard disks in so I could take the db with me and work off > my laptop. I found a diag program out there somewhere that tested the > transfer speed. For USB 2.0 the transfer speed was ~25 mbyte / sec. For > fire wire (4 pin) it was only 17.5 mbyte / sec. This with the Maxtor 200g > 8mb cache. HDD enclosures vary by brand, mostly due to the onboard hdd controller on the enclosure, I'm curious as to which one you bought, I'm surprised to hear that you got such a huge gap in performance between both connection. Generally the speed of USB2 is 412mb/s (mega bits) and FireWire is 400mb/s tho FireWire2 is 800mb/s. -- -Francisco From mikedorism at adelphia.net Fri Aug 20 10:31:33 2004 From: mikedorism at adelphia.net (Mike & Doris Manning) Date: Fri, 20 Aug 2004 11:31:33 -0400 Subject: [dba-SQLServer] Quick and dirty on varchar In-Reply-To: <000001c486c0$32532890$80b3fea9@ColbyM6805> Message-ID: <000001c486ca$c6ca9ee0$870aa845@hargrove.internal> >From BOL... varchar[(n)] Variable-length non-Unicode character data with length of n bytes. n must be a value from 1 through 8,000. Storage size is the actual length in bytes of the data entered, not n bytes. The data entered can be 0 characters in length. Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Friday, August 20, 2004 10:16 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Quick and dirty on varchar When I imported this mailing list into SQL Server every field came in as varchar 255. What does this mean in terms of actual disk space used? Does SQL Server "reserve" 255 bytes for that field or does it dynamically assign just enough space to hold the actual contents. Many if not most of the fields are a Y or N (actual character in the text coming in). Would it benefit me to change the data type for these columns in terms of the actual data storage size? If I did that would the DTS doing the import from the comma delimited field still function correctly (I assume yes to that since the data still fits). John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 19, 2004 10:49 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] External hard disk I purchased a "mobile disk" external hard disk enclosure from CompUSA to put one of the 200gb hard disks in so I could take the db with me and work off my laptop. I found a diag program out there somewhere that tested the transfer speed. For USB 2.0 the transfer speed was ~25 mbyte / sec. For fire wire (4 pin) it was only 17.5 mbyte / sec. This with the Maxtor 200g 8mb cache. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Michael Maddison Sent: Thursday, August 19, 2004 7:44 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Block size for SQL Server disk Most of the recommendations for sector size say bigger is better. I have a ppt presentation from UNISYS that looks at all the options for setting up a SQL server box. IIRC they recommend 64K because it allows the fastest file growth. While SQL uses 8k page sizes I'm not sure that relates directly to sector size! If anyone wants a copy of the ppt lemme know. cheers Michael M On Thu, 19 Aug 2004 16:01:05 -0400, John W. Colby wrote: > I have purchased new disks to build this new SQL Server database on. I > believe I have read that SQL Server uses 8k block sizes internally. > Is it useful to format the hard disk to 8K sectors so that the SQL > blocks map to sectors directly? Does anyone know if this helps or > hinders? Although Sql Server uses 8k block sizes internally I've never heard on any list that it's a good idea to format your hdd that way... so I'd say nope. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Fri Aug 20 11:58:35 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Fri, 20 Aug 2004 12:58:35 -0400 Subject: [dba-SQLServer] External hard disk In-Reply-To: Message-ID: <000401c486d6$ee8a9780$80b3fea9@ColbyM6805> There is another variable, which is the implementation of the usb and firewire in the computer. The box I bought is just called Mobile Disk External Data Storage and came from CompUSA. Price about $59 IIRC. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Friday, August 20, 2004 11:22 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] External hard disk On Thu, 19 Aug 2004 22:49:21 -0400, John W. Colby wrote: > I purchased a "mobile disk" external hard disk enclosure from CompUSA > to put one of the 200gb hard disks in so I could take the db with me > and work off my laptop. I found a diag program out there somewhere > that tested the transfer speed. For USB 2.0 the transfer speed was > ~25 mbyte / sec. For fire wire (4 pin) it was only 17.5 mbyte / sec. > This with the Maxtor 200g 8mb cache. HDD enclosures vary by brand, mostly due to the onboard hdd controller on the enclosure, I'm curious as to which one you bought, I'm surprised to hear that you got such a huge gap in performance between both connection. Generally the speed of USB2 is 412mb/s (mega bits) and FireWire is 400mb/s tho FireWire2 is 800mb/s. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Fri Aug 20 12:12:25 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Fri, 20 Aug 2004 13:12:25 -0400 Subject: [dba-SQLServer] Quick and dirty on varchar In-Reply-To: <000001c486ca$c6ca9ee0$870aa845@hargrove.internal> Message-ID: <000501c486d8$dd8c8bd0$80b3fea9@ColbyM6805> Good enough, thanks John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Mike & Doris Manning Sent: Friday, August 20, 2004 11:32 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Quick and dirty on varchar >From BOL... varchar[(n)] Variable-length non-Unicode character data with length of n bytes. n must be a value from 1 through 8,000. Storage size is the actual length in bytes of the data entered, not n bytes. The data entered can be 0 characters in length. Doris Manning Database Administrator Hargrove Inc. www.hargroveinc.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Friday, August 20, 2004 10:16 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Quick and dirty on varchar When I imported this mailing list into SQL Server every field came in as varchar 255. What does this mean in terms of actual disk space used? Does SQL Server "reserve" 255 bytes for that field or does it dynamically assign just enough space to hold the actual contents. Many if not most of the fields are a Y or N (actual character in the text coming in). Would it benefit me to change the data type for these columns in terms of the actual data storage size? If I did that would the DTS doing the import from the comma delimited field still function correctly (I assume yes to that since the data still fits). John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 19, 2004 10:49 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] External hard disk I purchased a "mobile disk" external hard disk enclosure from CompUSA to put one of the 200gb hard disks in so I could take the db with me and work off my laptop. I found a diag program out there somewhere that tested the transfer speed. For USB 2.0 the transfer speed was ~25 mbyte / sec. For fire wire (4 pin) it was only 17.5 mbyte / sec. This with the Maxtor 200g 8mb cache. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Michael Maddison Sent: Thursday, August 19, 2004 7:44 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Block size for SQL Server disk Most of the recommendations for sector size say bigger is better. I have a ppt presentation from UNISYS that looks at all the options for setting up a SQL server box. IIRC they recommend 64K because it allows the fastest file growth. While SQL uses 8k page sizes I'm not sure that relates directly to sector size! If anyone wants a copy of the ppt lemme know. cheers Michael M On Thu, 19 Aug 2004 16:01:05 -0400, John W. Colby wrote: > I have purchased new disks to build this new SQL Server database on. I > believe I have read that SQL Server uses 8k block sizes internally. > Is it useful to format the hard disk to 8K sectors so that the SQL > blocks map to sectors directly? Does anyone know if this helps or > hinders? Although Sql Server uses 8k block sizes internally I've never heard on any list that it's a good idea to format your hdd that way... so I'd say nope. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Fri Aug 20 19:22:21 2004 From: artful at rogers.com (Arthur Fuller) Date: Fri, 20 Aug 2004 20:22:21 -0400 Subject: [dba-SQLServer] Out of Disk In-Reply-To: <005e01c48626$c5208ae0$80b3fea9@ColbyM6805> Message-ID: <01e201c48714$ecdfc480$6601a8c0@rock> Copy the files to the new hard disk into a dir of your choice. Then delete the database within EM. Then open QA and run sp_attach_single_db "JWCdata", pointing to the new location. Check BOL for precise syntax, but that's the idea and it's trivial. You'll have it done in a few minutes. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Thursday, August 19, 2004 3:57 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Out of Disk I imported more of the data into this big db I'm working on and now SQL Server shuts down after 5 seconds or so after startup. The error log says "not enough room on disk" for something it is trying to build (a temp db I think). I am down to 15gb on the disk that the SQL Server data resides on. The question is, can I simply copy the mdf and ldf files for this new database off to the new hard disk I installed to free up memory? I understand that SQL Server would now be missing the database, but can I then detach it and reattach it from the new drive? If not I will need to clean up enough space (amount unknown) to get SQL Server to stay up so that I can correctly detach the db and then move it. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Fri Aug 20 19:24:18 2004 From: artful at rogers.com (Arthur Fuller) Date: Fri, 20 Aug 2004 20:24:18 -0400 Subject: [dba-SQLServer] Timeout In-Reply-To: <4125B2F3.3024.34ED260@lexacorp.com.pg> Message-ID: <01e301c48715$32c9de90$6601a8c0@rock> I can't verify the internals but I agree with your suggestion. Almost all SQL books suggest that Count(*) is highly optimized while Count(MyField) is not. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart McLachlan Sent: Thursday, August 19, 2004 6:15 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Timeout On 19 Aug 2004 at 13:38, John W. Colby wrote: > SELECT COUNT(ID) AS Cnt > FROM dbo.Conduit > > Which is built by the query builder in EM. The query times out. > [Microsoft][ODBC SQL Server Driver]Timeout Expired. Anyone have a > clue where I go to increase this value so I can get a count. Are you running this from the Query Analyzer or an Access (or whatever) FE using OBDC. If the later, go into the ODBC Manager in you Control Panel Also, try Count(*) rather than specifying a field. -- Stuart _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From stuart at lexacorp.com.pg Fri Aug 20 20:18:04 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Sat, 21 Aug 2004 11:18:04 +1000 Subject: [dba-SQLServer] Timeout In-Reply-To: <01e301c48715$32c9de90$6601a8c0@rock> References: <4125B2F3.3024.34ED260@lexacorp.com.pg> Message-ID: <41272F6C.8395.3FDDA1D@lexacorp.com.pg> On 20 Aug 2004 at 20:24, Arthur Fuller wrote: > I can't verify the internals but I agree with your suggestion. Almost > all SQL books suggest that Count(*) is highly optimized while > Count(MyField) is not. > Just can across another trick to do it much faster still: There is another way to determine the total row count in a table. You can use the sysindexes system table for this purpose. There is ROWS column in the sysindexes table. This column contains the total row count for each table in your database. So, you can use the following select statement instead of above one: SELECT rows FROM sysindexes WHERE id = OBJECT_ID('table_name') AND indid < 2 -- Stuart From stuart at lexacorp.com.pg Fri Aug 20 20:33:28 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Sat, 21 Aug 2004 11:33:28 +1000 Subject: [dba-SQLServer] Timeout In-Reply-To: <01e301c48715$32c9de90$6601a8c0@rock> References: <4125B2F3.3024.34ED260@lexacorp.com.pg> Message-ID: <41273308.2068.40BF37D@lexacorp.com.pg> On 20 Aug 2004 at 20:24, Arthur Fuller wrote: > I can't verify the internals but I agree with your suggestion. Almost > all SQL books suggest that Count(*) is highly optimized while > Count(MyField) is not. > SQLBOL COUNT(*) returns the number of items in a group, including NULL values and duplicates. COUNT(ALL expression) evaluates expression for each row in a group and returns the number of nonnull values. Count(*) just does a row scan. Count(Field) needs to compare the value of Field to Null in every row. -- Stuart From sqlserver667 at yahoo.com Tue Aug 24 05:41:11 2004 From: sqlserver667 at yahoo.com (S D) Date: Tue, 24 Aug 2004 03:41:11 -0700 (PDT) Subject: [dba-SQLServer] Schudeled Items? Message-ID: <20040824104111.59766.qmail@web53305.mail.yahoo.com> Hi group, I (urgently) need to find out wich DTS/SP object is scheduled at 16:00 hours. It is sending a lot of mail and this is very annoying and can cause problems in due time. So, my question: how can I figure out wich DTS/SP is scheduled? TIA Sander PS: We've got over a 1000 DTS/SP objects in our development street so I do not hope that I have to check them 1-by-1 :-) __________________________________ Do you Yahoo!? New and Improved Yahoo! Mail - 100MB free storage! http://promotions.yahoo.com/new_mail From DElam at jenkens.com Tue Aug 24 09:28:48 2004 From: DElam at jenkens.com (Elam, Debbie) Date: Tue, 24 Aug 2004 09:28:48 -0500 Subject: [dba-SQLServer] Schudeled Items? Message-ID: <7B1961ED924D1A459E378C9B1BB22B4C02485383@natexch.jenkens.com> Why don't you look at the list of scheduled jobs? Find the ones that ran last at 16:00 hours and narrow the list from there. I know my SQL Enterprise Manager has a Last Run Status (Start Date) field that can be sorted. Debbie -----Original Message----- From: S D [mailto:sqlserver667 at yahoo.com] Sent: Tuesday, August 24, 2004 5:41 AM To: sqlserver667 Subject: [dba-SQLServer] Schudeled Items? Hi group, I (urgently) need to find out wich DTS/SP object is scheduled at 16:00 hours. It is sending a lot of mail and this is very annoying and can cause problems in due time. So, my question: how can I figure out wich DTS/SP is scheduled? TIA Sander PS: We've got over a 1000 DTS/SP objects in our development street so I do not hope that I have to check them 1-by-1 :-) __________________________________ Do you Yahoo!? New and Improved Yahoo! Mail - 100MB free storage! http://promotions.yahoo.com/new_mail _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com - JENKENS & GILCHRIST E-MAIL NOTICE - This transmission may be: (1) subject to the Attorney-Client Privilege, (2) an attorney work product, or (3) strictly confidential. If you are not the intended recipient of this message, you may not disclose, print, copy or disseminate this information. If you have received this in error, please reply and notify the sender (only) and delete the message. Unauthorized interception of this e-mail is a violation of federal criminal law. This communication does not reflect an intention by the sender or the sender's client or principal to conduct a transaction or make any agreement by electronic means. Nothing contained in this message or in any attachment shall satisfy the requirements for a writing, and nothing contained herein shall constitute a contract or electronic signature under the Electronic Signatures in Global and National Commerce Act, any version of the Uniform Electronic Transactions Act or any other statute governing electronic transactions. From jwcolby at colbyconsulting.com Wed Aug 25 00:10:36 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Wed, 25 Aug 2004 01:10:36 -0400 Subject: [dba-SQLServer] Big db update In-Reply-To: <41273308.2068.40BF37D@lexacorp.com.pg> Message-ID: <000101c48a61$db9b37e0$80b3fea9@ColbyM6805> As you know by now I am working on a largish database, 65 million names / addresses plus demographics, ~600 fields it turns out. Over the last weekend I tried working on it using my P64 laptop with 512mb RAM. I bought a pair of 200gb disks which I will end up putting in my server but for now I took one and put it in an external enclosure and hooked it to the laptop over a usb2.0 port. I then started trying to import the database using dts. The raw data is broken down into zip files (PK zip) each containing 3 million records. The imports worked but as I went along, the times to import kept getting longer and longer. Eventually file 5 failed to import. It turned out that the database file was fragmented into 179 THOUSAND fragments!!!!! Defrag wouldn't work. The database file by this time was up around 80 gbytes. I went down to Wal-Mart and bought another (Seagate) 120gb external drive and hooked that up to another usb port. By copying the 80g file from the first drive to the new drive, it defragged. I then tried to import again. File 5 went in but file 6 failed. Again, the file was fragmented into ONLY about 140 THOUSAND fragments. No amount of coaxing on my part would allow the 6th file to import. I even figured out how to use BCP but that also failed. Tempdb read failure, which BTW was out on my C: drive. Anyway I just gave up at that point figuring that 512 mb of RAM was simply insufficient. I had ordered a pair of 1g sticks of ram, so tonight when I got home from the "vacation" I threw them in the server and went back to work. I had done the import of files 1 and 2 on my server and then moved them to the external enclosure to take to Syracuse on "vacation". Tonight I installed the ram in my desktop and used BCP to import the 3rd, 4th and (currently working on) 5th files. So far it is as smooth as silk. The database resizes as needed but so far the database file is at 91g and only 3 fragments, and that is due I believe to having to fit around existing files out on the disk. So far, MUCH better. The SQL Server piece grabs 1.9gb of ram for itself. There is no sign of swap file activity which was just a killer on the laptop. With only 512mb of ram the poor laptop was just beating up the swap file. I tried moving it out to the external disks but the laptop refused to boot that way so I had to keep it on the C: drive. With 2g RAM there is no swap file activity at all, even though all of the ram is used by SQL Server. I am doing the BCP directly on the server using Query Analyzer, with the 10g (decompressed) text "source" files on one drive and only the database on the 200g disk. SQL Server pegs the CPU needle, 100% usage and about 95% memory usage. I understand that I could set up my various other machines to each run BCP simultaneously against the SQL Server instance running on the server, but it turns out that the client really wants the file reconstructed in the order of the text files so that's a no go. It appears however that I may actually get my poor underpowered 2.5g AMD Barton with 2g RAM to work this file. We shall have to wait and see whether I get all 65 million records in 200gb of hard disk. It is looking like I won't since I am on the 5th file and the db is already up around 90g with 17 more files to go!!! At this rate, I am looking at MORE THAN 400gb. On my laptop I actually compressed the disk which worked very well actually (other than the extra overhead in compression). I seemed to be getting about 2:1 compression of the database file. This thing is getting out of hand I must say. Anyway, I just thought you folks might like a little update on the trials and tribulations I've been going through trying to get this working. John W. Colby www.ColbyConsulting.com From stuart at lexacorp.com.pg Wed Aug 25 00:45:49 2004 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Wed, 25 Aug 2004 15:45:49 +1000 Subject: [dba-SQLServer] Big db update In-Reply-To: <000101c48a61$db9b37e0$80b3fea9@ColbyM6805> References: <41273308.2068.40BF37D@lexacorp.com.pg> Message-ID: <412CB42D.16598.504489C@lexacorp.com.pg> On 25 Aug 2004 at 1:10, John W. Colby wrote: > is already up around 90g with 17 more files to go!!! At this rate, I am > looking at MORE THAN 400gb. > Don't say you weren't warned. On Fri 13th, I told you 2. You are looking at *several* hundred gig of data and indexes and huge processing power to be able to pull selected data sets. -- Stuart From sqlserver667 at yahoo.com Wed Aug 25 01:56:51 2004 From: sqlserver667 at yahoo.com (S D) Date: Tue, 24 Aug 2004 23:56:51 -0700 (PDT) Subject: [dba-SQLServer] Schudeled Items? In-Reply-To: <7B1961ED924D1A459E378C9B1BB22B4C02485383@natexch.jenkens.com> Message-ID: <20040825065651.89527.qmail@web53303.mail.yahoo.com> thnx for you're reply Debbie, i did some digging on MSDN and found several usefull tables in the MSDB database: sysjobs sysjobsteps sysjobhistory sysjobschedules select * from sysjobschedules where freq_type = 4 With these I found out wich jobs where scheduled. The job that fires over and over is TestFromBill. However I do NOT see this job?!? Could it be that this one is deleted/corrupted somehow but the scheduling wasn't deleted?? Regards, Sander --- "Elam, Debbie" wrote: > Why don't you look at the list of scheduled jobs? > Find the ones that ran > last at 16:00 hours and narrow the list from there. > I know my SQL > Enterprise Manager has a Last Run Status (Start > Date) field that can be > sorted. > > Debbie > > -----Original Message----- > From: S D [mailto:sqlserver667 at yahoo.com] > Sent: Tuesday, August 24, 2004 5:41 AM > To: sqlserver667 > Subject: [dba-SQLServer] Schudeled Items? > > > Hi group, > > I (urgently) need to find out wich DTS/SP object is > scheduled at 16:00 hours. > > It is sending a lot of mail and this is very > annoying > and can cause problems in due time. > > So, my question: > how can I figure out wich DTS/SP is scheduled? > > TIA > > Sander > PS: We've got over a 1000 DTS/SP objects in our > development street so I do not hope that I have to > check them 1-by-1 :-) > > > > > __________________________________ > Do you Yahoo!? > New and Improved Yahoo! Mail - 100MB free storage! > http://promotions.yahoo.com/new_mail > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > - JENKENS & GILCHRIST E-MAIL NOTICE - This > transmission may be: (1) subject > to the Attorney-Client Privilege, (2) an attorney > work product, or (3) > strictly confidential. If you are not the intended > recipient of this > message, you may not disclose, print, copy or > disseminate this information. > If you have received this in error, please reply and > notify the sender > (only) and delete the message. Unauthorized > interception of this e-mail is a > violation of federal criminal law. > This communication does not reflect an intention by > the sender or the > sender's client or principal to conduct a > transaction or make any agreement > by electronic means. Nothing contained in this > message or in any attachment > shall satisfy the requirements for a writing, and > nothing contained herein > shall constitute a contract or electronic signature > under the Electronic > Signatures in Global and National Commerce Act, any > version of the Uniform > Electronic Transactions Act or any other statute > governing electronic > transactions. > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > __________________________________ Do you Yahoo!? Yahoo! Mail - Helps protect you from nasty viruses. http://promotions.yahoo.com/new_mail From Erwin.Craps at ithelps.be Wed Aug 25 02:40:08 2004 From: Erwin.Craps at ithelps.be (Erwin Craps - IT Helps) Date: Wed, 25 Aug 2004 09:40:08 +0200 Subject: [dba-SQLServer] Big db update Message-ID: <46B976F2B698FF46A4FE7636509B22DF0ADB0A@stekelbes.ithelps.local> I'm not an SQL specialist but I would like to add my comment if I may? 1) Putting a database on a external disk (USB or Firewire) is not a smart thing to do. The speed of USB/Firewire is so slow compared to regular ATA. But as I read this is an intermediate solution. I'm not sure if SATA external drives already exists but they are much faster and don't use CPU processing power like USB. I noticed to that some tools do not work on external USB drives. Not sure for defrag but anyway that would take ages to do due to low speed. 2) From what I know you can set the initial size of a SQL database and log file to any size you want. So if you estimate to have 3 GB of data, set the initial size of the DB and log to 3GB. The file will be created in one shot, so it will not defragment unless the size grows above 3 GB. Secondly your addditions will go faster. In your old situation SQL server adds 10% of free space when the DB is to small. What you experianced now is that SQL adds some records, expands the database by x%, adds some records, expands the DB by x%, add some records... And so on. This creates the defragmantation. If the size is to big at the end of your additions there is a way to shrink the file again. 3) Again I'm not an SQL specialist, but I wonder if the log file can be turned off? When adding so much data I supose the log file (and certanly on you USB disk) will dramaticaly slow down the process. I wonder if the log file can be temporary turned of until all data is in the DB. I supose this must be posible. Ofcourse you must no encounter any problem wile you do that, like a power failure. Erwin -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Wednesday, August 25, 2004 7:11 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Big db update As you know by now I am working on a largish database, 65 million names / addresses plus demographics, ~600 fields it turns out. Over the last weekend I tried working on it using my P64 laptop with 512mb RAM. I bought a pair of 200gb disks which I will end up putting in my server but for now I took one and put it in an external enclosure and hooked it to the laptop over a usb2.0 port. I then started trying to import the database using dts. The raw data is broken down into zip files (PK zip) each containing 3 million records. The imports worked but as I went along, the times to import kept getting longer and longer. Eventually file 5 failed to import. It turned out that the database file was fragmented into 179 THOUSAND fragments!!!!! Defrag wouldn't work. The database file by this time was up around 80 gbytes. I went down to Wal-Mart and bought another (Seagate) 120gb external drive and hooked that up to another usb port. By copying the 80g file from the first drive to the new drive, it defragged. I then tried to import again. File 5 went in but file 6 failed. Again, the file was fragmented into ONLY about 140 THOUSAND fragments. No amount of coaxing on my part would allow the 6th file to import. I even figured out how to use BCP but that also failed. Tempdb read failure, which BTW was out on my C: drive. Anyway I just gave up at that point figuring that 512 mb of RAM was simply insufficient. I had ordered a pair of 1g sticks of ram, so tonight when I got home from the "vacation" I threw them in the server and went back to work. I had done the import of files 1 and 2 on my server and then moved them to the external enclosure to take to Syracuse on "vacation". Tonight I installed the ram in my desktop and used BCP to import the 3rd, 4th and (currently working on) 5th files. So far it is as smooth as silk. The database resizes as needed but so far the database file is at 91g and only 3 fragments, and that is due I believe to having to fit around existing files out on the disk. So far, MUCH better. The SQL Server piece grabs 1.9gb of ram for itself. There is no sign of swap file activity which was just a killer on the laptop. With only 512mb of ram the poor laptop was just beating up the swap file. I tried moving it out to the external disks but the laptop refused to boot that way so I had to keep it on the C: drive. With 2g RAM there is no swap file activity at all, even though all of the ram is used by SQL Server. I am doing the BCP directly on the server using Query Analyzer, with the 10g (decompressed) text "source" files on one drive and only the database on the 200g disk. SQL Server pegs the CPU needle, 100% usage and about 95% memory usage. I understand that I could set up my various other machines to each run BCP simultaneously against the SQL Server instance running on the server, but it turns out that the client really wants the file reconstructed in the order of the text files so that's a no go. It appears however that I may actually get my poor underpowered 2.5g AMD Barton with 2g RAM to work this file. We shall have to wait and see whether I get all 65 million records in 200gb of hard disk. It is looking like I won't since I am on the 5th file and the db is already up around 90g with 17 more files to go!!! At this rate, I am looking at MORE THAN 400gb. On my laptop I actually compressed the disk which worked very well actually (other than the extra overhead in compression). I seemed to be getting about 2:1 compression of the database file. This thing is getting out of hand I must say. Anyway, I just thought you folks might like a little update on the trials and tribulations I've been going through trying to get this working. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jmoss111 at bellsouth.net Wed Aug 25 05:38:33 2004 From: jmoss111 at bellsouth.net (JMoss) Date: Wed, 25 Aug 2004 05:38:33 -0500 Subject: [dba-SQLServer] Big db update In-Reply-To: <000101c48a61$db9b37e0$80b3fea9@ColbyM6805> Message-ID: John, Have you used Enterprise Manager's SHRINK on the DB and log file? Also, have you a dedup process for this monstrosity? I was heavily involved in building large databases a lot like you're currently doing, and about 10 -15 % of what I was loading was dups. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: Wednesday, August 25, 2004 12:11 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Big db update As you know by now I am working on a largish database, 65 million names / addresses plus demographics, ~600 fields it turns out. Over the last weekend I tried working on it using my P64 laptop with 512mb RAM. I bought a pair of 200gb disks which I will end up putting in my server but for now I took one and put it in an external enclosure and hooked it to the laptop over a usb2.0 port. I then started trying to import the database using dts. The raw data is broken down into zip files (PK zip) each containing 3 million records. The imports worked but as I went along, the times to import kept getting longer and longer. Eventually file 5 failed to import. It turned out that the database file was fragmented into 179 THOUSAND fragments!!!!! Defrag wouldn't work. The database file by this time was up around 80 gbytes. I went down to Wal-Mart and bought another (Seagate) 120gb external drive and hooked that up to another usb port. By copying the 80g file from the first drive to the new drive, it defragged. I then tried to import again. File 5 went in but file 6 failed. Again, the file was fragmented into ONLY about 140 THOUSAND fragments. No amount of coaxing on my part would allow the 6th file to import. I even figured out how to use BCP but that also failed. Tempdb read failure, which BTW was out on my C: drive. Anyway I just gave up at that point figuring that 512 mb of RAM was simply insufficient. I had ordered a pair of 1g sticks of ram, so tonight when I got home from the "vacation" I threw them in the server and went back to work. I had done the import of files 1 and 2 on my server and then moved them to the external enclosure to take to Syracuse on "vacation". Tonight I installed the ram in my desktop and used BCP to import the 3rd, 4th and (currently working on) 5th files. So far it is as smooth as silk. The database resizes as needed but so far the database file is at 91g and only 3 fragments, and that is due I believe to having to fit around existing files out on the disk. So far, MUCH better. The SQL Server piece grabs 1.9gb of ram for itself. There is no sign of swap file activity which was just a killer on the laptop. With only 512mb of ram the poor laptop was just beating up the swap file. I tried moving it out to the external disks but the laptop refused to boot that way so I had to keep it on the C: drive. With 2g RAM there is no swap file activity at all, even though all of the ram is used by SQL Server. I am doing the BCP directly on the server using Query Analyzer, with the 10g (decompressed) text "source" files on one drive and only the database on the 200g disk. SQL Server pegs the CPU needle, 100% usage and about 95% memory usage. I understand that I could set up my various other machines to each run BCP simultaneously against the SQL Server instance running on the server, but it turns out that the client really wants the file reconstructed in the order of the text files so that's a no go. It appears however that I may actually get my poor underpowered 2.5g AMD Barton with 2g RAM to work this file. We shall have to wait and see whether I get all 65 million records in 200gb of hard disk. It is looking like I won't since I am on the 5th file and the db is already up around 90g with 17 more files to go!!! At this rate, I am looking at MORE THAN 400gb. On my laptop I actually compressed the disk which worked very well actually (other than the extra overhead in compression). I seemed to be getting about 2:1 compression of the database file. This thing is getting out of hand I must say. Anyway, I just thought you folks might like a little update on the trials and tribulations I've been going through trying to get this working. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed Aug 25 06:19:07 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Wed, 25 Aug 2004 07:19:07 -0400 Subject: [dba-SQLServer] Big db update In-Reply-To: Message-ID: <000a01c48a95$59e66c90$80b3fea9@ColbyM6805> I looked at Shrink which will reduce the current size by 30% according to EM. It appears that the extra space is there for future expansion and seeing as I have a slew more files to import there is no point in doing so YET. I will keep you in mind for those hard questions about largish DBs! John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JMoss Sent: Wednesday, August 25, 2004 6:39 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Big db update John, Have you used Enterprise Manager's SHRINK on the DB and log file? Also, have you a dedup process for this monstrosity? I was heavily involved in building large databases a lot like you're currently doing, and about 10 -15 % of what I was loading was dups. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: Wednesday, August 25, 2004 12:11 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Big db update As you know by now I am working on a largish database, 65 million names / addresses plus demographics, ~600 fields it turns out. Over the last weekend I tried working on it using my P64 laptop with 512mb RAM. I bought a pair of 200gb disks which I will end up putting in my server but for now I took one and put it in an external enclosure and hooked it to the laptop over a usb2.0 port. I then started trying to import the database using dts. The raw data is broken down into zip files (PK zip) each containing 3 million records. The imports worked but as I went along, the times to import kept getting longer and longer. Eventually file 5 failed to import. It turned out that the database file was fragmented into 179 THOUSAND fragments!!!!! Defrag wouldn't work. The database file by this time was up around 80 gbytes. I went down to Wal-Mart and bought another (Seagate) 120gb external drive and hooked that up to another usb port. By copying the 80g file from the first drive to the new drive, it defragged. I then tried to import again. File 5 went in but file 6 failed. Again, the file was fragmented into ONLY about 140 THOUSAND fragments. No amount of coaxing on my part would allow the 6th file to import. I even figured out how to use BCP but that also failed. Tempdb read failure, which BTW was out on my C: drive. Anyway I just gave up at that point figuring that 512 mb of RAM was simply insufficient. I had ordered a pair of 1g sticks of ram, so tonight when I got home from the "vacation" I threw them in the server and went back to work. I had done the import of files 1 and 2 on my server and then moved them to the external enclosure to take to Syracuse on "vacation". Tonight I installed the ram in my desktop and used BCP to import the 3rd, 4th and (currently working on) 5th files. So far it is as smooth as silk. The database resizes as needed but so far the database file is at 91g and only 3 fragments, and that is due I believe to having to fit around existing files out on the disk. So far, MUCH better. The SQL Server piece grabs 1.9gb of ram for itself. There is no sign of swap file activity which was just a killer on the laptop. With only 512mb of ram the poor laptop was just beating up the swap file. I tried moving it out to the external disks but the laptop refused to boot that way so I had to keep it on the C: drive. With 2g RAM there is no swap file activity at all, even though all of the ram is used by SQL Server. I am doing the BCP directly on the server using Query Analyzer, with the 10g (decompressed) text "source" files on one drive and only the database on the 200g disk. SQL Server pegs the CPU needle, 100% usage and about 95% memory usage. I understand that I could set up my various other machines to each run BCP simultaneously against the SQL Server instance running on the server, but it turns out that the client really wants the file reconstructed in the order of the text files so that's a no go. It appears however that I may actually get my poor underpowered 2.5g AMD Barton with 2g RAM to work this file. We shall have to wait and see whether I get all 65 million records in 200gb of hard disk. It is looking like I won't since I am on the 5th file and the db is already up around 90g with 17 more files to go!!! At this rate, I am looking at MORE THAN 400gb. On my laptop I actually compressed the disk which worked very well actually (other than the extra overhead in compression). I seemed to be getting about 2:1 compression of the database file. This thing is getting out of hand I must say. Anyway, I just thought you folks might like a little update on the trials and tribulations I've been going through trying to get this working. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From artful at rogers.com Wed Aug 25 07:50:02 2004 From: artful at rogers.com (Arthur Fuller) Date: Wed, 25 Aug 2004 08:50:02 -0400 Subject: [dba-SQLServer] Big db update In-Reply-To: <000a01c48a95$59e66c90$80b3fea9@ColbyM6805> Message-ID: <013d01c48aa2$09d1dca0$6601a8c0@rock> Since it's getting so large, I wish you had looked into the LaCie drives I mentioned earlier. In case you missed that post, visit www.lacie.com. Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Wednesday, August 25, 2004 7:19 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Big db update I looked at Shrink which will reduce the current size by 30% according to EM. It appears that the extra space is there for future expansion and seeing as I have a slew more files to import there is no point in doing so YET. I will keep you in mind for those hard questions about largish DBs! John W. Colby www.ColbyConsulting.com From jwcolby at colbyconsulting.com Wed Aug 25 08:06:02 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Wed, 25 Aug 2004 09:06:02 -0400 Subject: [dba-SQLServer] Big db update In-Reply-To: <013d01c48aa2$09d1dca0$6601a8c0@rock> Message-ID: <000b01c48aa4$497db020$80b3fea9@ColbyM6805> I looked at them Arthur. I had already purchased 2 200g hard disks to put in my server to hold this, having no idea how big it would be but figuring that if push came to shove I could use a raid0 to get 400g. The 200g drives are the sweet spot on the price/performance curve right now. If I can get the raid1 to compress I can get about 800mb for somewhere in the neighborhood of $250, otherwise it will be 400 which it appears will NOT be enough. 8-( Speaking of compression, does anyone know what the key is to getting a drive to compress? I looked it up in help and it says just that it has to be a NTFS drive. These are but the "allow compression" check box is not available. I got one of my external disks to compress, the other I couldn't see that check box. No idea why or what the pattern is. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Wednesday, August 25, 2004 8:50 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Big db update Since it's getting so large, I wish you had looked into the LaCie drives I mentioned earlier. In case you missed that post, visit www.lacie.com. Arthur -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Wednesday, August 25, 2004 7:19 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Big db update I looked at Shrink which will reduce the current size by 30% according to EM. It appears that the extra space is there for future expansion and seeing as I have a slew more files to import there is no point in doing so YET. I will keep you in mind for those hard questions about largish DBs! John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From JColby at dispec.com Wed Aug 25 09:23:24 2004 From: JColby at dispec.com (Colby, John) Date: Wed, 25 Aug 2004 10:23:24 -0400 Subject: [dba-SQLServer] Log file Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDCC@DISABILITYINS01> Martin, When I tried this (I'm assuming the log file is the .LDF?) I couldn't get the SQL Server to open the database. -----Original Message----- From: Martin Reid [mailto:mwp.reid at qub.ac.uk] Sent: Thursday, August 19, 2004 4:20 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Log file John One of the things I did recently was to actually detach a databse and log file. I thenremoved the log file completely as it was HUGH. Put it away soemwhere safe. I then reattached the MDF file and had SQL automatically recreate a new empty log file. Not sure how professional this is but appeared to work fine for my needs. You can also move the log out somewhere else detach the database move the log to its new location reattach sp_attachdb Yourdb PATH LOGFile Path Should be OK Oh and back it all up first if you have the disc space. Martin ----- Original Message ----- From: "John W. Colby" To: Sent: Thursday, August 19, 2004 9:08 PM Subject: [dba-SQLServer] Log file > Is the ldf the log file? If so I understand it is possible to place that > file on a different drive from the data file? How do I limit the size of > the log file and dictate the location? > > John W. Colby > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. > Colby > Sent: Thursday, August 19, 2004 3:57 PM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] Out of Disk > > > I imported more of the data into this big db I'm working on and now SQL > Server shuts down after 5 seconds or so after startup. The error log says > "not enough room on disk" for something it is trying to build (a temp db I > think). I am down to 15gb on the disk that the SQL Server data resides on. > > > The question is, can I simply copy the mdf and ldf files for this new > database off to the new hard disk I installed to free up memory? I > understand that SQL Server would now be missing the database, but can I then > detach it and reattach it from the new drive? If not I will need to clean > up enough space (amount unknown) to get SQL Server to stay up so that I can > correctly detach the db and then move it. > > John W. Colby > www.ColbyConsulting.com > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From JColby at dispec.com Wed Aug 25 09:24:30 2004 From: JColby at dispec.com (Colby, John) Date: Wed, 25 Aug 2004 10:24:30 -0400 Subject: [dba-SQLServer] Timeout Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDCD@DISABILITYINS01> The query analyzer does indeed work fine where EM times out. Strange but true. JWC -----Original Message----- From: John Maxwell @ London City [mailto:John.Maxwell2 at ntl.com] Sent: Thursday, August 19, 2004 1:58 PM To: 'dba-sqlserver at databaseadvisors.com' Subject: RE: [dba-SQLServer] Timeout I also get this problem using the query builder in EM but fine when use Query analyser If in rush for results may be worth switching while waiting for a proper solution to be posted. Regards john -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com]On Behalf Of John W. Colby Sent: 19 August 2004 18:39 To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Timeout I am trying to run a simple count query on this big db I'm building. The SQL is SELECT COUNT(ID) AS Cnt FROM dbo.Conduit Which is built by the query builder in EM. The query times out. [Microsoft][ODBC SQL Server Driver]Timeout Expired. Anyone have a clue where I go to increase this value so I can get a count. ID BTW is the "autonumber" PK. John W. Colby www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com The contents of this email and any attachments are sent for the personal attention of the addressee(s) only and may be confidential. If you are not the intended addressee, any use, disclosure or copying of this email and any attachments is unauthorised - please notify the sender by return and delete the message. Any representations or commitments expressed in this email are subject to contract. ntl Group Limited _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From JColby at dispec.com Wed Aug 25 09:25:32 2004 From: JColby at dispec.com (Colby, John) Date: Wed, 25 Aug 2004 10:25:32 -0400 Subject: [dba-SQLServer] Using task manager to close SQL Server Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDCE@DISABILITYINS01> More memory seemed to do the trick. I now have 2GB in my machine and it is at least smooth if not exactly speedy. -----Original Message----- From: Michael Maddison [mailto:michael at ddisolutions.com.au] Sent: Thursday, August 19, 2004 12:22 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] Using task manager to close SQL Server John, The db wont be hurt if you stop the transaction. What I would do... YMMV... Change Recovery mode to simple (cuts logging down) Drop all indexes inc PK before attempting to add the identity column. Script them using QA so you can put them back after. Turn off all unnecessary services. Have nothing else running. Buy or upgrade to a bigger box ;-))) You need multiple drives and processes. regards Michael M Francisco, This was until recently my desktop machine for Access dev. No need for more memory really. I "retired" it when I got my laptop which is a P64 with 512m ram (desktop replacement laptop). I will now be pulling it back into service as a (temporary) SQL Server machine. The db will hold a 60+ million record flat file of name/address/ demographics with 14+ million more to come immediately, and possibly merging in a couple of other dbs that are currently hosted elsewhere. My intention is to build a server since I am "bootstrapping" this operation (little cash). The current machine is an AMD 2.5g Barton with 512 mb ram running Windows2K Pro and SQL Server 2K. Awhile back I bought a RAID card and a couple of 120gb hard disks for the main c: drive (Raid1) which is where the current db resides. I purchased a couple of 200gb Maxtors with 8m cache which I intended to mirror, then throw the db out there. We'll just have to wait and see how big this mutha gets. I can see however that an immediate memory upgrade would be good if I intend to use this thing for long. Long term I'm looking at building a dual Opteron. I have been looking at how to get lots of memory and processing power though reading the web page you provided brings to doubt my ability to use said memory. 8-( John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, August 18, 2004 5:32 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using task manager to close SQL Server On Wed, 18 Aug 2004 16:56:21 -0400, John W. Colby wrote: > I am trying to add a "PK" autoincrement to this huge db I'm working > with. There are around 6 million records at the moment. I went in > design view and added the field, then "saved" the table and it took > off adding the new field and putting values in the table. The problem > is it is taking FOREVER (running for many hours so far) and is not > giving any status indicator to say when it will be done. I absolutely > must "get back to work doing some stuff in the db. > > If I "close" enterprise manager in the middle, will it damage my db? > I am assuming not but don't want to take any chances. > > It appears that the reason it is taking so long is extensive use of > the swap file. Enterprise manager is currently using 421 mbytes on a > 512m machine. I would go somewhere and buy some 512mb sticks if it > will help (and it appears it will). You know what, I also forgot to mention that you ought ot look into how you arrange the new hdds for your pc as well, Since you are DOING some Sql Server development, it is to your advantage to have your 2ndary data hdd on a seperate I/O card, this is important in a production environment, but also in a development one as well. What you'll find is that the system may become busy doing it's updates, but your PC will continue running like a champ, because all the heavy I/O process are offloaded to a seperate card/ channel. So if you don't have a second ATA or SATA card go out and look for one when you are out shopping for your 1gb of ram. You'll notice that the performance of the database increases like nuts, IIRC you just recently bought a 200 or 250gb drive w/ 8mb of cache did you not? Adding the I/O card will enhance the performance on that drive (or your 2ndary drives you have) -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed Aug 25 11:06:30 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 25 Aug 2004 09:06:30 -0700 Subject: [dba-SQLServer] Big db update In-Reply-To: <000a01c48a95$59e66c90$80b3fea9@ColbyM6805> References: <000a01c48a95$59e66c90$80b3fea9@ColbyM6805> Message-ID: On Wed, 25 Aug 2004 07:19:07 -0400, John W. Colby wrote: > I looked at Shrink which will reduce the current size by 30% according to > EM. It appears that the extra space is there for future expansion and > seeing as I have a slew more files to import there is no point in doing so > YET. > > I will keep you in mind for those hard questions about largish DBs! You are becoming quite the VLDB expert :), one question I had for you, on your log file settings do you have your db set to either BULK Logged or SIMPLE, this will reduce the amount of records stored in the transaction log while you are doing this "Massive" rollup. -- -Francisco From alan.lawhon at us.army.mil Wed Aug 25 11:24:31 2004 From: alan.lawhon at us.army.mil (Lawhon, Alan C Contractor/Morgan Research) Date: Wed, 25 Aug 2004 11:24:31 -0500 Subject: [dba-SQLServer] Trouble Establishing a Connection to Remote SQL Server Tables Using DAO 3.6 Message-ID: <5D5043687CFCE44288407A73E4CC6E17449118@redstone819.ad.redstone.army.mil> Dear Experts: I am having a terrible time with a Visual Basic code module which accesses three (local) Access 2000 tables and three [remote] SQL Server 2000 backend tables. I am using the DAO 3.6 object model. (I know, I should have used the ADO object model instead, but the "dirty deed" has already been done ...) I have dimensioned three DAO recordset objects, (i.e. rst1, rst2, and rst3) to hold the records from the three [local] Access tables. I am attempting to get the records from the three [remote] SQL Server tables into rst4, rst5, and rst6 DAO recordset objects. The problem I'm having is getting a "connection" established to the SQL Server back end so that I can use "Set" statement assignments to the rst4, rst5, and rst6 recordset objects. I would be very grateful if a sharp troubleshooter could take a look at the following code snippet and impart some wisdom as to why the [recordset] "Connect" method is not working. Thanks, Alan C. Lawhon ----------------------- Option Compare Database Sub MERGE_NEW_HMIS_RECORDS() ' ' NOTE1: Initially, until this code has been throughly debugged and checked out, ' data merge read/writes will be to (local) copies of the Production tables, ' and NOT to the actual production tables! Once it is determined that the ' code is working properly, we'll change the references from the local ' tables to the actual SQL Server remote back-end tables and add ' "connection strings" to the rst4, rst5, and rst6 recordset object variables. ' ' Dim dbs As DAO.Database ' Dim rst1 As DAO.Recordset ' Will hold "HMIS_NSN_MAIN" (HMIS NSN) records Dim rst2 As DAO.Recordset ' Will hold "HMIS_NSN_PRODUCTS" (HMIS Products) records Dim rst3 As DAO.Recordset ' Will hold "HMIS_NSN_CONSTITUENTS" (HMIS Constituents) records ' Dim rst4 As DAO.Recordset ' Will hold "EDS_NSN_MAIN_TABLE" (Production table) records Dim rst5 As DAO.Recordset ' Will hold "EDS_NSN_PRODUCTS_TABLE" (Production table) records Dim rst6 As DAO.Recordset ' Will hold "EDS_NSN_CONSTITUENTS" (Production table) records ' Dim Temp_PRODUCT_Rst As DAO.Recordset Dim Temp_CONSTITUENT_Rst As DAO.Recordset ' ' Dim NSN_Var As String Dim PRODUCT_NUM_Var As Long Dim MSDSSRNO_Var As String ' ' Set dbs = CurrentDb ' ' Set rst1 = dbs.OpenRecordset("HMIS_NSN_MAIN", dbOpenTable) Set rst2 = dbs.OpenRecordset("HMIS_NSN_PRODUCTS", dbOpenTable) Set rst3 = dbs.OpenRecordset("HMIS_NSN_CONSTITUENTS", dbOpenTable) ' MsgBox "Connection to [local] HMIS tables successfully established." ' ' Execution "blows up" (halts) on the following statement. (I'm attempting to access the remote SQL Server back end tables via a "Connect" method.) ' ' rst4.Connect = "ODBC;DRIVER=SQL Server;SERVER=ETT-SQL;UID=sysuser;PWD=;DATABASE=EDS_REVIEW;Address=ETT-SQL,1433" rst5.Connect = "ODBC;DRIVER=SQL Server;SERVER=ETT-SQL;UID=sysuser;PWD=;DATABASE=EDS_REVIEW;Address=ETT-SQL,1433" rst6.Connect = "ODBC;DRIVER=SQL Server;SERVER=ETT-SQL;UID=sysuser;PWD=;DATABASE=EDS_REVIEW;Address=ETT-SQL,1433" ' ' Set rst4 = dbs.OpenRecordset("EDS_NSN_MAIN_TABLE", dbOpenTable) Set rst5 = dbs.OpenRecordset("EDS_NSN_PRODUCTS_TABLE", dbOpenTable) Set rst6 = dbs.OpenRecordset("EDS_NSN_CONSTITUENTS", dbOpenTable) ' MsgBox "Connection to production tables successfully established." ' ' rst1.Close rst2.Close rst3.Close rst4.Close rst5.Close rst6.Close ' ' Exit Sub ' Temporarily exit subroutine until we get connection to production ' tables established. ' ' ' From JColby at dispec.com Wed Aug 25 11:28:44 2004 From: JColby at dispec.com (Colby, John) Date: Wed, 25 Aug 2004 12:28:44 -0400 Subject: [dba-SQLServer] Big db update Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDD0@DISABILITYINS01> Francisco, Unfortunately I am not really up to speed on SQL Server so this is very much "learn as you go". I have always wanted a project that demanded SQL Server so I could get up to speed on it while earning a living. I am quite willing to drop my rate (or bill fewer hours than actually spent) to reflect my "on the job learning" but to just go learn something as complex as SQL Server when the knowledge would just grow stagnant has always seemed a waste. If this client is as big as it appears, I may some day be knowledgeable on SQL Server. And no, I haven't tuned the log file to account for the bulk inserts. At the moment, each 3 million name source file is taking about 45 minutes to pull in, which really gives me time to go read the manual and do other exploring while the insert is happening. I am working on-site ATM, but when I get home I hope to get one of my other desktops running the BCP and use my laptop to simultaneously start querying the db. Not really sure if that is even possible but I would hope so. Using the server itself to do the BCP causes the server to "lock up" by putting all the processor cycles and all the memory into the process. Perhaps by having another workstation do the BCP, the server can "just be a server", offloading some of the work to the workstation, thus allowing another workstation (my laptop) to also sneak in some queries and stuff. Of course I also have to get the db compacted so that it will fit on my 120g Seagate external (usb) hard disk), then try and get raid0 running to use the two hard disks as a big 400g drive (I'm gonna need it!) then copy the db file back to the raid0 drive and pick up on the BCPs where I left off. I have to stop the import here or the db won't fit on my Seagate external drive and I'll be unable to get the raid0 array working. This has definitely been an experience and a challenge. I do love a challenge so it's been fun so far. JWC -----Original Message----- From: Francisco Tapia [mailto:fhtapia at gmail.com] Sent: Wednesday, August 25, 2004 12:07 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Big db update On Wed, 25 Aug 2004 07:19:07 -0400, John W. Colby wrote: > I looked at Shrink which will reduce the current size by 30% according to > EM. It appears that the extra space is there for future expansion and > seeing as I have a slew more files to import there is no point in doing so > YET. > > I will keep you in mind for those hard questions about largish DBs! You are becoming quite the VLDB expert :), one question I had for you, on your log file settings do you have your db set to either BULK Logged or SIMPLE, this will reduce the amount of records stored in the transaction log while you are doing this "Massive" rollup. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed Aug 25 19:53:48 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 25 Aug 2004 17:53:48 -0700 Subject: [dba-SQLServer] Log file In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDCC@DISABILITYINS01> References: <05C61C52D7CAD211A7830008C7DF6F1079BDCC@DISABILITYINS01> Message-ID: John, Did you not get a copy my message on a single file re-attach command from QA? On Wed, 25 Aug 2004 10:23:24 -0400, Colby, John wrote: > Martin, > > When I tried this (I'm assuming the log file is the .LDF?) I couldn't get > the SQL Server to open the database. -- -Francisco From fhtapia at gmail.com Wed Aug 25 20:02:54 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 25 Aug 2004 18:02:54 -0700 Subject: [dba-SQLServer] Gmail Invite Message-ID: Goes to the FIRST reply I get off the list who can provide a clean knock knock joke I haven't heard before :) -- -Francisco From JColby at dispec.com Thu Aug 26 09:13:08 2004 From: JColby at dispec.com (Colby, John) Date: Thu, 26 Aug 2004 10:13:08 -0400 Subject: [dba-SQLServer] Big db update Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDD3@DISABILITYINS01> Francisco, Last night I tried to modify the log file size. It was already up to around 10 gb and refused to allow me to set the numbers down. Something about "can only expand the file". Where do I find these "bulk logged" or "SIMPLE" settings? JWC -----Original Message----- From: Francisco Tapia [mailto:fhtapia at gmail.com] Sent: Wednesday, August 25, 2004 12:07 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Big db update On Wed, 25 Aug 2004 07:19:07 -0400, John W. Colby wrote: > I looked at Shrink which will reduce the current size by 30% according to > EM. It appears that the extra space is there for future expansion and > seeing as I have a slew more files to import there is no point in doing so > YET. > > I will keep you in mind for those hard questions about largish DBs! You are becoming quite the VLDB expert :), one question I had for you, on your log file settings do you have your db set to either BULK Logged or SIMPLE, this will reduce the amount of records stored in the transaction log while you are doing this "Massive" rollup. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From JColby at dispec.com Thu Aug 26 09:14:01 2004 From: JColby at dispec.com (Colby, John) Date: Thu, 26 Aug 2004 10:14:01 -0400 Subject: [dba-SQLServer] Log file Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDD4@DISABILITYINS01> Nope, I didn't get that. Every time I try it just shows the missing ldb in red and refuses to attach saying that the db isn't valid. JWC -----Original Message----- From: Francisco Tapia [mailto:fhtapia at gmail.com] Sent: Wednesday, August 25, 2004 8:54 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Log file John, Did you not get a copy my message on a single file re-attach command from QA? On Wed, 25 Aug 2004 10:23:24 -0400, Colby, John wrote: > Martin, > > When I tried this (I'm assuming the log file is the .LDF?) I couldn't get > the SQL Server to open the database. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Thu Aug 26 10:42:56 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Thu, 26 Aug 2004 08:42:56 -0700 Subject: [dba-SQLServer] Log file In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDD4@DISABILITYINS01> References: <05C61C52D7CAD211A7830008C7DF6F1079BDD4@DISABILITYINS01> Message-ID: Here it is again... One way is to detach your current database from QA like this: (remember to be in the "master" database when running these commands) EXEC sp_detach_db 'MyDB', True then reATTACH it by running the sp_attach_db sproc, EXEC sp_attach_db @dbname = N'MyDB', @filename1 = N'd:\SqlServer\data\MyDB.mdf', @filename2 = N'f:\FastLogDisk\MyDB_log.ldf' to LIMIT your log file to a specific size then do this: IN EM: right click and go into the database properties Click on the Transaction Log TAB Deselect Automatically Grow File WARNING: you will NOW need to backup your transaction log more often in order to reuse some wasted space, one good way is to backup the transaction log when it reaches 60% of it's utilization space: you can do this by adding an Alert: IN EM: Under the Management folder and under the SQL Server Agent icon click on Alerts and create a new Alert. Give your alert a meaningful name In the General tab: Type choose: Sql Server Performance condition alert (enabled) Object: SqlServer:Databases Counter: Percent Log Used Instance: MyDb Alert If Counter: rises above Value: 60 In the Response Tab (Check Execute Job) and create a job (the three ... dots) your job should have the following TSQL job for backup: BACKUP LOG [MyDB] TO [LogBackupDeviceName] WITH INIT Then OK to save all your settings... I hope this helps you out. On Thu, 26 Aug 2004 10:14:01 -0400, Colby, John wrote: > Nope, I didn't get that. Every time I try it just shows the missing ldb in > red and refuses to attach saying that the db isn't valid. > > JWC > > > > -----Original Message----- > From: Francisco Tapia [mailto:fhtapia at gmail.com] > Sent: Wednesday, August 25, 2004 8:54 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] Log file > > John, > Did you not get a copy my message on a single file re-attach command from > QA? -- -Francisco From fhtapia at gmail.com Thu Aug 26 11:54:44 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Thu, 26 Aug 2004 09:54:44 -0700 Subject: [dba-SQLServer] Log file In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDD6@DISABILITYINS01> References: <05C61C52D7CAD211A7830008C7DF6F1079BDD6@DISABILITYINS01> Message-ID: You want to run the EXEC statements from QA (Query Analyzer). btw, I forgot I had this script which is much nicer, in order to shrink your log file USE MyDB BACKUP LOG MyDB WITH TRUNCATE ONLY DBCC SHRINKFILE (MyDB_log, 10) Where the Log will be shrunk down to 10megs... the Backup doesn't acctually backup any data at all it simply just checks all open transactions back to the db so that the log file is clear to shrink. As far as books go?, I've used "Mastering SQL Server 2000" as my main guide, but then BOL for all my other needs. I'm also subscribed to sswug.org where there are "real" gurus about this sort of stuff. On Thu, 26 Aug 2004 12:01:57 -0400, Colby, John wrote: > Wow. > > I already deselected the Automatically Grow File, but the file has already > frown from 1.x gig starting size to over 10g now. > > Do I run these EXEC statements from the query window? Remember, I am a > TOTAL neophyte here. In fact the only books I have are for SQL Server 7. I > will be going to the bookstore though. Any recommendations for good SQL > Server 2K books? > > ALSO... > > Last night I built a raid 0 array using 2 200gb drives, and it appears that > my 2.5g AMD Barton (overclocked to '3.0g') with 2gb RAM is barely sufficient > to handle this size db. I haven't managed to load the entire database yet > although it appears that I now have the ability to at least do that. I can > just see that a table scan on 65 million records pulling result sets on > Yes/No where clauses is going to take a LONG time. > > I am looking at putting up a multi-processor 64 bit server if this client > gives me enough business. From my readings, XP currently limits memory to 2 > gb and the 64 bit machines limit the max to 4g. Above that we revert back > to the good old days of EMS memory mappings crapola which slows things way > down. > > Apparently only Windows Server 2003 is currently 64 bit, although it appears > that SQL Server 2K can use a 64 bit machine. Does anyone know anything > about this 64 bit stuff? Can anyone provide insight on speed gains on > processing large databases using 64 bit (windows) OS and DB software? > > It appears that I can build a dual Opteron machine for well under $2K but > since I have no experience in this I am hesitant to go that route without > reasonable assurances of significant speed gains. > > JWC > > > > -----Original Message----- > From: Francisco Tapia [mailto:fhtapia at gmail.com] > Sent: Thursday, August 26, 2004 11:43 AM > To: dba-sqlserver at databaseadvisors.com; John Colby > Subject: Re: [dba-SQLServer] Log file > > Here it is again... > > One way is to detach your current database from QA like this: > (remember to be in the "master" database when running these commands) > > EXEC sp_detach_db 'MyDB', True > > then reATTACH it by running the sp_attach_db sproc, > > EXEC sp_attach_db @dbname = N'MyDB', > @filename1 = N'd:\SqlServer\data\MyDB.mdf', > @filename2 = N'f:\FastLogDisk\MyDB_log.ldf' > > to LIMIT your log file to a specific size then do this: > IN EM: > right click and go into the database properties > Click on the Transaction Log TAB > Deselect Automatically Grow File > > WARNING: you will NOW need to backup your transaction log more often > in order to reuse some wasted space, one good way is to backup the > transaction log when it reaches 60% of it's utilization space: you can > do this by adding an Alert: > > IN EM: > Under the Management folder and under the SQL Server Agent icon > click on Alerts and create a new Alert. Give your alert a meaningful > name > > In the General tab: > Type choose: Sql Server Performance condition alert (enabled) > Object: SqlServer:Databases > Counter: Percent Log Used > Instance: MyDb > Alert If Counter: rises above > Value: 60 > > In the Response Tab > (Check Execute Job) and create a job (the three ... dots) > your job should have the following TSQL job for backup: > BACKUP LOG [MyDB] TO [LogBackupDeviceName] WITH INIT > > Then OK to save all your settings... > > I hope this helps you out. > > On Thu, 26 Aug 2004 10:14:01 -0400, Colby, John wrote: > > Nope, I didn't get that. Every time I try it just shows the missing ldb > in > > red and refuses to attach saying that the db isn't valid. > > > > JWC > > > > > > > > -----Original Message----- > > From: Francisco Tapia [mailto:fhtapia at gmail.com] > > Sent: Wednesday, August 25, 2004 8:54 PM > > To: dba-sqlserver at databaseadvisors.com > > Subject: Re: [dba-SQLServer] Log file > > > > John, > > Did you not get a copy my message on a single file re-attach command > from > > QA? > > -- > -Francisco > -- -Francisco From rl_stewart at highstream.net Thu Aug 26 14:56:35 2004 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Thu, 26 Aug 2004 14:56:35 -0500 Subject: [dba-SQLServer] Re: Gmail Invite In-Reply-To: <200408261713.i7QHDDQ15047@databaseadvisors.com> Message-ID: <5.1.0.14.2.20040826145518.013cfc98@pop3.highstream.net> Knock, Knock Who's there? Dwain. Dwain who? Dwain the bath tub I'm dwownin. At 12:13 PM 8/26/2004 -0500, you wrote: >Date: Wed, 25 Aug 2004 18:02:54 -0700 >From: Francisco Tapia >Subject: [dba-SQLServer] Gmail Invite >To: dba-sqlserver at databaseadvisors.com >Message-ID: >Content-Type: text/plain; charset=US-ASCII > >Goes to the FIRST reply I get off the list who can provide a clean >knock knock joke I haven't heard before :) > >-- >-Francisco From JColby at dispec.com Thu Aug 26 15:06:36 2004 From: JColby at dispec.com (Colby, John) Date: Thu, 26 Aug 2004 16:06:36 -0400 Subject: [dba-SQLServer] Re: Gmail Invite Message-ID: <05C61C52D7CAD211A7830008C7DF6F1079BDDA@DISABILITYINS01> LOL. Are you blonde? Is this simultaneously a blond joke? >>Goes to the FIRST reply I get OFF THE LIST who can provide a clean John W. Colby The DIS Database Guy -----Original Message----- From: Robert L. Stewart [mailto:rl_stewart at highstream.net] Sent: Thursday, August 26, 2004 3:57 PM To: dba-sqlserver at databaseadvisors.com Cc: fhtapia at gmail.com Subject: [dba-SQLServer] Re: Gmail Invite Knock, Knock Who's there? Dwain. Dwain who? Dwain the bath tub I'm dwownin. At 12:13 PM 8/26/2004 -0500, you wrote: >Date: Wed, 25 Aug 2004 18:02:54 -0700 >From: Francisco Tapia >Subject: [dba-SQLServer] Gmail Invite >To: dba-sqlserver at databaseadvisors.com >Message-ID: >Content-Type: text/plain; charset=US-ASCII > >Goes to the FIRST reply I get off the list who can provide a clean >knock knock joke I haven't heard before :) > >-- >-Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From shait at mindspring.com Thu Aug 26 15:02:49 2004 From: shait at mindspring.com (Stephen Hait) Date: Thu, 26 Aug 2004 16:02:49 -0400 Subject: [dba-SQLServer] Big db update In-Reply-To: <000101c48a61$db9b37e0$80b3fea9@ColbyM6805> References: <41273308.2068.40BF37D@lexacorp.com.pg> Message-ID: <412E09A9.23705.37453C87@localhost> > As you know by now I am working on a largish database, 65 million > names / addresses plus demographics, ~600 fields it turns out. Over You most likely are already aware of this, but it's much faster to BCP in when there are no indexes on the target table - you can just build the indexes after the import. Stephen From fhtapia at gmail.com Thu Aug 26 16:07:56 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Thu, 26 Aug 2004 14:07:56 -0700 Subject: [dba-SQLServer] Re: Gmail Invite In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDDA@DISABILITYINS01> References: <05C61C52D7CAD211A7830008C7DF6F1079BDDA@DISABILITYINS01> Message-ID: On Thu, 26 Aug 2004 16:06:36 -0400, Colby, John wrote: > LOL. Are you blonde? Is this simultaneously a blond joke? > > >>Goes to the FIRST reply I get OFF THE LIST who can provide a clean > just fast typing and didn't click check spelling :D -- -Francisco From jwcolby at colbyconsulting.com Fri Aug 27 09:50:33 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Fri, 27 Aug 2004 10:50:33 -0400 Subject: [dba-SQLServer] Every 100th record In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDD3@DISABILITYINS01> Message-ID: <000b01c48c45$382be790$80b3fea9@ColbyM6805> Does anyone have a strategy for pulling every Nth record? My client wants to pull every 100th record into a dataset for analysis, to speed things up I am guessing. John W. Colby www.ColbyConsulting.com From joconnell at indy.rr.com Fri Aug 27 11:33:09 2004 From: joconnell at indy.rr.com (Joseph O'Connell) Date: Fri, 27 Aug 2004 11:33:09 -0500 Subject: [dba-SQLServer] Table Definition Message-ID: <01cb01c48c53$8c833600$6701a8c0@joe> In an Access application, I can use the Documenter tool to generate a report of table definitions. Is there an equivalent tool in SQL Server that will easily create a report showing the Name, Data Type and Size of the fields in a selected table? Joe O'Connell From kens.programming at verizon.net Fri Aug 27 12:01:26 2004 From: kens.programming at verizon.net (Ken Stoker) Date: Fri, 27 Aug 2004 10:01:26 -0700 Subject: [dba-SQLServer] Table Definition In-Reply-To: <01cb01c48c53$8c833600$6701a8c0@joe> Message-ID: <20040827165927.BAMZ14580.out011.verizon.net@enterprise> Will this work for you? SELECT so.name AS TableName, sc.name AS FieldName, st.name AS Type, sc.Length, sc.Prec, sc.Scale FROM sysobjects so INNER JOIN syscolumns sc on so.id = sc.id INNER JOIN systypes st on sc.xtype = st.xusertype WHERE so.name = 'mytablename' Ken -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Joseph O'Connell Sent: Friday, August 27, 2004 9:33 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Table Definition In an Access application, I can use the Documenter tool to generate a report of table definitions. Is there an equivalent tool in SQL Server that will easily create a report showing the Name, Data Type and Size of the fields in a selected table? Joe O'Connell _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From joconnell at indy.rr.com Fri Aug 27 13:10:27 2004 From: joconnell at indy.rr.com (Joseph O'Connell) Date: Fri, 27 Aug 2004 13:10:27 -0500 Subject: [dba-SQLServer] Table Definition Message-ID: <021201c48c61$230282e0$6701a8c0@joe> Thank you Joe -----Original Message----- From: Ken Stoker To: 'Joseph O'Connell' ; dba-sqlserver at databaseadvisors.com Date: Friday, August 27, 2004 1:04 PM Subject: RE: [dba-SQLServer] Table Definition |I have always copied the results out to Excel and worked with it from there. |The column names aren't included in the copy/paste, so you will need to put |those in if you need them. | |One other thing, the query I provided doesn't return fields with user |defined types. I did some more playing around and the best way is to |replace sc.xtype = st.xusertype with sc.xusertype = st.xusertype. | |Ken | |-----Original Message----- |From: Joseph O'Connell [mailto:joconnell at indy.rr.com] |Sent: Friday, August 27, 2004 10:43 AM |To: dba-sqlserver at databaseadvisors.com; kens.programming at verizon.net |Subject: Re: [dba-SQLServer] Table Definition | |Ken, | |Thank you for your prompt reply. This query does give me the information |that I need. | |Is there an easy way to print the results of the query? | |Joe O'Connell | |-----Original Message----- |From: Ken Stoker |To: dba-sqlserver at databaseadvisors.com |Date: Friday, August 27, 2004 12:10 PM |Subject: RE: [dba-SQLServer] Table Definition | | ||Will this work for you? || ||SELECT so.name AS TableName, sc.name AS FieldName, st.name AS Type, || sc.Length, sc.Prec, sc.Scale ||FROM sysobjects so INNER JOIN syscolumns sc on so.id = sc.id || INNER JOIN systypes st on sc.xtype = st.xusertype ||WHERE so.name = 'mytablename' || ||Ken || ||-----Original Message----- ||From: dba-sqlserver-bounces at databaseadvisors.com ||[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Joseph ||O'Connell ||Sent: Friday, August 27, 2004 9:33 AM ||To: dba-sqlserver at databaseadvisors.com ||Subject: [dba-SQLServer] Table Definition || ||In an Access application, I can use the Documenter tool to generate a |report ||of table definitions. || ||Is there an equivalent tool in SQL Server that will easily create a report ||showing the Name, Data Type and Size of the fields in a selected table? || ||Joe O'Connell || || || ||_______________________________________________ ||dba-SQLServer mailing list ||dba-SQLServer at databaseadvisors.com ||http://databaseadvisors.com/mailman/listinfo/dba-sqlserver ||http://www.databaseadvisors.com || ||_______________________________________________ ||dba-SQLServer mailing list ||dba-SQLServer at databaseadvisors.com ||http://databaseadvisors.com/mailman/listinfo/dba-sqlserver ||http://www.databaseadvisors.com || | | From kens.programming at verizon.net Fri Aug 27 13:05:53 2004 From: kens.programming at verizon.net (Ken Stoker) Date: Fri, 27 Aug 2004 11:05:53 -0700 Subject: [dba-SQLServer] Table Definition In-Reply-To: <01ef01c48c5d$7a667a40$6701a8c0@joe> Message-ID: <20040827180355.YRZZ28868.out004.verizon.net@enterprise> I have always copied the results out to Excel and worked with it from there. The column names aren't included in the copy/paste, so you will need to put those in if you need them. One other thing, the query I provided doesn't return fields with user defined types. I did some more playing around and the best way is to replace sc.xtype = st.xusertype with sc.xusertype = st.xusertype. Ken -----Original Message----- From: Joseph O'Connell [mailto:joconnell at indy.rr.com] Sent: Friday, August 27, 2004 10:43 AM To: dba-sqlserver at databaseadvisors.com; kens.programming at verizon.net Subject: Re: [dba-SQLServer] Table Definition Ken, Thank you for your prompt reply. This query does give me the information that I need. Is there an easy way to print the results of the query? Joe O'Connell -----Original Message----- From: Ken Stoker To: dba-sqlserver at databaseadvisors.com Date: Friday, August 27, 2004 12:10 PM Subject: RE: [dba-SQLServer] Table Definition |Will this work for you? | |SELECT so.name AS TableName, sc.name AS FieldName, st.name AS Type, | sc.Length, sc.Prec, sc.Scale |FROM sysobjects so INNER JOIN syscolumns sc on so.id = sc.id | INNER JOIN systypes st on sc.xtype = st.xusertype |WHERE so.name = 'mytablename' | |Ken | |-----Original Message----- |From: dba-sqlserver-bounces at databaseadvisors.com |[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Joseph |O'Connell |Sent: Friday, August 27, 2004 9:33 AM |To: dba-sqlserver at databaseadvisors.com |Subject: [dba-SQLServer] Table Definition | |In an Access application, I can use the Documenter tool to generate a report |of table definitions. | |Is there an equivalent tool in SQL Server that will easily create a report |showing the Name, Data Type and Size of the fields in a selected table? | |Joe O'Connell | | | |_______________________________________________ |dba-SQLServer mailing list |dba-SQLServer at databaseadvisors.com |http://databaseadvisors.com/mailman/listinfo/dba-sqlserver |http://www.databaseadvisors.com | |_______________________________________________ |dba-SQLServer mailing list |dba-SQLServer at databaseadvisors.com |http://databaseadvisors.com/mailman/listinfo/dba-sqlserver |http://www.databaseadvisors.com | From joconnell at indy.rr.com Fri Aug 27 12:43:20 2004 From: joconnell at indy.rr.com (Joseph O'Connell) Date: Fri, 27 Aug 2004 12:43:20 -0500 Subject: [dba-SQLServer] Table Definition Message-ID: <01ef01c48c5d$7a667a40$6701a8c0@joe> Ken, Thank you for your prompt reply. This query does give me the information that I need. Is there an easy way to print the results of the query? Joe O'Connell -----Original Message----- From: Ken Stoker To: dba-sqlserver at databaseadvisors.com Date: Friday, August 27, 2004 12:10 PM Subject: RE: [dba-SQLServer] Table Definition |Will this work for you? | |SELECT so.name AS TableName, sc.name AS FieldName, st.name AS Type, | sc.Length, sc.Prec, sc.Scale |FROM sysobjects so INNER JOIN syscolumns sc on so.id = sc.id | INNER JOIN systypes st on sc.xtype = st.xusertype |WHERE so.name = 'mytablename' | |Ken | |-----Original Message----- |From: dba-sqlserver-bounces at databaseadvisors.com |[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Joseph |O'Connell |Sent: Friday, August 27, 2004 9:33 AM |To: dba-sqlserver at databaseadvisors.com |Subject: [dba-SQLServer] Table Definition | |In an Access application, I can use the Documenter tool to generate a report |of table definitions. | |Is there an equivalent tool in SQL Server that will easily create a report |showing the Name, Data Type and Size of the fields in a selected table? | |Joe O'Connell | | | |_______________________________________________ |dba-SQLServer mailing list |dba-SQLServer at databaseadvisors.com |http://databaseadvisors.com/mailman/listinfo/dba-sqlserver |http://www.databaseadvisors.com | |_______________________________________________ |dba-SQLServer mailing list |dba-SQLServer at databaseadvisors.com |http://databaseadvisors.com/mailman/listinfo/dba-sqlserver |http://www.databaseadvisors.com | From shamil at users.mns.ru Fri Aug 27 13:45:07 2004 From: shamil at users.mns.ru (Shamil Salakhetdinov) Date: Fri, 27 Aug 2004 22:45:07 +0400 Subject: [dba-SQLServer] Every 100th record References: <000b01c48c45$382be790$80b3fea9@ColbyM6805> Message-ID: <002d01c48c65$fc1cade0$0201a8c0@PARIS> John, Cursors and FETCH RELATIVE n is one of the answers. See BOL for more details. Another solution is possible if your source data table/view/UDF returns unique IDs then you can populate a temp table with these IDs and correponding Identity field and then write a join ..... you got it I see... :) HTH, Shamil ----- Original Message ----- From: "John W. Colby" To: ; Sent: Friday, August 27, 2004 6:50 PM Subject: [dba-SQLServer] Every 100th record > Does anyone have a strategy for pulling every Nth record? My client wants > to pull every 100th record into a dataset for analysis, to speed things up I > am guessing. > > John W. Colby > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > From fhtapia at gmail.com Fri Aug 27 16:57:38 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 27 Aug 2004 14:57:38 -0700 Subject: [dba-SQLServer] Table Definition In-Reply-To: <01ef01c48c5d$7a667a40$6701a8c0@joe> References: <01ef01c48c5d$7a667a40$6701a8c0@joe> Message-ID: An easy enough method is to open a new db query from Excel and paste your sql code there, it WILL give you the column names. :D On Fri, 27 Aug 2004 12:43:20 -0500, Joseph O'Connell wrote: > Ken, > > Thank you for your prompt reply. This query does give me the information > that I need. > > Is there an easy way to print the results of the query? -- -Francisco From mmaddison at optusnet.com.au Fri Aug 27 23:05:38 2004 From: mmaddison at optusnet.com.au (Michael Maddison) Date: Sat, 28 Aug 2004 14:05:38 +1000 Subject: [dba-SQLServer] Big db update In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDD3@DISABILITYINS01> Message-ID: EM -> Database -> Properties -> the last tab IIRC. You probably need to backup the db/log file then shrink it. cheers Michael M Francisco, Last night I tried to modify the log file size. It was already up to around 10 gb and refused to allow me to set the numbers down. Something about "can only expand the file". Where do I find these "bulk logged" or "SIMPLE" settings? JWC -----Original Message----- From: Francisco Tapia [mailto:fhtapia at gmail.com] Sent: Wednesday, August 25, 2004 12:07 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Big db update On Wed, 25 Aug 2004 07:19:07 -0400, John W. Colby wrote: > I looked at Shrink which will reduce the current size by 30% according to > EM. It appears that the extra space is there for future expansion and > seeing as I have a slew more files to import there is no point in doing so > YET. > > I will keep you in mind for those hard questions about largish DBs! You are becoming quite the VLDB expert :), one question I had for you, on your log file settings do you have your db set to either BULK Logged or SIMPLE, this will reduce the amount of records stored in the transaction log while you are doing this "Massive" rollup. -- -Francisco _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com --- Incoming mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.718 / Virus Database: 474 - Release Date: 9/07/2004 --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.718 / Virus Database: 474 - Release Date: 9/07/2004 From jwcolby at colbyconsulting.com Sat Aug 28 06:56:11 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Sat, 28 Aug 2004 07:56:11 -0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only In-Reply-To: Message-ID: <000001c48cf6$06589a90$80b3fea9@ColbyM6805> I have been working on getting a rather large database into SQL Server. This thing is ~65 million names plus demographics info and will be used for bulk mailing analysis. I have been struggling for weeks to make this happen. Having no experience with a database this big, I had no idea how big the final db would be. I tried to get it in to a 160g drive but it rapidly became obvious that wouldn't hold it. I then purchased two 200g drives and used Raid0 to make a big 400 g drive. I thought I turned on compression but after getting well into the extraction process I discovered this wasn't the case. I then started trying to figure out how to get the drive compressed. Long story short, a NTFS drive can be compressed, even a raid array such as this, however... There is a control that allows you to select the sector size. I had selected the "compress" check box but then selected a sector size of 64K. As I started investigating why the drive wasn't compressed it turns out that only sector sizes of 512 to 4K bytes allow compression. Anything larger causes the "compress drive" check box to gray out and the drive ends up uncompressed. By this time I had already spent several days extracting zipped files of data and BCPing them into SQL Server so I had a MDF file of over 300gb and no place to put it! Sigh. Out of desperation I decided to try zipping the database file. I started it PK zipping last night onto an empty 160g partition. This morning I had a 10gb zipped file that supposedly contains the MDF file! I then deleted the partition on the 400gb Raid array and started playing with the compression / block size which is when I discovered the >4K sector size gotcha. I set the sector size to 4K and quick formatted, then started unzipping the MDF file to the (compressed) 400gb raid array. We shall see. The unzip is not finished, in fact has several hours to go yet. If this works I will celebrate. This whole project has been a challenge. It looks like the database will be around 600g for JUST the data, never mind any indexes. I simply don't have the money to build a raid 5 array to up the uncompressed drive size. Even if I did, IDE drives larger than 250gb are expensive and AFAICT only available in 5200 RPM. Plus the overhead of Raid5 is "One Drive" which means I'd need (4) 300g drives to build a 900g usable space raid5 array. Raid1 (which I am currently using) tops out at 600g using (2) 300g drives (uncompressed). So far my (2) drive Raid1 array using 200g drives has cost me $240 plus a controller I already had. A Raid5 solution using 300g drives would cost about $1200 just for the new controller and 4 drives! With any luck, given the massive compression PKZip managed to attain, I will be able to shoehorn the 600g. Update 8-( As I write this I just got a "delayed write failed" message from Windows saying it lost data trying to write to the disk. I have tried to turn off write caching but can't seem to find the magic button to cause Windows to quit using "Delayed write". BIG sigh! If I can't get the db back out of the zip file I will be facing a weekend of "starting from scratch" on getting the data out of the raw text files and back in to SQL Server! And I thought this would be fairly easy. John W. Colby www.ColbyConsulting.com From subs1847 at solution-providers.ie Sat Aug 28 11:33:22 2004 From: subs1847 at solution-providers.ie (Mark L. Breen) Date: Sat, 28 Aug 2004 17:33:22 +0100 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only References: <000001c48cf6$06589a90$80b3fea9@ColbyM6805> Message-ID: <00a501c48d1c$bf7e4260$0101a8c0@D8TZHN0J> Hello John, I have just downloaded my email for the list and there was in total 5500 or so emails, so I have just read through your series of problems with SQL. Firstly, I envy you, you lucky guy, having such a big db to work with ! Secondly, sorry to hear about the amount of hassle that you have had. I have recently been working on a 2gb database and when it is on the server, my laptop runs fine, but on my laptop with 1/2 gb ram, the laptop grinds to a halt, so I would be in favour of running the db off the development machine. I know that you have already looked at some of the obvious things such as ensuring that there are no indexes, in the tables, ensuring that the fields are no bigger than they need to be, use integers where you can etc. I have a few other comments to make, although I do not think that there are any revelations in them. DTS is probably the best way to get data in. SQL 2000 better at truncating / shrinking than SQL 7. Have you considered just importing a few hundred records from each file and getting the thing up and running? I am presuming that once you get it imported you want to start development. Wouldn't it be nicer to work with just 1 million records for the initial development period. Have you considered building a little vb app that would import 10k records and then do a database integrity and index rebuild (the work that the maintenance wizard does). Then let the vb app just work away for a week or so, mean while you could continue to use the hardware that you have. For the duration of this project, would you consider forgetting about redundancy and just use what ever disk space you have? I also have a dislike (not based on fact, just emotion) for compression. I wonder how it will effect performance of the SQL Server if you ever get it fully loaded. Keep in touch and let us know how the project goes Best of luck, Mark ----- Original Message ----- From: "John W. Colby" To: "'Access Developers discussion and problem solving'" ; Sent: Saturday, August 28, 2004 12:56 PM Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only > I have been working on getting a rather large database into SQL Server. > This thing is ~65 million names plus demographics info and will be used for > bulk mailing analysis. I have been struggling for weeks to make this > happen. Having no experience with a database this big, I had no idea how > big the final db would be. I tried to get it in to a 160g drive but it > rapidly became obvious that wouldn't hold it. I then purchased two 200g > drives and used Raid0 to make a big 400 g drive. I thought I turned on > compression but after getting well into the extraction process I discovered > this wasn't the case. I then started trying to figure out how to get the > drive compressed. > > Long story short, a NTFS drive can be compressed, even a raid array such as > this, however... There is a control that allows you to select the sector > size. I had selected the "compress" check box but then selected a sector > size of 64K. As I started investigating why the drive wasn't compressed it > turns out that only sector sizes of 512 to 4K bytes allow compression. > Anything larger causes the "compress drive" check box to gray out and the > drive ends up uncompressed. > > By this time I had already spent several days extracting zipped files of > data and BCPing them into SQL Server so I had a MDF file of over 300gb and > no place to put it! > > Sigh. > > Out of desperation I decided to try zipping the database file. I started it > PK zipping last night onto an empty 160g partition. This morning I had a > 10gb zipped file that supposedly contains the MDF file! > > I then deleted the partition on the 400gb Raid array and started playing > with the compression / block size which is when I discovered the >4K sector > size gotcha. I set the sector size to 4K and quick formatted, then started > unzipping the MDF file to the (compressed) 400gb raid array. > > We shall see. The unzip is not finished, in fact has several hours to go > yet. If this works I will celebrate. > > This whole project has been a challenge. It looks like the database will be > around 600g for JUST the data, never mind any indexes. I simply don't have > the money to build a raid 5 array to up the uncompressed drive size. Even > if I did, IDE drives larger than 250gb are expensive and AFAICT only > available in 5200 RPM. Plus the overhead of Raid5 is "One Drive" which > means I'd need (4) 300g drives to build a 900g usable space raid5 array. > Raid1 (which I am currently using) tops out at 600g using (2) 300g drives > (uncompressed). So far my (2) drive Raid1 array using 200g drives has cost > me $240 plus a controller I already had. A Raid5 solution using 300g drives > would cost about $1200 just for the new controller and 4 drives! > > With any luck, given the massive compression PKZip managed to attain, I will > be able to shoehorn the 600g. > > Update 8-( > > As I write this I just got a "delayed write failed" message from Windows > saying it lost data trying to write to the disk. I have tried to turn off > write caching but can't seem to find the magic button to cause Windows to > quit using "Delayed write". > > BIG sigh! > > If I can't get the db back out of the zip file I will be facing a weekend of > "starting from scratch" on getting the data out of the raw text files and > back in to SQL Server! > > And I thought this would be fairly easy. > > John W. Colby > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Sat Aug 28 12:11:24 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Sat, 28 Aug 2004 13:11:24 -0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only In-Reply-To: <00a501c48d1c$bf7e4260$0101a8c0@D8TZHN0J> Message-ID: <000901c48d22$0fd41be0$80b3fea9@ColbyM6805> Mark, Good to hear from you again. I have to assume that you only check your list email every few weeks? This is an OLAP database (more or less) from what I can tell, rather than a transaction database. There is only a single table, completely (intentionally) denormalized for speed reasons. Not that it has to remain that way, I can do it any way I want in the end, but that is how the data comes to me. The whole idea is that there are 65 million names of Americans ONLY with about 600 fields that categorize them. What they like to eat, drink, smoke, wear, drive, play with, medical drugs used, etc. The data will be loaded once (MAYBE!!! If I can get that done), then read hundreds of times a month. But only updated every few months as new names come in (additions / updates to existing data) or new survey results come in. This is definitely NOT like a banking database with millions of new records every month, nor even a call center or order entry database. NO transactions ever. There is in fact some maintenance involved such as checking addresses are valid, "do not mail" lists and so forth but in general this is a read only database. What I have been told is that for example 100 thousand records are pulled to build a mailing, by age, income, perhaps race or geographic locality if appropriate to the marketing effort. Then if it is a mailing set destined to be mailed to car buyers, perhaps narrowed down by those who like the brand being sold in this mass mailing, or even (intentionally) similar competitors. A result set comes back. Of those 100 thousand mailings, only 2% respond. So 2000 postcards come back asking for more info or whatever. The idea now is to look at the demographics of those 2000 and see what they have in common, and just as important who DIDN'T respond. If the mailing was sent to ages 25 to 50 but only 2% of those from 25 to 30 responded and 5% of 30 to 35 but 40% of those from 35-50 responded, then we know that the mailing "didn't work" for that 25-35 age group and perhaps give up on them, or perhaps target them differently. But it is safe(er) to mail a million out to ages 35 to 50 because many more of them will respond, making the response / mail more cost effective. As you can see, this is just a TON of analysis of the demographics, but very little (in fact NO) update of the original data. We will be adding a "business" database that holds client info (who is buying the address data sets), "filter" records of what records were selected in a given mailing, "Response" records that give us info on who responded. But these will probably be a set of a few to a few hundred (just off the top of my head) records in a handful of tables to record this "who bought the data" / "what was mailed" / "Who responded" information for each mailing data set created. If I can in fact do this at all on the class machines available to me it should be fun. It has been a much bigger challenge than I expected I can tell you that. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Mark L. Breen Sent: Saturday, August 28, 2004 12:33 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] VLDBs, the saga - Ramblings only Hello John, I have just downloaded my email for the list and there was in total 5500 or so emails, so I have just read through your series of problems with SQL. Firstly, I envy you, you lucky guy, having such a big db to work with ! Secondly, sorry to hear about the amount of hassle that you have had. I have recently been working on a 2gb database and when it is on the server, my laptop runs fine, but on my laptop with 1/2 gb ram, the laptop grinds to a halt, so I would be in favour of running the db off the development machine. I know that you have already looked at some of the obvious things such as ensuring that there are no indexes, in the tables, ensuring that the fields are no bigger than they need to be, use integers where you can etc. I have a few other comments to make, although I do not think that there are any revelations in them. DTS is probably the best way to get data in. SQL 2000 better at truncating / shrinking than SQL 7. Have you considered just importing a few hundred records from each file and getting the thing up and running? I am presuming that once you get it imported you want to start development. Wouldn't it be nicer to work with just 1 million records for the initial development period. Have you considered building a little vb app that would import 10k records and then do a database integrity and index rebuild (the work that the maintenance wizard does). Then let the vb app just work away for a week or so, mean while you could continue to use the hardware that you have. For the duration of this project, would you consider forgetting about redundancy and just use what ever disk space you have? I also have a dislike (not based on fact, just emotion) for compression. I wonder how it will effect performance of the SQL Server if you ever get it fully loaded. Keep in touch and let us know how the project goes Best of luck, Mark ----- Original Message ----- From: "John W. Colby" To: "'Access Developers discussion and problem solving'" ; Sent: Saturday, August 28, 2004 12:56 PM Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only > I have been working on getting a rather large database into SQL > Server. This thing is ~65 million names plus demographics info and > will be used for > bulk mailing analysis. I have been struggling for weeks to make this > happen. Having no experience with a database this big, I had no idea > how big the final db would be. I tried to get it in to a 160g drive > but it rapidly became obvious that wouldn't hold it. I then purchased > two 200g drives and used Raid0 to make a big 400 g drive. I thought I > turned on compression but after getting well into the extraction > process I discovered > this wasn't the case. I then started trying to figure out how to get > the drive compressed. > > Long story short, a NTFS drive can be compressed, even a raid array > such as > this, however... There is a control that allows you to select the > sector size. I had selected the "compress" check box but then > selected a sector size of 64K. As I started investigating why the > drive wasn't compressed it > turns out that only sector sizes of 512 to 4K bytes allow compression. > Anything larger causes the "compress drive" check box to gray out and > the drive ends up uncompressed. > > By this time I had already spent several days extracting zipped files > of data and BCPing them into SQL Server so I had a MDF file of over > 300gb and no place to put it! > > Sigh. > > Out of desperation I decided to try zipping the database file. I > started it > PK zipping last night onto an empty 160g partition. This morning I > had a 10gb zipped file that supposedly contains the MDF file! > > I then deleted the partition on the 400gb Raid array and started > playing with the compression / block size which is when I discovered > the >4K sector > size gotcha. I set the sector size to 4K and quick formatted, then started > unzipping the MDF file to the (compressed) 400gb raid array. > > We shall see. The unzip is not finished, in fact has several hours to > go yet. If this works I will celebrate. > > This whole project has been a challenge. It looks like the database > will be > around 600g for JUST the data, never mind any indexes. I simply don't have > the money to build a raid 5 array to up the uncompressed drive size. > Even if I did, IDE drives larger than 250gb are expensive and AFAICT > only available in 5200 RPM. Plus the overhead of Raid5 is "One Drive" > which means I'd need (4) 300g drives to build a 900g usable space > raid5 array. Raid1 (which I am currently using) tops out at 600g using > (2) 300g drives (uncompressed). So far my (2) drive Raid1 array using > 200g drives has cost > me $240 plus a controller I already had. A Raid5 solution using 300g drives > would cost about $1200 just for the new controller and 4 drives! > > With any luck, given the massive compression PKZip managed to attain, > I will > be able to shoehorn the 600g. > > Update 8-( > > As I write this I just got a "delayed write failed" message from > Windows saying it lost data trying to write to the disk. I have tried > to turn off write caching but can't seem to find the magic button to > cause Windows to quit using "Delayed write". > > BIG sigh! > > If I can't get the db back out of the zip file I will be facing a > weekend of > "starting from scratch" on getting the data out of the raw text files > and back in to SQL Server! > > And I thought this would be fairly easy. > > John W. Colby > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From tuxedo_man at hotmail.com Sat Aug 28 12:13:53 2004 From: tuxedo_man at hotmail.com (Billy Pang) Date: Sat, 28 Aug 2004 17:13:53 +0000 Subject: [dba-SQLServer] Table Definition Message-ID: You can get all column defs for all columns in all tables of current user db from running the following query: SELECT * FROM INFORMATION_SCHEMA.columns >From: "Ken Stoker" >Reply-To: dba-sqlserver at databaseadvisors.com >To: "'Joseph O'Connell'" , > >Subject: RE: [dba-SQLServer] Table Definition >Date: Fri, 27 Aug 2004 11:05:53 -0700 > >I have always copied the results out to Excel and worked with it from >there. >The column names aren't included in the copy/paste, so you will need to put >those in if you need them. > >One other thing, the query I provided doesn't return fields with user >defined types. I did some more playing around and the best way is to >replace sc.xtype = st.xusertype with sc.xusertype = st.xusertype. > >Ken > >-----Original Message----- >From: Joseph O'Connell [mailto:joconnell at indy.rr.com] >Sent: Friday, August 27, 2004 10:43 AM >To: dba-sqlserver at databaseadvisors.com; kens.programming at verizon.net >Subject: Re: [dba-SQLServer] Table Definition > >Ken, > >Thank you for your prompt reply. This query does give me the information >that I need. > >Is there an easy way to print the results of the query? > >Joe O'Connell > >-----Original Message----- >From: Ken Stoker >To: dba-sqlserver at databaseadvisors.com >Date: Friday, August 27, 2004 12:10 PM >Subject: RE: [dba-SQLServer] Table Definition > > >|Will this work for you? >| >|SELECT so.name AS TableName, sc.name AS FieldName, st.name AS Type, >| sc.Length, sc.Prec, sc.Scale >|FROM sysobjects so INNER JOIN syscolumns sc on so.id = sc.id >| INNER JOIN systypes st on sc.xtype = st.xusertype >|WHERE so.name = 'mytablename' >| >|Ken >| >|-----Original Message----- >|From: dba-sqlserver-bounces at databaseadvisors.com >|[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Joseph >|O'Connell >|Sent: Friday, August 27, 2004 9:33 AM >|To: dba-sqlserver at databaseadvisors.com >|Subject: [dba-SQLServer] Table Definition >| >|In an Access application, I can use the Documenter tool to generate a >report >|of table definitions. >| >|Is there an equivalent tool in SQL Server that will easily create a report >|showing the Name, Data Type and Size of the fields in a selected table? >| >|Joe O'Connell >| >| >| >|_______________________________________________ >|dba-SQLServer mailing list >|dba-SQLServer at databaseadvisors.com >|http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >|http://www.databaseadvisors.com >| >|_______________________________________________ >|dba-SQLServer mailing list >|dba-SQLServer at databaseadvisors.com >|http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >|http://www.databaseadvisors.com >| > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > _________________________________________________________________ Take advantage of powerful junk e-mail filters built on patented Microsoft? SmartScreen Technology. http://join.msn.com/?pgmarket=en-ca&page=byoa/prem&xAPID=1994&DI=1034&SU=http://hotmail.com/enca&HL=Market_MSNIS_Taglines Start enjoying all the benefits of MSN? Premium right now and get the first two months FREE*. From artful at rogers.com Sun Aug 29 09:48:39 2004 From: artful at rogers.com (Arthur Fuller) Date: Sun, 29 Aug 2004 10:48:39 -0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only In-Reply-To: <000901c48d22$0fd41be0$80b3fea9@ColbyM6805> Message-ID: <008901c48dd7$45ed9660$6501a8c0@rock> >> If I can in fact do this at all on the class machines available to me it should be fun. It has been a much bigger challenge than I expected I can tell you that. We tried to caution you :) Big databases demand big hardware. There are no shortcuts (i.e. compressed drives, etc.). For 65M rows I would want at least 800GB of space and as much RAM as the box can handle. There are threshholds involved here, though. Go past 8GB for example and you have to reconfigure everything so SQL Server can exploit the RAM. I don't have an URL handy concerning this, but this is a well-known issue. If you want to do some reading on this subject, I'll look for an URL for you. A. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of John W. Colby Sent: Saturday, August 28, 2004 1:11 PM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] VLDBs, the saga - Ramblings only Mark, Good to hear from you again. I have to assume that you only check your list email every few weeks? This is an OLAP database (more or less) from what I can tell, rather than a transaction database. There is only a single table, completely (intentionally) denormalized for speed reasons. Not that it has to remain that way, I can do it any way I want in the end, but that is how the data comes to me. The whole idea is that there are 65 million names of Americans ONLY with about 600 fields that categorize them. What they like to eat, drink, smoke, wear, drive, play with, medical drugs used, etc. The data will be loaded once (MAYBE!!! If I can get that done), then read hundreds of times a month. But only updated every few months as new names come in (additions / updates to existing data) or new survey results come in. This is definitely NOT like a banking database with millions of new records every month, nor even a call center or order entry database. NO transactions ever. There is in fact some maintenance involved such as checking addresses are valid, "do not mail" lists and so forth but in general this is a read only database. What I have been told is that for example 100 thousand records are pulled to build a mailing, by age, income, perhaps race or geographic locality if appropriate to the marketing effort. Then if it is a mailing set destined to be mailed to car buyers, perhaps narrowed down by those who like the brand being sold in this mass mailing, or even (intentionally) similar competitors. A result set comes back. Of those 100 thousand mailings, only 2% respond. So 2000 postcards come back asking for more info or whatever. The idea now is to look at the demographics of those 2000 and see what they have in common, and just as important who DIDN'T respond. If the mailing was sent to ages 25 to 50 but only 2% of those from 25 to 30 responded and 5% of 30 to 35 but 40% of those from 35-50 responded, then we know that the mailing "didn't work" for that 25-35 age group and perhaps give up on them, or perhaps target them differently. But it is safe(er) to mail a million out to ages 35 to 50 because many more of them will respond, making the response / mail more cost effective. As you can see, this is just a TON of analysis of the demographics, but very little (in fact NO) update of the original data. We will be adding a "business" database that holds client info (who is buying the address data sets), "filter" records of what records were selected in a given mailing, "Response" records that give us info on who responded. But these will probably be a set of a few to a few hundred (just off the top of my head) records in a handful of tables to record this "who bought the data" / "what was mailed" / "Who responded" information for each mailing data set created. If I can in fact do this at all on the class machines available to me it should be fun. It has been a much bigger challenge than I expected I can tell you that. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Mark L. Breen Sent: Saturday, August 28, 2004 12:33 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] VLDBs, the saga - Ramblings only Hello John, I have just downloaded my email for the list and there was in total 5500 or so emails, so I have just read through your series of problems with SQL. Firstly, I envy you, you lucky guy, having such a big db to work with ! Secondly, sorry to hear about the amount of hassle that you have had. I have recently been working on a 2gb database and when it is on the server, my laptop runs fine, but on my laptop with 1/2 gb ram, the laptop grinds to a halt, so I would be in favour of running the db off the development machine. I know that you have already looked at some of the obvious things such as ensuring that there are no indexes, in the tables, ensuring that the fields are no bigger than they need to be, use integers where you can etc. I have a few other comments to make, although I do not think that there are any revelations in them. DTS is probably the best way to get data in. SQL 2000 better at truncating / shrinking than SQL 7. Have you considered just importing a few hundred records from each file and getting the thing up and running? I am presuming that once you get it imported you want to start development. Wouldn't it be nicer to work with just 1 million records for the initial development period. Have you considered building a little vb app that would import 10k records and then do a database integrity and index rebuild (the work that the maintenance wizard does). Then let the vb app just work away for a week or so, mean while you could continue to use the hardware that you have. For the duration of this project, would you consider forgetting about redundancy and just use what ever disk space you have? I also have a dislike (not based on fact, just emotion) for compression. I wonder how it will effect performance of the SQL Server if you ever get it fully loaded. Keep in touch and let us know how the project goes Best of luck, Mark ----- Original Message ----- From: "John W. Colby" To: "'Access Developers discussion and problem solving'" ; Sent: Saturday, August 28, 2004 12:56 PM Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only > I have been working on getting a rather large database into SQL > Server. This thing is ~65 million names plus demographics info and > will be used for > bulk mailing analysis. I have been struggling for weeks to make this > happen. Having no experience with a database this big, I had no idea > how big the final db would be. I tried to get it in to a 160g drive > but it rapidly became obvious that wouldn't hold it. I then purchased > two 200g drives and used Raid0 to make a big 400 g drive. I thought I > turned on compression but after getting well into the extraction > process I discovered > this wasn't the case. I then started trying to figure out how to get > the drive compressed. > > Long story short, a NTFS drive can be compressed, even a raid array > such as > this, however... There is a control that allows you to select the > sector size. I had selected the "compress" check box but then > selected a sector size of 64K. As I started investigating why the > drive wasn't compressed it > turns out that only sector sizes of 512 to 4K bytes allow compression. > Anything larger causes the "compress drive" check box to gray out and > the drive ends up uncompressed. > > By this time I had already spent several days extracting zipped files > of data and BCPing them into SQL Server so I had a MDF file of over > 300gb and no place to put it! > > Sigh. > > Out of desperation I decided to try zipping the database file. I > started it > PK zipping last night onto an empty 160g partition. This morning I > had a 10gb zipped file that supposedly contains the MDF file! > > I then deleted the partition on the 400gb Raid array and started > playing with the compression / block size which is when I discovered > the >4K sector > size gotcha. I set the sector size to 4K and quick formatted, then started > unzipping the MDF file to the (compressed) 400gb raid array. > > We shall see. The unzip is not finished, in fact has several hours to > go yet. If this works I will celebrate. > > This whole project has been a challenge. It looks like the database > will be > around 600g for JUST the data, never mind any indexes. I simply don't have > the money to build a raid 5 array to up the uncompressed drive size. > Even if I did, IDE drives larger than 250gb are expensive and AFAICT > only available in 5200 RPM. Plus the overhead of Raid5 is "One Drive" > which means I'd need (4) 300g drives to build a 900g usable space > raid5 array. Raid1 (which I am currently using) tops out at 600g using > (2) 300g drives (uncompressed). So far my (2) drive Raid1 array using > 200g drives has cost > me $240 plus a controller I already had. A Raid5 solution using 300g drives > would cost about $1200 just for the new controller and 4 drives! > > With any luck, given the massive compression PKZip managed to attain, > I will > be able to shoehorn the 600g. > > Update 8-( > > As I write this I just got a "delayed write failed" message from > Windows saying it lost data trying to write to the disk. I have tried > to turn off write caching but can't seem to find the magic button to > cause Windows to quit using "Delayed write". > > BIG sigh! > > If I can't get the db back out of the zip file I will be facing a > weekend of > "starting from scratch" on getting the data out of the raw text files > and back in to SQL Server! > > And I thought this would be fairly easy. > > John W. Colby > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Sun Aug 29 13:32:57 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Sun, 29 Aug 2004 14:32:57 -0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only In-Reply-To: <008901c48dd7$45ed9660$6501a8c0@rock> Message-ID: <000401c48df6$9b42bd10$80b3fea9@ColbyM6805> With the (2) 250g drives I just purchased configured Raid 1 I will now have a 500g array and a 400g array. I will be breaking the db horizontally as discussed in the url you provided earlier. That will at least allow me to get it in place. I already have 2gb RAM which is all the motherboard can handle, and which SQL Server uses quite well. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Sunday, August 29, 2004 10:49 AM To: dba-sqlserver at databaseadvisors.com Subject: RE: [dba-SQLServer] VLDBs, the saga - Ramblings only >> If I can in fact do this at all on the class machines available to me it should be fun. It has been a much bigger challenge than I expected I can tell you that. We tried to caution you :) Big databases demand big hardware. There are no shortcuts (i.e. compressed drives, etc.). For 65M rows I would want at least 800GB of space and as much RAM as the box can handle. There are threshholds involved here, though. Go past 8GB for example and you have to reconfigure everything so SQL Server can exploit the RAM. I don't have an URL handy concerning this, but this is a well-known issue. If you want to do some reading on this subject, I'll look for an URL for you. A. From shamil at users.mns.ru Mon Aug 30 00:41:49 2004 From: shamil at users.mns.ru (Shamil Salakhetdinov) Date: Mon, 30 Aug 2004 09:41:49 +0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only References: <000001c48cf6$06589a90$80b3fea9@ColbyM6805> Message-ID: <004201c48e54$1cb55760$0201a8c0@PARIS> John, May I ask you did you make any calculations in advance to see what to expect with your data loading? (Almost nobody do I know - But I did start to do that for some time now - it did happen to me that "brute force" approach doesn't work well sometimes :) ) Why not make it normalized as a first step of your data loading adventure? I mean: - you have 65 million records with 600 fields each and let's assume that each field is 20 bytes (not Unicode). Then you get 65millions*600*20 = 780GB (even if the average size of a record is less than that - it for sure(?) should be more than 4KB and therefore you get ONE record loaded on ONE page - MS SQL records can't be longer than 8KB - this is ~520GB without indexes...) If as the first step you go through all your source data and get them normalized you get something like: 65millions*600*4 = 156GB - the latter looks manageable even with ordinary modern IDE drive even connected through USB2 - that's a cheap and quick enough solution for starters (I assume that some references from normalized table will be 4 bytes long, some two bytes long, others 1 byte long, some like First- , Middle- and Last- name will be left intact - so all that will probably give 4 bytes long field in average. And four bytes are enough to get referenced even 65,000,000 data dictionary/reference (65,000,000 = 03 DF D2 40 hexadecimal). So as the first step - get all your source data unzipped on one HDD (160MB)? Second step - analyze the unzipped data and find what is the best way to get it normalized (here is a simple utility reading source data, making hash code of every field and calculating quantities of different hash codes for every field should be not a bad idea - such a utility should be very quick and reliable solution to get good approximation where you can get with your huge volume of the source data, especially if you write in on C++/C#/VB.NET - I'm getting in this field now - so I can help here just for fun to share this your challenge but spare time is problem here so this help can be not as quick as it maybe needed for you now... Third step - make good (semi-) normalized data model (don't forget a clustered primary key - Identity - you like them I know :)), calculate well what size it will get when it will be implemented in a MS SQL database... Fourth step - load normalized data, maybe in several steps.... .... N-th step get all you data loaded and manageable - here is the point where you can get it back denormalized if you will need that (I don't think it will be needed/possible with your/your customer resources and the OLAP tasks should work well on (semi-)normalized database metioned above), maybe as a set of federated databases, linked databases, partitioned views ... I'd also advise you to read now carefully the "SQL Server Architecture" chapter from BOL... Of course it easy to advice - and it's not that easy to go through the challenge you have... I'm not that often here this days but I'm every working day online on MS Messenger (~ 8:00 - 22:00 (MT+3) Shamil at Work) - so you can get me there if you'll need some of my help... HTH & I hope you'll get it up&running soon, Shamil ----- Original Message ----- From: "John W. Colby" To: "'Access Developers discussion and problem solving'" ; Sent: Saturday, August 28, 2004 3:56 PM Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only > I have been working on getting a rather large database into SQL Server. > This thing is ~65 million names plus demographics info and will be used for > bulk mailing analysis. I have been struggling for weeks to make this > happen. Having no experience with a database this big, I had no idea how > big the final db would be. I tried to get it in to a 160g drive but it > rapidly became obvious that wouldn't hold it. I then purchased two 200g > drives and used Raid0 to make a big 400 g drive. I thought I turned on > compression but after getting well into the extraction process I discovered > this wasn't the case. I then started trying to figure out how to get the > drive compressed. > > Long story short, a NTFS drive can be compressed, even a raid array such as > this, however... There is a control that allows you to select the sector > size. I had selected the "compress" check box but then selected a sector > size of 64K. As I started investigating why the drive wasn't compressed it > turns out that only sector sizes of 512 to 4K bytes allow compression. > Anything larger causes the "compress drive" check box to gray out and the > drive ends up uncompressed. > > By this time I had already spent several days extracting zipped files of > data and BCPing them into SQL Server so I had a MDF file of over 300gb and > no place to put it! > > Sigh. > > Out of desperation I decided to try zipping the database file. I started it > PK zipping last night onto an empty 160g partition. This morning I had a > 10gb zipped file that supposedly contains the MDF file! > > I then deleted the partition on the 400gb Raid array and started playing > with the compression / block size which is when I discovered the >4K sector > size gotcha. I set the sector size to 4K and quick formatted, then started > unzipping the MDF file to the (compressed) 400gb raid array. > > We shall see. The unzip is not finished, in fact has several hours to go > yet. If this works I will celebrate. > > This whole project has been a challenge. It looks like the database will be > around 600g for JUST the data, never mind any indexes. I simply don't have > the money to build a raid 5 array to up the uncompressed drive size. Even > if I did, IDE drives larger than 250gb are expensive and AFAICT only > available in 5200 RPM. Plus the overhead of Raid5 is "One Drive" which > means I'd need (4) 300g drives to build a 900g usable space raid5 array. > Raid1 (which I am currently using) tops out at 600g using (2) 300g drives > (uncompressed). So far my (2) drive Raid1 array using 200g drives has cost > me $240 plus a controller I already had. A Raid5 solution using 300g drives > would cost about $1200 just for the new controller and 4 drives! > > With any luck, given the massive compression PKZip managed to attain, I will > be able to shoehorn the 600g. > > Update 8-( > > As I write this I just got a "delayed write failed" message from Windows > saying it lost data trying to write to the disk. I have tried to turn off > write caching but can't seem to find the magic button to cause Windows to > quit using "Delayed write". > > BIG sigh! > > If I can't get the db back out of the zip file I will be facing a weekend of > "starting from scratch" on getting the data out of the raw text files and > back in to SQL Server! > > And I thought this would be fairly easy. > > John W. Colby > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > From jwcolby at colbyconsulting.com Mon Aug 30 07:37:38 2004 From: jwcolby at colbyconsulting.com (John W. Colby) Date: Mon, 30 Aug 2004 08:37:38 -0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only In-Reply-To: <004201c48e54$1cb55760$0201a8c0@PARIS> Message-ID: <000501c48e8e$25ccde60$80b3fea9@ColbyM6805> Shamil, In fact the numbers don't look like that at all, plus I really didn't know what the fields looked like. Looking at the data in 600 fields is non-trivial all by itself. However I can tell you it isn't a straight calc like that. Many and probably most of the fields are a simple true false which led me to expect a MUCH smaller db size. They just have a T or a F in them (the character). Further it is not helpful to normalize them. This is SO huge that putting them back together again in queries joining a dozen or more fields / tables would be a slow nightmare. Even normalizing 600 fields would be many many hours (days? Weeks?) of work. But in fact I think it may just be better left denormalized. John W. Colby www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Shamil Salakhetdinov Sent: Monday, August 30, 2004 1:42 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] VLDBs, the saga - Ramblings only John, May I ask you did you make any calculations in advance to see what to expect with your data loading? (Almost nobody do I know - But I did start to do that for some time now - it did happen to me that "brute force" approach doesn't work well sometimes :) ) Why not make it normalized as a first step of your data loading adventure? I mean: - you have 65 million records with 600 fields each and let's assume that each field is 20 bytes (not Unicode). Then you get 65millions*600*20 = 780GB (even if the average size of a record is less than that - it for sure(?) should be more than 4KB and therefore you get ONE record loaded on ONE page - MS SQL records can't be longer than 8KB - this is ~520GB without indexes...) If as the first step you go through all your source data and get them normalized you get something like: 65millions*600*4 = 156GB - the latter looks manageable even with ordinary modern IDE drive even connected through USB2 - that's a cheap and quick enough solution for starters (I assume that some references from normalized table will be 4 bytes long, some two bytes long, others 1 byte long, some like First- , Middle- and Last- name will be left intact - so all that will probably give 4 bytes long field in average. And four bytes are enough to get referenced even 65,000,000 data dictionary/reference (65,000,000 = 03 DF D2 40 hexadecimal). So as the first step - get all your source data unzipped on one HDD (160MB)? Second step - analyze the unzipped data and find what is the best way to get it normalized (here is a simple utility reading source data, making hash code of every field and calculating quantities of different hash codes for every field should be not a bad idea - such a utility should be very quick and reliable solution to get good approximation where you can get with your huge volume of the source data, especially if you write in on C++/C#/VB.NET - I'm getting in this field now - so I can help here just for fun to share this your challenge but spare time is problem here so this help can be not as quick as it maybe needed for you now... Third step - make good (semi-) normalized data model (don't forget a clustered primary key - Identity - you like them I know :)), calculate well what size it will get when it will be implemented in a MS SQL database... Fourth step - load normalized data, maybe in several steps.... .... N-th step get all you data loaded and manageable - here is the point where you can get it back denormalized if you will need that (I don't think it will be needed/possible with your/your customer resources and the OLAP tasks should work well on (semi-)normalized database metioned above), maybe as a set of federated databases, linked databases, partitioned views ... I'd also advise you to read now carefully the "SQL Server Architecture" chapter from BOL... Of course it easy to advice - and it's not that easy to go through the challenge you have... I'm not that often here this days but I'm every working day online on MS Messenger (~ 8:00 - 22:00 (MT+3) Shamil at Work) - so you can get me there if you'll need some of my help... HTH & I hope you'll get it up&running soon, Shamil From shamil at users.mns.ru Mon Aug 30 08:48:44 2004 From: shamil at users.mns.ru (Shamil Salakhetdinov) Date: Mon, 30 Aug 2004 17:48:44 +0400 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only References: <000501c48e8e$25ccde60$80b3fea9@ColbyM6805> Message-ID: <000701c48e98$18c874e0$0201a8c0@PARIS> John, I didn't know they are these simple T/F... Why your database is getting so huge then? Do you have any explanations/calculations? > Further it is not helpful to normalize them. This is SO huge that putting > them back together again in queries joining a dozen or more fields / tables > would be a slow nightmare. This is incredibly quick in MS SQL.... And I'm not talking about "brute force" joining back... Of course all that my first thoughts/advises probably do not make sense now when I get known your source data are quite different from what I've thought about it before... > Even normalizing 600 fields would be many many > hours (days? Weeks?) of work. I don't think so - but to not get it wrong again I need to see some of your source data... > But in fact I think it may just be better left demoralized. Yes, very probably if they are these simple T/F. (Although I can't get then how they present so different I think habits and tastes of Americans...) Shamil ----- Original Message ----- From: "John W. Colby" To: Sent: Monday, August 30, 2004 4:37 PM Subject: RE: [dba-SQLServer] VLDBs, the saga - Ramblings only > Shamil, > > In fact the numbers don't look like that at all, plus I really didn't know > what the fields looked like. Looking at the data in 600 fields is > non-trivial all by itself. > > However I can tell you it isn't a straight calc like that. Many and > probably most of the fields are a simple true false which led me to expect a > MUCH smaller db size. They just have a T or a F in them (the character). > > Further it is not helpful to normalize them. This is SO huge that putting > them back together again in queries joining a dozen or more fields / tables > would be a slow nightmare. Even normalizing 600 fields would be many many > hours (days? Weeks?) of work. But in fact I think it may just be better > left denormalized. > > John W. Colby > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Shamil > Salakhetdinov > Sent: Monday, August 30, 2004 1:42 AM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] VLDBs, the saga - Ramblings only > > > John, > > May I ask you did you make any calculations in advance to see what to expect > with your data loading? (Almost nobody do I know - But I did start to do > that for some time now - it did happen to me that "brute force" approach > doesn't work well sometimes :) ) > > Why not make it normalized as a first step of your data loading adventure? > > I mean: > > - you have 65 million records with 600 fields each and let's assume that > each field is 20 bytes (not Unicode). Then you get 65millions*600*20 = 780GB > (even if the average size of a record is less than that - it for sure(?) > should be more than 4KB and therefore you get ONE record loaded on ONE page > - MS SQL records can't be longer than 8KB - this is ~520GB without > indexes...) > > If as the first step you go through all your source data and get them > normalized you get something like: > > 65millions*600*4 = 156GB - the latter looks manageable even with ordinary > modern IDE drive even connected through USB2 - that's a cheap and quick > enough solution for starters (I assume that some references from normalized > table will be 4 bytes long, some two bytes long, others 1 byte long, some > like First- , Middle- and Last- name will be left intact - so all that will > probably give 4 bytes long field in average. And four bytes are enough to > get referenced even 65,000,000 data dictionary/reference (65,000,000 = 03 > DF D2 40 hexadecimal). > > So as the first step - get all your source data unzipped on one HDD (160MB)? > > Second step - analyze the unzipped data and find what is the best way to get > it normalized (here is a simple utility reading source data, making hash > code of every field and calculating quantities of different hash codes for > every field should be not a bad idea - such a utility should be very quick > and reliable solution to get good approximation where you can get with your > huge volume of the source data, especially if you write in on C++/C#/VB.NET > - I'm getting in this field now - so I can help here just for fun to share > this your challenge but spare time is problem here so this help can be not > as quick as it maybe needed for you now... > > Third step - make good (semi-) normalized data model (don't forget a > clustered primary key - Identity - you like them I know :)), calculate well > what size it will get when it will be implemented in a MS SQL database... > > Fourth step - load normalized data, maybe in several steps.... .... N-th > step get all you data loaded and manageable - here is the point where you > can get it back denormalized if you will need that (I don't think it will be > needed/possible with your/your customer resources and the OLAP tasks should > work well on (semi-)normalized database metioned above), maybe as a set of > federated databases, linked databases, partitioned views ... > > I'd also advise you to read now carefully the "SQL Server Architecture" > chapter from BOL... > > Of course it easy to advice - and it's not that easy to go through the > challenge you have... > > I'm not that often here this days but I'm every working day online on MS > Messenger (~ 8:00 - 22:00 (MT+3) Shamil at Work) - so you can get me there if > you'll need some of my help... > > HTH & I hope you'll get it up&running soon, > Shamil > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > From fhtapia at gmail.com Mon Aug 30 10:28:21 2004 From: fhtapia at gmail.com (Francisco Tapia) Date: Mon, 30 Aug 2004 08:28:21 -0700 Subject: [dba-SQLServer] Big db update In-Reply-To: <05C61C52D7CAD211A7830008C7DF6F1079BDD3@DISABILITYINS01> References: <05C61C52D7CAD211A7830008C7DF6F1079BDD3@DISABILITYINS01> Message-ID: John are you still having issues w/ this part? I remember replying but I didn't see it in my archives... On Thu, 26 Aug 2004 10:13:08 -0400, Colby, John wrote: > Francisco, > > Last night I tried to modify the log file size. It was already up to around > 10 gb and refused to allow me to set the numbers down. Something about "can > only expand the file". Where do I find these "bulk logged" or "SIMPLE" > settings? -- -Francisco From cfoust at infostatsystems.com Mon Aug 30 10:44:16 2004 From: cfoust at infostatsystems.com (Charlotte Foust) Date: Mon, 30 Aug 2004 08:44:16 -0700 Subject: [dba-SQLServer] VLDBs, the saga - Ramblings only Message-ID: >From what John has described, he has, in essence, a data warehouse. It could certainly be broken up into one or more fact tables with dimension tables to make the slicing and dicing easier, but you still wouldn't wind up with "normal" normalization, just 1NF tables. I've worked with this kind of data on a smaller scale (I used to work for a company that would have been a customer of that kind of list). I suspect the demographics they're using as selection criteria should probably be put into dimension tables, which will actually make the database bigger but will make the queries MUCH faster and easier. Charlotte Foust -----Original Message----- From: Shamil Salakhetdinov [mailto:shamil at users.mns.ru] Sent: Monday, August 30, 2004 6:49 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] VLDBs, the saga - Ramblings only John, I didn't know they are these simple T/F... Why your database is getting so huge then? Do you have any explanations/calculations? > Further it is not helpful to normalize them. This is SO huge that putting > them back together again in queries joining a dozen or more fields / tables > would be a slow nightmare. This is incredibly quick in MS SQL.... And I'm not talking about "brute force" joining back... Of course all that my first thoughts/advises probably do not make sense now when I get known your source data are quite different from what I've thought about it before... > Even normalizing 600 fields would be many many > hours (days? Weeks?) of work. I don't think so - but to not get it wrong again I need to see some of your source data... > But in fact I think it may just be better left demoralized. Yes, very probably if they are these simple T/F. (Although I can't get then how they present so different I think habits and tastes of Americans...) Shamil ----- Original Message ----- From: "John W. Colby" To: Sent: Monday, August 30, 2004 4:37 PM Subject: RE: [dba-SQLServer] VLDBs, the saga - Ramblings only > Shamil, > > In fact the numbers don't look like that at all, plus I really didn't know > what the fields looked like. Looking at the data in 600 fields is > non-trivial all by itself. > > However I can tell you it isn't a straight calc like that. Many and > probably most of the fields are a simple true false which led me to expect a > MUCH smaller db size. They just have a T or a F in them (the character). > > Further it is not helpful to normalize them. This is SO huge that putting > them back together again in queries joining a dozen or more fields / tables > would be a slow nightmare. Even normalizing 600 fields would be many many > hours (days? Weeks?) of work. But in fact I think it may just be better > left denormalized. > > John W. Colby > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Shamil > Salakhetdinov > Sent: Monday, August 30, 2004 1:42 AM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] VLDBs, the saga - Ramblings only > > > John, > > May I ask you did you make any calculations in advance to see what to expect > with your data loading? (Almost nobody do I know - But I did start to do > that for some time now - it did happen to me that "brute force" approach > doesn't work well sometimes :) ) > > Why not make it normalized as a first step of your data loading adventure? > > I mean: > > - you have 65 million records with 600 fields each and let's assume that > each field is 20 bytes (not Unicode). Then you get 65millions*600*20 = 780GB > (even if the average size of a record is less than that - it for sure(?) > should be more than 4KB and therefore you get ONE record loaded on ONE page > - MS SQL records can't be longer than 8KB - this is ~520GB without > indexes...) > > If as the first step you go through all your source data and get them > normalized you get something like: > > 65millions*600*4 = 156GB - the latter looks manageable even with ordinary > modern IDE drive even connected through USB2 - that's a cheap and quick > enough solution for starters (I assume that some references from normalized > table will be 4 bytes long, some two bytes long, others 1 byte long, some > like First- , Middle- and Last- name will be left intact - so all that will > probably give 4 bytes long field in average. And four bytes are enough to > get referenced even 65,000,000 data dictionary/reference (65,000,000 = 03 > DF D2 40 hexadecimal). > > So as the first step - get all your source data unzipped on one HDD (160MB)? > > Second step - analyze the unzipped data and find what is the best way to get > it normalized (here is a simple utility reading source data, making hash > code of every field and calculating quantities of different hash codes for > every field should be not a bad idea - such a utility should be very quick > and reliable solution to get good approximation where you can get with your > huge volume of the source data, especially if you write in on C++/C#/VB.NET > - I'm getting in this field now - so I can help here just for fun to share > this your challenge but spare time is problem here so this help can be not > as quick as it maybe needed for you now... > > Third step - make good (semi-) normalized data model (don't forget a > clustered primary key - Identity - you like them I know :)), calculate well > what size it will get when it will be implemented in a MS SQL database... > > Fourth step - load normalized data, maybe in several steps.... .... N-th > step get all you data loaded and manageable - here is the point where you > can get it back denormalized if you will need that (I don't think it will be > needed/possible with your/your customer resources and the OLAP tasks should > work well on (semi-)normalized database metioned above), maybe as a set of > federated databases, linked databases, partitioned views ... > > I'd also advise you to read now carefully the "SQL Server Architecture" > chapter from BOL... > > Of course it easy to advice - and it's not that easy to go through the > challenge you have... > > I'm not that often here this days but I'm every working day online on MS > Messenger (~ 8:00 - 22:00 (MT+3) Shamil at Work) - so you can get me there if > you'll need some of my help... > > HTH & I hope you'll get it up&running soon, > Shamil > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com