From jwcolby at colbyconsulting.com Tue May 1 06:06:03 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 07:06:03 -0400 Subject: [dba-SQLServer] Bulk copy spec - was RE: [AccessD] using a saved SSIS with VB.Net In-Reply-To: <0JHC007TC404LIX4@vms048.mailsrvcs.net> References: <004301c7868a$9e6cd590$657aa8c0@m6805> <0JHC007TC404LIX4@vms048.mailsrvcs.net> Message-ID: <004601c78be0$bae455d0$657aa8c0@m6805> Is there any way to have SQL Server export a spec for a table that bulk copy can use, at least what the fields look like etc? I have never used bulk copy and there are a LOT of fields in the table. Alternately is there somewhere that I can find the bulk copy spec, what the format file is supposed to look like? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Monday, April 30, 2007 7:31 PM To: 'Access Developers discussion and problem solving' Subject: Re: [AccessD] using a saved SSIS with VB.Net Have you looked into BULK INSERT in SQL? This is supposed to be a faster data import method. Using T-SQL you can do something like this... CREATE TABLE #tmpEmployees () BULK INSERT #tmpEmployees FROM 'c:\temp\import.csv' WITH (FORMATFILE = 'c:\temp\importCSV.fmt' importCSV.fmt would contain the file format...in this example it's fixed width 8.0 18 1 SQLCHAR 0 5 "" 1 suffix SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 30 "" 2 last_name SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 20 "" 3 first_name SQL_Latin1_General_CP1_CI_AS From jwcolby at colbyconsulting.com Tue May 1 07:49:20 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 08:49:20 -0400 Subject: [dba-SQLServer] [AccessD] using a saved SSIS with VB.Net In-Reply-To: <0JHC007TC404LIX4@vms048.mailsrvcs.net> References: <004301c7868a$9e6cd590$657aa8c0@m6805> <0JHC007TC404LIX4@vms048.mailsrvcs.net> Message-ID: <004f01c78bef$264cf210$657aa8c0@m6805> Eric, I have successfully generated the fmt file. My only concern at this point is that the import file has spaces at the end of the valid data, padding to make up the full width of the field. I do NOT want the spaces, and I suspect that BCP is going to pull in all of the spaces. Is it possible to tell BCP to strip the spaces, or is it possible to use BCP to pull a CSV file into an existing table, respecting the already established field types / sizes in the destination table? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Monday, April 30, 2007 7:31 PM To: 'Access Developers discussion and problem solving' Subject: Re: [AccessD] using a saved SSIS with VB.Net Have you looked into BULK INSERT in SQL? This is supposed to be a faster data import method. Using T-SQL you can do something like this... CREATE TABLE #tmpEmployees () BULK INSERT #tmpEmployees FROM 'c:\temp\import.csv' WITH (FORMATFILE = 'c:\temp\importCSV.fmt' importCSV.fmt would contain the file format...in this example it's fixed width 8.0 18 1 SQLCHAR 0 5 "" 1 suffix SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 30 "" 2 last_name SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 20 "" 3 first_name SQL_Latin1_General_CP1_CI_AS -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, April 24, 2007 9:07 AM To: 'Access Developers discussion and problem solving' Subject: Re: [AccessD] using a saved SSIS with VB.Net The CSV file is on the same machine. It appears that the clause that pulls the source table (csv file) into memory is taking a ton of time. These are large files, the smallest are a little under 200 million bytes and the largest are up in the 3 gigabyte range. It appears that SQL Server does not read a few records and append them, but rather reads the whole CSV and then starts appending all of the assembled records. If I were a SQL Server pro I could probably speed this up considerably. Alas, I am not. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart Sent: Tuesday, April 24, 2007 11:53 AM To: accessd at databaseadvisors.com Subject: Re: [AccessD] using a saved SSIS with VB.Net John, One of the keys to getting it into SQL Server faster is to have the CSV file on the server and not on a different machine. Network traffic can kill the process and slow it down significantly. Robert At 10:22 AM 4/24/2007, you wrote: >Date: Tue, 24 Apr 2007 10:32:44 -0400 >From: "JWColby" >Subject: Re: [AccessD] using a saved SSIS with VB.Net >To: "'Access Developers discussion and problem solving'" > >Message-ID: <003101c7867d$6cce93a0$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Gustav, > >My bigger issue here is that there 56 of these files to import into SQL >Server, supposedly ~100 million records. I have done about 8 million >records so far. I really must get this thing automated such that it >just chunks through these CSV files without my having to be around to >start the next one. I am working now on setting up the append query >using that syntax below into a stored procedure so that I can then just replace the file name. >After that I will need to write something in VB.Net or whatever to >execute the stored procedure feeding in all of the file names from a >specific directory, deleting the file once the stored procedure >finishes the import for a given file. > >I have never written a stored procedure. >Obviously, given the above, I have never called a stored procedure from >code. > >So much to learn, so little time. > >Once this is imported I have to turn right around and export a subset >of fields from the table back out as 1 - 2 million record chunks for >CAS / DPV / NCOA processing, then I have to import THOSE back in to a >new table. > >And when this set of data is finished, I have another set of about the >same size on the way, to which I have to perform the same processes. I >soooooo need to get this process automated. > >John W. Colby -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.463 / Virus Database: 269.5.10/774 - Release Date: 4/23/2007 5:26 PM -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Tue May 1 08:28:54 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 09:28:54 -0400 Subject: [dba-SQLServer] Bulk insert Message-ID: <005001c78bf4$acb80290$657aa8c0@m6805> I am trying to to use the bulk insert to pull csv file data in to a table. It appears that I have everything except for the fact that the data is enclosed in "" - "SomeData". This is common in CSV files in order to "wrap" commas which would otherwise be interpreted as a field delimiter. Bulk Copy has a Line Terminator and a Field terminator as a parameter but I do not see anyplace to specify what is known as a "Text qualifier" (the " surrounding the data). Is there ANY way to do this? CSV files are very common and text qualifiers are so common that the import / export wizard allows you to specify them. Surely bulk insert must understand them. John W. Colby Colby Consulting www.ColbyConsulting.com From ebarro at verizon.net Tue May 1 09:06:36 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 01 May 2007 07:06:36 -0700 Subject: [dba-SQLServer] [AccessD] Bulk copy spec - was RE: using a saved SSIS with VB.Net In-Reply-To: <004601c78be0$bae455d0$657aa8c0@m6805> Message-ID: <0JHD003UK8HKBT22@vms042.mailsrvcs.net> John, This is the SQL query I use to get the field names... DECLARE @TableName varchar(50) SET @TableName = 'tmpemployees' SELECT syscolumns.name AS DBFieldNames, syscolumns.type AS DataType, (syscolumns.length/2) AS DataLength FROM sysobjects INNER JOIN syscolumns ON sysobjects.id = syscolumns.id WHERE sysobjects.name = @TableName ORDER BY colorder Here's what I came across when I googled it... http://www.thescripts.com/forum/thread520822.html 8.0 5 1 SQLCHAR 0 0 ";\"" 1 col1 "" 2 SQLCHAR 0 0 "\";" 2 col2 "" 3 SQLCHAR 0 0 ";" 3 col3 "" 4 SQLCHAR 0 0 ";" 4 col3 "" 5 SQLCHAR 0 0 "\r\n" 5 col3 "" The first row is the version of the file format. Next is the number of fields in the file. Following lines describe one field each. First column is record number. Second column is data type of the field in the file. For a text file this is always SQLCHAR or always SQLNCHAR for a Unicode file. Other data types are only used with binary formats. The third column is prefix-length, used only for binary files. Fourth column is the length, and is used for fixed-length fields. Fifth field is the terminator, and it is here you specify the quotes. Six column is the database column, with 1 denoting the first column. 0 means that this field is not to be imported. Seventh column is the column name, but it's informational. BCP/BULK INSERT does not use it. Last colunm is the collation for the data in the file. Overall, keep in mind that BCP/BULK INSERT reads a binary file and a row terminator is really only the terminator for the last field. Eric -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 01, 2007 4:06 AM To: 'Access Developers discussion and problem solving' Cc: dba-sqlserver at databaseadvisors.com Subject: [AccessD] Bulk copy spec - was RE: using a saved SSIS with VB.Net Is there any way to have SQL Server export a spec for a table that bulk copy can use, at least what the fields look like etc? I have never used bulk copy and there are a LOT of fields in the table. Alternately is there somewhere that I can find the bulk copy spec, what the format file is supposed to look like? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Monday, April 30, 2007 7:31 PM To: 'Access Developers discussion and problem solving' Subject: Re: [AccessD] using a saved SSIS with VB.Net Have you looked into BULK INSERT in SQL? This is supposed to be a faster data import method. Using T-SQL you can do something like this... CREATE TABLE #tmpEmployees () BULK INSERT #tmpEmployees FROM 'c:\temp\import.csv' WITH (FORMATFILE = 'c:\temp\importCSV.fmt' importCSV.fmt would contain the file format...in this example it's fixed width 8.0 18 1 SQLCHAR 0 5 "" 1 suffix SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 30 "" 2 last_name SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 20 "" 3 first_name SQL_Latin1_General_CP1_CI_AS -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.6.2/782 - Release Date: 5/1/2007 2:10 AM From ebarro at verizon.net Tue May 1 09:16:05 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 01 May 2007 07:16:05 -0700 Subject: [dba-SQLServer] [AccessD] using a saved SSIS with VB.Net In-Reply-To: <004f01c78bef$264cf210$657aa8c0@m6805> Message-ID: <0JHD0096Z8XC1WT7@vms042.mailsrvcs.net> John, Based on what I have seen so far, it will not strip the spaces so I usually create a TEMP table and BULK INSERT the data to that temp table and then RTRIM the fields when I import to the real table. Here's the code I used to import padded records from Peoplesoft... -- Create a table to hold the data CREATE TABLE #tmpEmployees ( suffix varchar(5), last_name varchar(30), first_name varchar(20), [....snipped the rest of the fields in between] Email varchar(50) ) -- Read the text file into the temp table BULK INSERT #tmpEmployees FROM 'ImportData.csv' WITH (FORMATFILE = 'ImportData.fmt' -- Now read it into Employees table (this assumes same number of fields as the importdata file otherwise we need to spell out each field) INSERT INTO Employees SELECT RTRIM(last_name) as last_name, RTRIM(first_name) as first_name, [....snipped the rest of the fields in between] RTRIM(email) as email FROM #tmpEmployees -- And then clean up DROP TABLE #tmpEmployees -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 01, 2007 5:49 AM To: 'Access Developers discussion and problem solving' Cc: dba-sqlserver at databaseadvisors.com Subject: Re: [AccessD] using a saved SSIS with VB.Net Eric, I have successfully generated the fmt file. My only concern at this point is that the import file has spaces at the end of the valid data, padding to make up the full width of the field. I do NOT want the spaces, and I suspect that BCP is going to pull in all of the spaces. Is it possible to tell BCP to strip the spaces, or is it possible to use BCP to pull a CSV file into an existing table, respecting the already established field types / sizes in the destination table? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Monday, April 30, 2007 7:31 PM To: 'Access Developers discussion and problem solving' Subject: Re: [AccessD] using a saved SSIS with VB.Net Have you looked into BULK INSERT in SQL? This is supposed to be a faster data import method. Using T-SQL you can do something like this... CREATE TABLE #tmpEmployees () BULK INSERT #tmpEmployees FROM 'c:\temp\import.csv' WITH (FORMATFILE = 'c:\temp\importCSV.fmt' importCSV.fmt would contain the file format...in this example it's fixed width 8.0 18 1 SQLCHAR 0 5 "" 1 suffix SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 30 "" 2 last_name SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 20 "" 3 first_name SQL_Latin1_General_CP1_CI_AS -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, April 24, 2007 9:07 AM To: 'Access Developers discussion and problem solving' Subject: Re: [AccessD] using a saved SSIS with VB.Net The CSV file is on the same machine. It appears that the clause that pulls the source table (csv file) into memory is taking a ton of time. These are large files, the smallest are a little under 200 million bytes and the largest are up in the 3 gigabyte range. It appears that SQL Server does not read a few records and append them, but rather reads the whole CSV and then starts appending all of the assembled records. If I were a SQL Server pro I could probably speed this up considerably. Alas, I am not. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart Sent: Tuesday, April 24, 2007 11:53 AM To: accessd at databaseadvisors.com Subject: Re: [AccessD] using a saved SSIS with VB.Net John, One of the keys to getting it into SQL Server faster is to have the CSV file on the server and not on a different machine. Network traffic can kill the process and slow it down significantly. Robert At 10:22 AM 4/24/2007, you wrote: >Date: Tue, 24 Apr 2007 10:32:44 -0400 >From: "JWColby" >Subject: Re: [AccessD] using a saved SSIS with VB.Net >To: "'Access Developers discussion and problem solving'" > >Message-ID: <003101c7867d$6cce93a0$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Gustav, > >My bigger issue here is that there 56 of these files to import into SQL >Server, supposedly ~100 million records. I have done about 8 million >records so far. I really must get this thing automated such that it >just chunks through these CSV files without my having to be around to >start the next one. I am working now on setting up the append query >using that syntax below into a stored procedure so that I can then just replace the file name. >After that I will need to write something in VB.Net or whatever to >execute the stored procedure feeding in all of the file names from a >specific directory, deleting the file once the stored procedure >finishes the import for a given file. > >I have never written a stored procedure. >Obviously, given the above, I have never called a stored procedure from >code. > >So much to learn, so little time. > >Once this is imported I have to turn right around and export a subset >of fields from the table back out as 1 - 2 million record chunks for >CAS / DPV / NCOA processing, then I have to import THOSE back in to a >new table. > >And when this set of data is finished, I have another set of about the >same size on the way, to which I have to perform the same processes. I >soooooo need to get this process automated. > >John W. Colby -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.463 / Virus Database: 269.5.10/774 - Release Date: 4/23/2007 5:26 PM -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.6.2/782 - Release Date: 5/1/2007 2:10 AM From ebarro at verizon.net Tue May 1 09:18:19 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 01 May 2007 07:18:19 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <005001c78bf4$acb80290$657aa8c0@m6805> Message-ID: <0JHD00LUX912E0CB@vms044.mailsrvcs.net> John, Can you send a sample csv...maybe 3 records or so? Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 01, 2007 6:29 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Bulk insert I am trying to to use the bulk insert to pull csv file data in to a table. It appears that I have everything except for the fact that the data is enclosed in "" - "SomeData". This is common in CSV files in order to "wrap" commas which would otherwise be interpreted as a field delimiter. Bulk Copy has a Line Terminator and a Field terminator as a parameter but I do not see anyplace to specify what is known as a "Text qualifier" (the " surrounding the data). Is there ANY way to do this? CSV files are very common and text qualifiers are so common that the import / export wizard allows you to specify them. Surely bulk insert must understand them. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.6.2/782 - Release Date: 5/1/2007 2:10 AM From jwcolby at colbyconsulting.com Tue May 1 09:20:32 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 10:20:32 -0400 Subject: [dba-SQLServer] Importing large data files Message-ID: <005801c78bfb$e13b23b0$657aa8c0@m6805> I have been struggling to get large data files to import into SQL Server in a manner that won't take weeks to complete. I was trying to use CSV files to get around the trailing (and occasionally leading) spaces in the data fields caused by fixed width / space padded fields in the source text files. So I created a program in MS Access to read the source file, line by line, carve out each data element, strip the spaces, then when an entire line was assembled, write that back out to a new csv file. The next issue I ran into is that SQL Server doesn't know how to deal with text qualifiers, which are commonly used in CSV files to encapsulate commas inside of data, IOW if a comma is the delimiter, then the data is enclosed in quotes: "51 Some Street, Ste2001", "Some City". In THIS CASE, since I control the intermediary file, I am able to change from a space delimiter to a | delimiter and remove the quotes. Thus the previous example turns into 51 Some Street, Ste2001|Some City| This can be imported into SQL Server without an issue since I can specify the FIELDTERMINATOR parameter to the bulk insert in a query. Using bulk insert pulled my import up from 300-400 records per second to almost one thousand records per second. Not blazing but with 100 million records to import, it is a significant improvement in total time to import. John W. Colby Colby Consulting www.ColbyConsulting.com From ebarro at verizon.net Tue May 1 09:22:34 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 01 May 2007 07:22:34 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <005001c78bf4$acb80290$657aa8c0@m6805> Message-ID: <0JHD00LSH985E7TB@vms044.mailsrvcs.net> The link I sent in one of my email responses actually addressed the " issue... http://www.thescripts.com/forum/thread520822.html Specifically... ------- 9;Some unquoted data;12;9.234;2004-12-12 19;"Some quoted data";-12;31.4;2003-02-23 But if a text column is consistently quoted, you can handle this with a format file where you specify each field. A format file that fits the second row in the example above could look like: 8.0 5 1 SQLCHAR 0 0 ";\"" 1 col1 "" 2 SQLCHAR 0 0 "\";" 2 col2 "" 3 SQLCHAR 0 0 ";" 3 col3 "" 4 SQLCHAR 0 0 ";" 4 col3 "" 5 SQLCHAR 0 0 "\r\n" 5 col3 "" -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 01, 2007 6:29 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Bulk insert I am trying to to use the bulk insert to pull csv file data in to a table. It appears that I have everything except for the fact that the data is enclosed in "" - "SomeData". This is common in CSV files in order to "wrap" commas which would otherwise be interpreted as a field delimiter. Bulk Copy has a Line Terminator and a Field terminator as a parameter but I do not see anyplace to specify what is known as a "Text qualifier" (the " surrounding the data). Is there ANY way to do this? CSV files are very common and text qualifiers are so common that the import / export wizard allows you to specify them. Surely bulk insert must understand them. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.6.2/782 - Release Date: 5/1/2007 2:10 AM From shait at mindspring.com Tue May 1 09:22:44 2007 From: shait at mindspring.com (Stephen Hait) Date: Tue, 1 May 2007 10:22:44 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <005001c78bf4$acb80290$657aa8c0@m6805> References: <005001c78bf4$acb80290$657aa8c0@m6805> Message-ID: Could you possibly use DTS instead of BCP? It should allow you to specify your text qualifier. Stephen On 5/1/07, JWColby wrote: > I am trying to to use the bulk insert to pull csv file data in to a table. > It appears that I have everything except for the fact that the data is > enclosed in "" - "SomeData". This is common in CSV files in order to "wrap" > commas which would otherwise be interpreted as a field delimiter. Bulk Copy > has a Line Terminator and a Field terminator as a parameter but I do not see > anyplace to specify what is known as a "Text qualifier" (the " surrounding > the data). > > Is there ANY way to do this? CSV files are very common and text qualifiers > are so common that the import / export wizard allows you to specify them. > Surely bulk insert must understand them. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From ebarro at verizon.net Tue May 1 09:29:03 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 01 May 2007 07:29:03 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: Message-ID: <0JHD00A0X9IZ7HB4@vms048.mailsrvcs.net> DTS may be more flexible but it is significantly slower than BULK INSERT. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stephen Hait Sent: Tuesday, May 01, 2007 7:23 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Could you possibly use DTS instead of BCP? It should allow you to specify your text qualifier. Stephen On 5/1/07, JWColby wrote: > I am trying to to use the bulk insert to pull csv file data in to a table. > It appears that I have everything except for the fact that the data is > enclosed in "" - "SomeData". This is common in CSV files in order to "wrap" > commas which would otherwise be interpreted as a field delimiter. > Bulk Copy has a Line Terminator and a Field terminator as a parameter > but I do not see anyplace to specify what is known as a "Text > qualifier" (the " surrounding the data). > > Is there ANY way to do this? CSV files are very common and text > qualifiers are so common that the import / export wizard allows you to specify them. > Surely bulk insert must understand them. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Tue May 1 09:48:36 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 10:48:36 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <0JHD00A0X9IZ7HB4@vms048.mailsrvcs.net> References: <0JHD00A0X9IZ7HB4@vms048.mailsrvcs.net> Message-ID: <005d01c78bff$cca0f390$657aa8c0@m6805> There is a BATCHSIZE parameter to BULKINSERT. Does anyone know the effects (speed-wise) when using this parameter? What is a reasonable value for this parameter? I did not use a BATCHSIZE in the first file imported, but I set a batchsize =10000 (10K) for the next file. These import files can be anywhere from a couple of hundred thousand records up to 4 million records. John W. Colby Colby Consulting www.ColbyConsulting.com From shait at mindspring.com Tue May 1 10:00:45 2007 From: shait at mindspring.com (Stephen Hait) Date: Tue, 1 May 2007 11:00:45 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <0JHD00A0X9IZ7HB4@vms048.mailsrvcs.net> References: <0JHD00A0X9IZ7HB4@vms048.mailsrvcs.net> Message-ID: I agree that BULK INSERT is faster. If quoted values in your source file are causing problems when using BCP, one thing you might try is to import first into MS Access which handles quoted values easily. Then export the data from Access as tab delmited. You can then specify the tab character as the field terminator in your BCP command with the switch, -t"\t" Good luck. On 5/1/07, Eric Barro wrote: > DTS may be more flexible but it is significantly slower than BULK INSERT. > From jwcolby at colbyconsulting.com Tue May 1 10:53:19 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 11:53:19 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: References: <0JHD00A0X9IZ7HB4@vms048.mailsrvcs.net> Message-ID: <006701c78c08$d7993ec0$657aa8c0@m6805> And that is exactly what I am doing and it is soooooo KLUDGY!!! I am running a program I wrote to read the data out of the original fixed width file because SQL Server won't handle fixed width files and strip off the spaces. How incredibly stupid is THAT? Is there ANYONE out there who WANTS those spaces? So I am already using KLUDGE to get data into SQL Server. Now I export it out to a perfectly valid CSV file only to discover that SQL Server BCP and Bulk Insert don't even look at (understand) quotes around comma delimited fields. But ACCESS does. But Access is a TOY remember? What exactly does that make SQL Server that it needs a toy to feed it data? This has been an exercise in discovering just how brain dead the data import processes are (or can be anyway) for SQL Server. This is NOT rocket science. I am able to write a utility to open / import / mangle / export it back out to another file in VBA. How tough can it be to do this import inside of SQL Server natively? I have no idea how widespread this kind of file is but I can tell you that that is all I see EVER in the industry I am dealing with. HUGE files, fixed width, space padded right. And I can tell you they have been a royal PITA to get into SQL Server. At least now I have my own utility that can get I these input files into the format I need, even if it is in ACCESS/ VBA. My next step is to port this to VB.Net so that I can do it a little more "natively". Once I get a little more familiar with VB.Net I want to look at storing the data right into a recordset in ADO and then write that back to SQL Server. If that is too slow (I suspect that it will be) then I can still do what I do now and import / mangle / write to file and then run a stored procedure to do a Bulk Insert from the file I create. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stephen Hait Sent: Tuesday, May 01, 2007 11:01 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert I agree that BULK INSERT is faster. If quoted values in your source file are causing problems when using BCP, one thing you might try is to import first into MS Access which handles quoted values easily. Then export the data from Access as tab delmited. You can then specify the tab character as the field terminator in your BCP command with the switch, -t"\t" Good luck. On 5/1/07, Eric Barro wrote: > DTS may be more flexible but it is significantly slower than BULK INSERT. > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From DavidL at sierranevada.com Tue May 1 12:28:31 2007 From: DavidL at sierranevada.com (David Lewis) Date: Tue, 1 May 2007 10:28:31 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: References: Message-ID: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> John: There are a few websites concerned with sql server, with active forums. I recommend you check them out. One is sqlcentral.com, the other is sswug. You are right, access is a 'toy' when compared to sql server. One problem you are having is that you don't have the luxury of a learning curve -- but that is not the fault of the tool. Hang in there. D And that is exactly what I am doing and it is soooooo KLUDGY!!! I am running a program I wrote to read the data out of the original fixed width file because SQL Server won't handle fixed width files and strip off the spaces. How incredibly stupid is THAT? Is there ANYONE out there who WANTS those spaces? So I am already using KLUDGE to get data into SQL Server. Now I export it out to a perfectly valid CSV file only to discover that SQL Server BCP and Bulk Insert don't even look at (understand) quotes around comma delimited fields. But ACCESS does. But Access is a TOY remember? What exactly does that make SQL Server that it needs a toy to feed it data? This has been an exercise in discovering just how brain dead the data import processes are (or can be anyway) for SQL Server. This is NOT rocket science. I am able to write a utility to open / import / mangle / export it back out to another file in VBA. How tough can it be to do this import inside of SQL Server natively? I have no idea how widespread this kind of file is but I can tell you that that is all I see EVER in the industry I am dealing with. HUGE files, fixed width, space padded right. And I can tell you they have been a royal PITA to get into SQL Server. At least now I have my own utility that can get I these input files into the format I need, even if it is in ACCESS/ VBA. My next step is to port this to VB.Net so that I can do it a little more "natively". Once I get a little more familiar with VB.Net I want to look at storing the data right into a recordset in ADO and then write that back to SQL Server. If that is too slow (I suspect that it will be) then I can still do what I do now and import / mangle / write to file and then run a stored procedure to do a Bulk Insert from the file I create. John W. Colby Colby Consulting www.ColbyConsulting.com From jwcolby at colbyconsulting.com Tue May 1 12:51:29 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 1 May 2007 13:51:29 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> Message-ID: <007001c78c19$592f94b0$657aa8c0@m6805> ROTFL. One problem I have is a "big boy" database that requires a toy to feed it data. And yes, that is the fault of the tool. Luckily I am an expert at programming the "toy" so I can get around the shortcomings of the ... Hmm... "big boy's database". I do notice that after all my posts there is dead silence about getting SQL Server to do what I need in an expeditious manner. Having combed the web I also find that there are hundreds of posts about this shortcoming of SQL Server. "Oh, well... You can import to a temp table and then use xxx to strip the spaces and then use YYY to get the data where it really should have been put in the first damned place by this "big boy's database". Yea, except that I have 60 of these files to do, and another 100 coming in a few weeks and another 50 shortly after that. It is waaaay faster to program my "toy" to feed it data. I will be the first to say that SQL Server is a powerful database engine, that is why I use it. I am also however willing to admit that it sucks in some very visible ways. Learning curve my hind ass. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of David Lewis Sent: Tuesday, May 01, 2007 1:29 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Bulk insert John: There are a few websites concerned with sql server, with active forums. I recommend you check them out. One is sqlcentral.com, the other is sswug. You are right, access is a 'toy' when compared to sql server. One problem you are having is that you don't have the luxury of a learning curve -- but that is not the fault of the tool. Hang in there. D And that is exactly what I am doing and it is soooooo KLUDGY!!! I am running a program I wrote to read the data out of the original fixed width file because SQL Server won't handle fixed width files and strip off the spaces. How incredibly stupid is THAT? Is there ANYONE out there who WANTS those spaces? So I am already using KLUDGE to get data into SQL Server. Now I export it out to a perfectly valid CSV file only to discover that SQL Server BCP and Bulk Insert don't even look at (understand) quotes around comma delimited fields. But ACCESS does. But Access is a TOY remember? What exactly does that make SQL Server that it needs a toy to feed it data? This has been an exercise in discovering just how brain dead the data import processes are (or can be anyway) for SQL Server. This is NOT rocket science. I am able to write a utility to open / import / mangle / export it back out to another file in VBA. How tough can it be to do this import inside of SQL Server natively? I have no idea how widespread this kind of file is but I can tell you that that is all I see EVER in the industry I am dealing with. HUGE files, fixed width, space padded right. And I can tell you they have been a royal PITA to get into SQL Server. At least now I have my own utility that can get I these input files into the format I need, even if it is in ACCESS/ VBA. My next step is to port this to VB.Net so that I can do it a little more "natively". Once I get a little more familiar with VB.Net I want to look at storing the data right into a recordset in ADO and then write that back to SQL Server. If that is too slow (I suspect that it will be) then I can still do what I do now and import / mangle / write to file and then run a stored procedure to do a Bulk Insert from the file I create. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From accessd at shaw.ca Tue May 1 16:43:20 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Tue, 01 May 2007 14:43:20 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <007001c78c19$592f94b0$657aa8c0@m6805> Message-ID: <0JHD00DHJTHA3210@l-daemon> Hi John: SQL Server 2005 was created as a database for manipulating data within it. It was not designed as a application for importing millions of rows on a steady bases. You are going to have to become a master of the bulk-loader or be patient with limited speed of a pre-processor. It might be worth investing in a good data conversion tool even though developers are pre-conditions into: rolling-your-own. For example here is a tool that might be worth looking into: http://www.utexas.edu/its/rc/tutorials/stat/spss/spss1/ or even Microsoft's BixTalk Server (there is a 120 day evaluation full copy at http://www.microsoft.com/technet/prodtechnol/biztalk/2006/downloads/default. mspx) which is designed for doing tricky conversion routine. You can add whole program to manage the importation of a single field.... and that sounds like what you need. HTH Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 01, 2007 10:51 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert ROTFL. One problem I have is a "big boy" database that requires a toy to feed it data. And yes, that is the fault of the tool. Luckily I am an expert at programming the "toy" so I can get around the shortcomings of the ... Hmm... "big boy's database". I do notice that after all my posts there is dead silence about getting SQL Server to do what I need in an expeditious manner. Having combed the web I also find that there are hundreds of posts about this shortcoming of SQL Server. "Oh, well... You can import to a temp table and then use xxx to strip the spaces and then use YYY to get the data where it really should have been put in the first damned place by this "big boy's database". Yea, except that I have 60 of these files to do, and another 100 coming in a few weeks and another 50 shortly after that. It is waaaay faster to program my "toy" to feed it data. I will be the first to say that SQL Server is a powerful database engine, that is why I use it. I am also however willing to admit that it sucks in some very visible ways. Learning curve my hind ass. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of David Lewis Sent: Tuesday, May 01, 2007 1:29 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Bulk insert John: There are a few websites concerned with sql server, with active forums. I recommend you check them out. One is sqlcentral.com, the other is sswug. You are right, access is a 'toy' when compared to sql server. One problem you are having is that you don't have the luxury of a learning curve -- but that is not the fault of the tool. Hang in there. D And that is exactly what I am doing and it is soooooo KLUDGY!!! I am running a program I wrote to read the data out of the original fixed width file because SQL Server won't handle fixed width files and strip off the spaces. How incredibly stupid is THAT? Is there ANYONE out there who WANTS those spaces? So I am already using KLUDGE to get data into SQL Server. Now I export it out to a perfectly valid CSV file only to discover that SQL Server BCP and Bulk Insert don't even look at (understand) quotes around comma delimited fields. But ACCESS does. But Access is a TOY remember? What exactly does that make SQL Server that it needs a toy to feed it data? This has been an exercise in discovering just how brain dead the data import processes are (or can be anyway) for SQL Server. This is NOT rocket science. I am able to write a utility to open / import / mangle / export it back out to another file in VBA. How tough can it be to do this import inside of SQL Server natively? I have no idea how widespread this kind of file is but I can tell you that that is all I see EVER in the industry I am dealing with. HUGE files, fixed width, space padded right. And I can tell you they have been a royal PITA to get into SQL Server. At least now I have my own utility that can get I these input files into the format I need, even if it is in ACCESS/ VBA. My next step is to port this to VB.Net so that I can do it a little more "natively". Once I get a little more familiar with VB.Net I want to look at storing the data right into a recordset in ADO and then write that back to SQL Server. If that is too slow (I suspect that it will be) then I can still do what I do now and import / mangle / write to file and then run a stored procedure to do a Bulk Insert from the file I create. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From ridermark at gmail.com Tue May 1 19:17:51 2007 From: ridermark at gmail.com (Mark Rider) Date: Tue, 1 May 2007 19:17:51 -0500 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <007001c78c19$592f94b0$657aa8c0@m6805> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> <007001c78c19$592f94b0$657aa8c0@m6805> Message-ID: John, You might want to check into SyncSort (www.syncsort.com). They specialize in creating the front end process for loading data, and manipulating it as necessary before it is imported. I have worked with them in the past, but never found the money to have their solution make it worthwhile for what I am doing. Your situation is very different, and they may b able to provide a custom solution for your needs. I get no kickback for mentioning this, am not affiliated...etc. Just a tool that you might want to look into. -- Mark Rider http://dfwmdug.org Don't anthropomorphize computers. They don't like it. From fuller.artful at gmail.com Tue May 1 22:32:31 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Tue, 1 May 2007 23:32:31 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <007001c78c19$592f94b0$657aa8c0@m6805> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> <007001c78c19$592f94b0$657aa8c0@m6805> Message-ID: <29f585dd0705012032n22056cc6j73438aa575777e0a@mail.gmail.com> To put it as gently as possible, you haven't even begun to explore what SSIS can do. Several of your statements are so preposterous as to lie beneath rebuttal. I suggest that when you have a spare weekend, you investigate the SSIS documentation. There you will find abundant insight into how to handle CSV and fixed-width files, not to mention a whole lot more. I'm sorry that the wizards couldn't get you from here to there, but just because you couldn't figure out immediately how to do it does NOT mean it's the tool's problem. SSIS is a quantum leap beyond what DTS could do, and even it could handle your CSV and trailing spaces problems without difficulty. I respectfully suggest it's time for input (read some documentation) not output. This is not to say that SSIS is problem-free, but the trivial issues you raise are solvable in a few minutes of reading. Arthur On 5/1/07, JWColby wrote: > > ROTFL. One problem I have is a "big boy" database that requires a toy to > feed it data. > > And yes, that is the fault of the tool. Luckily I am an expert at > programming the "toy" so I can get around the shortcomings of the ... > Hmm... > "big boy's database". > > I do notice that after all my posts there is dead silence about getting > SQL > Server to do what I need in an expeditious manner. Having combed the web > I > also find that there are hundreds of posts about this shortcoming of SQL > Server. > > "Oh, well... You can import to a temp table and then use xxx to strip the > spaces and then use YYY to get the data where it really should have been > put > in the first damned place by this "big boy's database". Yea, except that > I > have 60 of these files to do, and another 100 coming in a few weeks and > another 50 shortly after that. It is waaaay faster to program my "toy" to > feed it data. > > I will be the first to say that SQL Server is a powerful database engine, > that is why I use it. I am also however willing to admit that it sucks in > some very visible ways. > > Learning curve my hind ass. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of David > Lewis > Sent: Tuesday, May 01, 2007 1:29 PM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] Bulk insert > > > > John: > > There are a few websites concerned with sql server, with active forums. > I recommend you check them out. One is sqlcentral.com, the other is > sswug. > > You are right, access is a 'toy' when compared to sql server. One problem > you are having is that you don't have the luxury of a learning curve -- > but > that is not the fault of the tool. Hang in there. D > > > And that is exactly what I am doing and it is soooooo KLUDGY!!! > > I am running a program I wrote to read the data out of the original fixed > width file because SQL Server won't handle fixed width files and strip off > the spaces. How incredibly stupid is THAT? Is there ANYONE out there who > WANTS those spaces? So I am already using KLUDGE to get data into SQL > Server. Now I export it out to a perfectly valid CSV file only to > discover > that SQL Server BCP and Bulk Insert don't even look at (understand) quotes > around comma delimited fields. > > But ACCESS does. But Access is a TOY remember? What exactly does that > make > SQL Server that it needs a toy to feed it data? > > This has been an exercise in discovering just how brain dead the data > import > processes are (or can be anyway) for SQL Server. This is NOT rocket > science. I am able to write a utility to open / import / mangle / export > it > back out to another file in VBA. How tough can it be to do this import > inside of SQL Server natively? > > I have no idea how widespread this kind of file is but I can tell you that > that is all I see EVER in the industry I am dealing with. HUGE files, > fixed > width, space padded right. And I can tell you they have been a royal PITA > to get into SQL Server. > > At least now I have my own utility that can get I these input files into > the > format I need, even if it is in ACCESS/ VBA. My next step is to port this > to VB.Net so that I can do it a little more "natively". Once I get a > little > more familiar with VB.Net I want to look at storing the data right into a > recordset in ADO and then write that back to SQL Server. If that is too > slow (I suspect that it will be) then I can still do what I do now and > import / mangle / write to file and then run a stored procedure to do a > Bulk > Insert from the file I create. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Wed May 2 06:55:50 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 2 May 2007 07:55:50 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <29f585dd0705012032n22056cc6j73438aa575777e0a@mail.gmail.com> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp><007001c78c19$592f94b0$657aa8c0@m6805> <29f585dd0705012032n22056cc6j73438aa575777e0a@mail.gmail.com> Message-ID: <002e01c78cb0$d6e93950$657aa8c0@m6805> Arthur knock it off. I haven't even begun to explore what SQL Server in general can do. I asked a whole STREAM of questions about using SSIS to do this and you and everyone else were strangely silent. This list is exactly about getting HELP not "your statements are preposterous". If you can't answer how to do it, that is fine (notice that you STILL aren't offering an answer) but if YOU DON'T KNOW (being the kung fu master of SQL Server) then exactly how am I supposed to figure it out. Preposterous my statements may be (though you aren't showing me proof of that) but insulting your statements are, and not helpful to boot. I came up with a solution to my problem that involved a TOY. YOU OTOH do NOT offer a solution but hint that someday when I too am a kung fu master I will be able to solve my problems masterfully. Which I do not doubt. I respectfully suggest that you put your money where your mouth is. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Tuesday, May 01, 2007 11:33 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert To put it as gently as possible, you haven't even begun to explore what SSIS can do. Several of your statements are so preposterous as to lie beneath rebuttal. I suggest that when you have a spare weekend, you investigate the SSIS documentation. There you will find abundant insight into how to handle CSV and fixed-width files, not to mention a whole lot more. I'm sorry that the wizards couldn't get you from here to there, but just because you couldn't figure out immediately how to do it does NOT mean it's the tool's problem. SSIS is a quantum leap beyond what DTS could do, and even it could handle your CSV and trailing spaces problems without difficulty. I respectfully suggest it's time for input (read some documentation) not output. This is not to say that SSIS is problem-free, but the trivial issues you raise are solvable in a few minutes of reading. Arthur On 5/1/07, JWColby wrote: > > ROTFL. One problem I have is a "big boy" database that requires a toy > to feed it data. > > And yes, that is the fault of the tool. Luckily I am an expert at > programming the "toy" so I can get around the shortcomings of the ... > Hmm... > "big boy's database". > > I do notice that after all my posts there is dead silence about > getting SQL Server to do what I need in an expeditious manner. Having > combed the web I also find that there are hundreds of posts about this > shortcoming of SQL Server. > > "Oh, well... You can import to a temp table and then use xxx to strip > the spaces and then use YYY to get the data where it really should > have been put in the first damned place by this "big boy's database". > Yea, except that I have 60 of these files to do, and another 100 > coming in a few weeks and another 50 shortly after that. It is waaaay > faster to program my "toy" to feed it data. > > I will be the first to say that SQL Server is a powerful database > engine, that is why I use it. I am also however willing to admit that > it sucks in some very visible ways. > > Learning curve my hind ass. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of David > Lewis > Sent: Tuesday, May 01, 2007 1:29 PM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] Bulk insert > > > > John: > > There are a few websites concerned with sql server, with active forums. > I recommend you check them out. One is sqlcentral.com, the other is > sswug. > > You are right, access is a 'toy' when compared to sql server. One > problem you are having is that you don't have the luxury of a learning > curve -- but that is not the fault of the tool. Hang in there. D > > > And that is exactly what I am doing and it is soooooo KLUDGY!!! > > I am running a program I wrote to read the data out of the original > fixed width file because SQL Server won't handle fixed width files and > strip off the spaces. How incredibly stupid is THAT? Is there ANYONE > out there who WANTS those spaces? So I am already using KLUDGE to get > data into SQL Server. Now I export it out to a perfectly valid CSV > file only to discover that SQL Server BCP and Bulk Insert don't even > look at (understand) quotes around comma delimited fields. > > But ACCESS does. But Access is a TOY remember? What exactly does > that make SQL Server that it needs a toy to feed it data? > > This has been an exercise in discovering just how brain dead the data > import processes are (or can be anyway) for SQL Server. This is NOT > rocket science. I am able to write a utility to open / import / > mangle / export it back out to another file in VBA. How tough can it > be to do this import inside of SQL Server natively? > > I have no idea how widespread this kind of file is but I can tell you > that that is all I see EVER in the industry I am dealing with. HUGE > files, fixed width, space padded right. And I can tell you they have > been a royal PITA to get into SQL Server. > > At least now I have my own utility that can get I these input files > into the format I need, even if it is in ACCESS/ VBA. My next step is > to port this to VB.Net so that I can do it a little more "natively". > Once I get a little more familiar with VB.Net I want to look at > storing the data right into a recordset in ADO and then write that > back to SQL Server. If that is too slow (I suspect that it will be) > then I can still do what I do now and import / mangle / write to file > and then run a stored procedure to do a Bulk Insert from the file I > create. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 2 07:49:12 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 2 May 2007 08:49:12 -0400 Subject: [dba-SQLServer] Speed of bulk insert Message-ID: <004301c78cb8$4d611e70$657aa8c0@m6805> Interestingly, the speed of import has either - jumped from ~1K records / second to about 11K records / second, OR - I previously missed a decimal place (though the times required earlier on were still quite large so I don't think that's it). In any event ATM I am getting a speed consistently around 11K / second for a bulk insert operation. That is a significant event in my ability to expeditiously process 100 million records. John W. Colby Colby Consulting www.ColbyConsulting.com From jwcolby at colbyconsulting.com Wed May 2 08:56:26 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 2 May 2007 09:56:26 -0400 Subject: [dba-SQLServer] NVARCHAR turned into Text(255) in Access Message-ID: <005001c78cc1$adf4d930$657aa8c0@m6805> I uploaded my billing database to SQL Server the other day. The process worked perfectly AFAICT, however SQL Server turned my memo field into a nvarchar, which when linked back into ACCESS, was now being converted to a TEXT(255). Needless to say, this was not my intention, nor is it acceptable. Has anyone ever run into this? Known solutions? John W. Colby Colby Consulting www.ColbyConsulting.com From jwcolby at colbyconsulting.com Wed May 2 09:10:34 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 2 May 2007 10:10:34 -0400 Subject: [dba-SQLServer] NVARCHAR turned into Text(255) in Access In-Reply-To: <005001c78cc1$adf4d930$657aa8c0@m6805> References: <005001c78cc1$adf4d930$657aa8c0@m6805> Message-ID: <005501c78cc3$a737c5b0$657aa8c0@m6805> It turns out that simply changing it to VarChar(8000) works in this case. I tried VarChar(64000) and got a message from SQL Server that 8000 was the max for VarChar. This would have to be described as a bug in the converter however, which could cause data loss (though AFAICT it did not in my case, only because all of my memo fields are shorter than 8K). It also indicates a possible problem with the ODBC driver / connection since a Memo can be up to 64000 characters and VarChar can only be 8K there is a very real possibility of data loss in some cases. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Wednesday, May 02, 2007 9:56 AM To: dba-sqlserver at databaseadvisors.com; 'Access Developers discussion and problem solving' Subject: [dba-SQLServer] NVARCHAR turned into Text(255) in Access I uploaded my billing database to SQL Server the other day. The process worked perfectly AFAICT, however SQL Server turned my memo field into a nvarchar, which when linked back into ACCESS, was now being converted to a TEXT(255). Needless to say, this was not my intention, nor is it acceptable. Has anyone ever run into this? Known solutions? John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fuller.artful at gmail.com Wed May 2 10:24:08 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Wed, 2 May 2007 11:24:08 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <002e01c78cb0$d6e93950$657aa8c0@m6805> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> <007001c78c19$592f94b0$657aa8c0@m6805> <29f585dd0705012032n22056cc6j73438aa575777e0a@mail.gmail.com> <002e01c78cb0$d6e93950$657aa8c0@m6805> Message-ID: <29f585dd0705020824o4f4ea4e2q57a319735c214a@mail.gmail.com> Good points, all, JC. I will upload an approach in an hour or two. Meanwhile I have one of my own alligators to wrestle. But before closing, I would like to apologize. My words read much more harshly than they were intended. Arthur On 5/2/07, JWColby wrote: > > Arthur knock it off. I haven't even begun to explore what SQL Server in > general can do. I asked a whole STREAM of questions about using SSIS to > do > this and you and everyone else were strangely silent. This list is > exactly > about getting HELP not "your statements are preposterous". If you can't > answer how to do it, that is fine (notice that you STILL aren't offering > an > answer) but if YOU DON'T KNOW (being the kung fu master of SQL Server) > then > exactly how am I supposed to figure it out. > > Preposterous my statements may be (though you aren't showing me proof of > that) but insulting your statements are, and not helpful to boot. > > I came up with a solution to my problem that involved a TOY. YOU OTOH do > NOT offer a solution but hint that someday when I too am a kung fu master > I > will be able to solve my problems masterfully. Which I do not doubt. I > respectfully suggest that you put your money where your mouth is. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > From DavidL at sierranevada.com Wed May 2 10:20:23 2007 From: DavidL at sierranevada.com (David Lewis) Date: Wed, 2 May 2007 08:20:23 -0700 Subject: [dba-SQLServer] Bulk insert (Arthur Fuller) In-Reply-To: References: Message-ID: <00101736F13D774F88C54058CB2663C8014D36B6@celebration.sierranevada.corp> Hi Guys: Keep cool, everybody. John: The fact that you asked many questions that nobody here could answer is unfortunate but again doesn't reflect badly on the tool or the users. The two sites I mentioned (sqlcentral.com and sswug) have very active forums specifically dedicated to such topics as SSIS in 2005 or DTS in sql2k, etc. When you have a very specific question such as some of the ones you posed here, I have found better results on forums such as those. This forum has some very capable people, but relatively few regular posters. SQL Server is such a broad product that by now it takes a number of people specialized in different aspects of it to solve many of the harder problems. Most of those specialists are solving problems that the wizards in sql server cannot begin to address. Good luck. David ------------------------------ Message: 6 Date: Wed, 2 May 2007 07:55:50 -0400 From: "JWColby" Subject: Re: [dba-SQLServer] Bulk insert To: Message-ID: <002e01c78cb0$d6e93950$657aa8c0 at m6805> Content-Type: text/plain; charset="us-ascii" Arthur knock it off. I haven't even begun to explore what SQL Server in general can do. I asked a whole STREAM of questions about using SSIS to do this and you and everyone else were strangely silent. This list is exactly about getting HELP not "your statements are preposterous". If you can't answer how to do it, that is fine (notice that you STILL aren't offering an answer) but if YOU DON'T KNOW (being the kung fu master of SQL Server) then exactly how am I supposed to figure it out. Preposterous my statements may be (though you aren't showing me proof of that) but insulting your statements are, and not helpful to boot. I came up with a solution to my problem that involved a TOY. YOU OTOH do NOT offer a solution but hint that someday when I too am a kung fu master I will be able to solve my problems masterfully. Which I do not doubt. I respectfully suggest that you put your money where your mouth is. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Tuesday, May 01, 2007 11:33 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert To put it as gently as possible, you haven't even begun to explore what SSIS can do. Several of your statements are so preposterous as to lie beneath rebuttal. I suggest that when you have a spare weekend, you investigate the SSIS documentation. There you will find abundant insight into how to handle CSV and fixed-width files, not to mention a whole lot more. I'm sorry that the wizards couldn't get you from here to there, but just because you couldn't figure out immediately how to do it does NOT mean it's the tool's problem. SSIS is a quantum leap beyond what DTS could do, and even it could handle your CSV and trailing spaces problems without difficulty. I respectfully suggest it's time for input (read some documentation) not output. This is not to say that SSIS is problem-free, but the trivial issues you raise are solvable in a few minutes of reading. Arthur From ebarro at verizon.net Wed May 2 09:22:00 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 02 May 2007 07:22:00 -0700 Subject: [dba-SQLServer] [AccessD] NVARCHAR turned into Text(255) in Access In-Reply-To: <005501c78cc3$a737c5b0$657aa8c0@m6805> Message-ID: <0JHF00LMC3XZEF5F@vms044.mailsrvcs.net> One caveat with 8000 character field lengths is that SQL server will "spread' the record out to 2 pages when it retrieves the record. In other words, if you want retrieval of SQL data to be efficient you have to ensure that the total field length falls within 8060 characters (equivalent to one page of data). http://www.thescripts.com/forum/thread82625.html -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Wednesday, May 02, 2007 7:11 AM To: dba-sqlserver at databaseadvisors.com; 'Access Developers discussion and problem solving' Subject: Re: [AccessD] [dba-SQLServer] NVARCHAR turned into Text(255) in Access It turns out that simply changing it to VarChar(8000) works in this case. I tried VarChar(64000) and got a message from SQL Server that 8000 was the max for VarChar. This would have to be described as a bug in the converter however, which could cause data loss (though AFAICT it did not in my case, only because all of my memo fields are shorter than 8K). It also indicates a possible problem with the ODBC driver / connection since a Memo can be up to 64000 characters and VarChar can only be 8K there is a very real possibility of data loss in some cases. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Wednesday, May 02, 2007 9:56 AM To: dba-sqlserver at databaseadvisors.com; 'Access Developers discussion and problem solving' Subject: [dba-SQLServer] NVARCHAR turned into Text(255) in Access I uploaded my billing database to SQL Server the other day. The process worked perfectly AFAICT, however SQL Server turned my memo field into a nvarchar, which when linked back into ACCESS, was now being converted to a TEXT(255). Needless to say, this was not my intention, nor is it acceptable. Has anyone ever run into this? Known solutions? John W. Colby Colby Consulting www.ColbyConsulting.com From jwcolby at colbyconsulting.com Wed May 2 10:59:57 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 2 May 2007 11:59:57 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <29f585dd0705020824o4f4ea4e2q57a319735c214a@mail.gmail.com> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp><007001c78c19$592f94b0$657aa8c0@m6805><29f585dd0705012032n22056cc6j73438aa575777e0a@mail.gmail.com><002e01c78cb0$d6e93950$657aa8c0@m6805> <29f585dd0705020824o4f4ea4e2q57a319735c214a@mail.gmail.com> Message-ID: <006801c78cd2$ef383ed0$657aa8c0@m6805> >I would like to apologize. As would I. Believe me I know all about alligators as I am currently wrestling one that would make a great movie, set in Maine (of all places).... Sometimes I get a little testy... John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Wednesday, May 02, 2007 11:24 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Good points, all, JC. I will upload an approach in an hour or two. Meanwhile I have one of my own alligators to wrestle. But before closing, I would like to apologize. My words read much more harshly than they were intended. Arthur On 5/2/07, JWColby wrote: > > Arthur knock it off. I haven't even begun to explore what SQL Server > in general can do. I asked a whole STREAM of questions about using > SSIS to do this and you and everyone else were strangely silent. This > list is exactly about getting HELP not "your statements are > preposterous". If you can't answer how to do it, that is fine (notice > that you STILL aren't offering an > answer) but if YOU DON'T KNOW (being the kung fu master of SQL Server) > then exactly how am I supposed to figure it out. > > Preposterous my statements may be (though you aren't showing me proof > of > that) but insulting your statements are, and not helpful to boot. > > I came up with a solution to my problem that involved a TOY. YOU OTOH > do NOT offer a solution but hint that someday when I too am a kung fu > master I will be able to solve my problems masterfully. Which I do > not doubt. I respectfully suggest that you put your money where your > mouth is. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 2 11:29:55 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 2 May 2007 12:29:55 -0400 Subject: [dba-SQLServer] Bulk insert (Arthur Fuller) In-Reply-To: <00101736F13D774F88C54058CB2663C8014D36B6@celebration.sierranevada.corp> References: <00101736F13D774F88C54058CB2663C8014D36B6@celebration.sierranevada.corp> Message-ID: <006c01c78cd7$1ec0af30$657aa8c0@m6805> >Good luck. Thanks, I appreciate that. I have in fact already solved my problem, just not within the confines of SQL Server. My solution is FAR from optimal but it does work and I am able to use it to get the job done. I have no doubt that SQL Server does indeed have capabilities that I cannot even begin to grasp, never mind make use of. One of the unfortunate problems that we all have is that MS proselytizes about how great and easy their tools are to use. I see that in Access, and understand exactly how far that is from the truth when it comes to anything other that the trivial. OTOH wizards are intended for the novice and should IMHO offer more than trivial solutions to trivial problems. As an example they have an AWSOME wizard for getting data into SQL Server. It allows you to use the wizard to import CSV files quickly and easily, and it even allows you to import fixed width files, though certainly not so quickly and easily, particularly when there are LOTS of fields. Having spent so much (programming) time on this wizard, how much extra time would it take to allow the stuff you entered to be stored (it does that) but more importantly REUSED BY THE WIZARD? They store this fixed width (import) stuff out to an SSIS file and then the silly (if quite powerful initially) wizard can't even read that file. AND the file itself is useless for the next pass because it has embedded in it stuff about creating a table (which was already created by the wizard during the first pass). AND it defaults to NVARCHAR which I for one don't need (it doubles the store size of my data - which is already HUGE). AND it offers no obvious way to say use VARCHAR instead. AND it (the import spec) is stored in XML which just SUCKS for trying to edit without a special tool. AND when clicked on (to edit it) it takes me to another tool entirely (VSS 2005) where it displays it graphically as objects to be edited but does NOT allow you to edit the pieces needed easily. Do you sense my frustration? Yea, SQL Server is an awesome tool. I BELIEVE THAT. But I am earning below minimum wage here spending HOURS and HOURS struggling to do the simplest damned thing, and in the end (having REALY AND TRULY spent hours and hours trying to get a handle on this) going back to my TOYS to get the job done. What the hell good is SQL Server 2005 Express going to do for (ME) Joe Blow developer (which MS is pushing it for) if what should be the simplest part is so damned complex that they (Microsoft) can't even build a wizard that will allow us to use it? Unfortunately, with terabyte sized databases to get input I do not have the option of throwing up my hands and going back to my toys. I must and will cope with the inanities. And if I vent, well... I apologize in advance. Having been on the ACCESSD list for so long, and seeing the skill level there, I had high hopes that I could get answers to this (apparently not so simple) problem on the SQL Server forum. Not to worry, I will survive, although minimum wage makes my house payment tough to make. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of David Lewis Sent: Wednesday, May 02, 2007 11:20 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert (Arthur Fuller) Hi Guys: Keep cool, everybody. John: The fact that you asked many questions that nobody here could answer is unfortunate but again doesn't reflect badly on the tool or the users. The two sites I mentioned (sqlcentral.com and sswug) have very active forums specifically dedicated to such topics as SSIS in 2005 or DTS in sql2k, etc. When you have a very specific question such as some of the ones you posed here, I have found better results on forums such as those. This forum has some very capable people, but relatively few regular posters. SQL Server is such a broad product that by now it takes a number of people specialized in different aspects of it to solve many of the harder problems. Most of those specialists are solving problems that the wizards in sql server cannot begin to address. Good luck. David ------------------------------ Message: 6 Date: Wed, 2 May 2007 07:55:50 -0400 From: "JWColby" Subject: Re: [dba-SQLServer] Bulk insert To: Message-ID: <002e01c78cb0$d6e93950$657aa8c0 at m6805> Content-Type: text/plain; charset="us-ascii" Arthur knock it off. I haven't even begun to explore what SQL Server in general can do. I asked a whole STREAM of questions about using SSIS to do this and you and everyone else were strangely silent. This list is exactly about getting HELP not "your statements are preposterous". If you can't answer how to do it, that is fine (notice that you STILL aren't offering an answer) but if YOU DON'T KNOW (being the kung fu master of SQL Server) then exactly how am I supposed to figure it out. Preposterous my statements may be (though you aren't showing me proof of that) but insulting your statements are, and not helpful to boot. I came up with a solution to my problem that involved a TOY. YOU OTOH do NOT offer a solution but hint that someday when I too am a kung fu master I will be able to solve my problems masterfully. Which I do not doubt. I respectfully suggest that you put your money where your mouth is. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Tuesday, May 01, 2007 11:33 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert To put it as gently as possible, you haven't even begun to explore what SSIS can do. Several of your statements are so preposterous as to lie beneath rebuttal. I suggest that when you have a spare weekend, you investigate the SSIS documentation. There you will find abundant insight into how to handle CSV and fixed-width files, not to mention a whole lot more. I'm sorry that the wizards couldn't get you from here to there, but just because you couldn't figure out immediately how to do it does NOT mean it's the tool's problem. SSIS is a quantum leap beyond what DTS could do, and even it could handle your CSV and trailing spaces problems without difficulty. I respectfully suggest it's time for input (read some documentation) not output. This is not to say that SSIS is problem-free, but the trivial issues you raise are solvable in a few minutes of reading. Arthur _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Mon May 7 09:05:54 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Mon, 7 May 2007 10:05:54 -0400 Subject: [dba-SQLServer] Thanks for the help Message-ID: <001a01c790b0$d43b04f0$657aa8c0@m6805> I am making great progress on understanding how to create stored procedures and getting them functioning. The piece I am still missing for my particular application is how to get them to run from Access. As you probably know by now I am trying to do batch processing of data import / export. Last week I was working on getting a 100 million record data import happening, where the data came in from 56 different files of various size, one or more files per state, depending on the population of the state. I got a stored procedure built and, using a Bulk Insert SQL statement was able to up my import from a previous high less than 500 records / second to up above 12K records / second on average. What an improvement that has been! Again a million thanks to all those who so patiently talked me through this stuff. In the end I simply opened a query window inside of SQL Server, and keyed in the name of the stored procedure and a file name, manually recorded the time it took SQL Server to perform the insert, modified the filename and did the next etc. 56 times and I was done. Not efficient but with the import times so radically improved at least I could get it done. My next step has to be getting such a stored procedure functioning when run from Access. ATM my application that does the data transformation from fixed width to csv is the driver for this entire process, and ATM it is written in Access / VBA. Remember that these stored procedures simply do a BULK INSERT, passing in a file name. therefore these stored procedures do not yet return a recordset (or even a value), but I really do need to get them to return a value eventually. My strategy is to "baby step" this thing so that I can sort out where the inevitable problem lies and get it functioning one step at a time. So my next step is simply to get the stored procedure executing when called from VBA. If anyone has code that they are willing to share that executes a stored procedure in SQL Server , passing in a parameter, executed from VBA out in Access I would be most appreciative. On another note entirely, does anyone know how to, in SQL, specify a specific quantity of records, from a specific place in a table, without depending on an autonumber PK to do it. IOW, I need to pull the first 2 million records, then the second 2 million records, then the third 2 million records etc. I will be exporting these out to a CSV file. The table has an autoincrement PK but some records have been deleted because their address was not deliverable. Thus I could simply say "WHERE PKID >0 and <=2,000,000" and for the next set say "WHERE PKID > 2,000,000 and <=4,000,000" and in fact I will use this approach if required. The problem is that the result set will not be 2 million records, but rather 2 million minus the deleted records in that range. I suppose I could create another autoincrement field so that I would have a field where the numbers are consecutive and then use the approach above, using that field. I am just trying to discover whether it is possible with SQL to do this without depending on an autoincrementing number field. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com From James at fcidms.com Mon May 7 09:57:11 2007 From: James at fcidms.com (James Barash) Date: Mon, 7 May 2007 10:57:11 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <001a01c790b0$d43b04f0$657aa8c0@m6805> Message-ID: <009b01c790b7$fe185190$800101df@fci.local> John: Here is a basic call to a stored procedure from VBA with two parameter: Public Sub SetOrderStatus(ID as Long, Status as Long) Dim conn As ADODB.Connection Dim cmd As ADODB.command Set conn = New ADODB.Connection conn.ConnectionString = "Insert connection string here" conn.Open Set cmd = New ADODB.command With cmd Set .ActiveConnection = conn .CommandType = adCmdStoredProc .CommandText = "sp_SetOrderStatus" .Parameters.Append cmd.CreateParameter("@ID", adInteger, adParamInput, , ID) .Parameters.Append cmd.CreateParameter("Status", adInteger, adParamInput, , Status) .Execute End With Set cmd = Nothing conn.Close Set conn = Nothing End Sub For your question on pulling records from SQL Server, if you are using SQL Server 2005, you can use: With OrdersA as (select [Order Number], ROW_NUMBER() over (Order By [Order ID]) as 'ROWNUMBER' from Orders) Select * from OrdersA where ROWNUMBER between 100 and 200 Hope that helps. James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Monday, May 07, 2007 10:06 AM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Thanks for the help I am making great progress on understanding how to create stored procedures and getting them functioning. The piece I am still missing for my particular application is how to get them to run from Access. As you probably know by now I am trying to do batch processing of data import / export. Last week I was working on getting a 100 million record data import happening, where the data came in from 56 different files of various size, one or more files per state, depending on the population of the state. I got a stored procedure built and, using a Bulk Insert SQL statement was able to up my import from a previous high less than 500 records / second to up above 12K records / second on average. What an improvement that has been! Again a million thanks to all those who so patiently talked me through this stuff. In the end I simply opened a query window inside of SQL Server, and keyed in the name of the stored procedure and a file name, manually recorded the time it took SQL Server to perform the insert, modified the filename and did the next etc. 56 times and I was done. Not efficient but with the import times so radically improved at least I could get it done. My next step has to be getting such a stored procedure functioning when run from Access. ATM my application that does the data transformation from fixed width to csv is the driver for this entire process, and ATM it is written in Access / VBA. Remember that these stored procedures simply do a BULK INSERT, passing in a file name. therefore these stored procedures do not yet return a recordset (or even a value), but I really do need to get them to return a value eventually. My strategy is to "baby step" this thing so that I can sort out where the inevitable problem lies and get it functioning one step at a time. So my next step is simply to get the stored procedure executing when called from VBA. If anyone has code that they are willing to share that executes a stored procedure in SQL Server , passing in a parameter, executed from VBA out in Access I would be most appreciative. On another note entirely, does anyone know how to, in SQL, specify a specific quantity of records, from a specific place in a table, without depending on an autonumber PK to do it. IOW, I need to pull the first 2 million records, then the second 2 million records, then the third 2 million records etc. I will be exporting these out to a CSV file. The table has an autoincrement PK but some records have been deleted because their address was not deliverable. Thus I could simply say "WHERE PKID >0 and <=2,000,000" and for the next set say "WHERE PKID > 2,000,000 and <=4,000,000" and in fact I will use this approach if required. The problem is that the result set will not be 2 million records, but rather 2 million minus the deleted records in that range. I suppose I could create another autoincrement field so that I would have a field where the numbers are consecutive and then use the approach above, using that field. I am just trying to discover whether it is possible with SQL to do this without depending on an autoincrementing number field. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Mon May 7 10:18:53 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Mon, 7 May 2007 11:18:53 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <009b01c790b7$fe185190$800101df@fci.local> References: <001a01c790b0$d43b04f0$657aa8c0@m6805> <009b01c790b7$fe185190$800101df@fci.local> Message-ID: <002801c790bb$06d93210$657aa8c0@m6805> James, Gracias on both accounts. I'll implement those tonight and let you knows how it goes. If this all works I will finally have the capabilities in place to automatically import / export these huge files for processing without costing me untold hours of manual labor doing so. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of James Barash Sent: Monday, May 07, 2007 10:57 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Thanks for the help John: Here is a basic call to a stored procedure from VBA with two parameter: Public Sub SetOrderStatus(ID as Long, Status as Long) Dim conn As ADODB.Connection Dim cmd As ADODB.command Set conn = New ADODB.Connection conn.ConnectionString = "Insert connection string here" conn.Open Set cmd = New ADODB.command With cmd Set .ActiveConnection = conn .CommandType = adCmdStoredProc .CommandText = "sp_SetOrderStatus" .Parameters.Append cmd.CreateParameter("@ID", adInteger, adParamInput, , ID) .Parameters.Append cmd.CreateParameter("Status", adInteger, adParamInput, , Status) .Execute End With Set cmd = Nothing conn.Close Set conn = Nothing End Sub For your question on pulling records from SQL Server, if you are using SQL Server 2005, you can use: With OrdersA as (select [Order Number], ROW_NUMBER() over (Order By [Order ID]) as 'ROWNUMBER' from Orders) Select * from OrdersA where ROWNUMBER between 100 and 200 Hope that helps. James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Monday, May 07, 2007 10:06 AM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Thanks for the help I am making great progress on understanding how to create stored procedures and getting them functioning. The piece I am still missing for my particular application is how to get them to run from Access. As you probably know by now I am trying to do batch processing of data import / export. Last week I was working on getting a 100 million record data import happening, where the data came in from 56 different files of various size, one or more files per state, depending on the population of the state. I got a stored procedure built and, using a Bulk Insert SQL statement was able to up my import from a previous high less than 500 records / second to up above 12K records / second on average. What an improvement that has been! Again a million thanks to all those who so patiently talked me through this stuff. In the end I simply opened a query window inside of SQL Server, and keyed in the name of the stored procedure and a file name, manually recorded the time it took SQL Server to perform the insert, modified the filename and did the next etc. 56 times and I was done. Not efficient but with the import times so radically improved at least I could get it done. My next step has to be getting such a stored procedure functioning when run from Access. ATM my application that does the data transformation from fixed width to csv is the driver for this entire process, and ATM it is written in Access / VBA. Remember that these stored procedures simply do a BULK INSERT, passing in a file name. therefore these stored procedures do not yet return a recordset (or even a value), but I really do need to get them to return a value eventually. My strategy is to "baby step" this thing so that I can sort out where the inevitable problem lies and get it functioning one step at a time. So my next step is simply to get the stored procedure executing when called from VBA. If anyone has code that they are willing to share that executes a stored procedure in SQL Server , passing in a parameter, executed from VBA out in Access I would be most appreciative. On another note entirely, does anyone know how to, in SQL, specify a specific quantity of records, from a specific place in a table, without depending on an autonumber PK to do it. IOW, I need to pull the first 2 million records, then the second 2 million records, then the third 2 million records etc. I will be exporting these out to a CSV file. The table has an autoincrement PK but some records have been deleted because their address was not deliverable. Thus I could simply say "WHERE PKID >0 and <=2,000,000" and for the next set say "WHERE PKID > 2,000,000 and <=4,000,000" and in fact I will use this approach if required. The problem is that the result set will not be 2 million records, but rather 2 million minus the deleted records in that range. I suppose I could create another autoincrement field so that I would have a field where the numbers are consecutive and then use the approach above, using that field. I am just trying to discover whether it is possible with SQL to do this without depending on an autoincrementing number field. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jlawrenc1 at shaw.ca Mon May 7 13:36:45 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Mon, 07 May 2007 11:36:45 -0700 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <001a01c790b0$d43b04f0$657aa8c0@m6805> Message-ID: <0JHO00ESFOU4UQQ0@l-daemon> Hi John: Refer to DBA article: http://www.databaseadvisors.com/newsletters/newsletter112003/0311UnboundRepo rts.asp The section shows the method of parameterizing a request to MS SQL sever from Access. There is also a download attached to the article though in the demo it only connects to an Access DB BE. The modification required to allow it to work with a MS SQL Server only required the connection string to be changed. Here is a link to a short online video on the subject, from the server end: http://download.microsoft.com/download/b/3/8/b3847275-2bea-440a-8e2e-305b009 bb261/sql_12.wmv HTH Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Monday, May 07, 2007 7:06 AM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Thanks for the help I am making great progress on understanding how to create stored procedures and getting them functioning. The piece I am still missing for my particular application is how to get them to run from Access. As you probably know by now I am trying to do batch processing of data import / export. Last week I was working on getting a 100 million record data import happening, where the data came in from 56 different files of various size, one or more files per state, depending on the population of the state. I got a stored procedure built and, using a Bulk Insert SQL statement was able to up my import from a previous high less than 500 records / second to up above 12K records / second on average. What an improvement that has been! Again a million thanks to all those who so patiently talked me through this stuff. In the end I simply opened a query window inside of SQL Server, and keyed in the name of the stored procedure and a file name, manually recorded the time it took SQL Server to perform the insert, modified the filename and did the next etc. 56 times and I was done. Not efficient but with the import times so radically improved at least I could get it done. My next step has to be getting such a stored procedure functioning when run from Access. ATM my application that does the data transformation from fixed width to csv is the driver for this entire process, and ATM it is written in Access / VBA. Remember that these stored procedures simply do a BULK INSERT, passing in a file name. therefore these stored procedures do not yet return a recordset (or even a value), but I really do need to get them to return a value eventually. My strategy is to "baby step" this thing so that I can sort out where the inevitable problem lies and get it functioning one step at a time. So my next step is simply to get the stored procedure executing when called from VBA. If anyone has code that they are willing to share that executes a stored procedure in SQL Server , passing in a parameter, executed from VBA out in Access I would be most appreciative. On another note entirely, does anyone know how to, in SQL, specify a specific quantity of records, from a specific place in a table, without depending on an autonumber PK to do it. IOW, I need to pull the first 2 million records, then the second 2 million records, then the third 2 million records etc. I will be exporting these out to a CSV file. The table has an autoincrement PK but some records have been deleted because their address was not deliverable. Thus I could simply say "WHERE PKID >0 and <=2,000,000" and for the next set say "WHERE PKID > 2,000,000 and <=4,000,000" and in fact I will use this approach if required. The problem is that the result set will not be 2 million records, but rather 2 million minus the deleted records in that range. I suppose I could create another autoincrement field so that I would have a field where the numbers are consecutive and then use the approach above, using that field. I am just trying to discover whether it is possible with SQL to do this without depending on an autoincrementing number field. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From tuxedoman888 at gmail.com Tue May 8 00:10:34 2007 From: tuxedoman888 at gmail.com (Billy Pang) Date: Mon, 7 May 2007 22:10:34 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> Message-ID: <7c8826480705072210h71eccf8aj81e209850d3a4f03@mail.gmail.com> John: This is a bit late so not sure if you resolved your issues with BCP yet. However, I would like to point out that bcp does indeed handle fixed width datatypes. See following illustration: 1) create a database called test 2) run the following script to create a table called table1 CREATE TABLE [Table1] ( [asdf] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL , [asdf2] [int] NULL , [asdf3] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [asdf4] [int] NULL , [asf] [int] IDENTITY (1, 1) NOT NULL , CONSTRAINT [PK_Table1] PRIMARY KEY CLUSTERED ( [asf] ) ON [PRIMARY] ) ON [PRIMARY] 3) populate the table with 2 records insert into table1 values('1',2,'3',4) insert into table1 values('a',5,'b',6) 4) ok.. now use bcp to export the data using trusted connection bcp "test..table1" out "c:\yo_exported_data.txt" -T -c there are two records in the data file. note: fixed width preserved! 5) use bcp to create a format file bcp "test..table1" format -T -c -f "c:\yo_format_file.txt" 6) now use bcp to import the records you just exported using the format file you just created bcp "test..table1" in "c:\yo_exported_data.txt" -T -c -f "c:\yo_format_file.txt" 7) count the records in the table select count(*) from table1 there are four records 8) view the records in the tables. select * from table1 sometimes bcp is a bit of pita but it works like a dream once you get it working. hardest part is creating the format file but once you get bcp to do that for you, the rest pretty much writes itself. HTH Billy On 5/1/07, David Lewis wrote: > > > > John: > > There are a few websites concerned with sql server, with active forums. > I recommend you check them out. One is sqlcentral.com, the other is > sswug. > > You are right, access is a 'toy' when compared to sql server. One > problem you are having is that you don't have the luxury of a learning > curve -- but that is not the fault of the tool. Hang in there. D > > > And that is exactly what I am doing and it is soooooo KLUDGY!!! > > I am running a program I wrote to read the data out of the original > fixed > width file because SQL Server won't handle fixed width files and strip > off > the spaces. How incredibly stupid is THAT? Is there ANYONE out there > who > WANTS those spaces? So I am already using KLUDGE to get data into SQL > Server. Now I export it out to a perfectly valid CSV file only to > discover > that SQL Server BCP and Bulk Insert don't even look at (understand) > quotes > around comma delimited fields. > > But ACCESS does. But Access is a TOY remember? What exactly does that > make > SQL Server that it needs a toy to feed it data? > > This has been an exercise in discovering just how brain dead the data > import > processes are (or can be anyway) for SQL Server. This is NOT rocket > science. I am able to write a utility to open / import / mangle / > export it > back out to another file in VBA. How tough can it be to do this import > inside of SQL Server natively? > > I have no idea how widespread this kind of file is but I can tell you > that > that is all I see EVER in the industry I am dealing with. HUGE files, > fixed > width, space padded right. And I can tell you they have been a royal > PITA > to get into SQL Server. > > At least now I have my own utility that can get I these input files into > the > format I need, even if it is in ACCESS/ VBA. My next step is to port > this > to VB.Net so that I can do it a little more "natively". Once I get a > little > more familiar with VB.Net I want to look at storing the data right into > a > recordset in ADO and then write that back to SQL Server. If that is too > slow (I suspect that it will be) then I can still do what I do now and > import / mangle / write to file and then run a stored procedure to do a > Bulk > Insert from the file I create. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Billy Pang http://dbnotes.blogspot.com/ "Once the game is over, the King and the pawn go back in the same box." - Italian proverb From jwcolby at colbyconsulting.com Tue May 8 08:25:00 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 09:25:00 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <7c8826480705072210h71eccf8aj81e209850d3a4f03@mail.gmail.com> References: <00101736F13D774F88C54058CB2663C8014D337A@celebration.sierranevada.corp> <7c8826480705072210h71eccf8aj81e209850d3a4f03@mail.gmail.com> Message-ID: <00a801c79174$4858ee10$657aa8c0@m6805> Billy, The issue is not "handling fixed width" but in stripping off the trailing spaces in the process. The environment that I work in isn't "import a file and use it", but rather "import 56 files containing a HUNDRED MILLION records of 150 (or SEVEN HUNDRED) fields into a single table and use it". The SPACES in this last file were more than 40% of the total volume of the file. I CANNOT import all the spaces and then go back and strip them off in situ. The extra space in the database and the extra time to do the strip makes that a non starter. And this is where (so far) the bulk insert method has failed me. Nobody has shown me how to strip the spaces on the way in, and still keep the time up. I have written an external application, in Access / VBA but headed towards VB.NET which opens these HUGE (often four GIGABYTE) files (remember there were 56 of these files in the last batch), and then reads each line, pulls each field out, trims off the spaces, assembles it into a string with a field delimiter, and writes each line back to a new file. That new file imports cleanly into SQL Server using a BULK INSERT sql statement embedded in a sproc, which I am confident I will get working from my same external application to allow me to have that one app strip and import into SQL Server, logging all results. SQL Server may very well be able to do all this stuff in one fell swoop, who knows? Certainly not me. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Billy Pang Sent: Tuesday, May 08, 2007 1:11 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert John: This is a bit late so not sure if you resolved your issues with BCP yet. However, I would like to point out that bcp does indeed handle fixed width datatypes. See following illustration: 1) create a database called test 2) run the following script to create a table called table1 CREATE TABLE [Table1] ( [asdf] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL , [asdf2] [int] NULL , [asdf3] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [asdf4] [int] NULL , [asf] [int] IDENTITY (1, 1) NOT NULL , CONSTRAINT [PK_Table1] PRIMARY KEY CLUSTERED ( [asf] ) ON [PRIMARY] ) ON [PRIMARY] 3) populate the table with 2 records insert into table1 values('1',2,'3',4) insert into table1 values('a',5,'b',6) 4) ok.. now use bcp to export the data using trusted connection bcp "test..table1" out "c:\yo_exported_data.txt" -T -c there are two records in the data file. note: fixed width preserved! 5) use bcp to create a format file bcp "test..table1" format -T -c -f "c:\yo_format_file.txt" 6) now use bcp to import the records you just exported using the format file you just created bcp "test..table1" in "c:\yo_exported_data.txt" -T -c -f "c:\yo_format_file.txt" 7) count the records in the table select count(*) from table1 there are four records 8) view the records in the tables. select * from table1 sometimes bcp is a bit of pita but it works like a dream once you get it working. hardest part is creating the format file but once you get bcp to do that for you, the rest pretty much writes itself. HTH Billy On 5/1/07, David Lewis wrote: > > > > John: > > There are a few websites concerned with sql server, with active forums. > I recommend you check them out. One is sqlcentral.com, the other is > sswug. > > You are right, access is a 'toy' when compared to sql server. One > problem you are having is that you don't have the luxury of a learning > curve -- but that is not the fault of the tool. Hang in there. D > > > And that is exactly what I am doing and it is soooooo KLUDGY!!! > > I am running a program I wrote to read the data out of the original > fixed width file because SQL Server won't handle fixed width files and > strip off the spaces. How incredibly stupid is THAT? Is there ANYONE > out there who WANTS those spaces? So I am already using KLUDGE to get > data into SQL Server. Now I export it out to a perfectly valid CSV > file only to discover that SQL Server BCP and Bulk Insert don't even > look at (understand) quotes around comma delimited fields. > > But ACCESS does. But Access is a TOY remember? What exactly does > that make SQL Server that it needs a toy to feed it data? > > This has been an exercise in discovering just how brain dead the data > import processes are (or can be anyway) for SQL Server. This is NOT > rocket science. I am able to write a utility to open / import / > mangle / export it back out to another file in VBA. How tough can it > be to do this import inside of SQL Server natively? > > I have no idea how widespread this kind of file is but I can tell you > that that is all I see EVER in the industry I am dealing with. HUGE > files, fixed width, space padded right. And I can tell you they have > been a royal PITA to get into SQL Server. > > At least now I have my own utility that can get I these input files > into the format I need, even if it is in ACCESS/ VBA. My next step is > to port this to VB.Net so that I can do it a little more "natively". > Once I get a little more familiar with VB.Net I want to look at > storing the data right into a recordset in ADO and then write that > back to SQL Server. If that is too slow (I suspect that it will be) > then I can still do what I do now and import / mangle / write to file > and then run a stored procedure to do a Bulk Insert from the file I > create. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Billy Pang http://dbnotes.blogspot.com/ "Once the game is over, the King and the pawn go back in the same box." - Italian proverb _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From ebarro at verizon.net Tue May 8 08:33:15 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 08 May 2007 06:33:15 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <00a801c79174$4858ee10$657aa8c0@m6805> Message-ID: <0JHQ00JYQ5NLGGW5@vms040.mailsrvcs.net> John, Why not try the method I proposed with the bulk insert process that involved importing the data (with spaces) into a temporary file and then inserting those records into the actual table? That would be the "one fell swoop" method you're looking for that SQL server handles really well. This is the same method I used to grab data from Peoplesoft to import into an Employees table in SQL server. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 6:25 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Billy, The issue is not "handling fixed width" but in stripping off the trailing spaces in the process. The environment that I work in isn't "import a file and use it", but rather "import 56 files containing a HUNDRED MILLION records of 150 (or SEVEN HUNDRED) fields into a single table and use it". The SPACES in this last file were more than 40% of the total volume of the file. I CANNOT import all the spaces and then go back and strip them off in situ. The extra space in the database and the extra time to do the strip makes that a non starter. And this is where (so far) the bulk insert method has failed me. Nobody has shown me how to strip the spaces on the way in, and still keep the time up. I have written an external application, in Access / VBA but headed towards VB.NET which opens these HUGE (often four GIGABYTE) files (remember there were 56 of these files in the last batch), and then reads each line, pulls each field out, trims off the spaces, assembles it into a string with a field delimiter, and writes each line back to a new file. That new file imports cleanly into SQL Server using a BULK INSERT sql statement embedded in a sproc, which I am confident I will get working from my same external application to allow me to have that one app strip and import into SQL Server, logging all results. SQL Server may very well be able to do all this stuff in one fell swoop, who knows? Certainly not me. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Billy Pang Sent: Tuesday, May 08, 2007 1:11 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert John: This is a bit late so not sure if you resolved your issues with BCP yet. However, I would like to point out that bcp does indeed handle fixed width datatypes. See following illustration: 1) create a database called test 2) run the following script to create a table called table1 CREATE TABLE [Table1] ( [asdf] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL , [asdf2] [int] NULL , [asdf3] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [asdf4] [int] NULL , [asf] [int] IDENTITY (1, 1) NOT NULL , CONSTRAINT [PK_Table1] PRIMARY KEY CLUSTERED ( [asf] ) ON [PRIMARY] ) ON [PRIMARY] 3) populate the table with 2 records insert into table1 values('1',2,'3',4) insert into table1 values('a',5,'b',6) 4) ok.. now use bcp to export the data using trusted connection bcp "test..table1" out "c:\yo_exported_data.txt" -T -c there are two records in the data file. note: fixed width preserved! 5) use bcp to create a format file bcp "test..table1" format -T -c -f "c:\yo_format_file.txt" 6) now use bcp to import the records you just exported using the format file you just created bcp "test..table1" in "c:\yo_exported_data.txt" -T -c -f "c:\yo_format_file.txt" 7) count the records in the table select count(*) from table1 there are four records 8) view the records in the tables. select * from table1 sometimes bcp is a bit of pita but it works like a dream once you get it working. hardest part is creating the format file but once you get bcp to do that for you, the rest pretty much writes itself. HTH Billy On 5/1/07, David Lewis wrote: > > > > John: > > There are a few websites concerned with sql server, with active forums. > I recommend you check them out. One is sqlcentral.com, the other is > sswug. > > You are right, access is a 'toy' when compared to sql server. One > problem you are having is that you don't have the luxury of a learning > curve -- but that is not the fault of the tool. Hang in there. D > > > And that is exactly what I am doing and it is soooooo KLUDGY!!! > > I am running a program I wrote to read the data out of the original > fixed width file because SQL Server won't handle fixed width files and > strip off the spaces. How incredibly stupid is THAT? Is there ANYONE > out there who WANTS those spaces? So I am already using KLUDGE to get > data into SQL Server. Now I export it out to a perfectly valid CSV > file only to discover that SQL Server BCP and Bulk Insert don't even > look at (understand) quotes around comma delimited fields. > > But ACCESS does. But Access is a TOY remember? What exactly does > that make SQL Server that it needs a toy to feed it data? > > This has been an exercise in discovering just how brain dead the data > import processes are (or can be anyway) for SQL Server. This is NOT > rocket science. I am able to write a utility to open / import / > mangle / export it back out to another file in VBA. How tough can it > be to do this import inside of SQL Server natively? > > I have no idea how widespread this kind of file is but I can tell you > that that is all I see EVER in the industry I am dealing with. HUGE > files, fixed width, space padded right. And I can tell you they have > been a royal PITA to get into SQL Server. > > At least now I have my own utility that can get I these input files > into the format I need, even if it is in ACCESS/ VBA. My next step is > to port this to VB.Net so that I can do it a little more "natively". > Once I get a little more familiar with VB.Net I want to look at > storing the data right into a recordset in ADO and then write that > back to SQL Server. If that is too slow (I suspect that it will be) > then I can still do what I do now and import / mangle / write to file > and then run a stored procedure to do a Bulk Insert from the file I > create. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Billy Pang http://dbnotes.blogspot.com/ "Once the game is over, the King and the pawn go back in the same box." - Italian proverb _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.6.5/793 - Release Date: 5/7/2007 2:55 PM From jwcolby at colbyconsulting.com Tue May 8 09:09:34 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 10:09:34 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <0JHQ00JYQ5NLGGW5@vms040.mailsrvcs.net> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ00JYQ5NLGGW5@vms040.mailsrvcs.net> Message-ID: <00aa01c7917a$81f149a0$657aa8c0@m6805> Eric, Because... There are FIFTY SIX FILES, ONE HUNDRED AND FIFTY fields in this specific data set. There were SEVEN HUNDRED fields in the last data set. It sounds easy to "just insert into a temporary table and then move to the real table" but part of that process involves installing a strip() around each field. I am not a SQL kind of guy and no one here is offering to provide me with code "inside of SQL Server) that GENERICALLY goes out and gets the list of fields, builds up dynamically "in code" a sql statement that inserts that "strip()" around each field appending to the destination table of course, and then executes that. I am in the business of taking files and converting them. I am given the data files, but I am also given a spreadsheet (or occasionally just a printout) containing the field names, positions in the fixed width string and the width of each field. I have to get that data into a table, then use THAT data to parse the fixed width string. This is NOT a once off thing. If I am going to do this, it has to work tomorrow, with a completely different data set, different directory on the disk where the data is stored, different table name, different field name, different number of fields. I can do that in VBA, someday soon I will be able to do that in VB.Net. I cannot do this inside of SQL Server. Go listen to the "Agile programming" discussion on the AccessD list and see my position. I am a little bit frustrated because I laid out the entire problem back in the beginning. This has to be a SYSTEM, that processes, from start to finish, a SET of HUGE fixed width data files. "A bulk insert process that involved importing the data (with spaces) into a temporary file and then inserting those records into the actual table" is not a SYSTEM, it is one small part of the whole problem. That method would work just fine ONCE, on ONE FILE. It quickly turns into hundreds of manual edits to create the original table, create the temp table, create the SQL statement to move from the temp to the real table, to insert file names into the query that pulls the data from the text files into the temp table etc. AND THEN, DO IT ALL OVER AGAIN for the next set of files. I CANNOT DO THAT. Remember, I am OLD. I will die of old age before I can finish my job. Not to mention I have other clients as well expecting some small part of my time. AND I have kids who also want to see me. It took me a few hours to write a set of classes to do the data conversion to strip the files. I have a table that holds the location of the source and converted data files as well as statistics about how long etc. The client gives me a "spec" which I get into a table to tell me where each field starts and its length and its name. Code picks it up from there. Open a file, strip the spaces, line by line, write to a different file in a different directory, log the progress. ATM it does NOT do so but soon the code will then pick up the converted file and execute a Sproc that bulk inserts that converted file into the destination table. Once that is done I will have a SYSTME where I build a pair of tables, from / to dir and the field defs for an import. Click a button and come back in a few hours to have 100 million records from 56 files stripped and inserted into a table in SQL Server, progress logged into a table, file name processed, start time to strip, finish time to strip, number of records in file, start time to bulk insert, end time to bulk insert (maybe), number of records bulk inserted. The next set of files... I place in a new directory, build records in the two tables, push a button and come back in a few hours to 150 million records stripped and inserted. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Tuesday, May 08, 2007 9:33 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert John, Why not try the method I proposed with the bulk insert process that involved importing the data (with spaces) into a temporary file and then inserting those records into the actual table? That would be the "one fell swoop" method you're looking for that SQL server handles really well. This is the same method I used to grab data from Peoplesoft to import into an Employees table in SQL server. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 6:25 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Billy, The issue is not "handling fixed width" but in stripping off the trailing spaces in the process. The environment that I work in isn't "import a file and use it", but rather "import 56 files containing a HUNDRED MILLION records of 150 (or SEVEN HUNDRED) fields into a single table and use it". The SPACES in this last file were more than 40% of the total volume of the file. I CANNOT import all the spaces and then go back and strip them off in situ. The extra space in the database and the extra time to do the strip makes that a non starter. And this is where (so far) the bulk insert method has failed me. Nobody has shown me how to strip the spaces on the way in, and still keep the time up. I have written an external application, in Access / VBA but headed towards VB.NET which opens these HUGE (often four GIGABYTE) files (remember there were 56 of these files in the last batch), and then reads each line, pulls each field out, trims off the spaces, assembles it into a string with a field delimiter, and writes each line back to a new file. That new file imports cleanly into SQL Server using a BULK INSERT sql statement embedded in a sproc, which I am confident I will get working from my same external application to allow me to have that one app strip and import into SQL Server, logging all results. SQL Server may very well be able to do all this stuff in one fell swoop, who knows? Certainly not me. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Billy Pang Sent: Tuesday, May 08, 2007 1:11 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert John: This is a bit late so not sure if you resolved your issues with BCP yet. However, I would like to point out that bcp does indeed handle fixed width datatypes. See following illustration: 1) create a database called test 2) run the following script to create a table called table1 CREATE TABLE [Table1] ( [asdf] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL , [asdf2] [int] NULL , [asdf3] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [asdf4] [int] NULL , [asf] [int] IDENTITY (1, 1) NOT NULL , CONSTRAINT [PK_Table1] PRIMARY KEY CLUSTERED ( [asf] ) ON [PRIMARY] ) ON [PRIMARY] 3) populate the table with 2 records insert into table1 values('1',2,'3',4) insert into table1 values('a',5,'b',6) 4) ok.. now use bcp to export the data using trusted connection bcp "test..table1" out "c:\yo_exported_data.txt" -T -c there are two records in the data file. note: fixed width preserved! 5) use bcp to create a format file bcp "test..table1" format -T -c -f "c:\yo_format_file.txt" 6) now use bcp to import the records you just exported using the format file you just created bcp "test..table1" in "c:\yo_exported_data.txt" -T -c -f "c:\yo_format_file.txt" 7) count the records in the table select count(*) from table1 there are four records 8) view the records in the tables. select * from table1 sometimes bcp is a bit of pita but it works like a dream once you get it working. hardest part is creating the format file but once you get bcp to do that for you, the rest pretty much writes itself. HTH Billy On 5/1/07, David Lewis wrote: > > > > John: > > There are a few websites concerned with sql server, with active forums. > I recommend you check them out. One is sqlcentral.com, the other is > sswug. > > You are right, access is a 'toy' when compared to sql server. One > problem you are having is that you don't have the luxury of a learning > curve -- but that is not the fault of the tool. Hang in there. D > > > And that is exactly what I am doing and it is soooooo KLUDGY!!! > > I am running a program I wrote to read the data out of the original > fixed width file because SQL Server won't handle fixed width files and > strip off the spaces. How incredibly stupid is THAT? Is there ANYONE > out there who WANTS those spaces? So I am already using KLUDGE to get > data into SQL Server. Now I export it out to a perfectly valid CSV > file only to discover that SQL Server BCP and Bulk Insert don't even > look at (understand) quotes around comma delimited fields. > > But ACCESS does. But Access is a TOY remember? What exactly does > that make SQL Server that it needs a toy to feed it data? > > This has been an exercise in discovering just how brain dead the data > import processes are (or can be anyway) for SQL Server. This is NOT > rocket science. I am able to write a utility to open / import / > mangle / export it back out to another file in VBA. How tough can it > be to do this import inside of SQL Server natively? > > I have no idea how widespread this kind of file is but I can tell you > that that is all I see EVER in the industry I am dealing with. HUGE > files, fixed width, space padded right. And I can tell you they have > been a royal PITA to get into SQL Server. > > At least now I have my own utility that can get I these input files > into the format I need, even if it is in ACCESS/ VBA. My next step is > to port this to VB.Net so that I can do it a little more "natively". > Once I get a little more familiar with VB.Net I want to look at > storing the data right into a recordset in ADO and then write that > back to SQL Server. If that is too slow (I suspect that it will be) > then I can still do what I do now and import / mangle / write to file > and then run a stored procedure to do a Bulk Insert from the file I > create. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Billy Pang http://dbnotes.blogspot.com/ "Once the game is over, the King and the pawn go back in the same box." - Italian proverb _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.6.5/793 - Release Date: 5/7/2007 2:55 PM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Tue May 8 10:20:54 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 11:20:54 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans Message-ID: <00b601c79184$79118390$657aa8c0@m6805> In Access you can do something like: MyAlias: [SomeField] <=10 Which translates into SQL that looks like: [SomeField]<=10 as MyAlias When I try move that (cut and paste) from Access into SQL Server I get a the infamous "syntax error somewhere near <" Question 1: What is wrong with that in SQL Server Question 2: How do I accomplish this directly in the query builder in SQL Server? John W. Colby Colby Consulting www.ColbyConsulting.com From James at fcidms.com Tue May 8 11:27:24 2007 From: James at fcidms.com (James Barash) Date: Tue, 8 May 2007 12:27:24 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <00b601c79184$79118390$657aa8c0@m6805> Message-ID: <008601c7918d$c2e5ba00$800101df@fci.local> John: If you are trying to return a boolean value you need something like: (case when [SomeField]<=10 then 1 else 0 end) as MyAlias James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 11:21 AM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In Access you can do something like: MyAlias: [SomeField] <=10 Which translates into SQL that looks like: [SomeField]<=10 as MyAlias When I try move that (cut and paste) from Access into SQL Server I get a the infamous "syntax error somewhere near <" Question 1: What is wrong with that in SQL Server Question 2: How do I accomplish this directly in the query builder in SQL Server? John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Tue May 8 12:04:36 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 13:04:36 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <008601c7918d$c2e5ba00$800101df@fci.local> References: <00b601c79184$79118390$657aa8c0@m6805> <008601c7918d$c2e5ba00$800101df@fci.local> Message-ID: <00ce01c79192$f57e0d00$657aa8c0@m6805> Well James, you da man today! Right answer. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of James Barash Sent: Tuesday, May 08, 2007 12:27 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into Booleans John: If you are trying to return a boolean value you need something like: (case when [SomeField]<=10 then 1 else 0 end) as MyAlias James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 11:21 AM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In Access you can do something like: MyAlias: [SomeField] <=10 Which translates into SQL that looks like: [SomeField]<=10 as MyAlias When I try move that (cut and paste) from Access into SQL Server I get a the infamous "syntax error somewhere near <" Question 1: What is wrong with that in SQL Server Question 2: How do I accomplish this directly in the query builder in SQL Server? John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jlawrenc1 at shaw.ca Tue May 8 12:23:52 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Tue, 08 May 2007 10:23:52 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <00a801c79174$4858ee10$657aa8c0@m6805> Message-ID: <0JHQ005MPG4J9VP0@l-daemon> Hi John: As to your query: The issue is not "handling fixed width" but in stripping off the trailing spaces in the process. The environment that I work in isn't "import a file and use it", but rather "import 56 files containing a HUNDRED MILLION records of 150 (or SEVEN HUNDRED) fields into a single table and use it". The SPACES in this last file were more than 40% of the total volume of the file. The only way to pre-handle the data is to build a DTS/SSIS application. I wrote a reply last week suggesting this pre-processing method. It will definitely work, as the data is being inserted. The speed is slower that bulk-insert. I am only really familiar with the old DTS but the new SQL 2005 has a legacy section that supports this functionality. Can send some sample code if you are interested.... It's in basic VB 6 format so any amount of VBA logic can be added and then it can be compiled for extra performance. Jim From fuller.artful at gmail.com Tue May 8 12:34:45 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Tue, 8 May 2007 13:34:45 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <00b601c79184$79118390$657aa8c0@m6805> References: <00b601c79184$79118390$657aa8c0@m6805> Message-ID: <29f585dd0705081034q675ece25m7c31c0428a0d32bd@mail.gmail.com> Let's pick it apart tad by tad, beginning with your denigrating use of "infamous". This is a "famous" error, not an infamous error. For references to infamous errors, their numbers are 533 and 601. Now. Let's go step by step.AFAIK S2k5 has no issues with square brackets, in fact I use them frequently, but begin by removing them. Step 2: Why is there no space between the value and the operator? I shall assume that it's the fault of the translator. Step 3: lose the "AS" part and run the query and see what happens. You probably won't get this far, since Steps 1 and 2 ought to fix the problem, but JIC (just in case). Step 4: when none of the above works, re-do the query in Management Studio. Then compare the syntax. A, On 5/8/07, JWColby wrote: > > In Access you can do something like: > > MyAlias: [SomeField] <=10 > > Which translates into SQL that looks like: > > [SomeField]<=10 as MyAlias > > When I try move that (cut and paste) from Access into SQL Server I get a > the > infamous "syntax error somewhere near <" > > Question 1: What is wrong with that in SQL Server > Question 2: How do I accomplish this directly in the query builder in SQL > Server? > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > From jwcolby at colbyconsulting.com Tue May 8 12:40:59 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 13:40:59 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <0JHQ005MPG4J9VP0@l-daemon> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon> Message-ID: <00d901c79198$0a703210$657aa8c0@m6805> Jim, At this point it is not necessary. I built a preprocessor in a few hours using my toy (Access). My toy application handles everything exactly as described. Someday (soon I hope) I will port that to VB.Net which I hope will be much quicker in the preprocessing department. Then I will be considered by some as being a real man, playing with real tools. ;-) Others will still consider me a child, playing with toys because I didn't take it straight to C#. SOMEDAY (far in the future) perhaps I will embed those pieces directly in CLR programming inside of SQL Server 2005. Then I will be able to look down my nose at those children still playing with toys. For now, it works and with the addition of driving the Sproc from the vba will be an integrated application like what I described. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 08, 2007 1:24 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Hi John: As to your query: The issue is not "handling fixed width" but in stripping off the trailing spaces in the process. The environment that I work in isn't "import a file and use it", but rather "import 56 files containing a HUNDRED MILLION records of 150 (or SEVEN HUNDRED) fields into a single table and use it". The SPACES in this last file were more than 40% of the total volume of the file. The only way to pre-handle the data is to build a DTS/SSIS application. I wrote a reply last week suggesting this pre-processing method. It will definitely work, as the data is being inserted. The speed is slower that bulk-insert. I am only really familiar with the old DTS but the new SQL 2005 has a legacy section that supports this functionality. Can send some sample code if you are interested.... It's in basic VB 6 format so any amount of VBA logic can be added and then it can be compiled for extra performance. Jim _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Tue May 8 13:04:50 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 14:04:50 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <29f585dd0705081034q675ece25m7c31c0428a0d32bd@mail.gmail.com> References: <00b601c79184$79118390$657aa8c0@m6805> <29f585dd0705081034q675ece25m7c31c0428a0d32bd@mail.gmail.com> Message-ID: <00e001c7919b$5fa696e0$657aa8c0@m6805> Arthur, >Let's pick it apart tad by tad, beginning with your denigrating use of "infamous". This is a "famous" error, not an infamous error. For references to infamous errors, their numbers are 533 and 601. I have no way of knowing whether this is a famous error or not. However... Infamous - ill-famed: having an exceedingly bad reputation; "a notorious gangster"; "the tenderloin district was notorious for vice" How about "fuc**ng useless!!! Will that do? "ERROR SOMEWHERE in the vicinity of...." Oh yea, that fits the word infamous in my book. It also fits FUC**ING useless as far as I am concerned! EBKAC is equally helpful, and only a tad more insulting. >Step 2: Why is there no space between the value and the operator? I shall assume that it's the fault of the translator. Perhaps it is because that is the 47th attempt at putting things in and taking things out, NONE of which gave me any results other than the INFAMOUS "Error somewhere in the vicinity of Hudson NC". >Step 3: lose the "AS" part and run the query and see what happens. You probably won't get this far, since Steps 1 and 2 ought to fix the problem, but JIC (just in case). AS is the clause that defines the alias. You can't lose that. Even I know that. >Step 4: when none of the above works, re-do the query in Management Studio. Then compare the syntax. And this is my problem with you Arthur. If I told you "if that fails, just rebuild the space shuttle" what would you tell me? If I told you that in EVERY EMAIL what would you tell me? Such helpful suggestions are so useless that I would expect you to someday cease and desisted in issuing them. Alas.... I tried using Management Studio. It is vastly different from Access' qbd window and requires more than a passing knowledge of the intricacies of SQL Server's brand of SQL. If you had been paying attention for the last few months you would understand that to be the root of my problem. I also tried poking and prodding, and Googling and looking up various phrases in my books, trying to solve the problem without assistance. Not to worry, James Barash actually solved my problem by providing the (or a) syntax needed to do comparisons in SQL Server. I have to guess that it took him all of three minutes to type it into an email, and it took me all of three minutes to type it in and verify that it works. Thanks James! John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Tuesday, May 08, 2007 1:35 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into Booleans Let's pick it apart tad by tad, beginning with your denigrating use of "infamous". This is a "famous" error, not an infamous error. For references to infamous errors, their numbers are 533 and 601. Now. Let's go step by step.AFAIK S2k5 has no issues with square brackets, in fact I use them frequently, but begin by removing them. Step 2: Why is there no space between the value and the operator? I shall assume that it's the fault of the translator. Step 3: lose the "AS" part and run the query and see what happens. You probably won't get this far, since Steps 1 and 2 ought to fix the problem, but JIC (just in case). Step 4: when none of the above works, re-do the query in Management Studio. Then compare the syntax. A, From James at fcidms.com Tue May 8 13:57:27 2007 From: James at fcidms.com (James Barash) Date: Tue, 8 May 2007 14:57:27 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <00ce01c79192$f57e0d00$657aa8c0@m6805> Message-ID: <00ce01c791a2$b94932a0$800101df@fci.local> de nada -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 1:05 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into Booleans Well James, you da man today! Right answer. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of James Barash Sent: Tuesday, May 08, 2007 12:27 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into Booleans John: If you are trying to return a boolean value you need something like: (case when [SomeField]<=10 then 1 else 0 end) as MyAlias James Barash -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 11:21 AM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In Access you can do something like: MyAlias: [SomeField] <=10 Which translates into SQL that looks like: [SomeField]<=10 as MyAlias When I try move that (cut and paste) from Access into SQL Server I get a the infamous "syntax error somewhere near <" Question 1: What is wrong with that in SQL Server Question 2: How do I accomplish this directly in the query builder in SQL Server? John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Tue May 8 15:48:11 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 16:48:11 -0400 Subject: [dba-SQLServer] Redeem your voucher Message-ID: <011e01c791b2$32acf370$657aa8c0@m6805> Well, I watched the two videos, filled out the required feedback report and now I have received my highly coveted voucher for my free* copy of Visual Studio 2005 Standard Edition. *Other than having to pay shipping and handling (which is fine) I also have to get the web page to ACCEPT my voucher (which ain't happening). The old endless loop, comes right back to the same page and asks for the voucher again. I even tried using MS POS Explorer to no avail. I wonder if anyone at MS has noticed that no one is redeeming their vouchers. I wonder if anyone at MS is ASSIGNED to notice such things. Sigh. John W. Colby Colby Consulting www.ColbyConsulting.com From ebarro at verizon.net Tue May 8 16:17:05 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 08 May 2007 14:17:05 -0700 Subject: [dba-SQLServer] [AccessD] Redeem your voucher In-Reply-To: <011e01c791b2$32acf370$657aa8c0@m6805> Message-ID: <0JHQ0068CR4GUVHA@vms046.mailsrvcs.net> Well John, maybe the M$ gods have heard your rants and decided to block your IP from their servers. :) FYI...my officemate just got through their site and it accepted *his* voucher! LOL... -----Original Message----- From: accessd-bounces at databaseadvisors.com [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Tuesday, May 08, 2007 1:48 PM To: 'Access Developers discussion and problem solving'; dba-sqlserver at databaseadvisors.com; dba-vb at databaseadvisors.com Subject: [AccessD] Redeem your voucher Well, I watched the two videos, filled out the required feedback report and now I have received my highly coveted voucher for my free* copy of Visual Studio 2005 Standard Edition. *Other than having to pay shipping and handling (which is fine) I also have to get the web page to ACCEPT my voucher (which ain't happening). The old endless loop, comes right back to the same page and asks for the voucher again. I even tried using MS POS Explorer to no avail. I wonder if anyone at MS has noticed that no one is redeeming their vouchers. I wonder if anyone at MS is ASSIGNED to notice such things. Sigh. John W. Colby Colby Consulting www.ColbyConsulting.com -- AccessD mailing list AccessD at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/accessd Website: http://www.databaseadvisors.com From martyconnelly at shaw.ca Tue May 8 21:26:36 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Tue, 08 May 2007 19:26:36 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <00d901c79198$0a703210$657aa8c0@m6805> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon> <00d901c79198$0a703210$657aa8c0@m6805> Message-ID: <4641315C.4040303@shaw.ca> Uhh, there is a one line fix to remove trailing blanks in SQL, different defaults for SQL Server versions and NChar and VChar. SET ANSI_PADDING OFF When a table is created with the setting turned on (the default), spaces are not trimmed when data is inserted into that table. When ANSI_PADDING is off, the spaces are trimmed. So if you SET ANSI_PADDING OFF, create your table, then set it back on again, when you bcp the data into the table, the excess trailing spaces will be eliminated. The only caveat here is if you have empty fields in your file, a single space is inserted instead of a null. If this is the case with your data file, you will need to do an update to set columns to null when len(yourcolumn) = 0. See BOL http://msdn2.microsoft.com/en-us/library/ms188340.aspx http://msdn2.microsoft.com/en-us/library/ms187403.aspx JWColby wrote: >Jim, > >At this point it is not necessary. I built a preprocessor in a few hours >using my toy (Access). My toy application handles everything exactly as >described. Someday (soon I hope) I will port that to VB.Net which I hope >will be much quicker in the preprocessing department. Then I will be >considered by some as being a real man, playing with real tools. ;-) >Others will still consider me a child, playing with toys because I didn't >take it straight to C#. SOMEDAY (far in the future) perhaps I will embed >those pieces directly in CLR programming inside of SQL Server 2005. Then I >will be able to look down my nose at those children still playing with toys. > >For now, it works and with the addition of driving the Sproc from the vba >will be an integrated application like what I described. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim >Lawrence >Sent: Tuesday, May 08, 2007 1:24 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Hi John: > >As to your query: > > >The issue is not "handling fixed width" but in stripping off the trailing >spaces in the process. The environment that I work in isn't "import a file >and use it", but rather "import 56 files containing a HUNDRED MILLION >records of 150 (or SEVEN HUNDRED) fields into a single table and use it". >The SPACES in this last file were more than 40% of the total volume of the >file. > > >The only way to pre-handle the data is to build a DTS/SSIS application. I >wrote a reply last week suggesting this pre-processing method. It will >definitely work, as the data is being inserted. The speed is slower that >bulk-insert. I am only really familiar with the old DTS but the new SQL 2005 >has a legacy section that supports this functionality. >Can send some sample code if you are interested.... It's in basic VB 6 >format so any amount of VBA logic can be added and then it can be compiled >for extra performance. > >Jim > >__ > -- Marty Connelly Victoria, B.C. Canada From jwcolby at colbyconsulting.com Tue May 8 21:53:18 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Tue, 8 May 2007 22:53:18 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <4641315C.4040303@shaw.ca> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon><00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca> Message-ID: <000401c791e5$33d51790$657aa8c0@m6805> >The only caveat here is if you have empty fields in your file, a single space is inserted instead of a null. What is it with all "the only caveat here" stuff? I am sure that there is a darned good reason. In the end it is just easier to roll your own rather than work around the issues that the built in stuff seems to have. I have 150 fields (in this data set). Somehow I have to do an update on all 150 fields. I suppose I could have my converter run 150 update queries to do each column. Or 700 update queries to do the next data set. Or just do the stripping of the spaces external to SQL Server and be done with it. Either way I still have to use my toy. Once I move up to VB.Net I will be able to use threads to do the stripping and the BULK INSERT Sproc in parallel. BTW, I have to do something very similar all over again once I get the data in. I will need to export the entire table back out, 2 million record sets of data to delimited files for CAS / NCOA processing, dumping 100 million records out into ~50 files (just the address data this time). The CAS / NCOA process theoretically will process all files placed into an input directory (input to that program), dumping the processed files into an output directory (output from that program). At which point I have to pull all of the CASS / NCOAd files BACK out of that output directory into to yet another table. And that is just the "pre-processing". You might be getting a clue by now why I do not want to be manually doing all the crapola involved with the solutions that do not involve an external control process. Someday fairly soon I will have a completely automated system for doing all this. I will be back to blowing bubbles and poking at Charlotte with a big stick. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of MartyConnelly Sent: Tuesday, May 08, 2007 10:27 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Uhh, there is a one line fix to remove trailing blanks in SQL, different defaults for SQL Server versions and NChar and VChar. SET ANSI_PADDING OFF When a table is created with the setting turned on (the default), spaces are not trimmed when data is inserted into that table. When ANSI_PADDING is off, the spaces are trimmed. So if you SET ANSI_PADDING OFF, create your table, then set it back on again, when you bcp the data into the table, the excess trailing spaces will be eliminated. The only caveat here is if you have empty fields in your file, a single space is inserted instead of a null. If this is the case with your data file, you will need to do an update to set columns to null when len(yourcolumn) = 0. See BOL http://msdn2.microsoft.com/en-us/library/ms188340.aspx http://msdn2.microsoft.com/en-us/library/ms187403.aspx From tuxedoman888 at gmail.com Tue May 8 23:12:02 2007 From: tuxedoman888 at gmail.com (Billy Pang) Date: Tue, 8 May 2007 21:12:02 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <4641315C.4040303@shaw.ca> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon> <00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca> Message-ID: <7c8826480705082112m4ef0de47s749292bcfe954e1f@mail.gmail.com> thanks Marty for this information. Billy On 5/8/07, MartyConnelly wrote: > > Uhh, there is a one line fix to remove trailing blanks in SQL, different > defaults > for SQL Server versions and NChar and VChar. > > SET ANSI_PADDING OFF > > When a table is created with the setting turned on (the default), > spaces are > not trimmed when data is inserted into that table. When ANSI_PADDING > is off, the spaces are trimmed. > > So if you SET ANSI_PADDING OFF, create your table, then set it back on > again, when you bcp the data into the table, the excess trailing > spaces will be eliminated. The only caveat here is if you have empty > fields in your file, a single space is inserted instead of a null. If > this is the case with your data file, you will need to do an update to > set columns to null when len(yourcolumn) = 0. > > See BOL > http://msdn2.microsoft.com/en-us/library/ms188340.aspx > > http://msdn2.microsoft.com/en-us/library/ms187403.aspx > > > > -- > Billy Pang > http://dbnotes.blogspot.com/ > "Once the game is over, the King and the pawn go back in the same box." - > Italian proverb From martyconnelly at shaw.ca Wed May 9 01:22:41 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Tue, 08 May 2007 23:22:41 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <000401c791e5$33d51790$657aa8c0@m6805> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon> <00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca> <000401c791e5$33d51790$657aa8c0@m6805> Message-ID: <464168B1.3060502@shaw.ca> Occam's Razor JWColby wrote: >>The only caveat here is if you have empty fields in your file, a single >> >> >space is inserted instead of a null. > >What is it with all "the only caveat here" stuff? I am sure that there is a >darned good reason. > >In the end it is just easier to roll your own rather than work around the >issues that the built in stuff seems to have. I have 150 fields (in this >data set). Somehow I have to do an update on all 150 fields. I suppose I >could have my converter run 150 update queries to do each column. Or 700 >update queries to do the next data set. Or just do the stripping of the >spaces external to SQL Server and be done with it. Either way I still have >to use my toy. > >Once I move up to VB.Net I will be able to use threads to do the stripping >and the BULK INSERT Sproc in parallel. > >BTW, I have to do something very similar all over again once I get the data >in. I will need to export the entire table back out, 2 million record sets >of data to delimited files for CAS / NCOA processing, dumping 100 million >records out into ~50 files (just the address data this time). The CAS / >NCOA process theoretically will process all files placed into an input >directory (input to that program), dumping the processed files into an >output directory (output from that program). At which point I have to pull >all of the CASS / NCOAd files BACK out of that output directory into to yet >another table. And that is just the "pre-processing". > >You might be getting a clue by now why I do not want to be manually doing >all the crapola involved with the solutions that do not involve an external >control process. Someday fairly soon I will have a completely automated >system for doing all this. I will be back to blowing bubbles and poking at >Charlotte with a big stick. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >MartyConnelly >Sent: Tuesday, May 08, 2007 10:27 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Uhh, there is a one line fix to remove trailing blanks in SQL, different >defaults for SQL Server versions and NChar and VChar. > >SET ANSI_PADDING OFF > > When a table is created with the setting turned on (the default), spaces >are not trimmed when data is inserted into that table. When ANSI_PADDING is >off, the spaces are trimmed. > >So if you SET ANSI_PADDING OFF, create your table, then set it back on >again, when you bcp the data into the table, the excess trailing spaces will >be eliminated. The only caveat here is if you have empty fields in your >file, a single space is inserted instead of a null. If this is the case with >your data file, you will need to do an update to set columns to null when >len(yourcolumn) = 0. > >See BOL >http://msdn2.microsoft.com/en-us/library/ms188340.aspx > >http://msdn2.microsoft.com/en-us/library/ms187403.aspx > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada From jwcolby at colbyconsulting.com Wed May 9 06:42:09 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 9 May 2007 07:42:09 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <464168B1.3060502@shaw.ca> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon><00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca><000401c791e5$33d51790$657aa8c0@m6805> <464168B1.3060502@shaw.ca> Message-ID: <000901c7922f$18772c50$657aa8c0@m6805> >Occam's Razor ROTFLMAO. Occam's razor FOR THEM!!! Pretty much useless to me. Of course not being an "SQL guru" I am going to be told that this is really "the best way", and "it's not the fault of the tool" and a bunch of other platitudes and excuses for why I would be given tools that I cannot use, and of course, not being an SQL Guru I will be in no position to protest. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of MartyConnelly Sent: Wednesday, May 09, 2007 2:23 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Occam's Razor JWColby wrote: >>The only caveat here is if you have empty fields in your file, a >>single >> >> >space is inserted instead of a null. > >What is it with all "the only caveat here" stuff? I am sure that there >is a darned good reason. > >In the end it is just easier to roll your own rather than work around >the issues that the built in stuff seems to have. I have 150 fields >(in this data set). Somehow I have to do an update on all 150 fields. >I suppose I could have my converter run 150 update queries to do each >column. Or 700 update queries to do the next data set. Or just do the >stripping of the spaces external to SQL Server and be done with it. >Either way I still have to use my toy. > >Once I move up to VB.Net I will be able to use threads to do the >stripping and the BULK INSERT Sproc in parallel. > >BTW, I have to do something very similar all over again once I get the >data in. I will need to export the entire table back out, 2 million >record sets of data to delimited files for CAS / NCOA processing, >dumping 100 million records out into ~50 files (just the address data >this time). The CAS / NCOA process theoretically will process all >files placed into an input directory (input to that program), dumping >the processed files into an output directory (output from that >program). At which point I have to pull all of the CASS / NCOAd files >BACK out of that output directory into to yet another table. And that is just the "pre-processing". > >You might be getting a clue by now why I do not want to be manually >doing all the crapola involved with the solutions that do not involve >an external control process. Someday fairly soon I will have a >completely automated system for doing all this. I will be back to >blowing bubbles and poking at Charlotte with a big stick. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >MartyConnelly >Sent: Tuesday, May 08, 2007 10:27 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Uhh, there is a one line fix to remove trailing blanks in SQL, >different defaults for SQL Server versions and NChar and VChar. > >SET ANSI_PADDING OFF > > When a table is created with the setting turned on (the default), >spaces are not trimmed when data is inserted into that table. When >ANSI_PADDING is off, the spaces are trimmed. > >So if you SET ANSI_PADDING OFF, create your table, then set it back on >again, when you bcp the data into the table, the excess trailing spaces >will be eliminated. The only caveat here is if you have empty fields in >your file, a single space is inserted instead of a null. If this is the >case with your data file, you will need to do an update to set columns >to null when >len(yourcolumn) = 0. > >See BOL >http://msdn2.microsoft.com/en-us/library/ms188340.aspx > >http://msdn2.microsoft.com/en-us/library/ms187403.aspx > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From rl_stewart at highstream.net Wed May 9 08:31:00 2007 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Wed, 09 May 2007 08:31:00 -0500 Subject: [dba-SQLServer] Bulk insert In-Reply-To: References: Message-ID: <200705091332.l49DW5Td023462@databaseadvisors.com> John, I am working on an example of doing this with SQL Server for you. But, since I have the same amount of free time as you do, it is going to take a week or so to complete. I am using the same concept as you did with a couple of table to hold the Import spec and the column definitions for the spec. I am going to only build it for one table to show how it can be done. The rest will be up to you if you want to expand it. Robert At 09:27 PM 5/8/2007, you wrote: >Date: Tue, 8 May 2007 13:40:59 -0400 >From: "JWColby" >Subject: Re: [dba-SQLServer] Bulk insert >To: >Message-ID: <00d901c79198$0a703210$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Jim, > >At this point it is not necessary. I built a preprocessor in a few hours >using my toy (Access). My toy application handles everything exactly as >described. Someday (soon I hope) I will port that to VB.Net which I hope >will be much quicker in the preprocessing department. Then I will be >considered by some as being a real man, playing with real tools. ;-) >Others will still consider me a child, playing with toys because I didn't >take it straight to C#. SOMEDAY (far in the future) perhaps I will embed >those pieces directly in CLR programming inside of SQL Server 2005. Then I >will be able to look down my nose at those children still playing with toys. > >For now, it works and with the addition of driving the Sproc from the vba >will be an integrated application like what I described. > >John W. Colby From ssharkins at setel.com Wed May 9 09:06:31 2007 From: ssharkins at setel.com (Susan Harkins) Date: Wed, 9 May 2007 10:06:31 -0400 Subject: [dba-SQLServer] Links to backup articles Message-ID: <000401c79243$3f6b4fd0$1ebc2ad1@SusanOne> http://searchsqlserver.techtarget.com/generic/0,295582,sid87_gci1244265,00.h tml?track=sy41 Thanks to Mike G.'s larkware.com blog Susan H. From jwcolby at colbyconsulting.com Wed May 9 10:01:56 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 9 May 2007 11:01:56 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <200705091332.l49DW5Td023462@databaseadvisors.com> References: <200705091332.l49DW5Td023462@databaseadvisors.com> Message-ID: <002501c7924b$01f69c10$657aa8c0@m6805> Robert, That is much appreciated. Why don't you work with me instead? I am all for doing it "in SQL Server" (I own and use SQL Server 2005) but it has to be something that I understand and can modify or I will not be able to use it in the end, which would make your efforts wasted. My high level spec: 1) Use a "from directory" which is monitored to pick up files from. 1a) The from directory changes for each import data set. 2) Each file in the "from directory" will contain identical formatted data, but the number of records in the file may vary. 3) Each file will be defined by an import spec table, which contains Field Name, Field Width and data type. If no data type is specified, then VarChar() is used. The size of the VarChar() field is determined by the spec table. This allows me to only have to spec data types (in advance) that I know is not VarChar(), but allows me to spec as many as I need in advance of the import. 4) Once the table is built and populated, a new field called PKID needs to be built. PKID will be INT (32 bit) PRIMARY KEY, IDENTITY. Understand that at this time I actually have a functioning system. It takes a pair of tables which specify the file info (from / to directories, field delimiter etc) and the field info (field name / start position in the fixed width data / field length). This program (written in VBA) does a conversion from fixed width to a pipe delimited "CSV" file, reading a line, stripping off the spaces, and writing the stripped data lines back out to a CSV file in the "TO Directory", complete with the first line containing field names. At the moment I have a hand constructed table in a hand constructed database, which is created by an initial use of the wizard from inside of SQL Server, pulling in the first CSV files created by my program. Once that table is created, I use a hand created BULK INSERT Sproc to import the CSV files. Once the table is fully populated with the contents of all the files, I hand build an ALTER TABLE query to build a PKID INT PRIMARY KEY, IDENTITY. As you can see, anywhere you see "hand created", that is an area that needs to be automated. My thoughts are that creating the table initially will be relatively easy, and in fact I know how, building up a make table query with the field names and widths taken from the spec table. I just did not do that because I did not have the time. The next issue is dynamically creating the Sproc that does the Bulk Insert. Now, as to whether the process of importing the data (inside of SQL Server) strips off the spaces is really somewhat irrelevant at this point since I have working code to do this. It is not blazing fast at about 1000 lines / second (for 150 fields) but it is "fast enough". If I port that to VB.Net I hope / expect to get a speed increase. The BULK INSERT SProc that I hand build is currently running about 12K records / sec (for 150 fields) In the end, this really needs to be an external application driving SQL Server functionality. I need a place to go to fill in the import spec table, set the from / to directories, set up the name of the table etc. My heartache to this point has been the inability to get the SQL Server built-in import stuff to import the data without the external strip / rebuild step, or at least without an elaborate dance to get around any limitations of SQL Server to do that stuff for me. The very next thing I have to do is start exporting just the name / address (and PK) of this 100 million record table back out for CASS / NCOA processing. This requires exporting 2 million records at a time, to a destination directory, with a unique file name (Infutor01.csv, Infutor02.csv...). Once my CASS program finishes processing I will have a new set of files in yet another directory that I need to pull back in to SQL Server. Those files will not require the space stripping piece since they will not be fixed width. I do appreciate all of the advice from all the folks out there that have contributed. I am slowly but surely learning the pieces and parts that I need to do this part of my job. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart Sent: Wednesday, May 09, 2007 9:31 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert John, I am working on an example of doing this with SQL Server for you. But, since I have the same amount of free time as you do, it is going to take a week or so to complete. I am using the same concept as you did with a couple of table to hold the Import spec and the column definitions for the spec. I am going to only build it for one table to show how it can be done. The rest will be up to you if you want to expand it. Robert At 09:27 PM 5/8/2007, you wrote: >Date: Tue, 8 May 2007 13:40:59 -0400 >From: "JWColby" >Subject: Re: [dba-SQLServer] Bulk insert >To: >Message-ID: <00d901c79198$0a703210$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Jim, > >At this point it is not necessary. I built a preprocessor in a few >hours using my toy (Access). My toy application handles everything >exactly as described. Someday (soon I hope) I will port that to VB.Net >which I hope will be much quicker in the preprocessing department. >Then I will be considered by some as being a real man, playing with >real tools. ;-) Others will still consider me a child, playing with >toys because I didn't take it straight to C#. SOMEDAY (far in the >future) perhaps I will embed those pieces directly in CLR programming >inside of SQL Server 2005. Then I will be able to look down my nose at those children still playing with toys. > >For now, it works and with the addition of driving the Sproc from the >vba will be an integrated application like what I described. > >John W. Colby _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From rl_stewart at highstream.net Wed May 9 11:40:58 2007 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Wed, 09 May 2007 11:40:58 -0500 Subject: [dba-SQLServer] Bulk insert In-Reply-To: References: Message-ID: <200705091642.l49Gg11u011637@databaseadvisors.com> See answers/comments below... At 10:02 AM 5/9/2007, you wrote: >Date: Wed, 9 May 2007 11:01:56 -0400 >From: "JWColby" >Subject: Re: [dba-SQLServer] Bulk insert >To: >Message-ID: <002501c7924b$01f69c10$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Robert, > >That is much appreciated. Why don't you work with me instead? We talked about working together on some things before. But nothing ever came of it. The idea of the 'sample' was to give you an idea of how to do it so you could proceed on your own. I have a full time job as a SQL DBA and Developer now. Plus I am also doing 6 new .Net web sites and 2 new installs of my Social Services management (SSMS) software this month. And, I a rewriting the SSMS system into .Net. And, finally, to top all that off, my fiance and her daughter are having their interviews for their visas to come here on June 18. So I have to get everything ready for them also. So, a sample is the best I can do for you. > I am all for >doing it "in SQL Server" (I own and use SQL Server 2005) but it has to be >something that I understand and can modify or I will not be able to use it >in the end, which would make your efforts wasted. I don't consider doing samples a wasted effort, unless they are ignored. > > >My high level spec: > >1) Use a "from directory" which is monitored to pick up files from. I was not planning on using the SQL Agent and setting up job to do it, but it could be done. A SQL Agent job would have to be created for each 'client/job'. > >1a) The from directory changes for each import data set. That should be part of the table definition for the 'client job' setup. >2) Each file in the "from directory" will contain identical formatted data, >but the number of records in the file may vary. Would not matted if the format is the same. >3) Each file will be defined by an import spec table, which contains Field >Name, Field Width and data type. If no data type is specified, then >VarChar() is used. The size of the VarChar() field is determined by the >spec table. This allows me to only have to spec data types (in advance) >that I know is not VarChar(), but allows me to spec as many as I need in >advance of the import. This should be each file type, i.e. the format of the data. You would build a SQL statement in code and use EXEC sp_ExecuteSql to execute the statement to build the table from the information in the import spec column definition table. >4) Once the table is built and populated, a new field called PKID needs to >be built. PKID will be INT (32 bit) PRIMARY KEY, IDENTITY. this is a simple ALTER TABLE statement. But, it must also be built as a string and executed as above. >Understand that at this time I actually have a functioning system. It takes >a pair of tables which specify the file info (from / to directories, field >delimiter etc) and the field info (field name / start position in the fixed >width data / field length). This program (written in VBA) does a conversion >from fixed width to a pipe delimited "CSV" file, reading a line, stripping >off the spaces, and writing the stripped data lines back out to a CSV file >in the "TO Directory", complete with the first line containing field names. >At the moment I have a hand constructed table in a hand constructed >database, which is created by an initial use of the wizard from inside of >SQL Server, pulling in the first CSV files created by my program. Once that >table is created, I use a hand created BULK INSERT Sproc to import the CSV >files. Once the table is fully populated with the contents of all the >files, I hand build an ALTER TABLE query to build a PKID INT PRIMARY KEY, >IDENTITY. > >As you can see, anywhere you see "hand created", that is an area that needs >to be automated. My thoughts are that creating the table initially will be >relatively easy, and in fact I know how, building up a make table query with >the field names and widths taken from the spec table. I just did not do >that because I did not have the time. The next issue is dynamically >creating the Sproc that does the Bulk Insert. > >Now, as to whether the process of importing the data (inside of SQL Server) >strips off the spaces is really somewhat irrelevant at this point since I >have working code to do this. It is not blazing fast at about 1000 lines / >second (for 150 fields) but it is "fast enough". If I port that to VB.Net I >hope / expect to get a speed increase. The BULK INSERT SProc that I hand >build is currently running about 12K records / sec (for 150 fields) > >In the end, this really needs to be an external application driving SQL >Server functionality. I need a place to go to fill in the import spec >table, set the from / to directories, set up the name of the table etc. My >heartache to this point has been the inability to get the SQL Server >built-in import stuff to import the data without the external strip / >rebuild step, or at least without an elaborate dance to get around any >limitations of SQL Server to do that stuff for me. Correct, you would still need to have a front end to pass the parameters into the SP to actually do the work. >The very next thing I have to do is start exporting just the name / address >(and PK) of this 100 million record table back out for CASS / NCOA >processing. This requires exporting 2 million records at a time, to a >destination directory, with a unique file name (Infutor01.csv, >Infutor02.csv...). Once my CASS program finishes processing I will have a >new set of files in yet another directory that I need to pull back in to SQL >Server. Those files will not require the space stripping piece since they >will not be fixed width. Writing them out will be relatively easy. Probably a record count per file that would be a parameter you pass in to the SP that does it. >I do appreciate all of the advice from all the folks out there that have >contributed. I am slowly but surely learning the pieces and parts that I >need to do this part of my job. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com From jwcolby at colbyconsulting.com Wed May 9 12:01:01 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 9 May 2007 13:01:01 -0400 Subject: [dba-SQLServer] Infutor Statistics - was RE: Bulk insert In-Reply-To: <002501c7924b$01f69c10$657aa8c0@m6805> References: <200705091332.l49DW5Td023462@databaseadvisors.com> <002501c7924b$01f69c10$657aa8c0@m6805> Message-ID: <003a01c7925b$a000ba70$657aa8c0@m6805> Just an FYI. The table that I have been building this whole time contains 97.5 million records, exactly 149 (imported) fields and requires 62.6 Gigabytes of data space inside of SQL Server. It took 2 hours and 28 minutes just to build the auto increment PK field after the table was finished importing records. The index space for the table (with just this single index) is 101 Megabytes. There were 56 raw data files which required 75 gigabytes of disk space to hold. There were 56 CSV files created after stripping out the spaces, which required 40.8 Gigabytes of disk space to hold. Thus by my calculations, 35 gigs of disk space was required to hold JUST THE SPACES in the original fixed width file, with the real data occupying 40.8 GB. It is interesting to note that the raw data in the CSV file was 41gb while the data space required in SQL Server is 62 gb. As the process was built over time, I do not have accurate specs for each and every file, but the process of stripping the spaces off of the fields happened at about 1K records / second. Given 97.5 million records, this equates to 97.5 thousand seconds to do the space stripping, which is 27.77 hours. That of course is done in a VBA application. Again I don't have accurate specs for all of the bulk inserts, however those that I recorded the times for summed to 71.2 million records, which took 4674 seconds (1.3 hours) to import using a BULK INSERT statement, which equates to approximately 15K records / second. Remember that this BULK INSERT is importing precleaned data with pipe delimiters. Also remember that the BULK INSERT itself took 1.3 hours but due to the lack of automation in feeding the Sproc file names, I had to manually edit the SPROC each time I wanted to import a new file so the actual import took much longer, since I wasn't necessarily watching the computer as the last SPROC run finished. So there you go, that is what I have been trying to accomplish this last few weeks. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of JWColby Sent: Wednesday, May 09, 2007 11:02 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Robert, That is much appreciated. Why don't you work with me instead? I am all for doing it "in SQL Server" (I own and use SQL Server 2005) but it has to be something that I understand and can modify or I will not be able to use it in the end, which would make your efforts wasted. My high level spec: 1) Use a "from directory" which is monitored to pick up files from. 1a) The from directory changes for each import data set. 2) Each file in the "from directory" will contain identical formatted data, but the number of records in the file may vary. 3) Each file will be defined by an import spec table, which contains Field Name, Field Width and data type. If no data type is specified, then VarChar() is used. The size of the VarChar() field is determined by the spec table. This allows me to only have to spec data types (in advance) that I know is not VarChar(), but allows me to spec as many as I need in advance of the import. 4) Once the table is built and populated, a new field called PKID needs to be built. PKID will be INT (32 bit) PRIMARY KEY, IDENTITY. Understand that at this time I actually have a functioning system. It takes a pair of tables which specify the file info (from / to directories, field delimiter etc) and the field info (field name / start position in the fixed width data / field length). This program (written in VBA) does a conversion from fixed width to a pipe delimited "CSV" file, reading a line, stripping off the spaces, and writing the stripped data lines back out to a CSV file in the "TO Directory", complete with the first line containing field names. At the moment I have a hand constructed table in a hand constructed database, which is created by an initial use of the wizard from inside of SQL Server, pulling in the first CSV files created by my program. Once that table is created, I use a hand created BULK INSERT Sproc to import the CSV files. Once the table is fully populated with the contents of all the files, I hand build an ALTER TABLE query to build a PKID INT PRIMARY KEY, IDENTITY. As you can see, anywhere you see "hand created", that is an area that needs to be automated. My thoughts are that creating the table initially will be relatively easy, and in fact I know how, building up a make table query with the field names and widths taken from the spec table. I just did not do that because I did not have the time. The next issue is dynamically creating the Sproc that does the Bulk Insert. Now, as to whether the process of importing the data (inside of SQL Server) strips off the spaces is really somewhat irrelevant at this point since I have working code to do this. It is not blazing fast at about 1000 lines / second (for 150 fields) but it is "fast enough". If I port that to VB.Net I hope / expect to get a speed increase. The BULK INSERT SProc that I hand build is currently running about 12K records / sec (for 150 fields) In the end, this really needs to be an external application driving SQL Server functionality. I need a place to go to fill in the import spec table, set the from / to directories, set up the name of the table etc. My heartache to this point has been the inability to get the SQL Server built-in import stuff to import the data without the external strip / rebuild step, or at least without an elaborate dance to get around any limitations of SQL Server to do that stuff for me. The very next thing I have to do is start exporting just the name / address (and PK) of this 100 million record table back out for CASS / NCOA processing. This requires exporting 2 million records at a time, to a destination directory, with a unique file name (Infutor01.csv, Infutor02.csv...). Once my CASS program finishes processing I will have a new set of files in yet another directory that I need to pull back in to SQL Server. Those files will not require the space stripping piece since they will not be fixed width. I do appreciate all of the advice from all the folks out there that have contributed. I am slowly but surely learning the pieces and parts that I need to do this part of my job. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart Sent: Wednesday, May 09, 2007 9:31 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert John, I am working on an example of doing this with SQL Server for you. But, since I have the same amount of free time as you do, it is going to take a week or so to complete. I am using the same concept as you did with a couple of table to hold the Import spec and the column definitions for the spec. I am going to only build it for one table to show how it can be done. The rest will be up to you if you want to expand it. Robert At 09:27 PM 5/8/2007, you wrote: >Date: Tue, 8 May 2007 13:40:59 -0400 >From: "JWColby" >Subject: Re: [dba-SQLServer] Bulk insert >To: >Message-ID: <00d901c79198$0a703210$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Jim, > >At this point it is not necessary. I built a preprocessor in a few >hours using my toy (Access). My toy application handles everything >exactly as described. Someday (soon I hope) I will port that to VB.Net >which I hope will be much quicker in the preprocessing department. >Then I will be considered by some as being a real man, playing with >real tools. ;-) Others will still consider me a child, playing with >toys because I didn't take it straight to C#. SOMEDAY (far in the >future) perhaps I will embed those pieces directly in CLR programming >inside of SQL Server 2005. Then I will be able to look down my nose at those children still playing with toys. > >For now, it works and with the addition of driving the Sproc from the >vba will be an integrated application like what I described. > >John W. Colby _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 9 12:05:54 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 9 May 2007 13:05:54 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <200705091642.l49Gg11u011637@databaseadvisors.com> References: <200705091642.l49Gg11u011637@databaseadvisors.com> Message-ID: <004a01c7925c$4e91e820$657aa8c0@m6805> Given all that is on your plate, I don't think that getting involved in this is a good idea. I appreciate the thought but get your fiance in and your work in order. Thanks, John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart Sent: Wednesday, May 09, 2007 12:41 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert See answers/comments below... At 10:02 AM 5/9/2007, you wrote: >Date: Wed, 9 May 2007 11:01:56 -0400 >From: "JWColby" >Subject: Re: [dba-SQLServer] Bulk insert >To: >Message-ID: <002501c7924b$01f69c10$657aa8c0 at m6805> >Content-Type: text/plain; charset="us-ascii" > >Robert, > >That is much appreciated. Why don't you work with me instead? I have a full time job as a SQL DBA and Developer now. Plus I am also doing 6 new .Net web sites and 2 new installs of my Social Services management (SSMS) software this month. And, I a rewriting the SSMS system into .Net. And, finally, to top all that off, my fiance and her daughter are having their interviews for their visas to come here on June 18. So I have to get everything ready for them also. So, a sample is the best I can do for you. From martyconnelly at shaw.ca Wed May 9 13:04:21 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Wed, 09 May 2007 11:04:21 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <000401c791e5$33d51790$657aa8c0@m6805> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon> <00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca> <000401c791e5$33d51790$657aa8c0@m6805> Message-ID: <46420D25.4040808@shaw.ca> Another option is to use SqlBulkCopy a class that comes with Net Framework 2.0 There is a SqlBulkCopy example in the book below that uses a CSV import. Ttricky to setup but it works. This new 2.0 class is designed to call the SQLSMO layer underneath the covers--that replaces SQL DMO. William Vaughn's Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition) Here is C## code example of calling using a datareader stream into SQLBulkCopy This handles 40,000 rows a second, 8000 if you apply indexing private void button1_Click(object sender, EventArgs e) { Stopwatch sw = new Stopwatch(); sw.Start(); DataTable table = new DataTable(); table.Columns.Add(new DataColumn("File",typeof(string))); table.Columns.Add(new DataColumn("IID",typeof(int))); table.Columns[1].AutoIncrement = true; table.Columns[1].AutoIncrementSeed = 1; table.Columns[1].AutoIncrementStep = 1; StreamReader sr = new StreamReader("c:\\filelist.txt"); while (!sr.EndOfStream ){ table.Rows.Add(sr.ReadLine()); } sw.Stop(); Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / sw.Elapsed.TotalSeconds + " rows per second loaded to datatable"); sw.Start(); SqlConnection sqlcon = new SqlConnection("data source=lon0371xns;initial catalog=SonarBackup;integrated security=sspi"); SqlBulkCopy bc = new SqlBulkCopy(sqlcon); bc.DestinationTableName = "FileList"; bc.NotifyAfter = 5000; bc.SqlRowsCopied += new SqlRowsCopiedEventHandler(bc_SqlRowsCopied); bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("File", "File")); bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("IID", "IID")); sqlcon.Open(); bc.BulkCopyTimeout = 500; bc.WriteToServer(table); sw.Stop(); Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / sw.Elapsed.TotalSeconds + " rows per second loaded to db"); } JWColby wrote: >>The only caveat here is if you have empty fields in your file, a single >> >> >space is inserted instead of a null. > >What is it with all "the only caveat here" stuff? I am sure that there is a >darned good reason. > >In the end it is just easier to roll your own rather than work around the >issues that the built in stuff seems to have. I have 150 fields (in this >data set). Somehow I have to do an update on all 150 fields. I suppose I >could have my converter run 150 update queries to do each column. Or 700 >update queries to do the next data set. Or just do the stripping of the >spaces external to SQL Server and be done with it. Either way I still have >to use my toy. > >Once I move up to VB.Net I will be able to use threads to do the stripping >and the BULK INSERT Sproc in parallel. > >BTW, I have to do something very similar all over again once I get the data >in. I will need to export the entire table back out, 2 million record sets >of data to delimited files for CAS / NCOA processing, dumping 100 million >records out into ~50 files (just the address data this time). The CAS / >NCOA process theoretically will process all files placed into an input >directory (input to that program), dumping the processed files into an >output directory (output from that program). At which point I have to pull >all of the CASS / NCOAd files BACK out of that output directory into to yet >another table. And that is just the "pre-processing". > >You might be getting a clue by now why I do not want to be manually doing >all the crapola involved with the solutions that do not involve an external >control process. Someday fairly soon I will have a completely automated >system for doing all this. I will be back to blowing bubbles and poking at >Charlotte with a big stick. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >MartyConnelly >Sent: Tuesday, May 08, 2007 10:27 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Uhh, there is a one line fix to remove trailing blanks in SQL, different >defaults for SQL Server versions and NChar and VChar. > >SET ANSI_PADDING OFF > > When a table is created with the setting turned on (the default), spaces >are not trimmed when data is inserted into that table. When ANSI_PADDING is >off, the spaces are trimmed. > >So if you SET ANSI_PADDING OFF, create your table, then set it back on >again, when you bcp the data into the table, the excess trailing spaces will >be eliminated. The only caveat here is if you have empty fields in your >file, a single space is inserted instead of a null. If this is the case with >your data file, you will need to do an update to set columns to null when >len(yourcolumn) = 0. > >See BOL >http://msdn2.microsoft.com/en-us/library/ms188340.aspx > >http://msdn2.microsoft.com/en-us/library/ms187403.aspx > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada From jwcolby at colbyconsulting.com Wed May 9 13:33:49 2007 From: jwcolby at colbyconsulting.com (JWColby) Date: Wed, 9 May 2007 14:33:49 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <46420D25.4040808@shaw.ca> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon><00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca><000401c791e5$33d51790$657aa8c0@m6805> <46420D25.4040808@shaw.ca> Message-ID: <006a01c79268$96932bf0$657aa8c0@m6805> Marty, My only question about this method is the loading of all the records at one go before processing. Remember that I am doing raw files that can be 4 gigabytes of text, up to (in this case) 4 million records and (in this case) 149 fields. These files are HUGE by desktop standards. My method uses a single line read / process / write and thus is pretty much guaranteed to handle any size file, any number of records, any number of fields. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of MartyConnelly Sent: Wednesday, May 09, 2007 2:04 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Another option is to use SqlBulkCopy a class that comes with Net Framework 2.0 There is a SqlBulkCopy example in the book below that uses a CSV import. Ttricky to setup but it works. This new 2.0 class is designed to call the SQLSMO layer underneath the covers--that replaces SQL DMO. William Vaughn's Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition) Here is C## code example of calling using a datareader stream into SQLBulkCopy This handles 40,000 rows a second, 8000 if you apply indexing private void button1_Click(object sender, EventArgs e) { Stopwatch sw = new Stopwatch(); sw.Start(); DataTable table = new DataTable(); table.Columns.Add(new DataColumn("File",typeof(string))); table.Columns.Add(new DataColumn("IID",typeof(int))); table.Columns[1].AutoIncrement = true; table.Columns[1].AutoIncrementSeed = 1; table.Columns[1].AutoIncrementStep = 1; StreamReader sr = new StreamReader("c:\\filelist.txt"); while (!sr.EndOfStream ){ table.Rows.Add(sr.ReadLine()); } sw.Stop(); Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / sw.Elapsed.TotalSeconds + " rows per second loaded to datatable"); sw.Start(); SqlConnection sqlcon = new SqlConnection("data source=lon0371xns;initial catalog=SonarBackup;integrated security=sspi"); SqlBulkCopy bc = new SqlBulkCopy(sqlcon); bc.DestinationTableName = "FileList"; bc.NotifyAfter = 5000; bc.SqlRowsCopied += new SqlRowsCopiedEventHandler(bc_SqlRowsCopied); bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("File", "File")); bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("IID", "IID")); sqlcon.Open(); bc.BulkCopyTimeout = 500; bc.WriteToServer(table); sw.Stop(); Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / sw.Elapsed.TotalSeconds + " rows per second loaded to db"); } JWColby wrote: >>The only caveat here is if you have empty fields in your file, a single >> >> >space is inserted instead of a null. > >What is it with all "the only caveat here" stuff? I am sure that there is a >darned good reason. > >In the end it is just easier to roll your own rather than work around the >issues that the built in stuff seems to have. I have 150 fields (in this >data set). Somehow I have to do an update on all 150 fields. I suppose I >could have my converter run 150 update queries to do each column. Or 700 >update queries to do the next data set. Or just do the stripping of the >spaces external to SQL Server and be done with it. Either way I still have >to use my toy. > >Once I move up to VB.Net I will be able to use threads to do the stripping >and the BULK INSERT Sproc in parallel. > >BTW, I have to do something very similar all over again once I get the data >in. I will need to export the entire table back out, 2 million record sets >of data to delimited files for CAS / NCOA processing, dumping 100 million >records out into ~50 files (just the address data this time). The CAS / >NCOA process theoretically will process all files placed into an input >directory (input to that program), dumping the processed files into an >output directory (output from that program). At which point I have to pull >all of the CASS / NCOAd files BACK out of that output directory into to yet >another table. And that is just the "pre-processing". > >You might be getting a clue by now why I do not want to be manually doing >all the crapola involved with the solutions that do not involve an external >control process. Someday fairly soon I will have a completely automated >system for doing all this. I will be back to blowing bubbles and poking at >Charlotte with a big stick. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >MartyConnelly >Sent: Tuesday, May 08, 2007 10:27 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Uhh, there is a one line fix to remove trailing blanks in SQL, different >defaults for SQL Server versions and NChar and VChar. > >SET ANSI_PADDING OFF > > When a table is created with the setting turned on (the default), spaces >are not trimmed when data is inserted into that table. When ANSI_PADDING is >off, the spaces are trimmed. > >So if you SET ANSI_PADDING OFF, create your table, then set it back on >again, when you bcp the data into the table, the excess trailing spaces will >be eliminated. The only caveat here is if you have empty fields in your >file, a single space is inserted instead of a null. If this is the case with >your data file, you will need to do an update to set columns to null when >len(yourcolumn) = 0. > >See BOL >http://msdn2.microsoft.com/en-us/library/ms188340.aspx > >http://msdn2.microsoft.com/en-us/library/ms187403.aspx > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From rl_stewart at highstream.net Thu May 10 11:16:32 2007 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Thu, 10 May 2007 11:16:32 -0500 Subject: [dba-SQLServer] Bulk insert Message-ID: <200705101618.l4AGIPRf025159@databaseadvisors.com> John, Here is the first installment. These scripts will create the 2 tables and 1 view that the stored procedure uses to create a table. More later ******** Watch for line wrap CREATE DEFAULT dbo.Default_0 as 0 GO CREATE DEFAULT dbo.Default_Now as GETDATE() GO /****** Object: Table [dbo].[tsysImportDefinitionColumns] Script Date: 05/10/2007 11:15:05 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO CREATE TABLE [dbo].[tsysImportDefinitionColumns]( [ImportDefColumnID] [int] IDENTITY(1,1) NOT NULL, [ImportDefID] [int] NOT NULL, [ColumnName] [varchar](64) NOT NULL, [ColumnStart] [int] NOT NULL, [ColumnEnd] [int] NOT NULL, [ColumnDataType] [varchar](15) NOT NULL, [ColumnLength] [varchar](10) NULL, [ColumnPrecision] [tinyint] NULL, [ColumnScale] [tinyint] NULL, CONSTRAINT [PK_tsysImportDefinitionColumns] PRIMARY KEY CLUSTERED ( [ImportDefColumnID] ASC ) ON [PRIMARY] ) ON [PRIMARY] GO SET ANSI_PADDING OFF GO /****** Object: Table [dbo].[tsysImportDefinition] Script Date: 05/10/2007 11:15:12 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO CREATE TABLE [dbo].[tsysImportDefinition]( [ImportDefID] [int] IDENTITY(1,1) NOT NULL, [ImportDefName] [varchar](50) NOT NULL, [ImportDefDesc] [varchar](500) NULL, [DBName] [varchar](64) NOT NULL, [DBTableName] [varchar](64) NOT NULL, [ArchivedFlag] [bit] NOT NULL, [AuditArchiveDate] [datetime] NULL, [AuditCreateDate] [datetime] NOT NULL, [AuditModifyDate] [datetime] NULL, CONSTRAINT [PK_tsysImportDefinition] PRIMARY KEY CLUSTERED ( [ImportDefID] ASC ) ON [PRIMARY] ) ON [PRIMARY] GO SET ANSI_PADDING OFF GO EXEC sys.sp_bindefault @defname=N'[dbo].[Default_0]', @objname=N'[dbo].[tsysImportDefinition].[ArchivedFlag]' , @futureonly='futureonly' GO EXEC sys.sp_bindefault @defname=N'[dbo].[Default_Now]', @objname=N'[dbo].[tsysImportDefinition].[AuditCreateDate]' , @futureonly='futureonly' GO /****** Object: View [dbo].[vwImportDefColumns] Script Date: 05/10/2007 11:15:13 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE VIEW [dbo].[vwImportDefColumns] AS SELECT dbo.tsysImportDefinition.DBName, dbo.tsysImportDefinition.DBTableName, dbo.tsysImportDefinition.ArchivedFlag, dbo.tsysImportDefinitionColumns.ColumnName, dbo.tsysImportDefinitionColumns.ColumnStart, dbo.tsysImportDefinitionColumns.ColumnEnd, dbo.tsysImportDefinitionColumns.ColumnDataType, dbo.tsysImportDefinitionColumns.ColumnLength, dbo.tsysImportDefinitionColumns.ColumnPrecision, dbo.tsysImportDefinitionColumns.ColumnScale FROM dbo.tsysImportDefinition INNER JOIN dbo.tsysImportDefinitionColumns ON dbo.tsysImportDefinition.ImportDefID = dbo.tsysImportDefinitionColumns.ImportDefID WHERE (dbo.tsysImportDefinition.ArchivedFlag = 0) GO CREATE PROCEDURE [dbo].[pImportCountryFile] AS BEGIN SET NOCOUNT ON; -- declare the variables you will need for the country columns DECLARE @CountryCode VARCHAR(3), @CountryName VARCHAR(100), @ColumnName VARCHAR(64), @ColumnStart INT, @ColumnEnd INT, @ColumnDataType VARCHAR(15), @ColumnLength VARCHAR(10), -- column length defined as varchar to take MAX as a parameter @ColumnPrecision TINYINT, @ColumnScale TINYINT, @DBName VARCHAR(64), @DBTableName VARCHAR(64) -- Declare the variable you will ned for your dynamic SQL statements DECLARE @CreateTable NVARCHAR(4000) -- the following code is generic for creating the table -- the only thing that would be changed is being able to -- pass into the proc the name of the table you want to -- do the build of. for your system, it would probably be -- the client job information DECLARE cColumns CURSOR FOR SELECT DBName, DBTableName, ColumnName, ColumnStart, ColumnEnd, ColumnDataType, ColumnLength, ColumnPrecision, ColumnScale FROM dbo.vwImportDefColumns WHERE DBTableName = 'tlkpCountry' OPEN cColumns FETCH NEXT FROM cColumns INTO @DBName, @DBTableName, @ColumnName, @ColumnStart, @ColumnEnd, @ColumnDataType, @ColumnLength, @ColumnPrecision, @ColumnScale -- build the create table sql SET @CreateTable = 'CREATE TABLE dbo.' + @DBTableName + ' (' + 'PK_ID int identity(1,1), ' WHILE (@@FETCH_STATUS = 0) -- There are records BEGIN -- build the create table sql IF (CHARINDEX('varchar', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + '(' + + @ColumnLength + '), ' END ELSE IF (CHARINDEX('int', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + ', ' END ELSE IF (CHARINDEX('date', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + ', ' END ELSE IF (CHARINDEX('text', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + ', ' END ELSE -- money, numeric or decimal BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + '(' + + CAST(@ColumnPrecision AS varchar(10)) + CAST(@ColumnScale AS varchar(10)) + '), ' END FETCH NEXT FROM cColumns INTO @DBName, @DBTableName, @ColumnName, @ColumnStart, @ColumnEnd, @ColumnDataType, @ColumnLength, @ColumnPrecision, @ColumnScale END -- trim the final , from the end of the string SET @CreateTable = substring(@CreateTable,1,len(@CreateTable) - 1) + ')' PRINT @CreateTable -- Execute the create table statement -- EXEC sp_ExecuteSql @CreateTable CLOSE cColumns DEALLOCATE cColumns END GO From fhtapia at gmail.com Fri May 11 10:54:07 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 11 May 2007 08:54:07 -0700 Subject: [dba-SQLServer] [AccessD] Interesting for you kung fu masters? In-Reply-To: <0JHR0060H07KFFE0@l-daemon> References: <000201c791cf$f7351f20$657aa8c0@m6805> <0JHR0060H07KFFE0@l-daemon> Message-ID: Jim/John (and all) if you like that, you should really be checking out red-gate's sql compare. It not only does the compare that you are seeing here, but you can do it across servers, In this way you can compare your source database in your dev server against the production server database. It has come in really handy in our environment since I do not allow my developers to create any code on the live servers. If I catch them, they get to deal with me giving them a big lecture, second offense my boss wants me to bring them into his office, but that has not occurred yet. The price is not terrible either, we negotiated with their sql backup and sql compare products and were able to get 2 years of maintenance support for free. So I know this sounds like a commercial, but we also like to discuss products we use to help make our lives easier on this list right?... but one of the features I completly dig about this product (sql compare) is that when my developer completes their code, we can go through the code review... then if all is approved, we push it using this tool to the live server. This product allows you to select everything you need from your dev server against your production server, thus if you changed tables, data types, procedures, views etc... you can select them all for synchronization. You can also choose to just review the script that the program will generate to make the necessary changes and then just run it yourself via query analyzer. the sp_compare is what I used to use before, but then that will waste space if your looking at really large db's.. We also purchased sql-data compare which allows you to look at two tables and synchronize the data... so with our data warehouse when we are running changes, I will bring over a subset of data, and make the necessary changes, then publish the schema back to production without affecting production or causing unnecessary errors. http://red-gate.com/products/SQL_Compare/index.htm You can try the product for 14 days, and extend it for another 10 days (iirc) by typing in "i need more time" in the serial box... I think you can extend the products twice before they deactivate. In addition if you need extra time to evaluate the product you can contact them and they will give you a serial for a slightly longer time-period. Just so you know they don't get a full glowing report from me :). On occasion when I have required support, they have not been the fastest. I've used products from quest software and idera, and both those companies were faster at getting back to me (same day) than red-gate. Though to be fair, I have not had any issues with red-gate products that required such immediate attention such as idera or quest. I hope this info helps you guys. -- Francisco On 5/8/07, Jim Lawrence wrote: > > This is similar to the Backend-Upgrader at the DBA site > (http://www.databaseadvisors.com/downloads.asp) but works for MS SQL > server > instead of MS Access. > > Jim > > -----Original Message----- > From: accessd-bounces at databaseadvisors.com > [mailto:accessd-bounces at databaseadvisors.com] On Behalf Of JWColby > Sent: Tuesday, May 08, 2007 5:21 PM > To: 'Access Developers discussion and problem solving'; > dba-vb at databaseadvisors.com > Subject: [AccessD] Interesting for you kung fu masters? > > www.sql-server-performance.com/vg_database_comparison_sp.asp > > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > -- > AccessD mailing list > AccessD at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/accessd > Website: http://www.databaseadvisors.com > > -- > AccessD mailing list > AccessD at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/accessd > Website: http://www.databaseadvisors.com > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From fhtapia at gmail.com Fri May 11 10:56:02 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 11 May 2007 08:56:02 -0700 Subject: [dba-SQLServer] [dba-VB] Redeem your voucher In-Reply-To: <011e01c791b2$32acf370$657aa8c0@m6805> References: <011e01c791b2$32acf370$657aa8c0@m6805> Message-ID: heheheh... ...}:-> oops did I say that out-loud? what browser are you using... On 5/8/07, JWColby wrote: > > Well, I watched the two videos, filled out the required feedback report > and > now I have received my highly coveted voucher for my free* copy of Visual > Studio 2005 Standard Edition. > > *Other than having to pay shipping and handling (which is fine) I also > have > to get the web page to ACCEPT my voucher (which ain't happening). The old > endless loop, comes right back to the same page and asks for the voucher > again. I even tried using MS POS Explorer to no avail. > > I wonder if anyone at MS has noticed that no one is redeeming their > vouchers. I wonder if anyone at MS is ASSIGNED to notice such things. > > Sigh. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-VB mailing list > dba-VB at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-vb > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From fhtapia at gmail.com Fri May 11 11:10:59 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 11 May 2007 09:10:59 -0700 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <00e001c7919b$5fa696e0$657aa8c0@m6805> References: <00b601c79184$79118390$657aa8c0@m6805> <29f585dd0705081034q675ece25m7c31c0428a0d32bd@mail.gmail.com> <00e001c7919b$5fa696e0$657aa8c0@m6805> Message-ID: On 5/8/07, JWColby wrote: > > Arthur, > > >Let's pick it apart tad by tad, beginning with your denigrating use of > "infamous". This is a "famous" error, not an infamous error. For > references > to infamous errors, their numbers are 533 and 601. > > I have no way of knowing whether this is a famous error or > not. However... > > Infamous - ill-famed: having an exceedingly bad reputation; "a notorious > gangster"; "the tenderloin district was notorious for vice" > > How about "f*g useless!!! Will that do? first thing... let's just cut it down... to simple syntax... no need to add the other letters if someone does not understand your profanity I'm sure you can handle them privately. "ERROR SOMEWHERE in the vicinity of...." > > Oh yea, that fits the word infamous in my book. It also fits F*G > useless as far as I am concerned! It's a syntactical error.. it tells you where it's at... you need to at this point re-read the expression you've typed in and re-evaluate what you want here. The system isn't going to turn around and error out with: "ERROR, John, you must use blah blah blah to fix this error" NOT even VBA does this... IF you happen to have a bad reference somewhere you will notice that sometimes it will even error out doing stupid things such as case changes or MIN(x) etc... dumb things that are built in to the language. all because a reference to a non-default library is missing... EBKAC is equally helpful, and only a tad more insulting. > > >Step 2: Why is there no space between the value and the operator? I shall > assume that it's the fault of the translator. > > Perhaps it is because that is the 47th attempt at putting things in and > taking things out, NONE of which gave me any results other than the > INFAMOUS > "Error somewhere in the vicinity of Hudson NC". > > >Step 3: lose the "AS" part and run the query and see what happens. You > probably won't get this far, since Steps 1 and 2 ought to fix the problem, > but JIC (just in case). > > AS is the clause that defines the alias. You can't lose that. Even I > know > that. you can, your AS is optional (ha!)... example SELECT 1 testfieldone, 2 AS testfieldtwo this generates a list of values 1 and 2 in columns testfieldone and testfieldtwo, this is acceptable syntax for sql 7, 2000 and 2005. Though adding the AS makes it more legible and easier to follow along (btw, good idea to get into the habit of capitalizing all they default keywords). >Step 4: when none of the above works, re-do the query in Management Studio. > Then compare the syntax. > > And this is my problem with you Arthur. If I told you "if that fails, > just > rebuild the space shuttle" what would you tell me? If I told you that in > EVERY EMAIL what would you tell me? Such helpful suggestions are so > useless > that I would expect you to someday cease and desisted in issuing them. > Alas.... > > I tried using Management Studio. It is vastly different from Access' qbd > window and requires more than a passing knowledge of the intricacies of > SQL > Server's brand of SQL. If you had been paying attention for the last few > months you would understand that to be the root of my problem. I also > tried > poking and prodding, and Googling and looking up various phrases in my > books, trying to solve the problem without assistance. > > Not to worry, James Barash actually solved my problem by providing the (or > a) syntax needed to do comparisons in SQL Server. I have to guess that it > took him all of three minutes to type it into an email, and it took me all > of three minutes to type it in and verify that it works. > > Thanks James! Glad to see your problem has been resolved. John W. Colby > Colby Consulting > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur > Fuller > Sent: Tuesday, May 08, 2007 1:35 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into > Booleans > > Let's pick it apart tad by tad, beginning with your denigrating use of > "infamous". This is a "famous" error, not an infamous error. For > references > to infamous errors, their numbers are 533 and 601. > > Now. Let's go step by step.AFAIK S2k5 has no issues with square brackets, > in > fact I use them frequently, but begin by removing them. > > Step 2: Why is there no space between the value and the operator? I shall > assume that it's the fault of the translator. > > Step 3: lose the "AS" part and run the query and see what happens. You > probably won't get this far, since Steps 1 and 2 ought to fix the problem, > but JIC (just in case). > > Step 4: when none of the above works, re-do the query in Management > Studio. > Then compare the syntax. > > A, > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From jwcolby at colbyconsulting.com Fri May 11 11:44:01 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Fri, 11 May 2007 12:44:01 -0400 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: Message-ID: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> You know what Francisco, I work hard at learning this on my own. I have a half dozen books open in front of me, and I Google for an hour before I post a request for help. I just get tired of silly crap, both from SQL Server (or any other software package) and from list members. SQL Server's "help" was useless, as was Arthur's post. Every single point in the email was useless, and ended up condescending. James post OTOH was informative, to the point and best of all worked! I read every single reply to my posts trying to learn what I can, so when I get an entire email full of useless and even condescending crap it is annoying. It is REALLY annoying when it happens over and over. I try very hard to only post responses to questions with real, useful suggestions. NOW, go back and read Arthur's original response (at the bottom of this email). It is just useless, a waste of my time, and a waste of the list's bandwidth. James response is a an example of a succinct and useful reply to a request for help. Arthur had exactly the same information to work with that James did. LOOK at the difference in responses!!! Arthur is a bright guy, and knows a lot of stuff. If he would limit his responses to actually providing solutions to problems he would be immensely useful. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Friday, May 11, 2007 12:11 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into Booleans On 5/8/07, JWColby wrote: > > Arthur, > > >Let's pick it apart tad by tad, beginning with your denigrating use > >of > "infamous". This is a "famous" error, not an infamous error. For > references to infamous errors, their numbers are 533 and 601. > > I have no way of knowing whether this is a famous error or not. > However... > > Infamous - ill-famed: having an exceedingly bad reputation; "a > notorious gangster"; "the tenderloin district was notorious for vice" > > How about "f*g useless!!! Will that do? first thing... let's just cut it down... to simple syntax... no need to add the other letters if someone does not understand your profanity I'm sure you can handle them privately. "ERROR SOMEWHERE in the vicinity of...." > > Oh yea, that fits the word infamous in my book. It also fits F*G > useless as far as I am concerned! It's a syntactical error.. it tells you where it's at... you need to at this point re-read the expression you've typed in and re-evaluate what you want here. The system isn't going to turn around and error out with: "ERROR, John, you must use blah blah blah to fix this error" NOT even VBA does this... IF you happen to have a bad reference somewhere you will notice that sometimes it will even error out doing stupid things such as case changes or MIN(x) etc... dumb things that are built in to the language. all because a reference to a non-default library is missing... EBKAC is equally helpful, and only a tad more insulting. > > >Step 2: Why is there no space between the value and the operator? I > >shall > assume that it's the fault of the translator. > > Perhaps it is because that is the 47th attempt at putting things in > and taking things out, NONE of which gave me any results other than > the INFAMOUS "Error somewhere in the vicinity of Hudson NC". > > >Step 3: lose the "AS" part and run the query and see what happens. > >You > probably won't get this far, since Steps 1 and 2 ought to fix the > problem, but JIC (just in case). > > AS is the clause that defines the alias. You can't lose that. Even I > know that. you can, your AS is optional (ha!)... example SELECT 1 testfieldone, 2 AS testfieldtwo this generates a list of values 1 and 2 in columns testfieldone and testfieldtwo, this is acceptable syntax for sql 7, 2000 and 2005. Though adding the AS makes it more legible and easier to follow along (btw, good idea to get into the habit of capitalizing all they default keywords). >Step 4: when none of the above works, re-do the query in Management Studio. > Then compare the syntax. > > And this is my problem with you Arthur. If I told you "if that fails, > just > rebuild the space shuttle" what would you tell me? If I told you that in > EVERY EMAIL what would you tell me? Such helpful suggestions are so > useless > that I would expect you to someday cease and desisted in issuing them. > Alas.... > > I tried using Management Studio. It is vastly different from Access' qbd > window and requires more than a passing knowledge of the intricacies of > SQL > Server's brand of SQL. If you had been paying attention for the last few > months you would understand that to be the root of my problem. I also > tried > poking and prodding, and Googling and looking up various phrases in my > books, trying to solve the problem without assistance. > > Not to worry, James Barash actually solved my problem by providing the (or > a) syntax needed to do comparisons in SQL Server. I have to guess that it > took him all of three minutes to type it into an email, and it took me all > of three minutes to type it in and verify that it works. > > Thanks James! Glad to see your problem has been resolved. John W. Colby > Colby Consulting > www.ColbyConsulting.com > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur > Fuller > Sent: Tuesday, May 08, 2007 1:35 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into > Booleans > > Let's pick it apart tad by tad, beginning with your denigrating use of > "infamous". This is a "famous" error, not an infamous error. For > references > to infamous errors, their numbers are 533 and 601. > > Now. Let's go step by step.AFAIK S2k5 has no issues with square brackets, > in > fact I use them frequently, but begin by removing them. > > Step 2: Why is there no space between the value and the operator? I shall > assume that it's the fault of the translator. > > Step 3: lose the "AS" part and run the query and see what happens. You > probably won't get this far, since Steps 1 and 2 ought to fix the problem, > but JIC (just in case). > > Step 4: when none of the above works, re-do the query in Management > Studio. > Then compare the syntax. > > A, > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From rl_stewart at highstream.net Fri May 11 12:55:41 2007 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Fri, 11 May 2007 12:55:41 -0500 Subject: [dba-SQLServer] Reciprocity In-Reply-To: References: Message-ID: <200705111758.l4BHwJM0002530@databaseadvisors.com> So after coming up with the first section of code that you would need for your generic import processing, where is the same kind of courtesy that you are asking for from Arthur? Will it work for you or not? Should I bother with proceeding with the text file read? Robert At 12:00 PM 5/11/2007, you wrote: >Date: Fri, 11 May 2007 12:44:01 -0400 >From: "jwcolby" >Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into > Booleans >To: >Message-ID: <20070511164401.6BED3BC02 at smtp-auth.no-ip.com> >Content-Type: text/plain; charset="us-ascii" > >You know what Francisco, I work hard at learning this on my own. I have a >half dozen books open in front of me, and I Google for an hour before I post >a request for help. I just get tired of silly crap, both from SQL Server >(or any other software package) and from list members. SQL Server's "help" >was useless, as was Arthur's post. Every single point in the email was >useless, and ended up condescending. > >James post OTOH was informative, to the point and best of all worked! > >I read every single reply to my posts trying to learn what I can, so when I >get an entire email full of useless and even condescending crap it is >annoying. It is REALLY annoying when it happens over and over. I try very >hard to only post responses to questions with real, useful suggestions. > >NOW, go back and read Arthur's original response (at the bottom of this >email). It is just useless, a waste of my time, and a waste of the list's >bandwidth. James response is a an example of a succinct and useful reply to >a request for help. Arthur had exactly the same information to work with >that James did. LOOK at the difference in responses!!! > >Arthur is a bright guy, and knows a lot of stuff. If he would limit his >responses to actually providing solutions to problems he would be immensely >useful. > >John W. Colby From jwcolby at colbyconsulting.com Fri May 11 13:34:31 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Fri, 11 May 2007 14:34:31 -0400 Subject: [dba-SQLServer] Reciprocity In-Reply-To: <200705111758.l4BHwJM0002530@databaseadvisors.com> Message-ID: <20070511183431.06052BCF5@smtp-auth.no-ip.com> Robert, I do thank you for your assistance. The scripts are way cool from what I can understand of them. Please understand that I kind of work from what I need now to what I need eventually and this appears to be DEFINITELY what I need eventually. However I am, as we speak, migrating my VBA code to VB.NET. That has pretty much captured my entire attention. Perhaps you do not realize that I am VERY inexperienced in SQL Server, in fact have only ever written and am using exactly one Sproc. Thus I am completely unable to grasp the entirety of what you are trying to accomplish. I have not tested this stuff because I don't even know how. I understand everything down through the tsysImportDefinition (at least on a conceptual level), though I do not understand the EXEC sys.SP_BindDefault... No clue at all there. vwImportDefColumns not a clue. pImportCountryFile not a clue, though I assume that perhaps this is an "example" of using it? This appears to be the actual meat of the matter but it is so far over my head that "I'll have to get back to you" on that one. I have no doubt that whatever it does, it does well, and hopefully I will be able to use it soon. As for today.... I just don't have the SQL Server skills to apply it. As for "Should I bother with proceeding with the text file read?", I don't know. How can I answer that? You are apparently providing an all SQL solution to the problem which is really cool, and provides me with example code of how to do things, which is also really cool, but until I can absorb it I am not sure I can use it. If you have the time to explain this stuff, line by line or block by block, then perhaps yes, let's continue. I am not stupid, just ignorant. This looks like an education in progress. Of course I am still in first grade. ;-) I do think this would make a fascinating thread though if you do get down and explain it. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Robert L. Stewart Sent: Friday, May 11, 2007 1:56 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Reciprocity So after coming up with the first section of code that you would need for your generic import processing, where is the same kind of courtesy that you are asking for from Arthur? Will it work for you or not? Should I bother with proceeding with the text file read? Robert From fhtapia at gmail.com Fri May 11 15:56:36 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 11 May 2007 13:56:36 -0700 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> Message-ID: John, I'm not defending Arthur, however, I realize by your response here, that you are under some pressure to deliver some results quickly, if not by yourself for your business possibly the customers your are providing the reports for. This doesn't make Arthur's reply any less, James' reply was to the point and quick, Arthur just took a different route. This happens all the time on this or AccessD, sometimes a member of the list will provide enough information to begin troubleshooting the problem at your end, now I understand sometimes you don't need/want to learn to fish... sometimes the answer is enough. I like this list because it provides a place to think things out and provide different points of view when resolving issues or developing solutions. btw, You've found out the hard way that Access and Sql Server though both MS products have very distinct ways of handling SQL code. I am going through similar growing pains with a new system that we are getting into... and sometimes I wish our consultants would provide just a straight answer, but as all things in technology... the answers are usually "it depends". Patience ... On 5/11/07, jwcolby wrote: > > You know what Francisco, I work hard at learning this on my own. I have a > half dozen books open in front of me, and I Google for an hour before I > post > a request for help. I just get tired of silly crap, both from SQL Server > (or any other software package) and from list members. SQL Server's > "help" > was useless, as was Arthur's post. Every single point in the email was > useless, and ended up condescending. > > James post OTOH was informative, to the point and best of all worked! > > I read every single reply to my posts trying to learn what I can, so when > I > get an entire email full of useless and even condescending crap it is > annoying. It is REALLY annoying when it happens over and over. I try > very > hard to only post responses to questions with real, useful suggestions. > > NOW, go back and read Arthur's original response (at the bottom of this > email). It is just useless, a waste of my time, and a waste of the list's > bandwidth. James response is a an example of a succinct and useful reply > to > a request for help. Arthur had exactly the same information to work with > that James did. LOOK at the difference in responses!!! > > Arthur is a bright guy, and knows a lot of stuff. If he would limit his > responses to actually providing solutions to problems he would be > immensely > useful. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco > Tapia > Sent: Friday, May 11, 2007 12:11 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into > Booleans > > On 5/8/07, JWColby wrote: > > > > Arthur, > > > > >Let's pick it apart tad by tad, beginning with your denigrating use > > >of > > "infamous". This is a "famous" error, not an infamous error. For > > references to infamous errors, their numbers are 533 and 601. > > > > I have no way of knowing whether this is a famous error or not. > > However... > > > > Infamous - ill-famed: having an exceedingly bad reputation; "a > > notorious gangster"; "the tenderloin district was notorious for vice" > > > > How about "f*g useless!!! Will that do? > > > first thing... let's just cut it down... to simple syntax... no need to > add > the other letters if someone does not understand your profanity I'm sure > you > can handle them privately. > > > "ERROR SOMEWHERE in the vicinity of...." > > > > Oh yea, that fits the word infamous in my book. It also fits F*G > > useless as far as I am concerned! > > > It's a syntactical error.. it tells you where it's at... you need to at > this > point re-read the expression you've typed in and re-evaluate what you want > here. The system isn't going to turn around and error out with: > > "ERROR, John, you must use blah blah blah to fix this error" NOT even VBA > does this... IF you happen to have a bad reference somewhere you will > notice > that sometimes it will even error out doing stupid things such as case > changes or MIN(x) etc... dumb things that are built in to the language. > all > because a reference to a non-default library is missing... > > EBKAC is equally helpful, and only a tad more insulting. > > > > >Step 2: Why is there no space between the value and the operator? I > > >shall > > assume that it's the fault of the translator. > > > > Perhaps it is because that is the 47th attempt at putting things in > > and taking things out, NONE of which gave me any results other than > > the INFAMOUS "Error somewhere in the vicinity of Hudson NC". > > > > >Step 3: lose the "AS" part and run the query and see what happens. > > >You > > probably won't get this far, since Steps 1 and 2 ought to fix the > > problem, but JIC (just in case). > > > > AS is the clause that defines the alias. You can't lose that. Even I > > know that. > > > you can, your AS is optional (ha!)... example > > SELECT 1 testfieldone, 2 AS testfieldtwo > > this generates a list of values 1 and 2 in columns testfieldone and > testfieldtwo, this is acceptable syntax for sql 7, 2000 and 2005. Though > adding the AS makes it more legible and easier to follow along (btw, good > idea to get into the habit of capitalizing all they default keywords). > > > > >Step 4: when none of the above works, re-do the query in Management > Studio. > > Then compare the syntax. > > > > And this is my problem with you Arthur. If I told you "if that fails, > > just > > rebuild the space shuttle" what would you tell me? If I told you that > in > > EVERY EMAIL what would you tell me? Such helpful suggestions are so > > useless > > that I would expect you to someday cease and desisted in issuing them. > > Alas.... > > > > I tried using Management Studio. It is vastly different from Access' > qbd > > window and requires more than a passing knowledge of the intricacies of > > SQL > > Server's brand of SQL. If you had been paying attention for the last > few > > months you would understand that to be the root of my problem. I also > > tried > > poking and prodding, and Googling and looking up various phrases in my > > books, trying to solve the problem without assistance. > > > > Not to worry, James Barash actually solved my problem by providing the > (or > > a) syntax needed to do comparisons in SQL Server. I have to guess that > it > > took him all of three minutes to type it into an email, and it took me > all > > of three minutes to type it in and verify that it works. > > > > Thanks James! > > > Glad to see your problem has been resolved. > > > John W. Colby > > Colby Consulting > > www.ColbyConsulting.com > > > > -----Original Message----- > > From: dba-sqlserver-bounces at databaseadvisors.com > > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur > > Fuller > > Sent: Tuesday, May 08, 2007 1:35 PM > > To: dba-sqlserver at databaseadvisors.com > > Subject: Re: [dba-SQLServer] SQL Server - Turning comparisons into > > Booleans > > > > Let's pick it apart tad by tad, beginning with your denigrating use of > > "infamous". This is a "famous" error, not an infamous error. For > > references > > to infamous errors, their numbers are 533 and 601. > > > > Now. Let's go step by step.AFAIK S2k5 has no issues with square > brackets, > > in > > fact I use them frequently, but begin by removing them. > > > > Step 2: Why is there no space between the value and the operator? I > shall > > assume that it's the fault of the translator. > > > > Step 3: lose the "AS" part and run the query and see what happens. You > > probably won't get this far, since Steps 1 and 2 ought to fix the > problem, > > but JIC (just in case). > > > > Step 4: when none of the above works, re-do the query in Management > > Studio. > > Then compare the syntax. > > > > A, > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > -Francisco > http://sqlthis.blogspot.com | Tsql and More... > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From mwp.reid at qub.ac.uk Fri May 11 16:08:46 2007 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Fri, 11 May 2007 22:08:46 +0100 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> Message-ID: Yip up to my neck in Sharepoint and having similar problems etc with trying to find out how the guts of the thing work. Took me two hours this evening trying to find out how to add a value to the search drop down list in MOSS 2007. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 From fhtapia at gmail.com Fri May 11 16:32:18 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Fri, 11 May 2007 14:32:18 -0700 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> Message-ID: :D, I'm up to my ears in SAP security, and their CUA (central user administration). Our basis guy is a very good basis guy but is not a security person, so many questions that I have, he simply cannot answer, because he's never come across it. :| -- Francisco On 5/11/07, Martin Reid wrote: > > Yip up to my neck in Sharepoint and having similar problems etc with > trying to find out how the guts of the thing work. Took me two hours this > evening trying to find out how to add a value to the search drop down list > in MOSS 2007. > > Martin > > Martin WP Reid > Training and Assessment Unit > Riddle Hall > Belfast > > tel: 02890 974465 > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From martyconnelly at shaw.ca Fri May 11 18:24:18 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Fri, 11 May 2007 16:24:18 -0700 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <006a01c79268$96932bf0$657aa8c0@m6805> References: <00a801c79174$4858ee10$657aa8c0@m6805> <0JHQ005MPG4J9VP0@l-daemon> <00d901c79198$0a703210$657aa8c0@m6805> <4641315C.4040303@shaw.ca> <000401c791e5$33d51790$657aa8c0@m6805> <46420D25.4040808@shaw.ca> <006a01c79268$96932bf0$657aa8c0@m6805> Message-ID: <4644FB22.2070602@shaw.ca> Go with a big 64 bit CPU box and Windows Vista 64 bit Business Version that will give you access to 128 Gig of onboard RAM JWColby wrote: >Marty, > >My only question about this method is the loading of all the records at one >go before processing. Remember that I am doing raw files that can be 4 >gigabytes of text, up to (in this case) 4 million records and (in this case) >149 fields. These files are HUGE by desktop standards. > >My method uses a single line read / process / write and thus is pretty much >guaranteed to handle any size file, any number of records, any number of >fields. > > > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >MartyConnelly >Sent: Wednesday, May 09, 2007 2:04 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Another option is to use SqlBulkCopy a class that comes with Net Framework >2.0 There is a SqlBulkCopy example in the book below that uses a CSV import. >Ttricky to setup but it works. This new 2.0 class is designed to call the >SQLSMO layer underneath the covers--that replaces SQL DMO. >William Vaughn's Hitchhiker's Guide to Visual Studio and SQL Server (7th >Edition) > >Here is C## code example of calling using a datareader stream into >SQLBulkCopy This handles 40,000 rows a second, 8000 if you apply indexing > >private void button1_Click(object sender, EventArgs e) { > Stopwatch sw = new Stopwatch(); > sw.Start(); > DataTable table = new DataTable(); > table.Columns.Add(new DataColumn("File",typeof(string))); > table.Columns.Add(new DataColumn("IID",typeof(int))); > table.Columns[1].AutoIncrement = true; > table.Columns[1].AutoIncrementSeed = 1; > table.Columns[1].AutoIncrementStep = 1; > > StreamReader sr = new StreamReader("c:\\filelist.txt"); > while (!sr.EndOfStream ){ > table.Rows.Add(sr.ReadLine()); > } > > sw.Stop(); > Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + >table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / >sw.Elapsed.TotalSeconds + " >rows >per second loaded to datatable"); > sw.Start(); > > SqlConnection sqlcon = new SqlConnection("data >source=lon0371xns;initial catalog=SonarBackup;integrated security=sspi"); > SqlBulkCopy bc = new SqlBulkCopy(sqlcon); > bc.DestinationTableName = "FileList"; > bc.NotifyAfter = 5000; > bc.SqlRowsCopied += new >SqlRowsCopiedEventHandler(bc_SqlRowsCopied); > bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("File", "File")); > bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("IID", "IID")); > sqlcon.Open(); > bc.BulkCopyTimeout = 500; > bc.WriteToServer(table); > sw.Stop(); > > Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + >table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / >sw.Elapsed.TotalSeconds + " >rows >per second loaded to db"); > > > >} > > > >JWColby wrote: > > > >>>The only caveat here is if you have empty fields in your file, a single >>> >>> >>> >>> >>space is inserted instead of a null. >> >>What is it with all "the only caveat here" stuff? I am sure that there is >> >> >a > > >>darned good reason. >> >>In the end it is just easier to roll your own rather than work around the >>issues that the built in stuff seems to have. I have 150 fields (in this >>data set). Somehow I have to do an update on all 150 fields. I suppose I >>could have my converter run 150 update queries to do each column. Or 700 >>update queries to do the next data set. Or just do the stripping of the >>spaces external to SQL Server and be done with it. Either way I still have >>to use my toy. >> >>Once I move up to VB.Net I will be able to use threads to do the stripping >>and the BULK INSERT Sproc in parallel. >> >>BTW, I have to do something very similar all over again once I get the data >>in. I will need to export the entire table back out, 2 million record sets >>of data to delimited files for CAS / NCOA processing, dumping 100 million >>records out into ~50 files (just the address data this time). The CAS / >>NCOA process theoretically will process all files placed into an input >>directory (input to that program), dumping the processed files into an >>output directory (output from that program). At which point I have to pull >>all of the CASS / NCOAd files BACK out of that output directory into to yet >>another table. And that is just the "pre-processing". >> >>You might be getting a clue by now why I do not want to be manually doing >>all the crapola involved with the solutions that do not involve an external >>control process. Someday fairly soon I will have a completely automated >>system for doing all this. I will be back to blowing bubbles and poking at >>Charlotte with a big stick. >> >>John W. Colby >>Colby Consulting >>www.ColbyConsulting.com >> >>-----Original Message----- >>From: dba-sqlserver-bounces at databaseadvisors.com >>[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >>MartyConnelly >>Sent: Tuesday, May 08, 2007 10:27 PM >>To: dba-sqlserver at databaseadvisors.com >>Subject: Re: [dba-SQLServer] Bulk insert >> >>Uhh, there is a one line fix to remove trailing blanks in SQL, different >>defaults for SQL Server versions and NChar and VChar. >> >>SET ANSI_PADDING OFF >> >>When a table is created with the setting turned on (the default), spaces >>are not trimmed when data is inserted into that table. When ANSI_PADDING is >>off, the spaces are trimmed. >> >>So if you SET ANSI_PADDING OFF, create your table, then set it back on >>again, when you bcp the data into the table, the excess trailing spaces >> >> >will > > >>be eliminated. The only caveat here is if you have empty fields in your >>file, a single space is inserted instead of a null. If this is the case >> >> >with > > >>your data file, you will need to do an update to set columns to null when >>len(yourcolumn) = 0. >> >>See BOL >>http://msdn2.microsoft.com/en-us/library/ms188340.aspx >> >>http://msdn2.microsoft.com/en-us/library/ms187403.aspx >> >> >> -- Marty Connelly Victoria, B.C. Canada From jwcolby at colbyconsulting.com Fri May 11 19:00:48 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Fri, 11 May 2007 20:00:48 -0400 Subject: [dba-SQLServer] Bulk insert In-Reply-To: <4644FB22.2070602@shaw.ca> Message-ID: <20070512000048.36A36BD74@smtp-auth.no-ip.com> >Go with a big 64 bit CPU box and Windows Vista 64 bit Business Version that will give you access to 128 Gig of onboard RAM Yep. And about $20K later I will have a honkin machine. And someday I will do that, but not until the money starts to roll in from doing all this stuff on the cheap. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of MartyConnelly Sent: Friday, May 11, 2007 7:24 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Bulk insert Go with a big 64 bit CPU box and Windows Vista 64 bit Business Version that will give you access to 128 Gig of onboard RAM JWColby wrote: >Marty, > >My only question about this method is the loading of all the records at >one go before processing. Remember that I am doing raw files that can >be 4 gigabytes of text, up to (in this case) 4 million records and (in >this case) >149 fields. These files are HUGE by desktop standards. > >My method uses a single line read / process / write and thus is pretty >much guaranteed to handle any size file, any number of records, any >number of fields. > > > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >MartyConnelly >Sent: Wednesday, May 09, 2007 2:04 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Bulk insert > >Another option is to use SqlBulkCopy a class that comes with Net >Framework 2.0 There is a SqlBulkCopy example in the book below that uses a CSV import. >Ttricky to setup but it works. This new 2.0 class is designed to call >the SQLSMO layer underneath the covers--that replaces SQL DMO. >William Vaughn's Hitchhiker's Guide to Visual Studio and SQL Server >(7th >Edition) > >Here is C## code example of calling using a datareader stream into >SQLBulkCopy This handles 40,000 rows a second, 8000 if you apply >indexing > >private void button1_Click(object sender, EventArgs e) { > Stopwatch sw = new Stopwatch(); > sw.Start(); > DataTable table = new DataTable(); > table.Columns.Add(new DataColumn("File",typeof(string))); > table.Columns.Add(new DataColumn("IID",typeof(int))); > table.Columns[1].AutoIncrement = true; > table.Columns[1].AutoIncrementSeed = 1; > table.Columns[1].AutoIncrementStep = 1; > > StreamReader sr = new StreamReader("c:\\filelist.txt"); > while (!sr.EndOfStream ){ > table.Rows.Add(sr.ReadLine()); > } > > sw.Stop(); > Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + >table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / >sw.Elapsed.TotalSeconds + " >rows >per second loaded to datatable"); > sw.Start(); > > SqlConnection sqlcon = new SqlConnection("data >source=lon0371xns;initial catalog=SonarBackup;integrated security=sspi"); > SqlBulkCopy bc = new SqlBulkCopy(sqlcon); > bc.DestinationTableName = "FileList"; > bc.NotifyAfter = 5000; > bc.SqlRowsCopied += new >SqlRowsCopiedEventHandler(bc_SqlRowsCopied); > bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("File", "File")); > bc.ColumnMappings.Add(new SqlBulkCopyColumnMapping("IID", "IID")); > sqlcon.Open(); > bc.BulkCopyTimeout = 500; > bc.WriteToServer(table); > sw.Stop(); > > Debug.Write(sw.Elapsed.TotalSeconds + " seconds for " + >table.Rows.Count + " = " + Convert.ToDouble(table.Rows.Count) / >sw.Elapsed.TotalSeconds + " >rows >per second loaded to db"); > > > >} > > > >JWColby wrote: > > > >>>The only caveat here is if you have empty fields in your file, a >>>single >>> >>> >>> >>> >>space is inserted instead of a null. >> >>What is it with all "the only caveat here" stuff? I am sure that >>there is >> >> >a > > >>darned good reason. >> >>In the end it is just easier to roll your own rather than work around >>the issues that the built in stuff seems to have. I have 150 fields >>(in this data set). Somehow I have to do an update on all 150 fields. >>I suppose I could have my converter run 150 update queries to do each >>column. Or 700 update queries to do the next data set. Or just do >>the stripping of the spaces external to SQL Server and be done with >>it. Either way I still have to use my toy. >> >>Once I move up to VB.Net I will be able to use threads to do the >>stripping and the BULK INSERT Sproc in parallel. >> >>BTW, I have to do something very similar all over again once I get the >>data in. I will need to export the entire table back out, 2 million >>record sets of data to delimited files for CAS / NCOA processing, >>dumping 100 million records out into ~50 files (just the address data >>this time). The CAS / NCOA process theoretically will process all >>files placed into an input directory (input to that program), dumping >>the processed files into an output directory (output from that >>program). At which point I have to pull all of the CASS / NCOAd files >>BACK out of that output directory into to yet another table. And that is just the "pre-processing". >> >>You might be getting a clue by now why I do not want to be manually >>doing all the crapola involved with the solutions that do not involve >>an external control process. Someday fairly soon I will have a >>completely automated system for doing all this. I will be back to >>blowing bubbles and poking at Charlotte with a big stick. >> >>John W. Colby >>Colby Consulting >>www.ColbyConsulting.com >> >>-----Original Message----- >>From: dba-sqlserver-bounces at databaseadvisors.com >>[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of >>MartyConnelly >>Sent: Tuesday, May 08, 2007 10:27 PM >>To: dba-sqlserver at databaseadvisors.com >>Subject: Re: [dba-SQLServer] Bulk insert >> >>Uhh, there is a one line fix to remove trailing blanks in SQL, >>different defaults for SQL Server versions and NChar and VChar. >> >>SET ANSI_PADDING OFF >> >>When a table is created with the setting turned on (the default), >>spaces are not trimmed when data is inserted into that table. When >>ANSI_PADDING is off, the spaces are trimmed. >> >>So if you SET ANSI_PADDING OFF, create your table, then set it back on >>again, when you bcp the data into the table, the excess trailing >>spaces >> >> >will > > >>be eliminated. The only caveat here is if you have empty fields in >>your file, a single space is inserted instead of a null. If this is >>the case >> >> >with > > >>your data file, you will need to do an update to set columns to null >>when >>len(yourcolumn) = 0. >> >>See BOL >>http://msdn2.microsoft.com/en-us/library/ms188340.aspx >> >>http://msdn2.microsoft.com/en-us/library/ms187403.aspx >> >> >> -- Marty Connelly Victoria, B.C. Canada _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Fri May 11 21:19:36 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Fri, 11 May 2007 22:19:36 -0400 Subject: [dba-SQLServer] VB.Net - Raw data file transform Message-ID: <20070512021936.5A940BDA4@smtp-auth.no-ip.com> Well guys, I have ported the application that transformed my raw data into a pipe delimited "csv" file - from VBA in Access to VB.NET. Preliminary results VERY crudely timed show about 10K records / second, up from about 1K records / second in VBA. I really need to find a timer class for timing code in order to get precise timings on this. As it stands now however, it looks like without any further optimizations, my raw data transform would be the bottleneck, running at 9.7K records / sec, with the SQL Server BULK INSERT running at 15K records / second. If I can get these two processes running in threads so that they process independently, I am now in a position to go pushbutton and import a 100 million record file in ~10K seconds / 166.6 minutes / 2.7 hours. Given that I will soon have the pieces to run unattended this is an acceptable rate for me. This process took days of handholding manual labor to make happen (learning stuff all the way of course) Thanks to all who have helped my in getting the ADO happening out in VB.Net, as well as the Sproc happening in SQL Server. I still have a long way to go to get a complete app in VB.Net and SQL Server. The piece I just ported does the open / parse / strip / write to move the raw fixed width file to pipe delimited. Phase 2 begins immediately. The next piece will automate running the BULK INSERT Sproc from VB.Net, given a set of pipe delimited csv files in a directory and an existing destination table in SQL Server. Once that piece is running, I will need to learn how to run each piece in a separate thread. When both pieces are running simultaneously, I will need to reassess the speed of each piece. I am using a dual proc AMD XP 3800 with 4 gb RAM so hopefully each thread will run on a different proc, though I don't know that. From the little I know I assume there is a way to set the processor affinity of a thread. So much to learn, so little time. But this is just an awesome start and I am happy with the speed gain of the transform process achieved by moving to VB.Net. Again thanks to all who contributed. John W. Colby Colby Consulting www.ColbyConsulting.com From mwp.reid at qub.ac.uk Sun May 13 15:26:04 2007 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Sun, 13 May 2007 21:26:04 +0100 Subject: [dba-SQLServer] Here we go again!! References: <200705101618.l4AGIPRf025159@databaseadvisors.com> Message-ID: http://www.microsoft.com/sql/prodinfo/futureversion/default.mspx Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 From stuart at lexacorp.com.pg Sun May 13 16:47:48 2007 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Mon, 14 May 2007 07:47:48 +1000 Subject: [dba-SQLServer] Here we go again!! In-Reply-To: References: <200705101618.l4AGIPRf025159@databaseadvisors.com>, Message-ID: <46478784.28804.14B51005@stuart.lexacorp.com.pg> On 13 May 2007 at 21:26, Martin Reid wrote: > http://www.microsoft.com/sql/prodinfo/futureversion/default.mspx These days, whenever I see MS say "with advanced security technology" my blood runs cold and I think "How much of a PITA is *this* one going to be for developers to work with?" :-( -- Stuart From michael at ddisolutions.com.au Sun May 13 18:54:06 2007 From: michael at ddisolutions.com.au (Michael Maddison) Date: Mon, 14 May 2007 09:54:06 +1000 Subject: [dba-SQLServer] Here we go again!! References: <200705101618.l4AGIPRf025159@databaseadvisors.com>, <46478784.28804.14B51005@stuart.lexacorp.com.pg> Message-ID: <59A61174B1F5B54B97FD4ADDE71E7D01289740@ddi-01.DDI.local> So true, I just install Vista for the 1st time... What a waste of time! Stoopid thing wouldn't even let me use regsvr32 when logged on as local Admin. Now it won't let me connect a remote desktop to my #1 client. I guess the good news is we wont see it for a few years yet... Sent from my working XP desktop ;-/ cheers Michael M Subject: Re: [dba-SQLServer] Here we go again!! On 13 May 2007 at 21:26, Martin Reid wrote: > http://www.microsoft.com/sql/prodinfo/futureversion/default.mspx These days, whenever I see MS say "with advanced security technology" my blood runs cold and I think "How much of a PITA is *this* one going to be for developers to work with?" :-( -- Stuart _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From Elizabeth.J.Doering at wellsfargo.com Sun May 13 21:14:42 2007 From: Elizabeth.J.Doering at wellsfargo.com (Elizabeth.J.Doering at wellsfargo.com) Date: Sun, 13 May 2007 21:14:42 -0500 Subject: [dba-SQLServer] What have I done? References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> Message-ID: <1C2084FD2472124AB1812A5476EA3B7A016AB0FC@msgswbmnmsp04.wellsfargo.com> Help! This has been making my stomach upset all weekend. In a meeting on Friday afternoon, I apparently agreed to do the following: Plan is to have Elizabeth Doering to build the initial analytic data load then we move it into production in the new shared Oracle database. What the heck did I agree to? This won't go into production until early next year, so I have time to learn something about Oracle in the meantime. I have built what will be the production database, no problems there, thanks to the help of this list, but the fact of moving the data to Oracle is freaking me out just a bit. I can build a script that would move the data from one SQL Server 2005 database to another, but what changes when the destination is Oracle? Thanks, Liz Liz Doering elizabeth.j.doering at wellsfargo.com 612.667.2447 "This message may contain confidential and/or privileged information. If you are not the addressee or authorized to receive this for the addressee, you must not use, copy, disclose, or take any action based on this message or any information herein. If you have received this message in error, please advise the sender immediately by reply e-mail and delete this message. Thank you for your cooperation" From michael at ddisolutions.com.au Sun May 13 21:41:19 2007 From: michael at ddisolutions.com.au (Michael Maddison) Date: Mon, 14 May 2007 12:41:19 +1000 Subject: [dba-SQLServer] What have I done? References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> <1C2084FD2472124AB1812A5476EA3B7A016AB0FC@msgswbmnmsp04.wellsfargo.com> Message-ID: <59A61174B1F5B54B97FD4ADDE71E7D0128974F@ddi-01.DDI.local> Hi Liz, Do you have to design the Oracle db or just move the data? Moving the data should not be a huge problem, designing an Oracle db is as complex as you need to make it ;-) I assume you have Oracle experts to handle the Oracle side? IIRC Oracle has a migration wizard which is perhaps depending on your reqs one option. I'm not sure about 2005 but in 2000 it is fairly straightforward (or as straightforward as anything to do with Oracle can be) to build a DTS package to transfer data to Oracle. I imagine 2005 would be comperable. You will need to test rigorously as the Oracle datatypes will be slightly different. cheers Michael M -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Elizabeth.J.Doering at wellsfargo.com Sent: Monday, 14 May 2007 12:15 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] What have I done? Help! This has been making my stomach upset all weekend. In a meeting on Friday afternoon, I apparently agreed to do the following: Plan is to have Elizabeth Doering to build the initial analytic data load then we move it into production in the new shared Oracle database. What the heck did I agree to? This won't go into production until early next year, so I have time to learn something about Oracle in the meantime. I have built what will be the production database, no problems there, thanks to the help of this list, but the fact of moving the data to Oracle is freaking me out just a bit. I can build a script that would move the data from one SQL Server 2005 database to another, but what changes when the destination is Oracle? Thanks, Liz Liz Doering elizabeth.j.doering at wellsfargo.com 612.667.2447 "This message may contain confidential and/or privileged information. If you are not the addressee or authorized to receive this for the addressee, you must not use, copy, disclose, or take any action based on this message or any information herein. If you have received this message in error, please advise the sender immediately by reply e-mail and delete this message. Thank you for your cooperation" _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From garykjos at gmail.com Sun May 13 22:49:45 2007 From: garykjos at gmail.com (Gary Kjos) Date: Sun, 13 May 2007 22:49:45 -0500 Subject: [dba-SQLServer] What have I done? In-Reply-To: <1C2084FD2472124AB1812A5476EA3B7A016AB0FC@msgswbmnmsp04.wellsfargo.com> References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> <1C2084FD2472124AB1812A5476EA3B7A016AB0FC@msgswbmnmsp04.wellsfargo.com> Message-ID: I'm hoping your company either already has or will be hiring one or more database administrators for this Oracle database? Or is that to be your baby too? If so, I would get yourself into Oracle DBA classes asap. Lots and lots of information is available at Oracle.com website. Lots of downloadable software and documentation in pdf format. If you do have DBA's you need to get to know them. They are your friend. Good luck. You can do it! GK On 5/13/07, Elizabeth.J.Doering at wellsfargo.com wrote: > Help! This has been making my stomach upset all weekend. > > In a meeting on Friday afternoon, I apparently agreed to do the > following: > > Plan is to have Elizabeth Doering to build the initial analytic > data load then we move it into production in the new shared Oracle > database. > > What the heck did I agree to? This won't go into production until early > next year, so I have time to learn something about Oracle in the > meantime. I have built what will be the production database, no > problems there, thanks to the help of this list, but the fact of moving > the data to Oracle is freaking me out just a bit. I can build a script > that would move the data from one SQL Server 2005 database to another, > but what changes when the destination is Oracle? > > > Thanks, > > > Liz > > > Liz Doering > elizabeth.j.doering at wellsfargo.com > 612.667.2447 > > > "This message may contain confidential and/or privileged information. If > you are not the addressee or authorized to receive this for the > addressee, you must not use, copy, disclose, or take any action based on > this message or any information herein. If you have received this > message in error, please advise the sender immediately by reply e-mail > and delete this message. Thank you for your cooperation" > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Gary Kjos garykjos at gmail.com From jwcolby at colbyconsulting.com Mon May 14 08:29:01 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Mon, 14 May 2007 09:29:01 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <009b01c790b7$fe185190$800101df@fci.local> Message-ID: <20070514132900.447BBBE0D@smtp-auth.no-ip.com> James, >For your question on pulling records from SQL Server, if you are using SQL Server 2005, you can use: With [tblInf] as (select ROW_NUMBER() over (Order By [PKID]) as 'ROWNUMBER' ,[PKID] ,[FName] ,[MName] ,[LName] ,[Address] ,[Address2] ,[City] ,[State] ,[Zip] ,[Zip4] from [tblInfutor]) Select * from [tblInf] where ROWNUMBER between 10000 and 12000 *** I used this which works as advertised. The only issue is that it displays the RowNumber column which I need to suppress if at all possible. These are being exported out to my address vaildation package and it doesn't want the line number. Is it possible to suppress displaying the RowNumber field? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of James Barash Sent: Monday, May 07, 2007 10:57 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Thanks for the help John: Here is a basic call to a stored procedure from VBA with two parameter: Public Sub SetOrderStatus(ID as Long, Status as Long) Dim conn As ADODB.Connection Dim cmd As ADODB.command Set conn = New ADODB.Connection conn.ConnectionString = "Insert connection string here" conn.Open Set cmd = New ADODB.command With cmd Set .ActiveConnection = conn .CommandType = adCmdStoredProc .CommandText = "sp_SetOrderStatus" .Parameters.Append cmd.CreateParameter("@ID", adInteger, adParamInput, , ID) .Parameters.Append cmd.CreateParameter("Status", adInteger, adParamInput, , Status) .Execute End With Set cmd = Nothing conn.Close Set conn = Nothing End Sub For your question on pulling records from SQL Server, if you are using SQL Server 2005, you can use: With OrdersA as (select [Order Number], ROW_NUMBER() over (Order By [Order ID]) as 'ROWNUMBER' from Orders) Select * from OrdersA where ROWNUMBER between 100 and 200 Hope that helps. James Barash From rl_stewart at highstream.net Mon May 14 08:37:32 2007 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Mon, 14 May 2007 08:37:32 -0500 Subject: [dba-SQLServer] Reciprocity In-Reply-To: References: Message-ID: <200705141338.l4EDcsaE006603@databaseadvisors.com> John, As an education for you and probably some other here... The following creates a system based default so that constraints at the column level do not have to be created. CREATE DEFAULT dbo.Default_0 as 0 GO The following binds a default to a column in a specific table. EXEC sys.sp_bindefault @defname=N'[dbo].[Default_0]', @objname=N'[dbo].[tsysImportDefinition].[ArchivedFlag]' GO The code for vwImportDefColumns was just the SQL statement to create the view used in the stored procedure. The following is the stored procedure for creating the table from the definitions you stored in the 2 tables that would have been created in the code that ran before it. I have added additional comments, areas with -- in front of them to try and make thing clearer. CREATE PROCEDURE [dbo].[pImportCountryFile] AS BEGIN -- sets the record count off so records affected is not returned SET NOCOUNT ON; -- declare the variables you will need for the country columns DECLARE @CountryCode VARCHAR(3), @CountryName VARCHAR(100), @ColumnName VARCHAR(64), @ColumnStart INT, @ColumnEnd INT, @ColumnDataType VARCHAR(15), @ColumnLength VARCHAR(10), -- column length defined as varchar to take MAX as a parameter @ColumnPrecision TINYINT, @ColumnScale TINYINT, @DBName VARCHAR(64), @DBTableName VARCHAR(64) -- Declare the variable you will need for your dynamic SQL statements DECLARE @CreateTable NVARCHAR(4000) -- the following code is generic for creating the table -- the only thing that would be changed is being able to -- pass into the proc the name of the table you want to -- do the build of. for your system, it would probably be -- the client job information -- create a cursor. this works like a read-only forward scrolling recordset -- does in Access DECLARE cColumns CURSOR FOR SELECT DBName, DBTableName, ColumnName, ColumnStart, ColumnEnd, ColumnDataType, ColumnLength, ColumnPrecision, ColumnScale FROM dbo.vwImportDefColumns WHERE DBTableName = 'tlkpCountry' -- opens the cursor for use OPEN cColumns -- reads the first record of the select statement into the -- variables listed FETCH NEXT FROM cColumns INTO @DBName, @DBTableName, @ColumnName, @ColumnStart, @ColumnEnd, @ColumnDataType, @ColumnLength, @ColumnPrecision, @ColumnScale -- build the create table sql SET @CreateTable = 'CREATE TABLE dbo.' + @DBTableName + ' (' + 'PK_ID int identity(1,1), ' -- the system variable @@FETCH_STATUS will return 0 as long as there are records WHILE (@@FETCH_STATUS = 0) -- There are records -- you must enclose things and group them inside of BEGIN...END -- when there are more than 1 statement that you want to execute BEGIN -- build the create table sql IF (CHARINDEX('varchar', at ColumnDataType,1) > 0) -- CHARINDEX checks for the existance of one string inside of another BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + '(' + + @ColumnLength + '), ' END ELSE IF (CHARINDEX('int', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + ', ' END ELSE IF (CHARINDEX('date', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + ', ' END ELSE IF (CHARINDEX('text', at ColumnDataType,1) > 0) BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + ', ' END ELSE -- money, numeric or decimal BEGIN SET @CreateTable = @CreateTable + @ColumnName + ' ' + @ColumnDataType + '(' + + CAST(@ColumnPrecision AS varchar(10)) + CAST(@ColumnScale AS varchar(10)) + '), ' -- CAST converts between data types -- you can also use CONVERT END -- read the next record into the variables FETCH NEXT FROM cColumns INTO @DBName, @DBTableName, @ColumnName, @ColumnStart, @ColumnEnd, @ColumnDataType, @ColumnLength, @ColumnPrecision, @ColumnScale END -- trim the final , from the end of the string SET @CreateTable = substring(@CreateTable,1,len(@CreateTable) - 1) + ')' -- substring is the same as the MID function in VBA -- Prints out the SQL statement so we can see it PRINT @CreateTable -- Execute the create table statement -- uncomment out the following line to actually execute the SQL -- statement that was built -- EXEC sp_ExecuteSql @CreateTable -- clean up the CURSOR by closing and deallocating it CLOSE cColumns DEALLOCATE cColumns END I hope that helps some with what it does. If any one has questions about it, let me know. By the way John, with the CLR integration, SQL Server can run the VB.net code you can up with for the initial processing also. Robert At 06:25 PM 5/11/2007, you wrote: >Date: Fri, 11 May 2007 14:34:31 -0400 >From: "jwcolby" >Subject: Re: [dba-SQLServer] Reciprocity >To: >Message-ID: <20070511183431.06052BCF5 at smtp-auth.no-ip.com> >Content-Type: text/plain; charset="us-ascii" > >Robert, > >I do thank you for your assistance. The scripts are way cool from what I can >understand of them. Please understand that I kind of work from what I need >now to what I need eventually and this appears to be DEFINITELY what I need >eventually. However I am, as we speak, migrating my VBA code to VB.NET. >That has pretty much captured my entire attention. > >Perhaps you do not realize that I am VERY inexperienced in SQL Server, in >fact have only ever written and am using exactly one Sproc. Thus I am >completely unable to grasp the entirety of what you are trying to >accomplish. I have not tested this stuff because I don't even know how. I >understand everything down through the tsysImportDefinition (at least on a >conceptual level), though I do not understand the EXEC sys.SP_BindDefault... >No clue at all there. vwImportDefColumns not a clue. pImportCountryFile >not a clue, though I assume that perhaps this is an "example" of using it? >This appears to be the actual meat of the matter but it is so far over my >head that "I'll have to get back to you" on that one. > >I have no doubt that whatever it does, it does well, and hopefully I will be >able to use it soon. As for today.... I just don't have the SQL Server >skills to apply it. > >As for "Should I bother with proceeding with the text file read?", I don't >know. How can I answer that? You are apparently providing an all SQL >solution to the problem which is really cool, and provides me with example >code of how to do things, which is also really cool, but until I can absorb >it I am not sure I can use it. > >If you have the time to explain this stuff, line by line or block by block, >then perhaps yes, let's continue. I am not stupid, just ignorant. This >looks like an education in progress. Of course I am still in first grade. >;-) > >I do think this would make a fascinating thread though if you do get down >and explain it. > >John W. Colby From fuller.artful at gmail.com Mon May 14 09:02:47 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Mon, 14 May 2007 10:02:47 -0400 Subject: [dba-SQLServer] What have I done? In-Reply-To: References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> <1C2084FD2472124AB1812A5476EA3B7A016AB0FC@msgswbmnmsp04.wellsfargo.com> Message-ID: <29f585dd0705140702t6d11afa1w26fb7cd79956bbd@mail.gmail.com> I second and third that emotion about having Oracle DBAs. If your firm has not budgeted for this, they are in for a rude shock. It is much more necessary on Oracle to have a DBA than on SQL Server, where it is also necessary but less so in the sense of life-or-death. That aside, Oracle has a wizard that can migrate Access data to Oracle and it ought to handle almost every data-type transformation required. I have used it a few times, not tested it exhaustively with trick questions, but it worked just fine for me. Arthur On 5/13/07, Gary Kjos wrote: > > I'm hoping your company either already has or will be hiring one or > more database administrators for this Oracle database? Or is that to > be your baby too? If so, I would get yourself into Oracle DBA classes > asap. Lots and lots of information is available at Oracle.com website. > Lots of downloadable software and documentation in pdf format. If you > do have DBA's you need to get to know them. They are your friend. > > Good luck. You can do it! > > GK > > On 5/13/07, Elizabeth.J.Doering at wellsfargo.com > wrote: > > Help! This has been making my stomach upset all weekend. > > > > In a meeting on Friday afternoon, I apparently agreed to do the > > following: > > > > Plan is to have Elizabeth Doering to build the initial analytic > > data load then we move it into production in the new shared Oracle > > database. > > > > What the heck did I agree to? This won't go into production until early > > next year, so I have time to learn something about Oracle in the > > meantime. I have built what will be the production database, no > > problems there, thanks to the help of this list, but the fact of moving > > the data to Oracle is freaking me out just a bit. I can build a script > > that would move the data from one SQL Server 2005 database to another, > > but what changes when the destination is Oracle? > > > > > > Thanks, > > > > > > Liz > > > > > > Liz Doering > > elizabeth.j.doering at wellsfargo.com > > 612.667.2447 > > > > > > "This message may contain confidential and/or privileged information. If > > you are not the addressee or authorized to receive this for the > > addressee, you must not use, copy, disclose, or take any action based on > > this message or any information herein. If you have received this > > message in error, please advise the sender immediately by reply e-mail > > and delete this message. Thank you for your cooperation" > > > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > Gary Kjos > garykjos at gmail.com > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From fuller.artful at gmail.com Mon May 14 09:19:29 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Mon, 14 May 2007 10:19:29 -0400 Subject: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In-Reply-To: References: Message-ID: <29f585dd0705140719u5b8c149brb43c2fdc5ce78a1@mail.gmail.com> In my experience, the easiest solution was to use static functions in the first query, and then base the crosstab query on that result set. For the SQL Server folks on this list, there is an amazing stored procedure available at Simple-Talk (www.simple-talk.com) that dynamically generates cross-tabs as complex as you'd like. One of the big problems with cross-tabs is that the number of columns might change from run to run. This stored procedure gets around that. In the interests of transparency, let me admit that I write for Simple Talk, although I did not write this article and wish that I had. The stored procedure is amazing. Arthur On 5/14/07, Gustav Brock wrote: > > Hi Thomas > > Here's an article on using ADO and parameters: > > http://support.microsoft.com/kb/225897/en-us > > Also, look up in the archives subject "ADO code stopped working" from > early February this year. > > If it works now with DAO, I would leave it except, of course, if this is a > learning experience. > > /gustav > > >>> ewaldt at gdls.com 14-05-2007 13:10 >>> > I am running into a problem with a cross tab query. > > I have a report based on a parameter query, which is in turn based on a > cross tab query (which is based on the same parameter), and ADO doesn't > seem happy. On MSFT's site, they use DAO in their (very complex) example, > and they show how to specify a parameter (qdf = a QueryDef, and > qdf.Parameters(xxx) = yyy). I really prefer to use ADO (trying to learn > it), but I don't see how to specify a parameter's value in ADO. I'm > assuming that's the problem , because Access keeps saying that I'm not > specifiying required info. Also, when I went through and replaced all > instances of parameters in the queries (query based on queries based on > queries) with solid numbers, it worked. In the actual queries, the > parameter is: > > [Forms]![frmWeeklyData]![fraMonths] > > This simply refers to a frame containing option buttons so that I can > specify the month I'm interested in. The month's number is then used by > the queries. > > Running the queries without the report works just fine. However, since a > cross tab query is involved, and there can be varying numbers of columns, > I have to use dynamic columns in the report, and that's where complexity > rears its ugly head. The parameter query (that calls the cross tab query) > is necessary because I have information in addition to the cross tab query > itself which is needed in the report. > > Here's the code portion that Access highlights: > > rst.Open _ > Source:="qfrmWeeklyData", _ > ActiveConnection:=CurrentProject.Connection, _ > Options:=adCmdTable > > I'd greatly appreciate any help with this. > > > Thomas F. Ewald > Stryker Mass Properties > General Dynamics Land Systems > > > -- > AccessD mailing list > AccessD at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/accessd > Website: http://www.databaseadvisors.com > From fuller.artful at gmail.com Mon May 14 09:22:48 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Mon, 14 May 2007 10:22:48 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <20070514132900.447BBBE0D@smtp-auth.no-ip.com> References: <009b01c790b7$fe185190$800101df@fci.local> <20070514132900.447BBBE0D@smtp-auth.no-ip.com> Message-ID: <29f585dd0705140722u4c95e872vd6fd7e3169d39190@mail.gmail.com> Just replace your SELECT * part of the statement with an explicit list of the columns you want. Leave out the Row_Number() one and include all the others. hth, Arthur On 5/14/07, jwcolby wrote: > > James, > > >For your question on pulling records from SQL Server, if you are using > SQL > Server 2005, you can use: > > With [tblInf] as (select ROW_NUMBER() over (Order By [PKID]) as > 'ROWNUMBER' > ,[PKID] > ,[FName] > ,[MName] > ,[LName] > ,[Address] > ,[Address2] > ,[City] > ,[State] > ,[Zip] > ,[Zip4] from [tblInfutor]) > Select * > from [tblInf] > where ROWNUMBER between 10000 and 12000 > > *** > > I used this which works as advertised. The only issue is that it displays > the RowNumber column which I need to suppress if at all possible. These > are > being exported out to my address vaildation package and it doesn't want > the > line number. > > Is it possible to suppress displaying the RowNumber field? > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of James > Barash > Sent: Monday, May 07, 2007 10:57 AM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] Thanks for the help > > John: > > Here is a basic call to a stored procedure from VBA with two parameter: > > Public Sub SetOrderStatus(ID as Long, Status as Long) Dim conn As > ADODB.Connection Dim cmd As ADODB.command Set conn = New ADODB.Connection > conn.ConnectionString = "Insert connection string here" > conn.Open > Set cmd = New ADODB.command > With cmd > Set .ActiveConnection = conn > .CommandType = adCmdStoredProc > .CommandText = "sp_SetOrderStatus" > .Parameters.Append cmd.CreateParameter("@ID", adInteger, adParamInput, > , > ID) > .Parameters.Append cmd.CreateParameter("Status", adInteger, > adParamInput, , Status) > .Execute > End With > Set cmd = Nothing > conn.Close > Set conn = Nothing > End Sub > > For your question on pulling records from SQL Server, if you are using SQL > Server 2005, you can use: > > With OrdersA as (select [Order Number], ROW_NUMBER() over (Order By [Order > ID]) as 'ROWNUMBER' from Orders) > Select * from OrdersA where ROWNUMBER between 100 and 200 > > Hope that helps. > > James Barash > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Mon May 14 09:46:41 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Mon, 14 May 2007 10:46:41 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <29f585dd0705140722u4c95e872vd6fd7e3169d39190@mail.gmail.com> Message-ID: <20070514144639.F3460BC63@smtp-auth.no-ip.com> Thanks Arthur, worked like a champ. There are times I feel (and appear, I am sure) so stupid. I was so focused on getting the WITH clause functioning that I completely missed the * down below. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Monday, May 14, 2007 10:23 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Thanks for the help Just replace your SELECT * part of the statement with an explicit list of the columns you want. Leave out the Row_Number() one and include all the others. hth, Arthur From accessd at shaw.ca Mon May 14 10:43:33 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Mon, 14 May 2007 08:43:33 -0700 Subject: [dba-SQLServer] What have I done? In-Reply-To: <29f585dd0705140702t6d11afa1w26fb7cd79956bbd@mail.gmail.com> Message-ID: <0JI1002TZFGMTQV5@l-daemon> Just a note: There is a real demand for Oracle DBAs. If you have experience as well in Oracle the amount that can be demanded for wage/contract work can go over the moon. A local Oracle instructor teaches 3 months a years and works on contract the rest. He contracts out, states side, at $200.00 per hour plus bed, board and transportation. A good Oracle DBA can be an expensive investment.... but as I understand it well worth it. Just a comment Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Monday, May 14, 2007 7:03 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] What have I done? I second and third that emotion about having Oracle DBAs. If your firm has not budgeted for this, they are in for a rude shock. It is much more necessary on Oracle to have a DBA than on SQL Server, where it is also necessary but less so in the sense of life-or-death. That aside, Oracle has a wizard that can migrate Access data to Oracle and it ought to handle almost every data-type transformation required. I have used it a few times, not tested it exhaustively with trick questions, but it worked just fine for me. Arthur On 5/13/07, Gary Kjos wrote: > > I'm hoping your company either already has or will be hiring one or > more database administrators for this Oracle database? Or is that to > be your baby too? If so, I would get yourself into Oracle DBA classes > asap. Lots and lots of information is available at Oracle.com website. > Lots of downloadable software and documentation in pdf format. If you > do have DBA's you need to get to know them. They are your friend. > > Good luck. You can do it! > > GK > > On 5/13/07, Elizabeth.J.Doering at wellsfargo.com > wrote: > > Help! This has been making my stomach upset all weekend. > > > > In a meeting on Friday afternoon, I apparently agreed to do the > > following: > > > > Plan is to have Elizabeth Doering to build the initial analytic > > data load then we move it into production in the new shared Oracle > > database. > > > > What the heck did I agree to? This won't go into production until early > > next year, so I have time to learn something about Oracle in the > > meantime. I have built what will be the production database, no > > problems there, thanks to the help of this list, but the fact of moving > > the data to Oracle is freaking me out just a bit. I can build a script > > that would move the data from one SQL Server 2005 database to another, > > but what changes when the destination is Oracle? > > > > > > Thanks, > > > > > > Liz > > > > > > Liz Doering > > elizabeth.j.doering at wellsfargo.com > > 612.667.2447 > > > > > > "This message may contain confidential and/or privileged information. If > > you are not the addressee or authorized to receive this for the > > addressee, you must not use, copy, disclose, or take any action based on > > this message or any information herein. If you have received this > > message in error, please advise the sender immediately by reply e-mail > > and delete this message. Thank you for your cooperation" > > > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > Gary Kjos > garykjos at gmail.com > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Mon May 14 10:52:35 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Mon, 14 May 2007 11:52:35 -0400 Subject: [dba-SQLServer] What have I done? In-Reply-To: <0JI1002TZFGMTQV5@l-daemon> Message-ID: <20070514155234.E01E9BDBD@smtp-auth.no-ip.com> >That aside, Oracle has a wizard that can migrate Access data to Oracle and it ought to handle almost every data-type transformation required. I have used it a few times, not tested it exhaustively with trick questions, but it worked just fine for me. So there is a wizard available to gets you in over your head quickly and easily. ;-) I love it! John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Monday, May 14, 2007 11:44 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] What have I done? Just a note: There is a real demand for Oracle DBAs. If you have experience as well in Oracle the amount that can be demanded for wage/contract work can go over the moon. A local Oracle instructor teaches 3 months a years and works on contract the rest. He contracts out, states side, at $200.00 per hour plus bed, board and transportation. A good Oracle DBA can be an expensive investment.... but as I understand it well worth it. Just a comment Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Monday, May 14, 2007 7:03 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] What have I done? I second and third that emotion about having Oracle DBAs. If your firm has not budgeted for this, they are in for a rude shock. It is much more necessary on Oracle to have a DBA than on SQL Server, where it is also necessary but less so in the sense of life-or-death. That aside, Oracle has a wizard that can migrate Access data to Oracle and it ought to handle almost every data-type transformation required. I have used it a few times, not tested it exhaustively with trick questions, but it worked just fine for me. Arthur On 5/13/07, Gary Kjos wrote: > > I'm hoping your company either already has or will be hiring one or > more database administrators for this Oracle database? Or is that to > be your baby too? If so, I would get yourself into Oracle DBA classes > asap. Lots and lots of information is available at Oracle.com website. > Lots of downloadable software and documentation in pdf format. If you > do have DBA's you need to get to know them. They are your friend. > > Good luck. You can do it! > > GK > > On 5/13/07, Elizabeth.J.Doering at wellsfargo.com > wrote: > > Help! This has been making my stomach upset all weekend. > > > > In a meeting on Friday afternoon, I apparently agreed to do the > > following: > > > > Plan is to have Elizabeth Doering to build the initial > > analytic data load then we move it into production in the new shared > > Oracle database. > > > > What the heck did I agree to? This won't go into production until > > early next year, so I have time to learn something about Oracle in > > the meantime. I have built what will be the production database, no > > problems there, thanks to the help of this list, but the fact of > > moving the data to Oracle is freaking me out just a bit. I can > > build a script that would move the data from one SQL Server 2005 > > database to another, but what changes when the destination is Oracle? > > > > > > Thanks, > > > > > > Liz > > > > > > Liz Doering > > elizabeth.j.doering at wellsfargo.com > > 612.667.2447 > > > > > > "This message may contain confidential and/or privileged > > information. If you are not the addressee or authorized to receive > > this for the addressee, you must not use, copy, disclose, or take > > any action based on this message or any information herein. If you > > have received this message in error, please advise the sender > > immediately by reply e-mail and delete this message. Thank you for your cooperation" > > > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > Gary Kjos > garykjos at gmail.com > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From ebarro at verizon.net Mon May 14 10:59:09 2007 From: ebarro at verizon.net (Eric Barro) Date: Mon, 14 May 2007 08:59:09 -0700 Subject: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In-Reply-To: <29f585dd0705140719u5b8c149brb43c2fdc5ce78a1@mail.gmail.com> Message-ID: <0JI1009VWGDX8395@vms046.mailsrvcs.net> Arthur, Do you have the link to the article? Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Monday, May 14, 2007 7:19 AM To: Access Developers discussion and problem solving Cc: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In my experience, the easiest solution was to use static functions in the first query, and then base the crosstab query on that result set. For the SQL Server folks on this list, there is an amazing stored procedure available at Simple-Talk (www.simple-talk.com) that dynamically generates cross-tabs as complex as you'd like. One of the big problems with cross-tabs is that the number of columns might change from run to run. This stored procedure gets around that. In the interests of transparency, let me admit that I write for Simple Talk, although I did not write this article and wish that I had. The stored procedure is amazing. Arthur On 5/14/07, Gustav Brock wrote: > > Hi Thomas > > Here's an article on using ADO and parameters: > > http://support.microsoft.com/kb/225897/en-us > > Also, look up in the archives subject "ADO code stopped working" from > early February this year. > > If it works now with DAO, I would leave it except, of course, if this > is a learning experience. > > /gustav > > >>> ewaldt at gdls.com 14-05-2007 13:10 >>> > I am running into a problem with a cross tab query. > > I have a report based on a parameter query, which is in turn based on > a cross tab query (which is based on the same parameter), and ADO > doesn't seem happy. On MSFT's site, they use DAO in their (very > complex) example, and they show how to specify a parameter (qdf = a > QueryDef, and > qdf.Parameters(xxx) = yyy). I really prefer to use ADO (trying to > learn it), but I don't see how to specify a parameter's value in ADO. > I'm assuming that's the problem , because Access keeps saying that I'm > not specifiying required info. Also, when I went through and replaced > all instances of parameters in the queries (query based on queries > based on > queries) with solid numbers, it worked. In the actual queries, the > parameter is: > > [Forms]![frmWeeklyData]![fraMonths] > > This simply refers to a frame containing option buttons so that I can > specify the month I'm interested in. The month's number is then used > by the queries. > > Running the queries without the report works just fine. However, since > a cross tab query is involved, and there can be varying numbers of > columns, I have to use dynamic columns in the report, and that's where > complexity rears its ugly head. The parameter query (that calls the > cross tab query) is necessary because I have information in addition > to the cross tab query itself which is needed in the report. > > Here's the code portion that Access highlights: > > rst.Open _ > Source:="qfrmWeeklyData", _ > ActiveConnection:=CurrentProject.Connection, _ > Options:=adCmdTable > > I'd greatly appreciate any help with this. > > > Thomas F. Ewald > Stryker Mass Properties > General Dynamics Land Systems > > > -- > AccessD mailing list > AccessD at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/accessd > Website: http://www.databaseadvisors.com > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/803 - Release Date: 5/13/2007 12:17 PM From fuller.artful at gmail.com Mon May 14 11:07:32 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Mon, 14 May 2007 12:07:32 -0400 Subject: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In-Reply-To: <0JI1009VWGDX8395@vms046.mailsrvcs.net> References: <29f585dd0705140719u5b8c149brb43c2fdc5ce78a1@mail.gmail.com> <0JI1009VWGDX8395@vms046.mailsrvcs.net> Message-ID: <29f585dd0705140907l6e3186c6s70ab49828005c963@mail.gmail.com> Yes: http://www.simple-talk.com/sql/t-sql-programming/creating-cross-tab-queries-and-pivot-tables-in-sql/. However, you may possibly need to join to get to that URL, I'm not sure. Joining the list is free, so no worries. Arthur On 5/14/07, Eric Barro wrote: > > Arthur, > > Do you have the link to the article? > > Eric > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur > Fuller > Sent: Monday, May 14, 2007 7:19 AM > To: Access Developers discussion and problem solving > Cc: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested > > In my experience, the easiest solution was to use static functions in the > first query, and then base the crosstab query on that result set. > > For the SQL Server folks on this list, there is an amazing stored > procedure > available at Simple-Talk (www.simple-talk.com) that dynamically generates > cross-tabs as complex as you'd like. One of the big problems with > cross-tabs > is that the number of columns might change from run to run. This stored > procedure gets around that. > > In the interests of transparency, let me admit that I write for Simple > Talk, > although I did not write this article and wish that I had. The stored > procedure is amazing. > > Arthur > From fuller.artful at gmail.com Mon May 14 11:09:27 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Mon, 14 May 2007 12:09:27 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <20070514144639.F3460BC63@smtp-auth.no-ip.com> References: <29f585dd0705140722u4c95e872vd6fd7e3169d39190@mail.gmail.com> <20070514144639.F3460BC63@smtp-auth.no-ip.com> Message-ID: <29f585dd0705140909g147afe14tcf21c87a1558b678@mail.gmail.com> Actually, I was going to digress for about 5 paragraphs before failing to provide the answer, but something came over me and I just replied. I can't explain it. On 5/14/07, jwcolby wrote: > > Thanks Arthur, worked like a champ. > > There are times I feel (and appear, I am sure) so stupid. I was so > focused > on getting the WITH clause functioning that I completely missed the * down > below. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > From ebarro at verizon.net Mon May 14 11:10:55 2007 From: ebarro at verizon.net (Eric Barro) Date: Mon, 14 May 2007 09:10:55 -0700 Subject: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In-Reply-To: <29f585dd0705140907l6e3186c6s70ab49828005c963@mail.gmail.com> Message-ID: <0JI100HQDGXHHPD2@vms040.mailsrvcs.net> Thanks Arthur...I kinda figured that joining was a pre-requisite to viewing. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Monday, May 14, 2007 9:08 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested Yes: http://www.simple-talk.com/sql/t-sql-programming/creating-cross-tab-queries- and-pivot-tables-in-sql/. However, you may possibly need to join to get to that URL, I'm not sure. Joining the list is free, so no worries. Arthur On 5/14/07, Eric Barro wrote: > > Arthur, > > Do you have the link to the article? > > Eric > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of > Arthur Fuller > Sent: Monday, May 14, 2007 7:19 AM > To: Access Developers discussion and problem solving > Cc: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested > > In my experience, the easiest solution was to use static functions in > the first query, and then base the crosstab query on that result set. > > For the SQL Server folks on this list, there is an amazing stored > procedure available at Simple-Talk (www.simple-talk.com) that > dynamically generates cross-tabs as complex as you'd like. One of the > big problems with cross-tabs is that the number of columns might > change from run to run. This stored procedure gets around that. > > In the interests of transparency, let me admit that I write for Simple > Talk, although I did not write this article and wish that I had. The > stored procedure is amazing. > > Arthur > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/803 - Release Date: 5/13/2007 12:17 PM From jwcolby at colbyconsulting.com Mon May 14 11:29:24 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Mon, 14 May 2007 12:29:24 -0400 Subject: [dba-SQLServer] Thanks for the help In-Reply-To: <29f585dd0705140909g147afe14tcf21c87a1558b678@mail.gmail.com> Message-ID: <20070514162923.3DB4ABC12@smtp-auth.no-ip.com> ROTFLMAOBTC. I needed that! In any event, thanks for the slap up side the head. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Monday, May 14, 2007 12:09 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Thanks for the help Actually, I was going to digress for about 5 paragraphs before failing to provide the answer, but something came over me and I just replied. I can't explain it. On 5/14/07, jwcolby wrote: > > Thanks Arthur, worked like a champ. > > There are times I feel (and appear, I am sure) so stupid. I was so > focused on getting the WITH clause functioning that I completely > missed the * down below. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Mon May 14 12:28:46 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Mon, 14 May 2007 10:28:46 -0700 Subject: [dba-SQLServer] Here we go again!! In-Reply-To: <59A61174B1F5B54B97FD4ADDE71E7D01289740@ddi-01.DDI.local> References: <200705101618.l4AGIPRf025159@databaseadvisors.com> <46478784.28804.14B51005@stuart.lexacorp.com.pg> <59A61174B1F5B54B97FD4ADDE71E7D01289740@ddi-01.DDI.local> Message-ID: IME, Vista works best when purchased w/ new hardware, not for upgrading... sorry it's just the way I've seen it. Every install of Vista on existing hardware ('cept vm which is why i ventured to try it on physical systems) has been a disaster. My brother in law whom is 16 absolutely loves Vista, he had a chance to get his hands on it at school, of course.. w/ new hardware. I argued that, the os is not mature enough until SP2 at least but he did not think that was the case and wanted to buy it for home, until his teacher tried to upgrade one of the existing core2duo systems at school, it went belly up and their tech people had to go in and replace the machine w/ a working xp box :). Only the newer pc's that are purchased w/ Vista seem to work well, all others beware... -- my 2 cents On 5/13/07, Michael Maddison wrote: > > So true, > > I just install Vista for the 1st time... What a waste of time! > Stoopid thing wouldn't even let me use regsvr32 when logged on as local > Admin. > Now it won't let me connect a remote desktop to my #1 client. > > I guess the good news is we wont see it for a few years yet... > > Sent from my working XP desktop ;-/ > > cheers > > > Michael M > > Subject: Re: [dba-SQLServer] Here we go again!! > > On 13 May 2007 at 21:26, Martin Reid wrote: > > > http://www.microsoft.com/sql/prodinfo/futureversion/default.mspx > > > These days, whenever I see MS say "with advanced security technology" my > blood runs cold and I think "How much of a PITA is *this* one going to > be for developers to work with?" :-( > > > > > -- > Stuart > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From fhtapia at gmail.com Mon May 14 12:31:16 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Mon, 14 May 2007 10:31:16 -0700 Subject: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In-Reply-To: <29f585dd0705140719u5b8c149brb43c2fdc5ce78a1@mail.gmail.com> References: <29f585dd0705140719u5b8c149brb43c2fdc5ce78a1@mail.gmail.com> Message-ID: Arthur, :), it would help if you provided a link to the SP or article about the SP :) On 5/14/07, Arthur Fuller wrote: > > In my experience, the easiest solution was to use static functions in the > first query, and then base the crosstab query on that result set. > > For the SQL Server folks on this list, there is an amazing stored > procedure > available at Simple-Talk (www.simple-talk.com) that dynamically generates > cross-tabs as complex as you'd like. One of the big problems with > cross-tabs > is that the number of columns might change from run to run. This stored > procedure gets around that. > > In the interests of transparency, let me admit that I write for Simple > Talk, > although I did not write this article and wish that I had. The stored > procedure is amazing. > > Arthur > > On 5/14/07, Gustav Brock wrote: > > > > Hi Thomas > > > > Here's an article on using ADO and parameters: > > > > http://support.microsoft.com/kb/225897/en-us > > > > Also, look up in the archives subject "ADO code stopped working" from > > early February this year. > > > > If it works now with DAO, I would leave it except, of course, if this is > a > > learning experience. > > > > /gustav > > > > >>> ewaldt at gdls.com 14-05-2007 13:10 >>> > > I am running into a problem with a cross tab query. > > > > I have a report based on a parameter query, which is in turn based on a > > cross tab query (which is based on the same parameter), and ADO doesn't > > seem happy. On MSFT's site, they use DAO in their (very complex) > example, > > and they show how to specify a parameter (qdf = a QueryDef, and > > qdf.Parameters(xxx) = yyy). I really prefer to use ADO (trying to learn > > it), but I don't see how to specify a parameter's value in ADO. I'm > > assuming that's the problem , because Access keeps saying that I'm not > > specifiying required info. Also, when I went through and replaced all > > instances of parameters in the queries (query based on queries based on > > queries) with solid numbers, it worked. In the actual queries, the > > parameter is: > > > > [Forms]![frmWeeklyData]![fraMonths] > > > > This simply refers to a frame containing option buttons so that I can > > specify the month I'm interested in. The month's number is then used by > > the queries. > > > > Running the queries without the report works just fine. However, since a > > cross tab query is involved, and there can be varying numbers of > columns, > > I have to use dynamic columns in the report, and that's where complexity > > rears its ugly head. The parameter query (that calls the cross tab > query) > > is necessary because I have information in addition to the cross tab > query > > itself which is needed in the report. > > > > Here's the code portion that Access highlights: > > > > rst.Open _ > > Source:="qfrmWeeklyData", _ > > ActiveConnection:=CurrentProject.Connection, _ > > Options:=adCmdTable > > > > I'd greatly appreciate any help with this. > > > > > > Thomas F. Ewald > > Stryker Mass Properties > > General Dynamics Land Systems > > > > > > -- > > AccessD mailing list > > AccessD at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/accessd > > Website: http://www.databaseadvisors.com > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From jwcolby at colbyconsulting.com Mon May 14 12:36:04 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Mon, 14 May 2007 13:36:04 -0400 Subject: [dba-SQLServer] Here we go again!! In-Reply-To: Message-ID: <20070514173603.82114BD33@smtp-auth.no-ip.com> Even there you have a huge learning curve trying to figure out the security implications in your network etc. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Monday, May 14, 2007 1:29 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Here we go again!! IME, Vista works best when purchased w/ new hardware, not for upgrading... sorry it's just the way I've seen it. Every install of Vista on existing hardware ('cept vm which is why i ventured to try it on physical systems) has been a disaster. My brother in law whom is 16 absolutely loves Vista, he had a chance to get his hands on it at school, of course.. w/ new hardware. I argued that, the os is not mature enough until SP2 at least but he did not think that was the case and wanted to buy it for home, until his teacher tried to upgrade one of the existing core2duo systems at school, it went belly up and their tech people had to go in and replace the machine w/ a working xp box :). Only the newer pc's that are purchased w/ Vista seem to work well, all others beware... -- my 2 cents From fhtapia at gmail.com Mon May 14 13:33:52 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Mon, 14 May 2007 11:33:52 -0700 Subject: [dba-SQLServer] [AccessD] Cross Tab Query Help Requested In-Reply-To: References: <29f585dd0705140719u5b8c149brb43c2fdc5ce78a1@mail.gmail.com> Message-ID: ok, found it http://www.simple-talk.com/sql/t-sql-programming/creating-cross-tab-queries-and-pivot-tables-in-sql/ On 5/14/07, Francisco Tapia wrote: > > Arthur, :), it would help if you provided a link to the SP or article > about the SP :) > > On 5/14/07, Arthur Fuller < fuller.artful at gmail.com> wrote: > > > > In my experience, the easiest solution was to use static functions in > > the > > first query, and then base the crosstab query on that result set. > > > > For the SQL Server folks on this list, there is an amazing stored > > procedure > > available at Simple-Talk (www.simple-talk.com ) that dynamically > > generates > > cross-tabs as complex as you'd like. One of the big problems with > > cross-tabs > > is that the number of columns might change from run to run. This stored > > procedure gets around that. > > > > In the interests of transparency, let me admit that I write for Simple > > Talk, > > although I did not write this article and wish that I had. The stored > > procedure is amazing. > > > > Arthur > > > > On 5/14/07, Gustav Brock < Gustav at cactus.dk> wrote: > > > > > > Hi Thomas > > > > > > Here's an article on using ADO and parameters: > > > > > > http://support.microsoft.com/kb/225897/en-us > > > > > > Also, look up in the archives subject "ADO code stopped working" from > > > early February this year. > > > > > > If it works now with DAO, I would leave it except, of course, if this > > is a > > > learning experience. > > > > > > /gustav > > > > > > >>> ewaldt at gdls.com 14-05-2007 13:10 >>> > > > I am running into a problem with a cross tab query. > > > > > > I have a report based on a parameter query, which is in turn based on > > a > > > cross tab query (which is based on the same parameter), and ADO > > doesn't > > > seem happy. On MSFT's site, they use DAO in their (very complex) > > example, > > > and they show how to specify a parameter (qdf = a QueryDef, and > > > qdf.Parameters(xxx) = yyy). I really prefer to use ADO (trying to > > learn > > > it), but I don't see how to specify a parameter's value in ADO. I'm > > > assuming that's the problem , because Access keeps saying that I'm not > > > specifiying required info. Also, when I went through and replaced all > > > instances of parameters in the queries (query based on queries based > > on > > > queries) with solid numbers, it worked. In the actual queries, the > > > parameter is: > > > > > > [Forms]![frmWeeklyData]![fraMonths] > > > > > > This simply refers to a frame containing option buttons so that I can > > > specify the month I'm interested in. The month's number is then used > > by > > > the queries. > > > > > > Running the queries without the report works just fine. However, since > > a > > > cross tab query is involved, and there can be varying numbers of > > columns, > > > I have to use dynamic columns in the report, and that's where > > complexity > > > rears its ugly head. The parameter query (that calls the cross tab > > query) > > > is necessary because I have information in addition to the cross tab > > query > > > itself which is needed in the report. > > > > > > Here's the code portion that Access highlights: > > > > > > rst.Open _ > > > Source:="qfrmWeeklyData", _ > > > ActiveConnection:=CurrentProject.Connection , _ > > > Options:=adCmdTable > > > > > > I'd greatly appreciate any help with this. > > > > > > > > > Thomas F. Ewald > > > Stryker Mass Properties > > > General Dynamics Land Systems > > > > > > > > > -- > > > AccessD mailing list > > > AccessD at databaseadvisors.com > > > http://databaseadvisors.com/mailman/listinfo/accessd > > > Website: http://www.databaseadvisors.com > > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > -Francisco > http://sqlthis.blogspot.com | Tsql and More... -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From martyconnelly at shaw.ca Mon May 14 17:40:31 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Mon, 14 May 2007 15:40:31 -0700 Subject: [dba-SQLServer] SQL Server - Turning comparisons into Booleans In-Reply-To: References: <20070511164401.6BED3BC02@smtp-auth.no-ip.com> Message-ID: <4648E55F.3090709@shaw.ca> Here are some sites on Sharepoint, you might want to look through SharePoint User Group UK http://suguk.org/ Meridio's site for documentation on Sharepoint. http://www.meridio.com Heck these guys designed the Sharepoint sites for UK MoD and their office HQ is just down the road from you in Belfast. Queen's Rd. Maybe you can sweet talk for advice. Microsoft and AIIM present Advanced Records Management and Information Lifecycle Management using Office SharePoint Server 2007 http://www.microsoft.com/ireland/technet/default.mspx Inside SharePoint Authorization http://mssharepoint.advisorguide.com/doc/18977 Martin Reid wrote: >Yip up to my neck in Sharepoint and having similar problems etc with trying to find out how the guts of the thing work. Took me two hours this evening trying to find out how to add a value to the search drop down list in MOSS 2007. > >Martin > >Martin WP Reid >Training and Assessment Unit >Riddle Hall >Belfast > >tel: 02890 974465 > > > -- Marty Connelly Victoria, B.C. Canada From jlawrenc1 at shaw.ca Mon May 14 20:19:35 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Mon, 14 May 2007 18:19:35 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <4648E55F.3090709@shaw.ca> Message-ID: <0JI200KPP64MYZ30@l-daemon> Hi All: This is going to seem OT but maybe not. I will ask a simple question... What is the primary reason for SharePoint? What is it used for? Why would someone use it as opposed to just creating a web page? (No I am not writing an article on it just trying to figure out whether it is worth installing.) Jim From szayko at secor.com Tue May 15 11:24:38 2007 From: szayko at secor.com (Steve Zayko) Date: Tue, 15 May 2007 09:24:38 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <0JI200KPP64MYZ30@l-daemon> Message-ID: <5D71EC0BA06F7C41B6088EF70CA4014EB3320E@exchangecolo.secor.com> We use Sharepoint internally at SECOR. It is good for a web-based document storage system. It helps with versioning because users may "check out" a document, make edits, and check it back in with changes saved. While it is checked out, no other users may view the document. It has some flexibility to make the interface look like web pages, but it is essentially a glorified document warehouse. -Z Stephen R. Zayko PE SECOR International Inc 2321 Club Meridian Drive Ste E Okemos, MI 48864 (517) 349-9499 ext 224 (517) 204-5136 (c) -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Monday, May 14, 2007 9:20 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] OT SharePoint Hi All: This is going to seem OT but maybe not. I will ask a simple question... What is the primary reason for SharePoint? What is it used for? Why would someone use it as opposed to just creating a web page? (No I am not writing an article on it just trying to figure out whether it is worth installing.) Jim _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From ebarro at verizon.net Tue May 15 11:56:11 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 15 May 2007 09:56:11 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <5D71EC0BA06F7C41B6088EF70CA4014EB3320E@exchangecolo.secor.com> Message-ID: <0JI300E4VDPVA062@vms042.mailsrvcs.net> Sharepoint is more than a glorified document warehouse. It is more than a web-based document storage system. Collaboration, Portal Technology, Content Management, Searching and Indexing, Business Process Management and Business Intelligences are the main "pillars" of Sharepoint. It comes free as part of the Windows 2003 OS as Windows Sharepoint Services ver 3.0 (for MOSS2007) and if you need portal technology, advanced indexing and searching and connectivity to external datasources you will need to get the full MOSS 2007. Everything in Sharepoint is a list and is directly managed behind-the-scenes by SQL server 2005 (for MOSS2007). M$ has tightly integrated it with their Office 2007 product and is also integrating it into their newer products with the Windows Workflow Foundation and Windows Presentation Foundation. And the interface *is* web-based. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve Zayko Sent: Tuesday, May 15, 2007 9:25 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint We use Sharepoint internally at SECOR. It is good for a web-based document storage system. It helps with versioning because users may "check out" a document, make edits, and check it back in with changes saved. While it is checked out, no other users may view the document. It has some flexibility to make the interface look like web pages, but it is essentially a glorified document warehouse. -Z Stephen R. Zayko PE SECOR International Inc 2321 Club Meridian Drive Ste E Okemos, MI 48864 (517) 349-9499 ext 224 (517) 204-5136 (c) -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Monday, May 14, 2007 9:20 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] OT SharePoint Hi All: This is going to seem OT but maybe not. I will ask a simple question... What is the primary reason for SharePoint? What is it used for? Why would someone use it as opposed to just creating a web page? (No I am not writing an article on it just trying to figure out whether it is worth installing.) Jim _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM From mwp.reid at qub.ac.uk Tue May 15 12:04:40 2007 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Tue, 15 May 2007 18:04:40 +0100 Subject: [dba-SQLServer] OT SharePoint References: <0JI300E4VDPVA062@vms042.mailsrvcs.net> Message-ID: I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 From jlawrenc1 at shaw.ca Tue May 15 15:22:00 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Tue, 15 May 2007 13:22:00 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: Message-ID: <0JI30081XN0KCD10@l-daemon> Thank you, Steve, Eric and Martin for the information... So is it just a document manager? I guess it would be great for a lawyers', accounts' or even a government office. Is the app installed automatically, just has to be located and initialized or is it sitting on the Server 2003 installation disks? Is it fairly intuitive or does it require a great deal of preparation. Is the MOSS a client app for desktop stations? Can a ordinary browser use it? Is it fairly straight forward to setup clients? If I decide to play is there any 'gochas' to look out for? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Tuesday, May 15, 2007 10:05 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 From mwp.reid at qub.ac.uk Tue May 15 15:39:41 2007 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Tue, 15 May 2007 21:39:41 +0100 Subject: [dba-SQLServer] OT SharePoint References: <0JI30081XN0KCD10@l-daemon> Message-ID: Jim You can sign up for a free account on the web. Save you installing and let you get a feel for it. http://www.freesharepoint2007.com/uddi/ Theres a Try It Free site and then there is the actual free site make sure oyu pick the correct one. Easy to set up. Yes and no. We set up several single server installs. No problems. We are now having problems building a farm. Two MOSS servers, One SQL Server. Shoudl be part of your set up. I think the latest SPs for Win 2003 contain it. If not you can download it. In my case a lot of work will be required as we need to build very specific business functionality into this. But one think it allows us to do out of the box is push information out to specific AD groups and users. MOSS is a server APP. Access via URL. Very extensive object model. ALl the client needs is IE and Office. Loads of gochas mostly to do with the install. For example I am sitting here at home at 9.36pm logged onto a MOSS server in work trying to configure search service and sending emails threathening our network administrator (<: Any help I can give just ask but we to are on a huge learning curve with this. Maritn Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 ________________________________ From: dba-sqlserver-bounces at databaseadvisors.com on behalf of Jim Lawrence Sent: Tue 15/05/2007 21:22 To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you, Steve, Eric and Martin for the information... So is it just a document manager? I guess it would be great for a lawyers', accounts' or even a government office. Is the app installed automatically, just has to be located and initialized or is it sitting on the Server 2003 installation disks? Is it fairly intuitive or does it require a great deal of preparation. Is the MOSS a client app for desktop stations? Can a ordinary browser use it? Is it fairly straight forward to setup clients? If I decide to play is there any 'gochas' to look out for? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Tuesday, May 15, 2007 10:05 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From ebarro at verizon.net Tue May 15 15:56:57 2007 From: ebarro at verizon.net (Eric Barro) Date: Tue, 15 May 2007 13:56:57 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <0JI30081XN0KCD10@l-daemon> Message-ID: <0JI300IJZOUZA984@vms048.mailsrvcs.net> Jim, If you just want to have a feel for what Sharepoint looks and feels like you can download the following: 1. WSS 3.0 - Windows Sharepoint Services 2. .NET 2.0 is required - pre-requisite since WSS uses the .NET framework 3. SQL Server Express 2005 4. SQL Management Studio - to manage SQL Express 2005. You will also need to draw on the following skill set 1. Active Directory - SP is tightly integrated with AD for security. 2. IIS 6.0 - this serves the pages. 3. IIS application pools - recommend creating a super user for managing SP. When it installs it needs an AD user that has access privileges to SQL server and the application pools.. I usually create one called spsAdmin and use that whenever SP requires a user. It uses that to impersonate connections, etc...saves you a lot of heartache later. 4. SQL server - navigating and poking around to see where SP places the databases and tables. The config database in SQL is the key. Once the installation runs smoothly you should be able to configure and play. SP uses the concept of web parts. Think of web parts as mini-applications that can be plugged into a main web page. Thus you can have a web part that displays the weather in your location using RSS feeds from weather.com or some other site. You can have a web part that consumes RSS feeds from Wired.com or any site that has them. In other words you get a lot of functionality with little or no programming because each web part has been pre-programmed to do a specific thing. Web parts can also share information. One web part can accept data input from the user and send that off to another web part to display the results. Setting up users requires AD. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 15, 2007 1:22 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you, Steve, Eric and Martin for the information... So is it just a document manager? I guess it would be great for a lawyers', accounts' or even a government office. Is the app installed automatically, just has to be located and initialized or is it sitting on the Server 2003 installation disks? Is it fairly intuitive or does it require a great deal of preparation. Is the MOSS a client app for desktop stations? Can a ordinary browser use it? Is it fairly straight forward to setup clients? If I decide to play is there any 'gochas' to look out for? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Tuesday, May 15, 2007 10:05 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM From accessd at shaw.ca Tue May 15 17:30:31 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Tue, 15 May 2007 15:30:31 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <0JI300IJZOUZA984@vms048.mailsrvcs.net> Message-ID: <0JI3000FRSYR4730@l-daemon> Thank you some much you guys, Martin and Eric for the help. You have given a lot to chew on and if it is alright there will be more pointed questions then. Current have one of my servers set up with AD, IIS6, .Net FrameWork (not sure which version yet). Does it need SQL Express or is the full-version of SQL 2005 OK? Have a good handle of Web pages, but have done nothing major with the App Pool or anything with web parts or RSS feeds... Is this MOSS server some kind of pre-configured SharePoint module or a total stand-alone proprietary application? Or is it a custom server? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Tuesday, May 15, 2007 1:57 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Jim, If you just want to have a feel for what Sharepoint looks and feels like you can download the following: 1. WSS 3.0 - Windows Sharepoint Services 2. .NET 2.0 is required - pre-requisite since WSS uses the .NET framework 3. SQL Server Express 2005 4. SQL Management Studio - to manage SQL Express 2005. You will also need to draw on the following skill set 1. Active Directory - SP is tightly integrated with AD for security. 2. IIS 6.0 - this serves the pages. 3. IIS application pools - recommend creating a super user for managing SP. When it installs it needs an AD user that has access privileges to SQL server and the application pools.. I usually create one called spsAdmin and use that whenever SP requires a user. It uses that to impersonate connections, etc...saves you a lot of heartache later. 4. SQL server - navigating and poking around to see where SP places the databases and tables. The config database in SQL is the key. Once the installation runs smoothly you should be able to configure and play. SP uses the concept of web parts. Think of web parts as mini-applications that can be plugged into a main web page. Thus you can have a web part that displays the weather in your location using RSS feeds from weather.com or some other site. You can have a web part that consumes RSS feeds from Wired.com or any site that has them. In other words you get a lot of functionality with little or no programming because each web part has been pre-programmed to do a specific thing. Web parts can also share information. One web part can accept data input from the user and send that off to another web part to display the results. Setting up users requires AD. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 15, 2007 1:22 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you, Steve, Eric and Martin for the information... So is it just a document manager? I guess it would be great for a lawyers', accounts' or even a government office. Is the app installed automatically, just has to be located and initialized or is it sitting on the Server 2003 installation disks? Is it fairly intuitive or does it require a great deal of preparation. Is the MOSS a client app for desktop stations? Can a ordinary browser use it? Is it fairly straight forward to setup clients? If I decide to play is there any 'gochas' to look out for? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Tuesday, May 15, 2007 10:05 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Tue May 15 21:34:25 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Tue, 15 May 2007 22:34:25 -0400 Subject: [dba-SQLServer] IN() or NOT IN() Message-ID: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> I am trying to process a query where an income field has a set of possible values, 1-9 and A-T. The client wants values 409 and A-M. Logically that would be more efficient if it was NOT in(1-3,n-t). Is it in fact more efficient? And can ranges like that be specified or do I need to use comma delimted lists 1,2,3,n,o,p...? John W. Colby Colby Consulting www.ColbyConsulting.com From jwcolby at colbyconsulting.com Tue May 15 21:41:23 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Tue, 15 May 2007 22:41:23 -0400 Subject: [dba-SQLServer] Processing diverse where clauses Message-ID: <20070516024123.690EEBD0A@smtp-auth.no-ip.com> I am processing a query where a set of N fields have any value, AND one specific field has an IN() clause, i.e. a ton of codes possible. Is it more efficient to build up a pair of queries, SELECT PKID from tblX WHERE FldA in('A','B'...), and another query where the "OR" fields are gathered? Or just use one big query? I hate to even press the button... I am having to test with a TOP 100 kind of thing as it is. John W. Colby Colby Consulting www.ColbyConsulting.com From kens.programming at verizon.net Tue May 15 21:44:14 2007 From: kens.programming at verizon.net (kens.programming) Date: Tue, 15 May 2007 19:44:14 -0700 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> References: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> Message-ID: <001401c79764$1ec020a0$6b01a8c0@Stoker.com> You shouldn't have to use a comma limited list, just square brackets to designate your sets. IN ([4-9], [A-M]) IN ([^1-3], [^N-T]) NOT IN ([1-3], [N-T]) NOT IN ([^4-9], [^A-M]) Ken -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Tuesday, May 15, 2007 7:34 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] IN() or NOT IN() I am trying to process a query where an income field has a set of possible values, 1-9 and A-T. The client wants values 409 and A-M. Logically that would be more efficient if it was NOT in(1-3,n-t). Is it in fact more efficient? And can ranges like that be specified or do I need to use comma delimted lists 1,2,3,n,o,p...? John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM No virus found in this outgoing message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM From stuart at lexacorp.com.pg Tue May 15 21:46:49 2007 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Wed, 16 May 2007 12:46:49 +1000 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> References: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> Message-ID: <464AFD39.20836.424F65B8@stuart.lexacorp.com.pg> IN() needs a delimited list. It doesn't understand REGEX. I would assume that since the list would be shorter, NOT IN() would be slightly faster in this case - less comparisons to make on each record. On 15 May 2007 at 22:34, jwcolby wrote: > I am trying to process a query where an income field has a set of possible > values, 1-9 and A-T. The client wants values 409 and A-M. Logically that > would be more efficient if it was NOT in(1-3,n-t). Is it in fact more > efficient? And can ranges like that be specified or do I need to use comma > delimted lists 1,2,3,n,o,p...? > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > From jwcolby at colbyconsulting.com Tue May 15 21:50:43 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Tue, 15 May 2007 22:50:43 -0400 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: <001401c79764$1ec020a0$6b01a8c0@Stoker.com> Message-ID: <20070516025043.09115BCA7@smtp-auth.no-ip.com> Cool, thanks. What is the ^ for? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of kens.programming Sent: Tuesday, May 15, 2007 10:44 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] IN() or NOT IN() You shouldn't have to use a comma limited list, just square brackets to designate your sets. IN ([4-9], [A-M]) IN ([^1-3], [^N-T]) NOT IN ([1-3], [N-T]) NOT IN ([^4-9], [^A-M]) Ken -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Tuesday, May 15, 2007 7:34 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] IN() or NOT IN() I am trying to process a query where an income field has a set of possible values, 1-9 and A-T. The client wants values 409 and A-M. Logically that would be more efficient if it was NOT in(1-3,n-t). Is it in fact more efficient? And can ranges like that be specified or do I need to use comma delimted lists 1,2,3,n,o,p...? John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM No virus found in this outgoing message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From stuart at lexacorp.com.pg Tue May 15 22:04:13 2007 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Wed, 16 May 2007 13:04:13 +1000 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: <001401c79764$1ec020a0$6b01a8c0@Stoker.com> References: <20070516023425.21E79BC0F@smtp-auth.no-ip.com>, <001401c79764$1ec020a0$6b01a8c0@Stoker.com> Message-ID: <464B014D.8582.425F5223@stuart.lexacorp.com.pg> Are you sure? I've never seen anything that says that IN() takes anything other than a Subquery or a List of expressions. The SQL Server 2000 that I have here returns: "Invalid column name 4-9" if I try that. On 15 May 2007 at 19:44, kens.programming wrote: > You shouldn't have to use a comma limited list, just square brackets to > designate your sets. > > IN ([4-9], [A-M]) > IN ([^1-3], [^N-T]) > NOT IN ([1-3], [N-T]) > NOT IN ([^4-9], [^A-M]) > > Ken > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby > Sent: Tuesday, May 15, 2007 7:34 PM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] IN() or NOT IN() > > I am trying to process a query where an income field has a set of possible > values, 1-9 and A-T. The client wants values 409 and A-M. Logically that > would be more efficient if it was NOT in(1-3,n-t). Is it in fact more > efficient? And can ranges like that be specified or do I need to use comma > delimted lists 1,2,3,n,o,p...? > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > No virus found in this incoming message. > Checked by AVG Free Edition. > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 > 10:47 AM > > > No virus found in this outgoing message. > Checked by AVG Free Edition. > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 > 10:47 AM > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > From mwp.reid at qub.ac.uk Wed May 16 03:16:23 2007 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Wed, 16 May 2007 09:16:23 +0100 Subject: [dba-SQLServer] OT SharePoint References: <0JI3000FRSYR4730@l-daemon> Message-ID: there are really two parts to it. WSS is the collaboration environment. MOSS is the portal environment that sits on top of it all. WSS is free, MOSS is BIG BUCKS. SQL Server 2005. You can install with express if you need to. OK for testing and small deployments. We are using dedicated SQL 2005 servers for this one. Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 From fuller.artful at gmail.com Wed May 16 07:09:54 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Wed, 16 May 2007 08:09:54 -0400 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <20070516024123.690EEBD0A@smtp-auth.no-ip.com> References: <20070516024123.690EEBD0A@smtp-auth.no-ip.com> Message-ID: <29f585dd0705160509w13b1b9d1i231b4f8d78986d77@mail.gmail.com> INs are bad policy, typically, unless you have a very small number of targets. I wrote about this problem at TechRepublic a while ago. I may not have the article handy, but you can probably find it there. Go to www.techrepublic.com and search for "Arthur Fuller IN()". That might work. Else just search for my name and then browse. Anyway, the article concerns a better way to do the IN part. The problem with IN is that it almost always forces a table-scan, which in your case will give you enough time to move to another country and back before it's done. The article shows how to construct a temp-table from the IN clause and then do a join to the actual table using the temp table. Especially when your IN clause could have lots of values, this approach is way faster. I'll dig now for the article, but I'm fairly certain that it was one of the things wiped out when I lost a hard disk a month ago or so. If I find it I'll send it off-list. Arthur On 5/15/07, jwcolby wrote: > > I am processing a query where a set of N fields have any value, AND one > specific field has an IN() clause, i.e. a ton of codes possible. > > Is it more efficient to build up a pair of queries, SELECT PKID from tblX > WHERE FldA in('A','B'...), and another query where the "OR" fields are > gathered? Or just use one big query? I hate to even press the > button... I > am having to test with a TOP 100 kind of thing as it is. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Wed May 16 07:45:27 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 08:45:27 -0400 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <29f585dd0705160509w13b1b9d1i231b4f8d78986d77@mail.gmail.com> Message-ID: <20070516124525.773BABCD5@smtp-auth.no-ip.com> Don't you hate it when you lose a hard drive!!! My new servers are being built with Raid6! So that I can lose TWO drives and soldier on. All of my critical stuff is now being placed out on volumes on those raid arrays. It is probably not something I would do without a client need, but the total cost is now under $1000. $500 for the dedicate raid controller I selected: http://www.newegg.com/Product/Product.aspx?Item=N82E16816131004 And at least 4 hard drives - two for the parity and two for the data. This card will support 8 drives. In my first server I used the Seagate 320gb (300 real gb) drives: http://www.newegg.com/Product/Product.aspx?Item=N82E16822148140 In the second server I am using the 500gb drives: http://www.newegg.com/Product/Product.aspx?Item=N82E16822148136 With the selected controller you can start with raid 5 and as you add drives later, just tell the controller to upgrade to raid 6. It "just works" and works well. And is blazing fast. As for the IN() clause... Interestingly, I ran a test of just the IN() clause part and I can pull counts from 65 million records in just a couple of seconds - Count(PK) as RecCnt from tblXXX where IncomeRange in('4','5'...). I did index this field since it had so many values. I also ended up using a NOT IN() because there were fewer values to enumerate. I did not test whether that was faster or not. I ended up breaking the query down into three queries IncomeRange - 1 field in where using IN() FemaleWithChildren - 8 fields in where RecsInZips - 1 join on Zip5 And then built up a query joining the PKs pulled from all of those separate queries. The entire count using all three subqueries runs in 23 minutes. I think it is having to do table scans for the FemaleWithChildren subquery simply because the fields only have a few values in them. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Wednesday, May 16, 2007 8:10 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses INs are bad policy, typically, unless you have a very small number of targets. I wrote about this problem at TechRepublic a while ago. I may not have the article handy, but you can probably find it there. Go to www.techrepublic.com and search for "Arthur Fuller IN()". That might work. Else just search for my name and then browse. Anyway, the article concerns a better way to do the IN part. The problem with IN is that it almost always forces a table-scan, which in your case will give you enough time to move to another country and back before it's done. The article shows how to construct a temp-table from the IN clause and then do a join to the actual table using the temp table. Especially when your IN clause could have lots of values, this approach is way faster. I'll dig now for the article, but I'm fairly certain that it was one of the things wiped out when I lost a hard disk a month ago or so. If I find it I'll send it off-list. Arthur On 5/15/07, jwcolby wrote: > > I am processing a query where a set of N fields have any value, AND > one specific field has an IN() clause, i.e. a ton of codes possible. > > Is it more efficient to build up a pair of queries, SELECT PKID from > tblX WHERE FldA in('A','B'...), and another query where the "OR" > fields are gathered? Or just use one big query? I hate to even press > the button... I am having to test with a TOP 100 kind of thing as it > is. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed May 16 08:21:58 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 16 May 2007 06:21:58 -0700 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: <464B014D.8582.425F5223@stuart.lexacorp.com.pg> References: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> <001401c79764$1ec020a0$6b01a8c0@Stoker.com> <464B014D.8582.425F5223@stuart.lexacorp.com.pg> Message-ID: It definatly (in sql 2000) only takes list or single column subqueries. I'm not at my pc right now or I'd try that query on ss2005 On 5/15/07, Stuart McLachlan wrote: > Are you sure? I've never seen anything that says that IN() takes > anything other than a Subquery or a List of expressions. > > The SQL Server 2000 that I have here returns: > "Invalid column name 4-9" if I try that. > > On 15 May 2007 at 19:44, kens.programming wrote: > > > You shouldn't have to use a comma limited list, just square brackets to > > designate your sets. > > > > IN ([4-9], [A-M]) > > IN ([^1-3], [^N-T]) > > NOT IN ([1-3], [N-T]) > > NOT IN ([^4-9], [^A-M]) > > > > Ken > > > > -----Original Message----- > > From: dba-sqlserver-bounces at databaseadvisors.com > > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby > > Sent: Tuesday, May 15, 2007 7:34 PM > > To: dba-sqlserver at databaseadvisors.com > > Subject: [dba-SQLServer] IN() or NOT IN() > > > > I am trying to process a query where an income field has a set of possible > > values, 1-9 and A-T. The client wants values 409 and A-M. Logically that > > would be more efficient if it was NOT in(1-3,n-t). Is it in fact more > > efficient? And can ranges like that be specified or do I need to use > comma > > delimted lists 1,2,3,n,o,p...? > > > > John W. Colby > > Colby Consulting > > www.ColbyConsulting.com > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > No virus found in this incoming message. > > Checked by AVG Free Edition. > > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 > > 10:47 AM > > > > > > No virus found in this outgoing message. > > Checked by AVG Free Edition. > > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 > > 10:47 AM > > > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From jwcolby at colbyconsulting.com Wed May 16 09:17:24 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 10:17:24 -0400 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: Message-ID: <20070516141723.00A28BE81@smtp-auth.no-ip.com> Yep, I tried it yesterday. Comma delimited list was all I could make happen. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, May 16, 2007 9:22 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] IN() or NOT IN() It definatly (in sql 2000) only takes list or single column subqueries. I'm not at my pc right now or I'd try that query on ss2005 On 5/15/07, Stuart McLachlan wrote: > Are you sure? I've never seen anything that says that IN() takes > anything other than a Subquery or a List of expressions. > > The SQL Server 2000 that I have here returns: > "Invalid column name 4-9" if I try that. > > On 15 May 2007 at 19:44, kens.programming wrote: > > > You shouldn't have to use a comma limited list, just square brackets > > to designate your sets. > > > > IN ([4-9], [A-M]) > > IN ([^1-3], [^N-T]) > > NOT IN ([1-3], [N-T]) > > NOT IN ([^4-9], [^A-M]) > > > > Ken From ebarro at verizon.net Wed May 16 09:23:04 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 16 May 2007 07:23:04 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <0JI3000FRSYR4730@l-daemon> Message-ID: <0JI5001TZ1BFERK7@vms044.mailsrvcs.net> Jim, Knowledge of the app pool works in your favor if you have to debug. Otherwise you don't have to mess with it. Same goes for web parts. It's a plug-n-play proposition. I was just giving you the lay of the land so to speak. Don't worry about RSS feeds either. MOSS 2007 is the portal technology that sits on top of WSS. WSS is the engine, MOSS is the super-charger or turbo-boost for the engine. Both products are server technologies. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 15, 2007 3:31 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you some much you guys, Martin and Eric for the help. You have given a lot to chew on and if it is alright there will be more pointed questions then. Current have one of my servers set up with AD, IIS6, .Net FrameWork (not sure which version yet). Does it need SQL Express or is the full-version of SQL 2005 OK? Have a good handle of Web pages, but have done nothing major with the App Pool or anything with web parts or RSS feeds... Is this MOSS server some kind of pre-configured SharePoint module or a total stand-alone proprietary application? Or is it a custom server? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Tuesday, May 15, 2007 1:57 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Jim, If you just want to have a feel for what Sharepoint looks and feels like you can download the following: 1. WSS 3.0 - Windows Sharepoint Services 2. .NET 2.0 is required - pre-requisite since WSS uses the .NET framework 3. SQL Server Express 2005 4. SQL Management Studio - to manage SQL Express 2005. You will also need to draw on the following skill set 1. Active Directory - SP is tightly integrated with AD for security. 2. IIS 6.0 - this serves the pages. 3. IIS application pools - recommend creating a super user for managing SP. When it installs it needs an AD user that has access privileges to SQL server and the application pools.. I usually create one called spsAdmin and use that whenever SP requires a user. It uses that to impersonate connections, etc...saves you a lot of heartache later. 4. SQL server - navigating and poking around to see where SP places the databases and tables. The config database in SQL is the key. Once the installation runs smoothly you should be able to configure and play. SP uses the concept of web parts. Think of web parts as mini-applications that can be plugged into a main web page. Thus you can have a web part that displays the weather in your location using RSS feeds from weather.com or some other site. You can have a web part that consumes RSS feeds from Wired.com or any site that has them. In other words you get a lot of functionality with little or no programming because each web part has been pre-programmed to do a specific thing. Web parts can also share information. One web part can accept data input from the user and send that off to another web part to display the results. Setting up users requires AD. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 15, 2007 1:22 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you, Steve, Eric and Martin for the information... So is it just a document manager? I guess it would be great for a lawyers', accounts' or even a government office. Is the app installed automatically, just has to be located and initialized or is it sitting on the Server 2003 installation disks? Is it fairly intuitive or does it require a great deal of preparation. Is the MOSS a client app for desktop stations? Can a ordinary browser use it? Is it fairly straight forward to setup clients? If I decide to play is there any 'gochas' to look out for? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Tuesday, May 15, 2007 10:05 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM From jlawrenc1 at shaw.ca Wed May 16 09:29:24 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Wed, 16 May 2007 07:29:24 -0700 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <29f585dd0705160509w13b1b9d1i231b4f8d78986d77@mail.gmail.com> Message-ID: <0JI5005ZS1CSVZF0@l-daemon> Hi Arthur: The performance with the IN statement in this condition is not going to be an issue as there is only one person on the network. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Wednesday, May 16, 2007 5:10 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses INs are bad policy, typically, unless you have a very small number of targets. I wrote about this problem at TechRepublic a while ago. I may not have the article handy, but you can probably find it there. Go to www.techrepublic.com and search for "Arthur Fuller IN()". That might work. Else just search for my name and then browse. Anyway, the article concerns a better way to do the IN part. The problem with IN is that it almost always forces a table-scan, which in your case will give you enough time to move to another country and back before it's done. The article shows how to construct a temp-table from the IN clause and then do a join to the actual table using the temp table. Especially when your IN clause could have lots of values, this approach is way faster. I'll dig now for the article, but I'm fairly certain that it was one of the things wiped out when I lost a hard disk a month ago or so. If I find it I'll send it off-list. Arthur On 5/15/07, jwcolby wrote: > > I am processing a query where a set of N fields have any value, AND one > specific field has an IN() clause, i.e. a ton of codes possible. > > Is it more efficient to build up a pair of queries, SELECT PKID from tblX > WHERE FldA in('A','B'...), and another query where the "OR" fields are > gathered? Or just use one big query? I hate to even press the > button... I > am having to test with a TOP 100 kind of thing as it is. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 16 09:39:26 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 10:39:26 -0400 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <0JI5005ZS1CSVZF0@l-daemon> Message-ID: <20070516143925.39DB6BF08@smtp-auth.no-ip.com> I can tell you that at the very end it locked up my dual proc machine as it pulls the results all together. I have never seen that happen before. Aside from that, what does the network have to do with anything? This is running in SQL Server 2005, using tables / drives on the same machine. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 10:29 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi Arthur: The performance with the IN statement in this condition is not going to be an issue as there is only one person on the network. Jim From fuller.artful at gmail.com Wed May 16 09:58:30 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Wed, 16 May 2007 10:58:30 -0400 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: References: <20070516023425.21E79BC0F@smtp-auth.no-ip.com> <001401c79764$1ec020a0$6b01a8c0@Stoker.com> <464B014D.8582.425F5223@stuart.lexacorp.com.pg> Message-ID: <29f585dd0705160758o74955a69j8754e55d02da7699@mail.gmail.com> This is most certainly the case. IN() accepts only these arguments: a list of one or more items, comma-delimited if more than one); a SELECT statement that returns exactly one column. That's it. Note that I avoided, as did Francisco, assuming that the sub-query is based on table. It could be a UDF. But those are your choices: a SELECT statement that returns one column, or a comma-delimited list. The list might contain only one item, in which case the comma is not required, but why would anyone use IN() for a one-item list? AFAIK you can only use SQL Server's decidedly limited implementation of regex expressions in a LIKE clause, not an IN clause. Arthur On 5/16/07, Francisco Tapia wrote: > > It definatly (in sql 2000) only takes list or single column subqueries. > I'm not at my pc right now or I'd try that query on ss2005 > > On 5/15/07, Stuart McLachlan wrote: > > Are you sure? I've never seen anything that says that IN() takes > > anything other than a Subquery or a List of expressions. > > > > The SQL Server 2000 that I have here returns: > > "Invalid column name 4-9" if I try that. > > > > On 15 May 2007 at 19:44, kens.programming wrote: > > > > > You shouldn't have to use a comma limited list, just square brackets > to > > > designate your sets. > > > > > > IN ([4-9], [A-M]) > > > IN ([^1-3], [^N-T]) > > > NOT IN ([1-3], [N-T]) > > > NOT IN ([^4-9], [^A-M]) > > > > > > Ken > > > > > > -----Original Message----- > > > From: dba-sqlserver-bounces at databaseadvisors.com > > > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of > jwcolby > > > Sent: Tuesday, May 15, 2007 7:34 PM > > > To: dba-sqlserver at databaseadvisors.com > > > Subject: [dba-SQLServer] IN() or NOT IN() > > > > > > I am trying to process a query where an income field has a set of > possible > > > values, 1-9 and A-T. The client wants values 409 and A-M. Logically > that > > > would be more efficient if it was NOT in(1-3,n-t). Is it in fact more > > > efficient? And can ranges like that be specified or do I need to use > > comma > > > delimted lists 1,2,3,n,o,p...? > > > > > > John W. Colby > > > Colby Consulting > > > www.ColbyConsulting.com > > > > > > _______________________________________________ > > > dba-SQLServer mailing list > > > dba-SQLServer at databaseadvisors.com > > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > > http://www.databaseadvisors.com > > > > > > No virus found in this incoming message. > > > Checked by AVG Free Edition. > > > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: > 5/15/2007 > > > 10:47 AM > > > > > > > > > No virus found in this outgoing message. > > > Checked by AVG Free Edition. > > > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: > 5/15/2007 > > > 10:47 AM > > > > > > > > > _______________________________________________ > > > dba-SQLServer mailing list > > > dba-SQLServer at databaseadvisors.com > > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > > http://www.databaseadvisors.com > > > > > > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > -Francisco > http://sqlthis.blogspot.com | Tsql and More... > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Wed May 16 10:09:53 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 11:09:53 -0400 Subject: [dba-SQLServer] Using Databases on separate machines - performance Message-ID: <20070516150951.B2F0ABDEB@smtp-auth.no-ip.com> I now have two "high powered" servers built up. Both run Windows 2003 and SQL Server 2005. ATM I am running my entire database on a single machine. The database consists of a couple of largish "raw data" tables (65 million recs / 700+ fields; 97 million recs / 149 fields), which contain the data pulled in from text files, with a autoincrement PK added for tracking purposes. Each raw data table then has address / PK fields pulled out and "sent out" for address validation. The results are then reimported back in to SQL Server, into the same DB as the raw data table sits in. I have created separate DBF files for each "database" (raw / validated). As I mentioned earlier, I am creating a new dbf file set for each "order" I receive from my customer, where I build up the views required to process that specific order. That is working quite well BTW. I have a bunch of questions re performance. I have discovered that I can create queries / views that pull the data straight out of the desired db / table when I use that data in another db, simply by just referencing the database / table. I think I can do the same thing if pieces are on another server instance. What I am considering doing is placing these huge raw / validated database files out on the StoneHenge server, leaving the Azul server to contain and process the orders. Stonehenge is the newer machine and has a single partition with 1.6 tbytes open, and I will be adding another 500 gb to that soon. Thus this seems like the logical home for the big databases. I have a gbit switch between all the machines on my network. My question is, will there be any pros/cons to organizing things this way. I can get about 450mb burst streaming data off of my raid arrays which is considerably above the 1 gb switch capacity, but it seems unlikely that SQL Server would actually process data at that speed anyway. So I want to place the big source databases on the new server and the order database on the original server. To give an example of a real order I created a set of queries: 1) One query talks to the tblHSIDRaw table (75 million records / 700 fields), asking for "SELECT PKID WHERE ... " The Where clause encompasses about 9 different fields. All the fields are indexed, though how useful the indexes are (in all cases) is in doubt. 2) Another query uses a small table of 180 ZIPS provided by the client. That ZIP table is joined on the ZIP column of tblAZHSID which is the table that has been processed for valid addresses. tblAZHSID is ~50 million records with about 20 fields. "SELECT PK from tblZIPS inner join tblAZHSID on ..." 3) A third query requests data from tblHSIDRaw for a where on a single specific field. "SELECT PK FROM tblHSIDRaw WHERE FieldX IN () I did that just because this has to be an AND with all the 9 fields in the first query. Perhaps not the most efficient but it works. 4) A query that inner joins all the PKs and returns a result set. So what happens if I place these tblHSIDRaw and tblAZHSID out on Stonehenge? I assume that processor is passed the select clauses and performs the processing required to return the PKs requested? Does Azul do any processing? Perhaps the join of the result sets in query 4 to create the final data set? Is the process faster or slower than if the whole shootin match ran on a single machine? John W. Colby Colby Consulting www.ColbyConsulting.com From accessd at shaw.ca Wed May 16 10:23:57 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Wed, 16 May 2007 08:23:57 -0700 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <20070516124525.773BABCD5@smtp-auth.no-ip.com> Message-ID: <0JI5004693VPOEJ0@l-daemon> Hi John: The latest features of SQL 2005 that might help you are: You can use a SQL query or nested query as if you are using a table. I am not sure how many levels you can go but Oracle is 16. This could replace the IN (1,2,3...) Select o1.test1, o2.test2... >From (Select * from table2 Where ...) as o2 Where o1 = "test" (The processes from the inner most select out.) By default MS SQL 2005 has Optimization statistics on. This means that the first time you perform a possible BAD query it will run slow but eventually the same query will start running faster. This is because 'hints' internally tell the query routine which is the best method, whether fullscan or index and which indexes to use. Oracle had needed to use manual 'hints' but MS SQL automated this process since 2000. This means you can get away with bad queries, just do the same bad query enough time and remember to use 'variables' in the SPs Select o1.test1.... from table1 where o2.test = @test And not... Select o1.test1.... from table1 where o2.test = 'Filbert' HTH Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 5:45 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Don't you hate it when you lose a hard drive!!! My new servers are being built with Raid6! So that I can lose TWO drives and soldier on. All of my critical stuff is now being placed out on volumes on those raid arrays. It is probably not something I would do without a client need, but the total cost is now under $1000. $500 for the dedicate raid controller I selected: http://www.newegg.com/Product/Product.aspx?Item=N82E16816131004 And at least 4 hard drives - two for the parity and two for the data. This card will support 8 drives. In my first server I used the Seagate 320gb (300 real gb) drives: http://www.newegg.com/Product/Product.aspx?Item=N82E16822148140 In the second server I am using the 500gb drives: http://www.newegg.com/Product/Product.aspx?Item=N82E16822148136 With the selected controller you can start with raid 5 and as you add drives later, just tell the controller to upgrade to raid 6. It "just works" and works well. And is blazing fast. As for the IN() clause... Interestingly, I ran a test of just the IN() clause part and I can pull counts from 65 million records in just a couple of seconds - Count(PK) as RecCnt from tblXXX where IncomeRange in('4','5'...). I did index this field since it had so many values. I also ended up using a NOT IN() because there were fewer values to enumerate. I did not test whether that was faster or not. I ended up breaking the query down into three queries IncomeRange - 1 field in where using IN() FemaleWithChildren - 8 fields in where RecsInZips - 1 join on Zip5 And then built up a query joining the PKs pulled from all of those separate queries. The entire count using all three subqueries runs in 23 minutes. I think it is having to do table scans for the FemaleWithChildren subquery simply because the fields only have a few values in them. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Wednesday, May 16, 2007 8:10 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses INs are bad policy, typically, unless you have a very small number of targets. I wrote about this problem at TechRepublic a while ago. I may not have the article handy, but you can probably find it there. Go to www.techrepublic.com and search for "Arthur Fuller IN()". That might work. Else just search for my name and then browse. Anyway, the article concerns a better way to do the IN part. The problem with IN is that it almost always forces a table-scan, which in your case will give you enough time to move to another country and back before it's done. The article shows how to construct a temp-table from the IN clause and then do a join to the actual table using the temp table. Especially when your IN clause could have lots of values, this approach is way faster. I'll dig now for the article, but I'm fairly certain that it was one of the things wiped out when I lost a hard disk a month ago or so. If I find it I'll send it off-list. Arthur On 5/15/07, jwcolby wrote: > > I am processing a query where a set of N fields have any value, AND > one specific field has an IN() clause, i.e. a ton of codes possible. > > Is it more efficient to build up a pair of queries, SELECT PKID from > tblX WHERE FldA in('A','B'...), and another query where the "OR" > fields are gathered? Or just use one big query? I hate to even press > the button... I am having to test with a TOP 100 kind of thing as it > is. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From accessd at shaw.ca Wed May 16 10:32:04 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Wed, 16 May 2007 08:32:04 -0700 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <20070516143925.39DB6BF08@smtp-auth.no-ip.com> Message-ID: <0JI500F0J498DYA0@l-daemon> Hi john: This type of statement 'IN' may grind a network of users to a stand still but you are the only user so who cares but... Maybe it was a time-out issue.... but to lock it up?? Wow, never heard of that happening. Have you tried nested queries? Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 7:39 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I can tell you that at the very end it locked up my dual proc machine as it pulls the results all together. I have never seen that happen before. Aside from that, what does the network have to do with anything? This is running in SQL Server 2005, using tables / drives on the same machine. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 10:29 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi Arthur: The performance with the IN statement in this condition is not going to be an issue as there is only one person on the network. Jim _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 16 10:47:28 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 11:47:28 -0400 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <0JI500F0J498DYA0@l-daemon> Message-ID: <20070516154726.A12BFBD61@smtp-auth.no-ip.com> I had never seen this machine "lock up". It is a dual core 3.8G with 4 g of RAM and even when the cores are pegged it will usually switch tasks and stuff. And of course I can't absolutely 100% determine that SQL Server is the cause. The lockup happened last night late, at the very end of a query that pulled a count based on the four queries mentioned in other emails. It happened again this morning, again at the very end of the same query. When it happens, the task manager shows 100% cpu utilization (for both cores). Unfortunately I did not get a chance to go check which task was using what of that 100%. It locked it up so tight that I couldn't even switch between tasks. As soon as the query completed, the machine returned to normal so I still assume that it was SQL Server. Again though, what does the IN clause have to do with the network. You still are not explaining that. From my understanding, this is all happening on one machine and so I am curious what is being passed over the network, and to whom? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 11:32 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi john: This type of statement 'IN' may grind a network of users to a stand still but you are the only user so who cares but... Maybe it was a time-out issue.... but to lock it up?? Wow, never heard of that happening. Have you tried nested queries? Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 7:39 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I can tell you that at the very end it locked up my dual proc machine as it pulls the results all together. I have never seen that happen before. Aside from that, what does the network have to do with anything? This is running in SQL Server 2005, using tables / drives on the same machine. John W. Colby Colby Consulting www.ColbyConsulting.com From jwcolby at colbyconsulting.com Wed May 16 10:53:16 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 11:53:16 -0400 Subject: [dba-SQLServer] Export data sets to CSV files Message-ID: <20070516155314.71F6FBE11@smtp-auth.no-ip.com> I posted a question about breaking a table down into "small" chunks for export to a csv file. I got a response about building the SELECT query (which I can find). The next piece that I need assistance with is getting that exported out to the text file. For the data import, I simply used a stored procedure which used a BULK INSERT from a file into a table, using a specified field and record delimiter. Works GREAT!!! What is the reverse of a BULK INSERT? John W. Colby Colby Consulting www.ColbyConsulting.com From fhtapia at gmail.com Wed May 16 11:19:01 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 16 May 2007 09:19:01 -0700 Subject: [dba-SQLServer] Using Databases on separate machines - performance In-Reply-To: <20070516150951.B2F0ABDEB@smtp-auth.no-ip.com> References: <20070516150951.B2F0ABDEB@smtp-auth.no-ip.com> Message-ID: John, You're describing distributed queries, which you can perform such as SELECT * FROM ServerName.Database.Owner.Table These instructions will pass the request to the OLEDB provider and pass the "work" to the linked server. If you were hooked into some other type of engine ie, JET/Foxpro excel, then the work is performed by sql server as the oledb provider would have restrictions on the datasource. More info on distributed queries in the links below. I do use distributed queries in my environment, but in my situation I generally cause them to join against local tables, which then it has to pull data over to do the join locally. If instead I perform an entire query on the linked server the processing occurs on the remote server. such as: SELECT * FROM ServerName.Database.Owner.Table Where Field1 IN ('a','b', 'c', 'd') hth White Paper on performance: http://citeseer.ist.psu.edu/rd/93453806%2C732761%2C1%2C0.25%2CDownload/http%3AqSqqSqhome.comcast.netqSq%7EevilconroyqSqICDE2005-MicrosoftSQLServerDistributedQuery.pdf Architecture: http://msdn2.microsoft.com/en-us/library/ms191277.aspx Distributed Queries http://msdn2.microsoft.com/en-us/library/ms188721.aspx On 5/16/07, jwcolby wrote: > > I now have two "high powered" servers built up. Both run Windows 2003 and > SQL Server 2005. > > ATM I am running my entire database on a single machine. The database > consists of a couple of largish "raw data" tables (65 million recs / 700+ > fields; 97 million recs / 149 fields), which contain the data pulled in > from > text files, with a autoincrement PK added for tracking purposes. Each raw > data table then has address / PK fields pulled out and "sent out" for > address validation. The results are then reimported back in to SQL > Server, > into the same DB as the raw data table sits in. I have created separate > DBF > files for each "database" (raw / validated). > > As I mentioned earlier, I am creating a new dbf file set for each "order" > I > receive from my customer, where I build up the views required to process > that specific order. That is working quite well BTW. > > I have a bunch of questions re performance. I have discovered that I can > create queries / views that pull the data straight out of the desired db / > table when I use that data in another db, simply by just referencing the > database / table. I think I can do the same thing if pieces are on > another > server instance. What I am considering doing is placing these huge raw / > validated database files out on the StoneHenge server, leaving the Azul > server to contain and process the orders. Stonehenge is the newer machine > and has a single partition with 1.6 tbytes open, and I will be adding > another 500 gb to that soon. Thus this seems like the logical home for > the > big databases. > > I have a gbit switch between all the machines on my network. > > My question is, will there be any pros/cons to organizing things this way. > I can get about 450mb burst streaming data off of my raid arrays which is > considerably above the 1 gb switch capacity, but it seems unlikely that > SQL > Server would actually process data at that speed anyway. So I want to > place > the big source databases on the new server and the order database on the > original server. > > To give an example of a real order I created a set of queries: > > 1) One query talks to the tblHSIDRaw table (75 million records / 700 > fields), asking for > > "SELECT PKID WHERE ... " > > The Where clause encompasses about 9 different fields. All the fields are > indexed, though how useful the indexes are (in all cases) is in doubt. > > 2) Another query uses a small table of 180 ZIPS provided by the client. > That ZIP table is joined on the ZIP column of tblAZHSID which is the table > that has been processed for valid addresses. tblAZHSID is ~50 million > records with about 20 fields. > > "SELECT PK from tblZIPS inner join tblAZHSID on ..." > > 3) A third query requests data from tblHSIDRaw for a where on a single > specific field. > > "SELECT PK FROM tblHSIDRaw WHERE FieldX IN () > > I did that just because this has to be an AND with all the 9 fields in the > first query. Perhaps not the most efficient but it works. > > 4) A query that inner joins all the PKs and returns a result set. > > So what happens if I place these tblHSIDRaw and tblAZHSID out on > Stonehenge? > I assume that processor is passed the select clauses and performs the > processing required to return the PKs requested? > > Does Azul do any processing? Perhaps the join of the result sets in query > 4 > to create the final data set? > > Is the process faster or slower than if the whole shootin match ran on a > single machine? > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From ebarro at verizon.net Wed May 16 11:20:00 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 16 May 2007 09:20:00 -0700 Subject: [dba-SQLServer] Export data sets to CSV files In-Reply-To: <20070516155314.71F6FBE11@smtp-auth.no-ip.com> Message-ID: <0JI500IMK6PO9Y28@vms048.mailsrvcs.net> Here's an article that might help... http://www.sqlteam.com/item.asp?ItemID=4722 -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 8:53 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Export data sets to CSV files I posted a question about breaking a table down into "small" chunks for export to a csv file. I got a response about building the SELECT query (which I can find). The next piece that I need assistance with is getting that exported out to the text file. For the data import, I simply used a stored procedure which used a BULK INSERT from a file into a table, using a specified field and record delimiter. Works GREAT!!! What is the reverse of a BULK INSERT? John W. Colby Colby Consulting www.ColbyConsulting.com From fhtapia at gmail.com Wed May 16 11:21:18 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 16 May 2007 09:21:18 -0700 Subject: [dba-SQLServer] Using Databases on separate machines - performance In-Reply-To: References: <20070516150951.B2F0ABDEB@smtp-auth.no-ip.com> Message-ID: So in short the answer to your question really was "it depends" from the MS article: When possible, SQL Server pushes relational operations such as joins, restrictions, projections, sorts, and group by operations to the OLE DB data source. SQL Server does not default to scanning the base table into SQL Server and performing the relational operations itself. SQL Server queries the OLE DB provider to determine the level of SQL grammar it supports, and, based on that information, pushes as many relational operations as possible to the provider. For more information, see SQL Dialect Requirements for OLE DB Providers . On 5/16/07, Francisco Tapia wrote: > > John, > You're describing distributed queries, which you can perform such as > > > SELECT * FROM ServerName.Database.Owner.Table > > > These instructions will pass the request to the OLEDB provider and pass > the "work" to the linked server. If you were hooked into some other type of > engine ie, JET/Foxpro excel, then the work is performed by sql server as the > oledb provider would have restrictions on the datasource. More info on > distributed queries in the links below. I do use distributed queries in my > environment, but in my situation I generally cause them to join against > local tables, which then it has to pull data over to do the join locally. > If instead I perform an entire query on the linked server the processing > occurs on the remote server. > > such as: > > SELECT * FROM ServerName.Database.Owner.Table Where Field1 IN ('a','b', > 'c', 'd') > > > hth > > White Paper on performance: > > http://citeseer.ist.psu.edu/rd/93453806%2C732761%2C1%2C0.25%2CDownload/http%3AqSqqSqhome.comcast.netqSq%7EevilconroyqSqICDE2005-MicrosoftSQLServerDistributedQuery.pdf > > Architecture: > http://msdn2.microsoft.com/en-us/library/ms191277.aspx > > Distributed Queries > http://msdn2.microsoft.com/en-us/library/ms188721.aspx > > On 5/16/07, jwcolby wrote: > > > > I now have two "high powered" servers built up. Both run Windows 2003 > > and > > SQL Server 2005. > > > > ATM I am running my entire database on a single machine. The database > > consists of a couple of largish "raw data" tables (65 million recs / > > 700+ > > fields; 97 million recs / 149 fields), which contain the data pulled in > > from > > text files, with a autoincrement PK added for tracking purposes. Each > > raw > > data table then has address / PK fields pulled out and "sent out" for > > address validation. The results are then reimported back in to SQL > > Server, > > into the same DB as the raw data table sits in. I have created separate > > DBF > > files for each "database" (raw / validated). > > > > As I mentioned earlier, I am creating a new dbf file set for each > > "order" I > > receive from my customer, where I build up the views required to process > > that specific order. That is working quite well BTW. > > > > I have a bunch of questions re performance. I have discovered that I > > can > > create queries / views that pull the data straight out of the desired db > > / > > table when I use that data in another db, simply by just referencing the > > > > database / table. I think I can do the same thing if pieces are on > > another > > server instance. What I am considering doing is placing these huge raw > > / > > validated database files out on the StoneHenge server, leaving the Azul > > server to contain and process the orders. Stonehenge is the newer > > machine > > and has a single partition with 1.6 tbytes open, and I will be adding > > another 500 gb to that soon. Thus this seems like the logical home for > > the > > big databases. > > > > I have a gbit switch between all the machines on my network. > > > > My question is, will there be any pros/cons to organizing things this > > way. > > I can get about 450mb burst streaming data off of my raid arrays which > > is > > considerably above the 1 gb switch capacity, but it seems unlikely that > > SQL > > Server would actually process data at that speed anyway. So I want to > > place > > the big source databases on the new server and the order database on the > > > > original server. > > > > To give an example of a real order I created a set of queries: > > > > 1) One query talks to the tblHSIDRaw table (75 million records / 700 > > fields), asking for > > > > "SELECT PKID WHERE ... " > > > > The Where clause encompasses about 9 different fields. All the fields > > are > > indexed, though how useful the indexes are (in all cases) is in doubt. > > > > 2) Another query uses a small table of 180 ZIPS provided by the client. > > That ZIP table is joined on the ZIP column of tblAZHSID which is the > > table > > that has been processed for valid addresses. tblAZHSID is ~50 million > > records with about 20 fields. > > > > "SELECT PK from tblZIPS inner join tblAZHSID on ..." > > > > 3) A third query requests data from tblHSIDRaw for a where on a single > > specific field. > > > > "SELECT PK FROM tblHSIDRaw WHERE FieldX IN () > > > > I did that just because this has to be an AND with all the 9 fields in > > the > > first query. Perhaps not the most efficient but it works. > > > > 4) A query that inner joins all the PKs and returns a result set. > > > > So what happens if I place these tblHSIDRaw and tblAZHSID out on > > Stonehenge? > > I assume that processor is passed the select clauses and performs the > > processing required to return the PKs requested? > > > > Does Azul do any processing? Perhaps the join of the result sets in > > query 4 > > to create the final data set? > > > > Is the process faster or slower than if the whole shootin match ran on a > > single machine? > > > > > > John W. Colby > > Colby Consulting > > www.ColbyConsulting.com > > > > _______________________________________________ > > dba-SQLServer mailing list > > dba-SQLServer at databaseadvisors.com > > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > > http://www.databaseadvisors.com > > > > > > > -- > -Francisco > http://sqlthis.blogspot.com | Tsql and More... -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From ebarro at verizon.net Wed May 16 11:22:09 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 16 May 2007 09:22:09 -0700 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <20070516154726.A12BFBD61@smtp-auth.no-ip.com> Message-ID: <0JI500IN66T9A1Q7@vms048.mailsrvcs.net> I'm guessing that you are having record locking issues. I have seen SQL work CPU utilization up to 100% many times. The more memory and resources you allocate for it the hungrier it becomes. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 8:47 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I had never seen this machine "lock up". It is a dual core 3.8G with 4 g of RAM and even when the cores are pegged it will usually switch tasks and stuff. And of course I can't absolutely 100% determine that SQL Server is the cause. The lockup happened last night late, at the very end of a query that pulled a count based on the four queries mentioned in other emails. It happened again this morning, again at the very end of the same query. When it happens, the task manager shows 100% cpu utilization (for both cores). Unfortunately I did not get a chance to go check which task was using what of that 100%. It locked it up so tight that I couldn't even switch between tasks. As soon as the query completed, the machine returned to normal so I still assume that it was SQL Server. Again though, what does the IN clause have to do with the network. You still are not explaining that. From my understanding, this is all happening on one machine and so I am curious what is being passed over the network, and to whom? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 11:32 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi john: This type of statement 'IN' may grind a network of users to a stand still but you are the only user so who cares but... Maybe it was a time-out issue.... but to lock it up?? Wow, never heard of that happening. Have you tried nested queries? Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 7:39 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I can tell you that at the very end it locked up my dual proc machine as it pulls the results all together. I have never seen that happen before. Aside from that, what does the network have to do with anything? This is running in SQL Server 2005, using tables / drives on the same machine. John W. Colby Colby Consulting www.ColbyConsulting.com From ebarro at verizon.net Wed May 16 11:24:41 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 16 May 2007 09:24:41 -0700 Subject: [dba-SQLServer] Using Databases on separate machines - performance In-Reply-To: Message-ID: <0JI5001DY6XIECA8@vms044.mailsrvcs.net> You do give up the chance to use locking hints with linked servers. Ex: SELECT FROM nativeTable (nolock) This will not work with linked servers... SELECT FROM LinkedServerName.DatabaseName.dbo.linkedserverTableName (nolock) (nolock) helps speed up SELECT queries considerably. I use them all the time. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, May 16, 2007 9:19 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using Databases on separate machines - performance John, You're describing distributed queries, which you can perform such as SELECT * FROM ServerName.Database.Owner.Table These instructions will pass the request to the OLEDB provider and pass the "work" to the linked server. If you were hooked into some other type of engine ie, JET/Foxpro excel, then the work is performed by sql server as the oledb provider would have restrictions on the datasource. More info on distributed queries in the links below. I do use distributed queries in my environment, but in my situation I generally cause them to join against local tables, which then it has to pull data over to do the join locally. If instead I perform an entire query on the linked server the processing occurs on the remote server. such as: SELECT * FROM ServerName.Database.Owner.Table Where Field1 IN ('a','b', 'c', 'd') hth White Paper on performance: http://citeseer.ist.psu.edu/rd/93453806%2C732761%2C1%2C0.25%2CDownload/http% 3AqSqqSqhome.comcast.netqSq%7EevilconroyqSqICDE2005-MicrosoftSQLServerDistri butedQuery.pdf Architecture: http://msdn2.microsoft.com/en-us/library/ms191277.aspx Distributed Queries http://msdn2.microsoft.com/en-us/library/ms188721.aspx On 5/16/07, jwcolby wrote: > > I now have two "high powered" servers built up. Both run Windows 2003 > and SQL Server 2005. > > ATM I am running my entire database on a single machine. The database > consists of a couple of largish "raw data" tables (65 million recs / > 700+ fields; 97 million recs / 149 fields), which contain the data > pulled in from text files, with a autoincrement PK added for tracking > purposes. Each raw data table then has address / PK fields pulled out > and "sent out" for address validation. The results are then > reimported back in to SQL Server, into the same DB as the raw data > table sits in. I have created separate DBF files for each "database" > (raw / validated). > > As I mentioned earlier, I am creating a new dbf file set for each "order" > I > receive from my customer, where I build up the views required to > process that specific order. That is working quite well BTW. > > I have a bunch of questions re performance. I have discovered that I > can create queries / views that pull the data straight out of the > desired db / table when I use that data in another db, simply by just > referencing the database / table. I think I can do the same thing if > pieces are on another server instance. What I am considering doing is > placing these huge raw / validated database files out on the > StoneHenge server, leaving the Azul server to contain and process the > orders. Stonehenge is the newer machine and has a single partition > with 1.6 tbytes open, and I will be adding another 500 gb to that > soon. Thus this seems like the logical home for the big databases. > > I have a gbit switch between all the machines on my network. > > My question is, will there be any pros/cons to organizing things this way. > I can get about 450mb burst streaming data off of my raid arrays which > is considerably above the 1 gb switch capacity, but it seems unlikely > that SQL Server would actually process data at that speed anyway. So > I want to place the big source databases on the new server and the > order database on the original server. > > To give an example of a real order I created a set of queries: > > 1) One query talks to the tblHSIDRaw table (75 million records / 700 > fields), asking for > > "SELECT PKID WHERE ... " > > The Where clause encompasses about 9 different fields. All the fields > are indexed, though how useful the indexes are (in all cases) is in doubt. > > 2) Another query uses a small table of 180 ZIPS provided by the client. > That ZIP table is joined on the ZIP column of tblAZHSID which is the > table that has been processed for valid addresses. tblAZHSID is ~50 > million records with about 20 fields. > > "SELECT PK from tblZIPS inner join tblAZHSID on ..." > > 3) A third query requests data from tblHSIDRaw for a where on a single > specific field. > > "SELECT PK FROM tblHSIDRaw WHERE FieldX IN () > > I did that just because this has to be an AND with all the 9 fields in > the first query. Perhaps not the most efficient but it works. > > 4) A query that inner joins all the PKs and returns a result set. > > So what happens if I place these tblHSIDRaw and tblAZHSID out on > Stonehenge? > I assume that processor is passed the select clauses and performs the > processing required to return the PKs requested? > > Does Azul do any processing? Perhaps the join of the result sets in > query > 4 > to create the final data set? > > Is the process faster or slower than if the whole shootin match ran on > a single machine? > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM From jwcolby at colbyconsulting.com Wed May 16 11:55:57 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 12:55:57 -0400 Subject: [dba-SQLServer] Using Databases on separate machines- performance In-Reply-To: <0JI5001DY6XIECA8@vms044.mailsrvcs.net> Message-ID: <20070516165556.36A25BEF8@smtp-auth.no-ip.com> Hmmm... In my case I NEVER need locks. There are no updates on the existing tables. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Wednesday, May 16, 2007 12:25 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using Databases on separate machines- performance You do give up the chance to use locking hints with linked servers. Ex: SELECT FROM nativeTable (nolock) This will not work with linked servers... SELECT FROM LinkedServerName.DatabaseName.dbo.linkedserverTableName (nolock) (nolock) helps speed up SELECT queries considerably. I use them all the time. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, May 16, 2007 9:19 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using Databases on separate machines - performance John, You're describing distributed queries, which you can perform such as SELECT * FROM ServerName.Database.Owner.Table These instructions will pass the request to the OLEDB provider and pass the "work" to the linked server. If you were hooked into some other type of engine ie, JET/Foxpro excel, then the work is performed by sql server as the oledb provider would have restrictions on the datasource. More info on distributed queries in the links below. I do use distributed queries in my environment, but in my situation I generally cause them to join against local tables, which then it has to pull data over to do the join locally. If instead I perform an entire query on the linked server the processing occurs on the remote server. such as: SELECT * FROM ServerName.Database.Owner.Table Where Field1 IN ('a','b', 'c', 'd') hth White Paper on performance: http://citeseer.ist.psu.edu/rd/93453806%2C732761%2C1%2C0.25%2CDownload/http% 3AqSqqSqhome.comcast.netqSq%7EevilconroyqSqICDE2005-MicrosoftSQLServerDistri butedQuery.pdf Architecture: http://msdn2.microsoft.com/en-us/library/ms191277.aspx Distributed Queries http://msdn2.microsoft.com/en-us/library/ms188721.aspx On 5/16/07, jwcolby wrote: > > I now have two "high powered" servers built up. Both run Windows 2003 > and SQL Server 2005. > > ATM I am running my entire database on a single machine. The database > consists of a couple of largish "raw data" tables (65 million recs / > 700+ fields; 97 million recs / 149 fields), which contain the data > pulled in from text files, with a autoincrement PK added for tracking > purposes. Each raw data table then has address / PK fields pulled out > and "sent out" for address validation. The results are then > reimported back in to SQL Server, into the same DB as the raw data > table sits in. I have created separate DBF files for each "database" > (raw / validated). > > As I mentioned earlier, I am creating a new dbf file set for each "order" > I > receive from my customer, where I build up the views required to > process that specific order. That is working quite well BTW. > > I have a bunch of questions re performance. I have discovered that I > can create queries / views that pull the data straight out of the > desired db / table when I use that data in another db, simply by just > referencing the database / table. I think I can do the same thing if > pieces are on another server instance. What I am considering doing is > placing these huge raw / validated database files out on the > StoneHenge server, leaving the Azul server to contain and process the > orders. Stonehenge is the newer machine and has a single partition > with 1.6 tbytes open, and I will be adding another 500 gb to that > soon. Thus this seems like the logical home for the big databases. > > I have a gbit switch between all the machines on my network. > > My question is, will there be any pros/cons to organizing things this way. > I can get about 450mb burst streaming data off of my raid arrays which > is considerably above the 1 gb switch capacity, but it seems unlikely > that SQL Server would actually process data at that speed anyway. So > I want to place the big source databases on the new server and the > order database on the original server. > > To give an example of a real order I created a set of queries: > > 1) One query talks to the tblHSIDRaw table (75 million records / 700 > fields), asking for > > "SELECT PKID WHERE ... " > > The Where clause encompasses about 9 different fields. All the fields > are indexed, though how useful the indexes are (in all cases) is in doubt. > > 2) Another query uses a small table of 180 ZIPS provided by the client. > That ZIP table is joined on the ZIP column of tblAZHSID which is the > table that has been processed for valid addresses. tblAZHSID is ~50 > million records with about 20 fields. > > "SELECT PK from tblZIPS inner join tblAZHSID on ..." > > 3) A third query requests data from tblHSIDRaw for a where on a single > specific field. > > "SELECT PK FROM tblHSIDRaw WHERE FieldX IN () > > I did that just because this has to be an AND with all the 9 fields in > the first query. Perhaps not the most efficient but it works. > > 4) A query that inner joins all the PKs and returns a result set. > > So what happens if I place these tblHSIDRaw and tblAZHSID out on > Stonehenge? > I assume that processor is passed the select clauses and performs the > processing required to return the PKs requested? > > Does Azul do any processing? Perhaps the join of the result sets in > query > 4 > to create the final data set? > > Is the process faster or slower than if the whole shootin match ran on > a single machine? > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From DavidL at sierranevada.com Wed May 16 11:50:50 2007 From: DavidL at sierranevada.com (David Lewis) Date: Wed, 16 May 2007 09:50:50 -0700 Subject: [dba-SQLServer] comma delimited list In-Reply-To: References: Message-ID: <00101736F13D774F88C54058CB2663C8015F7CA2@celebration.sierranevada.corp> Wow. Cool. I have never come across that before. I can't wait to try it. I have always resorted to comma delimited lists. Thx. D Message: 8 Date: Tue, 15 May 2007 19:44:14 -0700 From: "kens.programming" Subject: Re: [dba-SQLServer] IN() or NOT IN() To: Message-ID: <001401c79764$1ec020a0$6b01a8c0 at Stoker.com> Content-Type: text/plain; charset=windows-1250 You shouldn't have to use a comma limited list, just square brackets to designate your sets. IN ([4-9], [A-M]) IN ([^1-3], [^N-T]) NOT IN ([1-3], [N-T]) NOT IN ([^4-9], [^A-M]) Ken From accessd at shaw.ca Wed May 16 11:59:28 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Wed, 16 May 2007 09:59:28 -0700 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <20070516154726.A12BFBD61@smtp-auth.no-ip.com> Message-ID: <0JI500DWK8AW1K00@l-daemon> Hi John: The IN statement usually is a pig on resources. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 8:47 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I had never seen this machine "lock up". It is a dual core 3.8G with 4 g of RAM and even when the cores are pegged it will usually switch tasks and stuff. And of course I can't absolutely 100% determine that SQL Server is the cause. The lockup happened last night late, at the very end of a query that pulled a count based on the four queries mentioned in other emails. It happened again this morning, again at the very end of the same query. When it happens, the task manager shows 100% cpu utilization (for both cores). Unfortunately I did not get a chance to go check which task was using what of that 100%. It locked it up so tight that I couldn't even switch between tasks. As soon as the query completed, the machine returned to normal so I still assume that it was SQL Server. Again though, what does the IN clause have to do with the network. You still are not explaining that. From my understanding, this is all happening on one machine and so I am curious what is being passed over the network, and to whom? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 11:32 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi john: This type of statement 'IN' may grind a network of users to a stand still but you are the only user so who cares but... Maybe it was a time-out issue.... but to lock it up?? Wow, never heard of that happening. Have you tried nested queries? Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 7:39 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I can tell you that at the very end it locked up my dual proc machine as it pulls the results all together. I have never seen that happen before. Aside from that, what does the network have to do with anything? This is running in SQL Server 2005, using tables / drives on the same machine. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 16 12:15:40 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 13:15:40 -0400 Subject: [dba-SQLServer] Processing diverse where clauses In-Reply-To: <0JI500DWK8AW1K00@l-daemon> Message-ID: <20070516171538.30D61BF0E@smtp-auth.no-ip.com> In the future I might try a subquery. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 12:59 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi John: The IN statement usually is a pig on resources. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 8:47 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I had never seen this machine "lock up". It is a dual core 3.8G with 4 g of RAM and even when the cores are pegged it will usually switch tasks and stuff. And of course I can't absolutely 100% determine that SQL Server is the cause. The lockup happened last night late, at the very end of a query that pulled a count based on the four queries mentioned in other emails. It happened again this morning, again at the very end of the same query. When it happens, the task manager shows 100% cpu utilization (for both cores). Unfortunately I did not get a chance to go check which task was using what of that 100%. It locked it up so tight that I couldn't even switch between tasks. As soon as the query completed, the machine returned to normal so I still assume that it was SQL Server. Again though, what does the IN clause have to do with the network. You still are not explaining that. From my understanding, this is all happening on one machine and so I am curious what is being passed over the network, and to whom? John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Wednesday, May 16, 2007 11:32 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses Hi john: This type of statement 'IN' may grind a network of users to a stand still but you are the only user so who cares but... Maybe it was a time-out issue.... but to lock it up?? Wow, never heard of that happening. Have you tried nested queries? Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 7:39 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Processing diverse where clauses I can tell you that at the very end it locked up my dual proc machine as it pulls the results all together. I have never seen that happen before. Aside from that, what does the network have to do with anything? This is running in SQL Server 2005, using tables / drives on the same machine. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 16 12:15:40 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 13:15:40 -0400 Subject: [dba-SQLServer] Using Databases on separate machines -performance In-Reply-To: Message-ID: <20070516171538.95B82BF38@smtp-auth.no-ip.com> Thanks for the white paper links. It certainly sounds like I could get processing done on both servers by moving the big tables over to Stonehenge. I think I will try this and report back on some timings. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, May 16, 2007 12:19 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Using Databases on separate machines -performance John, You're describing distributed queries, which you can perform such as SELECT * FROM ServerName.Database.Owner.Table These instructions will pass the request to the OLEDB provider and pass the "work" to the linked server. If you were hooked into some other type of engine ie, JET/Foxpro excel, then the work is performed by sql server as the oledb provider would have restrictions on the datasource. More info on distributed queries in the links below. I do use distributed queries in my environment, but in my situation I generally cause them to join against local tables, which then it has to pull data over to do the join locally. If instead I perform an entire query on the linked server the processing occurs on the remote server. such as: SELECT * FROM ServerName.Database.Owner.Table Where Field1 IN ('a','b', 'c', 'd') hth White Paper on performance: http://citeseer.ist.psu.edu/rd/93453806%2C732761%2C1%2C0.25%2CDownload/http% 3AqSqqSqhome.comcast.netqSq%7EevilconroyqSqICDE2005-MicrosoftSQLServerDistri butedQuery.pdf Architecture: http://msdn2.microsoft.com/en-us/library/ms191277.aspx Distributed Queries http://msdn2.microsoft.com/en-us/library/ms188721.aspx On 5/16/07, jwcolby wrote: > > I now have two "high powered" servers built up. Both run Windows 2003 > and SQL Server 2005. > > ATM I am running my entire database on a single machine. The database > consists of a couple of largish "raw data" tables (65 million recs / > 700+ fields; 97 million recs / 149 fields), which contain the data > pulled in from text files, with a autoincrement PK added for tracking > purposes. Each raw data table then has address / PK fields pulled out > and "sent out" for address validation. The results are then > reimported back in to SQL Server, into the same DB as the raw data > table sits in. I have created separate DBF files for each "database" > (raw / validated). > > As I mentioned earlier, I am creating a new dbf file set for each "order" > I > receive from my customer, where I build up the views required to > process that specific order. That is working quite well BTW. > > I have a bunch of questions re performance. I have discovered that I > can create queries / views that pull the data straight out of the > desired db / table when I use that data in another db, simply by just > referencing the database / table. I think I can do the same thing if > pieces are on another server instance. What I am considering doing is > placing these huge raw / validated database files out on the > StoneHenge server, leaving the Azul server to contain and process the > orders. Stonehenge is the newer machine and has a single partition > with 1.6 tbytes open, and I will be adding another 500 gb to that > soon. Thus this seems like the logical home for the big databases. > > I have a gbit switch between all the machines on my network. > > My question is, will there be any pros/cons to organizing things this way. > I can get about 450mb burst streaming data off of my raid arrays which > is considerably above the 1 gb switch capacity, but it seems unlikely > that SQL Server would actually process data at that speed anyway. So > I want to place the big source databases on the new server and the > order database on the original server. > > To give an example of a real order I created a set of queries: > > 1) One query talks to the tblHSIDRaw table (75 million records / 700 > fields), asking for > > "SELECT PKID WHERE ... " > > The Where clause encompasses about 9 different fields. All the fields > are indexed, though how useful the indexes are (in all cases) is in doubt. > > 2) Another query uses a small table of 180 ZIPS provided by the client. > That ZIP table is joined on the ZIP column of tblAZHSID which is the > table that has been processed for valid addresses. tblAZHSID is ~50 > million records with about 20 fields. > > "SELECT PK from tblZIPS inner join tblAZHSID on ..." > > 3) A third query requests data from tblHSIDRaw for a where on a single > specific field. > > "SELECT PK FROM tblHSIDRaw WHERE FieldX IN () > > I did that just because this has to be an AND with all the 9 fields in > the first query. Perhaps not the most efficient but it works. > > 4) A query that inner joins all the PKs and returns a result set. > > So what happens if I place these tblHSIDRaw and tblAZHSID out on > Stonehenge? > I assume that processor is passed the select clauses and performs the > processing required to return the PKs requested? > > Does Azul do any processing? Perhaps the join of the result sets in > query > 4 > to create the final data set? > > Is the process faster or slower than if the whole shootin match ran on > a single machine? > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From kens.programming at verizon.net Wed May 16 12:21:04 2007 From: kens.programming at verizon.net (kens.programming) Date: Wed, 16 May 2007 10:21:04 -0700 Subject: [dba-SQLServer] IN() or NOT IN() In-Reply-To: <464B014D.8582.425F5223@stuart.lexacorp.com.pg> References: <20070516023425.21E79BC0F@smtp-auth.no-ip.com>, <001401c79764$1ec020a0$6b01a8c0@Stoker.com> <464B014D.8582.425F5223@stuart.lexacorp.com.pg> Message-ID: <003e01c797de$9630c160$6b01a8c0@Stoker.com> You're right, I thought I had used it this way before, but it appears that I was incorrect. What I must have been thinking of was: LIKE ('[4-9, A-M]') LIKE ('[^1-3, N-T]') NOT LIKE ('[1-3, N-T']) NOT LIKE ('[^4-9, A-M]') John, the ^ is a bit operator. Ken -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart McLachlan Sent: Tuesday, May 15, 2007 8:04 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] IN() or NOT IN() Are you sure? I've never seen anything that says that IN() takes anything other than a Subquery or a List of expressions. The SQL Server 2000 that I have here returns: "Invalid column name 4-9" if I try that. On 15 May 2007 at 19:44, kens.programming wrote: > You shouldn't have to use a comma limited list, just square brackets to > designate your sets. > > IN ([4-9], [A-M]) > IN ([^1-3], [^N-T]) > NOT IN ([1-3], [N-T]) > NOT IN ([^4-9], [^A-M]) > > Ken > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby > Sent: Tuesday, May 15, 2007 7:34 PM > To: dba-sqlserver at databaseadvisors.com > Subject: [dba-SQLServer] IN() or NOT IN() > > I am trying to process a query where an income field has a set of possible > values, 1-9 and A-T. The client wants values 409 and A-M. Logically that > would be more efficient if it was NOT in(1-3,n-t). Is it in fact more > efficient? And can ranges like that be specified or do I need to use comma > delimted lists 1,2,3,n,o,p...? > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > No virus found in this incoming message. > Checked by AVG Free Edition. > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 > 10:47 AM > > > No virus found in this outgoing message. > Checked by AVG Free Edition. > Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 > 10:47 AM > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM No virus found in this outgoing message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM From erbachs at gmail.com Wed May 16 12:56:12 2007 From: erbachs at gmail.com (Steve Erbach) Date: Wed, 16 May 2007 12:56:12 -0500 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL Server Message-ID: <39cb22f30705161056r36586c6cw3108419a9811fdf2@mail.gmail.com> Dear Group, I've been the chief designer for a vertical market application that was created with Microsoft Access 2003. The application has been installed in a handful of locations around the country, some single-user, some multi-user. The man who markets and owns the application would like to convert it to a centralized web-based application. To keep the cost of maintenance and upgrades to a minimum I have suggested that rather than creating a separate SQL Server database for each customer installation of the product, every customer currently using the Access product would append his data to a "master" set of tables in one database on one SQL Server. The upshot would be that each company would have its own ID and the transactions and products specific to each company would be tagged with that ID. (There are certain tables that could be shared in common...certain lists of items common to all customers). Each company would use Views, etc., that show just its own data. None of this data is proprietary or particularly sensitive (it's hazmat record-keeping). Just for a quick look at the current app: http://www.swerbach.com/EnviroPlus/ . The SQL Server capability would be rented from one of the commercial web hosts. The volume of data is actually quite small. We're talking maybe 5 MB in Access for a couple years worth of information for each company. Of course, if the server goes down then everybody goes down. But the positives, I think, would be ease of upgrading, keeping everybody at the same revision level simultaneously, and low cost. Do you see any flies in the ointment here? I think it's very feasible, but I'd welcome any cautioning voices. Sincerely, Steven W. Erbach Neenah, WI http://thetowncrank.blogspot.com From carbonnb at gmail.com Wed May 16 13:06:28 2007 From: carbonnb at gmail.com (Bryan Carbonnell) Date: Wed, 16 May 2007 14:06:28 -0400 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL Server In-Reply-To: <39cb22f30705161056r36586c6cw3108419a9811fdf2@mail.gmail.com> References: <39cb22f30705161056r36586c6cw3108419a9811fdf2@mail.gmail.com> Message-ID: On 5/16/07, Steve Erbach wrote: > Of course, if the server goes down then everybody goes down. But the > positives, I think, would be ease of upgrading, keeping everybody at > the same revision level simultaneously, and low cost. > > Do you see any flies in the ointment here? I think it's very feasible, > but I'd welcome any cautioning voices. Does everyone automatically get updates or do they have to pay for them? What if someone doesn't want to or just doesn't pay for an upgrade? What if they don't want the data to travel on the 'net, but what it installed on their Intranet? Those 2 spring to mind quickly. -- Bryan Carbonnell - carbonnb at gmail.com Life's journey is not to arrive at the grave safely in a well preserved body, but rather to skid in sideways, totally worn out, shouting "What a great ride!" From erbachs at gmail.com Wed May 16 14:59:55 2007 From: erbachs at gmail.com (Steve Erbach) Date: Wed, 16 May 2007 14:59:55 -0500 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL Server In-Reply-To: References: <39cb22f30705161056r36586c6cw3108419a9811fdf2@mail.gmail.com> Message-ID: <39cb22f30705161259n2e94a70dkfad4b44c9a70a4d5@mail.gmail.com> Bryan, Very good questions! This is a rather unique application. The owner of it developed it in Paradox for Windows when he had his own hazmat consulting company. He was hired by a local company to run its environmental/legal department. He allowed the company to use his software for its required environmental reporting (EPA and Wisconsin Dept. of National Racehorses). I upgraded the Paradox app and then recommended switching to Access. He worked out arrangements with a number of suppliers and sister companies to install the app. I support and install and upgrade all of them. IT'S A PAIN! But he's the one that figures out if somebody needs to pay anything or not. He seems to think that the all-in-one location of the data and the code will be an excellent idea. I can hardly disagree. As far as transmission over the Internet, as I said, the data is quite innocuous, though we could set up an SSL certificate, I suppose, couldn't we? Thanks for your input. I really appreciate it. Steve Erbach Neenah, WI http://TheTownCrank.blogspot.com On 5/16/07, Bryan Carbonnell wrote: > On 5/16/07, Steve Erbach wrote: > > > Of course, if the server goes down then everybody goes down. But the > > positives, I think, would be ease of upgrading, keeping everybody at > > the same revision level simultaneously, and low cost. > > > > Do you see any flies in the ointment here? I think it's very feasible, > > but I'd welcome any cautioning voices. > > Does everyone automatically get updates or do they have to pay for > them? What if someone doesn't want to or just doesn't pay for an > upgrade? > > What if they don't want the data to travel on the 'net, but what it > installed on their Intranet? > > Those 2 spring to mind quickly. > > -- > Bryan Carbonnell - carbonnb at gmail.com From jwcolby at colbyconsulting.com Wed May 16 15:17:55 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 16:17:55 -0400 Subject: [dba-SQLServer] Output wizard choked Message-ID: <20070516201754.34C1CBF44@smtp-auth.no-ip.com> I tried to set up an export using the export wizard. It choked because the data in the address1 field was larger than the allowed data field width that I entered. The output file is a fixed width file speced by the customer so what there is is what there is. I saw no place to edit the transform or tell it to go ahead and trim the data. I did save the transform info in a DTSX file, though the last time I did that it didn't end up very useful. What is the SQL Server syntax for returning the first N characters? Any suggestions. John W. Colby Colby Consulting www.ColbyConsulting.com From ebarro at verizon.net Wed May 16 15:48:34 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 16 May 2007 13:48:34 -0700 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <20070516201754.34C1CBF44@smtp-auth.no-ip.com> Message-ID: <0JI500EJOJ68A048@vms042.mailsrvcs.net> SUBSTRING('abcdef', 2, 3) returns bcd -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 1:18 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Output wizard choked I tried to set up an export using the export wizard. It choked because the data in the address1 field was larger than the allowed data field width that I entered. The output file is a fixed width file speced by the customer so what there is is what there is. I saw no place to edit the transform or tell it to go ahead and trim the data. I did save the transform info in a DTSX file, though the last time I did that it didn't end up very useful. What is the SQL Server syntax for returning the first N characters? Any suggestions. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Wed May 16 15:49:56 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Wed, 16 May 2007 13:49:56 -0700 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <20070516201754.34C1CBF44@smtp-auth.no-ip.com> References: <20070516201754.34C1CBF44@smtp-auth.no-ip.com> Message-ID: are you asking in a select statement? it would just be the same thing like access right(field1, n) On 5/16/07, jwcolby wrote: > > I tried to set up an export using the export wizard. It choked because > the > data in the address1 field was larger than the allowed data field width > that > I entered. The output file is a fixed width file speced by the customer > so > what there is is what there is. I saw no place to edit the transform or > tell it to go ahead and trim the data. I did save the transform info in a > DTSX file, though the last time I did that it didn't end up very useful. > > What is the SQL Server syntax for returning the first N characters? > > Any suggestions. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From jwcolby at colbyconsulting.com Wed May 16 15:55:50 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 16:55:50 -0400 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: Message-ID: <20070516205549.0BD06BBF8@smtp-auth.no-ip.com> Yep. I am actually using Left(). Thanks. Now I can't get the output wizard to allow me to edit the field widths for a fixed width output. I do so love this stuff. Time to get past the wizards I suppose. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Francisco Tapia Sent: Wednesday, May 16, 2007 4:50 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Output wizard choked are you asking in a select statement? it would just be the same thing like access right(field1, n) On 5/16/07, jwcolby wrote: > > I tried to set up an export using the export wizard. It choked > because the data in the address1 field was larger than the allowed > data field width that I entered. The output file is a fixed width > file speced by the customer so what there is is what there is. I saw > no place to edit the transform or tell it to go ahead and trim the > data. I did save the transform info in a DTSX file, though the last > time I did that it didn't end up very useful. > > What is the SQL Server syntax for returning the first N characters? > > Any suggestions. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Wed May 16 15:59:24 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Wed, 16 May 2007 16:59:24 -0400 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <0JI500EJOJ68A048@vms042.mailsrvcs.net> Message-ID: <20070516205923.5927EBDC4@smtp-auth.no-ip.com> I gotta get a book for this stuff. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Wednesday, May 16, 2007 4:49 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Output wizard choked SUBSTRING('abcdef', 2, 3) returns bcd -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 1:18 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Output wizard choked I tried to set up an export using the export wizard. It choked because the data in the address1 field was larger than the allowed data field width that I entered. The output file is a fixed width file speced by the customer so what there is is what there is. I saw no place to edit the transform or tell it to go ahead and trim the data. I did save the transform info in a DTSX file, though the last time I did that it didn't end up very useful. What is the SQL Server syntax for returning the first N characters? Any suggestions. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From ebarro at verizon.net Wed May 16 16:05:31 2007 From: ebarro at verizon.net (Eric Barro) Date: Wed, 16 May 2007 14:05:31 -0700 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <20070516205923.5927EBDC4@smtp-auth.no-ip.com> Message-ID: <0JI500M3JJYOPCA7@vms046.mailsrvcs.net> Just google it :) I've found articles on www.sqlteam.com quite helpful. A subscription to SQLMag.com has been quite helfpul as well. I also have the Microsoft SQL Server 2000 Bible from Wiley Press. -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 1:59 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Output wizard choked I gotta get a book for this stuff. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Wednesday, May 16, 2007 4:49 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Output wizard choked SUBSTRING('abcdef', 2, 3) returns bcd -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby Sent: Wednesday, May 16, 2007 1:18 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Output wizard choked I tried to set up an export using the export wizard. It choked because the data in the address1 field was larger than the allowed data field width that I entered. The output file is a fixed width file speced by the customer so what there is is what there is. I saw no place to edit the transform or tell it to go ahead and trim the data. I did save the transform info in a DTSX file, though the last time I did that it didn't end up very useful. What is the SQL Server syntax for returning the first N characters? Any suggestions. John W. Colby Colby Consulting www.ColbyConsulting.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 10:47 AM From accessd at shaw.ca Wed May 16 17:14:34 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Wed, 16 May 2007 15:14:34 -0700 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL Server In-Reply-To: <39cb22f30705161056r36586c6cw3108419a9811fdf2@mail.gmail.com> Message-ID: <0JI5000CCMW0Y900@l-daemon> Hi Steve: That sure is a pretty application. If going down is a concern some site can provide a fail-over scenario. When the DNS is setup one zone would point to one host and other zone point to another host. Each would have a synchronized data set. The Zone managers are multiple sites as well. Sounds like a very workable system. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve Erbach Sent: Wednesday, May 16, 2007 10:56 AM To: SQLList Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL Server Dear Group, I've been the chief designer for a vertical market application that was created with Microsoft Access 2003. The application has been installed in a handful of locations around the country, some single-user, some multi-user. The man who markets and owns the application would like to convert it to a centralized web-based application. To keep the cost of maintenance and upgrades to a minimum I have suggested that rather than creating a separate SQL Server database for each customer installation of the product, every customer currently using the Access product would append his data to a "master" set of tables in one database on one SQL Server. The upshot would be that each company would have its own ID and the transactions and products specific to each company would be tagged with that ID. (There are certain tables that could be shared in common...certain lists of items common to all customers). Each company would use Views, etc., that show just its own data. None of this data is proprietary or particularly sensitive (it's hazmat record-keeping). Just for a quick look at the current app: http://www.swerbach.com/EnviroPlus/ . The SQL Server capability would be rented from one of the commercial web hosts. The volume of data is actually quite small. We're talking maybe 5 MB in Access for a couple years worth of information for each company. Of course, if the server goes down then everybody goes down. But the positives, I think, would be ease of upgrading, keeping everybody at the same revision level simultaneously, and low cost. Do you see any flies in the ointment here? I think it's very feasible, but I'd welcome any cautioning voices. Sincerely, Steven W. Erbach Neenah, WI http://thetowncrank.blogspot.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From erbachs at gmail.com Thu May 17 09:30:49 2007 From: erbachs at gmail.com (Steve Erbach) Date: Thu, 17 May 2007 09:30:49 -0500 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL Server In-Reply-To: <0JI5000CCMW0Y900@l-daemon> References: <39cb22f30705161056r36586c6cw3108419a9811fdf2@mail.gmail.com> <0JI5000CCMW0Y900@l-daemon> Message-ID: <39cb22f30705170730s35acae7bqd3061f2fa446e7ee@mail.gmail.com> Jim, ? That sure is a pretty application. ? Why, thank you, Jim! I'll pass that along to my wife, Janet; she was the graphic designer. I don't think that going down will be a concern. This application is mostly used once a month to record data for the previous month to report out to the various guvmint agencies. It's time critical to a degree, but everybody knows to get the info into the system on a timely basis for monthly reporting. I'm thinking that it's pretty workable, too. The only question mark with respect to the web host is what will the extra charge be for additional SQL Server logins. I think that would be the way to go instead of using one master login to save a few bucks and then having separate application logins. I don't want anybody getting the idea that they just have to have SQL Management Studio to get in to look at their data in the raw. Steve Erbach Neenah, WI http://TheTownCrank.blogspot.com On 5/16/07, Jim Lawrence wrote: > Hi Steve: > > That sure is a pretty application. > > If going down is a concern some site can provide a fail-over scenario. When > the DNS is setup one zone would point to one host and other zone point to > another host. Each would have a synchronized data set. The Zone managers are > multiple sites as well. > > Sounds like a very workable system. > > Jim > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve > Erbach > Sent: Wednesday, May 16, 2007 10:56 AM > To: SQLList > Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL > Server > > Dear Group, > > I've been the chief designer for a vertical market application that > was created with Microsoft Access 2003. The application has been > installed in a handful of locations around the country, some > single-user, some multi-user. > From jlawrenc1 at shaw.ca Thu May 17 10:29:32 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Thu, 17 May 2007 08:29:32 -0700 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQLServer In-Reply-To: <39cb22f30705170730s35acae7bqd3061f2fa446e7ee@mail.gmail.com> Message-ID: <0JI6005F0YSWR360@l-daemon> Hi Steve: I thought I recognized a woman's touch.... I do not mean this in a negative way as my wife is a professional portrait artist and my daughters are in the design field with one being a web/fashion designer and the other a computer animator. (http://www.mirandalawrence.com, http://www.corinnajasmine.com, http://www.kflamenco.com + http://www.karadesigns.ca ) If your host has software online you can always roll-your-own login page. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve Erbach Sent: Thursday, May 17, 2007 7:31 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Upsizing and consolidating Access data into SQLServer Jim, > That sure is a pretty application. < Why, thank you, Jim! I'll pass that along to my wife, Janet; she was the graphic designer. I don't think that going down will be a concern. This application is mostly used once a month to record data for the previous month to report out to the various guvmint agencies. It's time critical to a degree, but everybody knows to get the info into the system on a timely basis for monthly reporting. I'm thinking that it's pretty workable, too. The only question mark with respect to the web host is what will the extra charge be for additional SQL Server logins. I think that would be the way to go instead of using one master login to save a few bucks and then having separate application logins. I don't want anybody getting the idea that they just have to have SQL Management Studio to get in to look at their data in the raw. Steve Erbach Neenah, WI http://TheTownCrank.blogspot.com On 5/16/07, Jim Lawrence wrote: > Hi Steve: > > That sure is a pretty application. > > If going down is a concern some site can provide a fail-over scenario. When > the DNS is setup one zone would point to one host and other zone point to > another host. Each would have a synchronized data set. The Zone managers are > multiple sites as well. > > Sounds like a very workable system. > > Jim > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve > Erbach > Sent: Wednesday, May 16, 2007 10:56 AM > To: SQLList > Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQL > Server > > Dear Group, > > I've been the chief designer for a vertical market application that > was created with Microsoft Access 2003. The application has been > installed in a handful of locations around the country, some > single-user, some multi-user. > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From DavidL at sierranevada.com Thu May 17 12:41:36 2007 From: DavidL at sierranevada.com (David Lewis) Date: Thu, 17 May 2007 10:41:36 -0700 Subject: [dba-SQLServer] Wizards, books, etc. In-Reply-To: References: Message-ID: <00101736F13D774F88C54058CB2663C8015F81B8@celebration.sierranevada.corp> Hi John: Re: "time to get past the wizards" -- very true. They are great, but as soon as one gets past 123 (which your project passed long ago) they are more trouble than they are worth. Re: "gotta get a book" -- Ken Henderson's books are great, Itzak ben Gani, Joe Celko, and of course the one at your fingertips (BOL!!)... In addition I highly recommend a sql server magazine subscription, sswug.org, and sqlservercentral.com. I started in Access ~10 years ago, then was compelled to move to sql server ~7. I was used to building queries in Access' gui, and absolutely hated (and was incapable of) writing raw sql. However, once I made the plunge and weaned myself from access, I found there were many many more possibilities that were not possible in the gui. By that I mean there are many t-sql structures that the gui cannot display, and because I had limited myself to the gui I had never learned them. Sort of like playing the piano with two fingers, then suddenly discovering you have 8 more. It takes some practice to bring the other 8 into play, but once you do there is no going back. I realize it is hard to do all this while under the pressure of deadlines, but hang in there. D Message: 4 Date: Wed, 16 May 2007 16:55:50 -0400 From: "jwcolby" Subject: Re: [dba-SQLServer] Output wizard choked To: Message-ID: <20070516205549.0BD06BBF8 at smtp-auth.no-ip.com> Content-Type: text/plain; charset="us-ascii" Yep. I am actually using Left(). Thanks. Now I can't get the output wizard to allow me to edit the field widths for a fixed width output. I do so love this stuff. Time to get past the wizards I suppose. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- ------------------------------ Message: 5 Date: Wed, 16 May 2007 16:59:24 -0400 From: "jwcolby" Subject: Re: [dba-SQLServer] Output wizard choked To: Message-ID: <20070516205923.5927EBDC4 at smtp-auth.no-ip.com> Content-Type: text/plain; charset="us-ascii" I gotta get a book for this stuff. John W. Colby Colby Consulting www.ColbyConsulting.com ********************************************* From jwcolby at colbyconsulting.com Thu May 17 13:40:48 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Thu, 17 May 2007 14:40:48 -0400 Subject: [dba-SQLServer] Wizards, books, etc. In-Reply-To: <00101736F13D774F88C54058CB2663C8015F81B8@celebration.sierranevada.corp> Message-ID: <20070517184046.DBA41BEF9@smtp-auth.no-ip.com> Thanks for the encouragement. One of the problems that many of us face is the sheer diversity of the job spec. I have to this point worked mostly in Access because my job was delivering database applications to small companies as a consultant. I do not "have a job" or "have a boss" I have many different jobs and many different bosses. I happened to get a client that wanted a database done is SQL Server, importing a huge (to me, coming from Access) address list into SQL Server. It slowly over time morphed (as these things tend to do) and the complexity morphed with it. It is no longer "just a list" maintained in SQL Server but moving towards a full blown application intended to "merge" multiple lists into a single master name / address system, complete with time stamps for when a person lived at a specific location, and the ability to export / import all the data to a system for Address validation, and another system for building queries against the data. And so here I am struggling to learn a very complex tool (SQL Server) which I never really had a need for until this specific job came along. And of course VB.Net in order to take the user interface and utilities out of Access and into an environment where the application can breath. I am essentially starting from scratch in both tools. And of course I am building servers, installing OSs and SQL Server and Visual Studio and... all the rest of the stuff needed... A lot to learn, quickly, and yet still supporting all of my existing Access clients too. That is one reason that I try to use the wizards where I can, to stretch out the learning curve a bit. I have always been a "Yes, I can do that" kind of guy, and I will, but it is stressful ATM. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of David Lewis Sent: Thursday, May 17, 2007 1:42 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Wizards, books, etc. Hi John: Re: "time to get past the wizards" -- very true. They are great, but as soon as one gets past 123 (which your project passed long ago) they are more trouble than they are worth. Re: "gotta get a book" -- Ken Henderson's books are great, Itzak ben Gani, Joe Celko, and of course the one at your fingertips (BOL!!)... In addition I highly recommend a sql server magazine subscription, sswug.org, and sqlservercentral.com. I started in Access ~10 years ago, then was compelled to move to sql server ~7. I was used to building queries in Access' gui, and absolutely hated (and was incapable of) writing raw sql. However, once I made the plunge and weaned myself from access, I found there were many many more possibilities that were not possible in the gui. By that I mean there are many t-sql structures that the gui cannot display, and because I had limited myself to the gui I had never learned them. Sort of like playing the piano with two fingers, then suddenly discovering you have 8 more. It takes some practice to bring the other 8 into play, but once you do there is no going back. I realize it is hard to do all this while under the pressure of deadlines, but hang in there. D Message: 4 Date: Wed, 16 May 2007 16:55:50 -0400 From: "jwcolby" Subject: Re: [dba-SQLServer] Output wizard choked To: Message-ID: <20070516205549.0BD06BBF8 at smtp-auth.no-ip.com> Content-Type: text/plain; charset="us-ascii" Yep. I am actually using Left(). Thanks. Now I can't get the output wizard to allow me to edit the field widths for a fixed width output. I do so love this stuff. Time to get past the wizards I suppose. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- ------------------------------ Message: 5 Date: Wed, 16 May 2007 16:59:24 -0400 From: "jwcolby" Subject: Re: [dba-SQLServer] Output wizard choked To: Message-ID: <20070516205923.5927EBDC4 at smtp-auth.no-ip.com> Content-Type: text/plain; charset="us-ascii" I gotta get a book for this stuff. John W. Colby Colby Consulting www.ColbyConsulting.com ********************************************* _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From rl_stewart at highstream.net Thu May 17 15:05:25 2007 From: rl_stewart at highstream.net (Robert L. Stewart) Date: Thu, 17 May 2007 15:05:25 -0500 Subject: [dba-SQLServer] Upsizing and consolidating Access data In-Reply-To: References: Message-ID: <200705172006.l4HK6lme020086@databaseadvisors.com> Steve, Actually, if you setup the site to use the ASP.Net 2.0 login scripts, you can have as many logins as you want at the application level and not worry about at the SQL Server level. You would just link the User info from the tables created by the setup program to your HazMat tables. I have done it for simle name and address dbs and it works quite well. Robert At 10:25 AM 5/17/2007, you wrote: >Date: Thu, 17 May 2007 09:30:49 -0500 >From: "Steve Erbach" >Subject: Re: [dba-SQLServer] Upsizing and consolidating Access data > into SQL Server >To: dba-sqlserver at databaseadvisors.com >Message-ID: > <39cb22f30705170730s35acae7bqd3061f2fa446e7ee at mail.gmail.com> >Content-Type: text/plain; charset=ISO-8859-1; format=flowed > >Jim, > >? That sure is a pretty application. ? > >Why, thank you, Jim! I'll pass that along to my wife, Janet; she was >the graphic designer. > >I don't think that going down will be a concern. This application is >mostly used once a month to record data for the previous month to >report out to the various guvmint agencies. It's time critical to a >degree, but everybody knows to get the info into the system on a >timely basis for monthly reporting. > >I'm thinking that it's pretty workable, too. The only question mark >with respect to the web host is what will the extra charge be for >additional SQL Server logins. I think that would be the way to go >instead of using one master login to save a few bucks and then having >separate application logins. I don't want anybody getting the idea >that they just have to have SQL Management Studio to get in to look at >their data in the raw. > >Steve Erbach >Neenah, WI From accessd at shaw.ca Thu May 17 16:05:51 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Thu, 17 May 2007 14:05:51 -0700 Subject: [dba-SQLServer] OT SharePoint In-Reply-To: <0JI5001TZ1BFERK7@vms044.mailsrvcs.net> Message-ID: <0JI7003QVEDDE680@l-daemon> Hi Martin and Eric: Thank you so much for your help. I will be looking into this further when the desktop gets a little more cleared of projects. If you do not mind I will have some more questions to ask by then. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Wednesday, May 16, 2007 7:23 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Jim, Knowledge of the app pool works in your favor if you have to debug. Otherwise you don't have to mess with it. Same goes for web parts. It's a plug-n-play proposition. I was just giving you the lay of the land so to speak. Don't worry about RSS feeds either. MOSS 2007 is the portal technology that sits on top of WSS. WSS is the engine, MOSS is the super-charger or turbo-boost for the engine. Both products are server technologies. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 15, 2007 3:31 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you some much you guys, Martin and Eric for the help. You have given a lot to chew on and if it is alright there will be more pointed questions then. Current have one of my servers set up with AD, IIS6, .Net FrameWork (not sure which version yet). Does it need SQL Express or is the full-version of SQL 2005 OK? Have a good handle of Web pages, but have done nothing major with the App Pool or anything with web parts or RSS feeds... Is this MOSS server some kind of pre-configured SharePoint module or a total stand-alone proprietary application? Or is it a custom server? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Tuesday, May 15, 2007 1:57 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Jim, If you just want to have a feel for what Sharepoint looks and feels like you can download the following: 1. WSS 3.0 - Windows Sharepoint Services 2. .NET 2.0 is required - pre-requisite since WSS uses the .NET framework 3. SQL Server Express 2005 4. SQL Management Studio - to manage SQL Express 2005. You will also need to draw on the following skill set 1. Active Directory - SP is tightly integrated with AD for security. 2. IIS 6.0 - this serves the pages. 3. IIS application pools - recommend creating a super user for managing SP. When it installs it needs an AD user that has access privileges to SQL server and the application pools.. I usually create one called spsAdmin and use that whenever SP requires a user. It uses that to impersonate connections, etc...saves you a lot of heartache later. 4. SQL server - navigating and poking around to see where SP places the databases and tables. The config database in SQL is the key. Once the installation runs smoothly you should be able to configure and play. SP uses the concept of web parts. Think of web parts as mini-applications that can be plugged into a main web page. Thus you can have a web part that displays the weather in your location using RSS feeds from weather.com or some other site. You can have a web part that consumes RSS feeds from Wired.com or any site that has them. In other words you get a lot of functionality with little or no programming because each web part has been pre-programmed to do a specific thing. Web parts can also share information. One web part can accept data input from the user and send that off to another web part to display the results. Setting up users requires AD. Eric -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Jim Lawrence Sent: Tuesday, May 15, 2007 1:22 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Thank you, Steve, Eric and Martin for the information... So is it just a document manager? I guess it would be great for a lawyers', accounts' or even a government office. Is the app installed automatically, just has to be located and initialized or is it sitting on the Server 2003 installation disks? Is it fairly intuitive or does it require a great deal of preparation. Is the MOSS a client app for desktop stations? Can a ordinary browser use it? Is it fairly straight forward to setup clients? If I decide to play is there any 'gochas' to look out for? TIA Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Martin Reid Sent: Tuesday, May 15, 2007 10:05 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint I have just moved job to oversee the deployment of a 29,000 user MOSS install. First 300 user site goes live July. Out of the box it provides great features. With some programming even more. I am lucky as I have been paired iwth one of the best programmers in the University for this. We are initially using it to drive almost all of our administrative functions. Document sharing, communications etc Its early days and it does have issues once you get beyond the user interface but it is very good. The searching ability it has to search MOSS sites, Exchange and file shares etc is really useful to us. Martin Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.5.467 / Virus Database: 269.7.0/804 - Release Date: 5/14/2007 4:46 PM _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From mwp.reid at qub.ac.uk Thu May 17 16:10:07 2007 From: mwp.reid at qub.ac.uk (Martin Reid) Date: Thu, 17 May 2007 22:10:07 +0100 Subject: [dba-SQLServer] OT SharePoint References: <0JI7003QVEDDE680@l-daemon> Message-ID: Jim No problem. Have to admit its a huge bit of software. We have 3 additional servers arriving this week. The main issue I am having other than directly sharepoint is that it touches everything else in the infrastructure. Email, Active Directory, Windows Server, IIS,SQL Server and in a few months Oracle!!! Spend 4 hours with a consultant today installing the farm. My major plus is we have a huge consultant budget to call on as we are moving to Microsoft across the enterprise and we have little or no skills in house other than three of us who have sort of been cobbled together for this project. Its actually getting fairly intense building up this project I am doing trying to keep on top of al of it. I am beginning to know how JC must be feeling. Eventually we will have almost 30,000 users! feel sick thinking about it (<: Maritn Martin WP Reid Training and Assessment Unit Riddle Hall Belfast tel: 02890 974465 ________________________________ From: dba-sqlserver-bounces at databaseadvisors.com on behalf of Jim Lawrence Sent: Thu 17/05/2007 22:05 To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Hi Martin and Eric: Thank you so much for your help. I will be looking into this further when the desktop gets a little more cleared of projects. If you do not mind I will have some more questions to ask by then. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro Sent: Wednesday, May 16, 2007 7:23 AM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] OT SharePoint Jim, From fuller.artful at gmail.com Fri May 18 06:26:58 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Fri, 18 May 2007 07:26:58 -0400 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <20070516201754.34C1CBF44@smtp-auth.no-ip.com> References: <20070516201754.34C1CBF44@smtp-auth.no-ip.com> Message-ID: <29f585dd0705180426k72bd430erb4d73af466772c07@mail.gmail.com> Same as Access: LEFT(column, n) On 5/16/07, jwcolby wrote: > > I tried to set up an export using the export wizard. It choked because > the > data in the address1 field was larger than the allowed data field width > that > I entered. The output file is a fixed width file speced by the customer > so > what there is is what there is. I saw no place to edit the transform or > tell it to go ahead and trim the data. I did save the transform info in a > DTSX file, though the last time I did that it didn't end up very useful. > > What is the SQL Server syntax for returning the first N characters? > > Any suggestions. > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From jwcolby at colbyconsulting.com Fri May 18 08:34:15 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Fri, 18 May 2007 09:34:15 -0400 Subject: [dba-SQLServer] OT: FYI-System Transfer timing Message-ID: <20070518133413.A28F8BF0A@smtp-auth.no-ip.com> I just thought you might be interested in some numbers, transferring a large file from system to system on a network. Two identical computers, 3.8g X2 AMD proc systems, running Windows 2003. Both systems run Comodo personal firewall (software firewall) with specific rules allowing transfers from/to any other computer within my internal network. Both systems use an Areca 1220 dedicated RAID controller, and both systems use Seagate 7200.10 drives in the arrays. The "From" system has a Raid6 Array, the "To" system has a Raid 5 array. There is a gigabit switch between the systems. I am transferring a 120 gbyte SQL Server database file (dbf). When the transfer started it "settled down" after a couple of seconds saying it would take 48 minutes to transfer the file, which indicates about 2.5 gigabytes / minute, 42 mb / second. Testing has shown the read speed to be about 450 mbyte / sec for these arrays, so that is most likely the write speed of the Raid5 destination array. Write speed for these arrays is just slightly worse than the write speed of any single disk. Using task manager to simply view the network usage, the network seems to be using about 40% capacity on average. Again, using task manager, the CPU usage for the two cores shows core one swinging between 0 and 40%with a rough average around 20%. Core two is swinging between 60% and 80%. When the work is steady (and there are places where both cores, but particularly core 2 varies wildly), the "average" is reported as around 40%, as displayed in the CPU Usage. All of this usage being on the transmitting system. The task reporting most usage time is system idle, then explorer. System two (the receiving system) shows almost no Core 1 usage and Core 2 swinging wildly, but again averaging around 40% or so usage, both cores combined, per the CPU Usage display. John W. Colby Colby Consulting www.ColbyConsulting.com From erbachs at gmail.com Fri May 18 16:41:15 2007 From: erbachs at gmail.com (Steve Erbach) Date: Fri, 18 May 2007 16:41:15 -0500 Subject: [dba-SQLServer] Upsizing and consolidating Access data into SQLServer In-Reply-To: <0JI6005F0YSWR360@l-daemon> References: <39cb22f30705170730s35acae7bqd3061f2fa446e7ee@mail.gmail.com> <0JI6005F0YSWR360@l-daemon> Message-ID: <39cb22f30705181441v61ee935fo4decc9cad6e5d8fc@mail.gmail.com> Jim, Whoa! There is some serious talent in your family! Beautiful stuff! ? If your host has software online you can always roll-your-own login page. ? Not sure what you mean by that. What kind of "software online"? Steve Erbach On 5/17/07, Jim Lawrence wrote: > Hi Steve: > > I thought I recognized a woman's touch.... I do not mean this in a negative > way as my wife is a professional portrait artist and my daughters are in the > design field with one being a web/fashion designer and the other a computer > animator. (http://www.mirandalawrence.com, http://www.corinnajasmine.com, > http://www.kflamenco.com + http://www.karadesigns.ca ) > > If your host has software online you can always roll-your-own login page. > > Jim > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve > Erbach > Sent: Thursday, May 17, 2007 7:31 AM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] Upsizing and consolidating Access data into > SQLServer > > Jim, > > > That sure is a pretty application. < > > Why, thank you, Jim! I'll pass that along to my wife, Janet; she was > the graphic designer. > > I don't think that going down will be a concern. This application is > mostly used once a month to record data for the previous month to > report out to the various guvmint agencies. It's time critical to a > degree, but everybody knows to get the info into the system on a > timely basis for monthly reporting. > > I'm thinking that it's pretty workable, too. The only question mark > with respect to the web host is what will the extra charge be for > additional SQL Server logins. I think that would be the way to go > instead of using one master login to save a few bucks and then having > separate application logins. I don't want anybody getting the idea > that they just have to have SQL Management Studio to get in to look at > their data in the raw. > > Steve Erbach > Neenah, WI > http://TheTownCrank.blogspot.com > > > On 5/16/07, Jim Lawrence wrote: > > Hi Steve: > > > > That sure is a pretty application. From jlawrenc1 at shaw.ca Fri May 18 19:30:40 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Fri, 18 May 2007 17:30:40 -0700 Subject: [dba-SQLServer] Upsizing and consolidating Access data intoSQLServer In-Reply-To: <39cb22f30705181441v61ee935fo4decc9cad6e5d8fc@mail.gmail.com> Message-ID: <0JI9003ZAIK9SNF1@l-daemon> Hi Steve: Well, thank you for the complment. Virtually any kind of web based software like, PHP Perl, Ruby, Python, Asp, Asp.Net, ColdFusion etc etc... (I have functional skills with PHP, ASP, ColdFUsion), marginal skills with ASP.Net and "Yes-I-have-seen-them" skills with Perl, Ruby and Python.) Most host companies support a few packages that you as a customer could use, with no additional charge. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve Erbach Sent: Friday, May 18, 2007 2:41 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Upsizing and consolidating Access data intoSQLServer Jim, Whoa! There is some serious talent in your family! Beautiful stuff! > If your host has software online you can always roll-your-own login page. < Not sure what you mean by that. What kind of "software online"? Steve Erbach On 5/17/07, Jim Lawrence wrote: > Hi Steve: > > I thought I recognized a woman's touch.... I do not mean this in a negative > way as my wife is a professional portrait artist and my daughters are in the > design field with one being a web/fashion designer and the other a computer > animator. (http://www.mirandalawrence.com, http://www.corinnajasmine.com, > http://www.kflamenco.com + http://www.karadesigns.ca ) > > If your host has software online you can always roll-your-own login page. > > Jim > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve > Erbach > Sent: Thursday, May 17, 2007 7:31 AM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] Upsizing and consolidating Access data into > SQLServer > > Jim, > > > That sure is a pretty application. < > > Why, thank you, Jim! I'll pass that along to my wife, Janet; she was > the graphic designer. > > I don't think that going down will be a concern. This application is > mostly used once a month to record data for the previous month to > report out to the various guvmint agencies. It's time critical to a > degree, but everybody knows to get the info into the system on a > timely basis for monthly reporting. > > I'm thinking that it's pretty workable, too. The only question mark > with respect to the web host is what will the extra charge be for > additional SQL Server logins. I think that would be the way to go > instead of using one master login to save a few bucks and then having > separate application logins. I don't want anybody getting the idea > that they just have to have SQL Management Studio to get in to look at > their data in the raw. > > Steve Erbach > Neenah, WI > http://TheTownCrank.blogspot.com > > > On 5/16/07, Jim Lawrence wrote: > > Hi Steve: > > > > That sure is a pretty application. _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From erbachs at gmail.com Sat May 19 12:49:11 2007 From: erbachs at gmail.com (Steve Erbach) Date: Sat, 19 May 2007 12:49:11 -0500 Subject: [dba-SQLServer] Upsizing and consolidating Access data intoSQLServer In-Reply-To: <0JI9003ZAIK9SNF1@l-daemon> References: <39cb22f30705181441v61ee935fo4decc9cad6e5d8fc@mail.gmail.com> <0JI9003ZAIK9SNF1@l-daemon> Message-ID: <39cb22f30705191049t3ac1e82eice1714e90344cd44@mail.gmail.com> Jim, So, you yourself have "rolled your own" SQL login page in, say, PHP or ASP? I've edited the code for an ASP login page but I'll have to dig it up to see what I did. I'm all ASP.NET now. Steve Erbach http://TheTownCrank.blogspot.com On 5/18/07, Jim Lawrence wrote: > Hi Steve: > > Well, thank you for the complment. > > Virtually any kind of web based software like, PHP Perl, Ruby, Python, Asp, > Asp.Net, ColdFusion etc etc... (I have functional skills with PHP, ASP, > ColdFUsion), marginal skills with ASP.Net and "Yes-I-have-seen-them" skills > with Perl, Ruby and Python.) Most host companies support a few packages that > you as a customer could use, with no additional charge. > > Jim > > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Steve > Erbach > Sent: Friday, May 18, 2007 2:41 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] Upsizing and consolidating Access data > intoSQLServer > > Jim, > > Whoa! There is some serious talent in your family! Beautiful stuff! > > > If your host has software online you can always roll-your-own login page. > < > > Not sure what you mean by that. What kind of "software online"? > > Steve Erbach From martyconnelly at shaw.ca Tue May 22 16:35:45 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Tue, 22 May 2007 14:35:45 -0700 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <0JI500M3JJYOPCA7@vms046.mailsrvcs.net> References: <0JI500M3JJYOPCA7@vms046.mailsrvcs.net> Message-ID: <46536231.7040004@shaw.ca> Here is the start of a series of articles on performance tuning SQL One is a 30:1 increase of speed on INSERTs, another on NOLOCK use http://www.sqlservercentral.com/columnists/jSebastian/2944.asp Eric Barro wrote: >Just google it :) > > >I've found articles on www.sqlteam.com quite helpful. A subscription to >SQLMag.com has been quite helfpul as well. I also have the Microsoft SQL >Server 2000 Bible from Wiley Press. > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby >Sent: Wednesday, May 16, 2007 1:59 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Output wizard choked > >I gotta get a book for this stuff. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro >Sent: Wednesday, May 16, 2007 4:49 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Output wizard choked > >SUBSTRING('abcdef', 2, 3) returns bcd > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby >Sent: Wednesday, May 16, 2007 1:18 PM >To: dba-sqlserver at databaseadvisors.com >Subject: [dba-SQLServer] Output wizard choked > >I tried to set up an export using the export wizard. It choked because the >data in the address1 field was larger than the allowed data field width that >I entered. The output file is a fixed width file speced by the customer so >what there is is what there is. I saw no place to edit the transform or >tell it to go ahead and trim the data. I did save the transform info in a >DTSX file, though the last time I did that it didn't end up very useful. > >What is the SQL Server syntax for returning the first N characters? > >Any suggestions. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > >No virus found in this incoming message. >Checked by AVG Free Edition. >Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 >10:47 AM > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada From jlawrenc1 at shaw.ca Tue May 22 18:39:56 2007 From: jlawrenc1 at shaw.ca (Jim Lawrence) Date: Tue, 22 May 2007 16:39:56 -0700 Subject: [dba-SQLServer] Output wizard choked In-Reply-To: <46536231.7040004@shaw.ca> Message-ID: <0JIG00LHBUTOI360@l-daemon> Hi Marty: Thanks for the heads up. I will post a link to the article. Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of MartyConnelly Sent: Tuesday, May 22, 2007 2:36 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] Output wizard choked Here is the start of a series of articles on performance tuning SQL One is a 30:1 increase of speed on INSERTs, another on NOLOCK use http://www.sqlservercentral.com/columnists/jSebastian/2944.asp Eric Barro wrote: >Just google it :) > > >I've found articles on www.sqlteam.com quite helpful. A subscription to >SQLMag.com has been quite helfpul as well. I also have the Microsoft SQL >Server 2000 Bible from Wiley Press. > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby >Sent: Wednesday, May 16, 2007 1:59 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Output wizard choked > >I gotta get a book for this stuff. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Eric Barro >Sent: Wednesday, May 16, 2007 4:49 PM >To: dba-sqlserver at databaseadvisors.com >Subject: Re: [dba-SQLServer] Output wizard choked > >SUBSTRING('abcdef', 2, 3) returns bcd > >-----Original Message----- >From: dba-sqlserver-bounces at databaseadvisors.com >[mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of jwcolby >Sent: Wednesday, May 16, 2007 1:18 PM >To: dba-sqlserver at databaseadvisors.com >Subject: [dba-SQLServer] Output wizard choked > >I tried to set up an export using the export wizard. It choked because the >data in the address1 field was larger than the allowed data field width that >I entered. The output file is a fixed width file speced by the customer so >what there is is what there is. I saw no place to edit the transform or >tell it to go ahead and trim the data. I did save the transform info in a >DTSX file, though the last time I did that it didn't end up very useful. > >What is the SQL Server syntax for returning the first N characters? > >Any suggestions. > >John W. Colby >Colby Consulting >www.ColbyConsulting.com > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > >No virus found in this incoming message. >Checked by AVG Free Edition. >Version: 7.5.467 / Virus Database: 269.7.1/805 - Release Date: 5/15/2007 >10:47 AM > > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fhtapia at gmail.com Tue May 22 19:26:19 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Tue, 22 May 2007 17:26:19 -0700 Subject: [dba-SQLServer] scheduling a vbscript Message-ID: I have the following script (see below) that I can run by double-clicking it on the SQL Server's Host (ie, ProdServer) I get two pieces of information from this, first it writes out a txt file to a webserver where we serve it up, when a server stops responding (ie, does not write out it's file) then the webserver alerts critical people (SysAdmin and Dba (Me)), the second part collects all drive information and places it into a 3rd sql server for a graphical representation. This allows us to watch space growth and can view the online charts to see what the prediction path is. (ie, 3 months before we run out of space ,etc...). but this code will not run from a job in sql server giving me the only error message that I can understand of "The command script does not destroy all the objects that it creates. Revise the command script. (Microsoft SQL Server, Error: 14277)" any ideas? '========================================================================== ' ' NAME: DrivemonClient.vbs ' ' AUTHOR: joe ' DATE : 9/26/2006 ' ' COMMENT: ' This script reports the drive usage of all fixed drives on the system ' it is run. The report will be posted to Const URL ' '========================================================================== ' Constants for drive types Const Unknown = 0 Const Removable = 1 Const Fixed = 2 Const Remote = 3 Const CDROM = 4 Const RAMDisk = 5 dim svr ' general constants 'use blat here or on server 'Const MailServer = "127.0.0.1" 'Const MailServerPort = "25" Const URL = "http://ws.PRODcnc.net/drivemon.asp?Drivedata=" '==================================================================================== ' Begin main code '==================================================================================== on error resume next str = "" set oFs = WScript.CreateObject("Scripting.FileSystemObject") set oDrives = oFs.Drives svr = "(PROD) " & GetCurrentComputerName ' get name only once for performance reasons for each oDrive in oDrives for each oDrive in oDrives Select case oDrive.DriveType Case Fixed str = str & svr & _ "|" & oDrive.DriveLetter & _ "|" & oDrive.TotalSize & _ "|" & oDrive.FreeSpace End Select if err.number = 0 then postdata str str="" next set oFs = Nothing set oDrives = Nothing set str = nothing dim txtFile dim mfile 'EDIT WHERE TO WRITE THE FILE AND Server Name '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' mfile = "\\PRODinet\PRODinet_d_root\PRODinet\SqlCheckPROD\" & svr & ".txt" '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' Dim fso set fso = createobject("Scripting.FileSystemObject") Set txtFile = fso.OpenTextFile(mfile, 2, True) txtFile.WriteLine now() & " | " & "Server: " & svr txtFile.Close Set txtFile = Nothing Set fso = Nothing 'if err.number = 0 then postdata str '''''''''''''''''''''''''''''''''''''''' ' post to a page that stores the data '''''''''''''''''''''''''''''''''''''''' sub postData(DriveInfo) 'msgbox DriveInfo Set WshShell = WScript.CreateObject("WScript.Shell") Set http = CreateObject("Microsoft.XmlHttp") http.open "GET", URL & driveinfo, FALSE http.send "" 'msgbox http.responseText set WshShell = nothing set http = nothing end sub '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' ' get current computer name (from system environment variables) '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' Function GetCurrentComputerName set oWsh = WScript.CreateObject("WScript.Shell") set oWshSysEnv = oWsh.Environment("PROCESS") GetCurrentComputerName = oWshSysEnv("COMPUTERNAME") set oWsh = Nothing set oWshSysEnv = Nothing End Function -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From martyconnelly at shaw.ca Tue May 22 20:55:49 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Tue, 22 May 2007 18:55:49 -0700 Subject: [dba-SQLServer] scheduling a vbscript In-Reply-To: References: Message-ID: <46539F25.7010308@shaw.ca> Does DrivemonClient.vbs run normally when you double click in win explorer? It could be that you haven't pre declared Dim oFs , oDrives, oDrive oops you haven't set ODrive to nothing, it's a class too Francisco Tapia wrote: >I have the following script (see below) that I can run by double-clicking it >on the SQL Server's Host (ie, ProdServer) I get two pieces of information >from this, first it writes out a txt file to a webserver where we serve it >up, when a server stops responding (ie, does not write out it's file) then >the webserver alerts critical people (SysAdmin and Dba (Me)), the second >part collects all drive information and places it into a 3rd sql server for >a graphical representation. This allows us to watch space growth and can >view the online charts to see what the prediction path is. (ie, 3 months >before we run out of space ,etc...). > >but this code will not run from a job in sql server giving me the only error >message that I can understand of "The command script does not destroy all >the objects that it creates. Revise the command script. (Microsoft SQL >Server, Error: 14277)" > >any ideas? > > > > > > >'========================================================================== >' >' NAME: DrivemonClient.vbs >' >' AUTHOR: joe >' DATE : 9/26/2006 >' >' COMMENT: >' This script reports the drive usage of all fixed drives on the system ' it >is run. The report will be posted to Const URL ' >'========================================================================== >' Constants for drive types >Const Unknown = 0 >Const Removable = 1 >Const Fixed = 2 >Const Remote = 3 >Const CDROM = 4 >Const RAMDisk = 5 >dim svr > >' general constants >'use blat here or on server >'Const MailServer = "127.0.0.1" >'Const MailServerPort = "25" >Const URL = "http://ws.PRODcnc.net/drivemon.asp?Drivedata=" > >'==================================================================================== >' Begin main code >'==================================================================================== >on error resume next >str = "" >set oFs = WScript.CreateObject("Scripting.FileSystemObject") >set oDrives = oFs.Drives >svr = "(PROD) " & GetCurrentComputerName ' get name only once for >performance reasons for each oDrive in oDrives >for each oDrive in oDrives >Select case oDrive.DriveType >Case Fixed >str = str & svr & _ >"|" & oDrive.DriveLetter & _ >"|" & oDrive.TotalSize & _ >"|" & oDrive.FreeSpace >End Select >if err.number = 0 then postdata str >str="" >next >set oFs = Nothing >set oDrives = Nothing >set str = nothing > >dim txtFile >dim mfile > >'EDIT WHERE TO WRITE THE FILE AND Server Name >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > >mfile = "\\PRODinet\PRODinet_d_root\PRODinet\SqlCheckPROD\" & svr & ".txt" >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > >Dim fso >set fso = createobject("Scripting.FileSystemObject") >Set txtFile = fso.OpenTextFile(mfile, 2, True) >txtFile.WriteLine now() & " | " & "Server: " & svr >txtFile.Close >Set txtFile = Nothing >Set fso = Nothing > > >'if err.number = 0 then postdata str >'''''''''''''''''''''''''''''''''''''''' >' post to a page that stores the data >'''''''''''''''''''''''''''''''''''''''' >sub postData(DriveInfo) >'msgbox DriveInfo >Set WshShell = WScript.CreateObject("WScript.Shell") >Set http = CreateObject("Microsoft.XmlHttp") >http.open "GET", URL & driveinfo, FALSE >http.send "" >'msgbox http.responseText >set WshShell = nothing >set http = nothing >end sub >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' >' get current computer name (from system environment variables) >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' >Function GetCurrentComputerName >set oWsh = WScript.CreateObject("WScript.Shell") >set oWshSysEnv = oWsh.Environment("PROCESS") >GetCurrentComputerName = oWshSysEnv("COMPUTERNAME") >set oWsh = Nothing >set oWshSysEnv = Nothing >End Function > > > > -- Marty Connelly Victoria, B.C. Canada From fhtapia at gmail.com Tue May 22 21:17:40 2007 From: fhtapia at gmail.com (Francisco Tapia) Date: Tue, 22 May 2007 19:17:40 -0700 Subject: [dba-SQLServer] scheduling a vbscript In-Reply-To: <46539F25.7010308@shaw.ca> References: <46539F25.7010308@shaw.ca> Message-ID: Yes, it runs from the desktop just fine... in fact I just decided to schedule it via windows scheduler for now, but I broke it up so that the part that writes out the file goes through the agent, this way I know that sql server is online, and I still get my DB snapshot On 5/22/07, MartyConnelly wrote: > > Does DrivemonClient.vbs run normally when you double click in win > explorer? > It could be that you haven't pre declared > Dim oFs , oDrives, oDrive > > oops you haven't set ODrive to nothing, it's a class too > > > > > > Francisco Tapia wrote: > > >I have the following script (see below) that I can run by double-clicking > it > >on the SQL Server's Host (ie, ProdServer) I get two pieces of information > >from this, first it writes out a txt file to a webserver where we serve > it > >up, when a server stops responding (ie, does not write out it's file) > then > >the webserver alerts critical people (SysAdmin and Dba (Me)), the second > >part collects all drive information and places it into a 3rd sql server > for > >a graphical representation. This allows us to watch space growth and can > >view the online charts to see what the prediction path is. (ie, 3 months > >before we run out of space ,etc...). > > > >but this code will not run from a job in sql server giving me the only > error > >message that I can understand of "The command script does not destroy all > >the objects that it creates. Revise the command script. (Microsoft SQL > >Server, Error: 14277)" > > > >any ideas? > > > > > > > > > > > > > > >'========================================================================== > >' > >' NAME: DrivemonClient.vbs > >' > >' AUTHOR: joe > >' DATE : 9/26/2006 > >' > >' COMMENT: > >' This script reports the drive usage of all fixed drives on the system ' > it > >is run. The report will be posted to Const URL ' > > >'========================================================================== > >' Constants for drive types > >Const Unknown = 0 > >Const Removable = 1 > >Const Fixed = 2 > >Const Remote = 3 > >Const CDROM = 4 > >Const RAMDisk = 5 > >dim svr > > > >' general constants > >'use blat here or on server > >'Const MailServer = "127.0.0.1" > >'Const MailServerPort = "25" > >Const URL = "http://ws.PRODcnc.net/drivemon.asp?Drivedata=" > > > > >'==================================================================================== > >' Begin main code > > >'==================================================================================== > >on error resume next > >str = "" > >set oFs = WScript.CreateObject("Scripting.FileSystemObject") > >set oDrives = oFs.Drives > >svr = "(PROD) " & GetCurrentComputerName ' get name only once for > >performance reasons for each oDrive in oDrives > >for each oDrive in oDrives > >Select case oDrive.DriveType > >Case Fixed > >str = str & svr & _ > >"|" & oDrive.DriveLetter & _ > >"|" & oDrive.TotalSize & _ > >"|" & oDrive.FreeSpace > >End Select > >if err.number = 0 then postdata str > >str="" > >next > >set oFs = Nothing > >set oDrives = Nothing > >set str = nothing > > > >dim txtFile > >dim mfile > > > >'EDIT WHERE TO WRITE THE FILE AND Server Name > > >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > > > >mfile = "\\PRODinet\PRODinet_d_root\PRODinet\SqlCheckPROD\" & svr & > ".txt" > > >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > > > >Dim fso > >set fso = createobject("Scripting.FileSystemObject") > >Set txtFile = fso.OpenTextFile(mfile, 2, True) > >txtFile.WriteLine now() & " | " & "Server: " & svr > >txtFile.Close > >Set txtFile = Nothing > >Set fso = Nothing > > > > > >'if err.number = 0 then postdata str > >'''''''''''''''''''''''''''''''''''''''' > >' post to a page that stores the data > >'''''''''''''''''''''''''''''''''''''''' > >sub postData(DriveInfo) > >'msgbox DriveInfo > >Set WshShell = WScript.CreateObject("WScript.Shell") > >Set http = CreateObject("Microsoft.XmlHttp") > >http.open "GET", URL & driveinfo, FALSE > >http.send "" > >'msgbox http.responseText > >set WshShell = nothing > >set http = nothing > >end sub > >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > >' get current computer name (from system environment variables) > >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > >Function GetCurrentComputerName > >set oWsh = WScript.CreateObject("WScript.Shell") > >set oWshSysEnv = oWsh.Environment("PROCESS") > >GetCurrentComputerName = oWshSysEnv("COMPUTERNAME") > >set oWsh = Nothing > >set oWshSysEnv = Nothing > >End Function > > > > > > > > > > -- > Marty Connelly > Victoria, B.C. > Canada > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... From michael at ddisolutions.com.au Tue May 22 22:03:29 2007 From: michael at ddisolutions.com.au (Michael Maddison) Date: Wed, 23 May 2007 13:03:29 +1000 Subject: [dba-SQLServer] scheduling a vbscript References: <46539F25.7010308@shaw.ca> Message-ID: <59A61174B1F5B54B97FD4ADDE71E7D012897E3@ddi-01.DDI.local> Just a guess but does it have sufficient permissions when being run by the scheduler? You can set it to run under a specific account. cheers Michael M Subject: Re: [dba-SQLServer] scheduling a vbscript Yes, it runs from the desktop just fine... in fact I just decided to schedule it via windows scheduler for now, but I broke it up so that the part that writes out the file goes through the agent, this way I know that sql server is online, and I still get my DB snapshot On 5/22/07, MartyConnelly wrote: > > Does DrivemonClient.vbs run normally when you double click in win > explorer? > It could be that you haven't pre declared Dim oFs , oDrives, oDrive > > oops you haven't set ODrive to nothing, it's a class too > > > > > > Francisco Tapia wrote: > > >I have the following script (see below) that I can run by > >double-clicking > it > >on the SQL Server's Host (ie, ProdServer) I get two pieces of > >information from this, first it writes out a txt file to a webserver > >where we serve > it > >up, when a server stops responding (ie, does not write out it's file) > then > >the webserver alerts critical people (SysAdmin and Dba (Me)), the > >second part collects all drive information and places it into a 3rd > >sql server > for > >a graphical representation. This allows us to watch space growth and > >can view the online charts to see what the prediction path is. (ie, 3 > >months before we run out of space ,etc...). > > > >but this code will not run from a job in sql server giving me the > >only > error > >message that I can understand of "The command script does not destroy > >all the objects that it creates. Revise the command script. > >(Microsoft SQL Server, Error: 14277)" > > > >any ideas? > > > > > > > > > > > > > > >'==================================================================== > >====== > >' > >' NAME: DrivemonClient.vbs > >' > >' AUTHOR: joe > >' DATE : 9/26/2006 > >' > >' COMMENT: > >' This script reports the drive usage of all fixed drives on the system ' > it > >is run. The report will be posted to Const URL ' > > >'==================================================================== > >====== > >' Constants for drive types > >Const Unknown = 0 > >Const Removable = 1 > >Const Fixed = 2 > >Const Remote = 3 > >Const CDROM = 4 > >Const RAMDisk = 5 > >dim svr > > > >' general constants > >'use blat here or on server > >'Const MailServer = "127.0.0.1" > >'Const MailServerPort = "25" > >Const URL = "http://ws.PRODcnc.net/drivemon.asp?Drivedata=" > > > > >'==================================================================== > >================ > >' Begin main code > > >'==================================================================== > >================ > >on error resume next > >str = "" > >set oFs = WScript.CreateObject("Scripting.FileSystemObject") > >set oDrives = oFs.Drives > >svr = "(PROD) " & GetCurrentComputerName ' get name only once for > >performance reasons for each oDrive in oDrives for each oDrive in > >oDrives Select case oDrive.DriveType Case Fixed str = str & svr & _ > >"|" & oDrive.DriveLetter & _ "|" & oDrive.TotalSize & _ "|" & > >oDrive.FreeSpace End Select if err.number = 0 then postdata str > >str="" > >next > >set oFs = Nothing > >set oDrives = Nothing > >set str = nothing > > > >dim txtFile > >dim mfile > > > >'EDIT WHERE TO WRITE THE FILE AND Server Name > > >''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' ''''''' > > > >mfile = "\\PRODinet\PRODinet_d_root\PRODinet\SqlCheckPROD\" & svr & > ".txt" > > >''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' ''''''' > > > >Dim fso > >set fso = createobject("Scripting.FileSystemObject") > >Set txtFile = fso.OpenTextFile(mfile, 2, True) txtFile.WriteLine > >now() & " | " & "Server: " & svr txtFile.Close Set txtFile = Nothing > >Set fso = Nothing > > > > > >'if err.number = 0 then postdata str > >'''''''''''''''''''''''''''''''''''''''' > >' post to a page that stores the data > >'''''''''''''''''''''''''''''''''''''''' > >sub postData(DriveInfo) > >'msgbox DriveInfo > >Set WshShell = WScript.CreateObject("WScript.Shell") > >Set http = CreateObject("Microsoft.XmlHttp") > >http.open "GET", URL & driveinfo, FALSE > >http.send "" > >'msgbox http.responseText > >set WshShell = nothing > >set http = nothing > >end sub > >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > >' get current computer name (from system environment variables) > >'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''' > >Function GetCurrentComputerName > >set oWsh = WScript.CreateObject("WScript.Shell") > >set oWshSysEnv = oWsh.Environment("PROCESS") > >GetCurrentComputerName = oWshSysEnv("COMPUTERNAME") > >set oWsh = Nothing > >set oWshSysEnv = Nothing > >End Function > > > > > > > > > > -- > Marty Connelly > Victoria, B.C. > Canada > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- -Francisco http://sqlthis.blogspot.com | Tsql and More... _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Thu May 24 20:23:27 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Thu, 24 May 2007 21:23:27 -0400 Subject: [dba-SQLServer] How much storage Message-ID: <20070525012330.424D1BD7B@smtp-auth.no-ip.com> How much storage is used for varbinary? I am looking at using the hash function of SQL Server (built in to SQL Server 2005 now), and it returns something like 120-180 "somethings", it is defined as varbinary(8000) maximum. AFAICT it is a fixed width that varies depending on the hash algorithm. Is it returning an array of characters with 120-180 elements? Is binary (or varbinary) defined in bits of a 32 bit word? Is each binary digit stored as a single position in a character? John W. Colby Colby Consulting www.ColbyConsulting.com From stuart at lexacorp.com.pg Thu May 24 20:41:55 2007 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Fri, 25 May 2007 11:41:55 +1000 Subject: [dba-SQLServer] How much storage In-Reply-To: <20070525012330.424D1BD7B@smtp-auth.no-ip.com> References: <20070525012330.424D1BD7B@smtp-auth.no-ip.com> Message-ID: <4656CB83.11219.3738798@stuart.lexacorp.com.pg> On 24 May 2007 at 21:23, jwcolby wrote: > How much storage is used for varbinary? I am looking at using the hash > function of SQL Server (built in to SQL Server 2005 now), and it returns > something like 120-180 "somethings", it is defined as varbinary(8000) > maximum. AFAICT it is a fixed width that varies depending on the hash > algorithm. Is it returning an array of characters with 120-180 elements? > Is binary (or varbinary) defined in bits of a 32 bit word? Is each binary > digit stored as a single position in a character? varBinary = Variable length Binary data. It's just a string of bytes and it uses as much as it needs to store whatever chunk of data is put in it. Are you talking about one of the CHECKSUM functions? From jwcolby at colbyconsulting.com Thu May 24 20:47:08 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Thu, 24 May 2007 21:47:08 -0400 Subject: [dba-SQLServer] How much storage In-Reply-To: <4656CB83.11219.3738798@stuart.lexacorp.com.pg> Message-ID: <20070525014710.DCAD9BDAD@smtp-auth.no-ip.com> >Are you talking about one of the CHECKSUM functions? hashbytes John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart McLachlan Sent: Thursday, May 24, 2007 9:42 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] How much storage On 24 May 2007 at 21:23, jwcolby wrote: > How much storage is used for varbinary? I am looking at using the > hash function of SQL Server (built in to SQL Server 2005 now), and it > returns something like 120-180 "somethings", it is defined as > varbinary(8000) maximum. AFAICT it is a fixed width that varies > depending on the hash algorithm. Is it returning an array of characters with 120-180 elements? > Is binary (or varbinary) defined in bits of a 32 bit word? Is each > binary digit stored as a single position in a character? varBinary = Variable length Binary data. It's just a string of bytes and it uses as much as it needs to store whatever chunk of data is put in it. Are you talking about one of the CHECKSUM functions? _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From stuart at lexacorp.com.pg Thu May 24 21:35:25 2007 From: stuart at lexacorp.com.pg (Stuart McLachlan) Date: Fri, 25 May 2007 12:35:25 +1000 Subject: [dba-SQLServer] How much storage In-Reply-To: <20070525014710.DCAD9BDAD@smtp-auth.no-ip.com> References: <4656CB83.11219.3738798@stuart.lexacorp.com.pg>, <20070525014710.DCAD9BDAD@smtp-auth.no-ip.com> Message-ID: <4656D80D.8665.3A4845D@stuart.lexacorp.com.pg> In that case. If you use HashBytes('MD5'.........), you get back a 128 bit (16 byte) hash (aka Message Digest) which is usually represented as a string of 32 hex digits f you use HashBytes('SHA1'.........), you get back a 160 bit (20 byte) hash which is usually represented as 40 hex digits On 24 May 2007 at 21:47, jwcolby wrote: > >Are you talking about one of the CHECKSUM functions? > > hashbytes > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart > McLachlan > Sent: Thursday, May 24, 2007 9:42 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] How much storage > > On 24 May 2007 at 21:23, jwcolby wrote: > > > How much storage is used for varbinary? I am looking at using the > > hash function of SQL Server (built in to SQL Server 2005 now), and it > > returns something like 120-180 "somethings", it is defined as > > varbinary(8000) maximum. AFAICT it is a fixed width that varies > > depending on the hash algorithm. Is it returning an array of characters > with 120-180 elements? > > Is binary (or varbinary) defined in bits of a 32 bit word? Is each > > binary digit stored as a single position in a character? > > varBinary = Variable length Binary data. It's just a string of bytes and it > uses as much as it needs to store whatever chunk of data is put in it. > > Are you talking about one of the CHECKSUM functions? > > > > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > From jwcolby at colbyconsulting.com Sat May 26 06:58:50 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Sat, 26 May 2007 07:58:50 -0400 Subject: [dba-SQLServer] Work with SQL Server and Express edition Message-ID: <20070526115852.05722BE26@smtp-auth.no-ip.com> I want to work with express edition databases as the data source for some of my VB.Net projects, but full SQL Server databases for others. How do I go about doing this? I started with express edition on my new laptop simply because I installed Visual studio first and it installed Express edition. Then I went out and found / installed the Express edition Management Studio. When I tried to install SQL Server Standard Edition, it basically told me there was nothing to do, and did not install Standard Edition Management Studio, which I need. What do I need to do now? John W. Colby Colby Consulting www.ColbyConsulting.com From ssharkins at setel.com Sat May 26 12:58:01 2007 From: ssharkins at setel.com (Susan Harkins) Date: Sat, 26 May 2007 13:58:01 -0400 Subject: [dba-SQLServer] favor Message-ID: <000901c79fbf$68a25ef0$95bc2ad1@SusanOne> I need a favor from someone who's really familiar with using linked servers in SQL Server. This is strictly a favor -- no work or anything like that. You'll get my undying devotation, which is worth squat given my mostly limited expertise with SQL Server. Contact me privately at ssharkins at setel.com if you've got a few minutes -- that's all it will require. Susan H. From pcs at azizaz.com Sun May 27 01:41:19 2007 From: pcs at azizaz.com (pcs at azizaz.com) Date: Sun, 27 May 2007 16:41:19 +1000 (EST) Subject: [dba-SQLServer] Upsizing Access data into SQL2005 using SSMA Message-ID: <20070527164119.CUZ15964@dommail.onthenet.com.au> Hi Group, I am doing my second upgrade of an Access Db to 2005SQL. In this second upgrade project I've used the SQL Server Migration Assistant (SSMA) to pull the tables and data across from an Access BE rather than using the Upsizing Wizard in Access 2003. I am sitting with a few questions that I'd appreciate if someone in the group would give their answer / comments to. 1.) TimeStamp column In some recommended settings I came across for using the Upsizing Wizard in Access2003, it was suggested to not create the TimeStamp column. In our first project we didn't create the column, and the resulting access application with a SQL2005 BE is functioning very well in a terminal server environment with 60 users. The SSMA has created a TimeStamp column in every table migrated. My question is, what is the impact of this. Is the column necessary? There obviously is nothing in our existing code that make use of the TimeStamp column. And it appears - from our experience with the first migration - that SLQ2005 does not require the DateStamp column to keep track of whether a row has been updated or not. How is the DateStamp column utilized? Only by the developer through code, or does SQLServer make use of it internally if the column is available? 2. PK Autonumber - increment random Some of the Access tables have a PK defined as a random AutoNumber. SQL does not like the negative numbers in the PK column, and as a consequence migrates the field as a unique PK but does not set the Identity property of the Column. What is the best way of handling this? So far, I am thinking that I just have to create / generate new PK sequential autonumbers and deploy these as new FK in all relevant tables - before migrating. Do you agree? 3. Column Names Some column names starting with a number - f.ex. 96FEB is not liked by the SSMA; it throws the following message: "Column '96Feb' has a name that might cause problems for the Access application to function correctly against SQL Server". Do I need to be concerned re column names starting with a number?? Thanks, Regards Borge Hansen From jwcolby at colbyconsulting.com Sun May 27 06:52:27 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Sun, 27 May 2007 07:52:27 -0400 Subject: [dba-SQLServer] Upsizing Access data into SQL2005 using SSMA In-Reply-To: <20070527164119.CUZ15964@dommail.onthenet.com.au> Message-ID: <20070527115230.2F745BDE8@smtp-auth.no-ip.com> >So far, I am thinking that I just have to create / generate new PK sequential autonumbers and deploy these as new FK in all relevant tables - before migrating. Yes, however... You also have to modify the child table(s) as well. Anytime you mess with the PKs you have to mess with the corresponding FKs. >Do I need to be concerned re column names starting with a number?? This is one of those "best practices" kind of thing. Objects should not ever be names starting with a number or special character. Yes, it can be done and yes, it is a bad idea, for precisely this reason. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of pcs at azizaz.com Sent: Sunday, May 27, 2007 2:41 AM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] Upsizing Access data into SQL2005 using SSMA Hi Group, I am doing my second upgrade of an Access Db to 2005SQL. In this second upgrade project I've used the SQL Server Migration Assistant (SSMA) to pull the tables and data across from an Access BE rather than using the Upsizing Wizard in Access 2003. I am sitting with a few questions that I'd appreciate if someone in the group would give their answer / comments to. 1.) TimeStamp column In some recommended settings I came across for using the Upsizing Wizard in Access2003, it was suggested to not create the TimeStamp column. In our first project we didn't create the column, and the resulting access application with a SQL2005 BE is functioning very well in a terminal server environment with 60 users. The SSMA has created a TimeStamp column in every table migrated. My question is, what is the impact of this. Is the column necessary? There obviously is nothing in our existing code that make use of the TimeStamp column. And it appears - from our experience with the first migration - that SLQ2005 does not require the DateStamp column to keep track of whether a row has been updated or not. How is the DateStamp column utilized? Only by the developer through code, or does SQLServer make use of it internally if the column is available? 2. PK Autonumber - increment random Some of the Access tables have a PK defined as a random AutoNumber. SQL does not like the negative numbers in the PK column, and as a consequence migrates the field as a unique PK but does not set the Identity property of the Column. What is the best way of handling this? So far, I am thinking that I just have to create / generate new PK sequential autonumbers and deploy these as new FK in all relevant tables - before migrating. Do you agree? 3. Column Names Some column names starting with a number - f.ex. 96FEB is not liked by the SSMA; it throws the following message: "Column '96Feb' has a name that might cause problems for the Access application to function correctly against SQL Server". Do I need to be concerned re column names starting with a number?? Thanks, Regards Borge Hansen _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From jwcolby at colbyconsulting.com Sun May 27 07:08:35 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Sun, 27 May 2007 08:08:35 -0400 Subject: [dba-SQLServer] How much storage In-Reply-To: <4656D80D.8665.3A4845D@stuart.lexacorp.com.pg> Message-ID: <20070527120837.BE731BDAE@smtp-auth.no-ip.com> Thanks Stuart. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Stuart McLachlan Sent: Thursday, May 24, 2007 10:35 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] How much storage In that case. If you use HashBytes('MD5'.........), you get back a 128 bit (16 byte) hash (aka Message Digest) which is usually represented as a string of 32 hex digits f you use HashBytes('SHA1'.........), you get back a 160 bit (20 byte) hash which is usually represented as 40 hex digits On 24 May 2007 at 21:47, jwcolby wrote: > >Are you talking about one of the CHECKSUM functions? > > hashbytes > > > John W. Colby > Colby Consulting > www.ColbyConsulting.com > -----Original Message----- > From: dba-sqlserver-bounces at databaseadvisors.com > [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of > Stuart McLachlan > Sent: Thursday, May 24, 2007 9:42 PM > To: dba-sqlserver at databaseadvisors.com > Subject: Re: [dba-SQLServer] How much storage > > On 24 May 2007 at 21:23, jwcolby wrote: > > > How much storage is used for varbinary? I am looking at using the > > hash function of SQL Server (built in to SQL Server 2005 now), and > > it returns something like 120-180 "somethings", it is defined as > > varbinary(8000) maximum. AFAICT it is a fixed width that varies > > depending on the hash algorithm. Is it returning an array of > > characters > with 120-180 elements? > > Is binary (or varbinary) defined in bits of a 32 bit word? Is each > > binary digit stored as a single position in a character? > > varBinary = Variable length Binary data. It's just a string of bytes > and it uses as much as it needs to store whatever chunk of data is put in it. > > Are you talking about one of the CHECKSUM functions? > > > > > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fuller.artful at gmail.com Sun May 27 07:33:46 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Sun, 27 May 2007 08:33:46 -0400 Subject: [dba-SQLServer] Upsizing Access data into SQL2005 using SSMA In-Reply-To: <20070527164119.CUZ15964@dommail.onthenet.com.au> References: <20070527164119.CUZ15964@dommail.onthenet.com.au> Message-ID: <29f585dd0705270533j28b21538j1948e4b7ffe6dc44@mail.gmail.com> Hi Borge, A timestamp column does not accurately reflect a meaningful date or time. It is simply an incrementing value that reflects relative time as seen from the viewpoint of the database, and has no relationship to a clock in the real world. Its purpose is to identify changed records. Every time a record is changed in any way, its timestamp value, if any, is changed. For this reason it is a lousy candidate for a PK, and in fact for any key, if the data changes fairly often. In general, timestamp columns are not especially useful except in the case of replicated databases. Should you ever wish to obtain the current timestamp value for the database, use SELECT @@DBTS. Regarding your randomly chosen PKs, this can be a little bit tricky to change. IMO, there is not much value in selecting this option when creating a table. However, since it's already done, your problem is how to fix it. This depends on how many related tables there are. I would approach it like this: First off, make a copy of the original database. Now, working with the copy... For Table A: 1. Copy the structure but not the data to a new table A1. 2. Modify the structure: 2a) Change the current PK to an LongInt column and remove the PK setting from this column. 2b) Add a new sequential autonumber column called newPK or whatever. 3. Append the data from the original Table A. For each table related to Table A (let's call it Table B): 1. Add a column called TableA_FK or whatever. 2. Update Table B by joining it to Table A on your original PK/FK, but SET TableB.TableA_FK to the value of TableA.NewPK. The join will keep the rows pointed at their parent rows in TableA. You now have the correct (new) keys from TableA in the new column in TableB. If there are any tables related to TableB, repeat the procedure. Satisfy yourself that the data still hangs together correctly. When you're satisfied, you can remove all the randomPK columns and declare all the NewPK autonumber columns as your new PKs. At this stage you can rename these columns to the names of your original PKs, so any code depending on their names will continue to run correctly. 1. Create a new table with two columns, OldPK and NewPK. Make the former an int and the latter an incrementing identity key. Then append to this table all the OldPKs from Table A. You will now have a two-column table that you can use to map the oldPK to its new equivalent IMO it's a bad idea to name any column beginning with a number, and you've discovered why. I would suggest that the simplest solution might be to add a single character prefix to all such column names. Of course, you'll have to keep track of every one you rename, so that you can use Speed Ferret or Rick Fisher's Find and Replace to change all occurrences of said names in your code. With either product, you can create a list of all target names and their replacement names, and do it all at once. hth, Arthur On 5/27/07, pcs at azizaz.com wrote: > > Hi Group, > I am doing my second upgrade of an Access Db to 2005SQL. > > In this second upgrade project I've used the SQL Server > Migration Assistant (SSMA) to pull the tables and data > across from an Access BE rather than using the Upsizing > Wizard in Access 2003. > > I am sitting with a few questions that I'd appreciate if > someone in the group would give their answer / comments to. > > 1.) TimeStamp column > In some recommended settings I came across for using the > Upsizing Wizard in Access2003, it was suggested to not > create the TimeStamp column. In our first project we didn't > create the column, and the resulting access application with > a SQL2005 BE is functioning very well in a terminal server > environment with 60 users. > > The SSMA has created a TimeStamp column in every table > migrated. > My question is, what is the impact of this. > Is the column necessary? > There obviously is nothing in our existing code that make > use of the TimeStamp column. > And it appears - from our experience with the first > migration - that SLQ2005 does not require the DateStamp > column to keep track of whether a row has been updated or > not. How is the DateStamp column utilized? Only by the > developer through code, or does SQLServer make use of it > internally if the column is available? > > 2. PK Autonumber - increment random > Some of the Access tables have a PK defined as a random > AutoNumber. SQL does not like the negative numbers in the PK > column, and as a consequence migrates the field as a unique > PK but does not set the Identity property of the Column. > > What is the best way of handling this? > > So far, I am thinking that I just have to create / generate > new PK sequential autonumbers and deploy these as new FK in > all relevant tables - before migrating. > Do you agree? > > 3. Column Names > Some column names starting with a number - f.ex. 96FEB is > not liked by the SSMA; it throws the following > message: "Column '96Feb' has a name that might cause > problems for the Access application to function correctly > against SQL Server". > Do I need to be concerned re column names starting with a > number?? > > Thanks, > Regards > Borge Hansen > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From pcs at azizaz.com Sun May 27 07:53:29 2007 From: pcs at azizaz.com (pcs at azizaz.com) Date: Sun, 27 May 2007 22:53:29 +1000 (EST) Subject: [dba-SQLServer] Upsizing Access data into SQL2005 using SSMA Message-ID: <20070527225329.CUZ69130@dommail.onthenet.com.au> Arthur, John Thanks for the comments on an early sunday am! Borge From jwcolby at colbyconsulting.com Sun May 27 12:55:13 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Sun, 27 May 2007 13:55:13 -0400 Subject: [dba-SQLServer] Merge records Message-ID: <20070527175513.E10D6BCED@smtp-auth.no-ip.com> I have a situation where I might have several data records in a table (the big 700 field table) that are about the same person / address but have different data in some of the other fields. These records represent answers to surveys and so if a person answered three surveys, the person could have three records in the database. I need to merge the data from the three records into a single record, and eventually delete the other two. I have developed a field that represents the SHA1 hash of the address, zip5, zip4, lastname and firstname. I am running some tests to see whether this hash is unique across 50 million records (unique to that name / address) but I suspect that it will be. Once I determine that it is, then I can use that single field as a single "person identifier" field. So I need pointers how to "merge" the data from one record into a second record, only merging fields where there is legitimate data, and not overwriting fields where there is already data. John W. Colby Colby Consulting www.ColbyConsulting.com From ssharkins at setel.com Mon May 28 11:56:13 2007 From: ssharkins at setel.com (Susan Harkins) Date: Mon, 28 May 2007 12:56:13 -0400 Subject: [dba-SQLServer] Question on DB2 provider string Message-ID: <002901c7a149$1da86c10$a7b82ad1@SusanOne> EXEC sp_addlinkedserver @server='DB2', @srvproduct='Microsoft OLE DB Provider for DB2', @catalog='DB2', @provider='DB2OLEDB', @provstr='Initial Catalog=databasename; Data Source=DB2; HostCCSID=1252; Network Address=XYZ; Network Port=50000; Package Collection=admin; Default Schema=admin;' ======I don't know a thing about DB2 -- can someone tell me if the HostCCSID value is a literal value or a variable, which in this case just happens to be 1252? Susan H. From martyconnelly at shaw.ca Mon May 28 13:00:54 2007 From: martyconnelly at shaw.ca (MartyConnelly) Date: Mon, 28 May 2007 11:00:54 -0700 Subject: [dba-SQLServer] Question on DB2 provider string In-Reply-To: <002901c7a149$1da86c10$a7b82ad1@SusanOne> References: <002901c7a149$1da86c10$a7b82ad1@SusanOne> Message-ID: <465B18D6.6050801@shaw.ca> 1252 sounds suspiciously like the US Windows CodePage number May indicate collation sequence to use or Ascii to Unicode encoding method.. Susan Harkins wrote: >EXEC sp_addlinkedserver @server='DB2', >@srvproduct='Microsoft OLE DB Provider for DB2', >@catalog='DB2', >@provider='DB2OLEDB', >@provstr='Initial Catalog=databasename; Data Source=DB2; HostCCSID=1252; >Network Address=XYZ; Network Port=50000; Package Collection=admin; >Default Schema=admin;' > >======I don't know a thing about DB2 -- can someone tell me if the HostCCSID >value is a literal value or a variable, which in this case just happens to >be 1252? > >Susan H. > >_______________________________________________ >dba-SQLServer mailing list >dba-SQLServer at databaseadvisors.com >http://databaseadvisors.com/mailman/listinfo/dba-sqlserver >http://www.databaseadvisors.com > > > > > -- Marty Connelly Victoria, B.C. Canada From Elizabeth.J.Doering at wellsfargo.com Thu May 31 14:25:15 2007 From: Elizabeth.J.Doering at wellsfargo.com (Elizabeth.J.Doering at wellsfargo.com) Date: Thu, 31 May 2007 14:25:15 -0500 Subject: [dba-SQLServer] SQL Server versus Oracle References: <20070514155234.E01E9BDBD@smtp-auth.no-ip.com> Message-ID: <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> Come to find out, I am speaking in 30 minutes about the virtues of SQL Server 2005 versus those of Oracle. Given that my knowledge of Oracle could still dance comfortably on the head of a pin, I am frantically googling up details for my 'speech', and I would love to have your opinions I can easily say that we have already SQL Server and that Oracle is going to cost us $$$$$ that we hadn't budgeted for. The thing I am most up against is a contention that 650 users are going to generate more data in a year or two than SQL Server can possibly hold. I'm of the opinion that with a normalized database in a call center environment, users generating 10 or 12 records per call can go for years without seeing much if any slowdown. Is this accurate? Oracle isn't being suggested for the production environment however. Oracle is being pushed for the REPORTING side of this system, for the 3 or 4 analysts who will be looking at the long term performance of the folks in the call center. Is this making sense? The production staff can live with SQL Server, but 3 or 4 analysts need the big bucks spent on Oracle for running their reports. Opinions, please? Thanks, Liz Liz Doering elizabeth.j.doering at wellsfargo.com 612.667.2447 This message may contain confidential and/or privileged information. If you are not the addressee or authorized to receive this for the addressee, you must not use, copy, disclose, or take any action based on this message or any information herein. If you have received this message in error, please advise the sender immediately by reply e-mail and delete this message. Thank you for your cooperation. From jwcolby at colbyconsulting.com Thu May 31 14:47:51 2007 From: jwcolby at colbyconsulting.com (jwcolby) Date: Thu, 31 May 2007 15:47:51 -0400 Subject: [dba-SQLServer] SQL Server versus Oracle In-Reply-To: <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> Message-ID: <20070531194751.EAB33BE8A@smtp-auth.no-ip.com> SQL Server can easily handle terrabyte files and giga records. I think you are up against analysts who WANT Oracle because that is what they are accustomed to, NOT "what is best, or what is easiest, or what is (insert your choice here)". It is darned tough to address WANTS, cause they will fight tooth and nail to dispute whatever you say. I think you should concentrate on what you have, that it will quite comfortably support your usage for many years and that you can easily hire analysts who can use SQL Server reporting. Force the ANALYSTS to prove that SQL Server cannot do what they need instead of VV. John W. Colby Colby Consulting www.ColbyConsulting.com -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Elizabeth.J.Doering at wellsfargo.com Sent: Thursday, May 31, 2007 3:25 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] SQL Server versus Oracle Come to find out, I am speaking in 30 minutes about the virtues of SQL Server 2005 versus those of Oracle. Given that my knowledge of Oracle could still dance comfortably on the head of a pin, I am frantically googling up details for my 'speech', and I would love to have your opinions I can easily say that we have already SQL Server and that Oracle is going to cost us $$$$$ that we hadn't budgeted for. The thing I am most up against is a contention that 650 users are going to generate more data in a year or two than SQL Server can possibly hold. I'm of the opinion that with a normalized database in a call center environment, users generating 10 or 12 records per call can go for years without seeing much if any slowdown. Is this accurate? Oracle isn't being suggested for the production environment however. Oracle is being pushed for the REPORTING side of this system, for the 3 or 4 analysts who will be looking at the long term performance of the folks in the call center. Is this making sense? The production staff can live with SQL Server, but 3 or 4 analysts need the big bucks spent on Oracle for running their reports. Opinions, please? Thanks, Liz Liz Doering elizabeth.j.doering at wellsfargo.com 612.667.2447 This message may contain confidential and/or privileged information. If you are not the addressee or authorized to receive this for the addressee, you must not use, copy, disclose, or take any action based on this message or any information herein. If you have received this message in error, please advise the sender immediately by reply e-mail and delete this message. Thank you for your cooperation. _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fahooper at trapo.com Thu May 31 14:53:46 2007 From: fahooper at trapo.com (Fred Hooper) Date: Thu, 31 May 2007 15:53:46 -0400 Subject: [dba-SQLServer] SQL Server versus Oracle In-Reply-To: <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> Message-ID: <001801c7a3bd$6682c0e0$65cee044@fredxp> I work with Cognos BI tools a lot. For reporting, Cognos and other BI tools would work equally well with either Oracle or SQL Server; I've used Cognos with both. There's a lot of logic in putting your data warehouse on a box that isn't serving your transaction database, but there's no reason for the warehouse to be a different manufacturer's database. Therefore, I don't see any need for Oracle for you. Fred Hooper -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Elizabeth.J.Doering at wellsfargo.com Sent: Thursday, May 31, 2007 3:25 PM To: dba-sqlserver at databaseadvisors.com Subject: [dba-SQLServer] SQL Server versus Oracle Come to find out, I am speaking in 30 minutes about the virtues of SQL Server 2005 versus those of Oracle. Given that my knowledge of Oracle could still dance comfortably on the head of a pin, I am frantically googling up details for my 'speech', and I would love to have your opinions I can easily say that we have already SQL Server and that Oracle is going to cost us $$$$$ that we hadn't budgeted for. The thing I am most up against is a contention that 650 users are going to generate more data in a year or two than SQL Server can possibly hold. I'm of the opinion that with a normalized database in a call center environment, users generating 10 or 12 records per call can go for years without seeing much if any slowdown. Is this accurate? Oracle isn't being suggested for the production environment however. Oracle is being pushed for the REPORTING side of this system, for the 3 or 4 analysts who will be looking at the long term performance of the folks in the call center. Is this making sense? The production staff can live with SQL Server, but 3 or 4 analysts need the big bucks spent on Oracle for running their reports. Opinions, please? Thanks, Liz Liz Doering elizabeth.j.doering at wellsfargo.com 612.667.2447 This message may contain confidential and/or privileged information. If you are not the addressee or authorized to receive this for the addressee, you must not use, copy, disclose, or take any action based on this message or any information herein. If you have received this message in error, please advise the sender immediately by reply e-mail and delete this message. Thank you for your cooperation. _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From garykjos at gmail.com Thu May 31 15:29:01 2007 From: garykjos at gmail.com (Gary Kjos) Date: Thu, 31 May 2007 15:29:01 -0500 Subject: [dba-SQLServer] SQL Server versus Oracle In-Reply-To: <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> References: <20070514155234.E01E9BDBD@smtp-auth.no-ip.com> <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> Message-ID: I work for a company that has both Oracle and SQL Server databases. Both can handle large databases. Most reporting things can work with either. Oracle generally runs in a UNIX environment although I think there are also Windows versions available too. SQL Server needs Windows based servers to run I think. Having both requires 2 sets of database administrators and if you have Oracle On UNIX and SQL Server on Windows server, then you need system administrator type people that can deal with both of those worlds. If you need to you need to but if you don't then you are going to be having two sets of geeks where you could get away with one. Or have a backup set for the one platform you could be using. People that work with one or the other usually think that the one they use is superior to the other and that the othe is junk. That is how it is here. Our SQL Server DBA thinks everything else is bad and the Oracle DBA's think that SQL Server is terrible. GK On 5/31/07, Elizabeth.J.Doering at wellsfargo.com wrote: > > Come to find out, I am speaking in 30 minutes about the virtues of SQL > Server 2005 versus those of Oracle. Given that my knowledge of Oracle > could still dance comfortably on the head of a pin, I am frantically > googling up details for my 'speech', and I would love to have your > opinions > > I can easily say that we have already SQL Server and that Oracle is > going to cost us $$$$$ that we hadn't budgeted for. The thing I am most > up against is a contention that 650 users are going to generate more > data in a year or two than SQL Server can possibly hold. I'm of the > opinion that with a normalized database in a call center environment, > users generating 10 or 12 records per call can go for years without > seeing much if any slowdown. Is this accurate? > > Oracle isn't being suggested for the production environment however. > Oracle is being pushed for the REPORTING side of this system, for the 3 > or 4 analysts who will be looking at the long term performance of the > folks in the call center. > > Is this making sense? The production staff can live with SQL Server, > but 3 or 4 analysts need the big bucks spent on Oracle for running their > reports. > > Opinions, please? > > > Thanks, > > > Liz > > > Liz Doering > elizabeth.j.doering at wellsfargo.com > 612.667.2447 > > > This message may contain confidential and/or privileged information. If > you are not the addressee or authorized to receive this for the > addressee, you must not use, copy, disclose, or take any action based on > this message or any information herein. If you have received this > message in error, please advise the sender immediately by reply e-mail > and delete this message. Thank you for your cooperation. > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Gary Kjos garykjos at gmail.com From Elizabeth.J.Doering at wellsfargo.com Thu May 31 16:05:39 2007 From: Elizabeth.J.Doering at wellsfargo.com (Elizabeth.J.Doering at wellsfargo.com) Date: Thu, 31 May 2007 16:05:39 -0500 Subject: [dba-SQLServer] SQL Server versus Oracle References: <20070514155234.E01E9BDBD@smtp-auth.no-ip.com><1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> Message-ID: <1C2084FD2472124AB1812A5476EA3B7A0174050E@msgswbmnmsp04.wellsfargo.com> Thanks Gary, Fred and John! Fortunately for me, I was rescued by a very knowledgeable sounding Security-type who has a foot firmly in both worlds and was able to bring sense to the meeting. The reporting folks do win, in that they get their data into Oracle, but the project and I win, too, in that it seems clear that 1) An Oracle guru is not necessary to organize a push of data from SQL Server and 2) We don't need to purchase a new server just for the relatively small amount of data we are talking about here. So everyone is happy, especially me, as I am no longer being asked to act the roles of all the geeks Gary mentioned. BTW, at least some of the reporting folks are using Access as a front end on their other Oracle tables. :) Their big beef was that pulling from two sources would make Access run slower...... Liz -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Gary Kjos Sent: Thursday, May 31, 2007 3:29 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server versus Oracle I work for a company that has both Oracle and SQL Server databases. Both can handle large databases. Most reporting things can work with either. Oracle generally runs in a UNIX environment although I think there are also Windows versions available too. SQL Server needs Windows based servers to run I think. Having both requires 2 sets of database administrators and if you have Oracle On UNIX and SQL Server on Windows server, then you need system administrator type people that can deal with both of those worlds. If you need to you need to but if you don't then you are going to be having two sets of geeks where you could get away with one. Or have a backup set for the one platform you could be using. People that work with one or the other usually think that the one they use is superior to the other and that the othe is junk. That is how it is here. Our SQL Server DBA thinks everything else is bad and the Oracle DBA's think that SQL Server is terrible. GK On 5/31/07, Elizabeth.J.Doering at wellsfargo.com wrote: > > Come to find out, I am speaking in 30 minutes about the virtues of SQL > Server 2005 versus those of Oracle. Given that my knowledge of Oracle > could still dance comfortably on the head of a pin, I am frantically > googling up details for my 'speech', and I would love to have your > opinions > > I can easily say that we have already SQL Server and that Oracle is > going to cost us $$$$$ that we hadn't budgeted for. The thing I am > most up against is a contention that 650 users are going to generate > more data in a year or two than SQL Server can possibly hold. I'm of > the opinion that with a normalized database in a call center > environment, users generating 10 or 12 records per call can go for > years without seeing much if any slowdown. Is this accurate? > > Oracle isn't being suggested for the production environment however. > Oracle is being pushed for the REPORTING side of this system, for the > 3 or 4 analysts who will be looking at the long term performance of > the folks in the call center. > > Is this making sense? The production staff can live with SQL Server, > but 3 or 4 analysts need the big bucks spent on Oracle for running > their reports. > > Opinions, please? > > > Thanks, > > > Liz > > > Liz Doering > elizabeth.j.doering at wellsfargo.com > 612.667.2447 > > > This message may contain confidential and/or privileged information. > If you are not the addressee or authorized to receive this for the > addressee, you must not use, copy, disclose, or take any action based > on this message or any information herein. If you have received this > message in error, please advise the sender immediately by reply e-mail > and delete this message. Thank you for your cooperation. > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > -- Gary Kjos garykjos at gmail.com _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com From fuller.artful at gmail.com Thu May 31 17:00:40 2007 From: fuller.artful at gmail.com (Arthur Fuller) Date: Thu, 31 May 2007 18:00:40 -0400 Subject: [dba-SQLServer] SQL Server versus Oracle In-Reply-To: <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> References: <20070514155234.E01E9BDBD@smtp-auth.no-ip.com> <1C2084FD2472124AB1812A5476EA3B7A0174047E@msgswbmnmsp04.wellsfargo.com> Message-ID: <29f585dd0705311500r5fb5b5e5v2c8663319aa0e82@mail.gmail.com> I have worked with MS databases in excess of 1 Tb and that is not your problem. To give Oracle its due, in this respect alone, in Oracle you can create namespaces that correspond to physical drives and that can also respect FKs that span databases (or more exactly, name spaces). MS-SQL cannot do this, sadly. More generally, I would approach the problem using an automobile analogy. 85%+ of what most users require is encompassed in SQL Server and several other alternatives, for zero dollars to many more, depending on vagaries. There is no doubt about it, if what you want is security plus multi-TB capability plus plus plus, Oracle and DB2 are the only games in town. And I say this as a big fan of MS-SQL. There are things that MS-SQL cannot even pretend to do. If these things are on your necessary-list, then MS-SQL falls off your list of contenders. Let's take just a few examples: a) I want 8 databases to tie together, some of which are separated by firewalls etc. I want foreign keys to work across said databases. MS-SQL cannot enforce FKs across db boundaries. Big problem. Oracle can, so from that perspective this is an unimportant issue. Given that our client is wedded to MS-SQL, then what do we do? There are several alternatives, which I won't detail here, but suffice to say that all of them (that I know of) are less than beautiful. b) my principal db of concern is several PB (petabytes). c) my db of concern must install jobs that on schedule grab data from several other databases (for illustration, let's say said databases include one MySQL db, one PostGres db and one Oracle db). I need to grab the new data added to each of these since my last visit. I am well aware that this is not impossible, and in fact not particularly difficult. My point is that inhaling all this foreign data on schedule and guaranteeing the RI of the imports is non-trivial. It can be done, of course. I have done it. That is not the issue that I am attempting to present. The issue is that this is non-trivial. Many hours and many days might be expended making this sort of thing work. I am happy to bill for thousands of hours, but that is beside the point. The customer wants a solution not a description of the problem. That is my current mantra. Back to your topic: Oracle has some serious technology that might be important only in an organization with multi-TB databases, and several of these linked together in a WAN or whatever. But to take just one thing O can do that MS-SQL cannot, look at RI across databases. Expand that to view numerous dbs located who knows where, but which maintain RI. There is NO way to achieve that in MS-SQL. Oracle blazes the relational path, without doubt. Other implementations try to catch up. On the other hand, MS-SQL does 99% of what most customers want to do. Call MS the GM of software. Yes, you can buy an Aston-Martin and if that's what you want, then GM wishes you well, but if you're looking for an inexpensive solution then MS-SQL may be your ticket. (Or not. There remain alternatives such as MySQL and PostGres, which are freely downloadable and can be addressed using ODBC.) It is also true that the era of relational databases may be seeing its sunset. Implementations such as Cache raise the bar, and challenge the precepts that underly Codd's theory and subsequent implementations. I have played with Cache only a little; basically ported a complex app to Cache and run it and discovered dramatic performance improvements. But OTOH there is a whole new suitcase of stuff to lean, and I'm old and I don't have all that many available brain cells left. I'd hate to think of myself as going down, clinging to his precepts. I'd like to pretend that I'm a better person than that, and that presented with a superior alternative, I am willing to junk what I've learned and take the new path. Obviously, there is reluctance, not all of which is bad... I've been led down a few garden paths before, to no avail. Late in life, IMO, Dr. Codd made some statements that I would call questionable. IMO, he didn't perceive the Object problem, nor the GIS problem, in their gory details. That's ok by me. It doesn't mean that he's right, and it doesn't diminish his stature if he's wrong. It happens that I deem him incorrect on these particular two issues, and also one other, which I won't bother to get into now). It is certainly possible to implement an O-O database within a classic relational paradigm, but that isn't really the question. The question, IMO, is whether a db such as Cache can deliver superior performance (along with the assumed reliability etc.) to a SQL-based implementation of same. To test this, we need a complex object model and at least a few million rows of the various objects. It happens that I have a very good model for this test, but it will take me some time to port the data to the Cache db. My particular model concerns the pulp and paper industry. I think I can sketch this one in my sleep, but I am interested to hear from developers who have devoted similar time to particular domains. A. On 5/31/07, Elizabeth.J.Doering at wellsfargo.com < Elizabeth.J.Doering at wellsfargo.com> wrote: > > > Come to find out, I am speaking in 30 minutes about the virtues of SQL > Server 2005 versus those of Oracle. Given that my knowledge of Oracle > could still dance comfortably on the head of a pin, I am frantically > googling up details for my 'speech', and I would love to have your > opinions > > I can easily say that we have already SQL Server and that Oracle is > going to cost us $$$$$ that we hadn't budgeted for. The thing I am most > up against is a contention that 650 users are going to generate more > data in a year or two than SQL Server can possibly hold. I'm of the > opinion that with a normalized database in a call center environment, > users generating 10 or 12 records per call can go for years without > seeing much if any slowdown. Is this accurate? > > Oracle isn't being suggested for the production environment however. > Oracle is being pushed for the REPORTING side of this system, for the 3 > or 4 analysts who will be looking at the long term performance of the > folks in the call center. > > Is this making sense? The production staff can live with SQL Server, > but 3 or 4 analysts need the big bucks spent on Oracle for running their > reports. > > Opinions, please? > > > Thanks, > > > Liz > > > Liz Doering > elizabeth.j.doering at wellsfargo.com > 612.667.2447 > > > This message may contain confidential and/or privileged information. If > you are not the addressee or authorized to receive this for the > addressee, you must not use, copy, disclose, or take any action based on > this message or any information herein. If you have received this > message in error, please advise the sender immediately by reply e-mail > and delete this message. Thank you for your cooperation. > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > From accessd at shaw.ca Thu May 31 17:47:05 2007 From: accessd at shaw.ca (Jim Lawrence) Date: Thu, 31 May 2007 15:47:05 -0700 Subject: [dba-SQLServer] SQL Server versus Oracle In-Reply-To: <29f585dd0705311500r5fb5b5e5v2c8663319aa0e82@mail.gmail.com> Message-ID: <0JIX00DF7GE1NT10@l-daemon> Hi Arthur: As I teeter on the same precipice. My feelings are that the relational database model has reached it maximum capabilities. A couple of good database friends, one who has studied Cach? at length and the other that has been working with Spatial GIS data for years and myself would be very interested. Note: The GIS data is current being housed in an Oracle warehouse but there is a complex front-end object modeler from ESRI (http://www.esri.com/) that actually does all the work. Our coffee clutch would be very interested in any O-O design project that you may be planning. Regards Jim -----Original Message----- From: dba-sqlserver-bounces at databaseadvisors.com [mailto:dba-sqlserver-bounces at databaseadvisors.com] On Behalf Of Arthur Fuller Sent: Thursday, May 31, 2007 3:01 PM To: dba-sqlserver at databaseadvisors.com Subject: Re: [dba-SQLServer] SQL Server versus Oracle I have worked with MS databases in excess of 1 Tb and that is not your problem. To give Oracle its due, in this respect alone, in Oracle you can create namespaces that correspond to physical drives and that can also respect FKs that span databases (or more exactly, name spaces). MS-SQL cannot do this, sadly. More generally, I would approach the problem using an automobile analogy. 85%+ of what most users require is encompassed in SQL Server and several other alternatives, for zero dollars to many more, depending on vagaries. There is no doubt about it, if what you want is security plus multi-TB capability plus plus plus, Oracle and DB2 are the only games in town. And I say this as a big fan of MS-SQL. There are things that MS-SQL cannot even pretend to do. If these things are on your necessary-list, then MS-SQL falls off your list of contenders. Let's take just a few examples: a) I want 8 databases to tie together, some of which are separated by firewalls etc. I want foreign keys to work across said databases. MS-SQL cannot enforce FKs across db boundaries. Big problem. Oracle can, so from that perspective this is an unimportant issue. Given that our client is wedded to MS-SQL, then what do we do? There are several alternatives, which I won't detail here, but suffice to say that all of them (that I know of) are less than beautiful. b) my principal db of concern is several PB (petabytes). c) my db of concern must install jobs that on schedule grab data from several other databases (for illustration, let's say said databases include one MySQL db, one PostGres db and one Oracle db). I need to grab the new data added to each of these since my last visit. I am well aware that this is not impossible, and in fact not particularly difficult. My point is that inhaling all this foreign data on schedule and guaranteeing the RI of the imports is non-trivial. It can be done, of course. I have done it. That is not the issue that I am attempting to present. The issue is that this is non-trivial. Many hours and many days might be expended making this sort of thing work. I am happy to bill for thousands of hours, but that is beside the point. The customer wants a solution not a description of the problem. That is my current mantra. Back to your topic: Oracle has some serious technology that might be important only in an organization with multi-TB databases, and several of these linked together in a WAN or whatever. But to take just one thing O can do that MS-SQL cannot, look at RI across databases. Expand that to view numerous dbs located who knows where, but which maintain RI. There is NO way to achieve that in MS-SQL. Oracle blazes the relational path, without doubt. Other implementations try to catch up. On the other hand, MS-SQL does 99% of what most customers want to do. Call MS the GM of software. Yes, you can buy an Aston-Martin and if that's what you want, then GM wishes you well, but if you're looking for an inexpensive solution then MS-SQL may be your ticket. (Or not. There remain alternatives such as MySQL and PostGres, which are freely downloadable and can be addressed using ODBC.) It is also true that the era of relational databases may be seeing its sunset. Implementations such as Cache raise the bar, and challenge the precepts that underly Codd's theory and subsequent implementations. I have played with Cache only a little; basically ported a complex app to Cache and run it and discovered dramatic performance improvements. But OTOH there is a whole new suitcase of stuff to lean, and I'm old and I don't have all that many available brain cells left. I'd hate to think of myself as going down, clinging to his precepts. I'd like to pretend that I'm a better person than that, and that presented with a superior alternative, I am willing to junk what I've learned and take the new path. Obviously, there is reluctance, not all of which is bad... I've been led down a few garden paths before, to no avail. Late in life, IMO, Dr. Codd made some statements that I would call questionable. IMO, he didn't perceive the Object problem, nor the GIS problem, in their gory details. That's ok by me. It doesn't mean that he's right, and it doesn't diminish his stature if he's wrong. It happens that I deem him incorrect on these particular two issues, and also one other, which I won't bother to get into now). It is certainly possible to implement an O-O database within a classic relational paradigm, but that isn't really the question. The question, IMO, is whether a db such as Cache can deliver superior performance (along with the assumed reliability etc.) to a SQL-based implementation of same. To test this, we need a complex object model and at least a few million rows of the various objects. It happens that I have a very good model for this test, but it will take me some time to port the data to the Cache db. My particular model concerns the pulp and paper industry. I think I can sketch this one in my sleep, but I am interested to hear from developers who have devoted similar time to particular domains. A. On 5/31/07, Elizabeth.J.Doering at wellsfargo.com < Elizabeth.J.Doering at wellsfargo.com> wrote: > > > Come to find out, I am speaking in 30 minutes about the virtues of SQL > Server 2005 versus those of Oracle. Given that my knowledge of Oracle > could still dance comfortably on the head of a pin, I am frantically > googling up details for my 'speech', and I would love to have your > opinions > > I can easily say that we have already SQL Server and that Oracle is > going to cost us $$$$$ that we hadn't budgeted for. The thing I am most > up against is a contention that 650 users are going to generate more > data in a year or two than SQL Server can possibly hold. I'm of the > opinion that with a normalized database in a call center environment, > users generating 10 or 12 records per call can go for years without > seeing much if any slowdown. Is this accurate? > > Oracle isn't being suggested for the production environment however. > Oracle is being pushed for the REPORTING side of this system, for the 3 > or 4 analysts who will be looking at the long term performance of the > folks in the call center. > > Is this making sense? The production staff can live with SQL Server, > but 3 or 4 analysts need the big bucks spent on Oracle for running their > reports. > > Opinions, please? > > > Thanks, > > > Liz > > > Liz Doering > elizabeth.j.doering at wellsfargo.com > 612.667.2447 > > > This message may contain confidential and/or privileged information. If > you are not the addressee or authorized to receive this for the > addressee, you must not use, copy, disclose, or take any action based on > this message or any information herein. If you have received this > message in error, please advise the sender immediately by reply e-mail > and delete this message. Thank you for your cooperation. > > > _______________________________________________ > dba-SQLServer mailing list > dba-SQLServer at databaseadvisors.com > http://databaseadvisors.com/mailman/listinfo/dba-sqlserver > http://www.databaseadvisors.com > > _______________________________________________ dba-SQLServer mailing list dba-SQLServer at databaseadvisors.com http://databaseadvisors.com/mailman/listinfo/dba-sqlserver http://www.databaseadvisors.com