André Krijnen

Tag: sql server

SQL Server 2008 R2 Service Pack 1 is released (SharePoint info)

by on Jul.30, 2011, under maintenance, Reporting Services, SharePoint 2010, SharePoint Foundation, sql server, Update

So after some time Microsoft just released there first Service Pack for SQL Server 2008 R2. This package contains alot of improvements regarding to performances for Analysis Services.

You can download SP1 here: http://www.microsoft.com/download/en/details.aspx?id=20302

Also when you’ve running SharePoint Server 2010 with Reporting Services integrated you should update your Reporting Services to the latest Service Pack. Also an important notice to this, you should also download the SQL Server 2008 R2 SP1 Feature Pack to accomplish some other improvements.

You can download SP1 Feature Packs here: http://www.microsoft.com/download/en/details.aspx?id=26728

When you’ve downloaded the Feature Packs you should also install the following components to all SharePoint Servers:

  • rsSharePoint.msi
  • sqlncli.msi
  • SQLSERVER2008_ASADOMD10.msi

The first one is the SQL Server 2008 r2 Reporting Services Add-In for SharePoint Server 2010

The second is the SQL Server Client 2008 which you also have to install when you install the prerequisites for SharePoint Server 2010.

The third is also a component which is used in the prerequisite for SharePoint Server 2010.

 

 

Leave a Comment :, , , , , , , , , more...

Unable to cast object of type ‘Microsoft.SqlServer.Management.Smo.SimpleObjectKey’

by on Feb.03, 2010, under Software, sql server

Today I was doing some tasks on SQL Server 2005 with SP2. I had some problems with adding jobs and getting the following error message:

Unable to cast object of type ‘Microsoft.SqlServer.Management.Smo.SimpleObjectKey’ to type ‘Microsoft.SqlServer.Management.Smo.Agent.JobObjectKey’.

The issue was using client tools without SP2. You can’t add jobs to a SQL Server 2005 with SP2 with SQL Server 2005 RTM Client Tools.

Leave a Comment :, , , , , more...

Timestamp and validation

by on Jul.13, 2009, under Programming, sql server, SSIS

Timestamp in SQL Server:

1) The SQL Server timestamp data type has nothing to do with times or dates. SQL Server timestamps are binary numbers that indicate the relative sequence in which data modifications took place in a database. The timestamp data type was originally implemented to support the SQL Server recovery algorithms.

2) It further states Never use timestamp columns in keys, especially primary keys, because the timestamp value changes every time the row is modified.

For a customer we made some changes. And the situation is as follows. We exported the data to excel. Multiple tabels are exported. We changed the data in the excel file and the customer validated the data. We also did export the timetamp with the data. Why? Because we had to import it again to the database. If a Timestamp changed, we should not update the data from the excel file. That’s why.

Problem situation:

Because a timestamp isn’t a varchar, or numeric value, but a binary(8) value we had to convert it. Because we import the data by SSIS back into the database. But the data we imported is from a flat text file. So every column was read as a DT_STR by SSIS, and timestamp in the stored procedure was a datatype of Timestamp. So we did a comparison of a varchar against a timestamp. Well that situation didn’t work out that well.

So what we did was to change the parameter of the stored procedure to varchar. Well SQL Server can’t handle a comparison of varchar against timestamp inside the stored procedure. Of course, so we did try it to convert it to a varbinary(8) or binary(8) and then validate it against the timestamp. Well that didn’t work out that well.

declare @timestamp varbinary(8)
set @timestamp = CONVERT(varbinary(8), '0x0000000001F4AD88')
print @timestamp

returned: 0x3078303030303030

declare @timestamp varbinary(8)
set @timestamp = CONVERT(varbinary(8), 0x0000000001F4AD88)

print @timestamp

returned: 0x0000000001F4AD88

That comparison didn’t work well. So I had to figure an other way to validate on timestamp. When I almost lost my hope I wrote a function in SQL Server.

create function [dbo].[sp_hexadecimal] ( @var varbinary(255) )
returns varchar(255)
as

begin
      declare @charval varchar(255)
      declare @i int
      declare @length int
      declare @hex char(16)

      select @charval = '0x'
      select @i = 1
      select @length = datalength(@var)
      select @hex = '0123456789abcdef'

      while (@i <= @length)
      begin
            declare @tempint int
            declare @firstint int
            declare @secondint int

            select @tempint = convert(int, substring(@var,@i,1))
            select @firstint = floor(@tempint/16)
            select @secondint = @tempint - (@firstint*16)
            select @charval = @charval +
            substring(@hex, @firstint+1, 1) +
            substring(@hex, @secondint+1, 1)

            select @i = @i + 1
      end

return ( @charval )
end

So what I did was the other. I managed to validate the data on the other way.

sphexadecimal1
it did the trick...

Leave a Comment :, , , , , , , , , more...

Stored procedure for Reindexing and Update Stats

by on May.09, 2009, under Programming, sql server

Well I was searching for a Stored Procedure that you can use for reindexing and updating stats, etc.

I didn’t find any nice procedures that gave me the solution I needed, so I’ve read some posts by google, and made the following procedure 🙂

This procedure will loop through all the tables and all indexes and shall Reindex them. When every index on every table is reindexed it will update all the stats on the database.

Of course you can choose to change the fill index from 80 to any other valid percentage.

  1.  
  2. CREATE PROCEDURE spUtil_ReIndexDatabase_UpdateStats
  3. AS
  4. BEGIN
  5. — SET NOCOUNT ON added to prevent extra result sets from
  6. — interfering with SELECT statements.
  7. SET NOCOUNT ON;
  8. DECLARE @MyTable VARCHAR(255)
  9. DECLARE myCursor
  10. CURSOR FOR
  11. SELECT table_name
  12. FROM information_schema.tables
  13. WHERE table_type = 'base table'
  14. OPEN myCursor
  15. FETCH NEXT
  16. FROM myCursor INTO @MyTable
  17. WHILE @@FETCH_STATUS = 0
  18. BEGIN
  19. PRINT 'Reindexing Table:  ' + @MyTable
  20. DBCC DBREINDEX(@MyTable, '', 80)
  21. FETCH NEXT
  22. FROM myCursor INTO @MyTable
  23. END
  24. CLOSE myCursor
  25. DEALLOCATE myCursor
  26. EXEC SP_UPDATESTATS
  27. GO
  28. END
Leave a Comment :, , , , , , , , more...

SQL Server DB Log File truncating…

by on Jan.31, 2008, under maintenance, sql server

Well alot of database administrators forget to truncate their database daily before backing the database up. Well I don’t understand why they are forgetting it. Because it’s a simple procedure you can add before you backup the database. Today I received a database of 5 gigabytes and with a log file of yes you don’t how of 21 gigabytes. So I was using the standard features of SQL Server 2000, but this tool does not his work with the normal truncating options of the database.

After a good search I found out it isn’t that much fun to truncate the database with a procedure. After a search on google it was easy to find the simplest script for database maintenance. I used this script and it really did work. Below you can find out yourself:

  1.  
  2. SET NOCOUNT ON
  3.  
  4. DECLARE @LogicalFileName sysname, @MaxMinutes INT, @NewSize INT
  5.  
  6. — *** MAKE SURE TO CHANGE THE NEXT 4 LINES WITH YOUR CRITERIA.
  7.  
  8. *** USE [DATABASE] — This is the name of the database — for which the log will be shrunk.
  9. SELECT @LogicalFileName = 'LOGFILENAME', — Use sp_helpfile to identify the logical file — name that you want to shrink.
  10. @MaxMinutes = 10, — Limit on time allowed to wrap log.
  11. @NewSize = 10 — in MB — Setup / initialize
  12. DECLARE @OriginalSize INT
  13.  
  14. SELECT @OriginalSize = SIZE — in 8K pages FROM sysfiles WHERE name = @LogicalFileName
  15.  
  16. SELECT 'Original Size of ' + DB_NAME() + ' LOG is ' + CONVERT(VARCHAR(30),@OriginalSize) + ' 8K pages or ' + CONVERT(VARCHAR(30),(@OriginalSize*8/1024)) + 'MB' FROM sysfiles WHERE name = @LogicalFileName
  17.  
  18. CREATE TABLE DummyTrans (DummyColumn CHAR (8000) not null) — Wrap log and truncate it.
  19.  
  20. DECLARE @Counter INT, @StartTime DATETIME, @TruncLog VARCHAR(255)
  21.  
  22. SELECT @StartTime = GETDATE(), @TruncLog = 'BACKUP LOG ['+ DB_NAME() + '] WITH TRUNCATE_ONLY' — Try an initial shrink.
  23.  
  24. DBCC SHRINKFILE (@LogicalFileName, @NewSize) EXEC (@TruncLog) — Wrap the log if necessary.
  25.  
  26. WHILE @MaxMinutes > DATEDIFF (mi, @StartTime, GETDATE()) — time has not expired
  27. AND @OriginalSize = (SELECT SIZE FROM sysfiles WHERE name = @LogicalFileName) — the log has not shrunk
  28. AND (@OriginalSize * 8 /1024) > @NewSize — The value passed in for new size is smaller than the current size.
  29.  
  30. BEGIN — Outer loop.
  31.  
  32. SELECT @Counter = 0 WHILE ((@Counter > @OriginalSize / 16) AND (@Counter > 50000))
  33. BEGIN — update
  34. INSERT DummyTrans VALUES ('Fill Log') — Because it is a char field it inserts 8000 bytes.
  35. DELETE DummyTrans SELECT @Counter = @Counter + 1 END — update
  36. EXEC (@TruncLog) — See if a trunc of the log shrinks it.
  37. END — outer loop
  38. SELECT 'Final Size of ' + DB_NAME() + ' LOG is ' + CONVERT(VARCHAR(30),SIZE) + ' 8K pages or ' + CONVERT(VARCHAR(30),(SIZE*8/1024)) + 'MB' FROM sysfiles WHERE name = @LogicalFileName
  39. DROP TABLE DummyTrans PRINT '*** Perform a full database backup ***'
  40. SET NOCOUNT OFF

Voila, fill in the database name and the logfile name and you’re done. Run the script within query analyser or other usefull tool. I used this script with SQL Server 2000 and SQL Server 2005. This script can also be used with SQL Server 7. So with the major databases of Microsoft you can use this script. Have fun!!

Leave a Comment :, , , , more...

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Blogroll

A few highly recommended websites...