Tag: sql server
SQL Server 2008 R2 Service Pack 1 is released (SharePoint info)
by mysticslayer on Jul.30, 2011, under maintenance, Reporting Services, SharePoint 2010, SharePoint Foundation, sql server, Update
So after some time Microsoft just released there first Service Pack for SQL Server 2008 R2. This package contains alot of improvements regarding to performances for Analysis Services.
You can download SP1 here: http://www.microsoft.com/download/en/details.aspx?id=20302
Also when you’ve running SharePoint Server 2010 with Reporting Services integrated you should update your Reporting Services to the latest Service Pack. Also an important notice to this, you should also download the SQL Server 2008 R2 SP1 Feature Pack to accomplish some other improvements.
You can download SP1 Feature Packs here: http://www.microsoft.com/download/en/details.aspx?id=26728
When you’ve downloaded the Feature Packs you should also install the following components to all SharePoint Servers:
- rsSharePoint.msi
- sqlncli.msi
- SQLSERVER2008_ASADOMD10.msi
The first one is the SQL Server 2008 r2 Reporting Services Add-In for SharePoint Server 2010
The second is the SQL Server Client 2008 which you also have to install when you install the prerequisites for SharePoint Server 2010.
The third is also a component which is used in the prerequisite for SharePoint Server 2010.
Unable to cast object of type ‘Microsoft.SqlServer.Management.Smo.SimpleObjectKey’
by mysticslayer on Feb.03, 2010, under Software, sql server
Today I was doing some tasks on SQL Server 2005 with SP2. I had some problems with adding jobs and getting the following error message:
Unable to cast object of type ‘Microsoft.SqlServer.Management.Smo.SimpleObjectKey’ to type ‘Microsoft.SqlServer.Management.Smo.Agent.JobObjectKey’.
The issue was using client tools without SP2. You can’t add jobs to a SQL Server 2005 with SP2 with SQL Server 2005 RTM Client Tools.
Timestamp and validation
by mysticslayer on Jul.13, 2009, under Programming, sql server, SSIS
Timestamp in SQL Server:
1) The SQL Server timestamp data type has nothing to do with times or dates. SQL Server timestamps are binary numbers that indicate the relative sequence in which data modifications took place in a database. The timestamp data type was originally implemented to support the SQL Server recovery algorithms.
2) It further states Never use timestamp columns in keys, especially primary keys, because the timestamp value changes every time the row is modified.
For a customer we made some changes. And the situation is as follows. We exported the data to excel. Multiple tabels are exported. We changed the data in the excel file and the customer validated the data. We also did export the timetamp with the data. Why? Because we had to import it again to the database. If a Timestamp changed, we should not update the data from the excel file. That’s why.
Problem situation:
Because a timestamp isn’t a varchar, or numeric value, but a binary(8) value we had to convert it. Because we import the data by SSIS back into the database. But the data we imported is from a flat text file. So every column was read as a DT_STR by SSIS, and timestamp in the stored procedure was a datatype of Timestamp. So we did a comparison of a varchar against a timestamp. Well that situation didn’t work out that well.
So what we did was to change the parameter of the stored procedure to varchar. Well SQL Server can’t handle a comparison of varchar against timestamp inside the stored procedure. Of course, so we did try it to convert it to a varbinary(8) or binary(8) and then validate it against the timestamp. Well that didn’t work out that well.
declare @timestamp varbinary(8)
set @timestamp = CONVERT(varbinary(8), '0x0000000001F4AD88')
print @timestamp
returned: 0x3078303030303030
declare @timestamp varbinary(8)
set @timestamp = CONVERT(varbinary(8), 0x0000000001F4AD88)
print @timestamp
returned: 0x0000000001F4AD88
That comparison didn’t work well. So I had to figure an other way to validate on timestamp. When I almost lost my hope I wrote a function in SQL Server.
create function [dbo].[sp_hexadecimal] ( @var varbinary(255) )
returns varchar(255)
as
begin
declare @charval varchar(255)
declare @i int
declare @length int
declare @hex char(16)
select @charval = '0x'
select @i = 1
select @length = datalength(@var)
select @hex = '0123456789abcdef'
while (@i <= @length)
begin
declare @tempint int
declare @firstint int
declare @secondint int
select @tempint = convert(int, substring(@var,@i,1))
select @firstint = floor(@tempint/16)
select @secondint = @tempint - (@firstint*16)
select @charval = @charval +
substring(@hex, @firstint+1, 1) +
substring(@hex, @secondint+1, 1)
select @i = @i + 1
end
return ( @charval )
end
So what I did was the other. I managed to validate the data on the other way.
it did the trick...
Stored procedure for Reindexing and Update Stats
by mysticslayer on May.09, 2009, under Programming, sql server
Well I was searching for a Stored Procedure that you can use for reindexing and updating stats, etc.
I didn’t find any nice procedures that gave me the solution I needed, so I’ve read some posts by google, and made the following procedure 🙂
This procedure will loop through all the tables and all indexes and shall Reindex them. When every index on every table is reindexed it will update all the stats on the database.
Of course you can choose to change the fill index from 80 to any other valid percentage.
-
-
CREATE PROCEDURE spUtil_ReIndexDatabase_UpdateStats
-
AS
-
BEGIN
-
— SET NOCOUNT ON added to prevent extra result sets from
-
— interfering with SELECT statements.
-
SET NOCOUNT ON;
-
DECLARE @MyTable VARCHAR(255)
-
DECLARE myCursor
-
CURSOR FOR
-
SELECT table_name
-
FROM information_schema.tables
-
WHERE table_type = 'base table'
-
OPEN myCursor
-
FETCH NEXT
-
FROM myCursor INTO @MyTable
-
WHILE @@FETCH_STATUS = 0
-
BEGIN
-
PRINT 'Reindexing Table: ' + @MyTable
-
DBCC DBREINDEX(@MyTable, '', 80)
-
FETCH NEXT
-
FROM myCursor INTO @MyTable
-
END
-
CLOSE myCursor
-
DEALLOCATE myCursor
-
EXEC SP_UPDATESTATS
-
GO
-
END
SQL Server DB Log File truncating…
by mysticslayer on Jan.31, 2008, under maintenance, sql server
Well alot of database administrators forget to truncate their database daily before backing the database up. Well I don’t understand why they are forgetting it. Because it’s a simple procedure you can add before you backup the database. Today I received a database of 5 gigabytes and with a log file of yes you don’t how of 21 gigabytes. So I was using the standard features of SQL Server 2000, but this tool does not his work with the normal truncating options of the database.
After a good search I found out it isn’t that much fun to truncate the database with a procedure. After a search on google it was easy to find the simplest script for database maintenance. I used this script and it really did work. Below you can find out yourself:
-
-
SET NOCOUNT ON
-
-
DECLARE @LogicalFileName sysname, @MaxMinutes INT, @NewSize INT
-
-
— *** MAKE SURE TO CHANGE THE NEXT 4 LINES WITH YOUR CRITERIA.
-
-
*** USE [DATABASE] — This is the name of the database — for which the log will be shrunk.
-
SELECT @LogicalFileName = 'LOGFILENAME', — Use sp_helpfile to identify the logical file — name that you want to shrink.
-
@MaxMinutes = 10, — Limit on time allowed to wrap log.
-
@NewSize = 10 — in MB — Setup / initialize
-
DECLARE @OriginalSize INT
-
-
SELECT @OriginalSize = SIZE — in 8K pages FROM sysfiles WHERE name = @LogicalFileName
-
-
SELECT 'Original Size of ' + DB_NAME() + ' LOG is ' + CONVERT(VARCHAR(30),@OriginalSize) + ' 8K pages or ' + CONVERT(VARCHAR(30),(@OriginalSize*8/1024)) + 'MB' FROM sysfiles WHERE name = @LogicalFileName
-
-
CREATE TABLE DummyTrans (DummyColumn CHAR (8000) not null) — Wrap log and truncate it.
-
-
DECLARE @Counter INT, @StartTime DATETIME, @TruncLog VARCHAR(255)
-
-
SELECT @StartTime = GETDATE(), @TruncLog = 'BACKUP LOG ['+ DB_NAME() + '] WITH TRUNCATE_ONLY' — Try an initial shrink.
-
-
DBCC SHRINKFILE (@LogicalFileName, @NewSize) EXEC (@TruncLog) — Wrap the log if necessary.
-
-
WHILE @MaxMinutes > DATEDIFF (mi, @StartTime, GETDATE()) — time has not expired
-
AND @OriginalSize = (SELECT SIZE FROM sysfiles WHERE name = @LogicalFileName) — the log has not shrunk
-
AND (@OriginalSize * 8 /1024) > @NewSize — The value passed in for new size is smaller than the current size.
-
-
BEGIN — Outer loop.
-
-
SELECT @Counter = 0 WHILE ((@Counter > @OriginalSize / 16) AND (@Counter > 50000))
-
BEGIN — update
-
INSERT DummyTrans VALUES ('Fill Log') — Because it is a char field it inserts 8000 bytes.
-
DELETE DummyTrans SELECT @Counter = @Counter + 1 END — update
-
EXEC (@TruncLog) — See if a trunc of the log shrinks it.
-
END — outer loop
-
SELECT 'Final Size of ' + DB_NAME() + ' LOG is ' + CONVERT(VARCHAR(30),SIZE) + ' 8K pages or ' + CONVERT(VARCHAR(30),(SIZE*8/1024)) + 'MB' FROM sysfiles WHERE name = @LogicalFileName
-
DROP TABLE DummyTrans PRINT '*** Perform a full database backup ***'
-
SET NOCOUNT OFF
Voila, fill in the database name and the logfile name and you’re done. Run the script within query analyser or other usefull tool. I used this script with SQL Server 2000 and SQL Server 2005. This script can also be used with SQL Server 7. So with the major databases of Microsoft you can use this script. Have fun!!