I have this script that I found online and have adapted to my database instance. What it does is looks to see if a specific table exists in all the SQL Server databases within the instance. If the table is found and if not empty then it will list the name of the database where the table was found. Its all good so far however, if a database is in Restoring state then my script gets stuck at that database. I tried a different options such as STATE = 0 (which is ONLINE) or state_desc != 'RESTORING' in the 'Where' clause but it still fails. The databases in Restoring state isn't the worry here as sometimes someone may be genuinely trying to restore. All I want to do is not to query that database if its not ONLINE. I wanted to somehow make use of SELECT DATABASEPROPERTYEX ('DatabaseName', 'Status') but can't figure out how. I am not a T-SQL master, so really hoping if someone can tweak this query just to skip the DBs in Restoring state. This will be much appreciated, thank you..
SET NOCOUNT ON;
IF OBJECT_ID (N'tempdb.dbo.#temptbl') IS NOT NULL DROP TABLE #temptbl
CREATE TABLE #temptbl ([COUNT] INT , SiteDBName VARCHAR(50) )
DECLARE #TableName NVARCHAR(50)
SELECT #TableName = '[dbo].[PRJMGTLocation]'
DECLARE #SQL NVARCHAR(MAX)
SELECT #SQL = STUFF(
( SELECT CHAR(13) + 'SELECT ''' + name + ''', COUNT(1) FROM [' + name + '].' + #TableName
FROM sys.databases
WHERE OBJECT_ID(name + '.' + #TableName) IS NOT NULL
-- and STATE = 0 and state_desc != 'RESTORING'
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 1, '')
INSERT INTO #temptbl (SiteDBName, [COUNT])
EXEC sys.sp_executesql #SQL
SELECT * FROM #temptbl t where COUNT >=1 ORDER BY SiteDBName asc
Can you try the following slight tweak:
select SQL = Stuff(
(select Char(13) + 'SELECT ''' + name + ''', COUNT(1) FROM ' + QuoteName(name) + '.' + #TableName
from (select top(1000) name from sys.databases where state=0 )d
where Object_Id(d.name + '.' + #TableName) is not null
for xml path(''), type).value('.', 'NVARCHAR(MAX)'), 1, 1, '')
The issue is that the filtering out of rows from sys.databases is happening after the object_id() lookup, using a row-goal in a sub-query can coerce SQL Server to do it in reverse.
I'm looking to search for unwanted control characters in a MSSQL database.
I currently use a stored procedure that gets created against a database I need to search, but this will only work when searching for a simple character or string of characters. See below for the procedure as it stands (This was first gathered from this site)
CREATE PROC SearchAllTables
(
#SearchStr nvarchar(100)
)
AS
BEGIN
-- Creates a Stored Procedure for the database
-- When running the procedure, set the #SearchStr parameter to the character you are searching for
CREATE TABLE #Results (ColumnName nvarchar(370), ColumnValue nvarchar(3630))
SET NOCOUNT ON
DECLARE #TableName nvarchar(256), #ColumnName nvarchar(128), #SearchStr2 nvarchar(110)
SET #TableName = ''
SET #SearchStr2 = QUOTENAME('%' + #SearchStr + '%','''')
WHILE #TableName IS NOT NULL
BEGIN
SET #ColumnName = ''
SET #TableName =
(
SELECT MIN(QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME))
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
AND QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME) > #TableName
AND OBJECTPROPERTY(
OBJECT_ID(
QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME)
), 'IsMSShipped'
) = 0
)
WHILE (#TableName IS NOT NULL) AND (#ColumnName IS NOT NULL)
BEGIN
SET #ColumnName =
(
SELECT MIN(QUOTENAME(COLUMN_NAME))
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = PARSENAME(#TableName, 2)
AND TABLE_NAME = PARSENAME(#TableName, 1)
AND DATA_TYPE IN ('char', 'varchar', 'nchar', 'nvarchar')
AND QUOTENAME(COLUMN_NAME) > #ColumnName
)
IF #ColumnName IS NOT NULL
BEGIN
INSERT INTO #Results
EXEC
(
'SELECT ''' + #TableName + '.' + #ColumnName + ''', LEFT(' + #ColumnName + ', 3630)
FROM ' + #TableName + ' (NOLOCK) ' +
' WHERE ' + #ColumnName + ' LIKE ' + #SearchStr2
)
END
END
END
SELECT ColumnName, ColumnValue FROM #Results
END
Now, I need to alter this to allow me to search for a list of control characters:
'%['
+ CHAR(0)+CHAR(1)+CHAR(2)+CHAR(3)+CHAR(4)
+ CHAR(5)+CHAR(6)+CHAR(7)+CHAR(8)+CHAR(9)
+ CHAR(10)+CHAR(11)+CHAR(12)+CHAR(13)+CHAR(14)
+ CHAR(15)+CHAR(16)+CHAR(17)+CHAR(18)+CHAR(19)
+ CHAR(20)+CHAR(21)+CHAR(22)+CHAR(23)+CHAR(24)
+ CHAR(25)+CHAR(26)+CHAR(27)+CHAR(28)+CHAR(29)
+ CHAR(30)+CHAR(31)+CHAR(127)
+ ']%',
Now the procedure as it stands won't allow me to use this as a search string, and it won't search correctly even using a single control character e.g. CHAR (28)
USE [DBNAME]
GO
DECLARE #return_value int
EXEC #return_value = [dbo].[SearchAllTables]
#SearchStr = N'CHAR (28)'
SELECT 'Return Value' = #return_value
GO
Removing the N'' from the #SearchStr in the example above results in the error message:
Incorrect syntax near '28'
Can anyone help with a way of adapting this procedure to allow the search of control characters?
I would opt for a dynamic CharIndex(). Consider the following
Declare #ColumnName varchar(25)='[SomeField]'
Declare #SearchFor nvarchar(max) ='CHAR(0),CHAR(1),CHAR(2),CHAR(3),CHAR(4),CHAR(5),CHAR(6),CHAR(7),CHAR(8),CHAR(9),CHAR(10),CHAR(11),CHAR(12),CHAR(13),CHAR(14),CHAR(15),CHAR(16),CHAR(17),CHAR(18),CHAR(19),CHAR(20),CHAR(21),CHAR(22),CHAR(23),CHAR(24),CHAR(25),CHAR(26),CHAR(27),CHAR(28),CHAR(29),CHAR(30),CHAR(31),CHAR(127)'
Set #SearchFor = 'CharIndex('+Replace(#SearchFor,',',','+#ColumnName+')+CharIndex(')+','+#ColumnName+')'
So Your Dynamic where would look something like this
' WHERE ' + #SearchFor + '>0'
Just for illustration, the #SearchFor string would look like this
CharIndex(CHAR(0),[SomeField])+CharIndex(CHAR(1),[SomeField])+...+CharIndex(CHAR(31),[SomeField])+CharIndex(CHAR(127),[SomeField])
It looks like QUOTENAME is what is breaking things for you. When you try to use certain characters - such as char(0) - it returns NULL. Because of this, you are probably better off manually putting the single quotes yourself.
This means you would want to change this part:
INSERT INTO #Results
EXEC
(
'SELECT ''' + #TableName + '.' + #ColumnName + ''', LEFT(' + #ColumnName + ', 3630)
FROM ' + #TableName + ' (NOLOCK) ' +
' WHERE ' + #ColumnName + ' LIKE ' + #SearchStr2
)
to this:
INSERT INTO #Results
EXEC
(
'SELECT ''' + #TableName + '.' + #ColumnName + ''', LEFT(' + #ColumnName + ', 3630)
FROM ' + #TableName + ' (NOLOCK) ' +
' WHERE ' + #ColumnName + ' LIKE ''' + #SearchStr + ''' -- Note the use of #SearchStr (Not #SearchStr2) and the additional quotes to wrap your search string in.
)
Which should allow you to use your %[...]% pattern matching syntax.
Concerns:
Performance
As you probably know, wildcards (%) at the beginning and end of the argument prevent your SARG from using any indexes at all (even if it claims to use an INDEX SCAN) as SQL Server has no idea where the values will be. In a worst case scenario, it might even look in the wrong areas!
More grievous, the last EXEC statement you fire off will make SQL Server run through hoops. Despite what you might think, SQL Server initializes variables at the time of execution. Meaning, the optimizer will be running with its bed-clothes still on while it is in the middle of executing the query plan and may end up changing several times!
An example of what might be unleashed occurred on one of my DBs a
month ago, where a terrible new plugin ran a simple query looking for
one row with just two badly parameterized predicates on a large table of 1
million rows. Yet, the Optimizer swallowed up trillions of IOs in a
matter of seconds (the query came and went too fast for a governor)
and sent 2 billion rows PER QUERY through the network.
Tragically, the issue was zombied that day, and with just 500 one-row
result sets in my database running repeatedly, it brought down our
server.
Isolation and Transactions
Guessing haphazardly, expect to have locking issues and swallowed up resources. Major operations like UPDATES, and REINDEXING, and ALTER statements will either be forced to wait or kick your query to the curbside. Even using READ UNCOMMITTED will not save you from some blocking issues.
A New Approach
All of those characters you listed are neither letters nor numbers, but meaningless garbage (to SQL Server) that flows in from a front end application. I noticed you excluded Microsoft System Tables, so where does your data flow come from and how is it disseminated throughout the database? Who is at fault? How does the system, user, and designer play a role in the mess?
Is this Server an OLTP or READ heavy? Does your org not have a capable SSIS, ETL system to prevent garbage from wreaking havoc on your server?
Database Constraints
Assuredly, what reason does your application fail to pre-cleanse the data before sending it? And when it does get to the database level, why can we not use both the DATA TYPE and TABLE CONSTRAINTS to our advantage? Simple solutions like using DATE instead of VARCHAR for storing dates, adding normalization instead of storing blobs to isolate the read-heavy tables from write-heavy can spell wonders to improvement.
Admittingly, using CHECK CONSTRAINTS can lead to an exponential degradation of performance on your INSERT statements, so you may need to think about the larger impact.
Preventative vs Prescriptive
Ostensibly, I could write a query that would solve your current question (encapsulating EXEC statements in another Stored Proc enables proper parameter sniffing), we need to ask more and write less code. Your Procedure is terrible now and will always be, even if we window-dress. It masks the real issue of how those control characters got there in the first place and forces expensive queries on your poor system.
How relationally your tables work, normalization, cardinality should mean something to you so you can discriminate between not only types of tables but those specific columns they possess. Your current trouble would be disastrous on many of my databases, which can reach 1.5+ Terabytes in size
The more you gather your requirements, the better your answer will be. Heck, even setting up a database entirely for ETL would be superior than your current solution. And even if you still end up running a similar query, at least you will have shortened your list of columns and tables to a minute, understandable list instead of blindly inflicting pain on everyone in your company.
Best of wishes!
I would like to use variables instead of schema name in at create time for tables or views.
Eg instead of
create table dbo.TableName
I want to
create table #schema.TableName
Also I would like to write below statement with the mentioned solution.
IF OBJECT_ID (N'dbo.TableName', 'u') IS NOT NULL
DROP table TableName;
go
Is this way totally possible or not, if it's possible which way is more efficient.
You can build your statement concatenating objects names and use sp_executesql to submit it to the server:
Using sp_executesql
For example, creating:
DECLARE #SQLString NVARCHAR(500);
DECLARE #TableName NVARCHAR(100);
SET #TableName = 'dbo.TableName';
SET #SQLString = 'CREATE TABLE ' + #TableName + ' ...';
EXECUTE sp_executesql #SQLString;
and dropping:
SET #TableName = 'dbo.TableName';
SET #SQLString = 'IF OBJECT_ID (''' + #TableName + ''', ''u'') IS NOT NULL DROP table ' + #TableName;
EXECUTE sp_executesql #SQLString;
You'll have to mitigate the "Sql Injection" risk if you are getting the object names from the user.
You need to research "sqlcmd" mode.
http://msdn.microsoft.com/en-us/library/ms174187.aspx
Example. Note, in SSMS, Under "Query" there is a "SqlCmd Mode" menu item you need to enable.
:setvar MySchemaName "dbo"
IF EXISTS ( SELECT * FROM INFORMATION_SCHEMA.TABLES where TABLE_SCHEMA = N'$(MySchemaName)' and TABLE_NAME = N'Ticket' and TABLE_TYPE = N'BASE TABLE' )
BEGIN
DROP TABLE [$(MySchemaName)].[Ticket]
END
GO
CREATE TABLE [$(MySchemaName)].[Ticket] (
[TicketUUID] [uniqueidentifier] NOT NULL,
[CreateDate] [datetime] NOT NULL DEFAULT CURRENT_TIMESTAMP
)
GO
SELECT * FROM INFORMATION_SCHEMA.TABLES where TABLE_SCHEMA = N'$(MySchemaName)' and TABLE_NAME = N'Ticket' and TABLE_TYPE = N'BASE TABLE'
My database query has been running very fast until it changed to very slow recently. No changed have occurred in the database apart from normal data growth.
I have noticed that the database statistics have "never" been updated.
Is there an easy way that I can update these statistics across my entire database so I can see if that is the problem?
I am using SQL Server 2000 Sp4.
You can use this
CREATE PROC usp_UPDATE_STATISTICS
(#dbName sysname, #sample int)
AS
SET NOCOUNT ON
DECLARE #SQL nvarchar(4000)
DECLARE #ID int
DECLARE #TableName sysname
DECLARE #RowCnt int
CREATE TABLE ##Tables
(
TableID INT IDENTITY(1, 1) NOT NULL,
TableName SYSNAME NOT NULL
)
SET #SQL = ''
SET #SQL = #SQL + 'INSERT INTO ##Tables (TableName) '
SET #SQL = #SQL + 'SELECT [name] '
SET #SQL = #SQL + 'FROM ' + #dbName + '.dbo.sysobjects '
SET #SQL = #SQL + 'WHERE xtype = ''U'' AND [name] <> ''dtproperties'''
EXEC sp_executesql #statement = #SQL
SELECT TOP 1 #ID = TableID, #TableName = TableName
FROM ##Tables
ORDER BY TableID
SET #RowCnt = ##ROWCOUNT
WHILE #RowCnt <> 0
BEGIN
SET #SQL = 'UPDATE STATISTICS ' + #dbname + '.dbo.[' + #TableName + '] WITH SAMPLE ' + CONVERT(varchar(3), #sample) + ' PERCENT'
EXEC sp_executesql #statement = #SQL
SELECT TOP 1 #ID = TableID, #TableName = TableName
FROM ##Tables
WHERE TableID > #ID
ORDER BY TableID
SET #RowCnt = ##ROWCOUNT
END
DROP TABLE ##Tables
GO
This will update stats on all the tables in the DB. You should also look at indexes and rebuild / defrag as nexessary
Raj
Try here
This should speed up your indices and key distribution. Re-analyzing table statistics optimises SQL Server's choice of index for queries, especially for large datasets
Definitely make yourself a weekly task that runs automatically to update the database's statistics.
Normal Data Growth is good enough as a reson to justify a slowdown of pretty much any not optimized query.
Scalability issues related db size won't manifest till the data volume grows.
Post your query + rough data volume and we'll help you to see what's what.
We've had a very similar problem with MSSQL 2005 and suddenly slow running queries.
Here's how we solved it: we added (nolock) for every select statement in the query. For example:
select count(*) from SalesHistory with(nolock)
Note that nolock should also be added to nested select statements, as well as joins. Here's an article that gives more details about how performance is increased when using nolock. http://www.mollerus.net/tom/blog/2008/03/using_mssqls_nolock_for_faster_queries.html
Don't forget to keep a backup of your original query obviously. Please give it a try and let me know.
I am using Linked server For transferring data using MSDTC
Alter Proc [dbo].[usp_Select_TransferingDatasFromServerCheckingforExample]
#RserverName varchar(100), ----- Server Name
#RUserid Varchar(100), ----- server user id
#RPass Varchar(100), ----- Server Password
#DbName varchar(100) ----- Server database
As
Set nocount on
Set Xact_abort on
Declare #user varchar(100)
Declare #userID varchar(100)
Declare #Db Varchar(100)
Declare #Lserver varchar(100)
Select #Lserver = ##servername
Select #userID = suser_name()
select #User=user
Exec('if exists(Select 1 From [Master].[' + #user + '].[sysservers] where srvname = ''' +
#RserverName + ''') begin Exec sp_droplinkedsrvlogin ''' + #RserverName + ''',''' + #userID +
''' exec sp_dropserver ''' + #RserverName + ''' end ')
declare #ColumnList varchar(max)
set #ColumnList = null
Select #ColumnList = case when #ColumnList is not null then #ColumnList + ',' + quotename(name) else quotename(name) end
From syscolumns where Id = object_id('Crnot') order by colid
Set identity_Insert Crnot On
exec ('Insert Into ['+ #RserverName + '].'+ #DbName + '.'+ #user +'.Crnot ('+ #ColumnList +') Select '+ #ColumnList +' from Crnot ')
Set identity_Insert Crnot Off
Exec sp_droplinkedsrvlogin #RserverName,#userID
Exec sp_dropserver #RserverName
when executing this qry i get the error "No transaction Active"
Check your MS DTC configuration (cut and paste from a doc, not checked recently):
Start, Run, dcomcnfg.exe
In the Component Services window, expand Component Services... Computers...My Computer.
Right-click My Computer, Properties.
Click Security Configuration on the MSDTC tab.
Click to select the Network DTC Access check box.
Set both the Allow Inbound and Allow Outbound check boxes
Under the Transaction Manager Communication group, click to select the No Authentication Required option.
Verify that the DTC Logon Account name is set to NT AUTHORITY\NetworkService.
Click Ok etc
In your code, Set identity_Insert Crnot only applies to local objects.
It should be part of the dynamic SQL INSERT