Query multiple SQL Servers in One query - sql-server

I am trying to set up a query that will grab the Windows version of each SQL Server I have and throw it into a table. I have the query that grabs the version but I think there is a better way to get the information needed than connecting to each indiviual server one by one to run the query. I am not opposed to using XP_cmdshell I am just wondering if there is a way to run one query that will grab the version of each Windows OS I have on the sql servers. Also I do have a list of servers to use.
EDIT: I know I wil have to in some way touch each server. I would just like a way to get around having the RDP to each server and open SQL server and query it or haveing to connect to each server within sql server and running the query one by one.
All I have right now code wise is a simple INSERT STATEMENT I get here and I draw a blank on where to go next of even hoe to tackle the problem. The table below has two columns ServerName and Win_Ver ServerName is already populated with all the servers I have.
INSERT INTO mtTable
(Win_Ver)
SELECT ##Version

Given that:
there are "roughly 112 servers"
the servers being a "mixture between 2008 - 2012"
"There is table we are keeping with all of our DB server Statistics."
and "We periodically get asked to produce these statistics"
one option is to cycle through that table of servers using a cursor, and for each one, execute xp_cmdshell to call SQLCMD to run the query. You would use a table variable to capture the result set from SQLCMD as returned by xp_cmdshell. Something like:
DECLARE #ServerName sysname,
#Command NVARCHAR(4000),
#CommandTemplate NVARCHAR(4000);
DECLARE #Results TABLE ([ResultID] INT IDENTITY(1, 1) NOT NULL, [Result] NVARCHAR(4000));
SET #CommandTemplate = N'SQLCMD -S {{SERVER_NAME}} -E -h-1 -Q "PRINT ##VERSION;"';
DECLARE srvrs CURSOR LOCAL READ_ONLY FAST_FORWARD
FOR SELECT [ServerName]
FROM ServerStats;
OPEN srvrs;
FETCH NEXT
FROM srvrs
INTO #ServerName;
WHILE (##FETCH_STATUS = 0)
BEGIN
SET #Command = REPLACE(#CommandTemplate, N'{{SERVER_NAME}}', #ServerName);
INSERT INTO #Results ([Result])
EXEC xp_cmdshell #Command;
-- Get results via SELECT [Result] FROM #Results ORDER BY [ResultID];
-- Do something with the data in #Results
DELETE FROM #Results;
FETCH NEXT
FROM srvrs
INTO #ServerName;
END;
CLOSE srvrs;
DEALLOCATE srvrs;
And it wouldn't hurt to throw in a TRY / CATCH in there :-).
Even if not the most ideal of solutions, it is at least doesn't require adding 112 Linked Servers, and is dynamic and will adjust to servers being added and removed.

In SQL Server you are able to create a Linked Server that you can query from another server.
On the server you wish to write the query in:
Open the Object Explorer
Go to Server Objects
Right Click Linked Servers and add a New Linked Server
Add the Name of your networked server, select SQL server and make sure to define security roles.

Related

Remote Call Compiles SPROC differently to local, causes error

I have a stored proc that returns details of SQL Agent Jobs on the local server. There is a master script that calls this proc, using OPENQUERY, against every SQL server in the ecosystem. In pseudocode, the master script looks something like this:
FOR EACH #LinkedServer in the list
SET #SQL = 'INSERT #Results SELECT * FROM OPENQUERY(' + #LinkedServer + ',''EXEC ScriptToGetAgentJobInfo'')'
EXEC sp_executesql #SQL
NEXT #LinkedServer
Some of the agent jobs are created from SSRS report subscriptions, so they have horrible looking names. In order to replace them with the name of the report that is the target of the subscription, I appeal to the ReportServer database on the #LinkedServer, as part of the ScriptToGetAgentJobInfo.
However, not every server contains a ReportServer database, so sometimes this appeal would fail. To get round that failure, I have the following lines of script:
DECLARE #Reports TABLE
( AgentJob SYSNAME
,Reportname NVARCHAR(128));
IF EXISTS(SELECT 1 FROM [master].[dbo].sysdatabases WHERE [name] = 'ReportServer')
BEGIN;
INSERT #Reports(AgentJob,Reportname)
SELECT Job.job_id, Report.[Name]
FROM
ReportServer.dbo.ReportSchedule AS ReportSched
INNER JOIN dbo.sysjobs AS Job ON CONVERT(SYSNAME,ReportSched.ScheduleID) = Job.[name]
INNER JOIN ReportServer.dbo.Subscriptions AS Subscription ON ReportSched.SubscriptionID = Subscription.SubscriptionID
INNER JOIN ReportServer.dbo.[Catalog] AS Report ON Subscription.report_oid = Report.itemid;
END;
The idea is that if the reportserver database doesn't exist, I can avoid any calls to it, which would error, but if it does, I can get data from it. I then join the #Reports table to my SQL Agent job query with a LEFT JOIN to show the name of the relevant report if there is one.
All this works fine when I run the script locally, but when it is called through the master procedure, I get an error saying Invalid object name 'ReportServer.dbo.ReportSchedule'.
I can get round this problem by making the reportserver select statement "dynamic" (although it is totally static) and calling it with another sp_executesql call, but I really hate doing this!
So my question is this: Why does the error only occur when calling the script remotely and how can I avoid it without recourse to dynamic sql?
The master script is written and run in SQL Server 14.0, while the linked server that is causing the problem is only on SQL Server 10.50.

Error = [Microsoft][ODBC Driver 13 for SQL Server]Unable to open BCP host data-file

I have 1 table which contains a KEY and a file ref at the end of each record. I also have another table which has a lot of records with a KEY contained against each record. I then link both tables together by the KEY and wish to export the data based on the file ref. When I run it as a select query it works fine... but when I run it with the purpose of generating a file for each file ref it fails with the error in the title preceded by the line 'SQLState = S1000, NativeError = 0'. I have access to the directory and I'm running the code on the server. Any guidance would be appreciated.
DECLARE #File_number INT
DECLARE #SQL VARCHAR(8000)
DECLARE file_num CURSOR READ_ONLY FAST_FORWARD LOCAL FOR
SELECT DISTINCT File_number FROM map_sequence_tranid ORDER BY 1
OPEN file_num
FETCH NEXT FROM file_num INTO #File_number
WHILE ##FETCH_STATUS = 0 BEGIN
SELECT #SQL = 'bcp "SELECT b.File_number, a.[Field1], b.[Field2] from Table1 a, Table2 b where a.[Key]=b.[key] and b.File_number=' + CAST(#File_number as varchar(10)) + ' order by a.[key]" queryout ''E:\Path\file' + CAST(#File_number AS VARCHAR(10)) + '.txt'' -c -T -t'','' -S ' + ##SERVERNAME
EXEC master..xp_cmdshell #SQL
FETCH NEXT FROM file_num INTO #File_number
END
CLOSE file_num
DEALLOCATE file_num
It is not your account that needs access to the share. Since you are running the BCP command through the "xp_cmdshell" command, the account that actually executes the bcp command is the same account that is running the SQL Server service on the SQL Server box. When you use "xp_cmdshell" you leave your session/authentication behind and pass control to a new session outside of SQL Server. This is done using the account that is running the SQL Server service and the command is executed on the OS that the SQL Server is running on. Most likely, you are not even able to logon to the OS underneath the SQL Server.
You must confirm that the SQL service account has access to the share.
Im not certain that this is your issue, but it may be. You can test other possibilties by:
Print the contents of the #SQL command instead of executing it. Copy that value into a command window and try to run the command yourself. This will test that the command is valid and that things like paths are valid.
If you can, log onto the physical server (windows?) that your SQL Server is running on. Log on as the SQL Server service account. Then try to execute the bcp command there in a command window. This is the most complete test, but usually we dont have permission to authenticate using an account that is being used to run the SQL Server service.

Get names of multiple servers SSMS is connected to

I have one instance of SSMS open and I am connected to one remote server as well as localhost. How can I get the names of all the servers that SSMS is currently connected to? The emblem of the remote server looks like
and the local looks like
Also, I would like to know if there's any problems with connecting to multiple servers from one instance of SSMS, and how to switch between servers through a script without clicking on a table name and doing something like select top 1000 rows
Okay there are multiple issues at work here as this is not always a simple answer. Depending on your environment and rights you may have one or more many permission groups that have access to one or many environments which have one or many servers that thus have access to one or many databases. However if you do have permission and you have linked servers set up with data access you can do something like this to get a listing of things you have access to. You could run this similarly on different environments making it into a procedure that you could call with ADO.NET or similar.
--declare variable for dynamic SQL
DECLARE
#SQL NVARCHAR(512)
, #x int
-- Create temp table to catch linked servers
Declare #Servers TABLE
(
Id int identity
, ServerName VARCHAR(128)
)
-- insert linked servers
insert into #Servers
select name
FROM sys.servers
-- remove temp table if it exists as it should not be prepopulated.
IF object_ID('tempdb..#Databases') IS NOT NULL
DROP TABLE tempdb..#Databases
;
-- Create temp table to catch built in sql stored procedure
CREATE TABLE #Databases --DECLARE #Procs table
(
ServerName varchar(64)
, DatabaseName VARCHAR(128)
)
SET #X = 1
-- Loops through the linked servers with matching criteria to examine how MANY there are. Do a while loop while they exist.
WHILE #X <= (SELECT count(*) FROM #Servers)
BEGIN
declare #DB varchar(128);
Select #DB = ServerName from #Servers where Id = #X -- get DB name from current cursor increment
-- Set up dynamic SQL but do not include master and other meta databases as no one cares about them.
SET #SQL = 'insert into #Databases select ''' + #Db + ''', name from ' + #DB + '.master.sys.databases
where name not in (''master'',''tempdb'',''model'',''msdb'')'
-- Execute the dynamic sql to insert into collection object
exec sp_executesql #SQL
-- increment for next iteration on next server
SET #X = #X + 1
END
;
SELECT *
FROM #Databases
I'm not entirely sure what you are asking. If you are asking if you can connect to multiple instances of SQL Server in a single query window the answer is yes. I went into detail on how and some of the implications here: Multiple instances, single query window
If on the other hand you are asking how to tell what instance you are connected to you can use ##SERVERNAME.
SELECT ##SERVERNAME
It will return the name of the instance you are connected to.
Typically you would connect to one instance per query window and flip between the windows to affect the specific instance you are interested in.
If you want to write a command to send you to a specific instance you can set your query window to SQLCMD mode (Query menu -> SQLCMD Mode) and use the :CONNECT command.
:CONNECT InstaneName
SELECT ##SERVERNAME

Help with sp_msforeachdb -like queries

Where I'm at we have a software package running on a mainframe system. The mainframe makes a nightly dump into sql server, such that each of our clients has it's own database in the server. There are a few other databases in the server instance as well, plus some older client dbs with no data.
We often need to run reports or check data across all clients. I would like to be able to run queries using sp_msforeachdb or something similar, but I'm not sure how I can go about filtering unwanted dbs from the list. Any thoughts on how this could work?
We're still on SQL Server 2000, but should be moving to 2005 in a few months.
Update:
I think I did a poor job asking this question, so I'm gonna clarify my goals and then post the solution I ended up using.
What I want to accomplish here is to make it easy for programmers working on queries for use in their programs to write the query using one client database, and then pretty much instantly run (test) code designed and built on one client's db on all 50 or so client dbs, with little to no modification.
With that in mind, here's my code as it currently sits in Management Studio (partially obfuscated):
use [master]
declare #sql varchar(3900)
set #sql = 'complicated sql command added here'
-----------------------------------
declare #cmd1 varchar(100)
declare #cmd2 varchar(4000)
declare #cmd3 varchar(100)
set #cmd1 = 'if ''?'' like ''commonprefix_%'' raiserror (''Starting ?'', 0, 1) with nowait'
set #cmd3 = 'if ''?'' like ''commonprefix_%'' print ''Finished ?'''
set #cmd2 =
replace('if ''?'' like ''commonprefix_%''
begin
use [?]
{0}
end', '{0}', #sql)
exec sp_msforeachdb #command1 = #cmd1, #command2 = #cmd2, #command3 = #cmd3
The nice thing about this is all you have to do is set the #sql variable to your query text. Very easy to turn into a stored procedure. It's dynamic sql, but again: it's only used for development (famous last words ;) ). The downside is that you still need to escape single quotes used in the query and much of the time you'll end up putting an extra ''?'' As ClientDB column in the select list, but otherwise it works well enough.
Unless I get another really good idea today I want to turn this into a stored procedure and also put together a version as a table-valued function using a temp table to put all the results in one resultset (for select queries only).
Just wrap the statement you want to execute in an IF NOT IN:
EXEC sp_msforeachdb "
IF '?' NOT IN ('DBs','to','exclude') BEGIN
EXEC sp_whatever_you_want_to
END
"
Each of our database servers contains a "DBA" database that contains tables full of meta-data like this.
A "databases" table would keep a list of all databases on the server, and you could put flag columns to indicate database status (live, archive, system, etc).
Then the first thing your SCRIPT does is to go to your DBA database to get the list of all databases it should be running against.
We even have a nightly maintenance script that makes sure all databases physically on the server are also entered into our "DBA.databases" table, and alerts us if they are not. (Because adding a row to this table should be a manual process)
How about taking the definition of sp_msforeachdb, and tweaking it to fit your purpose? To get the definition you can run this (hit ctrl-T first to put the results pane into Text mode):
sp_helptext sp_msforeachdb
Obviously you would want to create your own version of this sproc rather than overwriting the original ;o)
Doing this type of thing is pretty simple in 2005 SSIS packages. Perhaps you could get an instance set up on a server somewhere.
We have multiple servers set up, so we have a table that denotes what servers will be surveyed. We then pull back, among other things, a list of all databases. This is used for backup scripts.
You could maintain this list of databases and add a few fields for your own purposes. You could have another package or step, depending on how you decide which databases to report on and if it could be done programmatically.
You can get code here for free: http://www.sqlmag.com/Articles/ArticleID/97840/97840.html?Ad=1
We based our system on this code.

SQL Server Full-Text Search: Hung processes with MSSEARCH wait type

We have a SQL Server 2005 SP2 machine running a large number of databases, all of which contain full-text catalogs. Whenever we try to drop one of these databases or rebuild a full-text index, the drop or rebuild process hangs indefinitely with a MSSEARCH wait type. The process can’t be killed, and a server reboot is required to get things running again. Based on a Microsoft forums post1, it appears that the problem might be an improperly removed full-text catalog. Can anyone recommend a way to determine which catalog is causing the problem, without having to remove all of them?
1 [http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2681739&SiteID=1]
“Yes we did have full text catalogues in the database, but since I had disabled full text search for the database, and disabled msftesql, I didn't suspect them. I got however an article from Microsoft support, showing me how I could test for catalogues not properly removed. So I discovered that there still existed an old catalogue, which I ,after and only after re-enabling full text search, were able to delete, since then my backup has worked”
Here's a suggestion. I don't have any corrupted databases but you can try this:
declare #t table (name nvarchar(128))
insert into #t select name from sys.databases --where is_fulltext_enabled
while exists(SELECT * FROM #t)
begin
declare #name nvarchar(128)
select #name = name from #t
declare #SQL nvarchar(4000)
set #SQL = 'IF EXISTS(SELECT * FROM '+#name+'.sys.fulltext_catalogs) AND NOT EXISTS(SELECT * FROM sys.databases where is_fulltext_enabled=1 AND name='''+#name+''') PRINT ''' +#Name + ' Could be the culprit'''
print #sql
exec sp_sqlexec #SQL
delete from #t where name = #name
end
If it doesn't work, remove the filter checking sys.databases.
Have you tried running process monitor and when it hangs and see what the underlying error is? Using process moniter you should be able to tell whick file/resource it waiting for/erroring on.
I had a similar problem with invalid full text catalog locations.
The server wouldn't bring all databases online at start-up. It would process databases in dbid order and get half way through and stop. Only the older DBs were brought online and the remainder were inaccessible.
Looking at sysprocesses revealed a dozen or more processes with a waittype = 0x00CC , lastwaittype = MSSEARCH. MSSEARCH could not be stopped.
The problem was caused when we relocated the full text catalogs but entered the wrong path for one of them when running the alter database ... modifyfile command.
The solution was to disable MSSEARCH, reboot the server allowing all DBs to come online, find the offending database, take it offline, correct the file path using the alter database command, and bring the DB online. Then start MSSEARCH and set to automatic start-up.

Resources