I have multiple databases on multiple servers (SQL Server 2008) with similar schema. I want to execute Stored Procedure on each of them. Right now I have to execute one by one on every server via SQL Server management studio.
Is there any possibility/option in SQL Server Management Studio that I can execute SP just once on all databases.
You can use a group query to run a script against more than one server. Look here
Then use the sp_MSForEachDB mentioned by #Ram
There are two ways I can suggest if you want to avoid doing it programmatically.
1) Use Registered Servers in SSMS. Each target database can be created as a Registered Server within a Server Group. You can then right click on the Server Group and select "New Query". This query will execute against all Registered Servers in the Group. This is explained in detail on MSSQLTips.
2) SQL Multi Script is a dedicated tool we developed at Red Gate to satisfy this use case. However, this isn't integrated into SSMS.
Using the sp_MSForEachDB stored procedure you should be able execute on multiple databases of same server.
EXEC sp_msforeachdb " IF '?' NOT IN ('DBs','to','exclude')
BEGIN
EXEC sp_whatever_you_want_to
END "
Looking around I'm sure you could write a powershell or batch script to do this but I do not have time to learn, build and test one.
So I'll do it in the language I'm happiest in: SQL and batch script
Paste the below query into SSMS and run it, substituting
Your Server List
Path to a file containing the script you want to run (i.e. replace YourSQLScript.SQL)
Path to a log file (i.e. replace YourOutputLog.TXT)
You might want to alter your script and add SELECT ##SERVERNAME to the start to log the server to your output file
WITH ServerList As (
SELECT 'Server1' ServerName UNION ALL
SELECT 'Server2' UNION ALL
SELECT 'Server3' UNION ALL
SELECT 'Server4' UNION ALL
SELECT 'Server5'
)
SELECT
'SQLCMD -S ' + ServerName + ' -E ' + ' -i C:\YourSqlScript.SQL -o C:\YourOutputLog.TXT'
From ServerList
UNION ALL
SELECT 'PAUSE'
So in this example, the file C:\YourSqlScript.SQL should probably contain something like:
SELECT ##SERVERNAME
EXEC sp_msforeachdb 'USE [?]; SELECT '?'; EXEC p_YourStoredProcedure;'
(Thanks to RAM for providing this)
(You should definitely test this script in just one database first)
Copy the output and paste into a text file. Save the text file as MyFirstBatchFile.CMD. Double click this file
Check the output file (C:\YourOutputLog.TXT)
This is not going to work first time - I just built it on the fly to show you how it can be done. If/when you get your first error, sit back take a look and see if you can solve it yourself.
If you need to do this regularly then you can have a think about how you want to automate it. For example there is a way to automate getting a list of servers (hint: SQLCMD -L)
If you are going to regularly administer multiple servers you should consider using Powershell.
Related
Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP
Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.
There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.
I have a SQL Server database set up that I manage using SQL Server Management Studio 17.
In that database, I have 27 tables that I maintain by running pretty simple OPENQUERY scripts every morning, something to the effect of:
DROP TABLE IF EXISTS [databasename].[dbo].[table27]
SELECT * INTO [databasename].[dbo].[table27] FROM OPENQUERY(OracleInstance, '
SELECT
table27.*
FROM
table27
INNER JOIN table26 ON table27.criteria = table26.criteria
WHERE
< filter >
< filter >
');
And this works great! But, it is cumbersome to every morning, sign into SSMS, and right click on my database and hit "New Query" and copy in 27 individual SQL scripts and run them. I am looking for a way to automate that. My directory that holds these scripts looks like this:
I don't know if this is achievable in SSMS or in like a batch script. I would imagine for the latter, some pseudocode looking like:
connect to sql server instance
given instance:
for each sql_script in directory:
sql_script.execute
I have tried creating a script in SSMS, by following:
Tasks -> Script Database ->
But there is no option to execute a .sql file on the tables in question.
I have tried looking at the following resources on using T-SQL to schedule nightly jobs, but have not had any luck conceiving of how to do so:
https://learn.microsoft.com/en-us/sql/ssms/agent/schedule-a-job?view=sql-server-2017
Scheduled run of stored procedure on SQL server
The expected result would be the ability to automatically run the 27 sql queries in the directory above to update the tables in SQL Server, once a day, preferably at 6:00 AM EST. My primary issue is that I cannot access anything but SQL Server Management Studio; I can't access the configuration manager to use things like SQL Server Agent. So if I am scheduling a task, I need to do so through SSMS.
You actually can't access the SQL Server Agent via Object Explorer?
This is located below "Integration Services Catalog"
See highlighted below:
You describe not being able to access that in the question for some reason. If you can't access that then something is wrong with SQL Server or perhaps you don't have admin rights to do things like schedule jobs (a guess there).
In SSMS you would wnat to use Execute T-SQL Statement Task and write your delete statement in the SQL Statement field in the General Tab.
However, I would look at sqlcmd. Simply make a batch script and schedule it in Task Scheduler (if you're using windows). Or you could use
for %%G in (*.sql) do sqlcmd /S servername /d databaseName -E -i"%%G"
pause
From this post. Run all SQL files in a directory
So basically you have to create a Powershell script that calls and execute the sql scripts.
After that you can add your Powrshell script to the Task Scheduler.
I suggest you add these scripts as jobs for the SQL Server Agent.
Example: I write a stored procedure, let's say dbo.GetAllColumns, that selects all of the distinct column names from INFORMATION_SCHEMA.COLUMNS. I install it in database DevOps.
Then I run DevOps.dbo.GetAllColumns from another database, TestProc. Will the output be all of the columns in TestProc or all of the columns in DevOps?
In the case in your question it will return the value from DevOps context.
There is a way that you can achieve what you want but I don't recommend it. Best to just install the proc on to all databases that need it.
The way that is similar to your desired behaviour is to create it in master, give it an sp_ prefix and (⚠️ undocumented/unsupported) mark it as a system object.
USE master
GO
CREATE PROC dbo.sp_GetAllColumns
AS
SELECT *
FROM INFORMATION_SCHEMA.COLUMNS
GO
EXEC sys.sp_MS_marksystemobject
'dbo.sp_GetAllColumns'
Now when called from any other database (with EXEC dbo.sp_GetAllColumns) it will run in that context.
I'm having an issue with a stored procedure which SQL Server being executed as a scheduled task through Task Manager. I have a batch file containing the EXECUTE statement which is called by Task Scheduler. Platforms is SQL Server 2008 R2 on Windows Server 2008 R2.
The batch file code
#echo off
SQLCmd -S lccc-cpc-sql -E -d NTSR -Q "Execute update_vw_NTSR_Base_AllRecords_Labels_new_proc"
This SP does the following:
Drops a table
Recreates it with updated data using a SELECT INTO statement
Problem: It's running the DROP statement, but failing on the SELECT INTO. Here's what's weird though:
If I execute the sp through SSMS (right click the sp, choose Execute) OR, view a query editor, run the code to drop the table and the SELECT INTO statement, it finishes correctly. It's a very large SELECT INTO statement - hundreds of columns and about 100 joins. The purpose is to join a lot of lookup tables to values so I have one place for my users to go for labeled data and some variables computed for user friendliness. It's messy, but it's what I have to work with.
Query timeout is set to 0 (no limit). This only happened recently as I added more columns and variables but it seems it'd fail called through any method, not just through the batch file. Any thoughts on how to make this work as-is (ie without breaking this up into multiple SELECT INTO statements)?
Thanks.
Does anyone know, please, if there's a way to specify "this machine" in a file path that SQL Server will like?
I have an Agent job I want to script onto all my servers. It outputs to a file on the local machine so I want to script the file name as
\\localhost\<shared-folder-name>\<file-name>
Another reason is I'm planning to use log shipping for disaster recovery and I want the jobs to work the same whichever server they're running on. The script sets the job up fine but when I run the job I get an error "Unable to open Step output file". I think I've eliminated share and folder permissions: it works fine with C:\<folder-name>\<file-name>.
have you tried ##SERVERNAME?
SELECT ##SERVERNAME
Or you can use this, the second one you can use if you have multiple instances of SQl server running
SELECT serverproperty('MachineName'),serverproperty('Instancename')
Put the folder in the same location on all machines.
c:\SharedSQLLogFiles.....
If you don't want them in the same location, create a Junction C:\SharedSQLLogFiles pointing at the actual location.
In this way you can simply script the file location as a local path, but still be able to access the file from a remote share.
Cheers, Ben
When you script it out
You can utilize the #output_file_name parameter
EXECUTE msdb.dbo.sp_add_jobstep #job_name = #JobName01, #step_name = #JobName01,
#subsystem = 'CMDEXEC', #command = #JobCommand01, #output_file_name = #OutputFile
To set the output log name, we can do
SET #OutputFile = #LogDirectory + '\SQLAGENT_JOB_$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM)).txt'
To get the server name
$(ESCAPE_SQUOTE(SRVR))