Does anyone know, please, if there's a way to specify "this machine" in a file path that SQL Server will like?
I have an Agent job I want to script onto all my servers. It outputs to a file on the local machine so I want to script the file name as
\\localhost\<shared-folder-name>\<file-name>
Another reason is I'm planning to use log shipping for disaster recovery and I want the jobs to work the same whichever server they're running on. The script sets the job up fine but when I run the job I get an error "Unable to open Step output file". I think I've eliminated share and folder permissions: it works fine with C:\<folder-name>\<file-name>.
have you tried ##SERVERNAME?
SELECT ##SERVERNAME
Or you can use this, the second one you can use if you have multiple instances of SQl server running
SELECT serverproperty('MachineName'),serverproperty('Instancename')
Put the folder in the same location on all machines.
c:\SharedSQLLogFiles.....
If you don't want them in the same location, create a Junction C:\SharedSQLLogFiles pointing at the actual location.
In this way you can simply script the file location as a local path, but still be able to access the file from a remote share.
Cheers, Ben
When you script it out
You can utilize the #output_file_name parameter
EXECUTE msdb.dbo.sp_add_jobstep #job_name = #JobName01, #step_name = #JobName01,
#subsystem = 'CMDEXEC', #command = #JobCommand01, #output_file_name = #OutputFile
To set the output log name, we can do
SET #OutputFile = #LogDirectory + '\SQLAGENT_JOB_$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM)).txt'
To get the server name
$(ESCAPE_SQUOTE(SRVR))
Related
I've been stuck on this for sometime now. I have an SSIS package thats supposed to read a file and populate a database. I need to run it from a SQL Server Agent Job and the source files to read are located on a folder in another server that I have shared with this server.
The shared path to the folder looks like like this: \\server\D\folder\folder
However when I run agent job through a service account it tells me File name property is not valid. Filename is a device or contains invalid characters
The SQL Server Agent uses a service account to run this job. It runs just fine if the source path is located somewhere on the machine where the database instance lives, however I can't get it to run from a shared folder. If I run it myself by right clicking on the SSIS catalog I can run it just fine. I am aware that it is most likely a credentials issue, but all of these servers and accounts were not set up by me. Can someone help me explain how I should go about adding appropriate permissions to the said SA account so it can read the files successfully? Some examples/references would be greatly appreciated!
Things I've tried: Going to the folder security tab and adding all permissions to everyone on both the server where the folder originally is from and the server that the folder is being shared from. I can confirm everyone has the permissions with the windows PowerShell Get-Acl command.
Switching owner of the job task in SQL Server Agent to my account (I don't think its supposed to work ever to begin with) - this makes Agent complain about being "Unable to determine if user has server acces" with SA account it does have server access, it just can't read the folder.
I saw a post where someone suggests to change the SQL Agent Job advanced step option to "execute as user" and change the user with appropriate credentials, but I don't even see that option in my MSSQL.
I have stumbled upon this thread here , it was never really solved it seems but it looks like the 3 steps given should help me:
Assume that we need to write \serv\share\dir1..\dirN\targetDir\somefile.txt using SSIS throught SQL Agent Job and nonadmin proxy account MyDomain\TestAccount
MyDomain\TestAccount need read/write access to share \serv\share
MyDomain\TestAccount needed at least FILE_READ_DATA permission for all folders (share,dir1,..dirN)
MyDomain\TestAccount needed the CHANGE rights + FILE_DELETE_CHILD permission for folder targetDir
However, me being new to this, I have no idea how to properly check whether or not all these 3 conditions are true and if they are even completely relevant to the problem
EDIT:
There is a project-level variable in SSIS that determines where to read from (in this case set to \\server\d\folder\folder)
This variable is passed into forEach file enumerator in a for loop.
There is also a fileName variable used to check if file name was already loaded in the db as I store them in the table. The variable goes like this:
DECLARE #FileName VARCHAR(50) SET #FileName='' IF EXISTS (SELECT 1 FROM FileLoadStatus WHERE fileName = #FileName) BEGIN SELECT 1 AS FileExistsFlg END ELSE BEGIN SELECT 0 as FileExistsFlg END
If variables are at fault, I still don't know why it works if I execute it manually through catalog myself, but SQL Server Agent is unable to execute it through an SA account
EDIT 2: Exact errors say the following:
EDIT 3: Now that I have set a windows system task to execute the SSIS package instead of a SQL Server Agent Job it just tells me that the "for each file enumerator is empty" basically meaning it can't find any files in the destination to read, even though files are there
it might be a late respond, for all who come to check for an answer to this issue:
the main thing is to be sure that the SQL agent has the authority to read from the shared folder:
1- hold down the Windows key and press R on your keyboard to open the Run command in windows.
2- type services.
3- search for SQL Server Agent.
4- as in the screenshot shows on the logon option you will find which user the agent is using, be sure that this user has the authority to read from the shared folder.
or change the user to another one with the right credentials.
5- you can check the users of the shared folder by right clicking on it and choosing properties --> security. From this window you can change the credentials of the users.
I am working in SSMS.
I have an object that I want to edit on several servers/databases simultaneously.
I start with opening the object via Object explorer and editing/testing there.
Once I am done, I go to Registered servers, and copy+paste the object code to update it on all the locations.
Is there a faster way to do this? Right clicking and choosing Change connection only works with one Server at a time and does not allow to choose anything from the Database engine.
Thank you!
SSMS has a SQLCMD mode.
By enabling it changing of the current connection can be part of the script:
:connect (local)
SELECT name from sys.databases
-- run some other script
:connect anotherServer
SELECT name from sys.databases
-- run some other script
Another approach is Multi-database Query:
In such case, servers to be pre-grouped into folders based on your criteria
References:
https://www.mssqltips.com/sqlservertip/2855/sql-server-multi-database-query-with-registered-servers/
https://www.sqlshack.com/use-sqlcmd-commands-ssms-query-editor/
I'm trying to write a program to allow a non-privileged technically naive user to create a new SQL Server database and receive it in the form of a .mdf file. I understand that .mdf files are not really supposed to be treated like database backups, but I need to do it this way to maintain compatibility with existing commercial software that works like this.
I'm using Visual Studio 2013 and SQL Server 2014, though I would like the program to be able to work with versions of SQL Server going back at least to 2008.
What I find is that I can't copy the .mdf file created by
CREATE DATABASE XXXXX on
(NAME=<name>,FILENAME=<filename>')
because it ends up belonging (I think) to MSSQL$SQLEXPRESS. The reason I say 'I think' is that when I go to the file properties and accept Administrator privileges to view the owner, it tells me 'Unable to display current owner.' I can transfer ownership of the file, but only using Administrator privileges.
So it seems that I cannot copy the .MDF file without Administrator privileges, which seems fairly ridiculous given that I was able to create the file without those privileges. I've tried creating the file in a folder located under my user's App_Data folder and with full access for everyone to subfolders and files, but that didn't help.
Can anyone suggest me what I can do (programatically) to make this file available to a non-privileged user?
Many thanks for your help.
Try following script before trying to copy mdf file:
USE [master]
GO
EXEC master.dbo.sp_detach_db #dbname = N'DatabaseName';
GO
It will detach Database from SQL Server and you can treat it's files as regular files within a system.
After copying you'd have to re-attach the database by "sp_attach_db".
Be aware that during that period database won't be visible by SQL Server.
USE [master]
GO
EXEC master.dbo.sp_attach_db #dbname = N'DatabaseName'
, #filename1 = 'C:\Data\Datfile.mdf';
, #filename2 = 'C:\Data\Logfile.ldf';
GO
First get location of data files into a variable, so that you can easly move from there after you detach.
select a.filename from sys.sysfiles a inner join sys.master_files b on a.fileid=b.file_id
where b.database_id=db_id('DB_NAME')
Verify if there is any open session with the database.
select spid from sysprocesses where dbid=db_id('DB_NAME')
Kill All the sessions using [kill spid]
Then Detach your database
USE [master]
GO
EXEC master.dbo.sp_detach_db 'DB_NAME';
Now you can move the database from the source holded in the variable to desired destination.
Now if you want to attach the database on other server.
If you moved both .ldf and .mdf to same directory
USE [master]
GO
EXEC master.dbo.sp_attach_db #dbname = 'DB_NAME', #filename1 = 'C:\Data\Datfile.mdf';
If you moved .ldf and .mdf to seperate directory.
Use the script suggested by Mr. Slava Murygin here to attach database.
Thanks
I have multiple databases on multiple servers (SQL Server 2008) with similar schema. I want to execute Stored Procedure on each of them. Right now I have to execute one by one on every server via SQL Server management studio.
Is there any possibility/option in SQL Server Management Studio that I can execute SP just once on all databases.
You can use a group query to run a script against more than one server. Look here
Then use the sp_MSForEachDB mentioned by #Ram
There are two ways I can suggest if you want to avoid doing it programmatically.
1) Use Registered Servers in SSMS. Each target database can be created as a Registered Server within a Server Group. You can then right click on the Server Group and select "New Query". This query will execute against all Registered Servers in the Group. This is explained in detail on MSSQLTips.
2) SQL Multi Script is a dedicated tool we developed at Red Gate to satisfy this use case. However, this isn't integrated into SSMS.
Using the sp_MSForEachDB stored procedure you should be able execute on multiple databases of same server.
EXEC sp_msforeachdb " IF '?' NOT IN ('DBs','to','exclude')
BEGIN
EXEC sp_whatever_you_want_to
END "
Looking around I'm sure you could write a powershell or batch script to do this but I do not have time to learn, build and test one.
So I'll do it in the language I'm happiest in: SQL and batch script
Paste the below query into SSMS and run it, substituting
Your Server List
Path to a file containing the script you want to run (i.e. replace YourSQLScript.SQL)
Path to a log file (i.e. replace YourOutputLog.TXT)
You might want to alter your script and add SELECT ##SERVERNAME to the start to log the server to your output file
WITH ServerList As (
SELECT 'Server1' ServerName UNION ALL
SELECT 'Server2' UNION ALL
SELECT 'Server3' UNION ALL
SELECT 'Server4' UNION ALL
SELECT 'Server5'
)
SELECT
'SQLCMD -S ' + ServerName + ' -E ' + ' -i C:\YourSqlScript.SQL -o C:\YourOutputLog.TXT'
From ServerList
UNION ALL
SELECT 'PAUSE'
So in this example, the file C:\YourSqlScript.SQL should probably contain something like:
SELECT ##SERVERNAME
EXEC sp_msforeachdb 'USE [?]; SELECT '?'; EXEC p_YourStoredProcedure;'
(Thanks to RAM for providing this)
(You should definitely test this script in just one database first)
Copy the output and paste into a text file. Save the text file as MyFirstBatchFile.CMD. Double click this file
Check the output file (C:\YourOutputLog.TXT)
This is not going to work first time - I just built it on the fly to show you how it can be done. If/when you get your first error, sit back take a look and see if you can solve it yourself.
If you need to do this regularly then you can have a think about how you want to automate it. For example there is a way to automate getting a list of servers (hint: SQLCMD -L)
If you are going to regularly administer multiple servers you should consider using Powershell.
My Query is regarding using NOT hard coded File Locations to initialize the the Variables DefaultDataPath and DefaultLogPath. Prior to adopt Database Projects as our standard Deployment and Database Management Tools and migrating our existing Scripts to Database projects we have been using the SET of CREATE and INITIALIZE scripts for Setting up Database. We are having following SQL Query to CREATE the Database with the FILE location:
SET #data_path = (SELECT SUBSTRING(filename, 1, CHARINDEX(N'master.mdf', LOWER(filename)) - 1)
FROM sys.sysaltfiles WHERE dbid = 1 AND fileid = 1);
set #mdb_file=#data_path + 'CF_DB.mdf'
set #cfdata='CF_DB_Data'
set #cflog='CF_DB_Log'
set #ldf_file=#data_path + 'CF_DB_log.ldf'
declare #sql nvarchar(500)
set #sql = 'CREATE DATABASE [CF_DB] ON (NAME = ' + quotename(#cfdata) + ',FILENAME =' + quotename(#mdb_file) + ',SIZE = 53, FILEGROWTH = 10%) LOG ON (NAME =' + quotename(#cflog) + ',FILENAME = ' + quotename(#ldf_file) + ', SIZE = 31, FILEGROWTH = 10%)COLLATE SQL_Latin1_General_CP1_CI_AS'
exec(#sql)
Here we are trying to figure out the location of MDF file for MASTER DB and using the same location to CREATE DATABASE.
Problem: With the scripts generated (after Deploy action) , there is an auto Generated SQLCMD variables , initialized with some default path (hardcoded one ) or Empty strings (which are using Default Datafile path used by SQL Server 2008 or 2005).
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
We need to make it work like our existing system. We need to know path of MASTER DB data and log files and using the same path to initialize DefaultDataPath and DefaultLogPath. We can't go with PreDeployment scripts because Database settings are done by Database Project generated script before embedding PreDeploymentScript in the final Deploy Scripts.
NEXT big thing: Developer need to switch to SQLCMD Mode in SQL Server Management Studio to run the scripts generated by DB Project. This is our implementation Team's requirement NOT TO USE SQLCMD mode to setup DATBASE. To overcome these step, I need to modify the generated SQL file and use SQL Variables instead of SQLCMD variables. Can we generate the clean SQL Statements and keeping automation script generation intact? I know both of these issues are corelated thus the solution for one is going to Fix the other one.
Thanks for any good suggestions or help upon the above discussions.
Regards
Sumeet
Not sure how best to handle your file path, though I suspect you will want to not use the Default File Path setting and instead use a new file path that you can control through a variable.
It sounds to me like you're trying to have the developers update their local machines easily. My recommendation would be to build out some batch files that do the following:
Set the PATH to include the location for MSBuild.exe
Get the location for your master database
Pass that location in to a variable
Run the MSBuild command to set your path variables to the local master path
and publish the database/changes.
Overall, that sounds like more trouble than it's really worth. I'd recommend that you send out a SQL Script to all of the developers getting them to properly set up their Default Data/Log paths and just use the defaults.
I do recommend that you look into setting up some batch files to run the MSBuild commands. You'll have a lot easier time getting the database builds to your developers without them generating scripts and running them locally. Alternatively, you could have them change their SSMS defaults to set SQLCMD mode on for their connections. SSDT made this a little nicer because it won't run at all without SQLCMD mode turned on - eliminated a lot of the messiness from VS2008/VS2010 DBProjects.
I used something like the following to build and deploy when we were on DB Projects:
msbuild /m .\MyDB\MyDB.dbproj /t:build
msbuild /m .\MyDB\MyDB.dbproj /t:deploy /p:TargetConnectionString="Data Source=localhost;Integrated Security=True;Pooling=False;" /p:TargetDatabase="MyDB"
When SSDT/VS generates the SQL file, it actually runs some queries against the server you've told it to connect to. The settings you're getting here for example...
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
Are obtained from server settings from the target database connection you specified in your publish file/profile.
On the server that you are using to generate your scripts, open regedit.exe and search for the keys "DefaultLog" and "DefaultData" under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft - they should be in the same location. See if they match the settings your scripts are generating.
You can change these on the server/your PC (where ever you are pointing to) and it will generate the locations you enter in your generate SQL Scripts. Be cautious naturally around a server you do not own, or that is in use for production etc as this will change a setting on the server which points SQL Server where to place new databases. This is a different setting it seems than the one you enter in SQL Server properties -> Database Settings.
Hope that helps!