How to switch from one server connection to database engine - sql-server

I am working in SSMS.
I have an object that I want to edit on several servers/databases simultaneously.
I start with opening the object via Object explorer and editing/testing there.
Once I am done, I go to Registered servers, and copy+paste the object code to update it on all the locations.
Is there a faster way to do this? Right clicking and choosing Change connection only works with one Server at a time and does not allow to choose anything from the Database engine.
Thank you!

SSMS has a SQLCMD mode.
By enabling it changing of the current connection can be part of the script:
:connect (local)
SELECT name from sys.databases
-- run some other script
:connect anotherServer
SELECT name from sys.databases
-- run some other script
Another approach is Multi-database Query:
In such case, servers to be pre-grouped into folders based on your criteria
References:
https://www.mssqltips.com/sqlservertip/2855/sql-server-multi-database-query-with-registered-servers/
https://www.sqlshack.com/use-sqlcmd-commands-ssms-query-editor/

Related

Can't see backup folder in MS SQL Server 2019 [duplicate]

I am trying to run a classic asp project on visual studio and therefore I am trying to restore a database. I cannot seem to find the .bak file when trying to restore it. It wasn't visible in the C-drive. I have looked it up and have seen that it has something to do with permissions. I have tried enabling permissions after properties>security of .bak file to 'Everyone', it did not work. I have also tried NT Service\MSSQLSERVER. Please see the following below.
I had the same issue. I get to know that it was a a permissions issue.
Here are the following steps to solve this problem
Go to Control panel > Administrative Tools > Services double click on the services.
copy the log on account name of sql service as shown in the picture.
Go to your backup folder and right click and select properties.
Follow the steps as shown in picture.
After clicking on edit you will get new window.
Click on add
Paste here the Log on account name of sql service account which you had copied earlier. then Click on check name button.
Select MSSQLSERVER then click ok.
Now you will get your sql server account name in security tab.
Select sql server account name that and in permission box select Allow in all check boxes(I added Full Control) then click apply.
Now you will be able to select and restore your backup file.
You also have to enable permissions to the folder where the file is located. SSMS first needs to browse the folder, then to read the file.
In your case, SQL server tries to access the folder using NT Service\MSSQLSERVER account, so permissions should be granted to that account, no need to give more permissions (to everyone).
Also try below if above doesn't help:
close/open SSMS
move the file from root to some folder and grant permissions to that folder
use TSQL script to restore the database instead of SSMS
you can copy the backup folder in to C:\Users\MSSQL$HCMIS\Desktop sql server users folder.Now it will be accessible
I'm facing simpler issue for sql server standalone 2014 enterprise edition ,installed in my home system . I tried every possibility provided in various article but strangely the issue is with my 2 named instance .
I can restore database on my default instance with query and GUI . I tried changing log-on as account of sql server service for named instance to 'network service' or 'local system' and giving them full control permission to backup holding folder .
In my (and probably a typical/common case): I looked through (and compare with other .bak files) in the windows permission and see nothing wrong.
However - This may be due to that one does not follow to restore from the [device] option first. Even you store the .bak in the right path, it is initially from the [Source]->[Database] check box, until you follow the steps to restore via the [device] option
(In my case the second AdventureWorksDW2020-DAX-Docs was not available until I go thru extra steps to restore it via the [Device] option first.)
The SQLserver need to do internal access right grants to users... etc..(even all the needed Windows properties are in place) –
Reference https://learn.microsoft.com/en-us/sql/samples/adventureworks-install-configure?view=sql-server-ver16&tabs=ssms

End user initiating SQL commands to create a file from a SQL table?

Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP
Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.
There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.

CREATE DATABASE runs successfully but no DB created

I'm running the following T-SQL statement from SSMS
CREATE DATABASE SomeDB
GO
With a result
Commands completed successfully.
But no database is actually created. I've been researching and came across this post which has the same behavior. The solution for that post was the run the script under an account with rights to modify sys.databases.
However, the user I'm running the script under and connecting to the DB as is in role sysadmin which is more than enough to create a database.
Any ideas as to what's going on here?
EDIT 1
If I change the script (and this is the whole script, with a DB actually called SomeDB to test), to the following
CREATE DATABASE SomeDB
GO
USE SomeDB
I get the following in SSMS's Messages panel.
Msg 911, Level 16, State 1, Line 56
Database 'SomeDB' does not exist. Make sure that the name is entered correctly.
If I change this to
CREATE DATABASE SomeDB
GO
SELECT * from sys.databases
I see the following in the Messages panel
Commands completed successfully.
But there is no Results panel. This would imply that access to sys.databases is restricted but it's weird that there's no error message.
EDIT 2
Taking this further and trying to narrow down the issue, I've run the following via an unelevated command line;
sqlcmd -S .\SQLExpress2014 -Q "CREATE DATABASE SomeDB"
And this time the database does exist. This narrows down the issue to SSMS itself rather than SQL Server or a syntax quirk.
Solution: Run SSMS as Admin.
Despite CREATE DATABASE working fine via an unelevated command line, SSMS requires admin privileges to do the same. The silent failure is...a possible bug?
I'll do further research on this but my working hypothesis is that when executing via a command line, it uses the SQL Server Windows Service instance's credentials (Network Service for older versions, NT Service\MSSQL$SQLEXPRESS for later versions - there's a strong whiff of a permissions issue here), to write to the %programfiles% folder. SSMS uses the currently logged in user (unelevated) if connecting via a Windows account. Without elevation, there is no write access to %programfiles%.
Still though, even if this is the case (to be verified), there should still be an access error when executing CREATE DATABASE in this context.
Right click on databases folder in Object explorer and refresh. Then check wheter DB exists or not..
1.- Run SSMS as Admin and create the database
2.- After the command shows it succesfully created the database, try disconnecting from the object explorer and connect again to see if it shows up.

Cannot bulk load. Operating system error code 5 (Access is denied.)

For some weird reason I'm having problems executing a bulk insert.
BULK INSERT customer_stg
FROM 'C:\Users\Michael\workspace\pydb\data\andrew.out.txt'
WITH
(
FIRSTROW=0,
FIELDTERMINATOR='\t',
ROWTERMINATOR='\n'
)
I'm confident after reading this that I've setup my user role correctly, as it states...
Members of the bulkadmin fixed server role can run the BULK INSERT statement.
I have set the Login Properties for the Windows Authentication correctly (as seen below).. to grant server-wide permissions on bulkadmin
(source: iforce.co.nz)
And the command EXEC sp_helpsrvrolemember 'bulkadmin' tells me that the information above was successful, and the current user Michael-PC\Michael has bulkadmin permissions.
(source: iforce.co.nz)
But even though I've set everything up correctly as far as I know, I'm still getting the error. executing the bulk insert directly from SQL Server Management Studio.
Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "C:\Users\Michael\workspace\pydb\data\andrew.out.txt" could not be opened. Operating system error code 5(Access is denied.).
which doesn't make sense because apparently bulkadmins can run the statement, am I meant to reconfigure how the bulkadmin works? (I'm so lost). Any ideas on how to fix it?
This error appears when you are using SQL Server Authentication and SQL Server is not allowed to access the bulk load folder.
So giving SQL server access to the folder will solve the issue.
Here is how to:
Go to the folder right click ->properties->Security tab->Edit->Add(on the new window) ->Advanced -> Find Now. Under the users list in the search results, find something like SQLServerMSSQLUser$UserName$SQLExpress and click ok, to all the dialogs opened.
I don't think reinstalling SQL Server is going to fix this, it's just going to kill some time.
Confirm that your user account has read privileges to the folder in question.
Use a tool like Process Monitor to see what user is actually trying to access the file.
My guess is that it is not Michael-PC\Michael that is trying to access the file, but rather the SQL Server service account. If this is the case, then you have at least three options (but probably others):
a. Set the SQL Server service to run as you.
b. Grant the SQL Server service account explicit access to that folder.
c. Put the files somewhere more logical where SQL Server has access, or can be made to have access (e.g. C:\bulk\).
I suggest these things assuming that this is a contained, local workstation. There are definitely more serious security concerns around local filesystem access from SQL Server when we're talking about a production machine, of course this can still be largely mitigated by using c. above - and only giving the service account access to the folders you want it to be able to touch.
I had the same problem SSIS 2012 and the solution was to use Windows Authentication. I was using SQL authentication with the sa user.
Go to start run=>services.msc=>SQL SERVER(MSSQLSERVER) stop the service
Right click on SQL SERVER(MSSQLSERVER)=> properties=>LogOn Tab=>Local System Account=>OK
Restart the SQL server Management Studio.
Try giving the folder(s) containing the CSV and Format File read permissions for ‘MSSQLSERVER’ user (or whatever user the SQL Server service is set to Log On As in Windows Services)
This is what worked for me:
Log on SSIS with Windows authentication.
1. Open services and find MSSQL NT Service account name and copy it:
2. Open folder from which SQL server should read from. Security - Group or user names tab - Add and paste there copied account:**
You will probably get "Multiple names found error", just select MSSQL user:
Your BULK INSERT query should run fine now.
If problem persists try adding SQL Server Agent account to folder permissions in same way.
Make sure you restart MSSQL server in services after you are done.
This is quite simple the way I resolved this problem:
open SQL Server
right click on database (you want to be backup)
select properties
select permissions
select your database role (local or cloud)
in the you bottom you will see explicit permissions table
find " backup database " permission and click Grant permission .
your problem is resolved .
sometimes this can be a bogus error message, tried opening the file with the same account that it is running the process. I had the same issue in my environment and when I did open the file (with the same credentials running the process), it said that it must be associated with a known program, after I did that I was able to open it and run the process without any errors.
Make sure the file you're using ('C:\Users\Michael\workspace\pydb\data\andrew.out.txt') is on the SQL server machine and not the client machine running MSSMS.
1) Open SQL
2) In Task Manager, you can check which account is running the SQL - it is probably not Michael-PC\Michael as Jan wrote.
The account that runs SQL need access to the shared folder.
I have come to similar question when I execute the bulk insert in SSMS it's working but it failed and returned with "Operation system failure code 5" when converting the task into SQL Server Agent.
After browsing lots of solutions posted previously, this way solved my problem by granting the NT SERVER/SQLSERVERAGENT with the 'full control" access right to the source folder.
Hope it would bring some light to these people who are still struggling with the error message.
In our case it ended up being a Kerberos issue. I followed the steps in this article to resolve the issue: https://techcommunity.microsoft.com/t5/SQL-Server-Support/Bulk-Insert-and-Kerberos/ba-p/317304.
It came down to configuring delegation on the machine account of the SQL Server where the BULK INSERT statement is running. The machine account needs to be able to delegate via the "cifs" service to the file server where the files are located. If you are using constrained delegation make sure to specify "Use any authenication protocol".
If DFS is involved you can execute the following Powershell command to get the name of the file server:
Get-DfsnFolderTarget -Path "\\dfsnamespace\share"

Update SQL Server 2005 view with new database name?

I have approximately 100 SQL views that are a variation of this:
select * from RTC.dbo.MyTable
...now I find I need to change the name of the RTC table to something else. Rather than edit one view at a time, is there a way to script out all their drop/create statements to a text file so that I can do a global replacement?
In SSMS right click the database, go to Tasks and select there 'Generate Scripts...'. Select 'Views', select the views you want exported, export.
I'd use PowerShell. If you're not using SQL 2008 Client Tools, install them. Then get the PowerShell client, add the registered snapins (plenty of information out there on how to do that), and then use the directory structure to get to the folder representing your Views.
Then script them using something like:
Get-ChildItems | % {$_.Script()}
Use ScriptOptions to tell it to use an Alter script.
And replace "RTC." with the new database name... and run them using sqlcmd.
PowerShell actually becomes a really nice deployment option too.

Resources