I want to deploy database changes to an SQL Server using an Azure DevOps release Pipeline. There is a task called SQL Server Database Deploy, I see the option to deploy using SQL scripts however everywhere I read it is explained that this is to deploy a database.
My question is that if I have a directory of SQL scripts in my repository under /SQLScripts/*.sql could I use the scripts contained in this folder to deploy changes to stored procedures without effecting the remainder of the database?
e.i. Using something like the below script to update stored procedures...
USE [TESTDB]
GO
/****** Object: StoredProcedure [dbo].[TestSP] Script Date: 01/01/2023 12:00:00 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[TestSP]
-- Some new comment
I haven't attempted to deploy anything using this task as I don't want to risk altering the database in unintended ways. I would expect the task to run the stored procedures in order and for it to only update the scripts that I am passing through the pipeline.
Related
I have a linked database in MS SQL Server management Studio. I can query it and get the exact data I need in the format I need with headers and all.
I need a way to automate this query on a schedule and export as a csv automatically.
Currently my process is to open the query in MS SQL SMS, and run the query (which mind you takes 15 minutes to run) and then in the results area right click and "save results as"
I would like to either automate it with in SMS, or be able to script it. What's confusing to me is how to write a script for a linked database.
I noticed that when you right click on the Linked Server there are a couple of options to script. For example, there's "Create To", "DROP To", and "DROP And CREATE To". Each of the scripts seem to be formated in a similar way with the following:
USE [master]
GO
/****** Object: LinkedServer [QREMOTE] Script Date: 2/24/2022 10:59:26 AM ******/
EXEC master.dbo.sp_dropserver #server=N'QREMOTE', #droplogins='droplogins'
GO
I'm not familiar with this scripting and I'm not sure how to use it. Any assistance would be appreciated!
Some background, this is a quickbooks database connected through the QODBC driver in which I'm using SMS to query specific fields from a table that need to be exported in a specific order so they can be imported into another system.
1. "WebApi": "Data Source=.;Initial Catalog=TaskDB; Integrated Security=true"
2. "WebApi": "Server=(localdb)\\mssqllocaldb;AttachDBFilename=%CONTENTROOTPATH%\\App_Data\\TaskDB.mdf;Trusted_Connection=true;MultipleActiveResultSets=true"
I am trying to move my DataBase from main folder of Microsoft SQL to project folder App_Data, but it does not work for some reason. I do not know why maybe my connection string is wrong. So with number 1 is working fine, but it is in main folder of Microsoft SQL, but with number 2 there is something wrong I guess
The files of database are exclusive in hand of SQL Server. You can not move database when it is online. Taking database offline, requires that no one be connected to database.
First take the database offline then try to move the files. You can take the database offline both using SSMS and query. First line of code kill all active sessions then set database multiple user; but it must be offline before anyone can connect to database so all these code must be executed together.
ALTER DATABASE DatabaseName SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
ALTER DATABASE DatabaseName SET MULTI_USER
GO
USE master
GO
ALTER DATABASE DATABASE_NAME SET OFFLINE
Be aware that when you move file to another location database could not been brought ONLINE if you do not set new filename before taking the database offline.
Use this code for all of files that you want to move them.
ALTER DATABASE TEST
MODIFY FILE (NAME = 'LOGICAL_NAME', FILENAME = 'New_Directort\Filename.mdf')
After you moved the files then bring online the database with this statement.
ALTER DATABASE DATABASE_NAME SET ONLINE
As you can see this action is not a kind, that can be done without plan. Specially with application code. When trying to do it in application code; then all session including application connection will be lost. then you can not continue the progress.
I have two views in a database project which OPENQUERY an Oracle LinkedServer. When I publish to production the Oracle Linked server needs to be named "OracleBI". When I publish to test the Oracle linked server needs to be named "OracleTestBI". How do I accomplish this?
I have tried using using SQLCMD variables and suppressing T_SQL warnings SQL71501. Errors would not suppress.
I have tried creating a skeleton view then altering the view with a post deployment script but the alter view wasn't allowed, 'incorrect syntax near ALTER.' in the batch .....
I tried creating a view with a select statement on a table function. Creating a skeleton table function and then altering the function on a post deployment script but the alter statement wasn't allowed, 'incorrect syntax near ALTER.' in the batch .....
I tried creating an additional database project for the linked server with both test and prod linked server names, added it as a reference, then use a SQLCMD variable to switch between linked server names, "...View: [compass].vwBIInvForecastBegVolume has an unresolved reference to object [$(OracleServer)]"
My Post Deployment script calls other scripts and when I say I added an alter script to the post deployment script, what I really did was add a reference to the script in the post deployment script. My post deployment script looks like this:
PRINT 'Create Environment Users'
------------------------------------------------------------
IF '$(TargetEnv)' = 'PROD'
BEGIN
:r .\PostDeployment\CreateEnvironmentUsers.Prod.sql
END
ELSE
IF '$(TargetEnv)' = 'TEST'
BEGIN
:r .\PostDeployment\CreateEnvironmentUsers.Test.sql
END
ELSE
BEGIN
:r .\PostDeployment\CreateEnvironmentUsers.Local.sql
END
Yes. Database Project Solution doesn't support the Alter Statements. You have to CheckIn the Script with Create Statement. During the Deployment DACPAC will Create the Alter Statement at run time. Add the Script to check if the view exist then drop and recreate.
I have a SQL Server database set up that I manage using SQL Server Management Studio 17.
In that database, I have 27 tables that I maintain by running pretty simple OPENQUERY scripts every morning, something to the effect of:
DROP TABLE IF EXISTS [databasename].[dbo].[table27]
SELECT * INTO [databasename].[dbo].[table27] FROM OPENQUERY(OracleInstance, '
SELECT
table27.*
FROM
table27
INNER JOIN table26 ON table27.criteria = table26.criteria
WHERE
< filter >
< filter >
');
And this works great! But, it is cumbersome to every morning, sign into SSMS, and right click on my database and hit "New Query" and copy in 27 individual SQL scripts and run them. I am looking for a way to automate that. My directory that holds these scripts looks like this:
I don't know if this is achievable in SSMS or in like a batch script. I would imagine for the latter, some pseudocode looking like:
connect to sql server instance
given instance:
for each sql_script in directory:
sql_script.execute
I have tried creating a script in SSMS, by following:
Tasks -> Script Database ->
But there is no option to execute a .sql file on the tables in question.
I have tried looking at the following resources on using T-SQL to schedule nightly jobs, but have not had any luck conceiving of how to do so:
https://learn.microsoft.com/en-us/sql/ssms/agent/schedule-a-job?view=sql-server-2017
Scheduled run of stored procedure on SQL server
The expected result would be the ability to automatically run the 27 sql queries in the directory above to update the tables in SQL Server, once a day, preferably at 6:00 AM EST. My primary issue is that I cannot access anything but SQL Server Management Studio; I can't access the configuration manager to use things like SQL Server Agent. So if I am scheduling a task, I need to do so through SSMS.
You actually can't access the SQL Server Agent via Object Explorer?
This is located below "Integration Services Catalog"
See highlighted below:
You describe not being able to access that in the question for some reason. If you can't access that then something is wrong with SQL Server or perhaps you don't have admin rights to do things like schedule jobs (a guess there).
In SSMS you would wnat to use Execute T-SQL Statement Task and write your delete statement in the SQL Statement field in the General Tab.
However, I would look at sqlcmd. Simply make a batch script and schedule it in Task Scheduler (if you're using windows). Or you could use
for %%G in (*.sql) do sqlcmd /S servername /d databaseName -E -i"%%G"
pause
From this post. Run all SQL files in a directory
So basically you have to create a Powershell script that calls and execute the sql scripts.
After that you can add your Powrshell script to the Task Scheduler.
I suggest you add these scripts as jobs for the SQL Server Agent.
I have a small (few hundred MB) SQL Server database running on RDS. I've spent several hours trying to get a copy of it onto my local SQL Server 2014 instance. All of the following fail. Any ideas what might work?
Task -> Backup fails because it doesn't give my admin account permission to backup to a local drive.
Copy Database fails during create package with While trying to find a folder on SQL an OLE DB error was encountered with error code 0x80040E4D
From SSMS, while connected to the RDS server, running BACKUP DATABASE. This fails with message BACKUP DATABASE permission denied in database 'MyDB'. Even after running EXEC sp_addrolemember 'db_backupoperator' for the connected user.
General scripts generates a 700MB .sql file. Running that with sqlcmd -i fails at some point after producing plausible .mdf and .ldf files that can't be mounted on the local server (probably because the sqlcmd failed to complete and unlock them).
AWS has finally provided a reasonably easy means of doing this: It requires an S3 bucket.
After creating a bucket called rds-bak I ran the following stored procedure in the RDS instance:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='arn:aws:s3:::rds-bak/MyDatabase.bak',
#overwrite_S3_backup_file=1;
The following stored procedure returns the status of the backup request:
exec msdb.dbo.rds_task_status #db_name='MyDatabase'
Once it finished I downloaded the .bak file from S3 and imported it into a local SQL Server instance using the SSMS Restore Database... wizard!
The SSIS Import Export Wizard can generate a package to duplicate a whole set of tables. (It's not the sort of Copy Database function that relies on files - it makes a package with data flow components for each table.)
It's somewhat brittle but can be made to work :-)
SSMS Generate Scripts feature can often fail with any large data set as the script for all the data is just to large/verbose. This method never scripts out the data.
Check this out: https://github.com/andrebonna/RDSDump
It is a C#.NET Console Application that search for the latest origin database Snapshot, restore it on a temporary RDS instance, generate a BACPAC file, upload it to S3 and delete the temporary RDS instance.
You can transform your RDS snapshot into a BACPAC file, that can be downloaded and imported onto your local SQL Server 2014 instance using the feature answered here (Azure SQL Database Bacpac Local Restore)
Redgate's SQL Compare and SQL Data Compare are invaluable for these types of things. They are not cheap (but worth every penny imo). But if this is a one-time thing, you could use the 14 day trial and see how it behaves for you.
http://www.red-gate.com/products/