I am currently working on an application that has different permissions/users for the local development environment and test. I want to be able to ignore the users and permissions when deploying to either environment. Under the .sqldeployment file, there seems to only be options for ignoring permissions (IgnorePermissions) and role membership (IgnoreRoleMembership) for a user, but not for ignoring the users themselves. Is this possible?
Sorry, not currently possible in the 2008 version of Visual Studio.
There isn't an obvious way - lots of requests for this feature here. The command-line tool, vsdbcmd.exe, suffers from the same problem.
As a workaround, I use a PowerShell script to remove users and schema authorization from the deployment SQL script. This could be run as a post-deploy step (not tried), or a TeamCity build step etc.
$rxUser = New-Object System.Text.RegularExpressions.Regex "PRINT[^;]*;\s*GO\s*CREATE USER \[[^\]]*](\s*WITH DEFAULT_SCHEMA = \[[^\]]*])?;\s*GO\s*", SingleLine
$rxSchema = New-Object System.Text.RegularExpressions.Regex "PRINT[^;]*;\s*GO\s*CREATE SCHEMA \[[^\]]*]\s*AUTHORIZATION \[[^\]]*];\s*GO\s*", SingleLine
Get-Item "*.sql" | ForEach-Object {
$input = [System.IO.File]::ReadAllText($_.FullName)
$input = $rxUser.Replace($input, "")
$input = $rxSchema.Replace($input, "")
$output = [System.IO.File]::CreateText($_.FullName)
$output.Write($input)
$output.Close()
}
The regex could be modified to ignore DROP USER statements too.
Related
I am working on SSIS Package .I added one more data flow task to existing ssis package.After complition of adding new task i rebuilded the Package it was suceed with out any errors .
Do i need to deploy it to Development server?
Background
The 2012 SSIS Project Deployment model in Visual Studio contains a file for project parameters, project level connection managers, packages and anything else you've added to the project.
In the following picture, you can see that I have a Solution named Lifecycle. That solution has a project named Lifecycle. The Lifecycle project has a Project Level Connection Manager ERIADOR defined and two SSIS packages: Package00.dtsx and Package01.dtsx.
When you run a package, behind the scenes Visual Studio will first build/compile all the required project elements into a deployable quantum called an ispac (pronounced eye-ess-pack, not ice-pack). This will be found in the bin\Development subfolder for your project.
Lifecycle.ispac is a zip filed with the following contents.
What's all this mean? The biggest difference is that instead of just deploying an updated package, you'll need to deploy the whole .ispac. Yes, you really have to redeploy everything even though you only changed one package. Such is life.
How do I deploy packages using the SSIS Project Deployment model?
You have a host options available to you but at the 3 things you will need to know are
where is my ispac
what server am I deploying to
what folder does this project to
SSDT
This will probably be your most common option in the beginning. Within SQL Server Data Tools, SSDT, you have the ability to define at the Configuration Manager level what server and what folder things are deployed to. At my client, I have 3 configurations: Dev, Stage, Production. Once you define those values, they get saved into the .dtproj file and you can then right click and deploy to your heart's content from visual studio.
ISDeploymentWizard - GUI flavor
SSDT is really just building the call to the ISDeploymentWizard.exe which comes in 32 and 64 bit flavors for some reason.
C:\Program Files\Microsoft SQL Server\110\DTS\Binn\ISDeploymentWizard.exe
C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\ISDeploymentWizard.exe
An .ispac extension is associated to the ISDeploymentWizard so double click and away you go. The first screen is new compared to using the SSDT interface but after that, it will be the same set of clicks to deploy.
ISDeploymentWizard - command line flavor
What they got right with the 2012 release that sucked with the package deployment model was that the manifest file can be deployed in an automated fashion. I had a workaround but it should have been a standard "thing".
So look carefully at the Review tab from either the SSDT or GUI deploy. Isn't that a beauty?
Using the same executable, ISDeploymentWizard, we can have both an attended and unattended installer for our .ispac(s). Highlight the second line there, copy paste and now you can have continuous integration!
C:\Program Files\Microsoft SQL Server\110\DTS\Binn\ISDeploymentWizard.exe
/Silent
/SourcePath:"C:\Dropbox\presentations\SSISDB Lifecycle\Lifecycle\Lifecycle\bin\Development\Lifecycle.ispac"
/DestinationServer:"localhost\dev2012"
/DestinationPath:"/SSISDB/Folder/Lifecycle"
TSQL
You can deploy an ispac to SQL Server through SQL Server Management Studio, SSMS, or through the command line, sqlcmd.exe. While SQLCMD is not strictly required, it simplifies the script.
You must use a windows account to perform this operation though otherwise you'll receive the following error message.
The operation cannot be started by an account that uses SQL Server Authentication. Start the operation with an account that uses Windows Authentication.
Furthermore, you'll need the ability to perform bulk operations (to serialize the .ispac) and ssis_admin/sa rights to the SSISDB database.
Here we use the OPENROWSET with the BULK option to read the ispac into a varbinary variable. We create a folder via catalog.create_folder if it doesn't already exist and then actually deploy the project with catalog.deploy_project. Once done, I like to check the operations messages table to verify things went as expected.
USE SSISDB
GO
-- You must be in SQLCMD mode
-- setvar isPacPath "C:\Dropbox\presentations\SSISDB Lifecycle\Lifecycle\Lifecycle\bin\Development\Lifecycle.ispac"
:setvar isPacPath "<isPacFilePath, nvarchar(4000), C:\Dropbox\presentations\SSISDB Lifecycle\Lifecycle\Lifecycle\bin\Development\Lifecycle.ispac>"
DECLARE
#folder_name nvarchar(128) = 'TSQLDeploy'
, #folder_id bigint = NULL
, #project_name nvarchar(128) = 'TSQLDeploy'
, #project_stream varbinary(max)
, #operation_id bigint = NULL;
-- Read the zip (ispac) data in from the source file
SELECT
#project_stream = T.stream
FROM
(
SELECT
*
FROM
OPENROWSET(BULK N'$(isPacPath)', SINGLE_BLOB ) AS B
) AS T (stream);
-- Test for catalog existences
IF NOT EXISTS
(
SELECT
CF.name
FROM
catalog.folders AS CF
WHERE
CF.name = #folder_name
)
BEGIN
-- Create the folder for our project
EXECUTE [catalog].[create_folder]
#folder_name
, #folder_id OUTPUT;
END
-- Actually deploy the project
EXECUTE [catalog].[deploy_project]
#folder_name
, #project_name
, #project_stream
, #operation_id OUTPUT;
-- Check to see if something went awry
SELECT
OM.*
FROM
catalog.operation_messages AS OM
WHERE
OM.operation_message_id = #operation_id;
Your MOM
As in, your Managed Object Model provides a .NET interface for deploying packages. This is a PowerShell approach for deploying an ispac along with creating the folder as that is an option the ISDeploymentWizard does not support.
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices") | Out-Null
#this allows the debug messages to be shown
$DebugPreference = "Continue"
# Retrieves a 2012 Integration Services CatalogFolder object
# Creates one if not found
Function Get-CatalogFolder
{
param
(
[string] $folderName
, [string] $folderDescription
, [string] $serverName = "localhost\dev2012"
)
$connectionString = [String]::Format("Data Source={0};Initial Catalog=msdb;Integrated Security=SSPI;", $serverName)
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$integrationServices = New-Object Microsoft.SqlServer.Management.IntegrationServices.IntegrationServices($connection)
# The one, the only SSISDB catalog
$catalog = $integrationServices.Catalogs["SSISDB"]
$catalogFolder = $catalog.Folders[$folderName]
if (-not $catalogFolder)
{
Write-Debug([System.string]::Format("Creating folder {0}", $folderName))
$catalogFolder = New-Object Microsoft.SqlServer.Management.IntegrationServices.CatalogFolder($catalog, $folderName, $folderDescription)
$catalogFolder.Create()
}
return $catalogFolder
}
# Deploy an ispac file into the SSISDB catalog
Function Deploy-Project
{
param
(
[string] $projectPath
, [string] $projectName
, $catalogFolder
)
# test to ensure file exists
if (-not $projectPath -or -not (Test-Path $projectPath))
{
Write-Debug("File not found $projectPath")
return
}
Write-Debug($catalogFolder.Name)
Write-Debug("Deploying $projectPath")
# read the data into a byte array
[byte[]] $projectStream = [System.IO.File]::ReadAllBytes($projectPath)
# $ProjectName MUST match the value in the .ispac file
# else you will see
# Failed to deploy the project. Fix the problems and try again later.:The specified project name, test, does not match the project name in the deployment file.
$projectName = "Lifecycle"
$project = $catalogFolder.DeployProject($projectName, $projectStream)
}
$isPac = "C:\Dropbox\presentations\SSISDB Lifecycle\Lifecycle\Lifecycle\bin\Development\Lifecycle.ispac"
$folderName = "Folder"
$folderName = "SSIS2012"
$folderDescription = "I am a description"
$serverName = "localhost\dev2012"
$catalogFolder = Get-CatalogFolder $folderName $folderDescription $serverName
Deploy-Project $isPac $projectName $catalogFolder
Here is an update on deploying a single package in SSIS 2016 (hope this can be useful).
With the release of SQL Server 2016 and SSDT 2015 the issue of single package deployment is now a thing of the past. There is the new Deploy Package option (VS 2015) that comes up for deploying individual packages within a project deployment model
With this new feature, you can also deploy multiple packages, by clicking and holding down the control key (Ctrl) and then choosing the packages you want to deploy.
Besides the Deploy Package option in Visual Studio 2015, there are some other possibilities you may use to deploy packages, like launching ISDeploymentWizard application or doing Command Line Deployment (this one is necessary when SSIS build and deployment is automated or managed as part of Continuous Integration process). You can learn more by navigating to this article: http://www.sqlshack.com/single-package-deployment-in-sql-server-integration-services-2016/
If you are using Project Model in SSIS 2012, you have to deploy the project every time you make any change in package.
You can simply do is :
RIGHT Click on Project and Deploy
Has anyone else met a similar problem to the one described below?
I am having a problem deploying a SQL server 2012 dacpac database upgrade with Powershell. The details are as follows:
Its a dacpac file built for sql server 2012 and I'm trying to apply it to a sql server 2012 database via Powershell run from the command line when logged in as administrator.
Exception calling "Deploy" with "4" argument(s): "Unable to determine the identity of domain."
At ... so.ps1:17 char:8
+ $d.Deploy($dp, $TargetDatabase,$true,$DeployOptions)
The redacted script (logging and literals changed) is as follows:
[System.Reflection.Assembly]::LoadFrom("C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll") | Out-Null
$d = new-object Microsoft.SqlServer.Dac.DacServices ("... Connection string ...")
$TargetDatabase = "databasename"
$fullDacPacPath = "c:\temp\...\databasename.dacpac"
# Load dacpac from file & deploy to database named pubsnew
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($fullDacPacPath)
$DeployOptions = new-object Microsoft.SqlServer.Dac.DacDeployOptions
$DeployOptions.IncludeCompositeObjects = $true
$DeployOptions.IgnoreFileSize = $false
$DeployOptions.IgnoreFilegroupPlacement = $false
$DeployOptions.IgnoreFileAndLogFilePath = $false
$DeployOptions.AllowIncompatiblePlatform = $true
$d.Deploy($dp, $TargetDatabase,$true,$DeployOptions)
Here is some supporting information:
Dac framework version is 11.1
The script throws the error when run on the command line:
ie. Powershell -File databaseupgrade.ps1
but not when run in the Powershell integrated script environment
Similar scripts work from the command line for other dacpacs.
Research on the web might suggest that it might be something to do with the size of dacpac. The ones that work are all smaller than the one that does not and this link mentions a figure of 1.3mb which the file size of the failing dacpac just exceeds. If anyone can confirm that this is the problem can you also suggest a solution?
Update
The following script exhibits the same behavior ie. works in PS Ide not from command line.
[Reflection.Assembly]::LoadWithPartialName("System.IO.IsolatedStorage")
$f = [System.IO.IsolatedStorage.IsolatedStorageFile]::GetMachineStoreForDomain();
Write-Host($f.AvailableFreeSpace);
I believe this issue here (at least in our case) is actually when the dacpac is working with a database that utilizes multiple filegroups. When doing the comparison for deployment my hypothesis is that it's utilizing the IsolatedStorage for the different files.
The link above was helpful, but it was not the entry as much as the last comment on that blog by Tim Lewis. I modified his code to work in native powershell. Putting this above the SMO assembly load should fix this issue:
$replacementEvidence = New-Object System.Security.Policy.Evidence
$replacementEvidence.AddHost((New-Object System.Security.Policy.Zone ([Security.SecurityZone]::MyComputer)))
$currentAppDomain = [System.Threading.Thread]::GetDomain()
$securityIdentityField = $currentAppDomain.GetType().GetField("_SecurityIdentity", ([System.Reflection.BindingFlags]::Instance -bOr [System.Reflection.BindingFlags]::NonPublic))
$securityIdentityField.SetValue($currentAppDomain,$replacementEvidence)
Edit - this answer is incorrect, see the link added in the original question for information about the real root cause.
It sounds like you're trying to connect with Windows Authentication and that's the cause of the failure (see this post as it seems to cover the error message you're getting). Change your connection string to use SQL Authentication or ensure that the user your powershell script is running as both has a domain-joined identity and has permissions to access the server. Basically, this is a SQL connection issue not a DAC issue.
It's been a few days now so I don't think a proper explanation will be forthcoming. I'll just post this as our workaround for anyone else who finds themselves in this situation.
There is a Microsoft command line program SqlPackage.exe that is fairly easy to get hold of. It will silently deploy a dacpac, can be executed in Powershell and has parameters that support all the options that we need.
If we use this instead of the Dac services assembly directly the domain problem does not arise.
I managed to grab the concept of for-each and the SSMS application programming and also reading from excel file thru this question
Adding Servers to SQL Management Studio
This is the code that is working for me now
Import-Csv C:\sl.csv | ForEach-Object { New-Item $(Encode-Sqlname $_.Name) -ItemType Registration -Value ("server=;$($_.Name);integrated security=true") }
But I was having a problem with getting the user name and password to be auto-configured,
which means changing the above code to this
Import-Csv C:\sl.csv | ForEach-Object { New-Item $(Encode-Sqlname $_.Name) -ItemType Registration -Value ("server=;$($_.Name);integrated security=false") }
but that is ok that is how the people I am delivering the script to prefer, for security purposes.(even though I would like to know how to get it done :))
Now for further enhancement, there are quite a number of mirrored servers, like
server1/instance2a
server2/instance2b
so the thing is I want to know in the registered servers window
there are mirrored and the principal servers, which means when the server is registered the name I want to make it appear like this, server1/isntance2a (mirror), so when the user wants to login he easily knows which is the mirror or prinicipal server. So to determine this the sql query is this.
select mirroring_role_desc from sys.database_mirroring where database_id > 4 and mirroring_state is NOT NULL
the output of this will give me this
mirroring_role_desc
PRINCIPAL
PRINCIPAL
PRINCIPAL
PRINCIPAL
PRINCIPAL
This query will run and determine first if it is a mirrored instance, if it is then display the number of principal databases in the instance. From here I want to take the output and display the registered server name according to the specifications I mentioned above.
BUT when there is a failover the registered server name still shows as server1/instance2a even though it is the mirrored now, so as by now you can understand that I am trying to make this script dynamic so the user can run this when ever he wants or it is run every fortnight or something like that (not to worry about the schedule for now)
Given a detached SQL Server Primary Data File (.mdf) how can you get the list of data and log files that exist inside that file? The goal is to use the Server.AttachDatabase SMO method to attach the database. But the database may have multiple data and/or log files so I need to get the list in order to add them to the StringCollection parameter of the method.
What I need is the equivalent of select * from sys.files for a detached mdf. SQL Server Management Studio does this when you use it to manually attach a database so I know it can be done.
If it's a single MDF file, if there are other MDF files I don't think the MDF file can tell you that directly.
And you can always try to attach an MDF file without a log file by using the CREATE DATABASE ... FOR ATTACH_REBUILD_LOG option.
In SMO you can do this using the AttachDatabase method specifying RebuildLog for AttachOptions.
Of course this all assumes the .mdf file is healthy - it will only be usable if it was cleanly detached.
Management Studio probably has some proprietary way of reading the file headers, but these aren't documented and you're not going to be able see what SSMS is doing using Profiler or the like.
If you are typically creating .mdf files for distribution of some kind, I really strongly recommend using backup/restore instead. You can learn a lot more about a .BAK file and the data/log files its database represents, using documented and public methods such as RESTORE FILELISTONLY.
Finally figured this one out. The undocumented command DBCC checkprimaryfile(N'blah.mdf',3) gives the info needed.
It took me a while to find it, but in SMO, you use EnumDetachedLogFiles and EnumDetachedLogFiles.
# PowerShell
$servername = "sqlserver\instance"
$mdf = "S:\DATA\mydb.mdf"
[void][Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$filestructure = New-Object System.Collections.Specialized.StringCollection
$server = New-Object Microsoft.SqlServer.Management.Smo.Server $servername
# Here you can automatically determine the database name, or set it manually
$dbname = ($server.DetachedDatabaseInfo($mdf) | Where { $_.Property -eq "Database name" }).Value
foreach ($file in $server.EnumDetachedDatabaseFiles($mdf)) {
$null = $filestructure.add($file)
}
foreach ($file in $server.EnumDetachedLogFiles($mdf)) {
$null = $filestructure.add($file)
}
Enumerate detached database file structures
SMO Attach/Detach Recipes
I'm using SQL Server 2008.
I'm trying to script out all my stored procedures, and it's very easy to do in SSMS by right clicking my database, then going to Tasks -> Generate Scripts. I set my options and all is good.
I would like to get the actual T-SQL that the script wizard is executing so that I don't have to go through and select all of my options every single time I want to do it. I want to just open the script and hit run. Is there a way to copy the script that the wizard itself is executing? Or do I just have to do it manually every time?
You can do this with a pretty simple powershell script using the SMO Framework. You will need to have SQL Server Management Studio installed for the framework to get picked up. You should look into this further, but the basic framework will be:
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$srv = new-object "Microsoft.SqlServer.Management.SMO.Server" 'MyServer'
$db = New-Object "Microsoft.SqlServer.Management.SMO.Database"
$db = $srv.Databases['MyDatabase']
$scr = New-Object "Microsoft.SqlServer.Management.Smo.Scripter"
$scr.Server = $srv
$scr.options.filename = 'C:\SomeFolder\MyExports.SQL'
$db.StoredProcedures | where-object {$_.IsSystemObject -eq $False} | %{$scr.Script($_)}
You may need to alter some additional options. MSDN has a pretty thorough overview of the framework here.
Essentially the above will script out all the stored procs in a database to whatever file you specify. SMO is the framework that SSMS uses so it should be identical.
If all you're after is "script all procedures," and it's a one-time thing, you can open Object Explorer Details in Management Studio, highlight Stored Procedures in Object Explorer, then Ctrl + A, right-click, script as > ...
If you want a slightly more automated way, there are several schema comparison products on the market so that you don't have to care about the script that Management Studio uses to generate the script for a single object. As a bonus, it will be much easier to synchronize other, more complicated objects - just try to generate the script for a table yourself, and you will see it is no picnic. I go over many options in this blog post.