powershell attach sql server database that has been detached by management studio - sql-server

I am writing a powershell script that will automate a dev environment deployment and I've hit a problem with attaching the db's. I am adding the 2 snap ins SqlServerCmdletSnapin100 and SqlServerProviderSnapin100 and using SQLSERVER:\SQL\localhost\SQLEXPRESS and the AttachDatabase method. This is working well and if I use DetachDatabase method in the same way I can re-run the script continually. My problem arises when I detach from the management studio and try to run the script again. No matter what I do here (permissions etc.) the script will continually fail from this point on with error:
Exception calling "AttachDatabase" with "2" argument(s):
"Attach database failed for Server 'localhost\SQLEXPRESS'. "
If I change the name of the database I am attaching as the script will work again. Is there something in a system db that would be hanging onto the Database or database files that I need to remove as well?

SMO uses nested error objects, so I'm wondering what the base error message states. If you run this statement:
$error[0] | fl -force
What error message do you get
Update
Ran a quick test:
Detach database "hsg" from my local instance using SSMS and successfully attached with this script:
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $s = get-item .
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $s.AttachDatabase("hsg",$sc)
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $sc = new-object System.Collections.Specialized.StringCollection
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $sc.Add("C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL1\MSSQL\DATA\hsg.mdf")
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $sc.Add("C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL1\MSSQL\DATA\hsg_log.ldf)

I detached a database using SQL Server Management Studio and used the following code to attach the database back to the same instance. It worked Ok except that the attached database did not get the original name although files were the same.
PS SQLSERVER:\sql\Hodentek8\RegencyPark\databases>
$files = new-object system.collections.specialized.stringcollection
$files.add("C:\Program Files\Microsoft SQLServer\MSSQL11.REGENCYPARK\MSSQL\DATA\Feb6.mdf")
$files.add("C:\Program Files\Microsoft SQL Server\MSSQL11.REGENCYPARK\MSSQL\DATA\Feb6_log.ldf")
$server=new-object Microsoft.SqlServer.Management.Smo.Server('Hodentek8\RegencyPark')
$db=new-object Microsoft.SqlServer.Management.Smo.Server
$dbname="feb6"
$server.AttachDatabase($db,$files)
Some details are here:
http://hodentekmsss.blogspot.com/2015/04/attaching-detached-database-in-sql.html

perhaps instead of localhost\SQLExpress you should use:
I found this fix for some of my code which returned the same error:
$srv = new-object Microsoft.SqlServer.Management.Smo.Server 'Hodentek8\RegencyPark'
Replace 'Hodentek8\RegencyPark' by ("(local)") for the default instance
Example here:
http://hodentekmsss.blogspot.com/2015/04/counting-sql-server-configuration.html

Related

How to connect to local SQL server 2016 via PowerShell and execute a SQL stored in a .sql file?

I am trying to create a PowerShell script to automate a very simple process, however, I cannot get much (if anything) to work. The documentation is either not what I need, outdated or conflicting.
I've had a few variations of this:
$SQLConnection = New-Object System.Data.SQLClient.SQLConnection
$SQLConnection.ConnectionString = "Data Source=.\SQL2016;Initial Catalog=TEST;Trusted_Connection=true;"
$SQLConnection.Open()
$Cmd = new-object system.Data.SqlClient.SqlCommand($SQLConnection)
Invoke-Sqlcmd -InputFile "C:\dev\test\script.sql" | Out-File -filePath "C:\dev\test\output.sql"
$SQLConnection.Close()
I've not managed to connect to the database.
The idea being, script.sql spits out a bunch of SQL (this works fine) which we will put into source control. Once in source control, a Jenkins job will do something with it.
Trying to keep this as basic as possible, no flexibility is needed other than different connection strings. I want to avoid using PSSQL if possible, a user throws in their connecting string (database will be the name) and runs the script, job done.
Can anyone point me in the right direction?
System.Data.SQLClient is only necessary if you don't have the SqlServer module installed or are doing something unusual. It's a much more verbose method.
You just need:
Import-Module SqlServer;
Invoke-Sqlcmd -ServerInstance '.\SQL2016' -Database 'TEST' -InputFile 'C:\dev\test\script.sql' |
Out-File -FilePath "C:\dev\test\output.sql"
However... your output file isn't really an .sql file unless the queries in script.sql are actually returning strings that should be executed as SQL. It should probably be a .txt file.
And depending on what exactly you're generating, you might want to consider Export-Csv -Path "C:\dev\test\output.csv" -NoTypeInformation instead of Out-File. I can't tell if you're trying to export data or just logging information.
Additionally, you'll need to make sure that you're not using the batch separator (GO) in script.sql, or relying on any "SQLCMD Mode" (as it's called in SQL Server Management Studio) or other sqlcmd.exe specific syntax. The Invoke-Sqlcmd doc outlines the differences between the two. If you don't know what that is, you're probably safe.

Powershell DB backup script pass only when executing from ISE

I've been using following backup script for a while to create .bacpac files for MSSQL databases. The script working ok in general but fails for some databases. Those databases are not too much different to others, maybe a bit bigger. As this is used only for dev databases the average size is not big, the bacpac file size is ~200Mb.
The weird thing the script successfully running when executed from PowerShell ISE but failed when running from the command line. Please note that script only failing for some databases and work for others. The error message is not really helpful:
WARNING: Exception occurred: Exception calling "ExportBacpac" with "2" argument(s): "Could not export schema and data from database."
We using MSSQL 2104 and the database can be exported to backup from MSSQL Studio without problems.
The script:
Param(
# Database name to backup e.g 'MYDB'
$databaseName,
# Database connection string "server=server ip;Integrated Security = True;User ID=user;Password=pass"
$connectionString,
# Path to the directory where backup file should be created
$backupDirectory
)
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll";
try {
$dacService = new-object Microsoft.SqlServer.Dac.DacServices $connectionString
# build backup filename
$backupFileName = $backupDirectory+"\"+$dataBaseName+[DateTime]::Now.ToString("yyyyMMdd-HHmmss")+".bacpac"
# perform backup
$dacService.exportBacpac($backupFileName, $dataBaseName);
Write-Output "Database backup file created $backupFileName"
} catch {
Write-warning "Exception occurred: $_"
throw "Database backup haven't been created, execution aborted."
}
Does anyone come across this issue? I know PowerShell and PowerShell ISE a bit different but I don't understand why script execution produces different results.
[EDIT]
Tried to add event listener for DacService and print output to get more debug information
register-objectevent -in $dacService -eventname Message -source "msg" -action { out-host -in $Event.SourceArgs[1].Message.Message } | Out-Null
The output is
Dac Assembly loaded.
Extracting schema (Start)
Gathering database options
Gathering users
WARNING: Exception occurred: Exception calling "ExportBacpac" with "2" argument(s): "Could not export schema and data
from database."
Gathering roles
Gathering application roles
Gathering role memberships
Gathering filegroups
Gathering full-text catalogs
Gathering assemblies
Gathering certificates
....
Processing Table '[dbo].[file_storage_entity]'. 99.74 % done.
Processing Table '[dbo].[file_storage_entity]'. 100.00 % done.
Exporting data (Failed)
[EDIT 2]
As a workaround, I've created C# program to do the same. Runs fine in any environment. The code virtually the same as in PowerShell script.

Deploying a dacpac via Powershell caused error: "Unable to determine the identity of domain"

Has anyone else met a similar problem to the one described below?
I am having a problem deploying a SQL server 2012 dacpac database upgrade with Powershell. The details are as follows:
Its a dacpac file built for sql server 2012 and I'm trying to apply it to a sql server 2012 database via Powershell run from the command line when logged in as administrator.
Exception calling "Deploy" with "4" argument(s): "Unable to determine the identity of domain."
At ... so.ps1:17 char:8
+ $d.Deploy($dp, $TargetDatabase,$true,$DeployOptions)
The redacted script (logging and literals changed) is as follows:
[System.Reflection.Assembly]::LoadFrom("C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll") | Out-Null
$d = new-object Microsoft.SqlServer.Dac.DacServices ("... Connection string ...")
$TargetDatabase = "databasename"
$fullDacPacPath = "c:\temp\...\databasename.dacpac"
# Load dacpac from file & deploy to database named pubsnew
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($fullDacPacPath)
$DeployOptions = new-object Microsoft.SqlServer.Dac.DacDeployOptions
$DeployOptions.IncludeCompositeObjects = $true
$DeployOptions.IgnoreFileSize = $false
$DeployOptions.IgnoreFilegroupPlacement = $false
$DeployOptions.IgnoreFileAndLogFilePath = $false
$DeployOptions.AllowIncompatiblePlatform = $true
$d.Deploy($dp, $TargetDatabase,$true,$DeployOptions)
Here is some supporting information:
Dac framework version is 11.1
The script throws the error when run on the command line:
ie. Powershell -File databaseupgrade.ps1
but not when run in the Powershell integrated script environment
Similar scripts work from the command line for other dacpacs.
Research on the web might suggest that it might be something to do with the size of dacpac. The ones that work are all smaller than the one that does not and this link mentions a figure of 1.3mb which the file size of the failing dacpac just exceeds. If anyone can confirm that this is the problem can you also suggest a solution?
Update
The following script exhibits the same behavior ie. works in PS Ide not from command line.
[Reflection.Assembly]::LoadWithPartialName("System.IO.IsolatedStorage")
$f = [System.IO.IsolatedStorage.IsolatedStorageFile]::GetMachineStoreForDomain();
Write-Host($f.AvailableFreeSpace);
I believe this issue here (at least in our case) is actually when the dacpac is working with a database that utilizes multiple filegroups. When doing the comparison for deployment my hypothesis is that it's utilizing the IsolatedStorage for the different files.
The link above was helpful, but it was not the entry as much as the last comment on that blog by Tim Lewis. I modified his code to work in native powershell. Putting this above the SMO assembly load should fix this issue:
$replacementEvidence = New-Object System.Security.Policy.Evidence
$replacementEvidence.AddHost((New-Object System.Security.Policy.Zone ([Security.SecurityZone]::MyComputer)))
$currentAppDomain = [System.Threading.Thread]::GetDomain()
$securityIdentityField = $currentAppDomain.GetType().GetField("_SecurityIdentity", ([System.Reflection.BindingFlags]::Instance -bOr [System.Reflection.BindingFlags]::NonPublic))
$securityIdentityField.SetValue($currentAppDomain,$replacementEvidence)
Edit - this answer is incorrect, see the link added in the original question for information about the real root cause.
It sounds like you're trying to connect with Windows Authentication and that's the cause of the failure (see this post as it seems to cover the error message you're getting). Change your connection string to use SQL Authentication or ensure that the user your powershell script is running as both has a domain-joined identity and has permissions to access the server. Basically, this is a SQL connection issue not a DAC issue.
It's been a few days now so I don't think a proper explanation will be forthcoming. I'll just post this as our workaround for anyone else who finds themselves in this situation.
There is a Microsoft command line program SqlPackage.exe that is fairly easy to get hold of. It will silently deploy a dacpac, can be executed in Powershell and has parameters that support all the options that we need.
If we use this instead of the Dac services assembly directly the domain problem does not arise.

Powershell User Issue Through SQL Server Agent

I am currently trying to run a Powershell script through SQL Server Agent, which completes its task without error if I run it through Powershell ISE on the desktop. The simple script is below (it's only being used for testing):
$test = "G:\test.txt"
if (Test-Path $testFile)
{
Remove-Item $test
}
When I run this through SQL Server Agent, it produces a successful output - no errors whatsoever, but does show that it's being run as a different user in the job history log, for instance domain\localmachine, whereas when I run the script through Powershell ISE, it shows domain\you.
As a note, I can't confirm this manually because what I tried to do was run the below script both locally and through SQL Server Agent in a job to see the output, but the job failed (and thus why I suspect it's a user issue). Therefore, I'm trusting SQL Server Agent as to the domain\locallmachine is running the job (the reason it won't delete the file).
([Environment]::UserDomainName + "\" + [Environment]::UserName) | out-file pssaved.txt
"$env:userdomain\$env:username" | out-file -append pssaved.txt
[Security.Principal.WindowsIdentity]::GetCurrent().Name | out-file -append pssaved.txt
## Locally this produces domain\you
## On SQL Server Agent, I receive the error: The error information returned by PowerShell is: 'SQL Server PowerShell provider error: Path SQLSERVER:\pssaved.txt does not exist. Please specify a valid path.'
Is there a way, through SQL Server Agent to run a job as my domain user, for instance domain\you instead of the domain\localmachine (at least, this would eliminate this possibility of an error)?
You can use a proxy for this. Check it out.

Get list of data and log files from detached SQL Server .mdf file

Given a detached SQL Server Primary Data File (.mdf) how can you get the list of data and log files that exist inside that file? The goal is to use the Server.AttachDatabase SMO method to attach the database. But the database may have multiple data and/or log files so I need to get the list in order to add them to the StringCollection parameter of the method.
What I need is the equivalent of select * from sys.files for a detached mdf. SQL Server Management Studio does this when you use it to manually attach a database so I know it can be done.
If it's a single MDF file, if there are other MDF files I don't think the MDF file can tell you that directly.
And you can always try to attach an MDF file without a log file by using the CREATE DATABASE ... FOR ATTACH_REBUILD_LOG option.
In SMO you can do this using the AttachDatabase method specifying RebuildLog for AttachOptions.
Of course this all assumes the .mdf file is healthy - it will only be usable if it was cleanly detached.
Management Studio probably has some proprietary way of reading the file headers, but these aren't documented and you're not going to be able see what SSMS is doing using Profiler or the like.
If you are typically creating .mdf files for distribution of some kind, I really strongly recommend using backup/restore instead. You can learn a lot more about a .BAK file and the data/log files its database represents, using documented and public methods such as RESTORE FILELISTONLY.
Finally figured this one out. The undocumented command DBCC checkprimaryfile(N'blah.mdf',3) gives the info needed.
It took me a while to find it, but in SMO, you use EnumDetachedLogFiles and EnumDetachedLogFiles.
# PowerShell
$servername = "sqlserver\instance"
$mdf = "S:\DATA\mydb.mdf"
[void][Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$filestructure = New-Object System.Collections.Specialized.StringCollection
$server = New-Object Microsoft.SqlServer.Management.Smo.Server $servername
# Here you can automatically determine the database name, or set it manually
$dbname = ($server.DetachedDatabaseInfo($mdf) | Where { $_.Property -eq "Database name" }).Value
foreach ($file in $server.EnumDetachedDatabaseFiles($mdf)) {
$null = $filestructure.add($file)
}
foreach ($file in $server.EnumDetachedLogFiles($mdf)) {
$null = $filestructure.add($file)
}
Enumerate detached database file structures
SMO Attach/Detach Recipes

Resources