I've been using following backup script for a while to create .bacpac files for MSSQL databases. The script working ok in general but fails for some databases. Those databases are not too much different to others, maybe a bit bigger. As this is used only for dev databases the average size is not big, the bacpac file size is ~200Mb.
The weird thing the script successfully running when executed from PowerShell ISE but failed when running from the command line. Please note that script only failing for some databases and work for others. The error message is not really helpful:
WARNING: Exception occurred: Exception calling "ExportBacpac" with "2" argument(s): "Could not export schema and data from database."
We using MSSQL 2104 and the database can be exported to backup from MSSQL Studio without problems.
The script:
Param(
# Database name to backup e.g 'MYDB'
$databaseName,
# Database connection string "server=server ip;Integrated Security = True;User ID=user;Password=pass"
$connectionString,
# Path to the directory where backup file should be created
$backupDirectory
)
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll";
try {
$dacService = new-object Microsoft.SqlServer.Dac.DacServices $connectionString
# build backup filename
$backupFileName = $backupDirectory+"\"+$dataBaseName+[DateTime]::Now.ToString("yyyyMMdd-HHmmss")+".bacpac"
# perform backup
$dacService.exportBacpac($backupFileName, $dataBaseName);
Write-Output "Database backup file created $backupFileName"
} catch {
Write-warning "Exception occurred: $_"
throw "Database backup haven't been created, execution aborted."
}
Does anyone come across this issue? I know PowerShell and PowerShell ISE a bit different but I don't understand why script execution produces different results.
[EDIT]
Tried to add event listener for DacService and print output to get more debug information
register-objectevent -in $dacService -eventname Message -source "msg" -action { out-host -in $Event.SourceArgs[1].Message.Message } | Out-Null
The output is
Dac Assembly loaded.
Extracting schema (Start)
Gathering database options
Gathering users
WARNING: Exception occurred: Exception calling "ExportBacpac" with "2" argument(s): "Could not export schema and data
from database."
Gathering roles
Gathering application roles
Gathering role memberships
Gathering filegroups
Gathering full-text catalogs
Gathering assemblies
Gathering certificates
....
Processing Table '[dbo].[file_storage_entity]'. 99.74 % done.
Processing Table '[dbo].[file_storage_entity]'. 100.00 % done.
Exporting data (Failed)
[EDIT 2]
As a workaround, I've created C# program to do the same. Runs fine in any environment. The code virtually the same as in PowerShell script.
Related
The process: an Azure agent that runs on a Windows 10 32bit pro machine with SQL Server 2014 Express installed.
The pipeline is built and runs successfully with PowerShell scripts as follows:
Create blank database
Create tables needed
C# application runs and populates the tables executed via a PowerShell script
Cross reference tables to update data needed.
Build a SSIS package
After result from SSIS package is success perform a backup
Command:
Backup-SqlDatabase -ServerInstance "$env.ComputerName" -Database "RealDB"
-BackupAction Database -BackupFile $Path -Blocksize 4096
This all works with one exception the actual backup I get is missing the data from the SSIS package run. BUT if I log into the machine and restore the backup used from $Path it is missing the data.
When I query the database after this process the data is there in the database.
There is only one database so its not backing up a different one.
I can run this command in powershell on the machine and my backup has the missing data that the powershell command from the agent does not.
Also interesting enough if I remove the -Blocksize 4096, it works as I expect and the backup has the data in it. I am considering abandoning the powershell due to this but thought I would ask to see if anyone experienced this or no.
Any help or thoughts are appreciated.
Thank you
Thank you #user19702 I was so consumed looking at the backup command with the added -BlockSize that I completely ignored the fact that my data has increased (thanks random process I never heard of before today) and even though the powershell is written to start the backup AFTER the SSIS package the process was not done. To find it I started task manager on the machine while the build was running and watched the process stay in memory for a few seconds when the backup started. I added a powershell command to make it wait a few seconds before processing the backup and its working.
In case anyone is wondering this is the command
$result = $package.Execute("false", $null)
Write-Host "Package ID Result: " $result
Start-Sleep -Seconds 10
Backup-SqlDatabase -ServerInstance "$env:ComputerName" -Database "RealDB" -BackupAction Database -BlockSize 4096 -BackupFile $Path
Thank You!!
I'm trying to write a PowerShell script which will execute tsql query to only one remote server using invoke-sqlcmd. Those tsql queries are simple one like backup/restore database, create a user, etc.
Below is an extract of it :
# 5 # Create clientdb database on secondary server by restoring the full backup for primary
Try {
Invoke-sqlcmd -ServerInstance 'REMOTESQLSRV'`
-Username 'ts_sql' -Password 'somepassword'`
-InputFile "$LScltid\__01_On_Secondary_CreateDB2_srv2.sql"`
-ErrorAction Stop
Write-Host " clt_$id is now restored to secondary server "`
-ForegroundColor White -BackgroundColor Green
} Catch {
Write-Host " Restore operation for clt_$id did not succeed. Check the error logs " -ForegroundColor Black -BackgroundColor Red
}
My scripts always break here. For some reasons that i could not put my head on, invoke-sqlcmd does not use the variable "$LScltid" to resolve the path where it will find the .sql script.
Everytime i had to run it, it change the current directory to either the SQLERVER:\ provider or some other causing the script to failed at this step.
Am I doing this the right way? If so, how should i adapt the command to perform as I expect it to.?
UPDATE
Forgot to mention, if i run the script with the variables values hard-coded I'm able to get the result i need (in this case restoring a database from device).
Thanks for your feedbacks.
Odd thing, the command is now working. I don't really know what i've done wrong previously, but the exact same command is now working.
Just a heads up for those facing the same issue:
If you have to use Invoke-Sqlcmd in your scripts, beware of the provider change especially if the commands coming after Invoque-Sqlcmd are regulars one (get, set, copy, new, etc....)
In my case, somewhere in my script between two Invoke-sqlcmd commands i had to copy files from local to remote server. Everytime the command failed because the provider changed. As a workaround I set-location before copy-item command execution and that maneuver seemed to do the trick (don't know if its re-commanded tough.
Thanks Stackoverflow Team
I am currently trying to run a Powershell script through SQL Server Agent, which completes its task without error if I run it through Powershell ISE on the desktop. The simple script is below (it's only being used for testing):
$test = "G:\test.txt"
if (Test-Path $testFile)
{
Remove-Item $test
}
When I run this through SQL Server Agent, it produces a successful output - no errors whatsoever, but does show that it's being run as a different user in the job history log, for instance domain\localmachine, whereas when I run the script through Powershell ISE, it shows domain\you.
As a note, I can't confirm this manually because what I tried to do was run the below script both locally and through SQL Server Agent in a job to see the output, but the job failed (and thus why I suspect it's a user issue). Therefore, I'm trusting SQL Server Agent as to the domain\locallmachine is running the job (the reason it won't delete the file).
([Environment]::UserDomainName + "\" + [Environment]::UserName) | out-file pssaved.txt
"$env:userdomain\$env:username" | out-file -append pssaved.txt
[Security.Principal.WindowsIdentity]::GetCurrent().Name | out-file -append pssaved.txt
## Locally this produces domain\you
## On SQL Server Agent, I receive the error: The error information returned by PowerShell is: 'SQL Server PowerShell provider error: Path SQLSERVER:\pssaved.txt does not exist. Please specify a valid path.'
Is there a way, through SQL Server Agent to run a job as my domain user, for instance domain\you instead of the domain\localmachine (at least, this would eliminate this possibility of an error)?
You can use a proxy for this. Check it out.
Given a detached SQL Server Primary Data File (.mdf) how can you get the list of data and log files that exist inside that file? The goal is to use the Server.AttachDatabase SMO method to attach the database. But the database may have multiple data and/or log files so I need to get the list in order to add them to the StringCollection parameter of the method.
What I need is the equivalent of select * from sys.files for a detached mdf. SQL Server Management Studio does this when you use it to manually attach a database so I know it can be done.
If it's a single MDF file, if there are other MDF files I don't think the MDF file can tell you that directly.
And you can always try to attach an MDF file without a log file by using the CREATE DATABASE ... FOR ATTACH_REBUILD_LOG option.
In SMO you can do this using the AttachDatabase method specifying RebuildLog for AttachOptions.
Of course this all assumes the .mdf file is healthy - it will only be usable if it was cleanly detached.
Management Studio probably has some proprietary way of reading the file headers, but these aren't documented and you're not going to be able see what SSMS is doing using Profiler or the like.
If you are typically creating .mdf files for distribution of some kind, I really strongly recommend using backup/restore instead. You can learn a lot more about a .BAK file and the data/log files its database represents, using documented and public methods such as RESTORE FILELISTONLY.
Finally figured this one out. The undocumented command DBCC checkprimaryfile(N'blah.mdf',3) gives the info needed.
It took me a while to find it, but in SMO, you use EnumDetachedLogFiles and EnumDetachedLogFiles.
# PowerShell
$servername = "sqlserver\instance"
$mdf = "S:\DATA\mydb.mdf"
[void][Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$filestructure = New-Object System.Collections.Specialized.StringCollection
$server = New-Object Microsoft.SqlServer.Management.Smo.Server $servername
# Here you can automatically determine the database name, or set it manually
$dbname = ($server.DetachedDatabaseInfo($mdf) | Where { $_.Property -eq "Database name" }).Value
foreach ($file in $server.EnumDetachedDatabaseFiles($mdf)) {
$null = $filestructure.add($file)
}
foreach ($file in $server.EnumDetachedLogFiles($mdf)) {
$null = $filestructure.add($file)
}
Enumerate detached database file structures
SMO Attach/Detach Recipes
I am writing a powershell script that will automate a dev environment deployment and I've hit a problem with attaching the db's. I am adding the 2 snap ins SqlServerCmdletSnapin100 and SqlServerProviderSnapin100 and using SQLSERVER:\SQL\localhost\SQLEXPRESS and the AttachDatabase method. This is working well and if I use DetachDatabase method in the same way I can re-run the script continually. My problem arises when I detach from the management studio and try to run the script again. No matter what I do here (permissions etc.) the script will continually fail from this point on with error:
Exception calling "AttachDatabase" with "2" argument(s):
"Attach database failed for Server 'localhost\SQLEXPRESS'. "
If I change the name of the database I am attaching as the script will work again. Is there something in a system db that would be hanging onto the Database or database files that I need to remove as well?
SMO uses nested error objects, so I'm wondering what the base error message states. If you run this statement:
$error[0] | fl -force
What error message do you get
Update
Ran a quick test:
Detach database "hsg" from my local instance using SSMS and successfully attached with this script:
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $s = get-item .
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $s.AttachDatabase("hsg",$sc)
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $sc = new-object System.Collections.Specialized.StringCollection
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $sc.Add("C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL1\MSSQL\DATA\hsg.mdf")
PS SQLSERVER:\SQL\WIN7BOOT\SQL1> $sc.Add("C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL1\MSSQL\DATA\hsg_log.ldf)
I detached a database using SQL Server Management Studio and used the following code to attach the database back to the same instance. It worked Ok except that the attached database did not get the original name although files were the same.
PS SQLSERVER:\sql\Hodentek8\RegencyPark\databases>
$files = new-object system.collections.specialized.stringcollection
$files.add("C:\Program Files\Microsoft SQLServer\MSSQL11.REGENCYPARK\MSSQL\DATA\Feb6.mdf")
$files.add("C:\Program Files\Microsoft SQL Server\MSSQL11.REGENCYPARK\MSSQL\DATA\Feb6_log.ldf")
$server=new-object Microsoft.SqlServer.Management.Smo.Server('Hodentek8\RegencyPark')
$db=new-object Microsoft.SqlServer.Management.Smo.Server
$dbname="feb6"
$server.AttachDatabase($db,$files)
Some details are here:
http://hodentekmsss.blogspot.com/2015/04/attaching-detached-database-in-sql.html
perhaps instead of localhost\SQLExpress you should use:
I found this fix for some of my code which returned the same error:
$srv = new-object Microsoft.SqlServer.Management.Smo.Server 'Hodentek8\RegencyPark'
Replace 'Hodentek8\RegencyPark' by ("(local)") for the default instance
Example here:
http://hodentekmsss.blogspot.com/2015/04/counting-sql-server-configuration.html