Download all SSRS reports - sql-server

I want to get a copy of all .rdl files in one server.
I can do the download manually one report at the time, but this is time consuming especially that this server has around 1500 reports.
Is there any way or any tool that allows me to download all the .rdl files and take a copy of them?

There is a complete & simpler way to do this using PowerShell.
This code will export ALL report content in the exact same structure as the Report server. Take a look at the Github wiki for other options & commands
#------------------------------------------------------
#Prerequisites
#Install-Module -Name ReportingServicesTools
#------------------------------------------------------
#Lets get security on all folders in a single instance
#------------------------------------------------------
#Declare SSRS URI
$sourceRsUri = 'http://ReportServerURL/ReportServer/ReportService2010.asmx?wsdl'
#Declare Proxy so we dont need to connect with every command
$proxy = New-RsWebServiceProxy -ReportServerUri $sourceRsUri
#Output ALL Catalog items to file system
Out-RsFolderContent -Proxy $proxy -RsFolder / -Destination 'C:\SSRS_Out' -Recurse

I've created this powershell script to copy them into a ZIP. You have to provide the SQL server database details.
Add-Type -AssemblyName "System.IO.Compression.Filesystem"
$dataSource = "SQLSERVER"
$user = "sa"
$pass = "sqlpassword"
$database = "ReportServer"
$connectionString = "Server=$dataSource;uid=$user; pwd=$pass;Database=$database;Integrated Security=False;"
$tempfolder = "$env:TEMP\Reports"
$zipfile = $PSScriptRoot + '\reports.zip'
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$allreports = $connection.CreateCommand()
$allreports.CommandText = "SELECT ItemID, Path, CASE WHEN Type = 2 THEN '.rdl' ELSE '.rds' END AS Ext FROM Catalog WHERE Type IN(2,5)"
$result = $allreports.ExecuteReader()
$reportable = new-object "System.Data.DataTable"
$reportable.Load($result)
[int]$objects = $reportable.Rows.Count
foreach ($report in $reportable) {
$cmd = $connection.CreateCommand()
$cmd.CommandText = "SELECT CAST(CAST(Content AS VARBINARY(MAX)) AS XML) FROM Catalog WHERE ItemID = '" + $report[0] + "'"
$xmldata = [string]$cmd.ExecuteScalar()
$filename = $tempfolder + $report["Path"].Replace('/', '\') + $report["Ext"]
New-Item $filename -Force | Out-Null
Set-Content -Path ($filename) -Value $xmldata -Force
Write-Host "$($objects.ToString()).$($report["Path"])"
$objects -= 1
}
Write-Host "Compressing to zip file..."
if (Test-Path $zipfile) {
Remove-Item $zipfile
}
[IO.Compression.Zipfile]::CreateFromDirectory($tempfolder, $zipfile)
Write-Host "Removing temporarly data"
Remove-Item -LiteralPath $tempfolder -Force -Recurse
Invoke-Item $zipfile

If you just need this for backup purposes or something similar, this might be useful: Where does a published RDL file sit?
The relevant query from that thread is:
select convert(varchar(max), convert(varbinary(max), content))
from catalog
where content is not null
The original answer was using 2005, and I've used it on 2016, so I imagine it should work for 2008 and 2012.
When I had to use this, I added in the Path to the query as well, so that I knew which report was which.
CAVEAT: prior to SSMS v18, Results to Grid is limited to 64KB per tuple and Results to Text are limited to 8,192 characters per tuple. If your report definition is larger than these limits you will not be able to get the entire definition.
In SSMS v18, those limits have been increased to 2MB per tuple for both Reports to Grid as well as Results to Text.

This is based on SQL2016/SSRS2016 but I think it should work for 2012.
SELECT 'http://mySQLServerName/reports/api/v1.0/catalogitems(' + cast(itemid as varchar(256))+ ')/Content/$value' AS url
FROM ReportServer.dbo.Catalog
This will give you a list of URL's, one for each report.
If the above did not work in SSRS 2012 then go to the report manager and do as if you were going to download the file from there. Check the URL on the download button and you'll probably see a URL with and item id embedded int it. Just adjust the above code to match that url structure.
What you do with then after this is up to you.
Personally I would use the Chrome extension called 'Tab Save' available in the Chrome store here. You can simply copy and paste all the URL's created above into it and hit the download button...

Found and used this without any issues. Nothing to install, just added my url, and pasted into Powershell.
https://microsoft-bitools.blogspot.com/2018/09/ssrs-snack-download-all-ssrs-reports.html
In case the link breaks, here's the code from the link:
###################################################################################
# Download Reports and DataSources from a SSRS server and create the same folder
# structure in the local download folder.
###################################################################################
# Parameters
###################################################################################
$downloadFolder = "c:\temp\ssrs\"
$ssrsServer = "http://myssrs.westeurope.cloudapp.azure.com"
###################################################################################
# If you can't use integrated security
#$secpasswd = ConvertTo-SecureString "MyPassword!" -AsPlainText -Force
#$mycreds = New-Object System.Management.Automation.PSCredential ("MyUser", $secpasswd)
#$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)/ReportServer/ReportService2010.asmx?WSDL" -Credential $mycreds
# SSRS Webserver call
$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)/ReportServer/ReportService2010.asmx?WSDL" -UseDefaultCredential
# List everything on the Report Server, recursively, but filter to keep Reports and DataSources
$ssrsItems = $ssrsProxy.ListChildren("/", $true) | Where-Object {$_.TypeName -eq "DataSource" -or $_.TypeName -eq "Report"}
# Loop through reports and data sources
Foreach($ssrsItem in $ssrsItems)
{
# Determine extension for Reports and DataSources
if ($ssrsItem.TypeName -eq "Report")
{
$extension = ".rdl"
}
else
{
$extension = ".rds"
}
# Write path to screen for debug purposes
Write-Host "Downloading $($ssrsItem.Path)$($extension)";
# Create download folder if it doesn't exist (concatenate: "c:\temp\ssrs\" and "/SSRSFolder/")
$downloadFolderSub = $downloadFolder.Trim('\') + $ssrsItem.Path.Replace($ssrsItem.Name,"").Replace("/","\").Trim()
New-Item -ItemType Directory -Path $downloadFolderSub -Force > $null
# Get SSRS file bytes in a variable
$ssrsFile = New-Object System.Xml.XmlDocument
[byte[]] $ssrsDefinition = $null
$ssrsDefinition = $ssrsProxy.GetItemDefinition($ssrsItem.Path)
# Download the actual bytes
[System.IO.MemoryStream] $memoryStream = New-Object System.IO.MemoryStream(#(,$ssrsDefinition))
$ssrsFile.Load($memoryStream)
$fullDataSourceFileName = $downloadFolderSub + "\" + $ssrsItem.Name + $extension;
$ssrsFile.Save($fullDataSourceFileName);
}

I'vr tried several permutations of this script and keep getting the "can't create proxy connection" error. Here's the one that "should" work:
#------------------------------------------------------
#Prerequisites
#Install-Module -Name ReportingServicesTools
#------------------------------------------------------
#Lets get security on all folders in a single instance
#------------------------------------------------------
#Declare SSRS URI
$sourceRsUri = "http://hqmnbi:80/ReportServer_SQL08/ReportService2010.asmx?wsdl"
#Declare Proxy so we dont need to connect with every command
$proxy = New-RsWebServiceProxy -ReportServerUri $sourceRsUri
#Output ALL Catalog items to file system
Out-RsFolderContent -Proxy $proxy -RsFolder / -Destination 'C:\Users\arobinson\source\Workspaces\EDW\MAIN\SSRS\HQMNBI' -Recurse
This is the error I'm getting:
Failed to establish proxy connection to http://hqmnbi/ReportServer_SQL08/ReportService2010.asmx : The HTML document does not contain
Web service discovery information.
At C:\Program Files\WindowsPowerShell\Modules\ReportingServicesTools\0.0.6.6\Functions\Utilities\New-RsWebServiceProxy.ps1:136 char:9
throw (New-Object System.Exception("Failed to establish proxy ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : OperationStopped: (:) [], Exception
FullyQualifiedErrorId : Failed to establish proxy connection to http://hqmnbi/ReportServer_SQL08/ReportService2010.asmx : The
HTML document does not contain Web service discovery information.
I've tried the URI with htttp:// and without, I've tried including the port number. etc. Still can't get this to actually work. We have two other SSRS instances that I was able to run this against no problem.

From this question: SQL Reporting Services - COPY reports to another folder
I found this tool can both download and upload reports. Plus it lists out folders and subfolders.
http://code.google.com/p/reportsync/

Related

Could Not Load Assembly MICROSOFT.SQLSERVER.BATCHPARSER on 32-BIT environment

I'm posting my problem here, hoping someone may help me to figure the issue.
So, for one of my clients I've developed a PS script that retrieve a table for a database and export it as a CSV directly to a Blob Storage. My script works fine in a 64-Bit environment. However, I cannot run it in a 32-Bit environment. I need to run it in a 32-Bit environment because the scheduler used by the client is a 32-Bit tool.
On my side, I've tried every thing I've already found around the net on this subject with no luck.
My problem as I said above is that I fail to run my script on a 32-Bit environment. I'm putting a screenshot of booth environment so you can see what I'm having.
The Green square is the expected result. The Yellow one is the error I'm having.
The Blue squares shows booth SqlServer Modules I downloaded (x86 & 64).
I have the same behavior from a CMD SHELL.
So My questions are:
Is there anyway to make this script working on a 32-Bit environment?
Else Is there anyway to force a 32-BIT CMD SHELL to open a 64-Bit session on PowerShell ?
Here is the FUll PS SCript :
param (
[String]$SourceServer="" ,
[String]$SourceDatabase="" ,
[String]$DestinationStorageAccountName = "",
[String]$DestinationStorageAccountContainrerName= "",
[String]$DBUser = "",
[String]$DBUserPWD = ""
)
FUNCTION Write-ToBlobStorage{
[CmdletBinding()]
param (
[Parameter(Mandatory)][String]$ResultString,
[Parameter(Mandatory)][String]$DestinationStorageAccountName,
[Parameter(Mandatory)][String]$DestinationStorageAccountContainrerName,
[Parameter(Mandatory)][String]$FileName
)
write-host "Clear existing identies to keep cache fresh"
Clear-AzContext -force
write-host "Authenticate using the Managed identity"
$account = Connect-AzAccount -identity
if(-not $account.Context.Subscription.Id)
{
write-error "Failed to authenticate with the Managed identity. Ensure VM has a Managed identity enabled and is assigned the correct IAM roles"
return
}
write-host "Get storage context"
$context = New-AZStorageContext -StorageAccountName $DestinationStorageAccountName
write-host "Get storage Container"
$container=Get-AzStorageContainer -Name $DestinationStorageAccountContainrerName -Context $context
write-host "Writing Result to storage"
$content = [system.Text.Encoding]::UTF8.GetBytes($ResultString)
$container.CloudBlobContainer.GetBlockBlobReference("$FileName.csv").UploadFromByteArray($content,0,$content.Length)
}
#Import-Module 'Az.KeyVault' -Force
#Import-Module -Name 'C:\Program Files\WindowsPowerShell\Modules\SqlServer' -Force
Import-Module -Name 'C:\Program Files (x86)\WindowsPowerShell\Modules\SqlServer' -Force
$TLS12Protocol = [System.Net.SecurityProtocolType] 'Ssl3 , Tls12'
[System.Net.ServicePointManager]::SecurityProtocol = $TLS12Protocol
$Query = "SELECT ##SERVERNAME"
$Result = Invoke-Sqlcmd -ServerInstance $SourceServer -Database $SourceDatabase -Query $Query | ConvertTo-Csv -Delimiter '|' -NoTypeInformation
$ResultString = $Result -join "`r`n"
Write-ToBlobStorage -ResultString $ResultString -DestinationStorageAccountName $DestinationStorageAccountName -DestinationStorageAccountContainrerName $DestinationStorageAccountContainrerName -FileName "TMP_Flux"
write-host "--- ALL DONE---"
And Here is The error for the 32-Bit :
Invoke-Sqlcmd : Could not load file or assembly
'Microsoft.SqlServer.BatchParser, Version=15.100.0.0, Culture=neutral,
PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot
find the file specified.
At C:\temp\ExportToBlobScript\ExportToBlob.ps1:87 char:11
+ $Result = Invoke-Sqlcmd -ServerInstance $SourceServer -Database $Sour ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Invoke-Sqlcmd], FileNotFoundEx
ception
+ FullyQualifiedErrorId : System.IO.FileNotFoundException,Microsoft.SqlServ
er.Management.PowerShell.GetScriptCommand
Write-ToBlobStorage : Cannot bind argument to parameter 'ResultString' because
it is an empty string.
At C:\temp\ExportToBlobScript\ExportToBlob.ps1:91 char:35
+ Write-ToBlobStorage -ResultString $ResultString -DestinationStorageAc ...
+ ~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Write-ToBlobStorage], Parameter
BindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAll
owed,Write-ToBlobStorage
--- ALL DONE---
And Here is the result for the 64-Bit:
Clear existing identies to keep cache fresh
Authenticate using the Managed identity
Get storage context
Get storage Container
Writing Result to storage
--- ALL DONE---
Many Thanks for all of you suggestions.
Not sure. But since you're only using Invoke-SqlCmd to run a query, you can eliminate the dependence on the SqlServer powershell module by using ADO.NET directly from PowerShell. The SQL Server client libraries are part of the .NET framework, so they will be available on any Windows box. So something like:
function Invoke-SqlCmd-Custom{
[CmdletBinding()]
param (
[Parameter(Mandatory)][String]$ServerInstance,
[Parameter(Mandatory)][String]$Database,
[Parameter(Mandatory)][String]$Query
)
$con = new-object System.Data.SqlClient.SqlConnection
$con.ConnectionString = "Server=$ServerInstance;Database=$Database;Integrated Security=true"
try
{
$con.Open()
$cmd = $con.CreateCommand()
$cmd.CommandText = $Query
$dt = new-object System.Data.DataTable
$rdr = $cmd.ExecuteReader()
$dt.Load($rdr)
return $dt.Rows
}
finally
{
$con.Close()
}
}

Creating a CSV file to use for starting services on multiple servers

I have created a tsv file to list Servers and Services as follows:
TSV File as follows:
Hostname Services
=========================
Server01 SP4AdminV4,SPTraceV4,SPWriterV4,WAS,W3SVC
Server02 SP4AdminV4,SPTraceV4,SPWriterV4,WAS,W3SVC,SPSearchHostController, OSearch16
PowerShell command
Import-csv C:\ServerServerList.tsv
$Services = $_.Services -Split ','
Start-Service -Computer 'Server01' -Name $Services
I then get the following error:
Start-Service: Cannot Bind Argument to parameter 'Name' because it is an empty string.
At Line:3 char:43
+ Start-Service -Computer $_.Hostname -Name **$Services**
+
+categoryinfo: InvalidData (:) [Start-Service], ParameterBindingValidationException
+FullyQualifiedErrorId: ParameterArgumentValidationErrorEmptyStringNotAllowed,
Microsoft.Powershell.Commands.StartServiceCommand
There are a couple of problems I can spot at a glance. First you don't seem to be assigning $Services correctly. 2nd you Start-Service doesn't have a -ComputerName parameter.
To get around that you can use the use Get-Service in conjunction with Set-Service implicitly using the -InputObject parameter via the pipeline.
Import-csv C:\ServerServerList.tsv -Delimiter "`t" |
ForEach-Object{
$Services = $_.Services -Split ','
Get-Service -Computer $_.HostName -Name $Services
} |
Start-Service
I'm assuming this is a Tab separated file as you described. Also assuming the services are listed in such a way to make the split proper.
The loop sends [System.ServiceProcess.ServiceController] objects down the pipeline. Those are bound to the -InputObject parameter. Internally Start-Service uses the .MachineName property to make the change on the remote system.
Warning: Get/Set-Service doesn't always report errors properly in this kind of scenario. The primary obstacle I've encountered is misleading errors and/or silent failures in cases where the operator account doesn't have access to the remote system.
I'm sure there's easier ways to go about this but, here's what I got:
# import csv and save to variable.
$CSV = Import-Csv -Path 'C:\ServerServerList.tsv'
# Use a foreach loop to iterate throw csv
# one row at a time - assigning the current iteration to $Row.
foreach ($Row in $CSV) {
# Split services into an array
$Services = ($Row.Services -split ',').Trim()
# Invoke the start-service call due to it not
# having it's own -ComputerName parameter to work with.+
Invoke-Command -ScriptBlock {
# Start the service using the newly created array of $Services.
# Note: we must use a *Remote Variable* to pass over our local variable
# by specifying the $using: keyword.
Start-Service -Name $using:Services -PassThru -WhatIf
} -ComputerName $Row.Hostname
}
- This is untested -

Comparing ALL settings between two separate powershell windows in the ISE?

TL;DR - I have a window in the ISE where I somehow got my script to work. If I start a new window (aka session?) in the ISE and copy/paste the code, the script fails. I'm trying to figure out what is different between the sessions so that I can get it to reliably work. I'm hoping there's some setting or something that I set in one that's not in the other, but I'm clearing variables between runs (see below) so I just don't know.
Long version:
I have a Powershell SMO/SQLServer script that works in one window in the ISE that uses ScriptTransfer(), which is a module inside the SQLServer module.
When I create a new window (aka new session) and copy/paste the code and run it against one particular server/database, the scripttransfer fails. I have no idea how I managed to get it working, but I want to duplicate that. The code works on most of my databases, but fails on one particular one, and there's no easy way to figure out what's going wrong inside that module, and the error message I get back is:
Exception calling "ScriptTransfer" with "0" argument(s): "Script transfer failed. "
At line:68 char:1
+ $transfer.ScriptTransfer()
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : FailedOperationException
Here's the code. It works on most databases, but I've got one DB that's causing these errors. Including it here just to be complete.
Remove-Variable -Name * -ErrorAction SilentlyContinue #allows us to run repeatedly in same session for testing.
import-module sqlserver #yes, necessary, at least so that you know if the scripttransfer fails, you need to get a newer version.
$servername = "mydb"
$databasename = "myserver"
$workingdirectory ="c:\temp\backup"
"servername is $servername"
"databasename is $databasename"
"uploadtos3 is $uploadtoS3"
"accesskey is $accesskey"
"secretkey is $secretkey"
"workingdirectory is $workingdirectory"
"bucketname is $bucketname"
"regionname is $regionname"
"s3path is $s3path"
####################
# 0 - Declarations #
####################
$credentials = set-awscredentials -accesskey $accesskey -secretkey $secretkey
$sql_server = $servername
$DBToScript = $databasename
$path = $workingdirectory #I'd use a UNC path here
################################
# 1 - Scripting out the CREATE #
################################
$SMOServer = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList "$sql_server"
$db = $SMOServer.Databases[$DBToScript]
$create_database = $db.Script() -replace("\d+KB","100mb") -replace("\d+mb","100mb") -replace("\d+GB","100mb")
#while I couldn't get it to script out the GO terminators (ScriptBatchTerminator), it doesn't need it for the create.
$create_database |Out-File "$path\$($dbtoscript)_db_create.sql" -Force -Encoding ascii -Append
#using out-file since I can't figure out which setting uses filename.
#####################################
# 2 - Scripting out all the objects #
#####################################
#using transfer as it seems more straightforward.
$transfer = new-object -TypeName Microsoft.SqlServer.Management.Smo.Transfer #-ArgumentList $db
$transfer.Database = $db
$transfer.CopyAllObjects = $true
$transfer.CopySchema = $true
$transfer.CopyAllSchemas= $true
$transfer.Options.Indexes = $true
$transfer.Options.ScriptBatchTerminator = $true # this only goes to the file
$transfer.Options.filename = "$path\$($dbtoscript)_db_objects.sql"
$transfer.Options.ExtendedProperties= $true # yes, we want these
$transfer.Options.DRIAll= $true # and all the constraints
$transfer.Options.Indexes= $true # Yup, these would be nice
$transfer.Options.Triggers= $true # This should be included when scripting a database
$transfer.Options.ScriptBatchTerminator = $true # this only goes to the file
$transfer.Options.IncludeHeaders = $false #pshaw
$transfer.Options.ToFileOnly = $true #no need of string output as well
$transfer.Options.IncludeIfNotExists = $false # not necessary but it means the script can be more versatile
"test"
# This first ScriptTransfer MUST stay! Otherwise it leaves out the WITH EXECUTE AS CALLER on CLR SPs
$transfer.ScriptTransfer() |out-null
$transfer.ScriptTransfer()

Using Invoke-Command, inside the script the methods and properties are not working

I have a file that contains records with the name of a server and the name of its sqlserver instance. I want to read for each record the jobs configured and their status for the last run.
I setup a credential, then define the script to gather information and finally calls Invoke-Command. My problem is that where I expect to find the gathered information I'm just getting the name of properties.
This is my script:
$a = Get-Credential
$scriptBlock={
param($server,$instance) $OFS=","
import-module sqlps
SQLSERVER:
cd SQL\$server\$instance\JobServer\Jobs
foreach($CurJob in (ls)){
write-output "$CurJob.DisplayName`n Last Run: $CurJob.LastRunDate`n Ultima Last Run Otcome:$CurJob.LastRunOutcome`n Next Run: $CurJob.NextRunDate"
}
}
Import-Csv -Header Server, Instance -Delimiter "|" .\servers_with_jobs.txt | %{
write-output "Inovoking with server: $_.Server/$_.Instance"
$JobNumber=Invoke-Command -AsJob -ComputerName $_.Server -Credential $a -ScriptBlock $scriptBlock -ArgumentList $_.Server, $_.Instance
Wait-Job $JobNumber
$JobResult = Receive-Job $JobNumber
$JobResult
}
And when I run, then I get the following:
Inovoking with server: #{Server=server01.company.net; Instance=INST1}.Server/#{Server=server01.company.net; Instance=INST1}.Instance
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
148 Job148 RemoteJob Completed True server01.e... ...
WARNING: The names of some imported commands from the module 'sqlps' include unapproved verbs that might make them less discoverable. To find the commands with
unapproved verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
ReplicationJob1.DisplayName
Last Run: ReplicationJob1.LastRunDate
Ultima Last Run Otcome:ReplicationJob1.LastRunOutcome
Next Run: ReplicationJob1.NextRunDate
ReplicationJob2.DisplayName
Last Run: ReplicationJob2.LastRunDate
Ultima Last Run Otcome:ReplicationJob2.LastRunOutcome
Next Run: ReplicationJob2.NextRunDate
It seems like if the script resolve the name of the job as the ToString method. Even I try to print the type for the $CurJob object but instead it show something like the output of running Get-Member:
PSComputerName : server01.company.net
RunspaceId : 547127ac-ae58-4144-b769-507c6cf18b2e
Module : Microsoft.SqlServer.Smo.dll
Assembly : Microsoft.SqlServer.Smo, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91
TypeHandle : System.RuntimeTypeHandle
BaseType : Microsoft.SqlServer.Management.Smo.Agent.AgentObjectBase
UnderlyingSystemType : Microsoft.SqlServer.Management.Smo.Agent.Job
FullName : Microsoft.SqlServer.Management.Smo.Agent.Job
AssemblyQualifiedName : Microsoft.SqlServer.Management.Smo.Agent.Job, Microsoft.SqlServer.Smo, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91
Namespace : Microsoft.SqlServer.Management.Smo.Agent
GUID : c556994c-df26-35b1-8b70-78bce85a3e25
You need to evaluate parameter calls inside a string evaluation. I didn't test this but I think the example will do the trick.
Example
$a = Get-Credential
$scriptBlock={
param($server,$instance) $OFS=","
import-module sqlps
SQLSERVER:
cd SQL\$server\$instance\JobServer\Jobs
foreach($CurJob in (ls)){
write-output "$($CurJob.DisplayName)`n Last Run: $($CurJob.LastRunDate)`n Ultima Last Run Otcome:$($CurJob.LastRunOutcome)`n Next Run: $($CurJob.NextRunDate)"
}
}
Import-Csv -Header Server, Instance -Delimiter "|" .\servers_with_jobs.txt | %{
write-output "Inovoking with server: $($_.Server)/$($_.Instance)"
$JobNumber=Invoke-Command -AsJob -ComputerName $_.Server -Credential $a -ScriptBlock $scriptBlock -ArgumentList $_.Server, $_.Instance
Wait-Job $JobNumber
$JobResult = Receive-Job $JobNumber
$JobResult
}
A Better Way
Instead of using string evaluations use -f to format a string.
Example
'Inovoking with server: {0}/{1}' -f $_.Server, $_.Instance

How to capture DacSevices.Deploy output?

So I've managed to deploy our DACPAC schema via Octopus. I'm using a Deploy.ps1 script interacting with .Net objects just like the article describes.
I'd like to make the deployment process more transparent by including the "standard output" you get from sqlcmd in our Octopus logs. I'm looking for the the generated schema modification messages as well as any custom migration migration messages our developers have put into the pre/post scripts.
The only workaround I can think of is to first generate the script with the DACPAC services and then run it with sqlcmd.exe. Any ideas?
Found the solution, posting in case someone else runs across this. You simply need to subscribe to the your DacService's Message event.
C# sample:
var services = new Microsoft.SqlServer.Dac.DacServices("data source=machinename;Database=ComicBookGuy;Trusted_connection=true");
var package = Microsoft.SqlServer.Dac.DacPackage.Load(#"C:\Database.dacpac");
var options = new Microsoft.SqlServer.Dac.DacDeployOptions();
options.DropObjectsNotInSource = true;
options.SqlCommandVariableValues.Add("LoginName", "SomeFakeLogin");
options.SqlCommandVariableValues.Add("LoginPassword", "foobar!");
services.Message += (object sender, Microsoft.SqlServer.Dac.DacMessageEventArgs eventArgs) => Console.WriteLine(eventArgs.Message.Message);
services.Deploy(package, "ComicBookGuy", true, options);
Powershell sample (executed by the Octopus Tentacle):
# This script is run by Octopus on the tentacle
$localDirectory = (Get-Location).Path
$tagetServer = $OctopusParameters["SQL.TargetServer"]
$databaseName = "ComicBookGuy"
Add-Type -path "$localDirectory\lib\Microsoft.SqlServer.Dac.dll"
$dacServices = New-Object Microsoft.SqlServer.Dac.DacServices ("data source=" + $tagetServer + ";Database=" + $databaseName + "; Trusted_connection=true")
$dacpacFile = "$localDirectory\Content\Unity.Quotes.Database.dacpac"
$dacPackage = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpacFile)
$options = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
$options.SqlCommandVariableValues.Add("LoginName", $OctopusParameters["SQL.LoginName"])
$options.SqlCommandVariableValues.Add("LoginPassword", $OctopusParameters["SQL.LoginPassword"])
$options.DropObjectsNotInSource = $true
Register-ObjectEvent -InputObject $dacServices -EventName "Message" -Action { Write-Host $EventArgs.Message.Message } | out-null
$dacServices.Deploy($dacPackage, $databaseName, $true, $options)
In the powershell version I couldn't get the handy "Add_EventName" style of event notification working so I had to use the clunky cmdlet. Meh.
Use sqlpackage instead of sqlcmd to deploy dacpac.
Get Latest version here : https://msdn.microsoft.com/en-us/mt186501
$sqlpackage = "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\sqlpackage.exe"
It will automatically output errors on the console. We use TFS build definition and call powershell and it is able to display errors that happened during a deploy.
Usage:
& $sqlpackage /Action:Publish /tsn:$dbServer /tdn:$database /sf:$mydacpac/pr:$dbProfile /variables:myVariable=1
This variation captures output but also allows you to capture and react to deploy failures by catching the exception
function Load-DacPacAssembly()
{
$assemblyName = "Microsoft.SqlServer.Dac.dll"
$packageFolder = <some custom code to find our package folder>
$dacPacAssembly = "$packageFolder\lib\net46\$assemblyName"
Write-Host "Loading assembly $assemblyName"
Add-Type -Path "$dacPacAssembly"
}
function Publish-Dacpac($dacpac, $publishProfile){
Load-DacPacAssembly
Write-Host "Loading profile $publishProfile..."
$dacProfile = [Microsoft.SqlServer.Dac.DacProfile]::Load($publishProfile)
$dacService = New-Object Microsoft.SqlServer.dac.dacservices ($dacProfile.TargetConnectionString)
Write-Host "Loading dacpac $dacpac"
$dacPackage = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
$event = Register-ObjectEvent -InputObject $dacService -EventName "Message" -Action {
$message = $EventArgs.Message
$colour = "DarkGray"
if ($message -contains "Error SQL")
{
$colour = "Red"
}
Write-Host $message -ForegroundColor $colour
}
Write-Host "Publishing...."
try {
$dacService.deploy($dacPackage, $dacProfile.TargetDatabaseName, $true, $dacProfile.DeployOptions)
}
catch [Microsoft.SqlServer.Dac.DacServicesException]
{
$message = $_.Exception.Message
Write-Host "SQL Publish failed - $message" -ForegroundColor Red # Customise here for your build system to detect the error
exit;
}
finally
{
Unregister-Event -SourceIdentifier $event.Name
}
}

Resources