How to create Azure SQL Managed Instance using AzureRm.Sql PowerShell library? - sql-server

I need to automate creation of Azure SQL Database Managed Instance using PowerShell scripts (AzureRm.Sql). What command should be used to create it?

If you have installed AzureRm.Sql and properly configured network environment (VNet and subnet), you can use the following script that deploys 8-core "General Purpose" instance with 1024 max storage:
Select-AzureRmSubscription -Subscription "60d9f1df-......"
$resourceGroup = "<resource group>"
$vNetName = "<VNet name"
$subnetName = "<subnet name>"
$instanceName = "<subnet name>"
$region = "South India"
$vNet = Get-AzureRmVirtualNetwork -Name $vNetName -ResourceGroupName $resourceGroup
$subnet = Get-AzureRmVirtualNetworkSubnetConfig -Name $SubnetName -VirtualNetwork $vNet
$subnetId = $subnet.Id
New-AzureRmSqlManagedInstance -Name $instanceName `
-ResourceGroupName $resourceGroup -Location $region -SubnetId $subnetId `
-AdministratorCredential (Get-Credential) `
-StorageSizeInGB 1024 -VCore 8 -Edition "GeneralPurpose" `
-ComputeGeneration Gen5 -LicenseType BasePrice

Related

Is it possible to add Microsoft Graph delegated permissions to Azure AD app via Powershell?

I registered an application in Azure AD from PowerShell using the below script.
//To create new application
$myapp = New-AzureADApplication -DisplayName MyApp
$myappId=$myapp.AppId
//To set ApplicationID URI
Set-AzureADApplication -ApplicationId $myappId -IdentifierUris "api://$myappId"
//To retrieve details of new application
Get-AzureADApplication -Filter "DisplayName eq $myapp"
Now I want to set delegated API permissions(Calendars.Read, Application.Read.All, Directory.Read.All) for this app.
From Azure Portal, I know how to assign these. But is it possible to add these permissions via PowerShell? If yes, can anyone help me with the script or cmdlets?
Any help will be appreciated. Thank you.
Yes, it's possible to set delegated API permissions via PowerShell
Initially, please note AppID of new application that can be retrieved by below cmdlet:
Get-AzureADApplication -Filter "DisplayName eq $myapp"
Check whether you have Service Principal named "Microsoft Graph" using below cmdlet:
Get-AzureADServicePrincipal -All $true | ? { $_.DisplayName -eq "Microsoft Graph" }
In order to assign API permissions via PowerShell, you should know the GUIDs of those delegated permissions that can be displayed using below cmdlet:
$MSGraph.Oauth2Permissions | FT ID, Value
Note the IDs of required permissions like Calendars.Read, Application.Read.All and Directory.Read.All
Please find the complete script below:
$myapp = New-AzureADApplication -DisplayName MyApp
$myappId=$myapp.ObjectId
Get-AzureADApplication -Filter "DisplayName eq 'MyApp'"
$MSGraph = Get-AzureADServicePrincipal -All $true | ? { $_.DisplayName -eq "Microsoft Graph" }
$MSGraph.Oauth2Permissions | FT ID, Value
# Create a Resource Access resource object and assign the service principal’s App ID to it.
$Graph = New-Object -TypeName "Microsoft.Open.AzureAD.Model.RequiredResourceAccess"
$Graph.ResourceAppId = $MSGraph.AppId
# Create a set of delegated permissions using noted IDs
$Per1 = New-Object -TypeName "Microsoft.Open.AzureAD.Model.ResourceAccess" -ArgumentList "c79f8feb-a9db-4090-85f9-90d820caa0eb","Scope"
$Per2 = New-Object -TypeName "Microsoft.Open.AzureAD.Model.ResourceAccess" -ArgumentList "465a38f9-76ea-45b9-9f34-9e8b0d4b0b42","Scope"
$Per3 = New-Object -TypeName "Microsoft.Open.AzureAD.Model.ResourceAccess" -ArgumentList "06da0dbc-49e2-44d2-8312-53f166ab848a","Scope"
$Graph.ResourceAccess = $Per1, $Per2, $Per3
# Set the above resource access object to your application ObjectId so permissions can be assigned.
Set-AzureADApplication -ObjectId $myappId -RequiredResourceAccess $Graph
Reference:
How to assign Permissions to Azure AD App by using PowerShell?

Could Not Load Assembly MICROSOFT.SQLSERVER.BATCHPARSER on 32-BIT environment

I'm posting my problem here, hoping someone may help me to figure the issue.
So, for one of my clients I've developed a PS script that retrieve a table for a database and export it as a CSV directly to a Blob Storage. My script works fine in a 64-Bit environment. However, I cannot run it in a 32-Bit environment. I need to run it in a 32-Bit environment because the scheduler used by the client is a 32-Bit tool.
On my side, I've tried every thing I've already found around the net on this subject with no luck.
My problem as I said above is that I fail to run my script on a 32-Bit environment. I'm putting a screenshot of booth environment so you can see what I'm having.
The Green square is the expected result. The Yellow one is the error I'm having.
The Blue squares shows booth SqlServer Modules I downloaded (x86 & 64).
I have the same behavior from a CMD SHELL.
So My questions are:
Is there anyway to make this script working on a 32-Bit environment?
Else Is there anyway to force a 32-BIT CMD SHELL to open a 64-Bit session on PowerShell ?
Here is the FUll PS SCript :
param (
[String]$SourceServer="" ,
[String]$SourceDatabase="" ,
[String]$DestinationStorageAccountName = "",
[String]$DestinationStorageAccountContainrerName= "",
[String]$DBUser = "",
[String]$DBUserPWD = ""
)
FUNCTION Write-ToBlobStorage{
[CmdletBinding()]
param (
[Parameter(Mandatory)][String]$ResultString,
[Parameter(Mandatory)][String]$DestinationStorageAccountName,
[Parameter(Mandatory)][String]$DestinationStorageAccountContainrerName,
[Parameter(Mandatory)][String]$FileName
)
write-host "Clear existing identies to keep cache fresh"
Clear-AzContext -force
write-host "Authenticate using the Managed identity"
$account = Connect-AzAccount -identity
if(-not $account.Context.Subscription.Id)
{
write-error "Failed to authenticate with the Managed identity. Ensure VM has a Managed identity enabled and is assigned the correct IAM roles"
return
}
write-host "Get storage context"
$context = New-AZStorageContext -StorageAccountName $DestinationStorageAccountName
write-host "Get storage Container"
$container=Get-AzStorageContainer -Name $DestinationStorageAccountContainrerName -Context $context
write-host "Writing Result to storage"
$content = [system.Text.Encoding]::UTF8.GetBytes($ResultString)
$container.CloudBlobContainer.GetBlockBlobReference("$FileName.csv").UploadFromByteArray($content,0,$content.Length)
}
#Import-Module 'Az.KeyVault' -Force
#Import-Module -Name 'C:\Program Files\WindowsPowerShell\Modules\SqlServer' -Force
Import-Module -Name 'C:\Program Files (x86)\WindowsPowerShell\Modules\SqlServer' -Force
$TLS12Protocol = [System.Net.SecurityProtocolType] 'Ssl3 , Tls12'
[System.Net.ServicePointManager]::SecurityProtocol = $TLS12Protocol
$Query = "SELECT ##SERVERNAME"
$Result = Invoke-Sqlcmd -ServerInstance $SourceServer -Database $SourceDatabase -Query $Query | ConvertTo-Csv -Delimiter '|' -NoTypeInformation
$ResultString = $Result -join "`r`n"
Write-ToBlobStorage -ResultString $ResultString -DestinationStorageAccountName $DestinationStorageAccountName -DestinationStorageAccountContainrerName $DestinationStorageAccountContainrerName -FileName "TMP_Flux"
write-host "--- ALL DONE---"
And Here is The error for the 32-Bit :
Invoke-Sqlcmd : Could not load file or assembly
'Microsoft.SqlServer.BatchParser, Version=15.100.0.0, Culture=neutral,
PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot
find the file specified.
At C:\temp\ExportToBlobScript\ExportToBlob.ps1:87 char:11
+ $Result = Invoke-Sqlcmd -ServerInstance $SourceServer -Database $Sour ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Invoke-Sqlcmd], FileNotFoundEx
ception
+ FullyQualifiedErrorId : System.IO.FileNotFoundException,Microsoft.SqlServ
er.Management.PowerShell.GetScriptCommand
Write-ToBlobStorage : Cannot bind argument to parameter 'ResultString' because
it is an empty string.
At C:\temp\ExportToBlobScript\ExportToBlob.ps1:91 char:35
+ Write-ToBlobStorage -ResultString $ResultString -DestinationStorageAc ...
+ ~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Write-ToBlobStorage], Parameter
BindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAll
owed,Write-ToBlobStorage
--- ALL DONE---
And Here is the result for the 64-Bit:
Clear existing identies to keep cache fresh
Authenticate using the Managed identity
Get storage context
Get storage Container
Writing Result to storage
--- ALL DONE---
Many Thanks for all of you suggestions.
Not sure. But since you're only using Invoke-SqlCmd to run a query, you can eliminate the dependence on the SqlServer powershell module by using ADO.NET directly from PowerShell. The SQL Server client libraries are part of the .NET framework, so they will be available on any Windows box. So something like:
function Invoke-SqlCmd-Custom{
[CmdletBinding()]
param (
[Parameter(Mandatory)][String]$ServerInstance,
[Parameter(Mandatory)][String]$Database,
[Parameter(Mandatory)][String]$Query
)
$con = new-object System.Data.SqlClient.SqlConnection
$con.ConnectionString = "Server=$ServerInstance;Database=$Database;Integrated Security=true"
try
{
$con.Open()
$cmd = $con.CreateCommand()
$cmd.CommandText = $Query
$dt = new-object System.Data.DataTable
$rdr = $cmd.ExecuteReader()
$dt.Load($rdr)
return $dt.Rows
}
finally
{
$con.Close()
}
}

powershell core 7.0.3 Az.Account PSADServicePrincipal appRoles and oauth2permissions

I have windows powershell 5.1 script block like the following that successfully retrieves appRoles and oauth2permissions.
import-module -name AzureAD
if ($null -eq $mycredentials) { $mycredentials = Get-Credential }
$azConnectionContext = Connect-AzureAD -Credential $mycredentials
# $svp = Get-AzureADServicePrincipal -Filter "DisplayName -eq 'Microsoft Graph' } # not working, why ???
$svp = Get-AzureADServicePrincipal -All $true | ? { $_.DisplayName -eq 'Microsoft Graph }
$appRoles = $svp.AppRoles; $oauth2permissions = $svp.Oauth2Permissions
I'm trying to convert to powershell core 7 script block like the following and i'm unable to retrive appRoles and oauth2permissions because the PSADServicePrincipal type doesn't expose those properties.
import-module -name Az.Accounts
$azConnectionContext = Connect-AzAccount
$svp = Get-AzADServicePrincipal -DisplayName 'Microsoft Graph' }
$appRoles = $svp.AppRoles; $oauth2permissions = $svp.Oauth2Permissions # both of these fail
Question - anyone know how i get at azure ad service principal appRoles and oauth2permissions using powershell core 7.0.3 apis and types?
Instead of:
$svp.AppRoles;
Use:
$svp.AppRole;
$svp.Oauth2Permissions should work. Type should be Microsoft.Azure.PowerShell.Cmdlets.Resources.Models.Api16.OAuth2Permission.
To your last question, remove-module only removes a module for the current session. You need to run Uninstall-Module instead.
There is an Uninstall-AzModule function here that might help you: https://learn.microsoft.com/en-us/powershell/azure/uninstall-az-ps?view=azps-4.6.1

Microsoft Teams integration into Moodle LMS

everybody! I have an error during integration of Microsoft Teams into Moodle. So one of the steps of integration is running Moodle-AzureAD-Script.ps1 script on a local machine. And when I try running the script I got an error: "./Moodle-AzureAD-Script.ps1: line 1: syntax error near unexpected token newline' '/Moodle-AzureAD-Script.ps1: line 1: <#
";
It doesn't depend on OS, so can somebody tell me, what should I do? Thanks!
The Moodle-AzureAD-Script.ps1 script is in the download from https://moodle.org/plugins/pluginversions.php?plugin=local_o365
<#
File Name : Moodle-AzureAD-Script.ps1
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the MIT License.
#>
# Allow for the script to be run
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope CurrentUser
# Install necessary modules
Install-Module AzureAD -AllowClobber -Scope CurrentUser
Install-Module AzureRM -AllowClobber -Scope CurrentUser
#Overarching requirement - log into Azure first!
Connect-AzureAD
<#
.DESCRIPTION
This function will be able to create an array of type RequiredResourceAccess which will be then passed to the New-AzureADApplication cmdlet
#>
function Get-Resources
{
[Microsoft.Open.AzureAD.Model.RequiredResourceAccess[]] $outputArray = #();
$localPath = Get-Location
$jsonPath = -Join($localPath,'\Json\permissions.json');
$jsonObj = (New-Object System.Net.WebClient).DownloadString($jsonPath) | ConvertFrom-Json;
# Output the number of objects to push into the array outputArray
Write-Host 'From the json path:'$jsonPath', we can find' $jsonObj.requiredResourceAccess.length'attributes to populate';
for ($i = 0; $i -lt $jsonObj.requiredResourceAccess.length; $i++) {
# Step A - Create a new object fo the type RequiredResourceAccess
$reqResourceAccess = New-Object -TypeName Microsoft.Open.AzureAD.Model.RequiredResourceAccess;
# Step B - Straightforward setting the ResourceAppId accordingly
$reqResourceAccess.ResourceAppId = $jsonObj.requiredResourceAccess[$i].resourceAppId;
# Step C - Having to set the ResourceAccess carefully
if ($jsonObj.requiredResourceAccess[$i].resourceAccess.length -gt 1)
{
$reqResourceAccess.ResourceAccess = $jsonObj.requiredResourceAccess[$i].resourceAccess;
}
else
{
$reqResourceAccess.ResourceAccess = $jsonObj.requiredResourceAccess[$i].resourceAccess[0];
}
# Step D - Add the element to the array
$outputArray += $reqResourceAccess;
}
$outputArray;
}
# Step 1 - Getting the necessary information
$displayName = Read-Host -Prompt "Enter the AAD app name (ex: Moodle plugin)"
$moodleDomain = Read-Host -Prompt "Enter the URL of your Moodle server (ex: https://www.moodleserver.com)"
if ($moodleDomain -notmatch '.+?\/$')
{
$moodleDomain += '/'
}
# Step 2 - Construct the reply URLs
$ssoEndUrl = $moodleDomain + 'local/o365/sso_end.php'
$ssoUrl = $moodleDomain + 'local/o365/sso.php'
$ssoLogoutUrl = $moodleDomain + 'local/o365/sso_logout.php'
$botFrameworkUrl = 'https://token.botframework.com/.auth/web/redirect'
$authUrl = $moodleDomain + 'auth/oidc/'
$replyUrls = ($ssoEndUrl, $ssoUrl, $botFrameworkUrl, $authUrl)
# Step 3 - Compile the Required Resource Access object
[Microsoft.Open.AzureAD.Model.RequiredResourceAccess[]] $requiredResourceAccess = Get-Resources
# Step 4 - Making sure to officially register the application
$appVars = New-AzureADApplication -DisplayName $displayName -ReplyUrls $replyUrls -RequiredResourceAccess $requiredResourceAccess -LogoutUrl $ssoLogoutUrl
# Step 5 - Taking the object id generated in Step 2, create a new Password
$pwdVars = New-AzureADApplicationPasswordCredential -ObjectId $appVars.ObjectId
# Step 5a - Updating the logo for the Azure AD app
$location = Get-Location
$imgLocation = -Join($location, '\Assets\moodle-logo.jpg')
Set-AzureADApplicationLogo -ObjectId $appVars.ObjectId -FilePath $imgLocation
# Step 6 - Write out the newly generated app Id and azure app password
Write-Host 'Your AD Application ID: '$appVars.AppId
Write-Host 'Your AD Application Secret: '$pwdVars.Value
In fact, the script exactly depends on OS.
Please pay attention to the README.md file under \o365\scripts path.
It states that:
Requirements
This script requires a Windows 7+ device. MacOS/Linux devices are NOT supported.
This script is only compatible with Windows Powershell 5, which is pre-installed on each Windows 7+ device. Powershell 6+ is NOT
supported.
Make sure the OS and Powershell version are OK and then you will be able to follow the guide in this README.md file to finish the integration.

Download all SSRS reports

I want to get a copy of all .rdl files in one server.
I can do the download manually one report at the time, but this is time consuming especially that this server has around 1500 reports.
Is there any way or any tool that allows me to download all the .rdl files and take a copy of them?
There is a complete & simpler way to do this using PowerShell.
This code will export ALL report content in the exact same structure as the Report server. Take a look at the Github wiki for other options & commands
#------------------------------------------------------
#Prerequisites
#Install-Module -Name ReportingServicesTools
#------------------------------------------------------
#Lets get security on all folders in a single instance
#------------------------------------------------------
#Declare SSRS URI
$sourceRsUri = 'http://ReportServerURL/ReportServer/ReportService2010.asmx?wsdl'
#Declare Proxy so we dont need to connect with every command
$proxy = New-RsWebServiceProxy -ReportServerUri $sourceRsUri
#Output ALL Catalog items to file system
Out-RsFolderContent -Proxy $proxy -RsFolder / -Destination 'C:\SSRS_Out' -Recurse
I've created this powershell script to copy them into a ZIP. You have to provide the SQL server database details.
Add-Type -AssemblyName "System.IO.Compression.Filesystem"
$dataSource = "SQLSERVER"
$user = "sa"
$pass = "sqlpassword"
$database = "ReportServer"
$connectionString = "Server=$dataSource;uid=$user; pwd=$pass;Database=$database;Integrated Security=False;"
$tempfolder = "$env:TEMP\Reports"
$zipfile = $PSScriptRoot + '\reports.zip'
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$allreports = $connection.CreateCommand()
$allreports.CommandText = "SELECT ItemID, Path, CASE WHEN Type = 2 THEN '.rdl' ELSE '.rds' END AS Ext FROM Catalog WHERE Type IN(2,5)"
$result = $allreports.ExecuteReader()
$reportable = new-object "System.Data.DataTable"
$reportable.Load($result)
[int]$objects = $reportable.Rows.Count
foreach ($report in $reportable) {
$cmd = $connection.CreateCommand()
$cmd.CommandText = "SELECT CAST(CAST(Content AS VARBINARY(MAX)) AS XML) FROM Catalog WHERE ItemID = '" + $report[0] + "'"
$xmldata = [string]$cmd.ExecuteScalar()
$filename = $tempfolder + $report["Path"].Replace('/', '\') + $report["Ext"]
New-Item $filename -Force | Out-Null
Set-Content -Path ($filename) -Value $xmldata -Force
Write-Host "$($objects.ToString()).$($report["Path"])"
$objects -= 1
}
Write-Host "Compressing to zip file..."
if (Test-Path $zipfile) {
Remove-Item $zipfile
}
[IO.Compression.Zipfile]::CreateFromDirectory($tempfolder, $zipfile)
Write-Host "Removing temporarly data"
Remove-Item -LiteralPath $tempfolder -Force -Recurse
Invoke-Item $zipfile
If you just need this for backup purposes or something similar, this might be useful: Where does a published RDL file sit?
The relevant query from that thread is:
select convert(varchar(max), convert(varbinary(max), content))
from catalog
where content is not null
The original answer was using 2005, and I've used it on 2016, so I imagine it should work for 2008 and 2012.
When I had to use this, I added in the Path to the query as well, so that I knew which report was which.
CAVEAT: prior to SSMS v18, Results to Grid is limited to 64KB per tuple and Results to Text are limited to 8,192 characters per tuple. If your report definition is larger than these limits you will not be able to get the entire definition.
In SSMS v18, those limits have been increased to 2MB per tuple for both Reports to Grid as well as Results to Text.
This is based on SQL2016/SSRS2016 but I think it should work for 2012.
SELECT 'http://mySQLServerName/reports/api/v1.0/catalogitems(' + cast(itemid as varchar(256))+ ')/Content/$value' AS url
FROM ReportServer.dbo.Catalog
This will give you a list of URL's, one for each report.
If the above did not work in SSRS 2012 then go to the report manager and do as if you were going to download the file from there. Check the URL on the download button and you'll probably see a URL with and item id embedded int it. Just adjust the above code to match that url structure.
What you do with then after this is up to you.
Personally I would use the Chrome extension called 'Tab Save' available in the Chrome store here. You can simply copy and paste all the URL's created above into it and hit the download button...
Found and used this without any issues. Nothing to install, just added my url, and pasted into Powershell.
https://microsoft-bitools.blogspot.com/2018/09/ssrs-snack-download-all-ssrs-reports.html
In case the link breaks, here's the code from the link:
###################################################################################
# Download Reports and DataSources from a SSRS server and create the same folder
# structure in the local download folder.
###################################################################################
# Parameters
###################################################################################
$downloadFolder = "c:\temp\ssrs\"
$ssrsServer = "http://myssrs.westeurope.cloudapp.azure.com"
###################################################################################
# If you can't use integrated security
#$secpasswd = ConvertTo-SecureString "MyPassword!" -AsPlainText -Force
#$mycreds = New-Object System.Management.Automation.PSCredential ("MyUser", $secpasswd)
#$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)/ReportServer/ReportService2010.asmx?WSDL" -Credential $mycreds
# SSRS Webserver call
$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)/ReportServer/ReportService2010.asmx?WSDL" -UseDefaultCredential
# List everything on the Report Server, recursively, but filter to keep Reports and DataSources
$ssrsItems = $ssrsProxy.ListChildren("/", $true) | Where-Object {$_.TypeName -eq "DataSource" -or $_.TypeName -eq "Report"}
# Loop through reports and data sources
Foreach($ssrsItem in $ssrsItems)
{
# Determine extension for Reports and DataSources
if ($ssrsItem.TypeName -eq "Report")
{
$extension = ".rdl"
}
else
{
$extension = ".rds"
}
# Write path to screen for debug purposes
Write-Host "Downloading $($ssrsItem.Path)$($extension)";
# Create download folder if it doesn't exist (concatenate: "c:\temp\ssrs\" and "/SSRSFolder/")
$downloadFolderSub = $downloadFolder.Trim('\') + $ssrsItem.Path.Replace($ssrsItem.Name,"").Replace("/","\").Trim()
New-Item -ItemType Directory -Path $downloadFolderSub -Force > $null
# Get SSRS file bytes in a variable
$ssrsFile = New-Object System.Xml.XmlDocument
[byte[]] $ssrsDefinition = $null
$ssrsDefinition = $ssrsProxy.GetItemDefinition($ssrsItem.Path)
# Download the actual bytes
[System.IO.MemoryStream] $memoryStream = New-Object System.IO.MemoryStream(#(,$ssrsDefinition))
$ssrsFile.Load($memoryStream)
$fullDataSourceFileName = $downloadFolderSub + "\" + $ssrsItem.Name + $extension;
$ssrsFile.Save($fullDataSourceFileName);
}
I'vr tried several permutations of this script and keep getting the "can't create proxy connection" error. Here's the one that "should" work:
#------------------------------------------------------
#Prerequisites
#Install-Module -Name ReportingServicesTools
#------------------------------------------------------
#Lets get security on all folders in a single instance
#------------------------------------------------------
#Declare SSRS URI
$sourceRsUri = "http://hqmnbi:80/ReportServer_SQL08/ReportService2010.asmx?wsdl"
#Declare Proxy so we dont need to connect with every command
$proxy = New-RsWebServiceProxy -ReportServerUri $sourceRsUri
#Output ALL Catalog items to file system
Out-RsFolderContent -Proxy $proxy -RsFolder / -Destination 'C:\Users\arobinson\source\Workspaces\EDW\MAIN\SSRS\HQMNBI' -Recurse
This is the error I'm getting:
Failed to establish proxy connection to http://hqmnbi/ReportServer_SQL08/ReportService2010.asmx : The HTML document does not contain
Web service discovery information.
At C:\Program Files\WindowsPowerShell\Modules\ReportingServicesTools\0.0.6.6\Functions\Utilities\New-RsWebServiceProxy.ps1:136 char:9
throw (New-Object System.Exception("Failed to establish proxy ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : OperationStopped: (:) [], Exception
FullyQualifiedErrorId : Failed to establish proxy connection to http://hqmnbi/ReportServer_SQL08/ReportService2010.asmx : The
HTML document does not contain Web service discovery information.
I've tried the URI with htttp:// and without, I've tried including the port number. etc. Still can't get this to actually work. We have two other SSRS instances that I was able to run this against no problem.
From this question: SQL Reporting Services - COPY reports to another folder
I found this tool can both download and upload reports. Plus it lists out folders and subfolders.
http://code.google.com/p/reportsync/

Resources