how to backup multiple databases in a script - database

I need to back up several powershell databases that are on the same server in Azure.
I currently have a script that helps me make backups but individually,
apart I must keep changing the names of the backups
This is my code:
Import-Module $PSScriptRoot\..\util\utilConnection.ps1;
Import-Module $PSScriptRoot\..\util\utilDate.ps1;
#Import-Module $PSScriptRoot\..\logging\Logging_Functions.ps1;
Import-Module AzureRM.sql
$TIMESTAMP = getTimeStamp;
#$LogPath = getPathLog;
#$logFileName = "prueba_jobDatabaseBackup.log";
#Log-Start -LogPath $LogPath -LogName $logFileName -ScriptVersion "1.5"
#return;
#Login-AzureRmAccount
loginRMAccount;
#Set subscription Azure
prueba;
Write-Output "";
#Create credential Source DB Server (QA)
#$myPasswordDB = ConvertTo-SecureString $SQL_ACCOUNT_PASSWORD_QA -AsPlainText -Force;
#$myCredentialDB = New-Object System.Management.Automation.PSCredential ($SQL_ACCOUNT_NAME_QA, $myPasswordDB);
#$sqlCredential = Get-Credential -Credential $myCredentialDB;
#Create credential Source DB Server (Prod)
$myPasswordDB = ConvertTo-SecureString $SQL_ACCOUNT_PASSWORD_QA -AsPlainText -Force;
$myCredentialDB = New-Object System.Management.Automation.PSCredential ($SQL_ACCOUNT_NAME_QA, $myPasswordDB);
$sqlCredential = Get-Credential -Credential $myCredentialDB;
$resourceGroup = "resGroupDB";
$serverName = "domserverqa";
$database = "prueba"; **// here I have to change the name of the backup file**
$primarykey = $STORAGE_ACCOUNT_BACKUP_KEY; #strdatabasebackup
$StorageUri = ("https://strdatabasebackup.blob.core.windows.net/strdatabasebackupblob/(2018-01-09-07:00)dbdom_buin.bacpac"); // here I also have to change the final name individually for each database
#$sqlCredential = getCredentialSQLServerQA; #SQL Server target
$SQL_SERVER_FULLNAME_QA = getSQLServerFullNameAzureQA;
$TIMEOUT = 300;
$importRequest = New-AzureRmSqlDatabaseImport –ResourceGroupName $resourceGroup –ServerName $serverName –DatabaseName $database –StorageKeytype StorageAccessKey –StorageKey $primarykey -StorageUri $StorageUri -AdministratorLogin $sqlCredential.UserName –AdministratorLoginPassword $sqlCredential.Password –Edition Basic –ServiceObjectiveName basic -DatabaseMaxSizeBytes 2147483648 # 2GB -> 2 * 1024 MB -> 2 * 1024 * 1024 KB -> 2 * 1024 * 1024 * 1024 Bytes
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink;
while ($importStatus.Status -eq "InProgress")
{
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink;
Write-Output ".";
[System.Threading.Thread]::Sleep(2000);
}
[System.Threading.Thread]::Sleep(4000);
How can I implement a foreach or array to put all the databases together and back them up one by one without having to do it manually?
If someone has any ideas please help me thanks

Pass you DBNames as a list to a ForLoop or a funciton
Just pass in a list of database names in a ForLoop, passing the dbname in a variable to your code.
$AzureDBNames = 'AzureDB01','AzureDB01','AzureDB01'
ForEach ($AzureDBName in $AzureDBNames)
{
# Code begins here
"Backing up $AzureDBName"
}
Turn your code into a function with a parameter that accepts one or more db names.
Function New-AzureDBBackup
{
[CmdletBinding()]
[Alias('NABB')]
Param
(
[string[]]$AzureDBNames
)
# Code begins here
}
New-AzureDBBackup -AzureDBNames 'AzureDB01','AzureDB01','AzureDB01'
Read the online help on:
About_Functions
About_For
About_Loops
About_Variables

Related

How to remove encryption from all objects in SQL Server?

I have more than a hundred encrypted procedures and functions that I want to decrypt (I am trying a bacpac file export but it fails due to procedures being encrypted). I tried using dbforge sql decryptor decryption wizard for in place alter but I get the error:
Definition is invalid. Can't find CREATE keyword.
When I try to see the DDL script of a stored procedure(using dbforge sql decryptor), I get the error:
A definition for the object dbo.pt_blocks cannot be shown because it is encrypted by a third party tool
I can not find a resolution to this. Are there any solutions or other tools available for this?
Edit: I found this resource which mentions
take the source code and issue an ALTER command without the encryption option. Just take the source code and remove the WITH ENCRYPTION
How could I achieve this?
EDIT: I have enabled remote DAC. How can I decrypt everything? The accepted answer from this question has a broken link.
Edit: The problem has been solved by uninstalling a third party tool which was creating encrypted procedures.
Below is a PowerShell example that creates a script file of all encrypted objects, gleaned from Paul White's The Internals of WITH ENCRYPTION article. Change the data source and initial catalog in the 2 connection strings to the desired server and database as well as script file path.
A DAC connection is used to retrieve values from system tables so sysadmin server role membership is required. If run remotely, the SQL Server remote admin connections option must be enabled and TCP port 1434 allowed through the firewall.
The script can be run from the PowerShell ISE or from a command prompt after customization. Example command-line invocation, assuming script was saved to file "Decrypt-Objects.ps1".
powershell -ExecutionPolicy RemoteSigned -File C:\PowershellScripts\Decrypt-Objects.ps1
PowerShell script:
# PowerShell implementation of T-SQL code from https://sqlperformance.com/2016/05/sql-performance/the-internals-of-with-encryption
Function Get-DecryptedString($pwd, $data) {
$key = [System.Array]::CreateInstance([int], 256)
$box = [System.Array]::CreateInstance([int], 256)
$cipher = [System.Array]::CreateInstance([byte], $data.Length)
for ($i = 0; $i -lt 256; ++$i) {
$key[$i] = $pwd[$i % $pwd.Length]
$box[$i] = $i
}
for ($j = $i = 0; $i -lt 256; ++$i) {
$j = ($j + $box[$i] + $key[$i]) % 256
$tmp = $box[$i]
$box[$i] = $box[$j]
$box[$j] = $tmp
}
for ($a = $j = $i = 0; $i -lt $data.Length; ++$i) {
++$a
$a %= 256
$j += $box[$a]
$j %= 256
$tmp = $box[$a]
$box[$a] = $box[$j]
$box[$j] = $tmp
$k = $box[(($box[$a] + $box[$j]) % 256)]
$cipher[$i] = ($data[$i] -bxor $k)
}
$decryptedString = [System.Text.Encoding]::Unicode.GetString($cipher)
return $decryptedString
}
Function Get-ClearObjectText($connectionString, $objectName) {
$getRc4KeyQuery = #"
DECLARE
#objectid integer = OBJECT_ID(#ObjectName),
#family_guid binary(16),
#objid binary(4),
#subobjid binary(2);
-- Find the database family GUID
SELECT #family_guid = CONVERT(binary(16), DRS.family_guid)
FROM sys.database_recovery_status AS DRS
WHERE DRS.database_id = DB_ID();
-- Convert object ID to little-endian binary(4)
SET #objid = CONVERT(binary(4), REVERSE(CONVERT(binary(4), #objectid)));
SELECT
-- Read the encrypted value
#imageval = SOV.imageval,
-- Get the subobjid and convert to little-endian binary
#subobjid = CONVERT(binary(2), REVERSE(CONVERT(binary(2), SOV.subobjid)))
FROM sys.sysobjvalues AS SOV
WHERE
SOV.[objid] = #objectid
AND SOV.valclass = 1;
-- Compute the RC4 initialization key
SELECT #RC4key = HASHBYTES('SHA1', #family_guid + #objid + #subobjid);
"#
$connection = New-Object System.Data.SqlClient.SqlConnection($dacConnectionString)
$connection.Open()
$command = New-Object System.Data.SqlClient.SqlCommand($getRc4KeyQuery, $connection)
($command.Parameters.Add("#ObjectName", [System.Data.SqlDbType]::NVarChar, 261)).Value = $objectName
($command.Parameters.Add("#imageval", [System.Data.SqlDbType]::VarBinary, -1)).Direction = [System.Data.ParameterDirection]::Output
($command.Parameters.Add("#RC4key", [System.Data.SqlDbType]::Binary, 20)).Direction = [System.Data.ParameterDirection]::Output
[void]$command.ExecuteNonQuery()
$imageval = $command.Parameters["#imageval"].Value
$RC4key = $command.Parameters["#RC4key"].Value
$connection.Close()
$decryptedString = Get-DecryptedString -pwd $RC4key -data $imageval
Return $decryptedString
}
# ############
# ### MAIN ###
# ############
# DAC connection string for decryption
$dacConnectionString = "Data Source=admin:YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI"
# normal connection string for encrypted object list
$connectionString = "Data Source=YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI"
# target file path for clear encrypted objects DDL
$scriptFilePath = "C:\Scripts\EncryptedObjects.sql"
[void](New-Item -Path "C:\Scripts\EncryptedObjects.sql" -ItemType file -Force) # create directory (if needed) and empty script file
$EncryptedObjectQuery = #"
SELECT
QUOTENAME(OBJECT_SCHEMA_NAME(object_id)) + '.' + QUOTENAME(name) AS QualifiedObjectName
FROM sys.objects
WHERE OBJECTPROPERTY(object_id, 'IsEncrypted') = 1;
"#
try {
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$command = New-Object System.Data.SqlClient.SqlCommand($EncryptedObjectQuery, $connection)
$connection.Open()
$reader = $command.ExecuteReader()
while ($reader.Read()) {
$createObjectScript = Get-ClearObjectText -connectionString $dacConnectionString -objectName $reader["QualifiedObjectName"]
$createObjectScript | Out-File -FilePath $scriptFilePath -Append
"GO" | Out-File -FilePath $scriptFilePath -Append
}
$connection.Close()
}
catch {
throw
}

How to pass csv file as query param in api call using powershell

I have developed two simple PS scripts that work fine separately. Script1 connects to a DB, run a sql query and save the output (only one column that is a list of the project names) as csv file. Script2 connects to an endpoint using API calls and prints the details of a projects. I use script1's output as script2's input manually. I have tried a couple of different ways to automate this process but I haven't been able to get it to work. Does anyone know how can I pass the csv file as query param in api call?
Here is what I have so far:
This is Script1:
#SQL Connection variables
$Server = "my server"
$DBName = "db name"
$credential = Import-CliXml -Path "C:\Test\MyCredential.xml"
$User = $Credential.UserName
$PW = $credential.GetNetworkCredential().Password
$Connection = New-Object System.Data.SqlClient.SqlConnection
$Connection.ConnectionString = "Server = $Server; Database = $DBName; User ID = $User; Password = $PW;"
$Connection.Open()
#$Connection.State
$SqlQuery = "select from table example"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.Connection = $Connection
$SqlCmd.CommandText = $SqlQuery
$CxSqlCmd.CommandTimeout = 0
#Creating sql adapter
$SqlAdapter = New-Object System.Data.sqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
#Creating Dataset
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$DataSet.Tables[0] | export-csv -Path $OuputFile -NoTypeInformation
The output for script 1 is 11223344, So I use this project name as my input or query param in my second script.
And this is Script2:
$credential = Import-CliXml -Path "C:\Test\MyCredential2.xml"
$credential = Import-CliXml -Path "C:\Test\MyCredential2.xml"
$APIKEY = $credential.GetNetworkCredential().Password
$token = "APIKEY " + "$APIKEY"
$Params = #{
uri = 'https:myendpoint/search?name=11223344'
Headers = #{'Authorization' = "API KEY $token"}
Method ='GET'
ContentType = 'application/json'
}
$Response = Invoke-RestMethod #Params
I really appreciate it if someone can help me with this.

Sqlbulkcopy Excessive Memory Consumtion even with EnableStreaming and low BatchSize

I try to bulk load data from Oracle to SqlServer through Powershell Sqlserver Module Sqlbulkcopy
On small Data, everything works fine, but on big Datasets, even if bachsize and streaming are set, sqlbulkcopy is taking all the memory available... until an out of memory
Also the notify function seems to give no answer, so I guess even with streaming=True, the process first load everything to memory...
What did I missed ?
$current = Get-Date
#copy table from Oracle table to SQL Server table
add-type -path "D:\oracle\product\12.1.0\client_1\odp.net\managed\common\Oracle.ManagedDataAccess.dll";
#define oracle connectin string
$conn_str = "cstr"
# query for oracle table
$qry = "
SELECT
ID,CREATEDT,MODIFIEDDT
FROM MYTABLE
WHERE source.ISSYNTHETIC=0 AND source.VALIDFROM >= TO_Date('2019-01-01','yyyy-mm-dd')
";
# key (on the left side) is the source column while value (on the right side) is the target column
[hashtable] $mapping = #{'ID'='ID';'CREATEDT'='CREATEDT';'MODIFIEDDT'};
$adapter = new-object Oracle.ManagedDataAccess.Client.OracleDataAdapter($qry, $conn_str);
#$info = new-object Oracle.ManagedDataAccess.Client;
#Write-Host ( $info | Format-Table | Out-String)
$dtbl = new-object System.Data.DataTable('MYTABLE');
#this Fill method will populate the $dtbl with the query $qry result
$adapter.Fill($dtbl);
#define sql server target instance
$sqlconn = "cstr";
$sqlbc = new-object system.data.sqlclient.Sqlbulkcopy($sqlconn)
$sqlbc.BatchSize = 1000;
$sqlbc.EnableStreaming = $true;
$sqlbc.NotifyAfter = 1000;
$sqlbc.DestinationTableName="DWHODS.MYTABLE";
#need to tell $sqlbc the column mapping info
foreach ($k in $mapping.keys)
{
$colMapping = new-object System.Data.SqlClient.SqlBulkCopyColumnMapping($k, $mapping[$k]);
$sqlbc.ColumnMappings.Add($colMapping) | out-null
}
$sqlbc.WriteToServer($dtbl);
$sqlbc.close;
$end= Get-Date
$diff= New-TimeSpan -Start $current -End $end
Write-Output "import needed : $diff"
Thanks to Jeroen, I changed the code like this, now its no more consuming memory :
$oraConn = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($conn_str);
$oraConn.Open();
$command = $oraConn.CreateCommand();
$command.CommandText=$qry;
$reader = $command.ExecuteReader()
...
$sqlbc.WriteToServer($reader);

Livestream data from SQL to Power BI online service using PowerShell or other options

I have an assignment to setup a live Power BI online service tile using live data streaming from SQL table. The data in the table updates every few seconds. Here is what I got so far using PowerShell.
But it appears to be not refreshing the data set every few seconds. What am I missing?
DO
{
$SqlServer = 'ServerName';
$SqlDatabase = 'DBName';
$sleepDuration = 3
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
$SqlQuery = "SELECT * FROM MyTable;";
$SqlCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand;
$SqlCommand.CommandText = $SqlQuery;
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
$SqlCommand.Connection = $SqlConnection;
$SqlConnection.Open();
$SqlDataReader = $SqlCommand.ExecuteReader();
##you would find your own endpoint in the Power BI service
$endpoint = "......My PowerBI Service Push URL.........."
#Fetch data and write out to files
while ($SqlDataReader.Read()) {
$payload =
#{
"Col1" =$SqlDataReader['Col1']
"Col2" =$SqlDataReader['Col2']
}
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json #($payload))
# Sleep for a second
Start-Sleep $sleepDuration
}
$SqlConnection.Close();
$SqlConnection.Dispose();
} While (1 -eq 1)

Task scheduler creates corrupted version of generated by script file

I've been working on a little project in Powershell.
My task was to create a script that will collect all files from mail attachments, merge all .pdf files into one and send the generated file to my email.
The script works completely fine in Powershell ISE, but when I try to run it from task scheduler, the merged .pdf file is corrupted without any data in it.
Keep in mind I am new to this stuff.
This is my main code that does all the heavy work:
function getAttachments
{
#########################################################
##-----------------------------------------------------##
## GET ATTACHMENTS ##
##-----------------------------------------------------##
#########################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}else{
##READ USER CREDENTIAL FROM FILE
$cred = Import-CliXml -Path $credpath
}
##url and date variables
$url = "https://outlook.office365.com/api/v1.0/me/messages"
$d = [DateTime]::Today.AddDays(-1)
$global:date = $d.ToString("yyyy-MM-dd")
## Get all messages that have attachments where received date is greater than $date
$messageQuery = "" + $url + "?`$select=Id&`$filter=HasAttachments eq true and DateTimeReceived ge " + $date
$messages = Invoke-RestMethod $messageQuery -Credential $cred
## Loop through each results
foreach ($message in $messages.value)
{
# get attachments and save to file system
$query = $url + "/" + $message.Id + "/attachments"
$attachments = Invoke-RestMethod $query -Credential $cred
# in case of multiple attachments in email
foreach ($attachment in $attachments.value)
{
Write-Host “Found File :- ” $attachment.Name
$path = "c:\Attachments\" + $attachment.Name
$Content = [System.Convert]::FromBase64String($attachment.ContentBytes)
Set-Content -Path $path -Value $Content -Encoding Byte
}
}
}
function sendAttachments
{
#############################################################
##---------------------------------------------------------##
## SEND ATTACHMENTS AND DELETE FILES ##
##---------------------------------------------------------##
#############################################################
#Connection Details
#PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
$cred = Import-CliXml -Path $credpath
$smtpServer = “ smtp.office365.com”
$msg = new-object Net.Mail.MailMessage
#Change port number for SSL to 587
$smtp = New-Object Net.Mail.SmtpClient($SmtpServer, 25)
#Uncomment Next line for SSL
$smtp.EnableSsl = $true
$smtp.Credentials = $cred
$msg.IsBodyHtml = $true
#From Address
$msg.From = $cred.UserName
#To Address, Copy the below line for multiple recipients
$msg.To.Add(“email#gmail.com”)
#Message Body
$msg.Body=”<h2>Alle attachments samen bevinden zich in de bijlage van did email</h2> <br/><br/>”
#Message Subject
$msg.Subject = “no-reply: Email met alle attachments”
#your file location
$files=Get-ChildItem “C:\Attachments\”
#attach the right file
$file = $global:pname
Write-Host “Attaching File :- ” $file
$attachment = New-Object System.Net.Mail.Attachment –ArgumentList C:\Attachments\$file
$msg.Attachments.Add($attachment)
#send email
$smtp.Send($msg)
$attachment.Dispose();
$msg.Dispose();
#delete the files from the folder
Get-ChildItem -Path C:\Attachments -Include * -File -Recurse | foreach { $_.Delete()}
}
function mergePDF
{
#############################################################
##---------------------------------------------------------##
## MERGE ALL PDF FILES ##
##---------------------------------------------------------##
#############################################################
$workingDirectory = "C:\Attachments"
$itspath = $PSScriptRoot
$global:pname = $global:date + "_pdfAttachments.pdf"
$pdfs = ls $workingDirectory -recurse | where {-not $_.PSIsContainer -and $_.Extension -imatch "^\.pdf$"};
[void] [System.Reflection.Assembly]::LoadFrom([System.IO.Path]::Combine($itspath, 'itextsharp.dll'));
$output = [System.IO.Path]::Combine($workingDirectory, $pname);
$fileStream = New-Object System.IO.FileStream($output, [System.IO.FileMode]::OpenOrCreate);
$document = New-Object iTextSharp.text.Document;
$pdfCopy = New-Object iTextSharp.text.pdf.PdfCopy($document, $fileStream);
$document.Open();
foreach ($pdf in $pdfs) {
$reader = New-Object iTextSharp.text.pdf.PdfReader($pdf.FullName);
[iTextSharp.text.pdf.PdfReader]::unethicalreading = $true
$pdfCopy.AddDocument($reader);
$reader.Dispose();
}
$document.Close()
$pdfCopy.Dispose();
$document.Dispose();
$fileStream.Dispose();
}
getAttachments
Start-Sleep -s 10
mergePDF
Start-Sleep -s 10
sendAttachments
In this piece of code that I run in another script file, I create a new task:
#############################################################
##---------------------------------------------------------##
## SCHEDULE SCRIPTS IN WINDOWS TASKS ##
##---------------------------------------------------------##
#############################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}
$taskName = "ManageEmailAttachments"
$taskExists = Get-ScheduledTask | Where-Object {$_.TaskName -like $taskName }
if($taskExists)
{
Get-ScheduledJob ManageEmailAttachments
Unregister-ScheduledJob ManageEmailAttachments
$wshell = New-Object -ComObject Wscript.Shell
$wshell.Popup("Task successfully deleted, run the script again to schedule the task",0,"Done",0x0)
}
else
{
$tt = Get-Date
$tt = $tt.AddMinutes(1)
$testtime = $tt.ToString("HH:mm:ss")
#set trigger
$trigger = New-JobTrigger -Daily -At "1:00"
$testtrigger = New-JobTrigger -Daily -At $testtime
#path to the scripts
$scriptPath = $PSScriptRoot + "\wps_manage_pdf_attachments.ps1"
#options(optional)
$option = New-ScheduledJobOption -WakeToRun: $true
#create a new task
Register-ScheduledJob -Name ManageEmailAttachments -FilePath $scriptPath -Trigger $testtrigger -ScheduledJobOption $option
}
The script when run in Powershell works great, it gets all the attachments from mailbox, merges them into 1 .pdf file and sends them to the requested email address. But when scheduled in windows task scheduler it does the first step fine, but when merged, the .pdf file is corrupted without any content.
I couldn't figure out how to make it work so I posted a question on the forum.
Hope you guys find a way to figure it out.
Thanks in advance
Use below function to get script root directory.
Function Get-ScriptDirectory
{
$Invocation = (Get-Variable MyInvocation -scope 1).Value
Split-path $Invocation.MyCommand.Path
}
$scriptPath=Join-Path(Get-ScriptDirectory) 'wps_manage_pdf_attachments.ps1'
Apparently the problem nested itself in the main code. I used:
Try{...}
Catch{$_ | Out-File C:\errors.txt}
In mergePDF function to find out what the error was. Seems like the path to my ITextSharp.dll was incorrect. $PSScriptRoot that I used showed "C:\windows\windows32" instead of where the script actually is.
So what I did instead was add a line in my batch file to copy the itextsharp.dll to %Temp%:
xcopy Scripts\itextsharp.dll %Temp% /D >NUL 2>NUL
and then read the file from there with:
$itsPath = [System.IO.Path]::GetTempPath()
And everything works as it should be. I know this isn't the best way to do it but I had this batch file before to make the script run by just dubbleclicking it.
So adding a little line won't hurt.
I hope this helps anyone with the same problem.

Resources