Powershell using System.Data.SqlDBType varbinary - sql-server

I'm trying to import a ZIP-File from a SQL-Server. The column in der SQL-table is defined as varbinary(max). I'm using a SQL-stored procedure with an output parameter giving me this zip-file.
I defined (see code) an cmd.parameter (system.data.sqldbtype::varbinary,-1) , "-1" should be the "max"-length, but I get no records back ($rd.HasRecords is null..).
Thanks for your help.
Function Get-SQLData
{
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = "Server=XXXX\YYYY;Database=SQL_XXX;Integrated Security=no;User=SQLServer_XX;Password=xxYYYY"
$conn.Open() | out-null
$cmd = new-Object System.Data.SqlClient.SqlCommand #("deployment.getZIPFile", $conn)
#Proz
$cmd.Connection = $conn
$cmd.CommandType = [System.Data.CommandType]::StoredProcedure
$cmd.CommandText = "deployment.getZIPFile"
#### Proz-Parameter
$cmd.Parameters.Add("#file_typ",[system.data.SqlDbType]::VarChar,5) | out-Null
$cmd.Parameters['#file_typ'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#file_typ'].value = 'PS'
#
$cmd.Parameters.Add("#domain",[system.data.SqlDbType]::VarChar,5) | out-Null
$cmd.Parameters['#domain'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#domain'].value = ($env:USERDNSDOMAIN).Split('.')[0] # domain
#
$cmd.Parameters.Add("#serverName",[system.data.SqlDbType]::VarChar,50) | out-Null
$cmd.Parameters['#serverName'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#serverName'].value = $env:COMPUTERNAME #local server
#
$cmd.Parameters.Add("#scriptVersion",[system.data.SqlDbType]::decimal) | out-Null
$cmd.Parameters['#scriptVersion'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#ScriptVersion'].Precision=18
$cmd.Parameters['#ScriptVersion'].Scale=2
$cmd.Parameters['#scriptVersion'].value = $MyVersion
#
$cmd.Parameters.Add("#operatingSystem",[system.data.SqlDbType]::VarChar, 100) | out-Null
$cmd.Parameters['#operatingSystem'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#operatingSystem'].value = (gwmi -Class win32_operatingsystem).caption
#
$cmd.Parameters.Add("#serverTyp",[system.data.SqlDbType]::VarChar, 50) | out-Null
$cmd.Parameters['#serverTyp'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#serverTyp'].value = $serverTyp
#
$cmd.Parameters.Add("#serverSubTyp",[system.data.SqlDbType]::VarChar, 50) | out-Null
$cmd.Parameters['#serverSubTyp'].Direction = [system.data.ParameterDirection]::Input
$cmd.Parameters['#serverSubTyp'].value = $serverSubTyp
#
$cmd.Parameters.Add("#aktScriptVersion",[system.data.SqlDbType]::decimal) | out-Null
$cmd.Parameters['#aktScriptVersion'].Direction = [system.data.ParameterDirection]::Output
$cmd.Parameters['#aktScriptVersion'].Precision=18
$cmd.Parameters['#aktScriptVersion'].Scale=2
#
$cmd.Parameters.Add("#ZIPFile",[system.data.SqlDbType]::varbinary,-1) | out-Null
$cmd.Parameters['#ZIPFile'].Direction = [system.data.ParameterDirection]::Output
#### Proz-Parameter Ende
#$cmd.ExecuteNonQuery() #| out-null ## org
$rd = $cmd.ExecuteReader() # gibt es Records?
#$rd = $cmd.ExecuteNonQuery()
if ($rd.HasRows) # gibt es Records?
{
$bufferSize = 8192
# Stream Lesen..
# Create a byte array for the stream.
$out = [array]::CreateInstance('Byte', $bufferSize)
# Looping through records
While ($rd.Read())
{
$fileLocation = "C:\PerfLogs\XXXX\ZIPImport.7z"
#Write-Output ("Exporting: {0}" -f $rd.GetInt32(0));
# New BinaryWriter
$fs = New-Object System.IO.FileStream $fileLocation,'Create','Write';
$bw = New-Object System.IO.BinaryWriter $fs;
$start = 0;
# Read first byte stream
$received = $rd.GetBytes(0, $start, $out, 0, $bufferSize - 1); ## 1
While ($received -gt 0)
{
$bw.Write($out, 0, $received);
$bw.Flush();
$start += $received;
# Read next byte stream
$received = $rd.GetBytes(0, $start, $out, 0, $bufferSize - 1); ## 1
}
$bw.Close();
$fs.Close();
}
# Closing & Disposing all objects
$fs.Dispose()
$rd.Close()
$cmd.Dispose()
$conn.Close()
Write-Output ("ZIP-Import Finished")
# 7z Sektion
$unzip = 'C:\PerfLogs\xxxx\Modul'
& ${env:ProgramFiles}\7-Zip\7z.exe x $fileLocation "-o$($unzip)" -y
Write-Output ("UNZIP Finished")
}
$conn.Close()
$conn.Dispose()
}
}

As mentioned in the comments that solved your problem, the parameters in your question are correct. The issue is with the code that executes the procedure and processes the result.
There is no need to use a SqlDataReader here because the proc returns the value in an output parameter instead of in a result set column. Also, since the entire byte array is returned, it is much easier to write the value directly to the file rather than in a read/write loop.
Below is one way to write the output parameter byte array to a file.
# execute the proc, returning the #ZIPFile output parameter
$cmd.ExecuteNonQuery() | out-null
if($cmd.Parameters["#ZIPFile"].Value -eq [System.DBNull]::Value) {
Write-Output "Zipfile not found in database"
}
else {
# write the byte arrary output value to a file
$fileLocation = "C:\PerfLogs\XXXX\ZIPImport.7z"
Write-Output "Exporting $($cmd.Parameters["#ZIPFile"].Value.Length) bytes to $fileLocation"
[System.IO.File]::WriteAllBytes($fileLocation, $cmd.Parameters["#ZIPFile"].Value)
}

Related

Importing CSV to SQL Server using bulkcopy

I'm trying to import my CSV files into SQL Server. I found this code and it works perfectly and very fast:
# Database variables
$sqlserver = "servername"
$database = "datebasename"
$table = "tablename"
# CSV variables
$csvfile = "F:\TestNA\fin_product4.csv"
$csvdelimiter = ";"
$firstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumnNames -eq $true) { $null = $reader.readLine() }
#foreach ($column in $columns) {
# $null = $datatable.Columns.Add()
#}
$col1 = New-Object system.Data.DataColumn fin_product_rk,([datetime])
$col2 = New-Object system.Data.DataColumn fin_product_id,([datetime])
$datatable.columns.add($col1)
$datatable.columns.add($col2)
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
The problem is: it works for the standard latin encoding, but I have CSVs in UTF-8 and Windows-1251 encodings.
What and where should I add to change the encoding in this code?
I don't know the programming language that used to write this code so I can't do it myself, I would be happy if someone can help!
Thank you!
Update: CSV example:
product;product_id;product_nm;dttm
220;text;некоторый текст;12JAN2021:18:03:41.000000
220;text;некоторый текст;1JAN2021:18:03:41.000000
564;text;некоторый текст;16JAN2021:18:03:41.000000
Here is a solution in T-SQL.
It is very concise, one single statement, in comparison with powershell.
Notable points:
BULK INSERT parameter CODEPAGE = '65001' specifies UTF-8.
product_nm NVARCHAR(100) column holds UNICODE characters from the file.
SQL
USE tempdb;
GO
DROP TABLE IF EXISTS dbo.tbl;
CREATE TABLE dbo.tbl (
product VARCHAR(10),
product_id VARCHAR(30),
product_nm NVARCHAR(100),
dttm VARCHAR(50)
);
BULK INSERT dbo.tbl
FROM 'e:\Temp\Faenno_2.csv'
WITH (FORMAT='CSV'
, DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' }
, FIELDTERMINATOR = ';'
, ROWTERMINATOR = '\n'
, FIRSTROW = 2
, CODEPAGE = '65001');
-- test
SELECT * FROM dbo.tbl;
Output
+---------+------------+-----------------+---------------------------+
| product | product_id | product_nm | dttm |
+---------+------------+-----------------+---------------------------+
| 220 | text | некоторый текст | 12JAN2021:18:03:41.000000 |
| 220 | text | некоторый текст | 1JAN2021:18:03:41.000000 |
| 564 | text | некоторый текст | 16JAN2021:18:03:41.000000 |
+---------+------------+-----------------+---------------------------+

Powershell CSV to SQL Query

I am back on my on-going learning into powershell, I have hit the limits of my capabilities. So wondering if anyone can throw me in the right direction and or tell me if what I am wanting to do is possible?
I have written a script which downloads a number of file(s) from the web and then deletes out all but the data that I wish to work with. These are .csv's...
I have added the Powershell code that I have managed to cobble together so far, when the data is downloaded see below. The data extract of one of these files looks like this.
Column A contains a description
Column B contains an ISIN (this is the working information)
Column C contains a numerical figure
Column D is where I want the SQL query data to be returned to
Please find a link to the sample data
Data Example
Data Example
What I then want to achieve is to run the following SQL command based on the data within Column B
Select *
From CL
Where CLISIN in ('GB0004835483',
'BE0003793107',
'GB00B7V2GY97',
'GB0000595859',
'GB00B1VCNQ84',
'GB0004992003',
'GB0002369352')
I believe that I will need to export this as another file perhaps? Though using the final results that are exported I then need to place these within Column D in this csv file.
Hoping that I have made this clear, if not please let me know and I will be as expansive as possible.
Long and the short is can I use powershell to automatically run a query for each item in column B and add the results of that matching query to the correlating line in column D?
I cannot find the answer via google nor here...
#### DOWNLOAD LOCATIONS ####
$DownloadPTMLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTable.xls"
$DownloadPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTable.csv"
$DownloadIPTMLocation = "L:\Operations Database\TakeOverPanel\IPTMDisclosureTable.xls"
$DownloadIPTMCSVLocation = "L:\Operations Database\TakeOverPanel\IPTMDisclosureTable.csv"
#### WEB URLS ###
$PTMURL = "http://www.thetakeoverpanel.org.uk/new/disclosureTable/v3/disclosuretable.xls"
$PTMCSVURL = "http://www.thetakeoverpanel.org.uk/new/disclosureTable/v3/disclosuretable.csv"
$IPTMURL = "http://irishtakeoverpanel.ie/disclosure/disclosuretable.xls"
$IPTMCSVURL = "http://irishtakeoverpanel.ie/disclosure/disclosuretable.csv"
$Path = Get-Location
#### Load Web Service ####
$WebClient = New-Object System.Net.WebClient
#### Start download Process ####
Write-Host "Downloading PTM File 1 of 4" $Path -ForegroundColor Green
$Url = $PTMURL
$Path = $DownloadPTMLocation
$WebClient.DownloadFile($PTMURL, $DownloadPTMLocation)
Write-Host "Downloading IPTM File 2 of 3" $Path -ForegroundColor Blue
$Url = $IPTMURL
$Path = $DownloadIPTMLocation
$WebClient.DownloadFile($IPTMURL, $DownloadIPTMLocation)
Write-Host "Downloading PTM Csv File 3 of 4" $Path -ForegroundColor Gray
$Url = $PTMCSVURL
$Path = $DownloadPTMCSVLocation
$WebClient.DownloadFile($PTMCSVURL, $DownloadPTMCSVLocation)
Write-Host "Downloading IPTM File 4 of 4" $Path -ForegroundColor Red
$Url = $IPTMCSVURL
$Path = $DownloadIPTMCSVLocation
$WebClient.DownloadFile($IPTMCSVURL, $DownloadIPTMCSVLocation)
#####################
## PTM ##
#Customise Vars
$DownloadPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTable.csv"
$OutputPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTableb.csv"
$Match = "ISIN"
$Matchs = "NSI"
## Strips all lines that do not contain ISIN ##
(Get-Content $DownloadPTMCSVLocation) -match $Match | Out-File $OutputPTMCSVLocation
Remove-Item $DownloadPTMCSVLocation
Rename-Item $OutputPTMCSVLocation -NewName $DownloadPTMCSVLocation
(Get-Content $DownloadPTMCSVLocation) -match $Matchs | Out-File $OutputPTMCSVLocation
Remove-Item $DownloadPTMCSVLocation
Rename-Item $OutputPTMCSVLocation -NewName $DownloadPTMCSVLocation
Get-Content $DownloadPTMCSVLocation | % {
$_ -replace 'ISIN: ',''
} | % {
$_ -replace 'NSI: ',''
} | Set-Content $OutputPTMCSVLocation
So working on this further and having hit the wall (once more returning SQL query) here is the working code so far...
###########
## BEGIN ##
###########
#### DOWNLOAD LOCATIONS ####
$DownloadPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTable.csv"
$OutputPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTableb.csv"
#### WEB URLS ###
$PTMCSVURL = "http://www.thetakeoverpanel.org.uk/new/disclosureTable/v3/disclosuretable.csv"
$Path = Get-Location
#### Load Web Service ####
$WebClient = New-Object System.Net.WebClient
#### Start download Process ####
Write-Host "Downloading PTM Csv" $Path -ForegroundColor Gray
$Url = $PTMCSVURL
$Path = $DownloadPTMCSVLocation
$WebClient.DownloadFile($PTMCSVURL, $DownloadPTMCSVLocation)
#####################
## PTM ##
$Match = "ISIN"
$Matchs = "NSI"
Import-Csv $DownloadPTMCSVLocation -Header #("A", "ISIN", "NSI", "Output") | #Import the CSV
Where { $_.ISIN -like "ISIN: ????????????" -and $_.NSI -like "NSI:*" } | #Filter rows
Foreach-Object {
$_.ISIN = $_.ISIN.Replace("ISIN: ", "")
$_.NSI = $_.NSI.Replace("NSI: ", "")
$query = "select CLIALPHASORTCODE, vl2securitynum, sum(cast(vl2beneficial as float)) as beneficial
from t5vaultsl2 vl2 left outer join t5client cli
on vl2.VL2CLIENTNUM = cli.CLICODE
where vl2.vl2securitynum = '$($_.ISIN)'
group by CLIALPHASORTCODE, VL2SECURITYNUM"
##Credentials##
$MISA = 'xx.xx.x.xx'
$MISB = 'xx.xx.x.xx'
$userName = 'UN'
$PassWord='PW'
$DB = 'reporting'
## CREATE MIS CREDENTIALS ##
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Data Source=$MISA;Initial Catalog=$DB;
Initial Catalog=$DB;User ID=$userName;Password=$PassWord;"
## - Runs Script from Set Location
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand;
$SqlCMD.CommandText = $query;
$SqlCmd.Connection = $SqlConnection;
So script is now as follows...
#### DOWNLOAD LOCATIONS ####
$DownloadPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTable.csv"
$OutputPTMCSVLocation = "L:\Operations Database\TakeOverPanel\PTMDisclosureTableb.csv"
$ExportLocation = "L:\Operations Database\TakeOverPanel\test.csv"
$ExportLocationb = "L:\Operations Database\TakeOverPanel\test.xml"
#### WEB URLS ###
$PTMCSVURL = "http://www.thetakeoverpanel.org.uk/new/disclosureTable/v3/disclosuretable.csv"
$Path = Get-Location
#### Load Web Service ####
$WebClient = New-Object System.Net.WebClient
#### Start download Process ####
Write-Host "Downloading PTM Csv" $Path -ForegroundColor Gray
$Url = $PTMCSVURL
$Path = $DownloadPTMCSVLocation
$WebClient.DownloadFile($PTMCSVURL, $DownloadPTMCSVLocation)
################
#### Query ####
################
$query = #"
select CLIALPHASORTCODE, vl2securitynum, sum(cast(vl2beneficial as float)) as beneficial
from t5vaultsl2 vl2 left outer join t5client cli
on vl2.VL2CLIENTNUM = cli.CLICODE
where vl2.vl2securitynum = '$($_.ISIN)'
group by CLIALPHASORTCODE, VL2SECURITYNUM
"#;
#####################################
$Match = "ISIN"
$Matchs = "NSI"
## Prepare SQL ##
$MISA = 'xx.xx.x.xx'
$MISB = 'xx.xx.x.xx'
$userName = 'UN'
$PassWord='PW'
$DB = 'reporting'
## CREATE SQL Connection ##
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Data Source=$MISA;Initial Catalog=$DB;Initial Catalog=$DB;User ID=$userName;Password=$PassWord;"
$SqlConnection.Open()
# Put everything in a Try block so if there is an error the SQL
# connection is still closed
try
{
$SqlCmd = $SqlConnection.CreateCommand()
## Process CSV ##
Import-Csv $DownloadPTMCSVLocation -Header #("A", "ISIN", "NSI", "Output") | #Import the CSV
Where { $_.ISIN -like "ISIN: ????????????" -and $_.NSI -like "NSI:*" } | #Filter rows
Foreach-Object {
$_.ISIN = $_.ISIN.Replace("ISIN: ", "")
$_.NSI = $_.NSI.Replace("NSI: ", "")
# Get data from SQL
$query = "select CLIALPHASORTCODE, vl2securitynum, sum(cast(vl2beneficial as float)) as beneficial from t5vaultsl2 vl2 left outer join t5client cli on vl2.VL2CLIENTNUM = cli.CLICODE where vl2.vl2securitynum = '$($_.ISIN)' group by CLIALPHASORTCODE, VL2SECURITYNUM"
$SqlCmd.CommandText = $query
$result = $SqlCmd.ExecuteReader()
$table = New-Object "System.Data.DataTable"
$table.Load($result)
$t = $table.Vl2Beneficial
# Pass row on through the pipeline
$_.Output = $table.VL2Beneficial
$table | Export-Clixml $ExportLocationb
Write-Output $_
} | Export-Csv $OutputPTMCSVLocation -NoTypeInformation
}
finally
{
# Always close SQL connection even if error is encountered.
$SqlConnection.Close()
}
## PTM ##
$Match = "ISIN"
$Matchs = "NSI"
## Prepare SQL ##
$MISA = 'xx.xx.x.xx'
$MISB = 'xx.xx.x.xx'
$userName = 'UN'
$PassWord='PW'
$DB = 'reporting'
## CREATE SQL Connection ##
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Data Source=$MISA;Initial Catalog=$DB;Initial Catalog=$DB;User ID=$userName;Password=$PassWord;"
$SqlConnection.Open()
# Put everything in a Try block so if there is an error the SQL
# connection is still closed
try
{
$SqlCmd = $SqlConnection.CreateCommand()
## Process CSV ##
Import-Csv $DownloadPTMCSVLocation -Header #("A", "ISIN", "NSI", "Output") | #Import the CSV
Where { $_.ISIN -like "ISIN: ????????????" -and $_.NSI -like "NSI:*" } | #Filter rows
Foreach-Object {
$_.ISIN = $_.ISIN.Replace("ISIN: ", "")
$_.NSI = $_.NSI.Replace("NSI: ", "")
# Get data from SQL
$query = "select CLIALPHASORTCODE, vl2securitynum, sum(cast(vl2beneficial as float)) as beneficial from t5vaultsl2 vl2 left outer join t5client cli on vl2.VL2CLIENTNUM = cli.CLICODE where vl2.vl2securitynum = '$($_.ISIN)' group by CLIALPHASORTCODE, VL2SECURITYNUM"
$SqlCmd.CommandText = $query
$result = $SqlCmd.ExecuteReader()
$table = New-Object "System.Data.DataTable"
$table.Load($result)
# Pass row on through the pipeline
$_.Output = $table.VL2Beneficial
Write-Output $_
} | Export-Csv $OutputPTMCSVLocation -NoTypeInformation
}
finally
{
# Always close SQL connection even if error is encountered.
$SqlConnection.Close()
}
Basically I threw out all the file renaming as it's completely unnecessary (use variables for that sort of thing if you need to), Import-Csv returns an array of objects, one for each row so I pipe them to the filter (where ISIN is checked to be 12 chars only) then loops through sorting out the data.

Powershell / SQL Server / Import-CSV

Working on another script for work and I'm attempting to read from a CSV containing only one column of data. And then for each item to find the corresponding ID when querying the SQL database. Then to put the result(ID1, CSVID1) in to an excel file(I have this part working fine).
Now I have run in to an issue as to how to populate the dataset within a foreach loop.
$excelAssets = Import-Csv .\test.csv -Header assetId | Foreach-Object {
$assetId = $_.assetId
# SQL Query Variables
$query = "SELECT AssetId AS AssetID, BrandId AS BrandID FROM [AssetLibrary_BrandAsset] WHERE AssetId = $assetId"
$connection = New-SqlConnection -Server $dataSource -Database $dataBase
#Execute the SQL commands and place the results in dataset
if ($connection.State -eq 'Open')
{
$swLap = Start-Elapsed $sw "Executing SQL Query"
Write-Verbose "$query";
$dataSet += Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0'
$i++
$connection.Close();
End-Elapsed $sw $swLap
} ELSE {
Write-Error "$($(Format-Elapsed $swLap)) SQL Connection Not Open - Exiting...";
exit;
}
}
Now $dataSet += doesn't work and I have googled numerous times to try and find the answer to this problem. Any help is appreciated.
Using the $dataSet
$dataTable = new-object "System.Data.DataTable" "Results"
$dataTable = $dataSet.Tables[0]
$rowDT = $dataTable.Rows.Count;
$colDT = $dataTable.Columns.Count;
Write-Host -NoNewLine "$(Format-Elapsed $sw.Elapsed) Rows: ";
Write-Host -NoNewLine "$($rowDT+1)" -ForegroundColor "Green";
Write-Host -NoNewLine " Columns: "
Write-Host -NoNewLine "$($colDT+1)" -ForegroundColor "Green";
Write-Host -NoNewLine " Cells: "
Write-Host "$( ($colDT+1)*($rowDT+1) )" -ForegroundColor "Green";
#Create a 2D Array of the DataTable
# http://stackoverflow.com/questions/13184191/fastest-way-to-drop-a-dataset-into-a-worksheet
$tableArray = New-Object 'object[,]' $rowDT, $colDT;
$swLap = Start-Elapsed $sw "DataTable transformation"
# i = row and j = column
for ($i=0;$i -lt $rowDT; $i++)
{
#Write-Progress -Activity "Transforming DataTable" -status "Row $i" -percentComplete ($i / $rowDT*100)
for ($j=0;$j -lt $colDT; $j++)
{
$tableArray[$i,$j] = $dataTable.Rows[$i].Item($j).ToString();
}
}
End-Elapsed $sw $swLap
$rowOffset = 1; $colOffset = 1;# 1,1 = "A1"
# Write out the header column names
for ($j=0;$j -lt $colDT; $j++)
{
$ActiveWorksheet.cells.item($rowOffset, $j+1) = $dataTable.Columns[$j].ColumnName;
}
$headerRange = $ActiveWorksheet.Range($ActiveWorksheet.cells.item($rowOffset, $colOffset), $ActiveWorksheet.cells.item($rowOffset, $colDT+$colOffset-1));
$headerRange.Font.Bold = $false
$headerRange.Interior.Color = $headingColour
$headerRange.Font.Name = $headingFont
$headerRange.Font.Color = $headingFontColour
$rowOffset++;
# Extract the data to Excel
$tableRange = $ActiveWorksheet.Range($ActiveWorksheet.cells.item($rowOffset, $colOffset), $ActiveWorksheet.cells.item($rowDT+$rowOffset-1, $colDT+$colOffset-1));
$tableRange.Cells.Value2 = $tableArray;
# Resize the columns in Excel
$swLap = Start-Elapsed $sw "Resize Excel Worksheet"
$wholeRange = $ActiveWorksheet.UsedRange
$wholeRange.EntireColumn.AutoFit() | Out-Null
End-Elapsed $sw $swLap
# Save Excel workbook
$ActiveWorkbook.SaveAs("$OutputFile")
$ActiveWorkbook.Close()
After assigning to $dataSet the first time, it's type is probably not array, meaning that the += operator doesn't behave exactly as you expect.
You can either initialize $dataSet as an empty array before you start assigning to it:
Import-Csv .\test.csv -Header assetId | Foreach-Object -Begin {$dataSet = #()} -Process {
# rest of script in here
} -End {return $dataSet}
or you can cast it during assigning:
[array]$dataSet += Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0'
finally, an alternative solution would be to ensure that the output from Invoke-SQLQuery is treated as an array before you assign it to $dataSet:
$dataSet += #(Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0')
Whatever suits your style of coding.

Powershell: Excel combine worksheets into a single worksheet

I have some web script that I've adapted to run 7 T-SQL queries and output the results into 1 Excel workbook, one worksheet per query. I've just been asked if I can combine all 7 worksheets into one.
Here's my sample code which does copy a worksheet, however the entire column(s) are selected instead of just the UsedData. Also, the first worksheet's data on the destination worksheet is replaced by the second worksheets data.
Questions: Would it be simpler to get Powershell to output the 7 queries into One Excel Worksheet separated by two blank rows? Or modify the existing Powershell script to create the 7 worksheets then combine them into one?
Code is not pretty! I also have been really lost using $Excel = New-Object -ComObject excel.application followed by $Excel | Get-Member to explore how to get PowerShell to work with Excel. References on MSDN are usually for VB or C languages and I can't translate that into PowerShell.
--Edit, add code that stores 7 Query results in an array and outputs to the console. The data is correct but I'm just unsure how to approach piping that data into a single Excel Worksheet.
$docs = "C:\Temp\SQL\test.xlsx"
If (Test-Path $docs){Remove-Item $docs}
Function First-Query {
param([string[]]$queries)
$xlsObj = New-Object -ComObject Excel.Application
$xlsObj.DisplayAlerts = $false
## - Create new Workbook and Sheet (Visible = 1 / 0 not visible)
$xlsObj.Visible = 0
$xlsWb = $xlsobj.Workbooks.Add(1)
$xlsSh = $xlsWb.Worksheets.Add([System.Reflection.Missing]::Value, $xlsWb.Worksheets.Item($xlsWb.Worksheets.Coun))
$xlsSh.Name = 'Test'
for ($i = 0; $i -lt $queries.Count; $i++){
$query = $queries[$i]
$SQLServer = 'Server'
$Database = 'DataBase'
## - Connect to SQL Server using non-SMO class 'System.Data':
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server = $SQLServer; Database = $Database; Integrated Security = True"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $query
$SqlCmd.Connection = $SqlConnection
## - Extract and build the SQL data object '$DataSetTable':
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd;
$tables = New-Object System.Data.DataSet;
$SqlAdapter.Fill($tables)
$TableArray = #($tables)
$SqlConnection.Close()
$DataSetTable = $TableArray.Tables[0]
}#End For Loop
## - Build the Excel column heading:
[Array] $getColumnNames = $DataSetTable.Columns | Select ColumnName;
## - Build column header:
[Int] $RowHeader = 1;
foreach ($ColH in $getColumnNames){
$xlsSh.Cells.item(1, $RowHeader).font.bold = $true;
$xlsSh.Cells.item(1, $RowHeader) = $ColH.ColumnName;
$RowHeader++;
}
## - Adding the data start in row 2 column 1:
[Int] $rowData = 2;
[Int] $colData = 1;
foreach ($rec in $DataSetTable.Rows){
foreach ($Coln in $getColumnNames){
## - Next line convert cell to be text only:
$xlsSh.Cells.NumberFormat = "#";
## - Populating columns:
$xlsSh.Cells.Item($rowData, $colData) = `
$rec.$($Coln.ColumnName).ToString()
$ColData++
}
$rowData++; $ColData = 1
}
## - Adjusting columns in the Excel sheet:
$xlsRng = $xlsSH.usedRange
$xlsRng.EntireColumn.AutoFit() | Out-Null
#End for loop.
#Delete unwanted Sheet1.
$xlsWb.Sheets.Item('Sheet1').Delete()
#Set Monday to Active Sheet upon opening Workbook.
$xlsWb.Sheets.Item('Monday').Activate()
## ---------- Saving file and Terminating Excel Application ---------- ##
$xlsFile = "C:\Temp\SQL\test.xlsx"
$xlsObj.ActiveWorkbook.SaveAs($xlsFile) | Out-Null
$xlsObj.Quit()
## - End of Script - ##
start-sleep 2
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsRng)) {'cleanup xlsRng'}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsSh)) {'cleanup xlsSh'}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsWb)) {'cleanup xlsWb'}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsObj)) {'cleanup xlsObj'}
[gc]::collect() | Out-Null
[gc]::WaitForPendingFinalizers() | Out-Null
}#End Function
$queries = #()
$queries += #'
SELECT DISTINCT
'#
First-Query -queries $queries
Not sure I really understand what your problem is but below is a "template" that might help you to do what you want. It shows you how you can create sheets and handle them. You'll have to fill the blanks (see the commented section, where you have to call your query function).
param (
[string] $ExcelFile = (Read-Host "Enter full path for Excel file")
)
try
{
$Error.Clear()
# http://support.microsoft.com/kb/320369
[System.Threading.Thread]::CurrentThread.CurrentCulture = [System.Globalization.CultureInfo] "en-US"
Push-Location
$scriptPath = Split-Path -parent $MyInvocation.MyCommand.Path
Set-Location $scriptPath
$Excel = New-Object -comobject Excel.Application
$Excel.Visible = $True
$WorksheetCount = 7
$Workbook = $Excel.Workbooks.Add()
$Workbook.Title = 'My Workbook'
$weekdays = #("Monday","Tuesday","Wednesday","Thursday","Friday","Saturday","Sunday")
($WorksheetCount - 1)..0 | %{
$sheet = $Excel.Worksheets.Add()
$sheet.Name = $weekdays[$_]
# $dataTable = Execute-Your-Query-Here-Returning-A-Data-Table
# $x = 0
# $dataTable | %{
# $sheet.cells.item($x, 1) = ...
# $sheet.cells.item($x, 2) = ...
# $x++
# }
}
$excel.ActiveWorkbook.SaveAs("$ExcelFile")
}
catch
{
"$($MyInvocation.InvocationName): $Error"
}
finally
{
Pop-Location
$Excel.Quit()
$Excel = $null
[gc]::collect()
[gc]::WaitForPendingFinalizers()
}

Powershell function that accepts multidimensional arrays as parameters

I'm trying to take a web script and make it a function that accepts an array of strings as arguments. Currently I work around this by making two functions from the same script. Then I use My-Function1 -arr $1 -arr2 $2 and after that use My-Function2 -arr $1 -arr2 $2 to get around the array problem.
There has to be a much cleaner method for passing this type of data to only one function but in my searching I have not see anything. I also saw a question on using the Tables refresh method to run a new query but with my limited scripting experience, I wasn't sure how to use that with my script.
The reason for two functions is: 1) First query runs and the results are inserted into a New Excel WorkBook and Worksheet. 2) Next the 2nd function is called to run the same T-SQL query however this time, Open workbook from #1 then insert a new named worksheet.
Original script from http://www.maxtblog.com/2014/06/powershell-extracting-sql-server-data-into-excel/
$docs = "D:\Scripts\Reboots2.xlsx"
If (Test-Path $docs){Remove-Item "D:\Scripts\Reboots2.xlsx"} Else {Continue}
Function First-Query {
param([string[]]$arr,$arr2)
### SQL query results sent to Excel
$SQLServer = 'SERVERNAME'
$Database = 'DATABASENAME'
## - Connect to SQL Server using non-SMO class 'System.Data':
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection;
$SqlConnection.ConnectionString = "Server = $SQLServer; Database = $Database; Integrated Security = True";
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand;
$SqlCmd.CommandText = $arr;
$SqlCmd.Connection = $SqlConnection;
## - Extract and build the SQL data object '$DataSetTable':
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$SqlAdapter.SelectCommand = $SqlCmd;
$DataSet = New-Object System.Data.DataSet;
$SqlAdapter.Fill($DataSet);
$SqlConnection.Close()
$DataSetTable = $DataSet.Tables["Table"];
## - Create an Excel Application instance:
$xlsObj = New-Object -ComObject Excel.Application;
## - Create new Workbook and Sheet (Visible = 1 / 0 not visible)
$xlsObj.Visible = 0;
$xlsWb = $xlsobj.Workbooks.Add(1);
$xlsSh = $xlsWb.Worksheets.item(1);
$xlsSh.Name = $($arr2)
## - Build the Excel column heading:
[Array] $getColumnNames = $DataSetTable.Columns | Select ColumnName;
## - Build column header:
[Int] $RowHeader = 1;
foreach ($ColH in $getColumnNames)
{
$xlsSh.Cells.item(1, $RowHeader).font.bold = $true;
$xlsSh.Cells.item(1, $RowHeader) = $ColH.ColumnName;
$RowHeader++;
};
## - Adding the data start in row 2 column 1:
[Int] $rowData = 2;
[Int] $colData = 1;
foreach ($rec in $DataSetTable.Rows)
{
foreach ($Coln in $getColumnNames)
{
## - Next line convert cell to be text only:
$xlsSh.Cells.NumberFormat = "#";
## - Populating columns:
$xlsSh.Cells.Item($rowData, $colData) = $rec.$($Coln.ColumnName).ToString();
$ColData++;
};
$rowData++; $ColData = 1;
};
## - Adjusting columns in the Excel sheet:
$xlsRng = $xlsSH.usedRange;
$xlsRng.EntireColumn.AutoFit() | Out-Null
## ---------- Saving file and Terminating Excel Application ---------- ##
$xlsFile = "D:\Scripts\Reboots.xlsx"
$xlsObj.ActiveWorkbook.SaveAs($xlsFile) | Out-Null
$xlsObj.Quit()
## - End of Script - ##
While([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsObj)){Remove-Variable xlsObj}
start-sleep 1
}#End Function
Function Rest-Query {
param([string[]]$arr,$arr2)
### SQL query results sent to Excel
$SQLServer = 'SERVERNAME'
$Database = 'DATABASENAME'
## - Connect to SQL Server using non-SMO class 'System.Data':
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection;
$SqlConnection.ConnectionString = "Server = $SQLServer; Database = $Database; Integrated Security = True";
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand;
$SqlCmd.CommandText = $arr;
$SqlCmd.Connection = $SqlConnection;
## - Extract and build the SQL data object '$DataSetTable':
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$SqlAdapter.SelectCommand = $SqlCmd;
$DataSet = New-Object System.Data.DataSet;
$SqlAdapter.Fill($DataSet);
$SqlConnection.Close()
$DataSetTable = $DataSet.Tables["Table"];
## - Create an Excel Application instance:
$xlsObj = New-Object -ComObject Excel.Application;
## - Create new Workbook and Sheet (Visible = 1 / 0 not visible)
$xlsObj.Visible = 0;
$xlsWb = $xlsObj.Workbooks.Open("D:\Scripts\Reboots.xlsx")
$xlsSh = $xlsWb.Worksheets.Add([System.Reflection.Missing]::Value, $xlsWb.Worksheets.Item($xlsWb.Worksheets.Count))
$xlsSh.Name = $($arr2)
$xlsObj.DisplayAlerts = $false
## - Build the Excel column heading:
[Array] $getColumnNames = $DataSetTable.Columns | Select ColumnName;
## - Build column header:
[Int] $RowHeader = 1;
foreach ($ColH in $getColumnNames)
{
$xlsSh.Cells.item(1, $RowHeader).font.bold = $true;
$xlsSh.Cells.item(1, $RowHeader) = $ColH.ColumnName;
$RowHeader++;
};
## - Adding the data start in row 2 column 1:
[Int] $rowData = 2;
[Int] $colData = 1;
foreach ($rec in $DataSetTable.Rows)
{
foreach ($Coln in $getColumnNames)
{
## - Next line convert cell to be text only:
$xlsSh.Cells.NumberFormat = "#";
## - Populating columns:
$xlsSh.Cells.Item($rowData, $colData) = $rec.$($Coln.ColumnName).ToString();
$ColData++;
};
$rowData++; $ColData = 1;
};
## - Adjusting columns in the Excel sheet:
$xlsRng = $xlsSH.usedRange;
$xlsRng.EntireColumn.AutoFit() | Out-Null
## ---------- Saving file and Terminating Excel Application ---------- ##
$xlsFile = "D:\Scripts\Reboots.xlsx"
$xlsObj.ActiveWorkbook.SaveAs($xlsFile) | Out-Null
$xlsObj.Quit()
$xlsObj.DisplayAlerts = $true
While([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsObj)){Remove-Variable xlsObj}
[gc]::collect()
[gc]::WaitForPendingFinalizers()
start-sleep 1
}#End Function2
$a = #'
SELECT DISTINCT
'#
$b = #'
SELECT DISTINCT
'#
$c = #'
SELECT DISTINCT
'#
$d = #'
SELECT DISTINCT
'#
$e = #'
SELECT DISTINCT
'#
$f = #'
SELECT DISTINCT
'#
$g = #'
SELECT DISTINCT
'#
First-Query -arr $a -arr2 Monday
Rest-Query -arr $b -arr2 Tuesday
Rest-Query -arr $c -arr2 Wednesday
Rest-Query -arr $d -arr2 Thursday
Rest-Query -arr $e -arr2 Friday
Rest-Query -arr $f -arr2 Saturday
Rest-Query -arr $g -arr2 Sunday
You can indeed write a single function for what you wanted to do, however you need to rearrange your code and add some structure to it.
I have tried to give some structure to the example below which you will need to adapt to your script.
Basically, and from what I could understand from the code you posted, you are running a specific SQL query for each week day and want to save the results in Excel.
In the example below I'm using a for loop to ensure the index used in each array is the same.
In the comments are the commands you need to add from your code.
Function First-Query {
param([string[]]$arr,[string[]]$arr2)
#initialize Excel
# do your Excel commands like opening the file, adding the workbook
#Now perform the query by using the appropriate element in each array (1 query / day)
for ($i = 0; $i -lt $arr.Count; $i++)
{
$day = $arr2[$i]
$query = $arr[$i]
# then add the worksheet
$xlsSh.Name = $day
# run the query
$SqlCmd.CommandText = $query;
# Save the results in the Excel columns
}
# Save and Quit
}#End Function
$arr = #()
$arr += "SELECT DISTINCT"
$arr += "ANOTHER QUERY SELECT DISTINCT"
# additional queries added with $arr +=
$weekdays = #("Monday","Tuesday","Wednesday","Thursday","Friday","Saturday","Sunday")
First-Query -arr $arr -arr2 $weekdays

Resources