Unable to insert SQL table values from Powershell into SQL Server - sql-server

A script gets data from an API, and I'm trying to import that data into SQL Server using a PowerShell.
$params = #{
ServerInstance = "SQLDB1"
Database="Stage"
}
$InsertResults = #"
INSERT INTO [Stage].[dbo].[ImportTable]([roleID],[roleName])
VALUES ('$roleId','$rolename')
"#
foreach($r in $roles) {
[int]$roleId = $r.id
$rolename = $r.name
Invoke-sqlcm #params -Query $InsertResults }
Here, the API spits out r in roles, which can be r.id (a number value I convert to int) or r.name, a string value, with the goal to put them into a single table side by side, [roleID][roleName]
Well, that's the goal. When checking the table in SQL Server, all I get is
|roleID|roleName|
-----------------
| 0 | |
That's if I set roleID to Primary Key. If I don't, it repeats that same row as many times as there are lines of data in the API. If I don't inclued "$rolename = $r.name" then the roleName column just says ".name" and that's that.
What I need looks like
|roleID|roleName|
-----------------
| 1 | role1 |
| 2 | role2 |
| 3 | role3 |
etc.

In your code I see some logic mistake:
You define query string with uninitialized parameters outside query loop. So, your query string never update correctly (expectedly), and you see after executed cmdlet default values for int is 0 and string is NULL (empty string).
So, correct code will be
foreach($r in $roles) {
$InsertResults = #"
INSERT INTO [Stage].[dbo].[ImportTable]([roleID],[roleName])
VALUES ('$roleId','$rolename')
"#
[int]$roleId = $r.id
$rolename = $r.name
Invoke-sqlcmd #params -Query $InsertResults }
I created array for test
$roles = #([pscustomobject]#{id=5; name ="test2"},[pscustomobject]#{id = 6;name ="test"})
I tested your code and saw this result
|roleID|roleName|
-----------------
| 6 | test |
| 6 | test |
After changed initialization query string (inside loop), i saw
|roleID|roleName|
-----------------
| 5 | test2 |
| 6 | test |
Your code also can be modified like this:
$InsertResults = #"
INSERT INTO [Stage].[dbo].[ImportTable]([roleID],[roleName])
VALUES (`$(roleId),`$(rolename))
"#
foreach($r in $roles) {
$variables = #(
"roleId=$($r.id)",
"rolename='$($r.name)'"
)
Invoke-sqlcmd #params -Query $InsertResults -Variable $variables}

Related

Importing CSV to SQL Server using bulkcopy

I'm trying to import my CSV files into SQL Server. I found this code and it works perfectly and very fast:
# Database variables
$sqlserver = "servername"
$database = "datebasename"
$table = "tablename"
# CSV variables
$csvfile = "F:\TestNA\fin_product4.csv"
$csvdelimiter = ";"
$firstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumnNames -eq $true) { $null = $reader.readLine() }
#foreach ($column in $columns) {
# $null = $datatable.Columns.Add()
#}
$col1 = New-Object system.Data.DataColumn fin_product_rk,([datetime])
$col2 = New-Object system.Data.DataColumn fin_product_id,([datetime])
$datatable.columns.add($col1)
$datatable.columns.add($col2)
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
The problem is: it works for the standard latin encoding, but I have CSVs in UTF-8 and Windows-1251 encodings.
What and where should I add to change the encoding in this code?
I don't know the programming language that used to write this code so I can't do it myself, I would be happy if someone can help!
Thank you!
Update: CSV example:
product;product_id;product_nm;dttm
220;text;некоторый текст;12JAN2021:18:03:41.000000
220;text;некоторый текст;1JAN2021:18:03:41.000000
564;text;некоторый текст;16JAN2021:18:03:41.000000
Here is a solution in T-SQL.
It is very concise, one single statement, in comparison with powershell.
Notable points:
BULK INSERT parameter CODEPAGE = '65001' specifies UTF-8.
product_nm NVARCHAR(100) column holds UNICODE characters from the file.
SQL
USE tempdb;
GO
DROP TABLE IF EXISTS dbo.tbl;
CREATE TABLE dbo.tbl (
product VARCHAR(10),
product_id VARCHAR(30),
product_nm NVARCHAR(100),
dttm VARCHAR(50)
);
BULK INSERT dbo.tbl
FROM 'e:\Temp\Faenno_2.csv'
WITH (FORMAT='CSV'
, DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' }
, FIELDTERMINATOR = ';'
, ROWTERMINATOR = '\n'
, FIRSTROW = 2
, CODEPAGE = '65001');
-- test
SELECT * FROM dbo.tbl;
Output
+---------+------------+-----------------+---------------------------+
| product | product_id | product_nm | dttm |
+---------+------------+-----------------+---------------------------+
| 220 | text | некоторый текст | 12JAN2021:18:03:41.000000 |
| 220 | text | некоторый текст | 1JAN2021:18:03:41.000000 |
| 564 | text | некоторый текст | 16JAN2021:18:03:41.000000 |
+---------+------------+-----------------+---------------------------+

Can a PS Custom Object be created from a variable?

I have a 100 column table in sql server and I want to make it so not all of the columns need to be passed in the file to load. I have assigned column names in a table that then compares the columns in a hash table to find matching columns. I then create the code based on the match for the array I want to use to insert the data from the file. The problem is, it doesn't like calling the one variable to create the custom object.
I store the following below in a array. (up to a 100 of these, few below for sample (notice sqlcolumn2 is skipped for example)).
sqlcolumn1 = if ([string]::IsNullOrEmpty($obj.P1) -eq $true) {$null} else {"$obj.P1"}
sqlcolumn3 = if ([string]::IsNullOrEmpty($obj.P2) -eq $true) {$null} else {"$obj.P2"}
sqlcolumn4 = if ([string]::IsNullOrEmpty($obj.P3) -eq $true) {$null} else {"$obj.P3"}
sqlcolumn5 = if ([string]::IsNullOrEmpty($obj.P4) -eq $true) {$null} else {"$obj.P4"}
Here is the array:
foreach($line in $Final)
{
$DataRow = "$($line."TableColumnName") = if ([string]::IsNullOrEmpty(`$obj.$($line."PName")) -eq `$true) {`$null} else {`"`$obj.$($line."PName")`"}"
$DataArray += $DataRow
}
I then try to add it to a final array where I would want this to be looped through for each row of data after which I would perform the insert from the array. Even though the "string" value in the array above is correct if it were hand coded, I can't get it to recognize the rows and run.
foreach ($obj in $data2)
{
$test = [PSCustomObject] #{
$DataArray = Invoke-Expression $DataArray
}
If I just type $DataArray, it doesn't like this because it wants the = sign which I already have built into the string.
Is what I am trying to do even possible.
I was attempting to template out various different ways we receive this data, where some people send us 30 of the 100 columns, other more or less, and no one person using the exact columns to cut down on individual scripts for everything.
Adding more code:
Function ArrayCompare() {
[CmdletBinding()]
PARAM(
[Parameter(Mandatory=$True)]$Array1,
[Parameter(Mandatory=$True)]$A1Match,
[Parameter(Mandatory=$True)]$Array2,
[Parameter(Mandatory=$True)]$A2Match)
$Hash = #{}
foreach ($Data In $Array1) {
$Hash[$Data.$A1Match] += ,$Data
}
foreach ($Data In $Array2) {
$Hash[$Data.$A2Match] += ,$Data
}
foreach ($KeyValue In $Hash.GetEnumerator()){
$Match1, $Match2 = $KeyValue.Value.Where( {$_.$A1Match}, 'Split')
[PSCustomObject]#{
MatchValue = $KeyValue.Key
A1Matches = $Match1.Count
A2Matches = $Match2.Count
TablePosition = [int]$Match2.TablePosition
TableColumnName = $Match2.TableColumnName
# PName is the P(##) that is a generic ascending column value back to import-excel module. ColumnA = P1, ColumnB = P2 etc..until no data is detected. Allows flexibility and not having to know how many columns there are
PName = $Match1.Name}
}
}
$Server = 'ServerName'
$Catalog = 'DBName'
$DestinationTable = 'ImportIntoTableName'
$FileIdentifierID = 10
$FileName = 'Test.xlsx'
$FilePath = 'C:\'
$FullFilePath = $FilePath + $FileName
$data = Import-Excel -Path $FullFilePath -NoHeader -StartRow 1 # Import-
Excel Module for working with xlsx excel files
$data2 = Import-Excel -Path $ullFilePath -NoHeader -StartRow 2 # Import-
Excel Module for working with xlsx excel files
$ExpectedHeaderArray = #()
$HeaderArray = #()
$DataArray = #()
$HeaderDetect = #()
$HeaderDetect = $data | Select-Object -First 1 # Header Row In File
$HeaderDetect |
ForEach-Object {
$ColumnValue = $_
$ColumnValue |
Get-Member -MemberType *Property |
Select-Object -ExpandProperty Name |
ForEach-Object {
$HeaderValues = [PSCustomObject]#{
Name = $_
Value = $ColumnValue.$_}
$HeaderArray += $HeaderValues
}
}
# Query below provides a list of all expected file headers and the table
column name they map to
$Query = "SELECT TableColumnName, FileHeaderName, TablePosition FROM
dbo.FileHeaders WHERE FileIdentifierID = $($FileIdentifierID)"
$ds = Invoke-Sqlcmd -ServerInstance $Server -Database $Catalog -Query $Query
-OutputAs DataSet
$ExpectedHeaderArray = foreach($Row in $ds.Tables[0].Rows)
{
new-object psObject -Property #{
TableColumnName = "$($row.TableColumnName)"
FileHeaderName = "$($row.FileHeaderName)"
TablePosition = "$($row.TablePosition)"
}
}
#Use Function Above
#Bring it together so we know what P(##) goes with which header in file/mapped to table column name
$Result = ArrayCompare -Array1 $HeaderArray -A1Match Value -Array2 $ExpectedHeaderArray -A2Match FileHeaderName
$Final = $Result | sort TablePosition
foreach($Line in $Final)
{
$DataRow = "$($Line."TableColumnName") = if ([string]::IsNullOrEmpty(`$obj.$($Line."PName")) -eq `$true) {`$null} else {`"`$obj.$($Line."PName"))`"}"
$DataArray += $DataRow
}
# The output below is what the code inside the last array would be that I would use to import into excel.
# The goal is to be dynamic and match headers in the file to the stored header value and import into a table (mapped from header column to table column name)
# The reason for this is before I was here, there were many different "versions" of a layout that was given out. In the end, it is all one in the same
# but some send all 100 columns, some only send a handful, some send 80 etc. I am trying to have everything flow through here vs. 60+ pieces of code/stored procedures/ssis packs
Write-Output $DataArray
# Output Sample -- Note how in the sample, P2 and subsequent skip SQLColumn2 because P2 maps to the header value of position 3 in the sql table and each after is one off.
# In this example, SqlColumn2 would not be populated
# SqlColumn1 = if ([string]::IsNullOrEmpty($obj.P1) -eq $true) {$null} else {"$obj.P1"}
# SqlColumn3 = if ([string]::IsNullOrEmpty($obj.P2) -eq $true) {$null} else {"$obj.P2"}
# SqlColumn4 = if ([string]::IsNullOrEmpty($obj.P3) -eq $true) {$null} else {"$obj.P3"}
# SqlColumn5 = if ([string]::IsNullOrEmpty($obj.P4) -eq $true) {$null} else {"$obj.P4"}
# I know this doesn't work. This is where I'm stuck, how to build an array now off of this output from above
foreach ($obj in $data2)
{
$test = [PSCustomObject] #{
$DataArray = Invoke-Expression $DataArray}
}
I'm gong to re-state your question first, just to make sure I understand it properly (it's possible I don't!)...
You've got an excel file that looks something like this:
+---+---------+---------+---------+
| | A | B | C |
+---+---------+---------+---------+
| 1 | HeaderA | HeaderB | HeaderC |
+---+---------+---------+---------+
| 2 | Value P | Value Q | Value R |
+---+---------+---------+---------+
| 3 | Value S | Value T | Value U |
+---+---------+---------+---------+
You've also got a database table which looks like this:
+---------+---------+---------+---------+
+ ColumnW | ColumnX | ColumnY | ColumnZ |
+---------+---------+---------+---------+
+ ....... | ....... | ....... | ....... |
+---------+---------+---------+---------+
and a column mapping table like this (note, ColumnX isn't mapped in this example):
+-----------------+----------------+---------------+
| TableColumnName | FileHeaderName | TablePosition |
+-----------------+----------------+---------------+
| ColumnW | HeaderA | 1 |
+-----------------+----------------+---------------+
| ColumnY | HeaderB | 2 |
+-----------------+----------------+---------------+
| ColumnZ | HeaderC | 3 |
+-----------------+----------------+---------------+
You want to insert the values from the spreadsheet into the database table, using the data in your mapping table so you get this:
+---------+---------+---------+---------+
+ ColumnW | ColumnX | ColumnY | ColumnZ |
+---------+---------+---------+---------+
+ Value P | null | Value Q | Value R |
+---------+---------+---------+---------+
+ Value S | null | Value T | Value U |
+---------+---------+---------+---------+
So let's load the spreadsheet (letting the header row generate meaningful property names this time):
$data = Import-Excel -Path ".\MySpreadsheet.xlsx";
write-host ($data | ft | out-string);
# HeaderA HeaderB HeaderC
# ------- ------- -------
# Value P Value Q Value R
# Value S Value T Value U
and get your column mapping data (I'm programmatically creating an in-memory dataset, but you obviously read yours from your database instead):
$mappings = new-object System.Data.DataTable;
$null = $mappings.Columns.Add("TableColumnName", [string]);
$null = $mappings.Columns.Add("FileHeaderName", [string]);
$null = $mappings.Columns.Add("TablePosition", [int]);
#(
#{ "TableColumnName"="ColumnW"; "FileHeaderName"="HeaderA"; "TablePosition"=1 },
#{ "TableColumnName"="ColumnY"; "FileHeaderName"="HeaderB"; "TablePosition"=2 },
#{ "TableColumnName"="ColumnZ"; "FileHeaderName"="HeaderC"; "TablePosition"=3 }
) | % {
$row = $mappings.NewRow();
$row.TableColumnName = $_.TableColumnName;
$row.FileHeaderName = $_.FileHeaderName;
$row.TablePosition = $_.TablePosition;
$mappings.Rows.Add($row);
}
$ds = new-object System.Data.DataSet;
$ds.Tables.Add($mappings);
write-host ($ds.Tables[0] | ft | out-string)
# TableColumnName FileHeaderName TablePosition
# --------------- -------------- -------------
# ColumnW HeaderA 1
# ColumnY HeaderB 2
# ColumnZ HeaderC 3
Now we can build the "mapped" objects:
$values = #();
foreach( $row in $data )
{
$properties = [ordered] #{};
foreach( $mapping in $mappings )
{
$properties.Add($mapping.TableColumnName, $row."$($mapping.FileHeaderName)");
}
$values += new-object PSCustomObject -Property $properties;
}
write-host ($values | ft | out-string)
# ColumnW ColumnY ColumnZ
# ------- ------- -------
# Value P Value Q Value R
# Value S Value T Value U
The tricksy bit is $properties.Add($mapping.TableColumnName, $row."$($mapping.FileHeaderName)"); - basically, you can access object properties in PowerShell using a dotted string literal or variable (I'm not sure of the exact feature name) - e.g.
PS> $myValue = new-object PSCustomObject -Property #{ "aaa"="bbb"; "ccc"="ddd" }
PS> $myValue."aaa"
bbb
PS> $myProperty = "aaa"
PS> $myValue.$myProperty
"bbb"
so $row."$($mapping.FileHeaderName)" is an expression that evaluates to the value of the property of $row named in $mapping.FileHeaderName.
And then finally you can insert the objects into your database using your existing process...
Note that I couldn't quite work out what your ArrayCompare is actually doing so it's possible the above doesn't solve your problem 100%, but it's hopefully close enough that you can either work the difference out yourself, or leave a comment with where it differs from your desired solution.
Hope this helps.

Not all properties displayed

When we're trying to export data to other functions via the pipeline, we observe some strange behavior in PowerShell.
Example code:
$Array = #()
$Obj1 = [PSCustomObject]#{
Member1 = 'First'
Member2 = 'Second'
}
$Obj2 = [PSCustomObject]#{
Member1 = 'First'
Member2 = 'Second'
Member3 = 'Third'
}
$Array = $Obj1, $Obj2
$Array | Out-GridView -Title 'Not showing Member3'
$Array = $Obj2, $Obj1
$Array | Out-GridView -Title 'All members correctly displayed'
In the example above you can see that when the first object only contains 2 properties, the Out-GridView CmdLet (and others) only show 2 properties, even though the second object has 3 properties. However, when the first object in the array has 3 properties it does display them all correctly.
Is there a way around this? Because it's not possible to predict up front how many properties on an object there will be and if the object with the most properties will be the first one in the array.
I had the same experience once and created the following reusable 'Union' function:
# 2021-08-25 Removed Union function
Usage:
$Obj1, $Obj2 | Union | Out-GridView -Title 'Showing all members'
It is also supposed to work with complex objects. Some standard cmdlets output multiple object types at once and if you view them (e.g. Out-GridView) or dump them in a file (e.g. Export-Csv) you might miss a lot of properties. Take as another example:
Get-WmiObject -Namespace root/hp/instrumentedBIOS -Class hp_biosSetting | Union | Export-Csv ".\HPBIOS.csv"
Added 2014-09-19:
Maybe this is already between the lines in the comments $Array | Select * | … will not resolve the issue but specifically selecting the properties $Array | Select Member1, Member2, Member3 | … does.
Besides, although in most cases the Union function will work, there are some exceptions to that as it will only align the first object with the rest.
Consider the following object:
$List = #(
New-Object PSObject -Property #{Id = 2}
New-Object PSObject -Property #{Id = 1}
New-Object PSObject -Property #{Id = 3; Name = "Test"}
)
If you Union this object everything appears to be fine and if you e.g. ExportTo-CSV and work with the export .csv file from then on you will never have any issue.
$List | Union
Id Name
-- ----
2
1
3 Test
Still there is a catch as only the first object is aligned. If you e.g. sort the result on Id (Sort Id) or take just the last 2 (Select -Last 2) entries, the Name is not listed because the second object doesn’t contain the Name property:
$List | Union | Sort Id
Id
--
1
2
3
Therefor I have rewritten the Union-Object (Alias Union) function`):
Union-Object
# 2021-08-25 Removed Union-Object function
Syntax:
$Array | Union | Out-GridView -Title 'All members correctly displayed'
Update 2021-08-25
Based on az1d helpful feedback on an error caused by equal property names with different casing, I have created a new UnifyProperties function.
(I will no longer use the name UnionObject for his)
function UnifyProperties {
$Names = [System.Collections.Generic.HashSet[string]]::new([StringComparer]::OrdinalIgnoreCase)
$InputCollected = #($Input)
$InputCollected.ForEach({
foreach ($Name in $_.psobject.Properties.Name) { $Null = $Names.Add($Name) }
})
$inputCollected | Select-Object #($Names)
}
Usage:
[pscustomobject] #{ one = 1; two = 2; three = 3 },
[pscustomobject] #{ ONE = 10; THREE = 30; FOUR = 4 } |
UnifyProperties
one two three FOUR
--- --- ----- ----
1 2 3
10 30 4
See also: #13906 Add -UnifyProperties parameter to Select-Object

PowerShell Emailing Arrays

I'm having issues with the way an array loses its formatting when it is emailed and viewed in outlook 2013
The formatted array looks like this in PowerShell
vServer State Connection
------- ----- ----------
vServer-LB-1 UP 0
vServer-LB-2 DOWN 0
vServer-LB-3 UP 0
vServer-LB-4 UP 0
vServer-LB-5 UP 0
vServer-LB-6 UP 2
This is how I formatted the array (I have tried to email the unformatted array, but it is still wrong)
$formatserver = #{Expression={$_.name};Label="vServer";width=48}, `
#{Expression={$_.state};Label="State";width=17}, `
#{Expression={$_.establishedconn};Label="Connection"}
$Array = $server | Format-Table $formatserver
However, when emailed (not quite like this, but its not formatted correctly).
vServer State Connection
------- ----- ----------
vServer-LB-1 UP 0
vServer-LB-2 DOWN 0
vServer-LB-3 UP 0
vServer-LB-4 UP 0
vServer-LB-5 UP 0
vServer-LB-6 UP 2
Here is the code for emailing
$from = 'Reporting <Support#Me.com>'
$to = 'me#me.com'
$subject = 'Report'
$body = $Array | Out-String
$smtpServer = 'mail.me.com'
$msg = New-Object Net.Mail.MailMessage($from, $to, $subject, $body)
#$msg.IsBodyHTML = $true
$smtp = New-Object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($msg)
Please note I have tried many combinations of | out-stringand $msg.IsBodyHTML = $true
You can use the <pre> tag in HTML to keep the spacing in your emails.
$body = $Array | Out-String | %{"<pre>"+$_+"</pre>"}
Make sure that you set IsBodyHTML as $true.
Note: Table formats a limited to the buffers of your PowerShell Shell/Console. So if your table width is more that the buffer width on your Shell/Console the table will not be shown in full.
To get round this you can set your shell buffer at the start of your script with the following:
$pshost = get-host
$pswindow = $pshost.ui.rawui
$newsize = $pswindow.buffersize
$newsize.height = 3000
$newsize.width = 1500
$pswindow.buffersize = $newsize
Or go to File >> Properties on your Shell/Console, and change the following property.

Powershell Adding multiple arrays of different columns to a main array

Code:
$arr1 = "" | select blabla,blabla2
$arr2 = "" | select blabla3,blabla4
$arrtotal = #()
$arrtotal += $arr1
$arrtotal += $arr2
$arrtotal
Printout:
blabla blabla2
However, when attempting to print both cells individually (not one after the other but simply selecting in PS ISE and hitting F8):
$arrtotal[0]
blabla blabla2
$arrtotal[1]
blabl3 blabla4
EDITED:
I would have expected both array columns to be printed when printing $arrtotal. Not just one of them. Further more it's unclear to me why printing them individually works but one after the other i.e "$arrtotal[0];$arrtotal[1]" does not.
EDIT2:
This is my original code.
All it does is query Sparkpost's API in order to build a custom HTML report.
$test = (Invoke-WebRequest "https://api.sparkpost.com/api/v1/metrics/deliverability?metrics=count_injected,count_sent,count_bounce,count_accepted&from=2016-01-01T08:00&to=2016-04-25T08:00" -Headers #{"Authorization"="xxxxxxxxxxxxx";"Content-Type"= "application/json"}).content | ConvertFrom-Json
$fill1 = "" | select EmailsReceived,EmailsSent,EmailsBounced
$fill1.EmailsReceived = $test.results.count_injected
$fill1.EmailsSent = $test.results.count_accepted
$fill1.EmailsBounced = $test.results.count_bounce
$fill2 = "" | select DeliveredPrecentage,BouncesPrecentage
$fill2.DeliveredPrecentage = [math]::round($test.results.count_accepted/$test.results.count_injected*100,2)
$fill2.BouncesPrecentage = [math]::round(($test.results.count_bounce)/$test.results.count_accepted*100,2)
$arr = #()
$arr += , $fill1
$arr += , $fill2
My problem is that I cant simply convert $arr into an HTML file like I've done numerous times before.
$arr
EmailsReceived EmailsSent EmailsBounced
107 107 12
On the other hand
$arr | Format-List
EmailsReceived : 107 EmailsSent : 107 EmailsBounced : 12
DeliveredPrecentage : 100 BouncesPrecentage : 11.21
I'd like to make an HTML out of everything so I can send it via email later. How can I pipe it all?
Its because $arr1and $arr2are two different PSCustomObjects. You can print the whole arrayin a list using the Format-List cmdlet:
$arrtotal | Format-List
Output:
blabla :
blabla2 :
blabla :
blabla2 :
Answer to your edit:
This looks like a single record for me, try this:
$record = [PsCustomObject]#{
EmailsReceived = $test.results.count_injected
EmailsSent = $test.results.count_accepted
EmailsBounced = $test.results.count_bounce
DeliveredPrecentage = [math]::round($test.results.count_accepted/$test.results.count_injected*100,2)
BouncesPrecentage = [math]::round(($test.results.count_bounce)/$test.results.count_accepted*100,2)
}
$record | convertto-html | out-file file.html
Note: I also changed the logic to create the object to a more well known approach using a PsCustomObject typecast on a hashtable.

Resources