Importing CSV to SQL Server using bulkcopy - sql-server

I'm trying to import my CSV files into SQL Server. I found this code and it works perfectly and very fast:
# Database variables
$sqlserver = "servername"
$database = "datebasename"
$table = "tablename"
# CSV variables
$csvfile = "F:\TestNA\fin_product4.csv"
$csvdelimiter = ";"
$firstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumnNames -eq $true) { $null = $reader.readLine() }
#foreach ($column in $columns) {
# $null = $datatable.Columns.Add()
#}
$col1 = New-Object system.Data.DataColumn fin_product_rk,([datetime])
$col2 = New-Object system.Data.DataColumn fin_product_id,([datetime])
$datatable.columns.add($col1)
$datatable.columns.add($col2)
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
The problem is: it works for the standard latin encoding, but I have CSVs in UTF-8 and Windows-1251 encodings.
What and where should I add to change the encoding in this code?
I don't know the programming language that used to write this code so I can't do it myself, I would be happy if someone can help!
Thank you!
Update: CSV example:
product;product_id;product_nm;dttm
220;text;некоторый текст;12JAN2021:18:03:41.000000
220;text;некоторый текст;1JAN2021:18:03:41.000000
564;text;некоторый текст;16JAN2021:18:03:41.000000

Here is a solution in T-SQL.
It is very concise, one single statement, in comparison with powershell.
Notable points:
BULK INSERT parameter CODEPAGE = '65001' specifies UTF-8.
product_nm NVARCHAR(100) column holds UNICODE characters from the file.
SQL
USE tempdb;
GO
DROP TABLE IF EXISTS dbo.tbl;
CREATE TABLE dbo.tbl (
product VARCHAR(10),
product_id VARCHAR(30),
product_nm NVARCHAR(100),
dttm VARCHAR(50)
);
BULK INSERT dbo.tbl
FROM 'e:\Temp\Faenno_2.csv'
WITH (FORMAT='CSV'
, DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' }
, FIELDTERMINATOR = ';'
, ROWTERMINATOR = '\n'
, FIRSTROW = 2
, CODEPAGE = '65001');
-- test
SELECT * FROM dbo.tbl;
Output
+---------+------------+-----------------+---------------------------+
| product | product_id | product_nm | dttm |
+---------+------------+-----------------+---------------------------+
| 220 | text | некоторый текст | 12JAN2021:18:03:41.000000 |
| 220 | text | некоторый текст | 1JAN2021:18:03:41.000000 |
| 564 | text | некоторый текст | 16JAN2021:18:03:41.000000 |
+---------+------------+-----------------+---------------------------+

Related

Powershell and Excel - Datatable slow with too many rows

I'm trying to fill an Excel sheet using powershell :
0. Declaring Excel object
$Excel = New-Object -ComObject Excel.Application
$Excel.DisplayAlerts = $false
$Excel.ScreenUpdating = $false
$Excel.DisplayStatusBar = $false
$Excel.EnableEvents = $false
$Excel.Visible = $False
1. Reading data from a database table :
$dt1 = New-Object System.Data.Dataset
2. Getting the table :
$dt_table1 = $dt1.Tables[0]
3. Filling Excel file :
for ([Int]$m = 0; $m -lt $dt_table1.Rows.Count; $m++)
{
for ([Int]$r = 0; $r -lt $dt_table1.Columns.Count; $r++)
{
$incolumn = $r + 1;
$inrow = $inheaderlenght + 2 + $m;
if($incolumn -gt 2)
{
$Workbook.ActiveSheet.Cells.Item($inrow, $incolumn) = [System.Convert]::ToDecimal($dt_table1.Rows[$m].ItemArray[$r])
}
else
{
$Workbook.ActiveSheet.Cells.Item($inrow, $incolumn) = $dt_table1.Rows[$m].ItemArray[$r].ToString()
}
}
}
With a few hundreds of rows the sheet is filling in seconds, the problem is when i got thousands of rows, is very slow, for example to fill 21.500 rows it need 15 min at least.
I'm executing this code in my production server, with 32GB of RAM and an Intel Xeon processor.
I would like to improve the performance, i need to fill an Excel file with 32 sheets and only few sheets have thousands of rows.
UPDATE: I wanted to fill directly an array into the Excel sheet :
$excelArray = New-Object 'object[,]' $dt_table1.Rows.Count, $dt_table1.Columns.Count
$excelArray = ForEach($Row in $dt1.Tables[0].Rows){
$Record = New-Object PSObject
ForEach($Col in $dt1.Tables[0].Columns.ColumnName){
Add-Member -InputObject $Record -NotePropertyName $Col -NotePropertyValue $Row.$Col
}
$Record
}
But now, the next line fails:
$range = $WorkSheet.Range('A1', ([char](64 + $dt_table1.Columns.Count)).ToString() + ($dt_table1.Rows.Count).ToString() )
$range.Value2 = $excelArray
Shamelessly using some of f6a4's answer, I think this could work:
# 3. Filling Excel file :
# convert to array of objects
$tableData = $dt_table1 | Select-Object * -ExcludeProperty ItemArray, Table, RowError, RowState, HasErrors
# select the range in the worksheet
$endRange = '{0}{1}' -f ([char](64 + $dt_table1.Columns.Count)), $dt_table1.Rows.Count
$range = $Workbook.ActiveSheet.Range('A1', $endRange)
# copy the array to excel range
$range.Value2 = $tableData
Ok, now i found solution.
$excelArray | ConvertTo-CSV -NoType -Del "`t" | Select -Skip 1 | Clip
[void]$WorkSheet.columns.Item(1).cells.Item(2).PasteSpecial()
From 15 min to 2 min. This really helps me a lot, i have to produce like 500+ Excel files.

Can a PS Custom Object be created from a variable?

I have a 100 column table in sql server and I want to make it so not all of the columns need to be passed in the file to load. I have assigned column names in a table that then compares the columns in a hash table to find matching columns. I then create the code based on the match for the array I want to use to insert the data from the file. The problem is, it doesn't like calling the one variable to create the custom object.
I store the following below in a array. (up to a 100 of these, few below for sample (notice sqlcolumn2 is skipped for example)).
sqlcolumn1 = if ([string]::IsNullOrEmpty($obj.P1) -eq $true) {$null} else {"$obj.P1"}
sqlcolumn3 = if ([string]::IsNullOrEmpty($obj.P2) -eq $true) {$null} else {"$obj.P2"}
sqlcolumn4 = if ([string]::IsNullOrEmpty($obj.P3) -eq $true) {$null} else {"$obj.P3"}
sqlcolumn5 = if ([string]::IsNullOrEmpty($obj.P4) -eq $true) {$null} else {"$obj.P4"}
Here is the array:
foreach($line in $Final)
{
$DataRow = "$($line."TableColumnName") = if ([string]::IsNullOrEmpty(`$obj.$($line."PName")) -eq `$true) {`$null} else {`"`$obj.$($line."PName")`"}"
$DataArray += $DataRow
}
I then try to add it to a final array where I would want this to be looped through for each row of data after which I would perform the insert from the array. Even though the "string" value in the array above is correct if it were hand coded, I can't get it to recognize the rows and run.
foreach ($obj in $data2)
{
$test = [PSCustomObject] #{
$DataArray = Invoke-Expression $DataArray
}
If I just type $DataArray, it doesn't like this because it wants the = sign which I already have built into the string.
Is what I am trying to do even possible.
I was attempting to template out various different ways we receive this data, where some people send us 30 of the 100 columns, other more or less, and no one person using the exact columns to cut down on individual scripts for everything.
Adding more code:
Function ArrayCompare() {
[CmdletBinding()]
PARAM(
[Parameter(Mandatory=$True)]$Array1,
[Parameter(Mandatory=$True)]$A1Match,
[Parameter(Mandatory=$True)]$Array2,
[Parameter(Mandatory=$True)]$A2Match)
$Hash = #{}
foreach ($Data In $Array1) {
$Hash[$Data.$A1Match] += ,$Data
}
foreach ($Data In $Array2) {
$Hash[$Data.$A2Match] += ,$Data
}
foreach ($KeyValue In $Hash.GetEnumerator()){
$Match1, $Match2 = $KeyValue.Value.Where( {$_.$A1Match}, 'Split')
[PSCustomObject]#{
MatchValue = $KeyValue.Key
A1Matches = $Match1.Count
A2Matches = $Match2.Count
TablePosition = [int]$Match2.TablePosition
TableColumnName = $Match2.TableColumnName
# PName is the P(##) that is a generic ascending column value back to import-excel module. ColumnA = P1, ColumnB = P2 etc..until no data is detected. Allows flexibility and not having to know how many columns there are
PName = $Match1.Name}
}
}
$Server = 'ServerName'
$Catalog = 'DBName'
$DestinationTable = 'ImportIntoTableName'
$FileIdentifierID = 10
$FileName = 'Test.xlsx'
$FilePath = 'C:\'
$FullFilePath = $FilePath + $FileName
$data = Import-Excel -Path $FullFilePath -NoHeader -StartRow 1 # Import-
Excel Module for working with xlsx excel files
$data2 = Import-Excel -Path $ullFilePath -NoHeader -StartRow 2 # Import-
Excel Module for working with xlsx excel files
$ExpectedHeaderArray = #()
$HeaderArray = #()
$DataArray = #()
$HeaderDetect = #()
$HeaderDetect = $data | Select-Object -First 1 # Header Row In File
$HeaderDetect |
ForEach-Object {
$ColumnValue = $_
$ColumnValue |
Get-Member -MemberType *Property |
Select-Object -ExpandProperty Name |
ForEach-Object {
$HeaderValues = [PSCustomObject]#{
Name = $_
Value = $ColumnValue.$_}
$HeaderArray += $HeaderValues
}
}
# Query below provides a list of all expected file headers and the table
column name they map to
$Query = "SELECT TableColumnName, FileHeaderName, TablePosition FROM
dbo.FileHeaders WHERE FileIdentifierID = $($FileIdentifierID)"
$ds = Invoke-Sqlcmd -ServerInstance $Server -Database $Catalog -Query $Query
-OutputAs DataSet
$ExpectedHeaderArray = foreach($Row in $ds.Tables[0].Rows)
{
new-object psObject -Property #{
TableColumnName = "$($row.TableColumnName)"
FileHeaderName = "$($row.FileHeaderName)"
TablePosition = "$($row.TablePosition)"
}
}
#Use Function Above
#Bring it together so we know what P(##) goes with which header in file/mapped to table column name
$Result = ArrayCompare -Array1 $HeaderArray -A1Match Value -Array2 $ExpectedHeaderArray -A2Match FileHeaderName
$Final = $Result | sort TablePosition
foreach($Line in $Final)
{
$DataRow = "$($Line."TableColumnName") = if ([string]::IsNullOrEmpty(`$obj.$($Line."PName")) -eq `$true) {`$null} else {`"`$obj.$($Line."PName"))`"}"
$DataArray += $DataRow
}
# The output below is what the code inside the last array would be that I would use to import into excel.
# The goal is to be dynamic and match headers in the file to the stored header value and import into a table (mapped from header column to table column name)
# The reason for this is before I was here, there were many different "versions" of a layout that was given out. In the end, it is all one in the same
# but some send all 100 columns, some only send a handful, some send 80 etc. I am trying to have everything flow through here vs. 60+ pieces of code/stored procedures/ssis packs
Write-Output $DataArray
# Output Sample -- Note how in the sample, P2 and subsequent skip SQLColumn2 because P2 maps to the header value of position 3 in the sql table and each after is one off.
# In this example, SqlColumn2 would not be populated
# SqlColumn1 = if ([string]::IsNullOrEmpty($obj.P1) -eq $true) {$null} else {"$obj.P1"}
# SqlColumn3 = if ([string]::IsNullOrEmpty($obj.P2) -eq $true) {$null} else {"$obj.P2"}
# SqlColumn4 = if ([string]::IsNullOrEmpty($obj.P3) -eq $true) {$null} else {"$obj.P3"}
# SqlColumn5 = if ([string]::IsNullOrEmpty($obj.P4) -eq $true) {$null} else {"$obj.P4"}
# I know this doesn't work. This is where I'm stuck, how to build an array now off of this output from above
foreach ($obj in $data2)
{
$test = [PSCustomObject] #{
$DataArray = Invoke-Expression $DataArray}
}
I'm gong to re-state your question first, just to make sure I understand it properly (it's possible I don't!)...
You've got an excel file that looks something like this:
+---+---------+---------+---------+
| | A | B | C |
+---+---------+---------+---------+
| 1 | HeaderA | HeaderB | HeaderC |
+---+---------+---------+---------+
| 2 | Value P | Value Q | Value R |
+---+---------+---------+---------+
| 3 | Value S | Value T | Value U |
+---+---------+---------+---------+
You've also got a database table which looks like this:
+---------+---------+---------+---------+
+ ColumnW | ColumnX | ColumnY | ColumnZ |
+---------+---------+---------+---------+
+ ....... | ....... | ....... | ....... |
+---------+---------+---------+---------+
and a column mapping table like this (note, ColumnX isn't mapped in this example):
+-----------------+----------------+---------------+
| TableColumnName | FileHeaderName | TablePosition |
+-----------------+----------------+---------------+
| ColumnW | HeaderA | 1 |
+-----------------+----------------+---------------+
| ColumnY | HeaderB | 2 |
+-----------------+----------------+---------------+
| ColumnZ | HeaderC | 3 |
+-----------------+----------------+---------------+
You want to insert the values from the spreadsheet into the database table, using the data in your mapping table so you get this:
+---------+---------+---------+---------+
+ ColumnW | ColumnX | ColumnY | ColumnZ |
+---------+---------+---------+---------+
+ Value P | null | Value Q | Value R |
+---------+---------+---------+---------+
+ Value S | null | Value T | Value U |
+---------+---------+---------+---------+
So let's load the spreadsheet (letting the header row generate meaningful property names this time):
$data = Import-Excel -Path ".\MySpreadsheet.xlsx";
write-host ($data | ft | out-string);
# HeaderA HeaderB HeaderC
# ------- ------- -------
# Value P Value Q Value R
# Value S Value T Value U
and get your column mapping data (I'm programmatically creating an in-memory dataset, but you obviously read yours from your database instead):
$mappings = new-object System.Data.DataTable;
$null = $mappings.Columns.Add("TableColumnName", [string]);
$null = $mappings.Columns.Add("FileHeaderName", [string]);
$null = $mappings.Columns.Add("TablePosition", [int]);
#(
#{ "TableColumnName"="ColumnW"; "FileHeaderName"="HeaderA"; "TablePosition"=1 },
#{ "TableColumnName"="ColumnY"; "FileHeaderName"="HeaderB"; "TablePosition"=2 },
#{ "TableColumnName"="ColumnZ"; "FileHeaderName"="HeaderC"; "TablePosition"=3 }
) | % {
$row = $mappings.NewRow();
$row.TableColumnName = $_.TableColumnName;
$row.FileHeaderName = $_.FileHeaderName;
$row.TablePosition = $_.TablePosition;
$mappings.Rows.Add($row);
}
$ds = new-object System.Data.DataSet;
$ds.Tables.Add($mappings);
write-host ($ds.Tables[0] | ft | out-string)
# TableColumnName FileHeaderName TablePosition
# --------------- -------------- -------------
# ColumnW HeaderA 1
# ColumnY HeaderB 2
# ColumnZ HeaderC 3
Now we can build the "mapped" objects:
$values = #();
foreach( $row in $data )
{
$properties = [ordered] #{};
foreach( $mapping in $mappings )
{
$properties.Add($mapping.TableColumnName, $row."$($mapping.FileHeaderName)");
}
$values += new-object PSCustomObject -Property $properties;
}
write-host ($values | ft | out-string)
# ColumnW ColumnY ColumnZ
# ------- ------- -------
# Value P Value Q Value R
# Value S Value T Value U
The tricksy bit is $properties.Add($mapping.TableColumnName, $row."$($mapping.FileHeaderName)"); - basically, you can access object properties in PowerShell using a dotted string literal or variable (I'm not sure of the exact feature name) - e.g.
PS> $myValue = new-object PSCustomObject -Property #{ "aaa"="bbb"; "ccc"="ddd" }
PS> $myValue."aaa"
bbb
PS> $myProperty = "aaa"
PS> $myValue.$myProperty
"bbb"
so $row."$($mapping.FileHeaderName)" is an expression that evaluates to the value of the property of $row named in $mapping.FileHeaderName.
And then finally you can insert the objects into your database using your existing process...
Note that I couldn't quite work out what your ArrayCompare is actually doing so it's possible the above doesn't solve your problem 100%, but it's hopefully close enough that you can either work the difference out yourself, or leave a comment with where it differs from your desired solution.
Hope this helps.

Extracting array headers and keep their order

i wrote a function wich create an XLS file from an array. Everything works well but i'm facing a problem when extracting headers from my array.
My headers are organise alphabetically after extraction, but i want to keep them in the same order as my input array.
function DoTheMagicExcel {
[cmdletbinding()]
Param(
<# Array to work on #>
[Parameter( Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[Array]$MyArray,
<# Its Excel Sheet name #>
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[alias('Sheet')]
[string]$MySheetName,
<# Excel file name #>
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[ValidatePattern("\.(csv|xls)$")]
[alias('XlsFile')]
[string]$MyExcelFile
)
Begin {
<# Init them all #>
$excel = new-object -comobject Excel.Application
$excel.visible = $False
$excel.DisplayAlerts = $False
$XlsAlreadyExist = $False
if (Test-Path $MyExcelFile) {
Write-Output "Fichier Excel deja existant"
$XlsAlreadyExist = $true
$workbook = $excel.Workbooks.open("$PSScriptRoot\$MyExcelFile")
$workbook.Worksheets.Add() | Out-Null
$workbook.WorkSheets.item(1).Name = $MySheetName
$MySheet = $workbook.Worksheets.Item($MySheetName)
}
else {
$workbook = $excel.Workbooks.Add()
$workbook.WorkSheets.item(1).Name = $MySheetName
$MySheet = $workbook.Worksheets.Item($MySheetName)
}
$MyHeadercolumn = 1
$StartRow = 2
}
Process {
$MyArrayHeader = $MyArray | Get-member -MemberType 'NoteProperty' | Select-Object -ExpandProperty 'Name'
foreach ($header in $MyArrayHeader) {
$MySheet.cells.item(1, $MyHeadercolumn).font.bold = $true
$MySheet.cells.item(1, $MyHeadercolumn) = $header
$MyHeadercolumn++
}
foreach ($ThisRow in $MyArray) { ......
i mean this
$AllBcksummary[1] |ft -AutoSize
Server RG Environment Availibility Vault Status Regle Dernier Backup Type
------- -- ----------- ------------ ------ ------ ----- -------------- ----
SEFRAPB0106 RG_AXL production NOT SET backup Healthy Policy14512-BDay-Prod 24/01/2018 19:22:15 AppConsistent
$MyArrayHeader = $AllBcksummary | Get-member -MemberType 'NoteProperty' | Select-Object -ExpandProperty 'Name'
$MyArrayHeader |ft -AutoSize
Availibility
Coffre
Dernier Backup
Environment
Regle
RG
Serveur
Status
Type
Then when i create my Excel i no longer keep its original order
foreach ($header in $MyArrayHeader) {
$MySheet.cells.item(1, $MyHeadercolumn).font.bold = $true
$MySheet.cells.item(1, $MyHeadercolumn) = $header
$MyHeadercolumn++
}
if you have an idea ... :)
Thanks
Almost :)
$AllBcksummary.psobject.properties | select name
Name
----
Capacity
Count
IsFixedSize
IsReadOnly
IsSynchronized
SyncRoot
but adding index 0 do the trick !
$AllBcksummary[0].psobject.properties | select name
Name
----
Serveur
RG
Environment
Availibility
Coffre
Status
Regle
Dernier Backup
Type
Thanks guys
This should do the trick :)
$AllBcksummary.psobject.properties | select Name
From How to get powershell object properties in the same order that format-list does? ;)

Find strings in one file in another and output certain columns

I have a file that contains CampaignNames and IDs. The two fields are separated by a pipe |. The IDs are separated by a space. I want to find all rows in a file (thorpe þ delimited) that contain the IDs, and output those rows into separate files per name. This file is usually 4-7 GB, sometimes larger.
campaigns.txt:
Name|NameID
FirstName|123 212 445 39
SecondName|313 939
ThirdName|219
Data ID File:
DateþIDþCode
10-22-14þ123þAbc
10-24-16þ212þPow
09-18-15þ219
So I would want 3 files created. FirstName.txt contains 2 rows. SecondName.txt contains 0 rows. ThirdName.txt contains 1 row.
I cobbled together some code from various sources and came up with this. However, I'm wondering if there's a better way than having to read through the data file multiple times. Any thoughts out there?
$campaigns = Import-Csv "campaigns.txt" -Delimiter "|"
$datafile = "5282_10-19-2016"
$encoding = [Text.Encoding]::GetEncoding('iso-8859-1')
echo "Starting.."
Get-Date -Format g
foreach ($campaign in $campaigns) {
$campaignname = $campaign.CampaignName
$campaignids = $campaign.CampaignID.split(" ")
echo "Looking for $campaignname - $campaignids"
$writer = New-Object System.IO.StreamWriter($campaignname + "_filtered.txt")
foreach ($campaignid in $campaignids) {
$datareader = New-Object System.IO.StreamReader($datafile, $encoding)
while ($dataline = $datareader.ReadLine()) {
if ($dataline -match $campaignid) {
$data = $dataline.Split("þ")
$writer.WriteLine('{0}|{1}|{2}|{3}|{4}|{5}|{6}|{7}', $data[0], $data[3], $data[5], $data[8], $data[12], $data[14], $data[19], $data[20])
}
}
}
$writer.Close()
}
echo "Done!"
Get-Date -Format g
Process the huge datafile just once.
Pick the campaign names from a hashtable built from campaign.txt.
Assuming there are not many campaigns (say, less than 1000) write to as many StreamWriters.
$campaignByID = #{}
foreach ($c in (Import-Csv 'campaigns.txt' -Delimiter '|')) {
foreach ($id in ($c.CampaignID -split ' ')) {
$campaignByID[$id] = $c.CampaignName
}
}
$campaignWriters = #{}
$datareader = New-Object IO.StreamReader($datafile, $encoding)
while (!$datareader.EndOfStream) {
$data = $datareader.ReadLine().Split('þ')
$campaignName = $campaignByID[$data[1]]
if ($campaignName) {
$writer = $campaignWriters[$campaignName]
if (!$writer) {
$writer = $campaignWriters[$campaignName] =
New-Object IO.StreamWriter($campaignName + '_filtered.txt')
}
$writer.WriteLine(($data[0,3,5,8,12,14,19,20] -join '|'))
}
}
$datareader.Close()
foreach ($writer in $campaignWriters.Values) {
$writer.Close()
}
To display progress use Write-Progress based on $datareader.BaseStream.Position / $datareader.BaseStream.Length * 100 but don't do it for every datafile line because it'll slow down the processing, do it every 1 second, for example, using a datetime variable: update it when a second has elapsed and display the progress.
try this ;)
$campaigns=import-csv C:\temp\campaigns.txt -Delimiter "|"
$datafile=import-csv C:\temp\5282_10-19-2016.txt -Delimiter "þ" -Encoding Default
$DirResult="C:\temp\root"
$campaigns | %{ foreach ($item in ($_.NameID.Split(" "))) {New-Object PSObject -Property #{ Name=$_.Name ; ValID=$item} } } | %{ $datafile | where id -eq $_.ValID | export-csv -Append -Delimiter "|" -Path ("$dirresult\" + $_.ValID + "_filtered.txt") -NoTypeInformation }

Adding column to SQL query on multiple instances from powershell

I have the following powershell script which reads in a list of servers, and runs SQL command on these servers. This data is then exported to csv and to excel format
I would like to be able to add the targeted server name from my server list as the first column so columns would look like this (server name added to front)
Server Name | Name | CollectionSet ID | Collection Mode | Retention Period | Schedule
This is the current script I have:
Param
(
[string]$fServers = 'W:\Theo\Scripts\mdw_servers.csv'
)
$query = "SELECT a.name AS 'DC Name',
collection_set_id AS 'Collection_set ID',
CASE collection_mode
WHEN 1 THEN 'non-cached'
WHEN 0 THEN 'cached'
END AS 'Collection Type' ,
days_until_expiration AS 'Retention Period' ,
b.name AS 'Schedule Name'
FROM msdb.dbo.syscollector_collection_sets a ,
msdb.dbo.sysschedules b
WHERE a.schedule_uid = b.schedule_uid
AND is_running = 1;"
$csvFilePath = "W:\Theo\Scripts\queryresults.csv"
$excelFilePath = "W:\Theo\Scripts\queryresults.xls"
# Run Query against multiple servers, combine results
$allServers = Get-Content -Path $fServers
foreach ($Server in $allServers) {
write-host "Executing query against server: " $Server
$results += Invoke-Sqlcmd -Query $query -ServerInstance $Server;
}
# Output to CSV
write-host "Saving Query Results in CSV format..."
$results | export-csv $csvFilePath -NoTypeInformation
# Convert CSV file to Excel
write-host "Converting CSV output to Excel..."
$excel = New-Object -ComObject excel.application
$excel.visible = $False
$excel.displayalerts=$False
$workbook = $excel.Workbooks.Open($csvFilePath)
$workSheet = $workbook.worksheets.Item(1)
$resize = $workSheet.UsedRange
$resize.EntireColumn.AutoFit() | Out-Null
$xlExcel8 = 56
$workbook.SaveAs($excelFilePath,$xlExcel8)
$workbook.Close()
$excel.quit()
$excel = $null
write-host "Results are saved in Excel file: " $excelFilePath
Any input is appreciated!
have you tried
SELECT ##SERVERNAME AS 'Server Name'
https://msdn.microsoft.com/en-us/library/ms187944.aspx

Resources