Powershell / SQL Server / Import-CSV - sql-server

Working on another script for work and I'm attempting to read from a CSV containing only one column of data. And then for each item to find the corresponding ID when querying the SQL database. Then to put the result(ID1, CSVID1) in to an excel file(I have this part working fine).
Now I have run in to an issue as to how to populate the dataset within a foreach loop.
$excelAssets = Import-Csv .\test.csv -Header assetId | Foreach-Object {
$assetId = $_.assetId
# SQL Query Variables
$query = "SELECT AssetId AS AssetID, BrandId AS BrandID FROM [AssetLibrary_BrandAsset] WHERE AssetId = $assetId"
$connection = New-SqlConnection -Server $dataSource -Database $dataBase
#Execute the SQL commands and place the results in dataset
if ($connection.State -eq 'Open')
{
$swLap = Start-Elapsed $sw "Executing SQL Query"
Write-Verbose "$query";
$dataSet += Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0'
$i++
$connection.Close();
End-Elapsed $sw $swLap
} ELSE {
Write-Error "$($(Format-Elapsed $swLap)) SQL Connection Not Open - Exiting...";
exit;
}
}
Now $dataSet += doesn't work and I have googled numerous times to try and find the answer to this problem. Any help is appreciated.
Using the $dataSet
$dataTable = new-object "System.Data.DataTable" "Results"
$dataTable = $dataSet.Tables[0]
$rowDT = $dataTable.Rows.Count;
$colDT = $dataTable.Columns.Count;
Write-Host -NoNewLine "$(Format-Elapsed $sw.Elapsed) Rows: ";
Write-Host -NoNewLine "$($rowDT+1)" -ForegroundColor "Green";
Write-Host -NoNewLine " Columns: "
Write-Host -NoNewLine "$($colDT+1)" -ForegroundColor "Green";
Write-Host -NoNewLine " Cells: "
Write-Host "$( ($colDT+1)*($rowDT+1) )" -ForegroundColor "Green";
#Create a 2D Array of the DataTable
# http://stackoverflow.com/questions/13184191/fastest-way-to-drop-a-dataset-into-a-worksheet
$tableArray = New-Object 'object[,]' $rowDT, $colDT;
$swLap = Start-Elapsed $sw "DataTable transformation"
# i = row and j = column
for ($i=0;$i -lt $rowDT; $i++)
{
#Write-Progress -Activity "Transforming DataTable" -status "Row $i" -percentComplete ($i / $rowDT*100)
for ($j=0;$j -lt $colDT; $j++)
{
$tableArray[$i,$j] = $dataTable.Rows[$i].Item($j).ToString();
}
}
End-Elapsed $sw $swLap
$rowOffset = 1; $colOffset = 1;# 1,1 = "A1"
# Write out the header column names
for ($j=0;$j -lt $colDT; $j++)
{
$ActiveWorksheet.cells.item($rowOffset, $j+1) = $dataTable.Columns[$j].ColumnName;
}
$headerRange = $ActiveWorksheet.Range($ActiveWorksheet.cells.item($rowOffset, $colOffset), $ActiveWorksheet.cells.item($rowOffset, $colDT+$colOffset-1));
$headerRange.Font.Bold = $false
$headerRange.Interior.Color = $headingColour
$headerRange.Font.Name = $headingFont
$headerRange.Font.Color = $headingFontColour
$rowOffset++;
# Extract the data to Excel
$tableRange = $ActiveWorksheet.Range($ActiveWorksheet.cells.item($rowOffset, $colOffset), $ActiveWorksheet.cells.item($rowDT+$rowOffset-1, $colDT+$colOffset-1));
$tableRange.Cells.Value2 = $tableArray;
# Resize the columns in Excel
$swLap = Start-Elapsed $sw "Resize Excel Worksheet"
$wholeRange = $ActiveWorksheet.UsedRange
$wholeRange.EntireColumn.AutoFit() | Out-Null
End-Elapsed $sw $swLap
# Save Excel workbook
$ActiveWorkbook.SaveAs("$OutputFile")
$ActiveWorkbook.Close()

After assigning to $dataSet the first time, it's type is probably not array, meaning that the += operator doesn't behave exactly as you expect.
You can either initialize $dataSet as an empty array before you start assigning to it:
Import-Csv .\test.csv -Header assetId | Foreach-Object -Begin {$dataSet = #()} -Process {
# rest of script in here
} -End {return $dataSet}
or you can cast it during assigning:
[array]$dataSet += Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0'
finally, an alternative solution would be to ensure that the output from Invoke-SQLQuery is treated as an array before you assign it to $dataSet:
$dataSet += #(Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0')
Whatever suits your style of coding.

Related

Powershell - proper way to execute SQL query with multiple select statements and result tables

I'm trying to execute an SQL query with few select statements, that returns multiple tables as a result. The problem is that I can't find a way to read and use the tables separately.
Expected results:
Actual results: (it is printed row by row)
Purpose: I've made a script that creates an empty excel file with multiple sheets and each of the sheets will be used to contain each resultset of the query.
The only thing left is to put the needed text into the sheets. Here is my code for that part only:
$ConnectionString = "Data Source=...;Initial Catalog=...;User Id=...;Password=..."
$DBServerName = $ConnectionString.split('=')[1].split(';')[0]
$DBName = $ConnectionString.split('=')[2].split(';')[0]
$DBUser = $ConnectionString.split('=')[3].split(';')[0]
$DBPassword = $ConnectionString.split('=')[4].split(';')[0]
$CurrentFilePath = "C:\SQLqueryWithManyResultsets.sql"
$query = Get-Content -literalPath $CurrentFilePath | Out-String #getting the query string from file
$resultTables = Invoke-Sqlcmd -Query $query -ServerInstance $DBServerName -Database $DBName -DisableVariables -Password $DBPassword -Username $DBUser -ErrorAction Stop
foreach ($result in $resultTables) {
$result | Format-Table #where the magic happens
}
I've made a lot of research, but I cannot find a proper way to store and read the tables the way i need.
Try this:
Clear-Host;
$objConnection = New-Object System.Data.SqlClient.SqlConnection;
$objConnection.ConnectionString = "...";
$ObjCmd = New-Object System.Data.SqlClient.SqlCommand;
$ObjCmd.CommandText = "...";
$ObjCmd.Connection = $objConnection;
$ObjCmd.CommandTimeout = 0;
$objAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$objAdapter.SelectCommand = $ObjCmd;
$objDataSet = New-Object System.Data.DataSet;
$objAdapter.Fill($objDataSet) | Out-Null;
for ($i=0; $i -lt $objDataSet.Tables.Count; $i++) {
Write-Host ($objDataSet.Tables[$i] | Format-Table | Out-String);
}
$query = $null;
$objDataSet = $null;
$objConnection.Close();
$objConnection = $null;

How to create only specific delete statements using Scripter

I am using the Scripter class to give me a script for the data out of an existing database. I want to script a dataset that can be inserted into a production database. We are doing this to test if an installation of our Software is correct.
Unfortunately the dataset has to be removed later without any entries left behind so that it does not interfere with the data of our customers. So what I need are INSERT and DELTE statements. These are maintained manually at the moment which is too much of a burden.
Very well so I just went and executed the Scripter twice (once for INSERT, once for DELETE)
Problem is that when setting ScriptDrops to true then the output is in the form
DELETE FROM [dbo].[TableName]
What I would like is something of the form:
DELETE FROM [dbo].[TableName] WHERE ID = 'GUID'
Technically this would be possible since there are Primary Keys on all the tables.
The Scripter class must also in some form know of that things since it also gets the order of the DELETE-statements (dependencies) correct via foreign keys.
Any help on this would be appreciated.
Following are the 2 PowerShell-scripts I am using to export the data:
ScriptRepositoryData.ps1
$scriptPath = $MyInvocation.MyCommand.Path
$scriptDirectory = Split-Path $scriptPath -Parent
. $scriptDirectory\DatabaseScripting.ps1
$filepath='c:\data.sql'
$database='ECMS_Repository'
$tablesToExclude = #(
"SomeUnwantedTable"
)
$tablesListFromDatabase = GetTableList $database
$tablesArray = #()
$tablesListFromDatabase |% {
if (-not $tablesToExclude.Contains($_.Name.ToString()))
{
$tablesArray += $_.Name
}
}
ScriptInsert $database $tablesArray $filepath
DatabaseScripting.ps1
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMOExtended") | out-null
Function GetTableList ($database)
{
Invoke-SqlCmd -Database $database -query "SELECT * FROM sys.tables"
}
Function ScriptInsert ($database, $tables, $destination)
{
try {
$serverMO = new-object ("Microsoft.SqlServer.Management.Smo.Server") "localhost"
if ($serverMO.Version -eq $null) {Throw "Can't find the instance localhost"}
$urnsToScript = New-Object Microsoft.SqlServer.Management.Smo.UrnCollection
$databaseMO = $serverMO.Databases.Item("ECMS_Repository")
if ($databaseMO.Name -ne $database) {Throw "Can't find the database $database"}
$tables |% {
$tableListMO = $databaseMO.Tables.Item($_, "dbo")
$tableListMO |% {
$urnsToScript.Add($_.Urn)
}
}
$scripter = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') $serverMO
$scripter.Options.ScriptSchema = $False;
$scripter.Options.ScriptData = $true;
$scripter.Options.ScriptDrops = $true;
$scripter.Options.ScriptAlter = $true;
$scripter.Options.NoCommandTerminator = $true;
$scripter.Options.Filename = $destination;
$scripter.Options.ToFileOnly = $true
$scripter.Options.Encoding = [System.Text.Encoding]::UTF8
$scripter.EnumScript($urnsToScript)
Write-Host -ForegroundColor Green "Done"
}
catch {
Write-Host
Write-Host -ForegroundColor Red "Error occured"
Write-Host
Write-Host $_.Exception.ToString()
Write-Host
}
}
Unfortunately I did not find a way to do this using the Sql Management Objects.
Anyhow I now use the output of the Scripter and select the IDs of each table. I then use the IDs to change every line that looks like
DELETE FROM [dbo].[tableName]
to this:
DELETE FROM [dbo].[tableName] WHERE ID IN ('guid1', 'guid2')
Here is how I did it:
$content = Get-Content $destination
Clear-Content $destination
$content |% {
$line = $_
$table = $line.Replace("DELETE FROM [dbo].[","").Replace("]","")
$query = "SELECT ID, ClassID FROM" + $_
$idsAsQueryResult = Invoke-SqlCmd -Database $database -query $query
$ids = $idsAsQueryResult | Select-Object -Expand ID
if ($ids -ne $null) {
$joinedIDs = [string]::Join("','",$ids)
$newLine = $line + " WHERE ID IN ('" + $joinedIDs + "')"
Add-Content $destination $newLine
}
}
Where $destination is the script that has been generated with the Scripter class and $database is a string containing the database name.
I had to select a second column (ClassID which is there on all tables due to our OR mapper re-store) because of some weird error in Select-Object which I do not fully understand.
This of course only works because all tables have primary keys and all primary keys are named ID and are not combined primary keys or something.
You could of course achieve the same thing for other more complicated database schemas by extracting primary key information via SQL management objects.

Powershell Write-Host showing only dataTable name instead of data

I'm trying to write a Powershell script that executes a SQL query contained in a .sql file
Function RunSQLScript ($connstring, $filePath)
{
$query = get-content $filePath;
$DTSet = New-Object System.Data.DataSet;
$Conn=New-Object System.Data.SQLClient.SQLConnection $connstring;
$Conn.Open();
try
{
$DataCmd = New-Object System.Data.SqlClient.SqlCommand;
$MyQuery = $query;
$DataCmd.CommandText = $MyQuery;
$DataCmd.Connection = $Conn;
$DAadapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$DAadapter.SelectCommand = $DataCmd;
$DAadapter.Fill($DTSet) | Out-Null;
for ($i = 0; $i -lt $DTSet.Tables.Count; $i++) {
Write-Host $DTSet.Tables[$i];
}
}
finally
{
$Conn.Close();
$Conn.Dispose();
}
return $DTSet;
}
The internal Write-Host is showing the DataTable name instead of the DataRows.
If I manually create a DataSet with a DataTable in Powershell Console, Write-Host shows me the data in the DataTable rows, so I can't really figure out why it is not doing that in the previous script.
Can you give me some clues on how to show the data contained in the datatables instead of the table names?
Thank you
This piece of code was quite helpful for me, posting it here if anybody needs it.
for ($i = 0; $i -lt $DTSet.Tables.Count; $i++) {
$DTSet.Tables[$i] | format-table | out-host
}
That produces a nice table-like output on screen.

How to format output when exporting SQL query to CSV

I have a task to save the results of a SQL Server query into a .csv file. After some googling I decided to use PowerShell. I found a script, modified it a bit, it works and almost all is ok.
$server = "server"
$database = "database"
$query = "SELECT * from et_thanks"
$tod = Get-Date;
$file = "{0:yyyyMMdd}_go.csv" -f $tod;
$extractFile = #"
\\info\export_files\$file
"#
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $query
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv -Force -Delimiter ";" $extractFile
But I have 2 problems which I can't solve:
When I open the .csv file I see columns headers and commented string on first line:
#TYPE System.Data.DataRow
"ob_no","c_name","c_visible","c_fp","e_from","e_to"
"436439","09.09.2013 11:29:08","0","","10937","260153"
How can I get rid of it?
All values are surrounded with quotes. Is it possible to modify script not to use it while exporting? Autoreplace isn't good idea, cause there is a possibility that quote symbol can be found in sql data.
I tried to find answers in documentation (http://ss64.com/ps/export-csv.html) but with no success.
You might run in to trouble removing the quotes, but if that's what you really want then the following should achieve it.
-NoTypeInformation will remove the additional type information you are seeing.
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter ";" -NoTypeInformation) -replace "`"", "" | `
Out-File -Force $extractFile
This uses convertto-csv to convert to a string representation of the csv followed by replacing all instances of " with nothing and the final string is piped to Out-File.
...and, to get rid of the header record, if you first convert the data to csv (Convert-Csv), then pipe those results to Select to skip the 1st record:
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation ) -Replace "`"","" | Select -skip 1 | Out-File blahblahblah...
Agreed export-csv isn't the best tool for the job. I would use sqlcmd.exe or bcp.exe provided SQL Server command-lines tools are installed. You could also build a simple routine to create a CSV from a datatable:
$result = new-Object text.stringbuilder
$dt = $DataSet.Tables[0]
foreach ($dr in $dt.Rows) {
for ($i = 0; $i -lt $dt.Columns.Count; $i++) {
$null = $result.Append($($dr[$i]).ToString())
$null = $result.Append($(if ($i -eq $dt.Columns.Count - 1) {"`n" } else { ","} ))
}
}
$result.ToString()

Sql Server Script data: SMO.Scripter not working when output to file

I get this error message when I run the Powershell script at the bottom:
Exception calling "EnumScript" with "1" argument(s): "Script failed for Table 'dbo.Product'. "
At :line:48 char:35
+ foreach ($s in $scripter.EnumScript <<<< ($tbl)) { write-host $s }
However, when I comment out the output_file line
#$output_file="C:\Product.sql"
(which won't set the Scripter options to write to file), it works fine and outputs the INSERT statments to the console.
Here's the failing script, is there something I'm missing?
# Script INSERTs for given table
param
(
[string] $server,
[string] $database,
[string] $schema,
[string] $table,
[string] $output_file
)
$server="devdidb02"
$database="EPCTrunk_EPC"
$schema="dbo"
$table="Product"
$output_file="C:\Product.sql"
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
$srv = New-Object "Microsoft.SqlServer.Management.SMO.Server" $server
$db = New-Object ("Microsoft.SqlServer.Management.SMO.Database")
$tbl = New-Object ("Microsoft.SqlServer.Management.SMO.Table")
$scripter = New-Object ("Microsoft.SqlServer.Management.SMO.Scripter") ($server)
# Get the database and table objects
$db = $srv.Databases[$database]
$tbl = $db.tables | Where-object {$_.schema -eq $schema-and$_.name -eq $table}
# Set scripter options to ensure only data is scripted
$scripter.Options.ScriptSchema = $false;
$scripter.Options.ScriptData = $true;
#Exclude GOs after every line
$scripter.Options.NoCommandTerminator = $true;
if ($output_file -gt "")
{
$scripter.Options.FileName = $output_file
$scripter.Options.ToFileOnly = $true
}
# Output the script
foreach ($s in $scripter.EnumScript($tbl)) { write-host $s }
I ran both yours and Keith's and it looks like the issue is in the path you are setting for the file. I was able to reproduce your error. You were using $output_file="C:\Product.sql". Then I changed the path to: $output_file="$home\Product.sql" it ran just fine and gave me the file.
I am guessing that the reason for this is that I don't have permission to write to c:\ which may be the problem you are having.
BTW - my home dir in this case for me is my user folder for my login so I was able to find it there.
FWIW I'm not able to repro the error you see using the AdventureWorks DB. The following generates the foo.sql file without any errors:
Add-Type -AssemblyName ('Microsoft.SqlServer.Smo, Version=10.0.0.0, ' + `
'Culture=neutral, PublicKeyToken=89845dcd8080cc91')
$serverName = '.\SQLEXPRESS'
$smo = new-object Microsoft.SqlServer.Management.Smo.Server $serverName
$db = $smo.Databases['AdventureWorks']
$tbl = $db.Tables | Where {$_.Schema -eq 'Production' -and `
$_.Name -eq 'Product'}
$output_file = "$home\foo.sql"
$scripter = new-object Microsoft.SqlServer.Management.Smo.Scripter $serverName
$scripter.Options.ScriptSchema = $false;
$scripter.Options.ScriptData = $true;
$scripter.Options.NoCommandTerminator = $true;
if ($output_file -gt "")
{
$scripter.Options.FileName = $output_file
$scripter.Options.ToFileOnly = $true
}
# Output the script
foreach ($s in $scripter.EnumScript($tbl)) { write-host $s }

Resources