Powershell single quote all array elements in parameters before SQL EXEC - sql-server

I have the below code snippet and all is working fine except that it looks like none of the values are wrapped in single quotes which the stored procedure gives out about.
Is there a way to tell it to wrap each element in $p in single quotes before the SQL is executed?
$p contains 30 different elements which can be numeric or alpha-numeric.
Param (
[Parameter(Mandatory=$true)][array]$p
)
Process {
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = "Data Source=OurServer;Initial Catalog=OurDatabase;Integrated Security=true"
$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.Connection = $conn
$cmd.CommandTimeout = 0
$cmd.CommandText = "EXEC sp_test $p"
}

There are a number of options to update array elements:
# Using -replace
$p = $p -replace '^.*$','''$&'''
# Using foreach-object and string format operator
$p = $p | Foreach-Object { "'{0}'" -f $_ }
# Using foreach method
$p = $p.foreach({"'$_'"})

Related

Split values using Powershell

I have a file with multiple expressions like "$REGX('CareMedic.2_0','CustomerInformation','Customer Information')". The file can be a xml file, text file or any other type. If the file contains 9 of those expressions, I'm trying to pull all nine and send the values to a database.
I've tried my code as below:
$input_path = ‘C:\Users\Administrator\Desktop\test2.xml’
$SQLServer = "WIN-17V7QT0IJVK"
$SQLDBName = "Test"
$uid ="WIN-17V7QT0IJVK\Administrator"
$pwd = "letmebackinplease"
$SqlQuery = "SELECT * from product_schema;"
$ConnectionString = "Server = $SQLServer; Database = $SQLDBName; Integrated Security = True;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $ConnectionString
$SqlConnection.open()
if($SqlConnection.state -eq "Open"){
Write-Host "Test connection successful"
}
$regex = '()\(.*?\)'
$output = select-string -Path $input_path -Pattern $regex -AllMatches | % { $.Matches } | % { $.Value } |
ForEach-Object {
($_ -split "\(|\)")[1]
}
foreach ($line in $output){
$line = $line -replace "\(",""
$line = $line -replace "\)",""
$line = $line -replace "\'",""
$col1,$col2,$col3 = $line -split ","
[PSCustomObject]#{
col1 = $col1
col2 = $col2
col3 = $col3
} | select col1,col2,col3
$insert_query = "INSERT INTO [$SQLDBName].[dbo].[product_schema]
([version]
,[field]
,[value])
VALUES
($col1, $col2, $col3);"
$execute_query = New-Object System.Data.SqlClient.SqlCommand
$execute_query.connection = $SQLConnection
$execute_query.commandtext = $insert_query
$execute_query.ExecuteNonQuery()
}
$SqlConnection.close()
If the file has two of the below:
('Medic.2_0','AgeInformation','Age Information')
('Medic.2_0','TransactionID','Transaction ID')
My actual output should be:
'Medic.2_0' stored in Version Column
'AgeInformation' stored in the Field Column
'Age Information' stored in the value column
'Medic.2_0' stored in Version Column
'TransactionID' stored in the Field Column
'Transaction ID' stored in the value column
I have to take each of the values and store it in a column in a temp table setup on MySQL server like below:
**Version** **Field** **Value**
Medic.2_0 AgeInformation Age Information
Medic.2_0 TransactionID Transaction ID
Error Encountered:
Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax near '.2'."
At C:\Users\Administrator\Desktop\test.ps1:47 char:10
+ $execute_query.ExecuteNonQuery()
+ ~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
Can someone please recommend how shall I change my code to solve this?
In answer to your original question before editing: Assuming your output looks like this and is saved in a variable named $output
('Medic.2_0','AgeInformation','Age Information')
('Medic.2_0','TransactionID','Transaction ID')
Try this:
foreach ($line in $output){
$line = $line -replace "\(",""
$line = $line -replace "\)",""
$line = $line -replace "\'",""
$col1,$col2,$col3 = $line -split ","
[PSCustomObject]#{
col1 = $col1
col2 = $col2
col3 = $col3
} | select col1,col2,col3 | export-csv d:\test.csv -append -NoTypeInformation
}
We are looping through the $output line by line removing the brackets and the single quotes, splitting the remaining text on the comma, then assigning each of the three entries into the relevant variables. Once they are in variables we can then easily create a PSObject and use it to select our requirements for our export-csv
Try to add this code:
$info=#() #for store your values
foreach($item in $output){
$z=$item.split(',') #for split to 3 strings
$info+=[PSCustomObject]#{ #create custom object which have named columns and store our values
Version = $z[0]
Field = $z[1]
Value = $z[2]
}
}
Write-Output $info #variable that store all columns
Then you must run foreach loop to each object in $info .
you can run it like this:
foreach($data in $info){
$data.Version #to access Version field
$data.Field #to access Field field
$data.Value #to access Value field
.......your SQL query......
}

PowerShell pipe / SQL insert query data limits (and increasing them?)

Are there any limits to the size of a string you can assign to a variable in powershell or any limits to the size of the text sent within an SQL INSERT query?
I have a big CSV file coming in to PowerShell and through string construction in a foreach loop I am generating SQL INSERT queries for each row. The resulting INSERT query; INSERT query; is over about 4MB.
The SQL server has a perfect schema to receive the data, however, when sending the 4MB collection of INSERT queries (each seperated by ;) I get an error that looks to me like the long 4MB set of insert queries was truncated somehow. I guess I have hit some kind of limit.
Is there a way of getting around this (programatically in PowerShell) or a way of increasing the size limit of an acceptable collection of SQL INSERT queries?
My code is using System.Data.SqlClient.SqlConnection and System.Data.sqlclient.SqlCommand.
Smaller datasets work ok but the larger datasets give an error like the following example. Each different dataset gives off a different "Incorrect syntax near" indicator.
Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax
near '('."
At C:\Users\stuart\Desktop\git\ADStfL\WorkInProgress.ps1:211 char:3
+ $SQLCommand.executenonquery()
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
In my experience, the best performing way to do this is to load the CSV into a DataTable and then use SQLBulkCopy.
$ErrorActionPreference = 'Stop';
$Csv = Import-Csv -Path $FileName;
$SqlServer = 'MyServer';
$SqlDatabase = 'MyDatabase';
$DestinationTableName = 'MyTable';
# Create Connection String
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
# Define your DataTable. The column order of the DataTable must either match the table in the database, or
# you must specify the column mapping in SqlBulkCopy.ColumnMapping. If you have an IDENTITY column, it's a
# bit more complicated
$DataTable = New-Object -TypeName System.Data.DataTable -ArgumentList $DestinationTableName;
$NewColumn = $DataTable.Columns.Add('Id',[System.Int32]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('IntegerField',[System.Int32]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('DecimalField',[System.Decimal]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('VarCharField',[System.String]);
$NewColumn.MaxLength = 50;
$NewColumn = $DataTable.Columns.Add('DateTimeField',[System.DateTime]);
$NewColumn.AllowDBNull = $false;
# Populate your datatable from the CSV file
# You may find that you need to type cast some of the fields.
$Csv | ForEach-Object {
$NewRow = $DataTable.NewRow();
$NewRow['Id'] = $_.Id;
$NewRow['IntegerField'] = $_.IntegerField;
$NewRow['DecimalField'] = $_.DecimalFiled;
$NewRow['StringField'] = $_.StringField1;
$NewRow['DateTimeField'] = $_.DateTimeField1;
$DataTable.Rows.Add($NewRow);
}
# Create Connection
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
# Open Connection
$SqlConnection.Open();
# Start Transaction
$SqlTransaction = $SqlConnection.BeginTransaction();
# Double check the possible options at https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopyoptions(v=vs.110).aspx
# If you need multiple then -bor them together
$SqlBulkCopyOptions = [System.Data.SqlClient.SqlBulkCopyOptions]::CheckConstraints;
# Create SqlBulkCopy class
$SqlBulkCopy = New-Object -TypeName System.Data.SqlClient.SqlBulkCopy -ArgumentList $SqlConnection, $SqlBulkCopyOptions, $SqlTransaction;
# Specify destination table
$SqlBulkCopy.DestinationTableName = $DestinationTableName;
# Do the insert; rollback on error
try {
$SqlBulkCopy.WriteToServer($DataTable);
$SqlTransaction.Commit();
}
catch {
# Roll back transaction and rethrow error
$SqlTransaction.Rollback();
throw ($_);
}
finally {
$SqlConnection.Close();
$SqlConnection.Dispose();
}
The other method is to use an SQLCommand and do it row by row:
$ErrorActionPreference = 'Stop';
$Csv = Import-Csv -Path $FileName;
$SqlServer = 'MyServer';
$SqlDatabase = 'MyDatabase';
# Create Connection String
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
# Create Connection
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
# Create Command
$InsertCommandText = 'INSERT INTO DestinationTable (Id, IntegerField, DecimalField, StringField, DateTimeField) VALUES (#Id, #IntegerField, #DecimalField, #StringField, #DateTimeField)';
$InsertCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand -ArgumentList $SqlConnection;
[void]$InsertCommand.Parameters.Add('#Id', [System.Data.SqlDbType]::Int);
[void]$InsertCommand.Parameters.Add('#IntegerField', [System.Data.SqlDbType]::Int);
[void]$InsertCommand.Parameters.Add('#DecimalField', [System.Data.SqlDbType]::Decimal);
[void]$InsertCommand.Parameters.Add('#StringField', [System.Data.SqlDbType]::VarChar,50);
[void]$InsertCommand.Parameters.Add('#DateTimeField', [System.Data.SqlDbType]::DateTime);
# Open connection and start transaction
$SqlConnection.Open()
$SqlTransaction = $SqlConnection.BeginTransaction();
$InsertCommand.Transaction = $SqlTransaction;
$RowsInserted = 0;
try {
$line = 0;
$Csv | ForEach-Object {
$line++;
# Specify parameter values
$InsertCommand.Parameters['#Id'].Value = $_.Id;
$InsertCommand.Parameters['#IntegerField'].Value = $_.IntegerField;
$InsertCommand.Parameters['#DecimalField'].Value = $_.DecimalField;
$InsertCommand.Parameters['#StringField'].Value = $_.StringField;
$InsertCommand.Parameters['#DateTimeField'].Value = $_.DateTimeField;
$RowsInserted += $InsertCommand.ExecuteNonQuery();
# Clear parameter values
$InsertCommand.Parameters | ForEach-Object { $_.Value = $null };
}
$SqlTransaction.Commit();
Write-Output "Rows affected: $RowsInserted";
}
catch {
# Roll back transaction and rethrow error
$SqlTransaction.Rollback();
Write-Error "Error on line $line" -ErrorAction Continue;
throw ($_);
}
finally {
$SqlConnection.Close();
$SqlConnection.Dispose();
}
Edit: Oh, I forgot one important point. If you need to set the value of a field to null in the database, you need to set it's value to [System.DBNull]::Value, not $null.

Powershell - proper way to execute SQL query with multiple select statements and result tables

I'm trying to execute an SQL query with few select statements, that returns multiple tables as a result. The problem is that I can't find a way to read and use the tables separately.
Expected results:
Actual results: (it is printed row by row)
Purpose: I've made a script that creates an empty excel file with multiple sheets and each of the sheets will be used to contain each resultset of the query.
The only thing left is to put the needed text into the sheets. Here is my code for that part only:
$ConnectionString = "Data Source=...;Initial Catalog=...;User Id=...;Password=..."
$DBServerName = $ConnectionString.split('=')[1].split(';')[0]
$DBName = $ConnectionString.split('=')[2].split(';')[0]
$DBUser = $ConnectionString.split('=')[3].split(';')[0]
$DBPassword = $ConnectionString.split('=')[4].split(';')[0]
$CurrentFilePath = "C:\SQLqueryWithManyResultsets.sql"
$query = Get-Content -literalPath $CurrentFilePath | Out-String #getting the query string from file
$resultTables = Invoke-Sqlcmd -Query $query -ServerInstance $DBServerName -Database $DBName -DisableVariables -Password $DBPassword -Username $DBUser -ErrorAction Stop
foreach ($result in $resultTables) {
$result | Format-Table #where the magic happens
}
I've made a lot of research, but I cannot find a proper way to store and read the tables the way i need.
Try this:
Clear-Host;
$objConnection = New-Object System.Data.SqlClient.SqlConnection;
$objConnection.ConnectionString = "...";
$ObjCmd = New-Object System.Data.SqlClient.SqlCommand;
$ObjCmd.CommandText = "...";
$ObjCmd.Connection = $objConnection;
$ObjCmd.CommandTimeout = 0;
$objAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$objAdapter.SelectCommand = $ObjCmd;
$objDataSet = New-Object System.Data.DataSet;
$objAdapter.Fill($objDataSet) | Out-Null;
for ($i=0; $i -lt $objDataSet.Tables.Count; $i++) {
Write-Host ($objDataSet.Tables[$i] | Format-Table | Out-String);
}
$query = $null;
$objDataSet = $null;
$objConnection.Close();
$objConnection = $null;

How do I account for CSV column titles and null rows when trying to import a CSV to SQL Server using PowerShell

I have a large CSV file with 5 columns, the first row being the title of the columns. I'm trying to pass the values to a datatable using powershell that I then pass to a table-value parameter to load the data into my SQL Server instance. My SQL Server table has already been created with the 5 columns, but I am running into issues.
cls
#CSV variables
$csvfile = "C:\Students\test.csv"
$csvdelimiter = ","
$firstRowColumns = $true
#creating the datatable
$dt = New-Object System.Data.Datatable
$reader = New-Object System.IO.StreamReader $csvfile
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
foreach ($column in $columns)
{
if ($firstRowColumns -eq $true)
{
[void]$dt.Columns.Add($column)
$reader.ReadLine()
} else { [void]$dt.Columns.Add() }
}
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null)
{
[void]$dt.Rows.Add($line.Split($csvdelimiter))
}
function ExecSproc
{
param ($Conn, $Sproc, $Parameters=#{})
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandType = [System.Data.CommandType]::StoredProcedure
$SqlCmd.Connection = $Conn
$SqlCmd.CommandText = $Sproc
foreach($p in $Parameters.Keys){
[Void] $SqlCmd.Parameters.AddWithValue("#$p",$Parameters[$p])
}
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter($SqlCmd)
$DataSet = New-Object System.Data.DataSet
[Void] $SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
return $DataSet.Tables[0]
}
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "server='.';database='ActiveStudents';trusted_connection=true;"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
ExecSproc -Conn $SqlConnection -Sproc spInsertStudents -Parameters #{TVP = $dt}
These are the errors being thrown:
Exception calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\Scripts\ActiveStudentInsert.ps1:24 char:2
Exception calling "Fill" with "1" argument(s): "Conversion failed when converting the nvarchar value '' to data type int.
The data for table-valued parameter "#TVP" doesn't conform to the table type of the parameter. SQL Server error is: 245, state: 1
The statement has been terminated."
Chad Miller made a great script in the Script Gallery that outputs DataTables in a very similar way to the one you are looking to do.
If you used his functions, then it would look something like this:
$dt = Import-Csv $csvfile | Out-DataTable
If you have a really large amount of data, I would recommend you check out the SQL BCP command-line utility.

How to format output when exporting SQL query to CSV

I have a task to save the results of a SQL Server query into a .csv file. After some googling I decided to use PowerShell. I found a script, modified it a bit, it works and almost all is ok.
$server = "server"
$database = "database"
$query = "SELECT * from et_thanks"
$tod = Get-Date;
$file = "{0:yyyyMMdd}_go.csv" -f $tod;
$extractFile = #"
\\info\export_files\$file
"#
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $query
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv -Force -Delimiter ";" $extractFile
But I have 2 problems which I can't solve:
When I open the .csv file I see columns headers and commented string on first line:
#TYPE System.Data.DataRow
"ob_no","c_name","c_visible","c_fp","e_from","e_to"
"436439","09.09.2013 11:29:08","0","","10937","260153"
How can I get rid of it?
All values are surrounded with quotes. Is it possible to modify script not to use it while exporting? Autoreplace isn't good idea, cause there is a possibility that quote symbol can be found in sql data.
I tried to find answers in documentation (http://ss64.com/ps/export-csv.html) but with no success.
You might run in to trouble removing the quotes, but if that's what you really want then the following should achieve it.
-NoTypeInformation will remove the additional type information you are seeing.
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter ";" -NoTypeInformation) -replace "`"", "" | `
Out-File -Force $extractFile
This uses convertto-csv to convert to a string representation of the csv followed by replacing all instances of " with nothing and the final string is piped to Out-File.
...and, to get rid of the header record, if you first convert the data to csv (Convert-Csv), then pipe those results to Select to skip the 1st record:
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation ) -Replace "`"","" | Select -skip 1 | Out-File blahblahblah...
Agreed export-csv isn't the best tool for the job. I would use sqlcmd.exe or bcp.exe provided SQL Server command-lines tools are installed. You could also build a simple routine to create a CSV from a datatable:
$result = new-Object text.stringbuilder
$dt = $DataSet.Tables[0]
foreach ($dr in $dt.Rows) {
for ($i = 0; $i -lt $dt.Columns.Count; $i++) {
$null = $result.Append($($dr[$i]).ToString())
$null = $result.Append($(if ($i -eq $dt.Columns.Count - 1) {"`n" } else { ","} ))
}
}
$result.ToString()

Resources