Getting data from SQL Server 2008 with Powershell - sql-server

I'm having some difficulties getting data from a simple stored procedure in SQL Server working. I have a Powershell script that needs to get variables filled from 3 columns (the procedure just returns 1 row)
Here's what I have that isn't working. Somehow I'm not referencing the column value correctly. I've tried various methods, but usually get the error "Cannot index into a null array". I don't want to iterate through the resultset, I just want to directly set some values from the one row returned into variables
$conn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$conn.ConnectionString = `
"Server=ServerName;Database=ShopDB;Integrated Security=True"
$sqlQuery = new-object System.Collections.Specialized.StringCollection
$sqlQuery.Add("JobSettingsGet")
$Resultset = $conn.ExecuteWithResults($sqlQuery)
# $UserName = $table.Rows(0).Columns(0)
# error - Method invocation failed because [System.Data.DataTable] doesn't
# contain a method named 'Rows'.
# $UserName = $table.Rows[0].Columns[0]
# error - cannot index null array
# $UserName = $table.Row(0).Column(0)
# error - Cannot index into a null array, also Method invocation failed
# because [System.Data.DataTable] doesn't contain a method named 'Row'.
# $UserName = $table.Rows[0].Columns[1].Value
# error - cannot index null array
I'd like to use the column name if possible, too.
Any pointers?
thanks,
Sylvia

You could use System.Data.SqlClient.SqlConnection instead of SqlServer.Management class
$conn = New-Object Data.Sqlclient.Sqlconnection`
("Data Source=DATABASE;Initial Catalog=master;Integrated Security=SSPI;")
$adapter = New-Object Data.Sqlclient.Sqldataadapter("exec sp_who2", $conn)
$set = new-object data.dataset
$adapter.fill($set)
$table = new-object data.datatable
$table = $set.tables[0]
In order to print the whole table, just use format-table
$table | ft -AutoSize
To get a column by name, use the foreach iterator and access note properties by name
$table | % {$_.login}
Ed: Here is the same query by using SqlServer.Management.
[void][Reflection.Assembly]::LoadWithPartialName`
("Microsoft.SqlServer.ConnectionInfo")
$conn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$conn.ConnectionString = `
"Server=ServerName;Database=ShopDB;Integrated Security=True"
$sqlQuery = new-object System.Collections.Specialized.StringCollection
$sqlQuery.Add("exec sp_who2")
$resultset = $conn.ExecuteWithResults($sqlQuery)
Access the results by picking the zeroeth resultset, zeroeth table, zeroeth row and item by name
$resultset[0].tables.Item(0).Rows[0].Item("Login")

Related

Sqlbulkcopy Excessive Memory Consumtion even with EnableStreaming and low BatchSize

I try to bulk load data from Oracle to SqlServer through Powershell Sqlserver Module Sqlbulkcopy
On small Data, everything works fine, but on big Datasets, even if bachsize and streaming are set, sqlbulkcopy is taking all the memory available... until an out of memory
Also the notify function seems to give no answer, so I guess even with streaming=True, the process first load everything to memory...
What did I missed ?
$current = Get-Date
#copy table from Oracle table to SQL Server table
add-type -path "D:\oracle\product\12.1.0\client_1\odp.net\managed\common\Oracle.ManagedDataAccess.dll";
#define oracle connectin string
$conn_str = "cstr"
# query for oracle table
$qry = "
SELECT
ID,CREATEDT,MODIFIEDDT
FROM MYTABLE
WHERE source.ISSYNTHETIC=0 AND source.VALIDFROM >= TO_Date('2019-01-01','yyyy-mm-dd')
";
# key (on the left side) is the source column while value (on the right side) is the target column
[hashtable] $mapping = #{'ID'='ID';'CREATEDT'='CREATEDT';'MODIFIEDDT'};
$adapter = new-object Oracle.ManagedDataAccess.Client.OracleDataAdapter($qry, $conn_str);
#$info = new-object Oracle.ManagedDataAccess.Client;
#Write-Host ( $info | Format-Table | Out-String)
$dtbl = new-object System.Data.DataTable('MYTABLE');
#this Fill method will populate the $dtbl with the query $qry result
$adapter.Fill($dtbl);
#define sql server target instance
$sqlconn = "cstr";
$sqlbc = new-object system.data.sqlclient.Sqlbulkcopy($sqlconn)
$sqlbc.BatchSize = 1000;
$sqlbc.EnableStreaming = $true;
$sqlbc.NotifyAfter = 1000;
$sqlbc.DestinationTableName="DWHODS.MYTABLE";
#need to tell $sqlbc the column mapping info
foreach ($k in $mapping.keys)
{
$colMapping = new-object System.Data.SqlClient.SqlBulkCopyColumnMapping($k, $mapping[$k]);
$sqlbc.ColumnMappings.Add($colMapping) | out-null
}
$sqlbc.WriteToServer($dtbl);
$sqlbc.close;
$end= Get-Date
$diff= New-TimeSpan -Start $current -End $end
Write-Output "import needed : $diff"
Thanks to Jeroen, I changed the code like this, now its no more consuming memory :
$oraConn = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($conn_str);
$oraConn.Open();
$command = $oraConn.CreateCommand();
$command.CommandText=$qry;
$reader = $command.ExecuteReader()
...
$sqlbc.WriteToServer($reader);

Looping through pscustomobject dataset and add to a SQL DB

I have a variable, $CiscoCMDB, which holds data for 2968 records like this (some fields are all filled, some are not like below):
SearchCode: D12345678911
Name1: 1212
Category: Office Phone
AssetTag:
Status: Stock
SerialNumber: FCH1549BBBB
Model: CISCO IP PHONE 7945G
MacAddress:
Location: SF
OwnerOrganization:
OwnerPerson:
I also have a SQL DB created, CiscoCMDB, with the same name for the columns that are present for the rows. I would like All records to be written to the SQL table. I've tried to use the Write-Datatable cmdlet, but it gave me errors. along with other CMDlets. I figured, if I am using SQL SELECT statements in my code, I should just use SQL INSERTs in my code. Only problem is, I'm not sure how to go about this other than knowing I will need a foreach loop to loop through.
Can someone help me on this as I'm getting confused and frustrated. I'm not exactly sure how to go about looping through correctly.
Pull data from another SQL DB
Organize data furthur (picking the columns I want)
Manipulate data to strip ^C from searchcode (all our search codes are prefaced with C)
Manipulate data to strip ^DN from Name1 field (again all Name1s are prefaced with DN)
Fill in my data
$CMDBCiscoQuery = #()
foreach ($row in $table) {
$hash = [ordered]#{
'SearchCode' = $row.SearchCode
'Category' = $row.Category
'Status' = $row.Status
'Name1' = $row.Name1
'SerialNumber' = $row.SerialNumber
'Model' = $row.Model
'MacAddress' = $row.MacAddress
'Location' = $row.Location
'OwnerOrg' = $row.OwnerOrganization
'Owner' = $row.OwnerPerson
}
$obj = New-Object -TypeName PSObject -Property $hash
$CMDBCiscoQuery+= $obj
$CMDBCisco = $CMDBCiscoQuery |
Select #{n='SearchCode';e={$_.Searchcode -replace "^C"}},
#{n='Name1';e={$_.DN -replace "^DN"}}, Category, AssetTag,
Status, SerialNumber, Model, MacAddress, Location,
OwnerOrganization, OwnerPerson
#Filling CMDB table
Write-Verbose "Filling CMDBCisco Table"
$ConnectionTimeout = 30
$ServerInstance = "myserver"
$Database = "Audits"
$conn = New-Object System.Data.SqlClient.SQLConnection
$ConnectionString = "Server={0};Database={1};Connect Timeout={2};Integrated Security = True;" -f $ServerInstance,$Database,$ConnectionTimeout
$conn.ConnectionString = $ConnectionString
$conn.Open()
$cmd = New-Object System.Data.SqlClient.SqlCommand($conn);
$conn.Close()

Specify insert into list with different sql source and dest connection in Powershell

I have this powershell script that would work if my DEST table ONLY had the columns listed in the select from my SOURCE server, but the DEST table has more. I haven't been able to find anything that gives examples on how to specify the columns from my dest table I want to insert into. Note that the SourceServer and DestServer are not linked servers.
Param (
#[parameter(Mandatory = $true)]
[string] $SrcServer = "SourceServer",
[parameter(Mandatory = $true)]
[string] $SrcDatabase = "SourceDb",
#[parameter(Mandatory = $true)]
[string] $SrcTable = "stage.InternalNotes",
#[parameter(Mandatory = $true)]
[string] $DestServer = "DestServer",
#[parameter(Mandatory = $true)]
[string] $DestDatabase = "DestDb",
[parameter(Mandatory = $true)]
[string] $DestTable = "dbo.InternalNotes",
)
Function ConnectionString([string] $ServerName, [string] $DbName)
{
"Data Source=$ServerName;Initial Catalog=$DbName;Integrated Security=True;User ID=$UID;Password=$PWD;"
}
$SrcConnStr = ConnectionString $SrcServer $SrcDatabase
$SrcConn = New-Object System.Data.SqlClient.SQLConnection($SrcConnStr)
$CmdText = "SELECT
ino.UserId
,ino.StoreId
,ino.PostedById
,ino.DatePosted
,ino.NoteSubject
,ino.NoteText
,ino.NoteType
,ino.Classify
,ino.CreatedBy
,ino.CreatedUtc
,IsReadOnly = 0
FROM
stage.InternalNotes AS ino
"
$SqlCommand = New-Object system.Data.SqlClient.SqlCommand($CmdText, $SrcConn)
$SrcConn.Open()
[System.Data.SqlClient.SqlDataReader] $SqlReader = $SqlCommand.ExecuteReader()
Try
{
$DestConnStr = ConnectionString $DestServer $DestDatabase
$bulkCopy = New-Object Data.SqlClient.SqlBulkCopy($DestConnStr, [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity)
$bulkCopy.DestinationTableName = $DestTable
$bulkCopy.WriteToServer($sqlReader)
}
Catch [System.Exception]
{
$ex = $_.Exception
Write-Host $ex.Message
}
Finally
{
Write-Host "Table $SrcTable in $SrcDatabase database on $SrcServer has been copied to table $DestTable in $DestDatabase database on $DestServer"
$SqlReader.close()
$SrcConn.Close()
$SrcConn.Dispose()
$bulkCopy.Close()
}
Essentially, I need to be able to do this:
INSERT INTO dbo.InternalNotes --DEST Server table
(
userID
,StoreID
,PostedByID
,DatePosted
,NoteSubject
,NoteText
,NoteType
,Classify
,CreatedBy
,CreatedDateUTC
,IsReadOnly
)
SELECT
ino.UserId
,ino.StoreId
,ino.PostedById
,ino.DatePosted
,ino.NoteSubject
,ino.NoteText
,ino.NoteType
,ino.Classify
,ino.CreatedBy
,ino.CreatedUtc
,IsReadOnly = 0
FROM
stage.InternalNotes AS ino --SOURCE Server table
Edits after getting everything to work based on the accepted answer:
For some reason it didn't like the line:
$bulkCopy = New-Object -TypeName Data.SqlClient.SqlBulkCopy -ArgumentList $DestSqlConnection, [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity, $DestSqlTransaction;
It gave the error:
Cannot convert argument "1", with value:
"[System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity", for
"SqlBulkCopy" to type "System.Data.SqlClient.SqlBulkCopyOptions":
"Cannot convert value
"[System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity" to type
"System.Data.SqlClient.SqlBulkCopyOptions". Error: "Unable to match
the identifier name
[System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity to a valid
enumerator name. Specify one of the following enumerator names and try
again: Default, KeepIdentity, CheckConstraints, TableLock, KeepNulls,
FireTriggers, UseInternalTransaction,
AllowEncryptedValueModifications""
So Instead I changed it to this, and everything worked:
$bulkCopy = New-Object Data.SqlClient.SqlBulkCopy($DestSqlConnection, [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity,$DestSqlTransaction)
To do manual column mapping, you need to populate SqlBulkCopy.ColumnMappings. If you don't specify the mapping, then as far as I know SqlBulkCopy will assume the first column in the select list or DataRow goes into the first ordinal column of the destination table.
For example:
$bulkCopy.DestinationTableName = $DestTable;
$bulkCopy.ColumnMappings.Add('sourceColumn1','destinationColumn1');
$bulkCopy.ColumnMappings.Add('sourceColumn2','destinationColumn2');
$bulkCopy.ColumnMappings.Add('sourceColumn3','destinationColumn3');
$bulkCopy.ColumnMappings.Add('sourceColumn4','destinationColumn4');
$bulkCopy.ColumnMappings.Add('sourceColumn5','destinationColumn5');
However, there's a number of other issues with your script.
Your connection string authentication section is nonsense:
`Integrated Security=True; User ID=$UID; Password=$PWD;`
Integrated Security=True says, "Use passthrough Windows authentication with currently logged on user." User ID=$UID; Password=$PWD; says, "Use SQL authentication with the specified username and password." You can't do both.
You should specify only one or the other.
$SqlCommand = New-Object system.Data.SqlClient.SqlCommand($CmdText, $SrcConn)
[...]
$bulkCopy = New-Object Data.SqlClient.SqlBulkCopy($DestConnStr, [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity)
I may be wrong, but I'm pretty sure you're trying to pass two variables as one argument here. Just like with your ConnectionString function, I don't think you don't want parentheses here. In any case it's syntactically confusing. Do this instead:
$SqlCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand -ArgumentList $CmdText, $SrcConn
[...]
$bulkCopy = New-Object -TypeName Data.SqlClient.SqlBulkCopy -ArgumentList $DestConnStr, [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity
Speaking of that last one, I have another issue with it. SqlBulkCopy is powerful, but you really have to hold it's hand. By default, SqlBulkCopy doesn't run with any transaction benefits. That means that if it errors in the middle, well, too bad, your data has been partially updated. You can enable internal transactions, but then only the most recent batch of the inserts will be rolled back. You really need to manage your own transaction to get an all-or-nothing result.
So you'll end up with something like this:
Try {
$DestConnStr = ConnectionString $DestServer $DestDatabase
# We have to open the connection before we can create the transaction
$DestSqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $DestConnStr;
$DestSqlConnection.Open();
$DestSqlTransaction = $DestSqlConnection.BeginTransaction();
$bulkCopy = New-Object -TypeName Data.SqlClient.SqlBulkCopy -ArgumentList $DestSqlConnection, [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity, $DestSqlTransaction;
$bulkCopy.DestinationTableName = $DestTable
$bulkCopy.ColumnMappings.Add('sourceColumn1','destinationColumn1');
$bulkCopy.ColumnMappings.Add('sourceColumn2','destinationColumn2');
$bulkCopy.ColumnMappings.Add('sourceColumn3','destinationColumn3');
$bulkCopy.ColumnMappings.Add('sourceColumn4','destinationColumn4');
$bulkCopy.ColumnMappings.Add('sourceColumn5','destinationColumn5');
Try {
$bulkCopy.WriteToServer($sqlReader)
# Commit on success
$DestSqlTransaction.Commit();
}
Catch {
# Rollback on error
$DestSqlTransaction.Rollback();
# Rethrow the error to the outer catch block
throw ($_);
}
}
Catch [System.Exception] {
$ex = $_.Exception
Write-Host $ex.Message
}
Finally {
[...]
}
I'd probably rewrite the above more because I don't like nested try blocks, but for a quick and dirty rewrite this will work. I don't think you'll run into any problems with distributed transaction problems doing this, but I may be wrong. I tend to use SSIS or linked servers when I need this sort of data pump.

PowerShell pipe / SQL insert query data limits (and increasing them?)

Are there any limits to the size of a string you can assign to a variable in powershell or any limits to the size of the text sent within an SQL INSERT query?
I have a big CSV file coming in to PowerShell and through string construction in a foreach loop I am generating SQL INSERT queries for each row. The resulting INSERT query; INSERT query; is over about 4MB.
The SQL server has a perfect schema to receive the data, however, when sending the 4MB collection of INSERT queries (each seperated by ;) I get an error that looks to me like the long 4MB set of insert queries was truncated somehow. I guess I have hit some kind of limit.
Is there a way of getting around this (programatically in PowerShell) or a way of increasing the size limit of an acceptable collection of SQL INSERT queries?
My code is using System.Data.SqlClient.SqlConnection and System.Data.sqlclient.SqlCommand.
Smaller datasets work ok but the larger datasets give an error like the following example. Each different dataset gives off a different "Incorrect syntax near" indicator.
Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax
near '('."
At C:\Users\stuart\Desktop\git\ADStfL\WorkInProgress.ps1:211 char:3
+ $SQLCommand.executenonquery()
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
In my experience, the best performing way to do this is to load the CSV into a DataTable and then use SQLBulkCopy.
$ErrorActionPreference = 'Stop';
$Csv = Import-Csv -Path $FileName;
$SqlServer = 'MyServer';
$SqlDatabase = 'MyDatabase';
$DestinationTableName = 'MyTable';
# Create Connection String
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
# Define your DataTable. The column order of the DataTable must either match the table in the database, or
# you must specify the column mapping in SqlBulkCopy.ColumnMapping. If you have an IDENTITY column, it's a
# bit more complicated
$DataTable = New-Object -TypeName System.Data.DataTable -ArgumentList $DestinationTableName;
$NewColumn = $DataTable.Columns.Add('Id',[System.Int32]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('IntegerField',[System.Int32]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('DecimalField',[System.Decimal]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('VarCharField',[System.String]);
$NewColumn.MaxLength = 50;
$NewColumn = $DataTable.Columns.Add('DateTimeField',[System.DateTime]);
$NewColumn.AllowDBNull = $false;
# Populate your datatable from the CSV file
# You may find that you need to type cast some of the fields.
$Csv | ForEach-Object {
$NewRow = $DataTable.NewRow();
$NewRow['Id'] = $_.Id;
$NewRow['IntegerField'] = $_.IntegerField;
$NewRow['DecimalField'] = $_.DecimalFiled;
$NewRow['StringField'] = $_.StringField1;
$NewRow['DateTimeField'] = $_.DateTimeField1;
$DataTable.Rows.Add($NewRow);
}
# Create Connection
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
# Open Connection
$SqlConnection.Open();
# Start Transaction
$SqlTransaction = $SqlConnection.BeginTransaction();
# Double check the possible options at https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopyoptions(v=vs.110).aspx
# If you need multiple then -bor them together
$SqlBulkCopyOptions = [System.Data.SqlClient.SqlBulkCopyOptions]::CheckConstraints;
# Create SqlBulkCopy class
$SqlBulkCopy = New-Object -TypeName System.Data.SqlClient.SqlBulkCopy -ArgumentList $SqlConnection, $SqlBulkCopyOptions, $SqlTransaction;
# Specify destination table
$SqlBulkCopy.DestinationTableName = $DestinationTableName;
# Do the insert; rollback on error
try {
$SqlBulkCopy.WriteToServer($DataTable);
$SqlTransaction.Commit();
}
catch {
# Roll back transaction and rethrow error
$SqlTransaction.Rollback();
throw ($_);
}
finally {
$SqlConnection.Close();
$SqlConnection.Dispose();
}
The other method is to use an SQLCommand and do it row by row:
$ErrorActionPreference = 'Stop';
$Csv = Import-Csv -Path $FileName;
$SqlServer = 'MyServer';
$SqlDatabase = 'MyDatabase';
# Create Connection String
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
# Create Connection
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
# Create Command
$InsertCommandText = 'INSERT INTO DestinationTable (Id, IntegerField, DecimalField, StringField, DateTimeField) VALUES (#Id, #IntegerField, #DecimalField, #StringField, #DateTimeField)';
$InsertCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand -ArgumentList $SqlConnection;
[void]$InsertCommand.Parameters.Add('#Id', [System.Data.SqlDbType]::Int);
[void]$InsertCommand.Parameters.Add('#IntegerField', [System.Data.SqlDbType]::Int);
[void]$InsertCommand.Parameters.Add('#DecimalField', [System.Data.SqlDbType]::Decimal);
[void]$InsertCommand.Parameters.Add('#StringField', [System.Data.SqlDbType]::VarChar,50);
[void]$InsertCommand.Parameters.Add('#DateTimeField', [System.Data.SqlDbType]::DateTime);
# Open connection and start transaction
$SqlConnection.Open()
$SqlTransaction = $SqlConnection.BeginTransaction();
$InsertCommand.Transaction = $SqlTransaction;
$RowsInserted = 0;
try {
$line = 0;
$Csv | ForEach-Object {
$line++;
# Specify parameter values
$InsertCommand.Parameters['#Id'].Value = $_.Id;
$InsertCommand.Parameters['#IntegerField'].Value = $_.IntegerField;
$InsertCommand.Parameters['#DecimalField'].Value = $_.DecimalField;
$InsertCommand.Parameters['#StringField'].Value = $_.StringField;
$InsertCommand.Parameters['#DateTimeField'].Value = $_.DateTimeField;
$RowsInserted += $InsertCommand.ExecuteNonQuery();
# Clear parameter values
$InsertCommand.Parameters | ForEach-Object { $_.Value = $null };
}
$SqlTransaction.Commit();
Write-Output "Rows affected: $RowsInserted";
}
catch {
# Roll back transaction and rethrow error
$SqlTransaction.Rollback();
Write-Error "Error on line $line" -ErrorAction Continue;
throw ($_);
}
finally {
$SqlConnection.Close();
$SqlConnection.Dispose();
}
Edit: Oh, I forgot one important point. If you need to set the value of a field to null in the database, you need to set it's value to [System.DBNull]::Value, not $null.

How do I account for CSV column titles and null rows when trying to import a CSV to SQL Server using PowerShell

I have a large CSV file with 5 columns, the first row being the title of the columns. I'm trying to pass the values to a datatable using powershell that I then pass to a table-value parameter to load the data into my SQL Server instance. My SQL Server table has already been created with the 5 columns, but I am running into issues.
cls
#CSV variables
$csvfile = "C:\Students\test.csv"
$csvdelimiter = ","
$firstRowColumns = $true
#creating the datatable
$dt = New-Object System.Data.Datatable
$reader = New-Object System.IO.StreamReader $csvfile
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
foreach ($column in $columns)
{
if ($firstRowColumns -eq $true)
{
[void]$dt.Columns.Add($column)
$reader.ReadLine()
} else { [void]$dt.Columns.Add() }
}
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null)
{
[void]$dt.Rows.Add($line.Split($csvdelimiter))
}
function ExecSproc
{
param ($Conn, $Sproc, $Parameters=#{})
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandType = [System.Data.CommandType]::StoredProcedure
$SqlCmd.Connection = $Conn
$SqlCmd.CommandText = $Sproc
foreach($p in $Parameters.Keys){
[Void] $SqlCmd.Parameters.AddWithValue("#$p",$Parameters[$p])
}
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter($SqlCmd)
$DataSet = New-Object System.Data.DataSet
[Void] $SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
return $DataSet.Tables[0]
}
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "server='.';database='ActiveStudents';trusted_connection=true;"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
ExecSproc -Conn $SqlConnection -Sproc spInsertStudents -Parameters #{TVP = $dt}
These are the errors being thrown:
Exception calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\Scripts\ActiveStudentInsert.ps1:24 char:2
Exception calling "Fill" with "1" argument(s): "Conversion failed when converting the nvarchar value '' to data type int.
The data for table-valued parameter "#TVP" doesn't conform to the table type of the parameter. SQL Server error is: 245, state: 1
The statement has been terminated."
Chad Miller made a great script in the Script Gallery that outputs DataTables in a very similar way to the one you are looking to do.
If you used his functions, then it would look something like this:
$dt = Import-Csv $csvfile | Out-DataTable
If you have a really large amount of data, I would recommend you check out the SQL BCP command-line utility.

Resources