Array index out of bounds. Interface Packets dropped: String (PowerShell) - arrays

Wrote a script to create incidents (objects) in bulk in SCSM (Microsoft System Center Service Manager) using PowerShell. The script pulls the data required from a CSV file. I am running into the following error:
New-SCSMIncident : Index was outside the bounds of the array.
At C:\Users\portalservice\Desktop\scripts\Incidents\BULK_Create_Incidents.ps1:83 char:5 + New-SCSMIncident -Title $Title -Description $Description -AffectedUser $Affe ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (Interface Packets Dropped :String) [New-SCSMIncident], IndexOutOfRangeException
+ FullyQualifiedErrorId : NewIncident,SMLets.SCSMIncidentNew
When I echo the contents of the array I see normal data... it all looks correct. Here is my script:
# Script designed to create tickets in bulk, reading the information needed from a CSVfile
# Tailored for incidents, can be modified to work with any work item
# SirLearnAlot 2015
# NOTES: Modify the $path variable to match the path of the data file being imported
# Import the smlets module
ipmo SMLets
Write-Host ""
Write-Host "Importing Smlets..."
# ---------------------------------------- Variable Decleration ----------------------------------------
Write-Host "Getting the classes needed..."
# Get the incident class
$IncidentClass = Get-SCSMClass -Name System.WorkItem.Incident$
# Get the CI Class
$CIClass = Get-SCSMClass -Name COMPANY.CI.Class$
# Get the Relationship we want to add (Affected Items)
$RelWIaboutCI = Get-SCSMRelationshipClass -Name System.WorkItemAboutConfigItem$
# Clear the error variable incase it contains anything already
$error.clear()
# --------------------------------------------- Data Import --------------------------------------------
# Import the CSV Data file containing list of incidents to create and their information
Write-Host "Importing CSV Data File..."
$path = "C:\Users\portalservice\Desktop\Script Data\Work Item Creation\Incidents\11.12.15\data.csv"
$data = ipcsv $path
if ($error -like "Could not find file *")
{
Write-Host "Encountered an error while importing the data file from '$path'. Please verify the path directories and file exist and run the script again."
exit
}
else
{
Write-Host "Successfully imported data file!"
Write-Host "Beginning ticket creation..."
#pause
}
# ---------------------------------------------- Execution ---------------------------------------------
# Begin going through file line by line
foreach ($case in $data)
{
# Clear the error variable
$error.clear()
# Visual Formatting
Write-Host ""
Write-Host "----------------------------------------------------------------------------------------"
Write-Host ""
# GETTING THE INFORMATION FROM DATA FILE AND STORING IN VARIABLES
# Get the Info for the Incident to create
$Title = $case.Title
Write-Host "Got case title: $Title"
$Description = $case.Description
Write-Host "Got case description: $Description"
$AffectedUser = $case.AffectedUser
Write-Host "Got case affected user: $AffectedUser"
$Classification = $case.Classification
Write-Host "Got case classification: $Classification"
$Impact = $case.Impact
Write-Host "Got case impact: $Impact"
$Urgency = $case.Urgency
Write-Host "Got case urgency: $Urgency"
$SupportGroup = $case.SupportGroup
Write-Host "Got case support group: $SupportGroup"
$PrimaryOwner = $case.PrimaryOwner
Write-Host "Got case owner: $PrimaryOwner"
$Vendor = $case.Vendor
Write-Host "Got case vendor: $Vendor"
$Customer = $case.Customer
Write-Host "Got case customer: $Customer"
$ReportedOn = $case.ReportedOn
Write-Host "Got date reported on: $ReportedOn"
# Get the name of the correct CI to add
$CIToAddNum = $case.CI
Write-Host "Got case CI: $CIToAddNum"
# Apply the non-OOB Values to the property hash table
$PropertyHashTable = #{"Vendor" = $Vendor; "Customer" = $Customer; "Reported_On" = $ReportedOn}
# INCIDENT CREATION
Write-Host "Attemping to create Incident from case data..."
New-SCSMIncident -Title $Title -Description $Description -AffectedUser $AffectedUser -Classification $Classification -Impact $Impact -Urgency $Urgency -SupportGroup $SupportGroup -Bulk
# Checks to see if the there is an error, if so displays a warning message and exits
if (!$error)
{
Write-Host "Incident creation successful."
}
else
{
Write-Host "$error"
Write-Host "Error creating the incident. Check the incident creation code and run this script again."
exit
}
# INCIDENT RETRIEVAL (POST-CREATION)
Write-Host "Attempting to retrieve incident for modification (Adding Customer, Vendor, and Reported On date to ticket)"
$Incident = Get-SCSMObject -Class $IncidentClass -Filter "Description -like '%$Description%'"
# Checks to see if the retrieval failed, if so displays error message and exits script
if ($Incident = $null)
{
Write-Host "Incident retrieval failed. Check the incident retrieval code and run the script again."
exit
}
# Apply the property hash table to the retrieved incident
$Incident | Set-SCSMObject -PropertyHashtable $PropertyHashTable
# ADDING THE CI TO THE INCIDENT
# Get the CI from the CMDB
$CIToAdd = Get-SCSMObject -Class $CIClass -Filter "DisplayName -eq '$CIToAddNum'"
if ($CIToAdd = $null)
{
Write-Host "Cannot find $CIToAddNum in the database, this CI does not seem to exist or the code to retrieve the CI failed. Please import the CI into the SMDB or fix the retrieval code and run this script again"
exit
}
# Attempt to add the CI to the Incident (Create a new relationship)
# Note: Due to a bug the -bulk paramater is required
Write-Host "Attempting to add $CIToAddNum to Incident..."
New-SCSMRelationshipObject -Relationship $RelWIaboutCI -Source $Incident -Target $CIToAdd -bulk
# Check if script throws error (cannot find the CI from the data file in our SMDB)
# If so provides error message and exits, otherwise provides success message
if ($error -like "Cannot bind argument to parameter 'Target' because it is null.")
{
Write-Host "Encountered an error while trying to add $CIToAddNum to $IncidentNum, this CI is not in the Service Manager Database or there was a problem retrieving it. Please import the CI into the SMDB or check the retrieval code and run this script again"
exit
}
else
{
Write-Host "Successfully Added!"
}
}
Write-Host ""
Write-Host "Success! All Incidents Corrected :)"
Not sure what the problem is here, any ideas?

Hit this same problem, my solution was to instead use New-SCSMObject.
Here is a simple example:
$irprops = #{Title = 'test'
Description = 'test descripton'
Impact = (get-scsmenumeration -name 'System.WorkItem.TroubleTicket.ImpactEnum.Low')
Urgency = (get-scsmenumeration -name 'System.WorkItem.TroubleTicket.UrgencyEnum.Medium')
ID = "IR{0}"
Status = (get-scsmenumeration -name 'System.WorkItem.TroubleTicket.ImpactEnum.Low')
Priority = 2
}
$ir=new-scsmobject -Class (get-scsmclass -name 'System.WorkItem.Incident$') -PropertyHashtable $irprops -PassThru

Related

Powershell variable doesn't contain all the objects

I got the following variable $listofusers which returns the below objects in two columns:
SourceUser DestinationUser
---------- ---------------
username1#legacy.company.corp username1#modern.company.corp
username2#legacy.company.corp username2#modern.company.corp
username3#legacy.company.corp username3#modern.company.corp
username4#legacy.company.corp username4#modern.company.corp
I now need to process this list of users in a foreach loop. I have tried so far the following but without luck yet:
$Results = ForEach ($User in $listofusers) {
Write-Host "Processing SourceUser $($User.SourceUser)"
Write-Host "Processing DestinationUser $($User.DestinationUser)"
#Assign the content to variables
$SourceUsers = $User.SourceUser
$DestinationUsers = $User.DestinationUser
}
It only returns me the last line of the objects:
$SourceUsers
RETURN ONLY: username4#legacy.company.corp
$DestinationUsers
RETURN ONLY: username4#modern.company.corp
How can I add all the objects in the variable $listofusers for further processing?
UPDATE:
I am trying to achieve the following that's why I have broken the association in listofusers
$SourceUser = #()
$DestinationUser = #()
$Results = ForEach ($User in $listofusers)
{
Write-Host "Processing SourceUser $($User.SourceUser)"
Write-Host "Processing DestinationUser $($User.DestinationUser)"
#Assign the content to variables
$SourceUser += $User.SourceUser
$DestinationUser += $User.DestinationUser
#Cannot get that variables working yet
$sourceusername, $sourcedomain = $SourceUser -split ("#")
$DestinationUsername, $destinationDomain = $DestinationUser -split ("#")
$SourceAccount = Get-ADUser $sourceusername -server $sourcedomain -Properties objectSid
$TargetAccount = Get-ADUser $DestinationUsername -Server $destinationDomain
}
Is there any better way to achieve that and get those variables to that point?
NEW UPDATE:
The purpose of the script would be to achieve the following cmdlets for processing ad objects:
#get the objectSid of the source account
$objectSid = $SourceAccount.objectSid
#copy source account objectSid to target account msExchMasterAccountSid
$TargetAccount | Set-ADUser -Replace #{"msExchMasterAccountSid"=$objectSid}
#enable targetaccount
$TargetAccount | Enable-ADAccount
#disable the source account
$SourceAccount | Disable-ADAccount
#move the migrated user into prod OU
$TargetAccount | Move-ADObject -TargetPath "ou=test,dc=contoso,dc=com"
Thanks
here is a demo of the concept i was trying to get across. [grin] it keeps the association of the objects in your CSV in the original object for as long as possible. the code has NOT been tested since i have no AD access.
what it does ...
fakes reading in a CSV file
when you are ready to use real data, replace the entire "region" with a call to Import-CSV.
iterates thru the list
builds a splat of the parameters for the AD calls
see Get-Help about_Splatting for more info on that wonderfully useful idea.
calls Get-AdUser with each to the Source/Target user data sets
stores the above
uses the stored account info to ...
== replace the .objectSid of the Target account
== enable the Target account
== disable the Source account
== Move the Target account to the desired OU
the hard coded OU could be set with a variable to make this a tad more flexible. however, this seems to be a one-off operation - so there is likely no benefit.
if you want to add logging, do so in the same loop.
there is no error handling, either. that likely should be added with a try/catch around each AD call & logging of both success and failure.
the code ...
#region >>> fake reading in a CSV file
# in real life, use Import-CSV
$UserList = #'
SourceUser, DestUser
ABravo#Old.com, ABravo#NewDomain.com
BCharlie#Old.com, BCharlie#NewDomain.com
CDelta#Old.com, CDelta#NewDomain.com
DEcho#Old.com, DEcho#NewDomain.com
EFoxtrot#Old.com, EFoxtrot#NewDomain.com
'# | ConvertFrom-Csv
#endregion >>> fake reading in a CSV file
ForEach ($UL_Item in $UserList)
{
Write-Host 'Processing ...'
Write-Host (' SourceUser {0}' -f $UL_Item.SourceUser)
Write-Host (' DestinationUser {0}' -f $UL_Item.DestUser)
Write-Host '__ Source Account __'
$GADU_Params_1 = [ordered]#{
Identity = $UL_Item.SourceUser.Split('#')[0]
Server = $UL_Item.SourceUser.Split('#')[1]
Properties = 'objectSid'
}
$SourceAccount = Get-ADUser #GADU_Params_1
Write-Host '__ Target Account __'
$GADU_Params_2 = [ordered]#{
Identity = $UL_Item.DestUser.Split('#')[0]
Server = $UL_Item.DestUser.Split('#')[1]
}
$TargetAccount = Get-ADUser #GADU_Params_2
Write-Host 'Making changes ...'
# all these piped objects are slower than making _direct_ calls
# however, i don't have any way to test the code, so i can't use what likely is faster
# something like >>>
# Set-AdUser -Identity $TargetAccount -Replace #{
# 'msExchMasterAccountSid' = $objectSid
# }
# note that i also replaced the unneeded _double_ quotes with the safer _single_ quotes
$TargetAccount |
Set-AdUser -Replace #{
'msExchMasterAccountSid' = $SourceAccount.objectSid
}
$TargetAccount |
Enable-AdAccount
$SourceAccount |
Disable-AdAccount
$TargetAccount |
Move-AdObject -TargetPath 'ou=test,dc=contoso,dc=com'
Write-Host '=' * 30
Write-Host ''
}
no output shown since i can't actually run this AD stuff. [grin]
$SourceUsers and $DestinationUsers contain only the last ones becasue youa re replacing the value on each foreach iteration.
if you want it to separate the properties try this:
$SourceUsers = $User | select SourceUser -ExpandProperty SourceUser
$DestinationUsers = $User | select DestinationUser -ExpandProperty DestinationUser
That will create a collection of only those strings. you wont be able to access those values by property anymore, meaning that is a simple String[] after the -ExpandProperty.
$SourceUsers = #()
$DestinationUsers = #()
$Results = ForEach ($User in $listofusers) {
Write-Host "Processing SourceUser $($User.SourceUser)"
Write-Host "Processing DestinationUser $($User.DestinationUser)"
#Assign the content to variables
$SourceUsers += $User.SourceUser
$DestinationUsers += $User.DestinationUser
}
$SourceUsers = #() and $DestinationUsers = #() creates two empty
arrays which we will use in the loop
+= is an assignment operator which enables us to assign more than
one value to a variable. According to the documentation: Increases
the value of a variable by the specified value, or appends the
specified value to the existing value.

How do i use PSGSUITE PS module to add google groups?

Ok so I work for a small business, and they use a google spread sheet as the "Phone list"
for finding and contacting employees. I installed the PSGSUITE powershell module and it seems to work very well, but im new to powershell and coding in general. The filter sheet i made along with the phone list places the employees in there respective groups. Example then The code.
"phone list"
Name # Company code Ext. Department Job Title Email
Hayden 111-222-333 JOP IT Technician example#example.com
"filter 2sheet"
JOP SPD
hayden#.com lisa#.com
john#.com arron#.com
david#.com mike#.com
I want to add these emails to there respective google groups
## NOVA BEAZ ##
## add groups in google based on company title
###
####
# Import Modules
Import-Module PSGSuite
# Create Array of Groups
$Title = (Import-GSSheet -SpreadsheetId "1NtCT5ruoL4Kf4-ec55xe-L8esXcSY8orfd-zOFK4q4k" -SheetName "Filter" -Headers "None" -Range "A1:1")
$Title = $Title | % { $_ }
$Groups = (Get-GSgroup -Fields "Name" )
if($Title = $Groups)
#{add that users email to the group}
#else
{echo "there is now group that matches that"}
The main issue is I really just dont know how to correctly run through the arrays and select all the emails in that row to add to the google groups, I think I need a array or object list form of storing my emails, I want this to be dynamic.
Excerpts from my blog post on how to use the PSGusite module.
Please check if the following answers your question. If not, let me know.
User Process
To begin, we need to import the module and then use the command Get-GSDriveFileList to find the Google Sheet where our data is stored.
Next, we use the command Import-GSSheet to import our user and group data.
Get Data from GSheet
# Import module
Import-Module -Name PSGSuite
# Discover spreadsheet Id in drive file list
$Spreadsheet = Get-GSDriveFileList -Filter "name = 'UserManagement'"
# Get data from each sheet from Google spreadsheet
$UserData = Import-GSSheet -SpreadsheetId $Spreadsheet.Id -SheetName 'Users'
$GroupData = Import-GSSheet -SpreadsheetId $Spreadsheet.Id -SheetName 'Groups'
Create Organization Units
We use Get-GSOrganizationalUnit to determine if the OU exists. And then we use New-GSOrganizationalUnit to create it if it does not.
foreach ($Group in $GroupData) {
$SplitPath = $Group.OrgUnitPath -Split '/'
$ParentPath = $SplitPath[0..($SplitPath.Count -2)] -join '/'
$OUPath = $SplitPath[-1]
$OrgUnit = Get-GSOrganizationalUnit -SearchBase $Group.OrgUnitPath -SearchScope Base -ErrorAction SilentlyContinue
if ($OrgUnit) {
"Org Unit {0} exists at {1}" -f $OrgUnit.OrgUnitPath,$OrgUnit.ParentOrgUnitPath
} else {
"Org Unit {0} does not exist; attempting to create in {1}" -f $Group.OrgUnitPath,$ParentPath
try {
$GSOrgUnit = New-GSOrganizationalUnit -Name $OUPath.ToLower() -ParentOrgUnitPath $ParentPath -Description $Group.Description
"Created {0} : {1}" -f $GSOrgUnit.OrgUnitPath,$GSOrgUnit.Description
}
catch {
"Unable to create {0}" -f $Group.OrgUnitPath
}
}
}
Create Groups
Using the command Get-GSGroup, we check if the group exists. If the group does not already exist, use New-GSGroup to create the group from the spreadsheet.
foreach ($Group in $GroupData) {
$GSGroup = Get-GSGroup -Group $Group.Name -ErrorAction SilentlyContinue
if ($GSGroup) {
"Group {0} exists" -f $Group.Name
} else {
"Group {0} does not exist; attempting to create" -f $Group.Name
try {
$NewGSGroup = New-GSGroup -Name $Group.Name -Email $Group.Email -Description $Group.Description
"Created {0} : {1}" -f $NewGSGroup.Name,$NewGSGroup.Description
}
catch {
"Unable to create {0}" -f $Group.Name
}
}
}
Create Users
Create the users listed in the spreadsheet.
First, determine the department based on the user type.
Using the department, set the variable for the org unit path.
Create the required hashtable for CustomSchemas to add the EmployeeType to the user.
Generate a random secure password.
Using the command New-GSUser, create the new user.
If the user is successfully created, use the command New-GSUserAlias for best effort to create an email alias based on the user’s full name.
foreach ($User in $UserData) {
$Domain = $User.Email.Split('#')[1]
switch ($User.UserType) {
'Faculty' { $Department = 'Academics'}
'Staff' { $Department = 'Business' }
}
# Set OU path
$OrgUnitPath = $GroupData.Where({$_.Name -eq $Department}).OrgUnitPath
# Set employee type custom schema
$CustomSchemas = #{
CustomUniversity = #{
EmployeeType = $User.UserType
}
}
# Set a random secure string
$Password = ConvertTo-SecureString -String (Get-RandomPassword) -AsPlainText -Force
$NewGSUserParams = #{
PrimaryEmail = $User.Email
FullName = $User.FullName
GivenName = $User.GivenName
FamilyName = $User.FamilyName
OrgUnitPath = $OrgUnitPath
CustomSchemas = $CustomSchemas
Password = $Password
}
$NewUser = New-GSUser #NewGSUserParams -ErrorAction SilentlyContinue
if ($NewUser) {
'Created user {0} with primary email {1}' -f $User.FullName,$User.Email
} else {
'Failed to create user {0}' -f $User.Email
}
New-GSUserAlias -User $NewUser.PrimaryEmail -Alias ( $NewUser.Name.FullName.Replace(' ',''),$Domain -join '#') -ErrorAction SilentlyContinue | Out-Null
}
The Get-RandomPassword function is a mock-up. You would need to provide your own password method.
You can omit CustomSchemas from the hashtable. The blog post shows how to manually create new attributes, if you are interested.
Assign Users to Groups
Next, we use Get-GSUserList to get a list of all users in the parent OU, and then add the user to the group with Add-GSGroupMember.
$UserToGroupList = Get-GSUserList -SearchBase '/test' -SearchScope Subtree
foreach ($User in $UserToGroupList) {
switch -regex ($User.OrgUnitPath) {
'academics' { $GroupName = 'Academics'}
'business' { $GroupName = 'Business'}
}
try {
Add-GSGroupMember -Identity $GroupName -Member $User.User -ErrorAction Stop | Out-Null
'Added {0} to group {1}' -f $User.User,$GroupName
}
catch {
'Failed to add {0} to group {1}' -f $User.User,$GroupName
}
}
Note: I manually created a /test organizational unit and blocked automatic assignment of a license since I’m using my personal G Suite account. I don’t want any surprises at the end of the month.

Powershell Looping through eventlog

I am trying to gather data from eventlogs of logons, disconnect, logoff etc... this data will be stored in a csv format.
This is the script i am working which got from Microsoft Technet and i have modified to meet my requirement. Script is working as it should be but there is looping going on which i can't figure out how it should be stopped.
$ServersToQuery = Get-Content "C:\Users\metho.HOME\Desktop\computernames.txt"
$cred = "home\Administrator"
$StartTime = "September 19, 2018"
#$Yesterday = (Get-Date) - (New-TimeSpan -Days 1)
foreach ($Server in $ServersToQuery) {
$LogFilter = #{
LogName = 'Microsoft-Windows-TerminalServices-LocalSessionManager/Operational'
ID = 21, 23, 24, 25
StartTime = (Get-Date).AddDays(-1)
}
$AllEntries = Get-WinEvent -FilterHashtable $LogFilter -ComputerName $Server -Credential $cred
$AllEntries | Foreach {
$entry = [xml]$_.ToXml()
$Output += New-Object PSObject -Property #{
TimeCreated = $_.TimeCreated
User = $entry.Event.UserData.EventXML.User
IPAddress = $entry.Event.UserData.EventXML.Address
EventID = $entry.Event.System.EventID
ServerName = $Server
}
}
}
$FilteredOutput += $Output | Select TimeCreated, User, ServerName, IPAddress, #{Name='Action';Expression={
if ($_.EventID -eq '21'){"logon"}
if ($_.EventID -eq '22'){"Shell start"}
if ($_.EventID -eq '23'){"logoff"}
if ($_.EventID -eq '24'){"disconnected"}
if ($_.EventID -eq '25'){"reconnection"}
}
}
$Date = (Get-Date -Format s) -replace ":", "-"
$FilePath = "$env:USERPROFILE\Desktop\$Date`_RDP_Report.csv"
$FilteredOutput | Sort TimeCreated | Export-Csv $FilePath -NoTypeInformation
Write-host "Writing File: $FilePath" -ForegroundColor Cyan
Write-host "Done!" -ForegroundColor Cyan
#End
First time when i run the script, it runs fine and i get the csv output as it should be. When i run the script again than a new CSV is created (as it should be) but the same event log enteries are created twice and run it again than three enteries are created for the same event. This is very strange as a new csv is created each time and i dont not have -append switch for export-csv configured.
$FilteredOutput = #()
$Output = #()
I did try adding these two lines in above script as i read somewhere that it is needed if i am mixing multiple variables into a array (i do not understand this so applogies if i get this wrong).
Can someone please help me this, more importantly, I need to understand this as it is good to know for my future projects.
Thanks agian.
mEtho
It sounds like the$Output and $FilteredOutput variables aren't getting cleared when you run the script subsequent times (nothing in the current script looks to do that), so the results are just getting appended to these variables each time.
As you've already said, you could add these to the top of your script:
$FilteredOutput = #()
$Output = #()
This will initialise them as empty arrays at the beginning, which will ensure they start empty as well as make it possible for them to be appended to (which happens at the script via +=). Without doing this on the first run the script likely failed, so I assume you must have done this in your current session at some point for it to be working at all.

Using Powershell to Bulk Import Large CSV into SQL Server

I came across a post discussing how to use Powershell to bulk import massive data relatively fast. I have a typical csv file with about 5 million rows formatted in the usual way.
I keep getting the same error messages regardless if I choose to import a txt or csv file. Playing around with the csvdelimiter/firstcolumnnames section also created their own issues.
I've spent hours trying to figure out how to get it to work with MY csv files and I keep getting the same error messages no matter what I try. All field names accept Null and they are identical in every way between the table and csv file. I do not have a primary key for the database.
# Database variables
$sqlserver = "SERVERNAMEHERE"
$database = "autos"
$table = "AgedAutos"
# CSV variables
$csvfile = "C:\temp\aged.csv"
$csvdelimiter = "',"
$firstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumnNames -eq $true) { $null = $reader.readLine() }
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
}
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 1) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
Error message listed below.
Exception calling "WriteToServer" with "1" argument(s): "The given value of type String from the data source cannot be converted to
type date of the specified target column."
At C:\powershell_scripts\batch_csv_import-code1-working-test for auto table.ps1:43 char:3
+ $bulkcopy.WriteToServer($datatable)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : InvalidOperationException
340000 rows have been inserted in 00:00:03.5156162
I have no idea what that error means since I cannot find anything useful on Google. I'm thinking one of the columns might be listed incorrectly in SQL Server, but I could be wrong.
Please help me figure out the problem. Thanks.
You are getting all the data in the first column because your value for $csvdelimiter is incorrect.
you have: $csvdelimiter = "',"
it should be: $csvdelimiter = ","

Powershell restore script timing out

I have a Powershell script based on this example that I've been using to automate my database restores, which has worked without a problem for a while now.
Recently it started failing and throwing timeout errors.
To highlight the code block from that example:
############################################################################
# Restore db Script block
############################################################################
[ScriptBlock] $global:RestoreDBSMO = {
param([string] $newDBName, [string] $backupFilePath, [bool] $isNetworkPath = $true, [string] $newDataPath, [string] $newLogPath)
# http://www.codeproject.com/Articles/110908/SQL-DB-Restore-using-PowerShell
try
{
# Load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
# Create sql server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"
# Copy database locally if backup file is on a network share
if($isNetworkPath)
{
$fileName = [IO.Path]::GetFileName($backupFilePath)
$localPath = Join-Path -Path $server.DefaultFile -ChildPath $fileName
Copy-Item $backupFilePath $localPath
$backupFilePath = $localPath
}
# Create restore object and specify its settings
$smoRestore = new-object("Microsoft.SqlServer.Management.Smo.Restore")
$smoRestore.Database = $newDBName
$smoRestore.NoRecovery = $false;
$smoRestore.ReplaceDatabase = $true;
$smoRestore.Action = "Database"
# Create location to restore from
$backupDevice = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($backupFilePath, "File")
$smoRestore.Devices.Add($backupDevice)
# Give empty string a nice name
$empty = ""
# Specify new data file (mdf)
$smoRestoreDataFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
if (($newDataPath -eq $null) -or ($newDataPath -eq $empty)) { #use exixting path
$defaultData = $server.DefaultFile
if (($defaultData -eq $null) -or ($defaultData -eq $empty))
{
$defaultData = $server.MasterDBPath
}
} Else {$defaultData = $newDataPath}
$fullPath = Join-Path -Path $defaultData -ChildPath ($newDBName + "_Data.mdf")
$smoRestoreDataFile.PhysicalFileName = $fullPath
# Specify new log file (ldf)
$smoRestoreLogFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
if (($newLogPath -eq $null) -or ($newLogPath -eq $empty)) { #use exixting path
$defaultLog = $server.DefaultLog
if (($defaultLog -eq $null) -or ($defaultLog -eq $empty))
{
$defaultLog = $server.MasterDBLogPath
}
} Else {$defaultLog = $newLogPath}
$fullPath = Join-Path -Path $defaultLog -ChildPath ($newDBName + "_Log.ldf")
$smoRestoreLogFile.PhysicalFileName = $fullPath
# Get the file list from backup file
$dbFileList = $smoRestore.ReadFileList($server)
# The logical file names should be the logical filename stored in the backup media
$smoRestoreDataFile.LogicalFileName = $dbFileList.Select("Type = 'D'")[0].LogicalName
$smoRestoreLogFile.LogicalFileName = $dbFileList.Select("Type = 'L'")[0].LogicalName
# Add the new data and log files to relocate to
$smoRestore.RelocateFiles.Add($smoRestoreDataFile)
$smoRestore.RelocateFiles.Add($smoRestoreLogFile)
# Restore the database
$smoRestore.SqlRestore($server)
"Database restore completed successfully"
}
catch [Exception]
{
"Database restore failed:`n`n " + $_.Exception
}
finally
{
# Clean up copied backup file after restore completes successfully
if($isNetworkPath)
{
Remove-Item $backupFilePath
}
}
}
When doing the actual restore, it throws the following error:
Restoring DB...
0
1
Database restore failed:
Microsoft.SqlServer.Management.Smo.FailedOperationException: Restore failed for Server 'SRV_MIC_DEV'. ---> Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. ---> System.
Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
The backup or restore was aborted.
10 percent processed.
20 percent processed.
30 percent processed.
40 percent processed.
50 percent processed.
60 percent processed.
at Microsoft.SqlServer.Management.Common.ConnectionManager.ExecuteTSql(ExecuteTSqlAction action, Object execObject, DataSet fillDataSet, Boolean catchException)
at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType)
--- End of inner exception stack trace ---
at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType)
at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(StringCollection sqlCommands, ExecutionTypes executionType)
at Microsoft.SqlServer.Management.Smo.ExecutionManager.ExecuteNonQuery(StringCollection queries)
at Microsoft.SqlServer.Management.Smo.BackupRestoreBase.ExecuteSql(Server server, StringCollection queries)
at Microsoft.SqlServer.Management.Smo.Restore.SqlRestore(Server srv)
--- End of inner exception stack trace ---
at Microsoft.SqlServer.Management.Smo.Restore.SqlRestore(Server srv)
at SqlRestore(Object , Object[] )
at System.Management.Automation.DotNetAdapter.AuxiliaryMethodInvoke(Object target, Object[] arguments, MethodInformation methodInformation, Object[] originalArguments)
I've tried having a look over the Powershell documentation and other links and noticed that some websites mentioned that Powershell cmdlets have a 600 second default timeout.
I'm trying to find either a solution to increase this timeout from Powershell or from SQL Server, but I'm not really having any luck with documentation or regular web-searches.
Is there any quick fix for my script to increase the timeout or a simple alternative?
Any help is much appreciated.

Resources