I have a file with multiple expressions like "$REGX('CareMedic.2_0','CustomerInformation','Customer Information')". The file can be a xml file, text file or any other type. If the file contains 9 of those expressions, I'm trying to pull all nine and send the values to a database.
I've tried my code as below:
$input_path = ‘C:\Users\Administrator\Desktop\test2.xml’
$SQLServer = "WIN-17V7QT0IJVK"
$SQLDBName = "Test"
$uid ="WIN-17V7QT0IJVK\Administrator"
$pwd = "letmebackinplease"
$SqlQuery = "SELECT * from product_schema;"
$ConnectionString = "Server = $SQLServer; Database = $SQLDBName; Integrated Security = True;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $ConnectionString
$SqlConnection.open()
if($SqlConnection.state -eq "Open"){
Write-Host "Test connection successful"
}
$regex = '()\(.*?\)'
$output = select-string -Path $input_path -Pattern $regex -AllMatches | % { $.Matches } | % { $.Value } |
ForEach-Object {
($_ -split "\(|\)")[1]
}
foreach ($line in $output){
$line = $line -replace "\(",""
$line = $line -replace "\)",""
$line = $line -replace "\'",""
$col1,$col2,$col3 = $line -split ","
[PSCustomObject]#{
col1 = $col1
col2 = $col2
col3 = $col3
} | select col1,col2,col3
$insert_query = "INSERT INTO [$SQLDBName].[dbo].[product_schema]
([version]
,[field]
,[value])
VALUES
($col1, $col2, $col3);"
$execute_query = New-Object System.Data.SqlClient.SqlCommand
$execute_query.connection = $SQLConnection
$execute_query.commandtext = $insert_query
$execute_query.ExecuteNonQuery()
}
$SqlConnection.close()
If the file has two of the below:
('Medic.2_0','AgeInformation','Age Information')
('Medic.2_0','TransactionID','Transaction ID')
My actual output should be:
'Medic.2_0' stored in Version Column
'AgeInformation' stored in the Field Column
'Age Information' stored in the value column
'Medic.2_0' stored in Version Column
'TransactionID' stored in the Field Column
'Transaction ID' stored in the value column
I have to take each of the values and store it in a column in a temp table setup on MySQL server like below:
**Version** **Field** **Value**
Medic.2_0 AgeInformation Age Information
Medic.2_0 TransactionID Transaction ID
Error Encountered:
Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax near '.2'."
At C:\Users\Administrator\Desktop\test.ps1:47 char:10
+ $execute_query.ExecuteNonQuery()
+ ~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
Can someone please recommend how shall I change my code to solve this?
In answer to your original question before editing: Assuming your output looks like this and is saved in a variable named $output
('Medic.2_0','AgeInformation','Age Information')
('Medic.2_0','TransactionID','Transaction ID')
Try this:
foreach ($line in $output){
$line = $line -replace "\(",""
$line = $line -replace "\)",""
$line = $line -replace "\'",""
$col1,$col2,$col3 = $line -split ","
[PSCustomObject]#{
col1 = $col1
col2 = $col2
col3 = $col3
} | select col1,col2,col3 | export-csv d:\test.csv -append -NoTypeInformation
}
We are looping through the $output line by line removing the brackets and the single quotes, splitting the remaining text on the comma, then assigning each of the three entries into the relevant variables. Once they are in variables we can then easily create a PSObject and use it to select our requirements for our export-csv
Try to add this code:
$info=#() #for store your values
foreach($item in $output){
$z=$item.split(',') #for split to 3 strings
$info+=[PSCustomObject]#{ #create custom object which have named columns and store our values
Version = $z[0]
Field = $z[1]
Value = $z[2]
}
}
Write-Output $info #variable that store all columns
Then you must run foreach loop to each object in $info .
you can run it like this:
foreach($data in $info){
$data.Version #to access Version field
$data.Field #to access Field field
$data.Value #to access Value field
.......your SQL query......
}
Related
I have the below code snippet and all is working fine except that it looks like none of the values are wrapped in single quotes which the stored procedure gives out about.
Is there a way to tell it to wrap each element in $p in single quotes before the SQL is executed?
$p contains 30 different elements which can be numeric or alpha-numeric.
Param (
[Parameter(Mandatory=$true)][array]$p
)
Process {
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = "Data Source=OurServer;Initial Catalog=OurDatabase;Integrated Security=true"
$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.Connection = $conn
$cmd.CommandTimeout = 0
$cmd.CommandText = "EXEC sp_test $p"
}
There are a number of options to update array elements:
# Using -replace
$p = $p -replace '^.*$','''$&'''
# Using foreach-object and string format operator
$p = $p | Foreach-Object { "'{0}'" -f $_ }
# Using foreach method
$p = $p.foreach({"'$_'"})
I'm trying to execute an SQL query with few select statements, that returns multiple tables as a result. The problem is that I can't find a way to read and use the tables separately.
Expected results:
Actual results: (it is printed row by row)
Purpose: I've made a script that creates an empty excel file with multiple sheets and each of the sheets will be used to contain each resultset of the query.
The only thing left is to put the needed text into the sheets. Here is my code for that part only:
$ConnectionString = "Data Source=...;Initial Catalog=...;User Id=...;Password=..."
$DBServerName = $ConnectionString.split('=')[1].split(';')[0]
$DBName = $ConnectionString.split('=')[2].split(';')[0]
$DBUser = $ConnectionString.split('=')[3].split(';')[0]
$DBPassword = $ConnectionString.split('=')[4].split(';')[0]
$CurrentFilePath = "C:\SQLqueryWithManyResultsets.sql"
$query = Get-Content -literalPath $CurrentFilePath | Out-String #getting the query string from file
$resultTables = Invoke-Sqlcmd -Query $query -ServerInstance $DBServerName -Database $DBName -DisableVariables -Password $DBPassword -Username $DBUser -ErrorAction Stop
foreach ($result in $resultTables) {
$result | Format-Table #where the magic happens
}
I've made a lot of research, but I cannot find a proper way to store and read the tables the way i need.
Try this:
Clear-Host;
$objConnection = New-Object System.Data.SqlClient.SqlConnection;
$objConnection.ConnectionString = "...";
$ObjCmd = New-Object System.Data.SqlClient.SqlCommand;
$ObjCmd.CommandText = "...";
$ObjCmd.Connection = $objConnection;
$ObjCmd.CommandTimeout = 0;
$objAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$objAdapter.SelectCommand = $ObjCmd;
$objDataSet = New-Object System.Data.DataSet;
$objAdapter.Fill($objDataSet) | Out-Null;
for ($i=0; $i -lt $objDataSet.Tables.Count; $i++) {
Write-Host ($objDataSet.Tables[$i] | Format-Table | Out-String);
}
$query = $null;
$objDataSet = $null;
$objConnection.Close();
$objConnection = $null;
Working on another script for work and I'm attempting to read from a CSV containing only one column of data. And then for each item to find the corresponding ID when querying the SQL database. Then to put the result(ID1, CSVID1) in to an excel file(I have this part working fine).
Now I have run in to an issue as to how to populate the dataset within a foreach loop.
$excelAssets = Import-Csv .\test.csv -Header assetId | Foreach-Object {
$assetId = $_.assetId
# SQL Query Variables
$query = "SELECT AssetId AS AssetID, BrandId AS BrandID FROM [AssetLibrary_BrandAsset] WHERE AssetId = $assetId"
$connection = New-SqlConnection -Server $dataSource -Database $dataBase
#Execute the SQL commands and place the results in dataset
if ($connection.State -eq 'Open')
{
$swLap = Start-Elapsed $sw "Executing SQL Query"
Write-Verbose "$query";
$dataSet += Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0'
$i++
$connection.Close();
End-Elapsed $sw $swLap
} ELSE {
Write-Error "$($(Format-Elapsed $swLap)) SQL Connection Not Open - Exiting...";
exit;
}
}
Now $dataSet += doesn't work and I have googled numerous times to try and find the answer to this problem. Any help is appreciated.
Using the $dataSet
$dataTable = new-object "System.Data.DataTable" "Results"
$dataTable = $dataSet.Tables[0]
$rowDT = $dataTable.Rows.Count;
$colDT = $dataTable.Columns.Count;
Write-Host -NoNewLine "$(Format-Elapsed $sw.Elapsed) Rows: ";
Write-Host -NoNewLine "$($rowDT+1)" -ForegroundColor "Green";
Write-Host -NoNewLine " Columns: "
Write-Host -NoNewLine "$($colDT+1)" -ForegroundColor "Green";
Write-Host -NoNewLine " Cells: "
Write-Host "$( ($colDT+1)*($rowDT+1) )" -ForegroundColor "Green";
#Create a 2D Array of the DataTable
# http://stackoverflow.com/questions/13184191/fastest-way-to-drop-a-dataset-into-a-worksheet
$tableArray = New-Object 'object[,]' $rowDT, $colDT;
$swLap = Start-Elapsed $sw "DataTable transformation"
# i = row and j = column
for ($i=0;$i -lt $rowDT; $i++)
{
#Write-Progress -Activity "Transforming DataTable" -status "Row $i" -percentComplete ($i / $rowDT*100)
for ($j=0;$j -lt $colDT; $j++)
{
$tableArray[$i,$j] = $dataTable.Rows[$i].Item($j).ToString();
}
}
End-Elapsed $sw $swLap
$rowOffset = 1; $colOffset = 1;# 1,1 = "A1"
# Write out the header column names
for ($j=0;$j -lt $colDT; $j++)
{
$ActiveWorksheet.cells.item($rowOffset, $j+1) = $dataTable.Columns[$j].ColumnName;
}
$headerRange = $ActiveWorksheet.Range($ActiveWorksheet.cells.item($rowOffset, $colOffset), $ActiveWorksheet.cells.item($rowOffset, $colDT+$colOffset-1));
$headerRange.Font.Bold = $false
$headerRange.Interior.Color = $headingColour
$headerRange.Font.Name = $headingFont
$headerRange.Font.Color = $headingFontColour
$rowOffset++;
# Extract the data to Excel
$tableRange = $ActiveWorksheet.Range($ActiveWorksheet.cells.item($rowOffset, $colOffset), $ActiveWorksheet.cells.item($rowDT+$rowOffset-1, $colDT+$colOffset-1));
$tableRange.Cells.Value2 = $tableArray;
# Resize the columns in Excel
$swLap = Start-Elapsed $sw "Resize Excel Worksheet"
$wholeRange = $ActiveWorksheet.UsedRange
$wholeRange.EntireColumn.AutoFit() | Out-Null
End-Elapsed $sw $swLap
# Save Excel workbook
$ActiveWorkbook.SaveAs("$OutputFile")
$ActiveWorkbook.Close()
After assigning to $dataSet the first time, it's type is probably not array, meaning that the += operator doesn't behave exactly as you expect.
You can either initialize $dataSet as an empty array before you start assigning to it:
Import-Csv .\test.csv -Header assetId | Foreach-Object -Begin {$dataSet = #()} -Process {
# rest of script in here
} -End {return $dataSet}
or you can cast it during assigning:
[array]$dataSet += Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0'
finally, an alternative solution would be to ensure that the output from Invoke-SQLQuery is treated as an array before you assign it to $dataSet:
$dataSet += #(Invoke-SQLQuery -connection $connection -query $query -ExecutionTimeout '0')
Whatever suits your style of coding.
I am using the Scripter class to give me a script for the data out of an existing database. I want to script a dataset that can be inserted into a production database. We are doing this to test if an installation of our Software is correct.
Unfortunately the dataset has to be removed later without any entries left behind so that it does not interfere with the data of our customers. So what I need are INSERT and DELTE statements. These are maintained manually at the moment which is too much of a burden.
Very well so I just went and executed the Scripter twice (once for INSERT, once for DELETE)
Problem is that when setting ScriptDrops to true then the output is in the form
DELETE FROM [dbo].[TableName]
What I would like is something of the form:
DELETE FROM [dbo].[TableName] WHERE ID = 'GUID'
Technically this would be possible since there are Primary Keys on all the tables.
The Scripter class must also in some form know of that things since it also gets the order of the DELETE-statements (dependencies) correct via foreign keys.
Any help on this would be appreciated.
Following are the 2 PowerShell-scripts I am using to export the data:
ScriptRepositoryData.ps1
$scriptPath = $MyInvocation.MyCommand.Path
$scriptDirectory = Split-Path $scriptPath -Parent
. $scriptDirectory\DatabaseScripting.ps1
$filepath='c:\data.sql'
$database='ECMS_Repository'
$tablesToExclude = #(
"SomeUnwantedTable"
)
$tablesListFromDatabase = GetTableList $database
$tablesArray = #()
$tablesListFromDatabase |% {
if (-not $tablesToExclude.Contains($_.Name.ToString()))
{
$tablesArray += $_.Name
}
}
ScriptInsert $database $tablesArray $filepath
DatabaseScripting.ps1
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMOExtended") | out-null
Function GetTableList ($database)
{
Invoke-SqlCmd -Database $database -query "SELECT * FROM sys.tables"
}
Function ScriptInsert ($database, $tables, $destination)
{
try {
$serverMO = new-object ("Microsoft.SqlServer.Management.Smo.Server") "localhost"
if ($serverMO.Version -eq $null) {Throw "Can't find the instance localhost"}
$urnsToScript = New-Object Microsoft.SqlServer.Management.Smo.UrnCollection
$databaseMO = $serverMO.Databases.Item("ECMS_Repository")
if ($databaseMO.Name -ne $database) {Throw "Can't find the database $database"}
$tables |% {
$tableListMO = $databaseMO.Tables.Item($_, "dbo")
$tableListMO |% {
$urnsToScript.Add($_.Urn)
}
}
$scripter = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') $serverMO
$scripter.Options.ScriptSchema = $False;
$scripter.Options.ScriptData = $true;
$scripter.Options.ScriptDrops = $true;
$scripter.Options.ScriptAlter = $true;
$scripter.Options.NoCommandTerminator = $true;
$scripter.Options.Filename = $destination;
$scripter.Options.ToFileOnly = $true
$scripter.Options.Encoding = [System.Text.Encoding]::UTF8
$scripter.EnumScript($urnsToScript)
Write-Host -ForegroundColor Green "Done"
}
catch {
Write-Host
Write-Host -ForegroundColor Red "Error occured"
Write-Host
Write-Host $_.Exception.ToString()
Write-Host
}
}
Unfortunately I did not find a way to do this using the Sql Management Objects.
Anyhow I now use the output of the Scripter and select the IDs of each table. I then use the IDs to change every line that looks like
DELETE FROM [dbo].[tableName]
to this:
DELETE FROM [dbo].[tableName] WHERE ID IN ('guid1', 'guid2')
Here is how I did it:
$content = Get-Content $destination
Clear-Content $destination
$content |% {
$line = $_
$table = $line.Replace("DELETE FROM [dbo].[","").Replace("]","")
$query = "SELECT ID, ClassID FROM" + $_
$idsAsQueryResult = Invoke-SqlCmd -Database $database -query $query
$ids = $idsAsQueryResult | Select-Object -Expand ID
if ($ids -ne $null) {
$joinedIDs = [string]::Join("','",$ids)
$newLine = $line + " WHERE ID IN ('" + $joinedIDs + "')"
Add-Content $destination $newLine
}
}
Where $destination is the script that has been generated with the Scripter class and $database is a string containing the database name.
I had to select a second column (ClassID which is there on all tables due to our OR mapper re-store) because of some weird error in Select-Object which I do not fully understand.
This of course only works because all tables have primary keys and all primary keys are named ID and are not combined primary keys or something.
You could of course achieve the same thing for other more complicated database schemas by extracting primary key information via SQL management objects.
I am trying to import a csv file where it doesn't import any values that are listed in an array declared above the import line. The array is made up of certain values that are pulled out of a database and I wan't to import all rows in the csv file that the txnID column values do not match the values in the array however I am having trouble trying to loop through my array.
I am new to using powershell and maybe I am not even implementing the array correctly but I haven't been able to find anything about import-csv Filename |Where column -notmatch $array
$Database = 'Database'
$Server = "Server"
$SqlQuery = 'SELECT DISTINCT WebOrderNumber FROM tbOrders
WHERE WebOrderNumber IS NOT NULL AND Len(WebOrderNumber)>8'
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Data Source=Datasource;Initial Catalog=Database;User ID=ID;Password=Pass;Integrated Security=False;"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $SqlQuery
$SqlCmd.Connection = $SqlConnection
$SqlConnection.Open()
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$Reader = $SqlCmd.ExecuteReader()
while ($Reader.Read()) {
#write-Output($Reader.GetValue($0))
$Key = $Reader.GetValue($0)
$table += $Key
}
foreach ($Row in $table){
write-output($Row)
$CSVFile = (import-csv "C:\Users\Office-Admin\Documents\Complete Sales Orders.csv") |where {$_.txnID -ne $Row} | select txnID, FirstName, LastName, Cust_Name, mc_Shipping, Payment_Gross, address_street, Address_Zip, quantity, item_name, item_number, payer_email, address_city, address_state, address_country, address_name, Shipping_Method, mc_gross
}
$CSVFile | export-csv "C:\Users\Office-Admin\Documents\Sales Order Import List.csv" -notypeinformation
remove-item variable:table
#Send SMTP Message
$SqlConnection.Close()
I've updated my code slightly however the problem still persists. I'm realizing that I believe with the code now, everytime I loop through and import, the previous condition in the where is forgotten so the only value that is not imported in the end is the last $Row value but I need all of the values in $table to be excluded and I don't know how I can do this.
Something list this should work. The main problem is you are over writing your csv every loop.
$table = import-csv file1.csv | % {$_ID} #gets array of just the ID values
$CSVFile = Import-csv file2.csv | where{$table -notcontains $_.ID} | export-csv output.csv -notypeinformation
To show you how this works I created to files as an example:
File 1: CSV with IDs:
ID,Stuff
123,alittlestuff
234,morestuff
345,evenmore
456,alotmore
567,somemore
678,notsomuch
789,tonesofstuff
File 2: csv with ID and stuff:
ID,stuff
123,hello
ghf,world
234,test
lkj,this
after running the code the only rows that get output are:
ID,Stuff
ghf,world
lkj,this
So I think to fit it into your code use this:
$filter = $table | %{$_.txnID}
$CSVFile = (import-csv "C:\Users\Office-Admin\Documents\Complete Sales Orders.csv") | where{$filter -notcontains $_.txnID} || export-csv "C:\Users\Office-Admin\Documents\Sales Order Import List.csv" -notypeinformation