I'm trying to use WMI to get the start times of a computer's login sessions using:
$starttimes = Get-WmiObject Win32_LogonSession -ComputerName HM-ITS-KLP |
select starttime
This gives me the date formatted as:
20170120075444.819609+000 (yyyymmddhhmmss.??????+???)
Using the String.ToCharArray() method I managed to convert a string to an array so that I could restructure it better, although in this format it will not accept this as:
Method invocation failed because [Selected.System.Management.ManagementObject] doesn't contain a method named 'ToCharArray'.
Whole code as follows:
$starttimes = Get-WmiObject Win32_LogonSession -ComputerName HM-ITS-KLP |
select StartTime
foreach ($line in $starttimes) {
$dateArray = $line.ToCharArray()
$time = $dateArray[8..9] + ":" + $dateArray[10..11] + ":" + $dateArray[12..13]
$date = $dateArray[6..7] + "/" + $dateArray[4..5] + "/" + $dateArray[0..3]
$LoginTimeAndDate1 = $time + " " + $date
$LoginTimeAndDate = $LoginTimeAndDate1 -join ""
}
You forgot to expand the "starttime" property. Try change to this:
foreach ( $line in $starttimes){
$dateArray = $line.starttime.toCharArray()
or this:
foreach ( $line in $starttimes.starttime){
$dateArray = $line.toCharArray()
or this:
$starttimes = Get-WMIObject Win32_LogonSession | select -Expand starttime
foreach ( $line in $starttimes){
try this:
Get-WmiObject Win32_LogonSession -ComputerName "HM-ITS-KLP" | select #{N='starttime';E={$_.ConvertToDateTime($_.starttime)}}
Related
Im trying to write a script in powershell that will add new parameters from updates into a ini file.
They are categorized with [categorie]
It checks what parameters are missing and then i want them to be added under the categorie.
$inipath = "somepath"
$list =#([PSCustomObject] #{ type = "\[cat\]"; values = "par1=N","par2=N", "par3=N"})
foreach($entry in $list){
if(Select-String -Path $inipath -Pattern $entry.values -SimpleMatch -Quiet){
}
else{
$parameter += #([PSCustomObject] #{ type = $entry.type; values =` $entry.values })
}
}
$tlength = $parameter.type.length
for($x=0; $x -le $tlength; $x++ ){
$vlength = $parameter[$x].values.Length
for($y=0;$y -le $vlength; $y++){
$par = $parameter[$x].values[$y]
$par = $par -join '`n'
$fileContent = Get-Content $inipath
$linenumber= Get-Content $inipath | select-string $parameter[$x].type
$fileContent[$lineNumber.LineNumber +1] += $par
$fileContent | Set-Content $inipath
}
Right now it recognizes the missing parameters and prints them but its printing them like this:
[cat]par1=Npar2=N=par3=N
Desired out put would be
[cat]
par1=N
par2=N
par3=N
In PowerShell, strings wrapped in single quotes are taken literally, while strings wrapped in double quotes are interpreted and processed for variables, etc.
You need to change:
$par = $par -join '`n'
to:
$par = $par -join "`n"
In the end i fixed it by replacing the function with a function that replaces the entire [category] with [category] + "`n" + $param
$tlength = $parameter.type.length
for($x=0; $x -le $tlength; $x++ ){
$vlength = $parameter[$x].values.Length
for($y=0;$y -le $vlength; $y++){
if(!$vlength){
echo "no parameters"
}
elseif(!$parameter[$x].values[$y]){
echo "no parameters"
}
else{
$par = $parameter.type[$x]
$a = $parameter.type[$x] + "`n" + $parameter[$x].values[$y]
$a = $a -replace '\\',''
echo $a
(Get-Content $inipath) -replace $par, $a | Set-Content $inipath
}
}
}
now it prints
[cat]
param1=n
param2=n
I managed to create a PowerShell script to download and load into a SQL Server database some files that have pivotted data: each day of the series is a column with the values of that day.
The problem is that it takes a long time to execute, about 15 seconds to load each file that has an average of 250 lines. I think the problem is when I try to get the values of the fields by its "index", because I didn't find a clever way of doing that.
Is there a better way of inserting this kind of CSV data that can change each day into the database using the PowerShell?
I don't know much of PowerShell scripting, but I managed to create the below scripts:
The files I downloaded from here:
https://github.com/CSSEGISandData/COVID-19/tree/master/csse_covid_19_data/csse_covid_19_time_series
Script to process the files:
Clear-Host
$date = (Get-Date).Date.AddDays(-1).ToString('MM-dd-yyyy')
$path = 'C:\Dataset\'
$items = (Get-ChildItem -Path ($path + "time_series*.csv") | Select FullName) #| Select -First 1
$sql_instance_name = '.'
$db_name = 'COVID-19'
foreach ($item in $items)
{
$dt_start = Get-Date
$schema = "stg"
$table = (Split-Path -Path $item.FullName -Leaf).Split('.')[0]
Write-Host "File:"(Split-Path -Path $item.FullName -Leaf) -ForegroundColor Yellow
Write-Host "Schema: $schema" -ForegroundColor Yellow
Write-Host "Table: [$table]" -ForegroundColor Yellow
$header = (Get-Content $item.FullName | Select -First 1).replace(",", "|,|")
$i = 0; $new_header = #();
foreach ($column in $header.Replace('|', '').split(','))
{
$new_header += "Column_$i"
$i++
}
$drop_table = "if (object_id('stg.[$table]')) is not null drop table $schema.[$table];"
Invoke-Sqlcmd -Database $db_name -Query $drop_table -ServerInstance $sql_instance_name
$create_table = ("if (object_id('stg.[$table]')) is null
create table $schema.[$table] (" +
" id int identity constraint [pk_$table] primary key," +
" [" + $header + "] varchar(500),`n`tload_date datetime`n);").Replace('|,|', "] varchar(500), [")
Invoke-Sqlcmd -Database $db_name -Query $create_table -ServerInstance $sql_instance_name
$csv = Import-Csv -Path $item.FullName -Header $new_header | Select -Skip 1
$insert = $null
foreach ($row in $csv)
{
$query = "insert into stg.[" + (Split-Path -Path $item.FullName -Leaf).Split('.')[0] + "] values ("
foreach ($column in $new_header)
{
<# Perhaps this part slows down the process by the means of
getting the value for the column (I couldn't find a way to
simply reference the column by index like $csv.column[$i]
till the last one)
#>
$value = ($row | Select $column)
$query += "nullif('" + ($value | % { $_.$(( $value | gm | ? { $_.membertype -eq "noteproperty"} )[0].name) }).Replace("'", "''") + "',''),"
}
$query += " current_timestamp);"
$insert = $query
#Write-Host $row.Column_1
Invoke-Sqlcmd -Database $db_name -Query $insert -ServerInstance $sql_instance_name
}
Write-Host "Lines:"$csv.count -ForegroundColor Yellow
#Start-Sleep -Seconds 2
$dt_end = Get-Date
Write-Host "Elapsed time:"(New-TimeSpan -Start $dt_start -End $dt_end).TotalSeconds -ForegroundColor Red
Write-Host ("-" * 50)
}
Execution results:
File: time_series_covid19_confirmed_global.csv
Schema: stg
Table: [time_series_covid19_confirmed_global]
Lines: 264
Elapsed time: 14,3725288
--------------------------------------------------
File: time_series_covid19_deaths_global.csv
Schema: stg
Table: [time_series_covid19_deaths_global]
Lines: 264
Elapsed time: 14,1963788
--------------------------------------------------
File: time_series_covid19_recovered_global.csv
Schema: stg
Table: [time_series_covid19_recovered_global]
Lines: 250
Elapsed time: 13,5150064
--------------------------------------------------
If 2016+, you can download the JSON data.
If <2016, I believe there is an XML option as well
The download takes a couple of seconds, and the parsing of the JSON takes less than 4 seconds (depending on your hardware).
Example
exec master..xp_cmdshell 'powershell.exe Invoke-WebRequest "https://opendata.ecdc.europa.eu/covid19/casedistribution/json/" -OutFile "c:\working\covid.json"',no_output
Declare #json varchar(max);
Select #json = BulkColumn FROM OPENROWSET(BULK 'c:\working\covid.json', SINGLE_BLOB) x;
;with cte as (
Select [DataDate] = try_convert(date,DateRep,105)
,[CountryCd]= try_convert(varchar(50),countryterritoryCode)
,[Country] = try_convert(varchar(150),countriesAndTerritories)
,[Cases] = try_convert(int ,cases)
,[Deaths] = try_convert(int ,deaths)
,[Pop] = try_convert(int ,[popData2018])
,rtc = sum(try_convert(int ,cases)) over (partition by countryterritoryCode order by try_convert(date,DateRep,105))
,rtd = sum(try_convert(int ,deaths)) over (partition by countryterritoryCode order by try_convert(date,DateRep,105))
From (
Select Idx= B.[Key]
,C.*
From OpenJSON(#json ) A
Cross Apply OpenJson(A.value) B
Cross Apply OpenJson(B.value) C
) src
Pivot (max(value) for [Key] in ( [dateRep],[cases],[deaths],[countriesAndTerritories],[geoId],[countryterritoryCode],[popData2018] ) ) pvt
)
Select DataDate
,CountryCd
,Country
,Cases = format(Cases,'#,###')
,Deaths= format(Deaths,'#,###')
,Pop = format(Pop/1000000.0,'#,##0.0')+'MM'
,RTC = format(RTC,'#,###')
,RTD = format(RTD,'#,###')
,Mort = format((rtd*100.0) / nullif(rtc,0),'##0.00')+'%'
,PctPop= format((cases*100.0) / nullif(Pop,0),'##0.0000')+'%'
From cte
Where DataDate='2020-04-11'
Order By try_convert(money,RTC) desc
Returns
I have a file with multiple expressions like "$REGX('CareMedic.2_0','CustomerInformation','Customer Information')". The file can be a xml file, text file or any other type. If the file contains 9 of those expressions, I'm trying to pull all nine and send the values to a database.
I've tried my code as below:
$input_path = ‘C:\Users\Administrator\Desktop\test2.xml’
$SQLServer = "WIN-17V7QT0IJVK"
$SQLDBName = "Test"
$uid ="WIN-17V7QT0IJVK\Administrator"
$pwd = "letmebackinplease"
$SqlQuery = "SELECT * from product_schema;"
$ConnectionString = "Server = $SQLServer; Database = $SQLDBName; Integrated Security = True;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $ConnectionString
$SqlConnection.open()
if($SqlConnection.state -eq "Open"){
Write-Host "Test connection successful"
}
$regex = '()\(.*?\)'
$output = select-string -Path $input_path -Pattern $regex -AllMatches | % { $.Matches } | % { $.Value } |
ForEach-Object {
($_ -split "\(|\)")[1]
}
foreach ($line in $output){
$line = $line -replace "\(",""
$line = $line -replace "\)",""
$line = $line -replace "\'",""
$col1,$col2,$col3 = $line -split ","
[PSCustomObject]#{
col1 = $col1
col2 = $col2
col3 = $col3
} | select col1,col2,col3
$insert_query = "INSERT INTO [$SQLDBName].[dbo].[product_schema]
([version]
,[field]
,[value])
VALUES
($col1, $col2, $col3);"
$execute_query = New-Object System.Data.SqlClient.SqlCommand
$execute_query.connection = $SQLConnection
$execute_query.commandtext = $insert_query
$execute_query.ExecuteNonQuery()
}
$SqlConnection.close()
If the file has two of the below:
('Medic.2_0','AgeInformation','Age Information')
('Medic.2_0','TransactionID','Transaction ID')
My actual output should be:
'Medic.2_0' stored in Version Column
'AgeInformation' stored in the Field Column
'Age Information' stored in the value column
'Medic.2_0' stored in Version Column
'TransactionID' stored in the Field Column
'Transaction ID' stored in the value column
I have to take each of the values and store it in a column in a temp table setup on MySQL server like below:
**Version** **Field** **Value**
Medic.2_0 AgeInformation Age Information
Medic.2_0 TransactionID Transaction ID
Error Encountered:
Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax near '.2'."
At C:\Users\Administrator\Desktop\test.ps1:47 char:10
+ $execute_query.ExecuteNonQuery()
+ ~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
Can someone please recommend how shall I change my code to solve this?
In answer to your original question before editing: Assuming your output looks like this and is saved in a variable named $output
('Medic.2_0','AgeInformation','Age Information')
('Medic.2_0','TransactionID','Transaction ID')
Try this:
foreach ($line in $output){
$line = $line -replace "\(",""
$line = $line -replace "\)",""
$line = $line -replace "\'",""
$col1,$col2,$col3 = $line -split ","
[PSCustomObject]#{
col1 = $col1
col2 = $col2
col3 = $col3
} | select col1,col2,col3 | export-csv d:\test.csv -append -NoTypeInformation
}
We are looping through the $output line by line removing the brackets and the single quotes, splitting the remaining text on the comma, then assigning each of the three entries into the relevant variables. Once they are in variables we can then easily create a PSObject and use it to select our requirements for our export-csv
Try to add this code:
$info=#() #for store your values
foreach($item in $output){
$z=$item.split(',') #for split to 3 strings
$info+=[PSCustomObject]#{ #create custom object which have named columns and store our values
Version = $z[0]
Field = $z[1]
Value = $z[2]
}
}
Write-Output $info #variable that store all columns
Then you must run foreach loop to each object in $info .
you can run it like this:
foreach($data in $info){
$data.Version #to access Version field
$data.Field #to access Field field
$data.Value #to access Value field
.......your SQL query......
}
I am trying to build a report file collecting data from various sources.
I have built a reporting structure like this:
$Data = import-csv "some CSV FILE"
<#
csv file must look like this
hostname,IP
server1,192.168.1.20
#>
Then I am building an array object, prepopulated with "initial values", and I attach it to my $data variable
$Ids = ('7.1.1.1','7.1.1.2')
$CheckObj= #()
foreach ($id in $IDs) {
$row = "" | Select-Object CheckID,CheckData,CheckDataRaw
$row.CheckID = $id
$row.CheckData = "NotChecked"
$CheckObj+= $row
}
$Data = $Data | Select *,CheckData
$data | % {$_.CheckData = $CheckObj}
The resulting object is:
hostname : server1
ip : 192.168.1.20
CheckData : {#{CheckID=7.1.1.1; CheckData=NotChecked; CheckDataRaw=},
#{CheckID=7.1.1.2; CheckData=NotChecked; CheckDataRaw=}}
All is well until I want to do this:
$FinalReport = $data | Select-Object -Property * -ExpandProperty Checkdata
I get all these errors, which let's say I can ignore...
Select-Object : The property cannot be processed because the property
"CheckData" already exists.
At line:1 char:24
+ ... lReport = $data | Select-Object -Property * -ExpandProperty Checkdata
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (#{hostname=serv...ystem.Objec
t[]}:PSObject) [Select-Object], PSArgumentException
+ FullyQualifiedErrorId : AlreadyExistingUserSpecifiedPropertyExpand,Micro
soft.PowerShell.Commands.SelectObjectCommand
BUT, an entire set of other variable gets altered, like:
$data | fl
hostname : server1
ip : 192.168.1.1
CheckData : {#{CheckID=7.1.1.1; CheckData=NotChecked; CheckDataRaw=;
hostname=server1; ip=192.168.1.1}, #{CheckID=7.1.1.2;
CheckData=NotChecked; CheckDataRaw=; hostname=server1;
ip=192.168.1.1}}
aswell as the $CheckObj variable
$CheckObj
CheckID : 7.1.1.1
CheckData : NotChecked
CheckDataRaw :
hostname : server1
ip : 192.168.1.1
CheckID : 7.1.1.2
CheckData : NotChecked
CheckDataRaw :
hostname : server1
ip : 192.168.1.1
This is totally unintended on my side...
Can someone clarify what I am doing wrong?
I am using powershell 5.0 on Windows 7.
All testing was done using powershell_ise, and I didn't change any of the powershell defaults
My expected result would be for the $Final Report variable to contain the expanded content, not all the variables I used in the process...
It seems, after a bit more digging I understood, to some extent why this is occurring.
I am using simple $b = $a assignments, which appear to be a form of shallow copy. So any change in $b also impacts object $a and vice-versa.
For my purpose I need distinct copies of the data, it seems the solution is to do a deep copy, similar to the solution of this post:
PowerShell copy an array completely
So the working code, which gives me the desired result would be:
Function Copy-Object ($Source,[switch]$DeepCopy) {
# Serialize and Deserialize data using BinaryFormatter
if ($DeepCopy) {
$ms = New-Object System.IO.MemoryStream
$bf = New-Object System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
$bf.Serialize($ms, $Source)
$ms.Position = 0
#Deep copied data
$Target = $bf.Deserialize($ms)
$ms.Close()
Write-Output $Target
}
Else {
Write-Output $Source
}
}
$Data = "" | select hostname,IP
$data.Hostname = "server1"
$data.IP = "192.168.1.10"
$Ids = ('7.1.1.1','7.1.1.2')
$CheckObj= #()
foreach ($id in $IDs) {
$row = "" | Select-Object CheckID,CheckData,CheckDataRaw
$row.CheckID = $id
$row.CheckData = "NotChecked"
$CheckObj += $row
}
$Data = Copy-Object -source $Data -DeepCopy | Select *,CheckData2
$Data | % {$_.CheckData2 = Copy-Object -source $CheckObj -DeepCopy}
$FinalReport = Copy-Object -source $Data -DeepCopy | Select-Object -Property hostname,IP -ExpandProperty Checkdata2
$FinalReport | ft
output being:
CheckID CheckData CheckDataRaw hostname IP
------- --------- ------------ -------- --
7.1.1.1 NotChecked server1 192.168.1.10
7.1.1.2 NotChecked server1 192.168.1.10
I have stored the results of an sql server query in a variable named $TESTER using the Invoke-Sqlcmd cmdlet.
I want to take the fields from each row (i.e. name, address etc) and concatenate them into a seperate string parameter named $STR
something similar to the below
$TESTER | Select-Object | ForEach-Object {$STR = $.name + $.address + $_.postcode}
Any ideas?
You can try to concatenate your values, maybe add a separator and add them to a list like so:
$data = #()
$TESTER | Select-Object | ForEach-Object { $data += $_.name, $_.address, $_.postcode -join "|" }
You can use ItemArray property from System.Data.DataRow and join values with a separator.
$TESTER | foreach { $STR += $_.ItemArray -join "," }