Increment through array across daily scheduled jobs - arrays

Is it possible to have the arguments in a daily scheduled job increment each day? I have an array of values and I want to run one job per value, spread out so that the jobs occur once a day.
This is my code so far:
$dailyTrigger = New-JobTrigger -Daily -At "11:00 AM"
$option = New-ScheduledJobOption -StartIfOnBattery -StartIfIdle -WakeToRun -IdleTimeout "10:00:00"
Register-ScheduledJob -Name $JobName -FilePath $ScriptToRun -Trigger $dailyTrigger -ScheduledJobOption $option
I was planning on using Register-ScheduledJob's -ArgumentList parameter, but from what I've seen there would be no way to pass a different value to each daily instance of the job. Is there some way I could store which element of the array is next so that the jobs could reach it?

One possible way to persist a counter between job runs is to store the value in the registry:
Set-ItemProperty -Path 'HKCU:\foo' -Name 'Job1' -Value ($counter + 1)
and read it on the next run:
$counter = (Get-ItemProperty -Path 'HKCU:\foo').Job1

Related

PowerShell How to loop a script based on script array count

I'm new to PowerShell. I'm trying to pull a users name and place it in a file. I have two corresponding arrays, $title and $csvfile. $title[0] corresponds with $csvfile[0] and $title[1] to $csvfile[1] and so on. Is it possible to loop this script while increasing the index for both at the same time so that each index runs once but also in sync?
$title = #('jim' 'john' 'james')
$csvfile = #('jim.csv' 'john.csv' 'james.csv')
Get -ADUser -filter {(Title -like "$title") -and (Company -like "Location1")} | Export-Csv c:\temp\$csvfile
Problem 2
The $title array doesn't seem to be iterating one at a time. If I replace $title[$] with any of the listed array items it works. Funny thing is that it DOES create all my .csv files from the $csvfile array, they are just empty. I've done some looking on the web, not sure if my array items are to long or the quotations are not parsing right. Any help would be muchly appreciated.
$title = #(
'Director of Nursing'
'Assistant Director of Nursing'
'Activities Director'
)
$csvfile = #(
'DON'
'ADON'
'ACTIVITIES'
)
for($i=0; $i -lt $title.Count; $i++)
{
#Get-ADUser -Filter { (Title -like "Director of Nursing") -and (Company -like "location1") }| Export-Csv c:\temp\$($csvfile[$i]).csv"
Get-ADUser -filter { (Title -like "$($title[$i])") -and (Company -like "Location1") }| Export-Csv "c:\tempPath\$($csvfile[$i]).csv"
}
If I'm understanding this correctly. You'd like to append the user information from Get-Aduser to the corresponding csv of the users name you got it from? Such as, jim info goes to jim.csv, and so on for each one?
Seems like you're looking for the Foreach loop.
$title = #('jim','john','james')
#$csvfile = #('jim.csv' 'john.csv' 'james.csv')
Foreach($user in $title){
Get-ADUser -filter {(Title -like "$title") -and (Company -like "Location1")} | Export-Csv "c:\temp\$title.csv"}
A Foreach loop goes through a list of objects and performs the same action for every object, ending when it's finished with the last one. The list of objects is typically an array. When you run a loop over a list of objects, we say you're iterating over the list.
The Foreach loop can be used in 3 different ways: Foreach Statement, Foreach-Object cmdlet, or as a foreach() method.
What we're using here is the Foreach statement which is followed by parentheses that contain three elements, in order: a variable, the keyword in, and the object or array to iterate over. As it moves through list ($title-array in this case), Powershell will copy the object it's looking at into the Variable defining each item in the list, $user.
Note: because the variables is just a copy, you cannot directly change the item in the original list.
Please note as well, the items in an array are read separately by adding a comma to the end of each item in the list(if its not in a new line). In the code above, we're appending the same name you're iterating with to the csv file as well.
EDIT: Using for loop. . .
$title = #('jim','john','james')
$csvfile = #('CEO','CFO','CIO')
For($i=0; $i -lt $title.Count; $i++){
Get-ADUser -filter {(Title -like "$($title[$i])") -and (Company -like "Location1")} | Export-Csv "c:\temp\$($csvfile[$i]).csv"}
Matches the output like so:
jim - CEO.csv
john - CFO.csv
james - CIO.csv

insert large amount of AD data into SQL server using PowerShell

I have a PowerShell script that pulls in 1.4+ million rows of data and saves it to a HUGE CSV file that then gets imported into an SQL server. I thought there might be a way to have PowerShell insert the data into the SQL server directly but I am not sure how.
One of my concerns is that I don't want to buffer up the AD result into memory and then write them. I'd rather write them in batches of 1000 or something so memory consumption stays down. Get 1000 records, save to SQL server, and repeat...
I see articles about how to get PowerShell to write to an SQL server but they all seem to either do ALL data at one time or one record at a time -- both of which seem inefficient to me.
This is the PowerShell script I have to query AD.
# the attributes we want to load
$ATTRIBUTES_TO_GET = "name,distinguishedName"
# split into an array
$attributes = $ATTRIBUTES_TO_GET.split(",")
# create a select string to be used when we want to dump the information
$selectAttributes = $attributes | ForEach-Object {#{n="AD $_";e=$ExecutionContext.InvokeCommand.NewScriptBlock("`$_.$($_.toLower())")}}
# get a directory searcher to search the GC
[System.DirectoryServices.DirectoryEntry] $objRoot = New-Object System.DirectoryServices.DirectoryEntry("GC://dc=company,dc=com")
[System.DirectoryServices.DirectorySearcher] $objSearcher = New-Object System.DirectoryServices.DirectorySearcher($objRoot)
# set properties
$objSearcher.SearchScope = "Subtree"
$objSearcher.ReferralChasing = "All"
# need to set page size otherwise AD won't return everything
$objSearcher.PageSize = 1000
# load the data we want
$objSearcher.PropertiesToLoad.AddRange($attributes)
# set the filter
$objSearcher.Filter = "(&(objectClass=group)(|(name=a*)(name=b*)))"
# get the data and export to csv
$objSearcher.FindAll() | select -expandproperty properties | select $selectAttributes | export-csv -notypeinformation -force "out.csv"
I use Out-DataTable to convert my object array into a DataTable object type, then use Write-DataTable to bulk insert that into a database (Write-DataTable uses SqlBulkCopy to do this).
Caveats/gotchas for this (SqlBulkCopy can be a nuisance to troubleshoot):
Make sure your properties are the correct type (string for varchar/nvarchar, int for any integer values, dateTime can be string as long as the format is correct and SQL can parse it)
Make sure you properties are in order and line up with the table you're inserting to, including any fields that auto fill (incrementing ID key, RunDt, etc).
Out-DataTable: https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
Write-DataTable: https://gallery.technet.microsoft.com/scriptcenter/2fdeaf8d-b164-411c-9483-99413d6053ae
Usage
If I were to continue on your example and skip the CSV, this is how I would do it... replace the last two lines with the code below (assuming that your object properties line up with the table perfectly, your SQL server name is sql-server-1, database name is org, and table name is employees):
try {
Write-DataTable -ServerInstance sql-server-1 -Database org -TableName employees -Data $($objSearcher.FindAll() | Select-Object -expandproperty properties | Select-Object $selectAttributes | Out-DataTable -ErrorAction Stop) -ErrorAction Stop
}
catch {
$_
}
Looking at your code it looks like you come from .NET or some language based on .NET. Have you heard of the cmdlets Get-ADUser / Get-ADGroup? This would simplify things tremendously for you.
As far as the SQL connection goes PowerShell doesn't have any native support for it. Microsoft has made cmdlets for it though! You just have to have SQL Server Installed in order to get them.... Which is kinda a bummer since SQL is so heavy and not everyone wants to install it. It is still possible using .NET, it's just not very quick or pretty. I won't be giving advice on the cmdlets here, you can Google that. As far as .NET, I would start by reading some of the documentation on the System.Data.SqlClient namespace as well as some historical questions on the subject.
Finally, as you said it would be a good idea to try and avoid overloading your RAM. The big thing here is trying to keep your entire script down to one single AD query. This way you avoid the troubling scenario of data changing between one query and the next. I think the best way of doing this would be to save your results straight to a file. Once you have that you could use SqlBulkCopy to insert into the table straight from your file. The downside to this is that it doesn't allow for multiple AD Properties. At least I don't think SqlBulkCopy will allow for this?
Get-ADUser "SomeParamsHere" | Out-File ADOutput.txt
If you have to have multiple AD properties and still want to keep the RAM usage to a minimum...well I toyed around with a script that would work but makes a few calls that would read from the entire file, which defeats the whole purpose. Your best option might be to save each property to a separate file then do your whole write DB thing. Example:
New-Item Name.txt
New-Item DistinguishedName.txt
Get-ADUser "SomeParamsHere" -Properties "Name,DistinguishedName" | Foreach {
Add-Content -Path "Name.txt" -Value "$_.Name"
Add-Content -PassThru "DistinguishedName.txt" -Value "$_.DistinguishedName"
}
Store results in your last line of code in a variable instead of exporting it to csv.
Then create group's of size you want't.
Using Out-DataTable and Write-DataTable write to SQL - links in nferrell's answer.
$res = $objSearcher.FindAll() | select -expandproperty properties | select
$selectAttributes
$counter = [pscustomobject] #{ Value = 0 }
#create groups with 1000 entries each
$groups = $res | Group-Object -Property { [math]::Floor($counter.Value++ / 1000) }
foreach ($group in $groups){
#convert to data table
$dt = $group.group | Out-DataTable
$dt | Write-DataTable -Database DB -ServerInstance SERVER -TableName TABLE
}
`
You're making this unneccesarily complicated.
If I read your code correctly, you want all groups starting with 'a' or 'b'.
# the attributes we want to export
$attributes = 'name', 'distinguishedName'
Import-Module ActiveDirectory
Get-ADGroup -Filter {(name -like "a*") -or (name -like "b*")} -SearchBase 'dc=company,dc=com' |
select $attributes | Export-Csv -NoTypeInformation -Force "out.csv"
Instead of using Export-Csv at the end, just pipe the output to the command which creates the SQL rows. By piping objects (instead of assigning them to a variable) you give PowerShell the ability to handle them efficiently (it will start processing objects as they come in, not buffer everything).
Unfortunately I can't help you with the SQL part.

Comparing Arrays in Powershell

There is probably a simple way to do this, but I've been hitting my head against a wall for hours at this point. I'm trying to grab several user attributes out of AD, compare two of those attributes, and then modify them based on the differences. However since some users have null values for either their office or department fields which causes compare-object to fail, I have those going into other arrays with a -replace to get rid of the nulls, so my variables look like this:
$UserData = Get-ADuser -filter * -properties physicaldeliveryofficename,department | select samaccountname,physicaldeliveryofficename,department
$Offices = $UserData.physicaldeliveryofficename -replace "^$","N/A"
$Departments = $UserData.department -replace "^$","N/A"
So far so good, but when I loop through to compare values, I start to run into trouble. Looping through the users like this seems to be comparing every element to every other element:
Foreach ($user in $UserData.samaccountname) {
Compare-object $offices $departments -includeqeual}
While not having a loop and using compare-object by itself gives accurate results, but then I'd need a loop to check for matches anyway.
Assuming I just want to determine which users have matching office and department fields (and based off that do a pretty simple Set-ADUser command), how would I go about comparing the values without checking every element against every other element?
Your ForEach loop won't work properly because even though you are going through each user account, you are always comparing the same collection of offices and departments. I wrote this that might give you better results and saves the compare results as part of an object so you can see the user account as well.
Get-ADuser -Filter * -properties physicaldeliveryofficename,department | ForEach {
$Offices = $_.physicaldeliveryofficename -replace "^$","N/A"
$Departments = $_.department -replace "^$","N/A"
$Results = Compare-object $offices $departments -IncludeEqual
[pscustomobject]#{
User = $_.samaccountname
compare = $Results
}
}

How to process SSAS cube structures concurrently from PowerShell script using Mircrosoft.AnalysisServices namespace

TL;DR: Is there a way for a PowerShell script calling Microsoft.AnalysisServices functions to process multiple cube structures concurrently?
I have a Microsoft SSAS cube that needs several measure groups processed before the rest of the cube is processed later in the job plan. I have created a PowerShell script that enumerates the measure groups to process and calls measureGroup.Process('ProcessFull') from the Microsoft.AnalysisServices namespace. This works to process the measure group, dimension, et.al.
However, processing the measure groups in this manner doesn't allows SQL Server 2014 to parallelize the processing. A cube that takes on average 2 hours to fully process was running for 7 hours before we killed it.
Is there a way in PowerShell to batch the processes so that they are sent at the same time to the cube? If this could be done, it would allow the server to do concurrent processing instead of one object at a time.
I have taken a look through the documentation on the MSDN as well as consulted Google, but was unable to find an answer. Thanks!
You should check this method in PowerShell: http://ssas-info.com/analysis-services-scripts/1238-powershell-script-to-process-all-dimensions-and-cubes-in-one-db-limiting-workload
Have a look at powershell jobs:
Background Jobs
Here's a quick example that you could adapt to run your measuregroup processing:
$cmd = {
param($a)
Write-Host $a
}
$list = #("a","b","c")
$list | ForEach-Object {
Start-Job -ScriptBlock $cmd -ArgumentList $_
}
It's quite simple, you define your script block in the $cmd variable, this is where you would put your logic around processing the measure group. The $list variable could contain a list of the measure groups to process.
You then start a job for each item in the list, which executes the code in the script block, passing through the item in the list as a parameter. In this example it simply prints out the parameter that you passed in. You can of course pass in as many parameters as you like.
To get the results, you can use the Get-Job cmdlet to check the status of the jobs and Receive-Job to get the output. Remove-Job can then be used to clear finished jobs from the queue.
The following command run after the code above will get the results of all the jobs (in this case just the a,b,c that we passed in and then will remove it from the queue:
Get-Job | ForEach-Object { Receive-Job $_.Id; Remove-Job $_.Id }

Powershell: Using -notcontains to compare arrays doesn't find non-matching values

PS noob here (as will be obvious shortly) but trying hard to get better. In my exchange 2010 environment I import and export huge numbers of .pst files. Many will randomly fail to queue up and once they're not in the queue it's very tedious to sort through the source files to determine which ones need to be run again so I'm trying to write a script to do it.
first I run a dir on the list of pst files and fill a variable with the associated aliases of the accounts:
$vInputlist = dir $vPath -Filter *.pst |%{ get-mailbox -Identity $_.basename| select alias}
Then I fill a variable with the aliases of all the files/accounts that successfully queued:
$vBatch = foreach ($a in (Get-MailboxImportRequest -BatchName $vBatchname)) {get-mailbox $a.mailbox | select alias}
Then I compare the two arrays to see which files I need to queue up again:
foreach($should in $vInputlist){if ($vBatch -notcontains $should){Write-Host $should ""}}
It seems simple enough yet the values in the arrays never match, or not match, as the case may be. I've tried both -contains and -notcontains. I have put in a few sanity checks along the way like exporting the variables to the screen and/or to csv files and the data looks fine.
For instance, when $vInputlist is first filled I send it to the screen and it looks like this:
Alias
MapiEnableTester1.psiloveyou.com
MapiEnableTester2.psiloveyou.com
MapiEnableTester3.psiloveyou.com
MapiEnableTester4.psiloveyou.com
Yet that last line of code I displayed above (..write-host $should,"") will output this:
#{Alias=MapiEnableTester1.psiloveyou.com}
#{Alias=MapiEnableTester2.psiloveyou.com}
#{Alias=MapiEnableTester3.psiloveyou.com}
#{Alias=MapiEnableTester4.psiloveyou.com}
(those all display as a column, not sure why they won't show that way here)
I've tried declaring the arrays like this, $vInputlist = #()
I've tried instead of searching for the alias just cleaning .pst off off the $_.basename using .replace
I've searched on comparing arrays til I'm blue in the fingers and I don't think my comparison is wrong, I believe that somehow no matter how I fill these variables I am corrupting or changing the data so that seemingly matching data simply doesn't.
Any help would be greatly appreciated. TIA
Using -contains to compare objects aren't easy because the objects are never identical even though they have the same property with the same value. When you use select alias you get an array of pscustomobjects with the property alias.
Try using the -expand parameter in select, like
select -expand alias
Using -expand will extract the value of the alias property, and your lists will be two arrays of strings instead, which can be compared using -contains and -notcontains.
UPDATE I've added a sample to show you what happends with your code.
#I'm creating objects that are EQUAL to the ones you have in your code
#This will simulate the objects that get through the "$vbatch -notcontains $should" test
PS > $arr = #()
PS > $arr += New-Object psobject -Property #{ Alias="MapiEnableTester1.psiloveyou.com" }
PS > $arr += New-Object psobject -Property #{ Alias="MapiEnableTester2.psiloveyou.com" }
PS > $arr += New-Object psobject -Property #{ Alias="MapiEnableTester3.psiloveyou.com" }
PS > $arr | ForEach-Object { Write-Host $_ }
#{Alias=MapiEnableTester1.psiloveyou.com}
#{Alias=MapiEnableTester2.psiloveyou.com}
#{Alias=MapiEnableTester3.psiloveyou.com}
#Now this is what you will get if you use "... | select -expand alias" instead of "... | select alias"
PS > $arrWithExpand = $arr | select -expand alias
PS > $arrWithExpand | ForEach-Object { Write-Host $_ }
MapiEnableTester1.psiloveyou.com
MapiEnableTester2.psiloveyou.com
MapiEnableTester3.psiloveyou.com

Resources