GET-ADUSER into .CSV file, but distinguishedName in a single column - sql-server

After hours of tries and reading almost all posts here, I could not find the solution to what I need.
I would like to retrieve adusers but only specific fields. Something like this:
Area | Enabled | Name | userPrincipalName(email)
The problem is, the field DistinguishedName has everything I need but it's separated into different columns when I export it to .CSV.
I could only read this line and separate it in the columns I need, but then I have another problem that is there's some users with more than 2,3,4 OU.
is there a way to read, like only one OU ( the first one is the one I need, because it says IT, EXPEDITION , ETC) or at least separate CN as a field, OU as another field, and then name, email ...?
I create this code to run on PowerShell:
Get-ADUser -Filter * -Properties DistinguishedName,SamAccountName,DisplayName,EmailAddress,OfficePhone | Select-Object DistinguishedName,EmailAddress,OfficePhone,DisplayName,SamAccountName | export-csv -path path\to\adusers.csv -NoTypeInformation -Encoding "UTF8"
and then I import it to SQL Server.

The way I would use this is to create a custom property in the select-object part of your code. eg:
Get-ADUser -Filter * -Properties DistinguishedName,SamAccountName,DisplayName,EmailAddress,OfficePhone | Select-Object DistinguishedName,EmailAddress,OfficePhone,DisplayName,SamAccountName
would become
Get-ADUser -Filter * -Properties DistinguishedName,SamAccountName,DisplayName,EmailAddress,OfficePhone | Select-Object #{Label='DistinguishedName';expression={$_.DistinguishedName.Split(',')[1]}},EmailAddress,OfficePhone,DisplayName,SamAccountName
#{Label = 'DistinguishedName';Expression={$_.Distinguishedname.split(',')[1]}} this is how you create a custom property within the pipeline, so here we are saying I want a custom property and the Label is the name of the property, the expression is what we want the property to be, so in this case we want the current distinguishedname in the pipelin $_.DistinguishedName but then we want to split this on , as that is what separates the parts of the DistinguishedName from each other so we use the .split(',') method which you enter in the character you want to split on. Now based on your question you are only ever interested in the 1st OU in the distinguishedname field so we select the 2nd object in the array created by .split(',') using [1] the first object in the array will be the CN= entry.
This will give you something like below.
Which you can then export to CSV
Link to further info on custom properties https://4sysops.com/archives/add-a-calculated-property-with-select-object-in-powershell/

Related

Removing data from Array once pulled from AD

I'm currently pulling user data from Ad by OU, then updating certain fields, which works fine.
I want to modify the script to only update certain users but struggling, to remove any of the entries from the array as it is a fixed size. I converted to ArrayList and can get count of object, and can query then individually etc..
$users = Get-ADUser -Filter * -SearchBase "DN" -Properties GivenName, Surname,mail,UserPrincipalName,SAMAccountName,proxyAddresses | Select GivenName, Surname,mail,UserPrincipalName,SAMAccountName,proxyAddresses
$WorkingSet =[System.Collections.ArrayList]($users)
$WorkingSet.count gives 47 as result with last element being:
GivenName: LauraSurname:Willoxmail:WilloxL#domainUserPrincipalName :Laura.Willox#domain
SAMAccountName : Laura.Willox
proxyAddresses : {smtp:laura.willox#domain, SMTP:WilloxL#domain}
but trying $WorkingSet.IndexOf('Laura.Willox') gives -1 instead of 46
So then I can't do something like $WorkingSet.RemoveAt($WorkingSet.IndexOf('Laura.Willox'))
Is something about this data that I am not understanding,that it can't be queried like this?
You absolutely do not need to wrap your data in an ArrayList, it'll only complicate your code unnecessarily.
Instead of trying to modify the output from Get-ADUser inline in a list, use PowerShell's Where-Object cmdlet to filter the data:
$users = Get-ADUser -Filter * -SearchBase "DN" -Properties GivenName, Surname,mail,UserPrincipalName,SAMAccountName,proxyAddresses | Select GivenName, Surname,mail,UserPrincipalName,SAMAccountName,proxyAddresses
# use `Where-Object` to filter the data based on individual property values
$usersSansLaura = $users |Where-Object SAMAccountName -ne 'Laura.Willox'
Here, we pipe any user objects contained in $users to Where-Object SAMAccountName -ne 'Laura.Willox' - the -ne operator is the "not equal" operator, so the output will be any input object that does not have a SAMAccountName property with the exact value Laura.Willox, and then assign those to $usersSansLaura
Mathias' helpful answer is worth considering:
In PowerShell, it is unusual to directly manipulate resizable collections.
Instead, collection processing in PowerShell usually involves creating new collections by filtering the original collection, using the Where-Object in the pipeline or, for collections already in memory, the .Where() array method.
If you do need to deal with in-place resizing of a list data type, I suggest using System.Collections.Generic.List`1 instead, whose .FindIndex() method allows you to do what you wanted:
# Note: I'm using [object] as the type parameter for simplicity, but you
# could use [Microsoft.ActiveDirectory.Management.ADUser] for strict typing.
$WorkingSet = [System.Collections.Generic.List[object]]::new(
#(Get-ADUser -Filter * -SearchBase "DN" -Properties GivenName, Surname,mail,UserPrincipalName,SAMAccountName,proxyAddresses | Select GivenName, Surname,mail,UserPrincipalName,SAMAccountName,proxyAddresses)
)
# Find the index of the user with a given SAMAccountName:
$ndx = $WorkingSet.FindIndex({ $args[0].SAMAccountName -eq 'Laura.Willox' })
# If found, remove the user from the list
# (-1 indicates that no matching element was found)
if ($ndx -ne -1) {
$WorkingSet.RemoveAt($ndx)
}
Generally, note that both System.Collections.ArrayList and System.Collections.Generic.List`1 have a .Remove() method that allows you to pass the object (element) to remove directly, irrespective of its index.
As for what you tried:
Since your array list is composed of ADUser instances, the .IndexOf() method requires passing such an instance in order to locate it among the elements - you can't just pass a string referring to one of the properties among the elements.
Instead, you need a predicate (a Boolean test) that compares the string to the property of interest (.SamAccountName), which is what the .FindIndex() call above does.

PowerShell ForEach Loop to Add UserPrincipalName and object ID to a file

I have a file of user emails. I need to gather their UPNs and object IDs.
Get-AzureADUser -Filter "PrimarySMTPAddress eq '$user.EmailAddress'"
I'm using this line to query AzureAD and it works perfectly. However when I place it in a loop I can't seem to get any data.
import-csv .\Book4.csv | ForEach-Object{
Get-AzureADUser -Filter "PrimarySMTPAddress eq '$_.EmailAddress'" | Select-Object UserPrincipalName, ObjectID
} | Export-Csv -Path .\Book4.csv -NoTypeInformation
Can anyone tell me where I'm going wrong. I thought this would be something simple but I've been stuck for an hour. The CSV file has three column headers: EmailAddress, UserPrincipalName, ObjectID
"PrimarySMTPAddress eq '$_.EmailAddress'" doesn't work as intended, because the attempted property access on variable $_, .EmailAddress, isn't effective:
Inside "...", an expandable string, you need $(...), the subexpression operator in order to access a variable's properties[1], call methods on it, or, more generally embed entire statements.
Therefore, use the following string instead:
"PrimarySMTPAddress eq '$($_.EmailAddress)'"
Also, you're mistakenly trying to read from and write back to the same file (.\Book4.csv) in the same pipeline, which doesn't work (without additional effort), as discussed in Allen Wu's helpful answer.
[1] Without enclosing in $(...), "...$_.EmailAddress..." cause $_ to be expanded by itself, whereas .EmailAddress is used verbatim. Compare Write-Output "$HOME.Length" to Write-Output "$($HOME.Length)", for instance.
#mklement0 is correct.
But another important thing is that we can't read from and write back to the same file in the same pipeline. Unless you make sure that the input file is read into memory, in full, before its lines are sent through the pipeline.
Use this:
$users = import-csv E:\temp\Book4.csv | ForEach-Object{
Get-AzureADUser -Filter "PrimarySMTPAddress eq '$($_.EmailAddress)'" | Select-Object UserPrincipalName, ObjectID
}
$users | Export-Csv -Path E:\temp\Book4.csv -NoTypeInformation

Extracting first and last names from txt file and using it to get username from AD and import it into csv

as a test and a way to get specific data from AD i am trying to get the data based on a txt file filled with names and last names of users. I worked out that using ConvertFrom-Stringcmdlet allows you to split names inside the txt file into two separate values, thus enabling you to use (in theory) Get-ADUser to find the user from AD and its attributes.
The code i was using so far is the following, with random changes here and there as i tried various options to make it work. I have been able to get the data i need by storing names under variables and then using the Get-ADUser cmdlet to pick them up from AD, i was even able to export that data into the CSV file. The issue is cant make it work when there is a text file filled with several entries.
Get-Content C:\temp\users.txt -encoding UTF8 |
ConvertFrom-String |
ForEach-Object {Get-ADUser -Filter {(givenName -Like '$($_.P1)') -and (sn -Like '$($_.P2)')} -Properties *} |
Select-object givenName, Surname, SamAccountName | Export-CSV C:\Temp\usersexport.csv -NoTypeInformation
Any help would be very appreciated.
You are using a sub-expression inside a non-expanding string. Check out Powershell's about_Quoting_Rules help topic for more detailed information. But, the short story is you need double quotes to expand something like $($_).
Something like:
Get-Content C:\temp\users.txt -encoding UTF8 |
ConvertFrom-String |
ForEach-Object {Get-ADUser -Filter "( givenName -like '$($_.P1)' ) -and (sn -like '$($_.P2)' )" -Properties *} |
Select-object givenName, Surname, SamAccountName | Export-CSV C:\Temp\usersexport.csv -NoTypeInformation
Notice that I quoted the entire filter. Technically Get-ADUser's -Filter parameter takes a string so this is acceptable and makes it a little easier to read.
Also note: you may need "*" to make this work correctly. Technically -like will work on exact matches, but it's more typically used when searching with wildcards. In your case using '*$($_.P1)*' might help get you past the issue.
If you are confident in the values especially considering you are also using the -and operator you might think about using -eq. However I'd be concerned as there's always a chance of common name collisions. For example in the US there could easily be 2 John Smiths in AD...
I'd also point out it's more expedient to use Import-Csv for this. When corrected your approach is working, but below is more readable and easier to write in the first place:
Example with -eq:
Import-Csv -Path C:\temp\users.txt -header "givenName","sn" -Delimiter " " |
ForEach-Object{
$GivenName = $_.givenName
$LastName = $_.sn
ForEach-Object { Get-ADUser -Filter "(givenName -eq '$GivenName') -and (sn -eq '$LastName')" -Properties * }
} |
Select-Object givenName, Surname, SamAccountName |
Export-CSV C:\Temp\usersexport.csv -NoTypeInformation
Example with -like and Wildcards:
Import-Csv -Path C:\temp\users.txt -header "givenName","sn" -Delimiter " " |
ForEach-Object{
$GivenName = $_.givenName
$LastName = $_.sn
ForEach-Object { Get-ADUser -Filter "(givenName -like '*$GivenName*') -and (sn -like '*$LastName*')" -Properties * }
} |
Select-Object givenName, Surname, SamAccountName |
Export-CSV C:\Temp\usersexport.csv -NoTypeInformation
You also may want to avoid -Properties * and instead define a more exact set of properties to retrieve. In bigger jobs that's going to make a performance difference.
Oh and, you usually don't need to specify the encoding for Get-Content it's very good at figuring that out on it's own.
In PowerShell Core the default value for the parameter is UTF8NoBOM. NoBOM just means there's no "Byte Order Mark" declaring the encoding and so it shouldn't have any issues with a UTF8 file. In Windows PowerShell 5.1 you may need to specify as you have.
It looks like this is a learning exercise for you, let me know if this helps.

Is it possible to make IndexOf case-insensitive in PowerShell?

I've got a problem searching an INDEX in an array made up by query sessions command in a terminal server.
This is the problematic script:
# Array of logged users in terminal servers
$a=Get-RDUsersession -CollectionName "BLABLA" -ConnectionBroker BLABLA.BLA.BL
# Array of all users with two columns from active directory
$b=Get-ADUser -filter * -properties TelephoneNumber,SamAccountName
Now imagine logging in the terminal server using the account name TEST instead of test.
If I do:
$c = $b[$b.SamAccountName.indexof("test")].TelephoneNumber
then I don't get the telephone number.
I think that's because of the case sensitivity, isn't it? If I type TEST in the search command, I get the correct number.
Is there any simple way to solve this problem and make the search of the index case-insensitive?
I've read about using the method [StringComparison]"CurrentCultureIgnoreCase", but it seems not working with array.
Thanks.
Since $b is an Object[] type, then you would probably want to do a Where-Object.
$b | Where-Object -FilterScript {$_.Samaccountname -like '*Smith*'} | Select-Object -ExpandProperty 'telephoneNumber'
That being said, an array in Powershell can be indexed case-insensitively if it is converted to a [Collections.Generic.List[Object]] type.
$b = [Collections.Generic.List[Object]]$b
$b.FindIndex( {$args[0].sAMAccountName -eq 'test'} )
Note that pulling every single user object in AD and filtering using where-object or index matching can be very slow. You can instead Get-ADUser as needed or pull all ADusers using a filter that pulls only the users returned in $a.
If you insist on having all ADUsers in one spot with one pull, consider looping over the list once to make a hash lookup so you can easily index the hash value.
#Create account lookup hash
$accountlookup = #{}
foreach ($element in $accounts) {
$accountlookup[$element.SamAccountName] = $element
}
Hope that helps!

insert large amount of AD data into SQL server using PowerShell

I have a PowerShell script that pulls in 1.4+ million rows of data and saves it to a HUGE CSV file that then gets imported into an SQL server. I thought there might be a way to have PowerShell insert the data into the SQL server directly but I am not sure how.
One of my concerns is that I don't want to buffer up the AD result into memory and then write them. I'd rather write them in batches of 1000 or something so memory consumption stays down. Get 1000 records, save to SQL server, and repeat...
I see articles about how to get PowerShell to write to an SQL server but they all seem to either do ALL data at one time or one record at a time -- both of which seem inefficient to me.
This is the PowerShell script I have to query AD.
# the attributes we want to load
$ATTRIBUTES_TO_GET = "name,distinguishedName"
# split into an array
$attributes = $ATTRIBUTES_TO_GET.split(",")
# create a select string to be used when we want to dump the information
$selectAttributes = $attributes | ForEach-Object {#{n="AD $_";e=$ExecutionContext.InvokeCommand.NewScriptBlock("`$_.$($_.toLower())")}}
# get a directory searcher to search the GC
[System.DirectoryServices.DirectoryEntry] $objRoot = New-Object System.DirectoryServices.DirectoryEntry("GC://dc=company,dc=com")
[System.DirectoryServices.DirectorySearcher] $objSearcher = New-Object System.DirectoryServices.DirectorySearcher($objRoot)
# set properties
$objSearcher.SearchScope = "Subtree"
$objSearcher.ReferralChasing = "All"
# need to set page size otherwise AD won't return everything
$objSearcher.PageSize = 1000
# load the data we want
$objSearcher.PropertiesToLoad.AddRange($attributes)
# set the filter
$objSearcher.Filter = "(&(objectClass=group)(|(name=a*)(name=b*)))"
# get the data and export to csv
$objSearcher.FindAll() | select -expandproperty properties | select $selectAttributes | export-csv -notypeinformation -force "out.csv"
I use Out-DataTable to convert my object array into a DataTable object type, then use Write-DataTable to bulk insert that into a database (Write-DataTable uses SqlBulkCopy to do this).
Caveats/gotchas for this (SqlBulkCopy can be a nuisance to troubleshoot):
Make sure your properties are the correct type (string for varchar/nvarchar, int for any integer values, dateTime can be string as long as the format is correct and SQL can parse it)
Make sure you properties are in order and line up with the table you're inserting to, including any fields that auto fill (incrementing ID key, RunDt, etc).
Out-DataTable: https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
Write-DataTable: https://gallery.technet.microsoft.com/scriptcenter/2fdeaf8d-b164-411c-9483-99413d6053ae
Usage
If I were to continue on your example and skip the CSV, this is how I would do it... replace the last two lines with the code below (assuming that your object properties line up with the table perfectly, your SQL server name is sql-server-1, database name is org, and table name is employees):
try {
Write-DataTable -ServerInstance sql-server-1 -Database org -TableName employees -Data $($objSearcher.FindAll() | Select-Object -expandproperty properties | Select-Object $selectAttributes | Out-DataTable -ErrorAction Stop) -ErrorAction Stop
}
catch {
$_
}
Looking at your code it looks like you come from .NET or some language based on .NET. Have you heard of the cmdlets Get-ADUser / Get-ADGroup? This would simplify things tremendously for you.
As far as the SQL connection goes PowerShell doesn't have any native support for it. Microsoft has made cmdlets for it though! You just have to have SQL Server Installed in order to get them.... Which is kinda a bummer since SQL is so heavy and not everyone wants to install it. It is still possible using .NET, it's just not very quick or pretty. I won't be giving advice on the cmdlets here, you can Google that. As far as .NET, I would start by reading some of the documentation on the System.Data.SqlClient namespace as well as some historical questions on the subject.
Finally, as you said it would be a good idea to try and avoid overloading your RAM. The big thing here is trying to keep your entire script down to one single AD query. This way you avoid the troubling scenario of data changing between one query and the next. I think the best way of doing this would be to save your results straight to a file. Once you have that you could use SqlBulkCopy to insert into the table straight from your file. The downside to this is that it doesn't allow for multiple AD Properties. At least I don't think SqlBulkCopy will allow for this?
Get-ADUser "SomeParamsHere" | Out-File ADOutput.txt
If you have to have multiple AD properties and still want to keep the RAM usage to a minimum...well I toyed around with a script that would work but makes a few calls that would read from the entire file, which defeats the whole purpose. Your best option might be to save each property to a separate file then do your whole write DB thing. Example:
New-Item Name.txt
New-Item DistinguishedName.txt
Get-ADUser "SomeParamsHere" -Properties "Name,DistinguishedName" | Foreach {
Add-Content -Path "Name.txt" -Value "$_.Name"
Add-Content -PassThru "DistinguishedName.txt" -Value "$_.DistinguishedName"
}
Store results in your last line of code in a variable instead of exporting it to csv.
Then create group's of size you want't.
Using Out-DataTable and Write-DataTable write to SQL - links in nferrell's answer.
$res = $objSearcher.FindAll() | select -expandproperty properties | select
$selectAttributes
$counter = [pscustomobject] #{ Value = 0 }
#create groups with 1000 entries each
$groups = $res | Group-Object -Property { [math]::Floor($counter.Value++ / 1000) }
foreach ($group in $groups){
#convert to data table
$dt = $group.group | Out-DataTable
$dt | Write-DataTable -Database DB -ServerInstance SERVER -TableName TABLE
}
`
You're making this unneccesarily complicated.
If I read your code correctly, you want all groups starting with 'a' or 'b'.
# the attributes we want to export
$attributes = 'name', 'distinguishedName'
Import-Module ActiveDirectory
Get-ADGroup -Filter {(name -like "a*") -or (name -like "b*")} -SearchBase 'dc=company,dc=com' |
select $attributes | Export-Csv -NoTypeInformation -Force "out.csv"
Instead of using Export-Csv at the end, just pipe the output to the command which creates the SQL rows. By piping objects (instead of assigning them to a variable) you give PowerShell the ability to handle them efficiently (it will start processing objects as they come in, not buffer everything).
Unfortunately I can't help you with the SQL part.

Resources