Replace Database connection string using PowerShell - sql-server

I am working on a script that should display the current database a program is connected to, and also give an option to replace with a new database (either by manually entering the database name but preferably listing all databases on the local server \ instance and give users "number selection" to select the database they want to use.
The connection string is saved in a text file called server.exe.config below is an example of what the file contains, its not the only data in the config file obviously
<add key="persistency.connection" value="data source=MyDatabaseServer;initial catalog=MyDatabase1;Integrated Security=True" />
I can use Get-Content to see the entire file and also use Where-Object {$_ -like "*initial catalog=*"} to see the only line which has the database configuration.
But I think this will be difficult for users to interpret what database is being used so if possible need a command that will display just the database name that is in the config file instead of the entire line, save that database name for future replacement when the user selects a new database to be replaced into the config file.
Possible?

While plain-text processing, as shown in your own answer, works in simple cases, for robustness it is usually preferable to use XML parsing, such as via the Select-Xml cmdlet:
$file = './server.exe.config'
# Extract the XML element of interest, using an XPath query
$xpathQuery = '/configuration/appSettings/add[#key = "persistency.connection"]'
$el = (Select-Xml $xpathQuery $file).Node
# Replace the database name in the element's value...
$newDatabase = 'NewDatabase'
$el.value = $el.value -replace '(?<=\binitial catalog=)[^;]+', $newDatabase
# ... and save the modified document back to the file.
# IMPORTANT: Be sure to use a *full* path to the output file,
# because PowerShell's current directory (location)
# usually differs from .NET's; Convert-Path ensures that.
$el.OwnerDocument.Save((Convert-Path $file))
Note that this technique works in principle for any XML file, not just .config.exe files.

I think this is simpler and works for me
$ConnString = Get-Content -Path $filepath\server.exe.Config | Where-Object {$_ -like "*initial catalog=*"}
$ConnString -split '\s*;\s*' | ForEach-Object -Process {
if ($_ -imatch '^initial catalog=(?<dbname>.+)$') {
$ConnStringdb = $Matches.dbname
}
}
I am then using the following simply to replace the text above back into the config file after a database name is enter - Manually. would love if I could get it to work automatically by maybe building an array or something.
Write-Host "Please Enter the database name you want to use for Production(Don't Leave Blank): " -NoNewline; $NewDatabaseConfig = Read-Host;
(Get-Content -Path $filepath\server.exe.Config -Raw) -Replace $ConnStringdb,$NewDatabaseConfig | Set-Content -Path $IPSServerProd\UBERbaseOfficeServer.exe.Config

Related

searching for multiple strings in multiple files in PowerShell

first of all, I've got a reliable search (thanks to some help on Stack Overflow) that checks for occurrences of different strings in a line over many log files.
I've now been tasked to include multiple searches and since there are about 20 files and about a dozen search criteria, I don't want to to have to access these files over 200 times. I believe the best way of doing this is in a array, but so far all methods I've tried have failed.
The search criteria is made up of date, which obviously changes very day, a fixed string (ERROR) and a unique java classname. Here is what i have:
$dateStr = Get-Date -Format "yyyy-MM-dd"
$errword = 'ERROR'
$word01 = [regex]::Escape('java.util.exception')
$pattern01 = "${dateStr}.+${errword}.+${word01}"
$count01 = (Get-ChildItem -Filter $logdir -Recurse | Select-String -Pattern $pattern01 -AllMatches |ForEach-Object Matches |Measure-Object).Count
Add-Content $outfile "$dateStr,$word01,$count01"
the easy way to expand this is to have a separate three command entry (set word, set pattern and then search) for each class i want to search against - which I've done and it works, but its not elegant and then we're processing >200 files to run the search. I've tried to read the java classes in from a simple text file with mixed results, but its the only thing I've been able to get to work in order to simplify the search for 12 different patterns.
iRon provided an important pointer: Select-String can accept an array of patterns to search for, and reports matches for lines that match any one of them.
You can then get away with a single Select-String call, combined with a Group-Object call that allows you to group all matching lines by which pattern matched:
# Create the input file with class names to search for.
#'
java.util.exception
java.util.exception2
'# > classNames.txt
# Construct the array of search patterns,
# and add them to a map (hashtable) that maps each
# pattern to the original class name.
$dateStr = Get-Date -Format 'yyyy-MM-dd'
$patternMap = [ordered] #{}
Get-Content classNames.txt | ForEach-Object {
$patternMap[('{0}.+{1}.+{2}' -f $dateStr, 'ERROR', [regex]::Escape($_))] = $_
}
# Search across all files, using multiple patterns.
Get-ChildItem -File -Recurse $logdir | Select-String #($patternMap.Keys) |
# Group matches by the matching pattern.
Group-Object Pattern |
# Output the result; send to `Set-Content` as needed.
ForEach-Object { '{0},{1},{2}' -f $dateStr, $patternMap[$_.Name], $_.Count }
Note:
$logDir, as the name suggests, is presumed to refer to a directory in which to (recursively) search for log files; passing that to -Filter wouldn't work, so I've removed it (which then positionally binds $logDir to the -Path parameter); -File limits the results to files; if other types of files are also present, add a -Filter argument as needed, e.g. -Filter *.log
Select-String's -AllMatches switch is generally not required - you only need it if any of the patterns can match multiple times per line and you want to capture all of those matches.
Using #(...), the array-subexpression operator around the collection of the hashtable's keys, $patternMap.Keys, i.e. the search patterns, is required purely for technical reasons: it forces the collection to be convertible to an array of strings ([string[]]), which is how the -Pattern parameter is typed.
The need for #(...) is surprising, and may be indicative of a bug, as of PowerShell 7.2; see GitHub issue #16061.

Is it possible to make IndexOf case-insensitive in PowerShell?

I've got a problem searching an INDEX in an array made up by query sessions command in a terminal server.
This is the problematic script:
# Array of logged users in terminal servers
$a=Get-RDUsersession -CollectionName "BLABLA" -ConnectionBroker BLABLA.BLA.BL
# Array of all users with two columns from active directory
$b=Get-ADUser -filter * -properties TelephoneNumber,SamAccountName
Now imagine logging in the terminal server using the account name TEST instead of test.
If I do:
$c = $b[$b.SamAccountName.indexof("test")].TelephoneNumber
then I don't get the telephone number.
I think that's because of the case sensitivity, isn't it? If I type TEST in the search command, I get the correct number.
Is there any simple way to solve this problem and make the search of the index case-insensitive?
I've read about using the method [StringComparison]"CurrentCultureIgnoreCase", but it seems not working with array.
Thanks.
Since $b is an Object[] type, then you would probably want to do a Where-Object.
$b | Where-Object -FilterScript {$_.Samaccountname -like '*Smith*'} | Select-Object -ExpandProperty 'telephoneNumber'
That being said, an array in Powershell can be indexed case-insensitively if it is converted to a [Collections.Generic.List[Object]] type.
$b = [Collections.Generic.List[Object]]$b
$b.FindIndex( {$args[0].sAMAccountName -eq 'test'} )
Note that pulling every single user object in AD and filtering using where-object or index matching can be very slow. You can instead Get-ADUser as needed or pull all ADusers using a filter that pulls only the users returned in $a.
If you insist on having all ADUsers in one spot with one pull, consider looping over the list once to make a hash lookup so you can easily index the hash value.
#Create account lookup hash
$accountlookup = #{}
foreach ($element in $accounts) {
$accountlookup[$element.SamAccountName] = $element
}
Hope that helps!

PowerShell to delete files not matching an array of user names

I have a simple database query that returns an array of current users
e.g.
ABB016
ACQ002
AGU003
AHM007
I would like to delete all files in a particular folder that have a name to the left of the file ending (BaseName) that can not be found within this array.
The code below is what I currently have, however at the moment it would delete all the files in this directory
# Find files that do not have the same basename as current users' name and delete
Get-ChildItem -path $DirectoryWithFiles -rec *.* | Where-Object {
$SQLQeryResultsSet -NotContains $_.BaseName } |
Remove-Item -WhatIf
What if: Performing the operation "Remove File" on target "ABB016.pdf".
What if: Performing the operation "Remove File" on target "ACQ002.pdf".
What if: Performing the operation "Remove File" on target "AGU003.pdf".
What am I doing wrong?
As your comments suggest, $SQLQeryResultsSet is not an array of strings, but a DataTable.
You could extract the individual string values from first column in each data row like this:
$ExcludedUsers = $SQLQeryResultsSet |ForEach-Object { '{0}' -f $_.Item(0) }
Then use this new variable in place of $SQLQeryResultsSet in the Where-Object filter scriptblock

Powershell importing one csv column in array. Contains command doesn`t work with this array

With a powershell script
I`m importing a CSV file in an array, everything works fine.
$csv = Import-Csv "C:\test.csv" -delimiter ";"
But I'm not able to find easily a value in a field name PeopleID directly.
The only best method is to loop through all array line and look if the item I`m looking for exist like :
foreach($item in $csv)
{
if ($csv | where {$item.PeopleID -eq 100263} | select *) {
#Good I found it!!!
}
$List.add($item.PeopleID) > $null
}
Instead I decide to import only my column PeopleID directly in an array to make it faster:
$csvPeople = Import-Csv "C:\test.csv" -delimiter ";" | select PeopleID
If you see higher, I also create an array $List that add every PeopleID in my loop.
So I have 2 arrays that are identically
The problem if I use the CONTAINS command:
if($csvPeople -contains 100263)
the result is false
if($List -contains 100263)
the result is true
What can I do to have my $csvPeople array working with "contains" ?
Importing a csv column is faster than looping through result and adding it to a new array, but this array is working.
Do I'm missing somthing when I import my CSV column to have a "working" array ?
thanks
I think you are looking for -like not -contains. -contains is a bit finicky about how it is used. If you replace you if syntax with this:
if($csvPeople -like "*100263*")
You should be good to go. Note the wildcards on either side, I'm putting these here for you since I don't know exactly what your data looks like. You might be either able to remove them or able to change them.
Obligatory -like vs -contains article if you are interested: http://windowsitpro.com/blog/powershell-contains
Also, #AnsgarWiechers comment above will work. I believe you will still need to wrap your number in quotes though. I don't like to do this as it requires an exact match. If you are working with the CSV in excel or elsewhere and you have whitespace or oddball line ending characters then you might not get a hit with -contains.
I just noticed this in your script above where you are doubling your efforts:
foreach($item in $csv) # <-- Here $item is a record in the CSV
{
if ($csv | where {$item.PeopleID -eq 100263} | select *) { # <-- Here you are re-parsing the CSV even though you already have the data in $item
}
$List.add($item.Matricule) > $null
}
Since I don't have the full picture of what you are trying to do I'm providing a solution that allows you to match and take action per record.
foreach($item in $csv)
{
if ($item.PeopleID -eq 100263) {
# Do something with $item here now that you matched it
}
}
To Address your 191 users. I would take a different approach. You should be able to load a raw list of PeopleID's into a variable and then use the -in feature of Where to do a list-to-list comparison.
First load your target PeopleID's into a list. Assuming it is just a flat list, not a csv you could do your comparison like this:
$PeopleID_List = Get-Content YourList.txt
$csv | where { $_.PeopleID -in $PeopleID_List } | % { # Do something with the matched record here. Reference it with $_ }
This basically says for each record in the CSV I want you to check if that record's peopleID is in the $PeopleID_List. If it is, take action on that record.

insert large amount of AD data into SQL server using PowerShell

I have a PowerShell script that pulls in 1.4+ million rows of data and saves it to a HUGE CSV file that then gets imported into an SQL server. I thought there might be a way to have PowerShell insert the data into the SQL server directly but I am not sure how.
One of my concerns is that I don't want to buffer up the AD result into memory and then write them. I'd rather write them in batches of 1000 or something so memory consumption stays down. Get 1000 records, save to SQL server, and repeat...
I see articles about how to get PowerShell to write to an SQL server but they all seem to either do ALL data at one time or one record at a time -- both of which seem inefficient to me.
This is the PowerShell script I have to query AD.
# the attributes we want to load
$ATTRIBUTES_TO_GET = "name,distinguishedName"
# split into an array
$attributes = $ATTRIBUTES_TO_GET.split(",")
# create a select string to be used when we want to dump the information
$selectAttributes = $attributes | ForEach-Object {#{n="AD $_";e=$ExecutionContext.InvokeCommand.NewScriptBlock("`$_.$($_.toLower())")}}
# get a directory searcher to search the GC
[System.DirectoryServices.DirectoryEntry] $objRoot = New-Object System.DirectoryServices.DirectoryEntry("GC://dc=company,dc=com")
[System.DirectoryServices.DirectorySearcher] $objSearcher = New-Object System.DirectoryServices.DirectorySearcher($objRoot)
# set properties
$objSearcher.SearchScope = "Subtree"
$objSearcher.ReferralChasing = "All"
# need to set page size otherwise AD won't return everything
$objSearcher.PageSize = 1000
# load the data we want
$objSearcher.PropertiesToLoad.AddRange($attributes)
# set the filter
$objSearcher.Filter = "(&(objectClass=group)(|(name=a*)(name=b*)))"
# get the data and export to csv
$objSearcher.FindAll() | select -expandproperty properties | select $selectAttributes | export-csv -notypeinformation -force "out.csv"
I use Out-DataTable to convert my object array into a DataTable object type, then use Write-DataTable to bulk insert that into a database (Write-DataTable uses SqlBulkCopy to do this).
Caveats/gotchas for this (SqlBulkCopy can be a nuisance to troubleshoot):
Make sure your properties are the correct type (string for varchar/nvarchar, int for any integer values, dateTime can be string as long as the format is correct and SQL can parse it)
Make sure you properties are in order and line up with the table you're inserting to, including any fields that auto fill (incrementing ID key, RunDt, etc).
Out-DataTable: https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
Write-DataTable: https://gallery.technet.microsoft.com/scriptcenter/2fdeaf8d-b164-411c-9483-99413d6053ae
Usage
If I were to continue on your example and skip the CSV, this is how I would do it... replace the last two lines with the code below (assuming that your object properties line up with the table perfectly, your SQL server name is sql-server-1, database name is org, and table name is employees):
try {
Write-DataTable -ServerInstance sql-server-1 -Database org -TableName employees -Data $($objSearcher.FindAll() | Select-Object -expandproperty properties | Select-Object $selectAttributes | Out-DataTable -ErrorAction Stop) -ErrorAction Stop
}
catch {
$_
}
Looking at your code it looks like you come from .NET or some language based on .NET. Have you heard of the cmdlets Get-ADUser / Get-ADGroup? This would simplify things tremendously for you.
As far as the SQL connection goes PowerShell doesn't have any native support for it. Microsoft has made cmdlets for it though! You just have to have SQL Server Installed in order to get them.... Which is kinda a bummer since SQL is so heavy and not everyone wants to install it. It is still possible using .NET, it's just not very quick or pretty. I won't be giving advice on the cmdlets here, you can Google that. As far as .NET, I would start by reading some of the documentation on the System.Data.SqlClient namespace as well as some historical questions on the subject.
Finally, as you said it would be a good idea to try and avoid overloading your RAM. The big thing here is trying to keep your entire script down to one single AD query. This way you avoid the troubling scenario of data changing between one query and the next. I think the best way of doing this would be to save your results straight to a file. Once you have that you could use SqlBulkCopy to insert into the table straight from your file. The downside to this is that it doesn't allow for multiple AD Properties. At least I don't think SqlBulkCopy will allow for this?
Get-ADUser "SomeParamsHere" | Out-File ADOutput.txt
If you have to have multiple AD properties and still want to keep the RAM usage to a minimum...well I toyed around with a script that would work but makes a few calls that would read from the entire file, which defeats the whole purpose. Your best option might be to save each property to a separate file then do your whole write DB thing. Example:
New-Item Name.txt
New-Item DistinguishedName.txt
Get-ADUser "SomeParamsHere" -Properties "Name,DistinguishedName" | Foreach {
Add-Content -Path "Name.txt" -Value "$_.Name"
Add-Content -PassThru "DistinguishedName.txt" -Value "$_.DistinguishedName"
}
Store results in your last line of code in a variable instead of exporting it to csv.
Then create group's of size you want't.
Using Out-DataTable and Write-DataTable write to SQL - links in nferrell's answer.
$res = $objSearcher.FindAll() | select -expandproperty properties | select
$selectAttributes
$counter = [pscustomobject] #{ Value = 0 }
#create groups with 1000 entries each
$groups = $res | Group-Object -Property { [math]::Floor($counter.Value++ / 1000) }
foreach ($group in $groups){
#convert to data table
$dt = $group.group | Out-DataTable
$dt | Write-DataTable -Database DB -ServerInstance SERVER -TableName TABLE
}
`
You're making this unneccesarily complicated.
If I read your code correctly, you want all groups starting with 'a' or 'b'.
# the attributes we want to export
$attributes = 'name', 'distinguishedName'
Import-Module ActiveDirectory
Get-ADGroup -Filter {(name -like "a*") -or (name -like "b*")} -SearchBase 'dc=company,dc=com' |
select $attributes | Export-Csv -NoTypeInformation -Force "out.csv"
Instead of using Export-Csv at the end, just pipe the output to the command which creates the SQL rows. By piping objects (instead of assigning them to a variable) you give PowerShell the ability to handle them efficiently (it will start processing objects as they come in, not buffer everything).
Unfortunately I can't help you with the SQL part.

Resources