Create a Powershell script to disable most of SQL server logins - sql-server

First, and newbie question. Sorry in advance ;)
I'm trying to create a two step script in powershell to change all sql server logins on different servers.
My goal is to be able to run it in a few seconds and not having to access the management studio for this operations. This has to be done in about 1000 computers running different versions of SQL Express.
1st step: Change the SA password.
I've tested this one and it's ok. I'm using a txt file with the server name list and two string for the password and username. Works perfect.
2nd step: Disable all sql logins (including windows auth ones)that are not on text file with the list of logins to keep.
Here's where I'm stuck.
I can use a list of users and disable them but I would like to do the opposite. I want to disable login names that are not on the list but are enabled on the sql server.
I need it to work this way because I have no chance to know the hostnames and usernames without accessing our clients computers / servers and I need it to be done really quick without manual procedures.
This is what I've done to be able to disable the ones on the list. It works fine but... I need the opposite.
Thank you very much and sorry if it's a stupid question.
Regards,
Luís
#DISABLE LOGINS ON THE LIST
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$Error.Clear()
cls
$servers = Get-Content C:\example\servers.txt
$logins = Get-Content C:\example\users.txt
foreach($server in $servers)
{
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $server
foreach($login in $logins)
{
if ($srv.Logins.Contains($login))
{
$srv.Logins[$login].disable();
}
}
}

$srv.Logins |
Where-Object Name -notin $logins |
ForEach-Object Disable
Note that the lookup isn't efficient, because a linear search in array $logins is performed for each login, but that probably won't matter in practice.
The Where-Object and ForEach-Object calls use simplified syntax.

Related

How to get creation scripts on SQL Server - Automated (DDL Events, Power Shell)

I'm facing an issue on getting the creation scripts for all objects from the Data Base. we want to get it to save as initial versioning (like the first version of everything), the idea for that is if we alter something or it goes wrong somehow, we have the previous objects scripts and can apply rollback. We also using DDLEvents to get all the historic of modifications and so on...
I could get part of the solution with Object_Definition(), but unfortunatelly this is limited and I can't get any result for Tables Definition... And I need everything that is created with the table scripts (contraints, indexes and so on), exactly like when we get it manually on SQL Server Management Studio
I found some Power Shell Scripts and been trying to get part to part what I need:
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$logFile = "C:\Users\User\Documents\SQL\Table Scripts\Scripts_DBTables.sql"
$DB = "DBTables"
# Create an SMO connection to the instance
$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') "Server"
$dbs=$s.Databases
# Obj to use scripter()
$scrp = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($s)
# scripter() obj Options Property
$scrp.Options.AppendToFile = $True
$scrp.Options.ClusteredIndexes = $True
$scrp.Options.DriAll = $True
$scrp.Options.ScriptDrops = $false
$scrp.Options.IncludeHeaders = $True
$scrp.Options.ToFileOnly = $True
$scrp.Options.Indexes = $True
$scrp.Options.WithDependencies = $True
$scrp.Options.FileName = $logFile
#$scrp.Script($dbs["$DB"].Tables)
foreach($item in $dbs["$DB"].Tables)
{
$tablearray+=#($item)
}
$scrp.Script($tablearray)
With this I can get part of what I need which is the creation table scripts, but I keep getting the same error:
Could not load file or assembly 'Microsoft.SqlServer.Dmf.Common,
Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or
one of its dependencies. The system cannot find the file specified.
I've tried to find the dll which would match with the code below, but I only could find the Version=16.100.0.0 in the folder of C:\Program Files (x86)\Microsoft SQL Server Management Studio 18\Common7\IDE.
Is there someone that have done this kind of work before or have any ideas what could be done to solve the issue?
Thanks a lot in advance.
The dbatools third-party module can generate scripts very easily. You can pass most of its objects to Export-DbaScript and you will get a script for that object.
Get-DbaDbTable -SqlInstance DEV01 -Database Test1 | Export-DbaScript
To load the module, use
Install-Module dbatools
Or
Install-Module dbatools -Scope CurrentUser
To monitor for changes, you may want to consider either SQL Audit, or some other third-party tool, for example Redgate SQL Compare.

Replace Database connection string using PowerShell

I am working on a script that should display the current database a program is connected to, and also give an option to replace with a new database (either by manually entering the database name but preferably listing all databases on the local server \ instance and give users "number selection" to select the database they want to use.
The connection string is saved in a text file called server.exe.config below is an example of what the file contains, its not the only data in the config file obviously
<add key="persistency.connection" value="data source=MyDatabaseServer;initial catalog=MyDatabase1;Integrated Security=True" />
I can use Get-Content to see the entire file and also use Where-Object {$_ -like "*initial catalog=*"} to see the only line which has the database configuration.
But I think this will be difficult for users to interpret what database is being used so if possible need a command that will display just the database name that is in the config file instead of the entire line, save that database name for future replacement when the user selects a new database to be replaced into the config file.
Possible?
While plain-text processing, as shown in your own answer, works in simple cases, for robustness it is usually preferable to use XML parsing, such as via the Select-Xml cmdlet:
$file = './server.exe.config'
# Extract the XML element of interest, using an XPath query
$xpathQuery = '/configuration/appSettings/add[#key = "persistency.connection"]'
$el = (Select-Xml $xpathQuery $file).Node
# Replace the database name in the element's value...
$newDatabase = 'NewDatabase'
$el.value = $el.value -replace '(?<=\binitial catalog=)[^;]+', $newDatabase
# ... and save the modified document back to the file.
# IMPORTANT: Be sure to use a *full* path to the output file,
# because PowerShell's current directory (location)
# usually differs from .NET's; Convert-Path ensures that.
$el.OwnerDocument.Save((Convert-Path $file))
Note that this technique works in principle for any XML file, not just .config.exe files.
I think this is simpler and works for me
$ConnString = Get-Content -Path $filepath\server.exe.Config | Where-Object {$_ -like "*initial catalog=*"}
$ConnString -split '\s*;\s*' | ForEach-Object -Process {
if ($_ -imatch '^initial catalog=(?<dbname>.+)$') {
$ConnStringdb = $Matches.dbname
}
}
I am then using the following simply to replace the text above back into the config file after a database name is enter - Manually. would love if I could get it to work automatically by maybe building an array or something.
Write-Host "Please Enter the database name you want to use for Production(Don't Leave Blank): " -NoNewline; $NewDatabaseConfig = Read-Host;
(Get-Content -Path $filepath\server.exe.Config -Raw) -Replace $ConnStringdb,$NewDatabaseConfig | Set-Content -Path $IPSServerProd\UBERbaseOfficeServer.exe.Config

How to access a key on an array that i recovered on AD with shellscript

First the question, after ill contribute on some idea im working:
Im receiving from AD some information and got an array on one of them.
I want to extract this info so i can use for next part of my project
the script:
Get-ADUser foo -Properties * | select name, department, manager
this returns me a table, ill simplify reading:
name -> foo
department -> bar
manager -> CN=foo, OU=bar, OU=fubar, OU=foobar
**disclaimer: im from BR, so it may look different for you when u receive data(if you trying to reproduce)
I want to extract The info "foo" from uptable, but i accept even "CN=foo".
::finalle
My idea is to create automation by taking AD's data and by shell putting into MSWord
There i have some fields that has autocompletion and after this ill need to somehow pass it through shell
The complete goal is
run a script that the users gives who he wants (by ID), find it and
receive the word oppening with all writen, instead of have to stay
changing the same document everytime someone needs this
Thank you guys!
I manage do get 2 commands that do the job after some time.
(get-aduser (get-aduser foobar -Properties manager).manager).name
Get-ADUser -Identity foobar -Properties manager |
Select-Object -Property #{label='Supervisor';expression={$_.manager -replace ',.*$'}}
Thanks for the help anyway

insert large amount of AD data into SQL server using PowerShell

I have a PowerShell script that pulls in 1.4+ million rows of data and saves it to a HUGE CSV file that then gets imported into an SQL server. I thought there might be a way to have PowerShell insert the data into the SQL server directly but I am not sure how.
One of my concerns is that I don't want to buffer up the AD result into memory and then write them. I'd rather write them in batches of 1000 or something so memory consumption stays down. Get 1000 records, save to SQL server, and repeat...
I see articles about how to get PowerShell to write to an SQL server but they all seem to either do ALL data at one time or one record at a time -- both of which seem inefficient to me.
This is the PowerShell script I have to query AD.
# the attributes we want to load
$ATTRIBUTES_TO_GET = "name,distinguishedName"
# split into an array
$attributes = $ATTRIBUTES_TO_GET.split(",")
# create a select string to be used when we want to dump the information
$selectAttributes = $attributes | ForEach-Object {#{n="AD $_";e=$ExecutionContext.InvokeCommand.NewScriptBlock("`$_.$($_.toLower())")}}
# get a directory searcher to search the GC
[System.DirectoryServices.DirectoryEntry] $objRoot = New-Object System.DirectoryServices.DirectoryEntry("GC://dc=company,dc=com")
[System.DirectoryServices.DirectorySearcher] $objSearcher = New-Object System.DirectoryServices.DirectorySearcher($objRoot)
# set properties
$objSearcher.SearchScope = "Subtree"
$objSearcher.ReferralChasing = "All"
# need to set page size otherwise AD won't return everything
$objSearcher.PageSize = 1000
# load the data we want
$objSearcher.PropertiesToLoad.AddRange($attributes)
# set the filter
$objSearcher.Filter = "(&(objectClass=group)(|(name=a*)(name=b*)))"
# get the data and export to csv
$objSearcher.FindAll() | select -expandproperty properties | select $selectAttributes | export-csv -notypeinformation -force "out.csv"
I use Out-DataTable to convert my object array into a DataTable object type, then use Write-DataTable to bulk insert that into a database (Write-DataTable uses SqlBulkCopy to do this).
Caveats/gotchas for this (SqlBulkCopy can be a nuisance to troubleshoot):
Make sure your properties are the correct type (string for varchar/nvarchar, int for any integer values, dateTime can be string as long as the format is correct and SQL can parse it)
Make sure you properties are in order and line up with the table you're inserting to, including any fields that auto fill (incrementing ID key, RunDt, etc).
Out-DataTable: https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
Write-DataTable: https://gallery.technet.microsoft.com/scriptcenter/2fdeaf8d-b164-411c-9483-99413d6053ae
Usage
If I were to continue on your example and skip the CSV, this is how I would do it... replace the last two lines with the code below (assuming that your object properties line up with the table perfectly, your SQL server name is sql-server-1, database name is org, and table name is employees):
try {
Write-DataTable -ServerInstance sql-server-1 -Database org -TableName employees -Data $($objSearcher.FindAll() | Select-Object -expandproperty properties | Select-Object $selectAttributes | Out-DataTable -ErrorAction Stop) -ErrorAction Stop
}
catch {
$_
}
Looking at your code it looks like you come from .NET or some language based on .NET. Have you heard of the cmdlets Get-ADUser / Get-ADGroup? This would simplify things tremendously for you.
As far as the SQL connection goes PowerShell doesn't have any native support for it. Microsoft has made cmdlets for it though! You just have to have SQL Server Installed in order to get them.... Which is kinda a bummer since SQL is so heavy and not everyone wants to install it. It is still possible using .NET, it's just not very quick or pretty. I won't be giving advice on the cmdlets here, you can Google that. As far as .NET, I would start by reading some of the documentation on the System.Data.SqlClient namespace as well as some historical questions on the subject.
Finally, as you said it would be a good idea to try and avoid overloading your RAM. The big thing here is trying to keep your entire script down to one single AD query. This way you avoid the troubling scenario of data changing between one query and the next. I think the best way of doing this would be to save your results straight to a file. Once you have that you could use SqlBulkCopy to insert into the table straight from your file. The downside to this is that it doesn't allow for multiple AD Properties. At least I don't think SqlBulkCopy will allow for this?
Get-ADUser "SomeParamsHere" | Out-File ADOutput.txt
If you have to have multiple AD properties and still want to keep the RAM usage to a minimum...well I toyed around with a script that would work but makes a few calls that would read from the entire file, which defeats the whole purpose. Your best option might be to save each property to a separate file then do your whole write DB thing. Example:
New-Item Name.txt
New-Item DistinguishedName.txt
Get-ADUser "SomeParamsHere" -Properties "Name,DistinguishedName" | Foreach {
Add-Content -Path "Name.txt" -Value "$_.Name"
Add-Content -PassThru "DistinguishedName.txt" -Value "$_.DistinguishedName"
}
Store results in your last line of code in a variable instead of exporting it to csv.
Then create group's of size you want't.
Using Out-DataTable and Write-DataTable write to SQL - links in nferrell's answer.
$res = $objSearcher.FindAll() | select -expandproperty properties | select
$selectAttributes
$counter = [pscustomobject] #{ Value = 0 }
#create groups with 1000 entries each
$groups = $res | Group-Object -Property { [math]::Floor($counter.Value++ / 1000) }
foreach ($group in $groups){
#convert to data table
$dt = $group.group | Out-DataTable
$dt | Write-DataTable -Database DB -ServerInstance SERVER -TableName TABLE
}
`
You're making this unneccesarily complicated.
If I read your code correctly, you want all groups starting with 'a' or 'b'.
# the attributes we want to export
$attributes = 'name', 'distinguishedName'
Import-Module ActiveDirectory
Get-ADGroup -Filter {(name -like "a*") -or (name -like "b*")} -SearchBase 'dc=company,dc=com' |
select $attributes | Export-Csv -NoTypeInformation -Force "out.csv"
Instead of using Export-Csv at the end, just pipe the output to the command which creates the SQL rows. By piping objects (instead of assigning them to a variable) you give PowerShell the ability to handle them efficiently (it will start processing objects as they come in, not buffer everything).
Unfortunately I can't help you with the SQL part.

How to know the GUID of a Database?

I follow this step by step http://technet.microsoft.com/en-us/library/ff681014.aspx for reset the User Profile Synchronization Service but I need the GUID of the synchronization database.
I searched a lot but I didn't find anything. I need also to find the GUID of the/a service.
Thank you
You can use Powershell to do this the way MS describes in the Technet article or the smart way:
$syncDBType = "Microsoft.Office.Server.Administration.SynchronizationDatabase"
$upaSAType = "User Profile Service Application"
$syncDB = Get-SPDatabase | where-object {$_.Type -eq $syncDBType}
$upa = Get-SPServiceApplication | where-object {$_.TypeName -eq $upaSAType}
$syncDB.Unprovision()
$syncDB.Status = "Offline"
$upa.ResetSynchronizationMachine()
$upa.ResetSynchronizationDatabase()
$syncDB.Provision()
restart-service SPTimerV4
So we actually don't look for the guid but look where the database type is the sync database type. You can find more troubleshooting gems like this on Harbars site: “Stuck on Starting”: Common Issues with SharePoint Server 2010 User Profile Synchronization

Resources