Get Azure Active Directory export from GitHub Action - azure-active-directory

Background
I previously created a PowerShell script that accessed my company's Active Directory and exported the file as a csv:
get-aduser -filter * -properties "whenCreated","DisplayName","Department","Enabled","mobile","MobilePhone","Name","Office","Title" | export-csv -path adexport.csv
To use this command, I had to install some cmdlets with:
Get-WindowsCapability -Name RSAT.ActiveDirectory* -Online | Add-WindowsCapability -Online
Use GitHub Action
I want to automate this script with a GitHub Action. I set up the connection between GitHub and Azure following this documentation. I created the following, simplified workflow:
name: AzureLoginSample
on: push
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Log in with Azure
uses: azure/login#v1
with:
creds: '${{ secrets.AZURE_CREDENTIALS }}'
enable-AzPSSession: true
- name: Azure PowerShell Action
uses: Azure/powershell#v1
with:
inlineScript: |
Get-AzADUser | export-csv -path adexport.csv
azPSVersion: 3.1.0
When the workflow runs, the first step (log in) works just fine, but the second step fails because:
Get-AzADUser: /home/runner/work/_temp/415ec269-1cff-4c50-8035-c1e5181e0412.ps1:2
Line |
2 | Get-AzADUser | export-csv -path adexport.csv
| ~~~~~~~~~~~~
| Insufficient privileges to complete the operation.
I feel like I have the necessary permissions on the Azure side of things; the Azure application has reader and contributor permissions. I know with my original PowerShell script, I had to run as an admin - is there a way to do this with my Azure PowerShell script?
Previous Attempt
I tried to copy and paste the original PowerShell command, but the cmdlet get-aduser could not be found. When I tried to create a separate step and install the cmdlets, I was given another " cmdlet could not be found" error.
Thank you in advance and let me know if you need any clarifications.

Related

Xp_CmdShell Powershell Script No Valid Module Found

I'm trying to run a PS1 file in T-SQL using XP_CMDSHELL, like so:
exec xp_cmdshell 'powershell -ExecutionPolicy bypass -command "C:\Users\sleven\Documents\DimAcctImport.ps1"'
The powershell script is as follows:
import-module dbatools
Import-DbaCsv -SqlInstance 'MSSQL' -Database 'Test' -Table 'Account' -Path "R:\Data\Account.csv" -Delimiter ',' -Quote '"' -KeepNulls -NoProgress
This script uses the cmdlet Import-DbaCSV of module DbaTools to import the CSV to the target table.
Here is the error I receive in SSMS:
import-module : The specified module 'dbatools' was not loaded because no valid module file was found in any module
directory.
At C:\Users\sleven\Documents\DimAcctImport.ps1:1 char:1
+ import-module dbatools
+ ~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ResourceUnavailable: (dbatools:String) [Import-Module], FileNotFoundException
+ FullyQualifiedErrorId : Modules_ModuleNotFound,Microsoft.PowerShell.Commands.ImportModuleCommand
The module is installed and runs as expected if I copy the PowerShell script and run it in PS ISE.
I'm using SQL Developer Edition on the same PC as I'm using to run the sp - my local PC.
What am I missing?
EDIT: Adding output from get-module and $PSVersionTable
PS> (Get-Module -ListAvailable dbatools).Path:
C:\Users\sleven\Documents\WindowsPowerShell\Modules\dbatools\0.9.834\dbatools.psd1
PS> $PSVersionTable
Name Value
---- -----
PSVersion 5.1.18362.145
PSEdition Desktop
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.18362.145
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
Using Import-Module cmdlet tells PowerShell to look for cmdlet only in certain folders. You can check these folders by checking environment variable PSModulePath (splitting added for better visibility):
$env:PSModulePath -split ';'
In that case, when running PowerShell by xp_cmdshell, that folder is not included in PSModulePath (might happen as well if you use different account as your module is currently in your profile folder). Now you have two options:
Reference the module using its path, not name. You can get path using (get-module -ListAvailable dbatools).path:
# Replace the path with the path you found with Get-Module
Import-Module 'C:\path\to\module\dbatools.psd1
Modify your $env:PSModulePath. As that topic is more broader, let me give you the link to the docs. Remember that you should set that variable for the user running xp_cmdshell. As RThomas mentioned, by default it's not your user account. See linked answer for explanation on that topic.
NOTE (credits go to #David Browne - Microsoft from his comments):
If you're going to set the environment variable, it should be a system environment variable, as you wouldn't want this process to break if you change the SQL Server service account. And remember, setting a system environment variable requires a reboot for services to see the change.
It's likely an account issue. Keep in mind that running xp_cmdshell by default runs everything as the service account behind the SQL Server instance. So you'll want to verify what account this is.
If it's a built in system account this can cause strange behavior when it comes to rights.
If it's a windows or a domain account then you'll want to test externally to SQL Server by running the PS file not as yourself but as the same account SQL Server uses as a service account. You can do this easily by opening the cmd shell with shift/right click and specifying the other account.
Your other option is to set up a proxy account for the xp_cmdshell call to use. Instructions on how to do this can be found in Microsoft documentation.

TFS Run batch script on a remote server with admin permission

I currently have Server A which is where my TFS and Build Agent is located. I then have Server B which is when my source code site. I am trying to set up a build definition and copies file from on location in server B to another and then build the solution.
However when I run this batch file as part of a build definition it is not creating folders where it need to be. I believe due to the agent not having correct permissions.
Is there a way to run the following batch script to run with Admin permission from a build definition.
You can try below workarounds:
Convert the batch script to PowerShell script, then copy the
PowerShell script to target machine and use the PowerShell on Target
Machines task to run the script. You can enter your admin user
and password using the task. Reference below screenshot.
Add a PowerShell task and run below script to call the cmd.exe to
run the batch script with an admin user account on target machine
(Copy the batch script to target machine first, in below sample I
copied the batch script to C:\Scripts\Test.bat):
Param(
[string]$computerName = "v-tinmo-12r2",
)
$Username = "Domain\user"
$Password = ConvertTo-SecureString "password-here" -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($Username,$password)
Invoke-Command -ComputerName $computerName -Credential $cred -ErrorAction Stop -ScriptBlock {Invoke-Expression -Command:"cmd.exe /c 'C:\Scripts\Test.bat'"}

SQL installed but powershell not picking up sqlps commands (like invoke-sqlcmd)

I installed ms sql server with chocolatey:
choco install SQLServer2012DeveloperEditionWithSP1 -y -f -source 'http://choco.developers.tcpl.ca/chocolatey' -c "$env:WINDIR\temp"
SQL seems to be installed and working well outside of powershell where it doesn't work. I can see the sqlps module with:
Get-Module -listavailable
...
ModuleType Version Name ExportedCommands
---------- ------- ---- ----------------
Manifest 1.0 SQLASCMDLETS
Manifest 1.0 SQLPS
The commands seem to be missing though. I don't have invoke-sqlcmd etc. In theory I should get access to them if I install the module but when I try to import-module sqlps but I get an error about not having a sqlserver drive:
PS C:\WINDOWS\system32> Import-Module SQLPS
Set-Location : Cannot find drive. A drive with the name 'SQLSERVER' does not exist.
At C:\Program Files (x86)\Microsoft SQL Server\110\Tools\PowerShell\Modules\SQLPS\SqlPsPostScript.ps1:1 char:1
+ Set-Location SQLSERVER:
+ ~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (SQLSERVER:String) [Set-Location], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.SetLocationCommand
I know several people in my group who went through these steps and did get the correct sql ps setup working.
Any tips or ideas would be very helpful. Thanks.
Good day,
I am guessing that you are using SQL Server 2017, since this is common issue in 2017, as I will explain below. I am not sure what version is used since this question is a bit old and was asked on May 2 '17 at 22:58
The error that you get includes the basic issue
Set-Location : Cannot find drive. A drive with the name 'SQLSERVER' does not exist.
It does not say that the module 'SQLPS' does not exist, but that the module 'SQLSERVER' does not exist
The explanation is that until 2016 SQLPS module was included with the SQL Server installation, but the PowerShell module which we use is the 'SqlServer' module. 'SqlServer' module was included with SQL Server Management Studio (SSMS) 16.x, but if you are using SSMS 2017 (17.x) then 'SqlServer' module must be installed from the PowerShell Gallery.
The procedure to install it is to execute the command:
Install-Module -Name SqlServer
If you get error like PackageManagement\Install-Package : The following commands are already available on this system:...
Then you can enforce the installation using the parameters: -Force and –AllowClobber
Since I am not familiar with your system, I will NOT advice you what to do or say if you should enforce the installation, but this is the solution which I would probably do in most cases like this (according to the information I noticed in this thread)
Install-Module -Name SqlServer -Force –AllowClobber
In order to confirm that the module is instead you can execute the following command:
Get-Module -Name SqlServer -listAvailable | select Name, ModuleType, Version
Check the version of your installation using the command above, and use it in the following command in order to import the newest version (at the time I write this answer the version is 21.0.17279):
Import-Module SqlServer -Version 21.0.17279
That is all... If all went well then you should be able to use all the SQL Server PowerShell commands
Just a side-note for future readers, I was trying to create a sql backup via powershell. The cmdlets ran as they should under an Administrator account, however running the script under a regular user account, I got the following error:
Cannot find a provider with the name 'SqlServer'
Googling that question brought me here, but the answer to my issue was in a forum post here:
https://social.technet.microsoft.com/Forums/windowsserver/en-US/626bb81a-00ba-4239-ad0a-fec32546350a/check-if-drive-exists-if-not-map?forum=winserverpowershell
I encountered a weird issue and hope that somebody may have a fix for this.
Microsoft SQL Server 2014 (SP2-GDR) (KB4505217) - 12.0.5223.6 (X64) (yeah I know...it's a dev server)
Windows 2012 R2
PSVersion 4.0
I load the SQL assemblies in the PS script
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
The following will fail if I run this immediately after loading the assemblies
Get-ChildItem SQLSERVER:\SQL$env:COMPUTERNAME
Get-ChildItem : Cannot find drive. A drive with the name 'SQLSERVER' does not exist.
If I run the following first, I'm fine.
Invoke-Sqlcmd -Query "SELECT TOP 1 * FROM sys.sysobjects" | Out-Null
Get-ChildItem SQLSERVER:\SQL$env:COMPUTERNAME
I just wanted to amend that I just need to run
Invoke-Sqlcmd | Out-Null
This seems to then fully load the assemblies into memory and everything is OK after.
So I added this to my script:
# SQL cmdlets below need some dll imports from system
# These do not get loaded sometime when running under an non-admin account (Cannof find a provider with the name 'SqlServer')
# Running this dummy command seems to load all needed dlls
# Also see: https://www.sqlservercentral.com/forums/topic/unable-access-sql-provider-in-powershell-without-running-an-invoke-sqlcmd-first
Invoke-Sqlcmd | Out-Null
Seems like valuable information that shouldn't get lost, so I thought i'd post it in the highest SO when googling for that particular error.
One could argue that installing the module rather than this dummy method would be cleaner.
As pointed, here are the links that you should refer now.
MSDN Link
Running SQL Server Powershell
Cannot find path 'SQLSERVER' Issue
There is an answer given by Jarret. Simply loading the module won't help actually. These set of commands have to run after that.
Push-Location
cd $sqlpsPath
Add-PSSnapin SqlServerCmdletSnapin100
Add-PSSnapin SqlServerProviderSnapin100
Update-TypeData -PrependPath SQLProvider.Types.ps1xml
update-FormatData -prependpath SQLProvider.Format.ps1xml
Pop-Location
Hope it helps.

Powershell running as administrator refuses to redirect STDOUT to file

As an exercise, I am trying to make a web control panel for Hyper-V. It will use Powershell to interact with Hyper-V.
First thing I tried to do was to use the Get-VM command to create a table with the status of all the VM's on the local machine. I ran Get-VM from PHP:
$output = `powershell Get-VM`;
$output became NULL. After a lot of troubleshooting, I realized that Get-VM doesn't give output unless Powershell is running as an administrator (I was running the Powershell instance that I was testing with as administrator, but the one that was created by PHP was not run as administrator).
Then, I went to the powershell.exe file, right-clicked on it, went to Properties -> Compatibility -> Change settings for all users, and checked "Run as administrator". This should make it run as administrator no matter where it is started from, and it appears that it does exactly that in this case as well. However, once Powershell is running as an administrator, it does not output to a file, even if told to do so by the > operator.
If I run powershell Get-VM > out.txt, while powershell is running as an administrator, it creates the file out.txt, but it is 0kb and contains nothing. powershell Get-Process > out.txt has the same result, a 0kb file.
Running powershell Get-Process > out.txt with a non-elevated powershell.exe yields the expected result, namely the output from Get-Process in out.txt. However, you can't do this with powershell Get-VM > out.txt, since Get-VM returns no data unless powershell is run as administrator.
To wrap it up, my question is why does Powershell refuse to redirect its STDOUT to a file if it is running as administrator, but not if it is running as a normal user?
My diligent father has solved the problem, by not using redirection operators, but rather piping the output into Powershell's own out-file or export-csv, so that the command becomes powershell Get-VM | export-csv out.csv. I then parse this in PHP and convert it to a table. Works like a charm.

Deploying database using TFS Deployer and SqlPackage

I'm trying to make PowerShell script that uses SqlPackage to deploy database through TFS Deployer service. Script is working if it is executed directly from command line but it fails when TFS Deployer tries to execute it, the same user account is used for both cases.
The service is running in test mode (TfsDeployer -d), as console application, but it also fails when runs as Windows Service.
This is log file (captured output of SqlPackage and exception caught in PowerShell script):
11/18/2012 17:51:49 | Publishing to database 'databaseName' on server 'machineName'.
11/18/2012 17:51:56 | An error occurred while the batch was being executed.
11/18/2012 17:51:56 | System.Management.Automation.RemoteException: *** Could not deploy package.
These are the only information I was able to collect. Error code (HRESULT) was not present in the caught exception.
PowerShell script goes like this:
try{
$cmd = Join-Path (Get-Item "Env:ProgramFiles(x86)").Value "Microsoft SQL Server\110\DAC\bin\SqlPackage.exe"
$src = Join-Path $source "db.dacpac"
$cfg = Join-Path $source "db.publish.xml"
&$cmd /Action:Publish /SourceFile:$src /Profile:$cfg 2>&1 | ForEach-Object -Process {
Write-Log $_
}
}
catch [Exception] {
Write-Log $_.Exception.ToString()
if($_.Exception.HResult) {
Write-Log "Code: $($_.Exception.HResult.ToString('X'))"
}
}

Resources