Command GET-DBAOPERATINGSYSTEM: can a time out be set on this command? - dbatools

Hi DBATOOLS Community,
I'm using this great DBA powershell module to put in place a report script that will gather several metrics from all SQL Server I have in the company.
But sometimes the GET-DBAOPERATINGSYSTEM command just get stacked with out any output for hours ! ( just had to stop the run of my script after nearly 2 hours run on only this command ! ). So my questions are:
1. Do you have any idea why this command just got stack ?
2. Can a time out be specified to this command a how can it be set ?
Many Thanks for your answers.
$OsVersion= Get-DbaOperatingSystem -ComputerName $servername | SELECT ComputerName , Version

Related

Getting an error when trying to execute powershell code via a SQL Agent job

I've setup a SQL Server 2016 agent job that has 2 steps that is executed by the same service account, Step 1 executes an SSIS package that writes 2 files out to a share. Step 2 executes embedded powershell code that combines these 2 files into a single file and creates a file on that same share where the 2 files were created by SSIS. When I execute the job, Step 1, the SSIS package completes successfully and creates the 2 files on this share. But when Step 2 executes I get the error below. Just a point of info, when I run this script locally from my windows 10 box, I have no issues.
I guess my question is, if SSIS can execute successfully with the same service account can this be a permissions issue for powershell scripts? I'm not a server admin so if this is a stupid questions please forgive my ignorance. Any help/direction anyone can provide would be appreciated.
Message
Executed as user: XXX\sqlsvc02. The job script encountered the following errors. These errors did not stop the script: A job step received an error at line 12 in a PowerShell script. The corresponding line is 'Get-Content $pathCounts, $pathDetails | Set-Content $outFile'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Cannot find path '\xxx\shared\MonthlyReports\BoundPoliciesAddedAtFaultEndorsementWithin10Days_Counts.csv' because it does not exist. ' A job step received an error at line 12 in a PowerShell script. The corresponding line is 'Get-Content $pathCounts, $pathDetails | Set-Content $outFile'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Cannot find path '\xxx\shared\MonthlyReports\BoundPoliciesAddedAtFaultEndorsementWithin10Days_Details.csv' because it does not exist. '. Process Exit Code 0. The step succeeded.
Here is my powershell script:
# Define variables
$path = '\\xxx\shared\MonthlyReports'
$pathCounts = Join-Path -Path $path -ChildPath 'BoundPoliciesAddedAtFaultEndorsementWithin10Days_Counts.csv'
$pathDetails = Join-Path -Path $path -ChildPath 'BoundPoliciesAddedAtFaultEndorsementWithin10Days_Details.csv'
$date = Get-Date
$dateStr = $date.ToString("yyyyMM")
# Define combined file name
$outFile = Join-Path $path -ChildPath("BoundPoliciesAddedAtFaultEndorsementsWithin10Days_" + $dateStr + ".csv")
# Execute combine files code
Get-Content $pathCounts, $pathDetails | Set-Content $outFile
Here is an image of what Step #2 looks like:

read data for every 100 days untill we get the complete data in hive

I am copying data from prod to test for testing purpose in hive using bash script. when i am doing so for a table , I have received a memory heap issue.
to solve this , I am planning to read the data from rundate (day when i am executing the script) to the day where the data available for every 100 days to avoid this issue. can you please let me know how to achieve this using bash and please do let me know if is there any other approach other than setting up the memory
You basically need to run a HiveQL(.hql) script from shell.
Create a .hql script with your query of pulling only last 100 days of data.
example.hql
select * from my_database.my_table
where insert_date BETWEEN '2018-07-01' AND '2018-10-01';
Now you can call this script from hive shell:
hive -f example.hql
Or you can create a shell script and execute your query in it.
run.sh
#!/bin/bash
hive -e "select * from my_database.my_table
where insert_date BETWEEN '2018-07-01' AND '2018-10-01'" >select.txt
result=`echo $?`
if [ $result -ne 0 ]; then
echo "Error!!!!"
echo "Hive error number is: $result"
exit 1
else
echo "no error, do your stuffs"
fi
Then execute your shell script by sh run.sh.

Running Powershell Script from SQL Server Agent

How different should the programming be when you execute a powershell script from SQL Server Agent. I have been seeing very weird behavior.
Any object call fails
Can't we use powershell functions in these scripts. The parameters go empty if we call function through an object parameter
Some commands just don't pring messages, even though I use a variable hard print or I use Write-Output.
I just wanted to know why this is too different. I have a big script that automated and helped big manual task, which works with no errors at all when done manually.
Please help me on this.
Object: $agobj = $PrimarySQLConnObj.AvailabilityGroups[$AGName]
Error from Agent History:
The corresponding line is ' Add-SqlAvailabilityDatabase -InputObject $agobj -Database "$db" -ErrorAction Stop '. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Cannot bind parameter 'InputObject'. Cannot convert the "[AG]" value of type "Microsoft.SqlServer.Management.Smo.AvailabilityGroup" to type "Microsoft.SqlServer.Management.Smo.AvailabilityGroup"

Net User in PowerShell

I am in the middle of moving to the cloud, migrating from SBS 2003 Active Directory, to 2008 R2.
I configured a new user, and noticed that the user was unable to reset their password.
My Server Admin showed me how to use net user.
I noticed that I can obtain information from some accounts and not others. With over 100 accounts to process, I thought I'd try PowerShell.
In this post (Use powershell to look up 'net user' on other domains?) Lorenzo recommends using Get-ADUser (and this applies to polling from another domain). When I run Get-ADUser from my PowerShell prompt, I receive a message stating that the commandlet is not recognized.
I am reading the user IDs from a text file, and sending the output to a log file so that I can send to the server admin for further analysis.
Here is my code so far (please note that I am completely new to PowerShell):
# Get our list of user names from the local staff.txt file
$users = get-content 'C:\Scripts\staff.txt'
# Create log file of output:
$LogTime = Get-Date -Format 'MM-dd-yyyy_hh-mm-ss'
$CompPath = "C:\Scripts\"
$CompLog = $CompPath + "NetUserInfo" + $LogTime + ".txt"
New-Item -path $CompLog -type File
foreach ($user in $users) {
#Testing user:
"Testing user: $user" | out-file $CompLog -Append
# Obtain user information using net user:
net user $user /domain >> $CompLog
# Pause to let system gather information:
Start-Sleep -Second 15
}
As the script runs currently, my log file will have two or three user names followed by the response "The request will be processed at a domain controller for domain (domain)"
If net user, from CMD, would return "System error 5 has occurred, Access is denied." This is not logged in the output file. IF net user, from CMD, would return user information, this is logged to the output file. I am currently receiving output for only a couple users, but when I run the command from CMD, I am able to retrieve information for at least ten.
My first thought was that I needed to wait for the net user command to complete (hence the Start-Sleep command) but that has not had any effect on the output.
Any assistance would be greatly appreciated.
Sorry, I don't have enough reputation to add a comment.
When you run programs in Powershell (such as net user... or ping) it should be running exactly the same as it would in the normal command prompt (cmd).
If I understand correctly you're getting different results when you (effectively) run the same thing in Powershell or Command Prompt. Is that right?
Are you using the ISE to build your script? If so you can set Breakpoints that will pause the script and allow you to see what the variables are. It could be that the $user variable isn't holding what you think it should be, or that the value doesn't match the name for a domain user account.
EDIT:
What happens when you run the net user ... command interactively (e.g. not as a script, but manually) in Powershell? Do you get an error? If so, what does the error say?
Additional related stuff:
You shouldn't need the Start-Sleep as the commands are run in order, and the next line shouldn't execute until the previous on has completed.
Also, which version of Powershell are you using?
(You can check by running $host.version.Major)
The Get-ADUser cmdlet requires version 3 (I believe) and also needs to import the Active Directory module.
The reason the error output is not being appended to the log file is because you are only redirecting the STDOUT (standard output stream). To also redirect the STDERR (standard error stream) change
net user $user /domain >> $CompLog
to
net user $user /domain 2>&1 $CompLog
This explains it a bit more:
http://www.techotopia.com/index.php/Windows_PowerShell_1.0_Pipes_and_Redirection#Windows_PowerShell_Redirection_Operators

has anyone faced this error "Error: No valid counters" using type perf?

Has anyone faced this error, Error: No valid counters, using typeperf utility while writing it to SQL database. I have tried variety of different things but every time I try to write it in SQL database using counters in a file it fails with the No valid counters error.
The command was executed in the following fashion:
C:\>typeperf -cf "E:\DBA\CounterCollector\counters_eg.txt" -si 15 -sc 10 -f SQL -o SQL:SQLServerDS!log5
The counters_eg.txt file contains:
"\\<computername>\PhysicalDisk(* *)\Avg. Disk Queue Length"
I am able to write in SQL database by specifying the counters individually at command prompt.
example:
C:\Windows\system32>typeperf -f SQL -o SQL:SQLServerDS!log4 "\\<computername>\PhysicalDisk(* *)\Avg. Disk Queue Length"
Note: I have replaced the server name by <computername>.
Include a double '%%', i.e.
typeperf "\\<remote-IP>\Process(*)\%% Processor Time" -sc 1
Figured it out:
After following the example from
https://www.simple-talk.com/sql/performance/collecting-performance-data-into-a-sql-server-table/
I kept on getting the same error message "Error: No valid counters". The counter.txt is exactly the same like the example provided by Feodor but when I put the counter names on the command line individually, they get processed successfully. The problem I was getting was when I tried to run the entire syntax.
Instead of using what Feodor used:
"TYPEPERF -f SQL -s ALF -cf “C:\CounterCollect\Counters.txt” -si 15 -o SQL:SQLServerDS!log1 -sc 4",
I tweaked it a little bit (after looking at the second example from http://technet.microsoft.com/en-us/library/cc753182.aspx) and finally it WORKED! It is a matter of switching the parameters.
After following the demo by Feodor, I used this below syntax, and it worked for me. I am using SQL Server 2012 and here is the command:
TYPEPERF -cf "C:\PerfMonCollect\Counters.txt" -si 5 -sc 4 -f SQL -o SQL:SQLdatasource!log1".
Your counters list may be damaged. Run perfmon GUI utility and make sure that you are able to see the counters in there.
make sure your file name is correct. counters.txt NOT counters.txt.txt . show extensions then check the file name. also, you can try the RUN command and paste your target to the text file and see if it works.
I had the same issue and it drove me crazy.
I had this error now and solved it by adding the user running typeperf to local administrators group on the servers that threw the error.
I was getting this error on a server(Windows Server 2012 R2) I had admin rights on, I had to manually build performance counters and it was sorted. Here's the link https://support.microsoft.com/en-us/help/2554336/how-to-manually-rebuild-performance-counters-for-windows-server-2008-6
The problem is that the file should contain only file names, without " quote marks.
Removing all " from counterlist resolved the issue for me.

Resources