I've searched all over the web and am unable to find a solution/guide for my problem.
I'm using the below bit of script its part of a larger script to export multiple SQL tables into CSVs. It fills a dataset with data from an SQL table.
The problem I have is in mainly in relation to datetime settings. For example, If I were to export the SQL table using the export wizard into a CSV file. The date appears exactly like it does in SQL e.g. "2014-05-23 07:00:00.0000000" or "2014-05-23".
However when I use my script it changes the format of the datetime to "23/05/2014 07:00:00" or "23/05/2014 00:00:00". I believe this has something to do with the culture settings of my machine/powershell session.
cls
# Declare variables for connection and table to export
$Server = 'server'
$Database = 'database'
$Folder = 'D:\Powershell Scripts\01_Export From SQL\Test Folder'
$Tablename1 = 'test'
$Tablename2 = ''
# Delcare Connection Variables
$SQLconnection = New-Object System.Data.SqlClient.SqlConnection
$SQLconnection.ConnectionString = "Integrated Security=SSPI;server=$Server;Database=$Database"
# Delcare SQL command variables
$SQLcommand = New-Object System.Data.SqlClient.SqlCommand
$SQLcommand.CommandText = "SELECT [name] from sys.tables where [name] like '%$Tablename1%' and [name] like '%$Tablename2%' order by [name]"
$SQLcommand.Connection = $SQLconnection
# Load up the Tables in a dataset
$SQLAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SQLAdapter.SelectCommand = $SQLcommand
$DataSet = New-Object System.Data.DataSet
$null = $SqlAdapter.Fill($DataSet)
$SQLconnection.Close()
"Time to Export`tRecords `tFile Name"
"--------------`t------- `t---------"
foreach ($Table in $DataSet.Tables[0])
{
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
$FileExtractUTF8 = "$Folder\FromPSUTF8_$($Table[0]).csv"
$SQLcommand.CommandText = "SELECT * FROM [$($Table[0])]"
$SQLAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SQLAdapter.SelectCommand = $SQLcommand
$DataSet = New-Object System.Data.DataSet
$Count = $SQLAdapter.Fill($DataSet)
$SQLconnection.Close()
$DataSet.Tables[0] | Export-Csv $FileExtractUTF8 -NoTypeInformation -Encoding UTF8
$stopwatch.stop()
$Time = "{0}" -f $stopwatch.Elapsed.ToString('mm\:ss\.fff')
“{0,-14}`t{1,-10}`t{2}” -f $Time,$Count,$Table.name
}
The main goal is to export the data from SQL into a flat file with the data appearing exactly as it would if I used the export wizard.
Changing default DateTime format within a script
Within your DataSet / DataTable, the date exists as a [DateTime] type. When you export to CSV it needs to be converted to a string so it can be written to file. As you have observed, the default string conversion gives an output such as:
[PS]> (get-date).ToString()
21/03/2017 17:52:26
The format of this string is specified (as you worked out) by your specific globalization "culture". It appears to be a concatenation of the ShortDatePattern and LongTimePattern properties of the DateTimeFormat.
There are plenty of posts from people who have tried and failed to change the culture of their current session (i.e. the running PS host)...
why-does-powershell-always-use-us-culture-when-casting-to-datetime
...but it may be possible to change the globalization culture within your script using a mechanism such as the ones described here:
powershell-changing-the-culture-of-current-session
Using-Culture function
I suspect you should be able to use the Using-Culture example to wrap the Export-Csv line in your script.
But what culture to use?
So, you might now be able to set a specific culture, but do any of the pre-existing cultures use ISO8601 (sortable) dates? Or more specific formats? It seems not, so you have to make your own!
In short, you need to clone an existing CultureInfo and update the DateTimeFormat, specifically (at least) the ShortDatePattern to be what you want. Here is an example that I hope puts you on the right path. There are two functions, one which clones an existing CultureInfo and one which runs a command (Get-Date) with the new culture set for that thread.
function New-CultureInfo {
[CmdletBinding()]
[OutputType([System.Globalization.CultureInfo])]
param(
[Parameter()]
[System.Globalization.CultureInfo]
$BaseCulture = [System.Globalization.CultureInfo]::InvariantCulture
)
$CultureInfo = ($BaseCulture).Clone()
$CultureInfo.DateTimeFormat.ShortDatePattern = "yyyy'-'MM'-'dd"
$CultureInfo.DateTimeFormat.LongTimePattern = "HH:mm:ss.fff"
return $CultureInfo
}
function Test-DateFormat {
[CmdletBinding()]
[OutputType([System.DateTime])]
param(
[Parameter(Mandatory)]
[System.Globalization.CultureInfo]
$Culture
)
[System.Threading.Thread]::CurrentThread.CurrentUICulture = $Culture
[System.Threading.Thread]::CurrentThread.CurrentCulture = $Culture
(Get-Date).ToString()
}
Example use:
[PS]> $NewCulture = (New-CultureInfo)
[PS]> Test-DateFormat $NewCulture
2017-03-21 19:08:34.691
Now, I haven't been able to run this against an example close to the SQL problem in the OP, but I've had fun working this all out. ;-)
Great description by Charlie. My problem is that I wanted to change the default ToString to output an ISO datetime that contains a T separator between date and time instead of a space. TL;DR - it's not possible.
I'm more from the Java world than MS but had to write a script to export db CSVs and here's my investigation in case anyone else is interested in how the format is built. I dug into the source code to see how ToString works on DateTime.
According to the DateTime class it will defer to DateTimeFormat
https://referencesource.microsoft.com/#mscorlib/system/datetime.cs,0a4888bea7300518
public override String ToString() {
Contract.Ensures(Contract.Result<String>() != null);
return DateTimeFormat.Format(this, null, DateTimeFormatInfo.CurrentInfo);
}
This will eventually call into Format method with a null String format. This causes the block below to be called which basically specifies the "G" format to use for the date/time.
https://referencesource.microsoft.com/#mscorlib/system/globalization/datetimeformat.cs,386784dd90f395bd
if (offset == NullOffset) {
// Default DateTime.ToString case.
if (timeOnlySpecialCase) {
format = "s";
}
else {
format = "G";
}
}
This will eventually make a call to get the current Culture's DateTimeInfo object that has an internal String pattern that cannot be overridden. It lazily sets this so that it doesn't have to concatenate and there is no way to override it. As Charlie pointed out, it always concatenates ShortDatePattern and LongTimePattern with a space separator.
https://referencesource.microsoft.com/#mscorlib/system/globalization/datetimeformatinfo.cs,799bd6e7997c4e78
internal String GeneralLongTimePattern {
get {
if (generalLongTimePattern == null) {
generalLongTimePattern = ShortDatePattern + " " + LongTimePattern;
}
return (generalLongTimePattern);
}
}
So there you have it. Hope it helps the next person know why/how and that it's not possible to override the overall default format - just the date and the time components.
Related
I have an application I made with PowerShell that obtains info from a SQL database for various controls. My 'View', will occasionally work perfectly. But if I close the application and reopen, the same thing can yield different results. I've set PS to run as admin, but again the same issue. The view should read 1971 results and look like
#ASP.Net Client Ugrade
C# Client Upgrade
Another Basic Project Name
However, while this works correctly some of the time...the output I get when it decides to not load correctly is
#ASP.Net Client Ugrade
1971
void Open(), void IDbConne
I'm not sure why it's adding property information instead of the values being requested. Also not sure why the results sometimes are right and sometimes wrong, while doing the same behavior(Just starting the program, the box is populated on load). Relevant code to how I'm selecting and populating the data:
[string[]]$projectsAll = "" #I only do it this way because the sql query is actually in a method, were I return the array-and then do the foreach item in the array add to the projectview
$query = "select Title from OversightProjectsFix where ID > 0 Order By Title"
if ($sqlcon.State -eq 'Closed'){$sqlcon.Open()}
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $query
$SqlCmd.Connection = $sqlcon
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$ds = New-Object System.Data.DataSet
$SqlAdapter.Fill($ds)|out-null
foreach ($Row in $ds.Tables[0].Rows)
{
$projectsAll += "$($Row.Title)"
}
$ProjectView.Items.Clear()
foreach ($p in $projectsAll)
{
$ProjectView.Items.Add($p)
}
As I mentioned before, if this same code is ran, you should always get the same results-but this one returns differently and I'm not sure why? How can I make it always work?
It's hard to understand exactly what is wrong with this snippit of code, but I think that it may be because the DataSet is empty, and you are . i.e. I think that it isn't connecting and pulling any data.
Since the dataset is initialized with New-Object, if it didn't fill properly, what happens is if you try to iterating through it using quotes, instead of adding it as a straight assignment, it may cause it to unexpectedly return the data type instead of an "expected" null or empty string. You can see this when you try something like this:
$ds = New-Object System.Data.DataSet
#Returns Null
PS> $ds.Tables[0].Rows
PS>
#Returns data type
PS> "$ds.Tables[0].Rows"
System.Data.DataSet.Tables[0].Rows
So you may have to remove the quotes i.e.:
$projectsAll += $Row.Title
You should add a check to see if the dataset has something before trying to iterate and add anything.
Also, usually by implementing the $SqlAdapter.Fill($ds) method opens the connection, without you needing to explicitly open the SQL connection. I wonder if your inconsistencies is due to the SQL connection opening/closing unexpectedly?
I had prepared a script to pull some report w.r.t SQL server and out put will be pushed to different CSV sheets. After output is generated, all the CSV's are merged to single Excel file with the help of custom created function and that excel will be sent to my email address.
While running htrough powershell_ise.exe, it is running fine and I am receiving the email successfully. When I scheduled the same script, I am receiving the email but with out excel attachments. I am suspecting that custom created function is not used, because I dont see any converted excel files in the desired location.
I tried all possible ways, like dot sourching, pasting the function in the script itself but no luck.
I am a beginner in powershell, can some one please help me if i am missing some thing.
Thanks,
Anil
Function Merge-CSVFiles
{
Param(
$CSVPath = "D:\Anil\Missing_Indexes", ## Soruce CSV Folder
$XLOutput="D:\Anil\Missing_Indexes.xls" ## Output file name
)
$csvFiles = Get-ChildItem ("$CSVPath\*") -Include *.csv
$Excel = New-Object -ComObject excel.application
$Excel.visible = $false
$Excel.sheetsInNewWorkbook = $csvFiles.Count
$workbooks = $excel.Workbooks.Add()
$CSVSheet = 1
Foreach ($CSV in $Csvfiles)
{
$worksheets = $workbooks.worksheets
$CSVFullPath = $CSV.FullName
$SheetName = ($CSV.name -split "\.")[0]
$worksheet = $worksheets.Item($CSVSheet)
$worksheet.Name = $SheetName
$TxtConnector = ("TEXT;" + $CSVFullPath)
$CellRef = $worksheet.Range("A1")
$Connector = $worksheet.QueryTables.add($TxtConnector,$CellRef)
$worksheet.QueryTables.item($Connector.name).TextFileCommaDelimiter = $True
$worksheet.QueryTables.item($Connector.name).TextFileParseType = 1
$worksheet.QueryTables.item($Connector.name).Refresh()
$worksheet.QueryTables.item($Connector.name).delete()
$worksheet.UsedRange.EntireColumn.AutoFit()
$CSVSheet++
}
$workbooks.SaveAs($XLOutput,51)
$workbooks.Saved = $true
$workbooks.Close()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workbooks) | Out-Null
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
While running through powershell_ise.exe, it is running fine and I am receiving the email successfully
This is because the ISE on your box is able to load your custom function during runtime and generate the report.
But when this is scheduled through a job, the script will run on a different server than your box, hence the script will not be able to find the function and you don't get the report.
I have faced this kind of issue before while using custom functions.
The work around that i can suggest is wrapping for custom functions in a separate module and importing the module in your main script. ( preferably save the module in the same location as your script for easy troubleshooting).
Example:
Save your function in a .psm1 module file
Function ScriptExample {
Param ( [string] $script,
[string] $jobname,
[string] $jobcategory,
[hashtable] $config,
[string] $deletelogfilepath,
[string] $servername)
#your-function-here#
{
}
Return $script;
}
Now call this module in your main script as follows,
$importmoduleroot = "C:\temp\SO_Example.psm1"
###### LOAD MODULES ######
# Import all related modules written to support this process
$modules = get-childitem -path $importmoduleroot -include SO*.psm1 -recurse;
You can then call your function and pass in the parameters within the main script,
ScriptExample -script $script `
-jobname $jobname `
-jobcategory $jobcategory `
-config $config `
-servername $ServerName `
-deletelogfilepath $deletelogfilepath
I want to generate an import script for a MSSQL DB via Powershell (related to this question).
I tried doing this:
#Set-ExecutionPolicy RemoteSigned
$DB_NAME = "<<dbName>>"
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
$srv = new-object "Microsoft.SqlServer.Management.SMO.Server" "<<server>>"
$conContext = $srv.ConnectionContext
$conContext.LoginSecure = $false
$conContext.Login = "<<user>>"
$conContext.Password = "<<password>>"
$srv = new-object Microsoft.SqlServer.Management.Smo.Server($conContext)
$srv.SetDefaultInitFields([Microsoft.SqlServer.Management.SMO.View], "IsSystemObject")
$db = $srv.databases[$DB_NAME]
$scripter = new-object "Microsoft.SqlServer.Management.Smo.Scripter" $srv
$scripter.Options.ScriptSchema = $false
$scripter.Options.ScriptData = $true
$scripter.Options.ScriptDrops = $false
$scripter.Script($db)
But executing this throws an error:
"This method does not support scripting data"
I also tried to set the output file option but this doesn't change anything.
Can you tell me what I did wrong?
Thanks!
Per the error, Scripter.Script does not support scripting data. This is documented. What isn't documented is what you're supposed to use instead, but it's EnumScript:
$scripter.EnumScript(#($db.Tables))
You must pass the tables, since simply scripting the database will yield nothing (as, technically, the database itself contains no data, its tables do).
(The #() forcibly converts the Tables collection to an array, since that's what EnumScript expects.)
I have a report in SSRS, which has a parameter in it. For each possibility in the parameter, I need an Excel file. This comes down to 50 Excel files. the only way I know to schedule a report is to go to the reporting services home page, go to my report, click manage, click subscriptions > New subscription and to enter a file name, path, user name, password, schedule, parameter and ultimately press OK.
Is there a quicker way to do this, or is there a way which allows me to create the 50 reports more quickly, like copying a subscription or something like that?
try creating a ssis package and running the report for all values of the parameter. i had seen someone do this in my previous company.
data driven subscriptions are available only in enterprise and developer editions - yours could be standard.
You could also write a script in PowerShell or write an app in C#/VB. Here is an example done in PowerShell. Here is an example done in C#. Using either of these approaches, you could programmatically render the reports as you see fit. You can also create subscriptions this way as well.
PowerShell solution to the OP:
# Create a proxy to the SSRS server and give it the namespace of 'RS' to use for
# instantiating objects later. This class will also be used to create a report
# object.
$reportServerURI = "http://<SERVER>/ReportServer/ReportExecution2005.asmx?WSDL"
$RS = New-WebServiceProxy -Class 'RS' -NameSpace 'RS' -Uri $reportServerURI -UseDefaultCredential
$RS.Url = $reportServerURI
# Set up some variables to hold referenced results from Render
$deviceInfo = "<DeviceInfo><NoHeader>True</NoHeader></DeviceInfo>"
$extension = ""
$mimeType = ""
$encoding = ""
$warnings = $null
$streamIDs = $null
# Next we need to load the report. Since Powershell cannot pass a null string
# (it instead just passes ""), we have to use GetMethod / Invoke to call the
# function that returns the report object. This will load the report in the
# report server object, as well as create a report object that can be used to
# discover information about the report. It's not used in this code, but it can
# be used to discover information about what parameters are needed to execute
# the report.
$reportPath = "/PathTo/Report"
$Report = $RS.GetType().GetMethod("LoadReport").Invoke($RS, #($reportPath, $null))
# Report parameters are handled by creating an array of ParameterValue objects.
# $excelInput: either pass in as a parameter and run 50 times, or reset
# this value and run it each time with the updated excel file
$excelInput = "<ExcelFile>";
$parameters = #()
$parameters += New-Object RS.ParameterValue
$parameters[0].Name = "Excel Input File"
$parameters[0].Value = $excelInput
# Add the parameter array to the service. Note that this returns some
# information about the report that is about to be executed.
$RS.SetExecutionParameters($parameters, "en-us") > $null
# Render the report to a byte array. The first argument is the report format.
# The formats I've tested are: PDF, XML, CSV, WORD (.doc), EXCEL (.xls),
# IMAGE (.tif), MHTML (.mhtml).
$RenderOutput = $RS.Render('PDF',
$deviceInfo,
[ref] $extension,
[ref] $mimeType,
[ref] $encoding,
[ref] $warnings,
[ref] $streamIDs
)
# Convert array bytes to file and write
$OutputFile = $excelInput + ".pdf"
$Stream = New-Object System.IO.FileStream($OutputFile), Create, Write
$Stream.Write($RenderOutput, 0, $RenderOutput.Length)
$Stream.Close()
I have a csv with columns that contain a user's first name last name. I have firgured out how to import the csv data from each column using:
$firstname = Import-csv .\data.csv | select-object "Employee First Name"
$lastname = Import-csv .\data.csv | select-object "Employee Last Name"
I have also figured out that:
$username=$firstname.substring(0,1)+$lastname
works when using system.string objects. The problem is I need to loop through all of names to create a new column with the username and I get the error
System.Object[]] doesn't contain a method named 'substring'
I thought I could convert to string using $firstname.ToString() but I seem to have botched that. Any help is geatly appreciated as I am still trying to learn powershell.
The error is self expressive $firstname is not a string. $firstname is an array of string.
Try :
$username = ($firstname[0]).substring(0,1)+$lastname[0]
you'd better use :
$employees = Import-csv .\data.csv
foreach ($employee in $employees)
{
$username=($employee.'Employee First Name').substring(0,1)+($employee.'Employee Last Name')
}
Looks like you are trying to make a user name. So i'm not sure why you want to import one column at a time?
But this might work for you.
$names = Import-Csv .\test.csv
$userNames = New-Object 'System.Collections.Generic.List[String]'
foreach ($name in $names)
{
$userNames.Add($($name.firstName.Substring(0,1) + $name.lastName))
}