I have some SQL deadlocks I am trying to capture mediaName from. The deadlock report is in XML but the attribute i need is buried in XML, then SQL, then XML again. Here is an example.
XPATH for where the SQL starts is /deadlock/process-list/process/inputbuf, then the SQL is:
SET DEADLOCK_PRIORITY 8;
EXEC spM_Ext_InsertUpdateXML N'<mediaRecords><media
title="This Is the title" mediaType="0"
creationTime="2018-03-16T00:59:43" origSOM="01:00:00;00" notes="Air Date:
2018-03-18 
Air Window: 3 
" mediaName="This is what i need"
><mediaInstances><mediaInstance directory="here"
duration="00:28:40;11" version="1" position="00:00:00;00" mediaSetId="34"
creationStartTime="2018-03-16T00:59:43;25" creationEndTime="2018-03-
16T00:59:43;25"/></mediaInstances><properties><
classifications><classification category="HD" classification="Content
Resolution"/></classifications><markups><markup
name=""><Item duration="00:00:10;00" orderNo="1"
type="Dynamic" som="00:59:50;00" comment=""
name="Segment"/></markup><markup
name="Segment"><markupItem duration="00:08:41;10" orderNo="2"
type="Dynamic" som="01:00:00;00" comment="Main Title and Segment 1 |
ID:SEDC" name="Segment"/></markup><markup
name="Black"><markup
See how the XML isnt using < and > for the elements but the < and > which adds complexity.
I am trying to extract only mediaName from this report but cant get past the above mentioned XPath with powershell. Was hoping someone might have an idea. I was using
$xml = [xml](Get-Content "C:\Users\user\desktop\test.xml")
$xml.SelectNodes('/deadlock/process-list/process/inputbuf') | select mediaName
I have also tried piping select-xml to where-object but I don't think I am using the right $_.[input]
With the help of tomalak and the answer below this is the fixed and working parsing script.
#report file location, edited by user when needed
$DeadlockReport = "C:\Users\User\Desktop\xml_report1.xml"
# Create object to load the XML from the deadlock report and find the SQL within
$xml = New-Object xml
$xml.Load($DeadlockReport)
$inputbuf = $xml.SelectNodes('//deadlock/process-list/process/inputbuf')
$value = $inputbuf.'#text'
#find the internal XML and replace bad values, SQL, and truncation with RE
$value = $value -replace "^[\s\S]*?N'","" -replace "';\s*$","" -replace "<markup.*$","</properties></media></mediaRecords>"
#append root elements to $value
$fix = "<root>" + $value + "</root>"
#Load the XML after its been corrected
$payload.LoadXml($fix)
#find the nodes in the xml for mediaName
$mediaName = $payload.SelectNodes('//root/mediaRecords/media/#mediaName')
#iterate through and return all media names.
foreach($i in $mediaName)
{
return $mediaName
}
What you have is:
an XML file,
which contains a string value,
which is SQL,
which contains another string value,
which is XML again.
So let's peel the onion.
First-off, please never load XML files like this
# this is bad code, don't use
$xml = [xml](Get-Content "C:\Users\user\desktop\test.xml")
XML has sophisticated file encoding detection, and you are short-circuiting that by letting Powershell load the file. This can lead to data breaking silently because Powershell's Get-Content has no idea what actual encoding of the XML file is. (Sometimes the above works, sometimes it doesn't. "It works for me" doesn't mean that you're doing it right, it means that you're being lucky.)
This is the correct way:
$xml = New-Object xml
$xml.Load("C:\Users\user\desktop\test.xml")
Here the XmlDocument object will take care of loading the file and transparently adapt to any encoding it might have. Nothing can break and you don't have to worry about file encodings.
Second, don't let the looks of the XML file in a text editor deceive you. As indicated, /deadlock/process-list/process/inputbuf contains a string as far as XML is concerned, the < and > and all the rest will be there when you look at the actual text value of the element.
$inputbuf = $xml.SelectSingleNode('/deadlock/process-list/process/inputbuf')
$value = $inputbuf.'#text'
Write-Host $value
Would print something like this, which is SQL:
SET DEADLOCK_PRIORITY 8;
EXEC spM_Ext_InsertUpdateXML N'<mediaRecords><media
title="This Is the title" mediaType="0"
creationTime="2018-03-16T00:59:43" origSOM="01:00:00;00" notes="Air Date:
2018-03-18
Air Window: 3
" mediaName="This is what i need"
><mediaInstances><mediaInstance directory="here"
duration="00:28:40;11" version="1" position="00:00:00;00" mediaSetId="34"
creationStartTime="2018-03-16T00:59:43;25" creationEndTime="2018-03-
16T00:59:43;25"/></mediaInstances><properties><
classifications><classification category="HD" classification="Content
Resolution"/></classifications><markups><markup
name=""><Item duration="00:00:10;00" orderNo="1"
type="Dynamic" som="00:59:50;00" comment=""
name="Segment"/></markup><markup
name="Segment"><markupItem duration="00:08:41;10" orderNo="2"
type="Dynamic" som="01:00:00;00" comment="Main Title and Segment 1 |
ID:SEDC" name="Segment"/></markup><markup
name="Black"><markup ...
</mediaRecords>';
And the XML you are interested in is actually a string inside this SQL. If the SQL follows this pattern...
SET DEADLOCK_PRIORITY 8;
EXEC spM_Ext_InsertUpdateXML N'<...>';
...we need to do three things in order to get to the XML payload:
Remove the enclosing SQL statements
Replace any '' with ' (because the '' is the escaped quote in SQL strings)
Pray that part in-between does not contain any other SQL expressions
So
$value = $value -replace "^[\s\S]*?N'","" -replace "';\s*$","" -replace "''","'"
would remove everything up to and including N' and the '; at the end, as well as replace all the duplicated single quotes (if any) with normal single quotes.
Adapt the regular expressions as needed. Replacing the SQL parts with regex isn't exactly clean, but if the expected input is very limited, like in this case, it'll do.
Write-Host $value
Now we should have a string that is actually XML. Let's parse it. This time, it's already in our memory, there isn't any file encoding to pay attention to. So it's actually all-right if we cast it to XML directly:
$payload = [xml]$value
And now we can query it for the value you are interested in:
$mediaName = $payload.SelectSingleNode("/mediaRecords/media/#mediaName")
Write-Host $mediaName
Related
I am trying to export a sql query to an xml file with Powershell using the below script:
$SQLResult = Invoke-Sqlcmd -MaxCharLength ([int]::MaxValue) -AbortOnError -EncryptConnection -ConnectionTimeout $TIMEOUT -Database $Database -ServerInstance $SQL_SERVER_FULLNAME_SOURCE -Username $SQL_ACCOUNT_NAME_SOURCE -Password $SQL_ACCOUNT_PASSWORD_SOURCE -Query $QUERY
$PropertyName = ($SQLResult | Get-Member -MemberType Property | Where {$_.Name -like "XML*"}).Name
$SQLResult.$PropertyName | Out-File -FilePath $OutputFile -Force
It should generate a file like below:
<vendors>
<vendor>
<administration>YIT</administration>
<FISCALCODE>804658055B01</FISCALCODE>
<accountNumber>10001</accountNumber>
<Offset_LedgerAccount></Offset_LedgerAccount>
<name>Comp Europe B.V.</name>
<shortName>Comp Europe B.V.</shortName>
<TAXEXEMPTNUMBER>NL801238055B01</TAXEXEMPTNUMBER>
<DefaultDescription></DefaultDescription>
<invoiceType>Non-PO CEE-FN-ACQ</invoiceType>
<IBAN>NL57MHCB0212303590</IBAN>
<CUSTOMERACCOUNT></CUSTOMERACCOUNT>
<TAXITEMGROUP></TAXITEMGROUP>
<IncludeOrderReference>Unknown</IncludeOrderReference>
<lineText></lineText>
<lines>Unknown</lines>
</vendor>
<vendor>
<administration>YIT</administration>
<FISCALCODE>03840123961</FISCALCODE>
<accountNumber>20001</accountNumber>
<Offset_LedgerAccount></Offset_LedgerAccount>
<name>4ABCD - For Marketing s.r.l.</name>
<shortName>4ABCD-FOR MARKETING</shortName>
<TAXEXEMPTNUMBER>03840123961</TAXEXEMPTNUMBER>
<DefaultDescription></DefaultDescription>
<invoiceType>Purchase invoice</invoiceType>
<IBAN>IT93M0306912330615256048252</IBAN>
<CUSTOMERACCOUNT></CUSTOMERACCOUNT>
<TAXITEMGROUP></TAXITEMGROUP>
<IncludeOrderReference>Unknown</IncludeOrderReference>
<lineText></lineText>
<lines>Unknown</lines>
</vendor>
<vendors>
In real it can contain a few thousand records.
The problem with the various other options that I used is that the output file is either truncated, or it contains double quotes. With the above script suddenly extra spaces are added that are repeated every certain block of output. If I run the query in MSMS it generates a perfectly clean result that I can save as .xml file.
<vendor>
<administration>YIT</administration>
<FISCALCODE>0381230961</FISCALCODE>
<accountNumber>20001</accountNumber>
<Offset_LedgerAccount/>
<name>4ABCD - For Marketing s.r.l.</name>
<shortName>4ABCD-FOR MARKETING</shortName>
<TAXEXEMPTNUMBER>03840123961</TAXEXEMPTNUMBER>
<DefaultDescription/>
<invoiceType>Purchase invoice</invoiceType>
<IBAN>IT93M03069121238252</IBAN>
<CUST OMERACCOUNT>< CUSTOMERACCOUNT><TAXITEMGROUP>< TAXITEMGROUP><IncludeOrderReference>Unknown< IncludeOrderReference><lineText>< lineText>
As you can see above it starts okay but then suddenly it starts adding a single space to certain nodes, not to all. Then it continues okay for some records and then a gain a line with a single space in some of the nodes.
I tried various solutions but none of them seem to work for me. Can somebody explain what I am doing wrong? The actual query I use is below.
SELECT
v.[DATAAREAID] AS [administration]
,v.[FISCALCODE]
,v.[VENDORACCOUNTNUMBER] AS [accountNumber]
,v.[DEFAULTOFFSETLEDGERACCOUNTDISPLAYVALUE] AS [Offset_LedgerAccount]
,v.[VENDORORGANIZATIONNAME] AS [name]
,v.[VENDORSEARCHNAME] AS [shortName]
,CASE WHEN LEFT(v.[TAXEXEMPTNUMBER],2)='IT' THEN RIGHT(v.[TAXEXEMPTNUMBER],LEN(v.[TAXEXEMPTNUMBER])-2) ELSE v.[TAXEXEMPTNUMBER] END AS [TAXEXEMPTNUMBER]
,ISNULL(v.[NOTES],'') AS [DefaultDescription]
,ISNULL(ipvp.[INVOICETYPENAME],'') AS [invoiceType]
,ISNULL(vba.[IBAN],'') AS [IBAN]
,ISNULL(c.[CUSTOMERACCOUNT],'') AS [CUSTOMERACCOUNT]
,ISNULL(itc.[TAXITEMGROUP],'') AS [TAXITEMGROUP]
,'Unknown' AS [IncludeOrderReference]
,v.[DESTINATIONCODE] AS [lineText]
,'Unknown' AS [lines]
FROM [dbo].[RetailVendVendorV3Staging] v
LEFT OUTER JOIN [dbo].[AXTip_ParametersVendTableStaging] ipvp ON v.[VENDORACCOUNTNUMBER]=ipvp.[VENDORACCOUNT] AND v.[DATAAREAID]=ipvp.[DATAAREAID]
LEFT OUTER JOIN [dbo].[VendVendorBankAccountStaging] vba ON vba.[DATAAREAID]=v.[DATAAREAID] AND vba.[VENDORACCOUNTNUMBER]=v.[VENDORACCOUNTNUMBER] AND vba.[VENDORBANKACCOUNTID]=v.[BANKACCOUNTID]
LEFT OUTER JOIN [dbo].[CustCustomerV3Staging] c ON c.[VENDORACCOUNT]=v.[VENDORACCOUNTNUMBER] AND v.[DATAAREAID]=c.[DATAAREAID]
LEFT OUTER JOIN [dbo].[AXTip_ImportTaxcodeStaging] itc ON itc.[ACCOUNTRELATION]=v.[VENDORACCOUNTNUMBER] AND itc.[DATAAREAID]=v.[DATAAREAID] AND itc.[EXTERNALTAXCODE]='I'
WHERE v.[DATAAREAID] ='YIT' AND v.[FISCALCODE]!=''
FOR XML PATH('vendor'), root('vendors')
I read quite a lot of issues using the method that I used in the opening post. The method described in the link below seems a more "powershell" way to generate an xml file using sql-data. I copied this method and it works well.
stuartsplace.com: powershell-and-sql-server-exporting-data-xml
I have a pipe delimited csv file whose data are to be inserted into sql server.
Then I
Opened that file in excel and added comma in column values.
Then saved.
Here is the data in notepad++ after adding comma.
Then I bulk inserted that file into sql server.
BULK INSERT #temp1
FROM '\\as03\AppData\PipeDelimited.csv'
WITH (
FIRSTROW = 2
, FIELDTERMINATOR ='|'
, ROWTERMINATOR='\n'
, ERRORFILE ='\\as03\AppData\ErrorFiles\PipeDelimited.csv'
, CODEPAGE = '65001'
**strong text**, MAXERRORS = 99999999
)
But got the double quotes (") in first and last column values and also got two consecutive double quotes ("") where one double quote was already in file.
Here is the data inserted in sql server.
Is where some way to insert data in sql server by ignoring double quotes that were added by excel or notepad????
This appears to be a non-(PowerShell)issue as it actually works fine in PowerShell:
$Csv = #'
Name|Country|Address|Mobile
Tom|Nepal|Kathmandu,Nardevi|98456667365
Harry|India|Delhi"Patna,™ or R|9856524524
'# -Split '[\r?\n]'
$Csv |ConvertFrom-Csv -Delimiter '|' # Import-Csv .\PipeDelimited.csv -Delimiter '|'
Yields:
Name Country Address Mobile
---- ------- ------- ------
Tom Nepal Kathmandu,Nardevi 98456667365
Harry India Delhi"Patna,™ or R 9856524524
In other words, you might simply convert your PipeDelimited.csv to a more common CommaDelimited.csv with text indicators like:
Import-Csv .\PipeDelimited.csv -Delimiter '|' |Export-Csv .\CommaDelimited.csv
Your file was corrupted by editting it in Excel and saving it as a CSV. The best solution is to not use Excel to edit such files, but rather either use scripting or a text editor (depending on the complexity - just adding a comma to one field feels easiest in a text editor).
However; if we're saying the damage is done and you need a script to fix the issues caused you could run something like this; this reads in the data as plain text, applies a regex to remove the offending quotes, then spits the result out to a copy of the file (I've written to a copy rather than back to the original so you can rollback more easily if this wasn't what you wanted).
[string]$sourcePath = '.\PipeDelimited.csv'
[string]$outputPath = '.\PipeDelimitedCorrected.csv'
[System.Text.Encoding]$encoding = [System.Text.Encoding]::UTF8 # based on your code page
[string[]]$data = [System.IO.File]::ReadAllLines($sourcePath, $encoding)
$data = $data | ForEach-Object {((($_ -replace '^"', '') -replace '"$', '') -replace '""', '"')}
# replaces quotes at the start and end of the line with blanks,
# then replaces escaped double quotes with individual double quotes...
# it's a quick and dirty approach, but looking at your example should be sufficient
[System.IO.File]::WriteAllLines($outputPath, $data, $encoding)
I was given a backup RayStation database, RS_Patients.bak, and am trying to extract and view the DICOM images that are stored in it. The trouble is two-fold: I don't know which one of the 2,000+ fields (or combinations of fields) refer to the images themselves, and even if I did know where the images were, I don't know how to extract them from the database into .dcm files.
From examining the schema, I found a few fields that are large varbinary fields (BLOBs) and I think they might be the fields I'm looking for. FileStream is enabled for the database and there is an FS directory. I've tried to download those fields into files using the bcp Utility, but that hasn't generated successful DICOMs.
Does anyone have any experience with this sort of database/image structure? Any other suggestions for pulling out and viewing the image? Do you think the image would be made up of a couple of fields instead of just one? There are fields next to what we believe the image field with headers for the DICOM image: in the table callefd ImageStack, next to a field called PixelData, there are fields called PixelSize, SlicePosition, NrPixels, etc.
Also, if you can think of another place to ask this, I would appreciate that too.
Edit per #mcNets suggestion, the bcp command:
DECLARE #Command Varchar(400)
SET #Command = 'bcp "SELECT TOP 1 PixelData FROM RayStationPatientDB.dbo.ImageStack" queryout "C:\Users\Administrator\Documents\test.dcm" -S WIN-123ABC\MSSQLSERVER01 -T -w'
EXEC xp_cmdshell #Command
Generally speaking, you're not going to be able to use SQL Server results to write image data directly. bcp.exe isn't going to help you, either. You need to either use something that understands that the binary string is raw file data, or, because this is a FILESTREAM, use something that will give you the path to the file on the SQL Server. I have limited experience with FILESTREAM, but here's what I would do.
I can't definitively answer which field to use. That will depend on the application. If we assume that the DICOM images are stored in a FILESTREAM, then you can find the available FILESTREAM columns with this:
select t.name TableName
,c.name ColumnName
from sys.tables t
join sys.columns c
on c.object_id = t.object_id
where c.is_filestream = 1
If we also assume that DICOM images are stored as raw image files -- i.e., as a complete binary version of what they would be if they were saved on a PACS disc -- then you can run this to determine the path for each file by the ID:
select TableName_Id
,FileData.PathName()
from TableName.ColumnName
The doc for the PathName() function of FILESTREAM columns is here.
If you instead want to pull the data through SQL Server in a traditional sense, then I would probably use a PowerShell script to do it. This has the advantage of letting you use arbitrary data from the server to name the files. This method also has the advantage that it will work on any binary or varbinary column. As a disadvantage, this method will be slower and uses more disk space, because the server has to read the data, send it to the client, and then the client writes the data to disk:
$SqlQuery = "select Name, FileData from TableName.ColumnName";
$OutputPath = 'C:\OutputPath';
$SqlServer = 'ServerName';
$SqlDatabase = 'DatabaseName';
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
$SqlCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand;
$SqlCommand.CommandText = $SqlQuery;
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
$SqlCommand.Connection = $SqlConnection;
$SqlConnection.Open();
$SqlDataReader = $SqlCommand.ExecuteReader();
while ($SqlDataReader.Read()) {
$OutputFileName = Join-Path -Path $OutputPath -ChildPath "$($SqlDataReader['Name']).dcm"
[System.IO.File]::WriteAllBytes($OutputFileName,$SqlDataReader['FileData']);
}
$SqlConnection.Close();
$SqlConnection.Dispose();
It's also possible to use FILESTREAM functions to return Win32 API handles, but I have never done that.
I'm looking for suggestions on either returning multiple datasets, or keeping the session open, with Invoke-SqlCmd?
I have a series of SQL queries that use temporary tables to generate a few related views of the data that I am sending on (via Excel) to a manager. As we work on what is required from the datasets, I am getting a little tired of cutting and pasting samples into Excel.
I thought to use Powershell to simply send the results to HTML files as output for the manager, however I ran into a couple of problems
If I put the final extracts into one SQL file, Powershell appends all of the data into a single result set (sort of a union of the tables)
If I attempt to build the temporary tables and then extract each query individually, each Invoke-Sqlcmd is a seperate session, meaning my Temporary tables get dropped.
I'm looking for suggestions on either returning multiple datasets, or keeping the session open?
Invoke-Sqlcmd -InputFile .\GenerateTimecard.sql -Variable $params | Out-Null;
#{
'Summary' = 'select * from #WeeklyTimeSummary;'
'ByDay' = 'select * from #WeeklyTimeDaily order by postdate desc;'
'ByTask' = 'select * from #WeeklyTimeEvents order by HoursSpent desc;'
'Detail' = 'select * from #WeeklyTimeDetail order by postdate desc;'
}.GetEnumerator() | ForEach-Object {
Write-Output $_.Name;
$fname = $_.Name + '.html';
Invoke-Sqlcmd -Query $_.Value | ConvertTo-Html | Out-File -Encoding ascii $fname;
};
The Description section from Get-Help Invoke-Sqlcmd says it supports GO commands so you could try running everything at once. Personally I'd use the -InputFile parameter and pipe the results to Out-File.
You can specify the ApplicationName parameter for Invoke-SqlCmd, which results in a different SQL connection.
Omitting ApplicationName will result in the temp tables getting removed the second time you call Invoke-SqlCmd.
Something like:
Invoke-SqlCmd -ApplicationName CreateTable -Query 'CREATE TABLE ##FooTable (FooKey INT)
Invoke-SqlCmd -ApplicationName SelectTable -Query 'SELECT * FROM ##FooTable'
I have been twiddling with a fairly simple idea to export ReportServer reports from the underlying database and then find its dependent stored procedures and export them as well.
However, when testing initially I found that the XML data for the report itself is truncated in the standard way I export things to files, and I think I may be using an incorrect method.
The code is fairly simple at this point, and I am using a simplified report called "ChartReport":
Import-Module 'sqlps'
$saveloc = "$home\savedata\filename.txt"
$dataquery = #"
DECLARE #name NVARCHAR(MAX)='ChartReport',
#path NVARCHAR(MAX) = '/ChartReport'
SELECT CAST(CAST(c.Content AS VARBINARY(MAX)) AS XML) [ReportData], c.Name, c.Path
FROM ReportServer.dbo.Catalog c
WHERE c.Name = #name
AND c.Path LIKE #path+'%'
"#
Invoke-SQLCMD -Query $dataquery | select ReportData | Out-File $saveloc
I have verified the query returns XML (The underlying XML file itself is over 25000 characters, and I would be happy to provide a link to it if anyone is interested), however when I save the file I get something like:
Column1
<Report xmlns:rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner" xmlns:cl="http://schemas.microsof...
I have attempted to use some of the ideas already posted on SO, such as:
> $somefile Powershell 2: Easy way to direct every bit of output to a file?
out-file and specifying width Powershell Add-Content Truncating Output
Using the format-table with -autosize and -wrap
Each of these fail at some point (though the format-table method gets pretty far before it truncates).
I would definitely consider some sort of XML specific solution, but really I think it is just that I am missing some information. As far as I am considering, this is a file of "stuff" and I want to write said file to the disk after it is loaded into the object.
Would iterating over some sort of line break and writing each line of the object to a file be the idiomatic answer?
Use -MaxCharLength parameter of Invoke-SQLCMD command. By default it 4000.
See Invoke-SqlCmd doesn't return long string?