How to get field as suggestion in SQL server using logic apps - azure-logic-apps

This is my logic app workflow,
To get EDI 850 message as input for http trigger
Decode x12 message.
Transform xml
SQL server to insert data into the on-prem SQL database. while inserting data it doesn't shows POLineNumber field as suggestion it shows like GS01,SE01 as input suggestion. i'm using transform xml output for inserting the data into the table.
And my transform xml sample single field output as 1.
I would like to insert data into sql table. when i select table i want to get that particular field as suggestion in insert row of sql server but it shows the segment of EDI 850 message like GS01, SE01, etc. i need as POLineNumber.
for an example when i use parse json in another logic app workflow for other business scenario it shows suggestion as account. for more clarification as shown in the below image
[enter image description here][1]
So for xml output i get suggestion as
[enter image description here][1]
But i need as shown in the example flow image.
What component or how to get particular field in insert row of sql server. please fix the issue.
Thanks in advance

The suggested fields are informed by the decode component and are headers of full EDI 850 Interchange (GS, ST, ...). Also the component suggested you the "GoodMessages" and "BadMessages" collection to iterate.
Those fields (POLineNumber, ...) are of each item of GoodMessages, not from decode component.
You can validate the XML of each item and then you will get the suggested fields.

Related

SQL Blob to Base 64 in Table for FileMaker

I have looked and found some instances there something similar is being done for websites etc....
I have a SQL table that I am accessing in FileMaker Pro (Through ESS) via an ODBC connection to the SQL database and I have everything I need except there is one field(LNL_BLOB) in one table (duo.MMOBJS) which is an image "(image, null)" which cannot be accessed via the ODBC connection.
What I am hopping to accomplish is find a way that when an image is placed in the field, it is ALSO converted to Base64 in another field in the same table. Also, the database creator has a "View" (Foreign Concept to us Filemaker Developers) with this same data called "dbo.VW_BLOB_IMAGES" if that is helpful.
If there is a field with Base64 text, within FileMaker I can decode it to get the image.
What thoughts do you all have? Is there and even better way?
NOTE: I am using many tables and lots of the data in the app that I have made, this image is not the only reason I have created the ODBC connection.
Table
View
Well, one way to get base64 out of SQL would be to trick the XML engine in SQL to convert your column to base64, then strip out the XML:
SELECT SUBSTRING(Q.Base64Data, 7, LEN(Q.Base64Data)-9)
FROM (SELECT
(
SELECT LNL_BLOB AS B
FROM duo.MMOBJS
FOR XML raw('r'), BINARY BASE64
) AS [Base64Data]) AS [Q]
You'd probably want to add that to your select statement or a view, rather than add it to the table; but, you could write a trigger that would maintain the field using that definition.

SSIS: ETL Data Validation -> XML to SQL (xml data validation)

i want to create a ssis etl package. am new to ssis. still managed to learn basics from internet and started working on it.
Source= xml.
destination = microsoft sql server database.. 2 separate tables one for good record and another for bad records.
my final result should look like this,, specific column with error should come and sit in the below table,,if there are 20 bad fields in a single xml row then 20 bad records should sit separately in the below table.
bad record structure :
[Slno]
[LoanNumber] = this is primary key in my source, so this needs to inserted for every bad data column.
[ErrorField] = i need to insert which input data in xml has error.
[ErrorFieldValue] = i need to insert what is the value of error column.
[ErrorMessage]= and a error message based on the validaiton.
input xml data = it has 5 rows of data in a xml and each row has 100 data fields.
i need to validate each and every data field in xml before putting it into sql database table.
i tried to validate based on data conversion field... example if (Amount) from input data source != float.. redirect the error into sql error table destination.. but while mapping i need to map all good fields or only i can select specific column but if there 20-30 error fields in input data am not able to validate and map error values.
my required validations are lenght validation ,alphanumeric and date .i need to check lenght of field should not be >10 else it should move to error table, like that amount should not be alphanumeric and date should be proper.
please help me to solve this.
What I would suggest is to have is to first generate an XSD meeting all your business rules. Use this XSD to validate the XML, you then proceed to begin the valid XML as part of the Data Flow.
Here is a blog talks about how to validate XML data using Script Task. You can further tweak that code to obtain the errors that you are looking for by adding the appropriate OleDBConnection and get that data into your error tables.

SQL Server save brackets in nvarchar for Arabic language

I am using sql server 2008 R2 to store my data. I have a datatable named postMaster where I save all the posts in my organization. All posts have to be described in english and arabic. My problem is with the arabic description of the post
My data table structure as follows:
and a sample record will look as follows:
As you can see from the picture, the Arabic description mixes up the brackets and when I display data in datagridview it would look the save way it is saved!
Is there away to make these brackets be properly saved as in english way?
It's because the postDescAr is not displaying texts in a RTL format. The data is saved correctly, but not displayed correctly. It depends on from where you query the data.

Manual Entered Data On Excel Ms Query Is Misaligned After Refresh

I have done an MS SQL Query in excel.
I have added extra colums in the excel sheet which I want to enter manual
data in.
When I refresh the data, these manually inputted columns become misaligned
to the imported data they refer to.
Is there any around this happening.
I have tried to link the imported data sheet to a manual data sheet via
vlookup but this isn't working as there are no unique fields to link together.
Please help!
Thanks
Excel version is 2010.
MS SQL version is 2005.
There is no unique data.
Because excel firstly looks like this.
when we entered a new order in to database Excel looks like this
Try this: in the External Data Range Properties, select "Insert entire rows for new data".
Not sure, but worth a try. And keep us updated of the result !
edit: And make sure you provide a consistent sort order.
There is no relationship to the spreadsheets external data and the columns you are entering. When refreshing typically the data is cleared and updated though there are other options in the external data refresh menu you could play with. You could play around with the External data options in the menu to see if changing the settings on what happens with the new data would help.
If you want your manually entered data to link to the data in the embedded dataset, you have to establish the lookup with a vlookup or some formula to find the rows info and show it.
Basically you are thinking the SQL data on the spreadsheet is static, but it isn't unless you never refresh it or disconnect it from the database
note that Marcel Beug has given a full solution to this problem in a more recent post in this forum # Inserting text manually in a custom column and should be visible on refresh of the report
he has even taken the time to record an example in a video # https://www.youtube.com/watch?v=duNYHfvP_8U&feature=youtu.be

Move Error data to another table

How do we redirect error/failed data to another table in SQL Server, during data importing in SSIS 2008 ?
In a particular data flow component - in Configure Error Output, choose to redirect the row. You may need to add some derived columns after that, and then union all your errors from different parts of your package together if you have just one unified error output.
Cade's way will work for any errors.
If you have data that you know in advance you want to redirect (say states that are not in a list of official states or people with no address), then you can do a conditional split and redirect the rows that way. I prefer to check for known problem issues rather than just to rely on something failing insert to avoid sending things to my datbase that might actually go into the filed but which are data I don't want. For instance I got a file that had the phrase "Legistlative restriction" in the last name field - this clearly wasn't a person, so I redirected the rows. But the actual text would have fit in our lastname field and the record would have been inserted if I had just reliedon error output.

Categories

Resources