How to use for-each in XQuery in OSB? - osb

Hi I want to select all from table to another table through Oracle service bus so that I am using X Query for transformation but it is transfer only one row so i need for-each function for all record but i do not how to use for-each?
please provide me methods to use for-each in X Query?

The graphical XQuery editor might help. See samples here:
https://docs.oracle.com/cd/E29542_01/dev.1111/e15866/examples.htm#OSBDV749
(esp Example 7-6)

Related

Performance Effective JSON data masking Snowflake

I am trying to perform data masking on JSON data.
Using a Javascript UDF to update list of NESTED JPATH attributes similar to what is done here,
https://www.snowflake.com/blog/masking-semi-structured-data-with-snowflake/
Additionally I tried nested OBJECT_INSERT statements to mask a specific attribute but having multiple attributes to mask I have to build a list of subqueries to perform OBJECT INSERT on previous sub query result which is complex.
Ex:
FROM (
SELECT OBJECT_INSERT(VAR_COL,'LVL1',OBJECT_INSERT(VAR_COL:LVL1,'KEY1',OBJECT_INSERT(VAR_COL:LVL1.KEY1,'KEY2','VALUE',TRUE),TRUE),TRUE) AS VAR_COL
FROM TABLE
)
Another problem with OBJECT_INSERT which is not letting me use it is if the JPATH doesn't exists for a specific JSON row it will add that JPATH which I dont want.
I am working with million of records and using XS Warehouse it takes 15 mins to do a simple query using JavaScript UDF.
Alternately, also tried Snowpark UDF but it is also showing very small improvement.
Any idea on improving performance further?

How to modify the projection of a dataset in a ADF Dataflow

I want to optimize my dataflow reading just data I really need.
I created a dataset that maps a view on my database. This dataset is used by different dataflow so I need a generic projection.
Now I am creating a new dataflow and I want to read just a subset of the dataset.
Here how I created the dataset:
And that is the generic projection:
Here how I created the data flow. That is the source settings:
But now I want just a subset of my dataset:
It works but I think I am doing wrong:
I wanto to read data from my dataset (as you can see from source settings tab), but when I modify the projection I read from the underlying table (as you can see from source option). It seems an inconsistence. Which is the correct way to manage this kind of customization?
Thank you
EDIT
The solution proposed does not solve my problem. If I go in monitor and I analyze the exections that is what I saw...
Before I had applyed the solution proposed and with the solution I wrote above I got this:
As you can see I had read just 8 columns from database.
With the solution proposed, I get this:
And just then:
Just to be clear, the purpose of my question is:
How ca n I read only the data I really need instead of read all data and filter them in second moment?
I found a way (explained in my question) but there is an inconsistency with the configuration of the dataflow (I set a dataflow as input but in the option I write a query that read from db).
First import data as a Source.
You can use Select transformation in DataFlow activity to select CustomerID from imported dataset.
Here you can remove unwanted columns.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/data-flow-select

User Defined Table Function with Procedural Logic

Our company is setting up a new Snowflake instance, and we are attempting to migrate some processing currently being done is MS SQL Server. I need to migrate a Table-Valued SQL Function into snowflake. The source function has procedural logic in it, which is not allowed to my knowledge in snowflake UDTFs. I have been searching for a workaround with no success.
To be as specific as I can, I need a function that will take a string for input, decode that string, and return a table with the keys and their corresponding values. I cannot condense all of the logic to split the string and decode the keys into one SQL statement, so Snowflake SQL UDTFs will not work.
I looked into whether a UDTF can call a procedure and somehow I could simply return a result, but it does not look like that will work. Please let me know if there is any way to work around this.
I think Javascript UDTF is what you're looking for in Snowflake:
https://docs.snowflake.com/en/sql-reference/udf-js-table-functions.html
funny I just stumbled onto this as I'm running into the same thing myself. I found there is a SPLIT_TO_TABLE function that may be able to accomplish this. As Greg suggested nesting this together in the form of a CTE combined with a JOIN may allow you to accomplish what you need to do.

Passing Variable in SSIS

I need some help to pass row value to the other task in SSIS package. Here is my sample query select distinct txnno from tbltxn, what I need is to get distinct txnno from this query and delete records from other table based on this txnno. I think we can pick txxno in some variable in a foreach in a container and pass that recordset value to the query which is used to delete.But I have not done this before, so I need some clues and examples to solve this problem.
Here you go - the documentation includes a link to samples on Codeplex:
http://technet.microsoft.com/en-us/library/cc280492.aspx

ODBC iterate table without storing in memory

I need to have a way to iterate through a database table without actually storing it in memory anywhere. I want to essentially read through the rows like an input iterator.
I've tried using cursors with a select statement (select * from table_name), but this retrieves the entire table and feeds it back to be one row at a time. So this solution is no good. Instead, I need it to only feed me each row as I ask for it.
Any suggestions are greatly appreciated.
Thanks!
You'll just want to use a forward only cursor. Your DB will need to support this. For detials, see MSDN's How to: Use Cursors.
If you're using SQL Server, you can use a Fast Forward-Only Cursor, which provides extra benefits.

Resources