How to call database package in ODI? The db package contains list of functions and out of those I want to call only one function in ODI - oracle-data-integrator

There's a database package and it contains list of functions. Out of the list, I want to call one function in ODI12c. I tried to create an ODI procedure and within that called the db package using .(return variable);
However, when I execute this ODI procedure, it fails with error saying " is not a procedure or is undefined.
Any help is appreciated.
Thanks
I tried to create an ODI procedure and within that called the db package using .(return variable);
Error: is not a procedure or is undefined.

If your goal is to store the value returned by that function into a ODI variable, you can use something like this as the refresh query for that variable :
select <SCHEMA_NAME>.<PACKAGE_NAME>.<FUNCTION_NAME>(<PARAMETERS>) from DUAL;
You can then refresh that variable in a package or a load plan.
If you don't need to store the result of the function but just execute it, the easiest is to use a PL/SQL block in your ODI Procedure. So make sure you set the technology to Oracle for that procedure step. Then use something like this :
BEGIN
<SCHEMA_NAME>.<PACKAGE_NAME>.<FUNCTION_NAME>(<PARAMETERS>);
END;
A nicer way to do it would be to avoid hardcoding the schema name and get it from the topology instead. As it can be a different schema through different contexts we need to use the substitution API that will replace it at runtime. Here would be the result for a variable :
select <%=odiRef.getSchemaName("<LOGICAL_SCHEMA_NAME>", "D")%>.<PACKAGE_NAME>.<FUNCTION_NAME>(<PARAMETERS>) from DUAL;
And for a procedure :
BEGIN
<%=odiRef.getSchemaName("<LOGICAL_SCHEMA_NAME>", "D")%>.<PACKAGE_NAME>.<FUNCTION_NAME>(<PARAMETERS>);
END;

As this is a call to function, there has to be a variable created in ODI procedure to which the function returns the value.
So I did a small change and it worked.
DECLARE
var1 varchar2(1000);
v_ret Boolean;
BEGIN
v_ret := <Function Call>;
END;

Related

Execute two stored procedures: what is the order they get executed? [duplicate]

I have two TSQL EXEC statements
EXECUTE (N'MyDynamicallyGeneratedStoredProcedure') -- return 0 on success
SELECT #errCode = ##ERROR ;
IF (#errCode = 0)
BEGIN
EXEC 'A Sql Statement using ##temptable created from first', #returnValue
END
How do I make the two EXEC's synchronous? ; Right now the second EXEC does not wait for the first EXECUTE to complete. I tried issuing a WaitFor Delay, It waits but the second EXEC statement is never returing back.
Thanks.
Update, Here is more info:
First execute creates a global temp table and populates it from a complex SELECT query.
Second EXEC is a CLR Stored Procedure that generates a dynamic SP, based on the variables from recently created and populated Global Temp table.
Now the second EXEC, complains that the Global Temp table is not found.
Update 2, Found the issue (And its me!!)
GBN (and others) was point blank on the answer. EXEC IS synchronous. The problem? My understanding of the problem itself.. I had mentioned
EXECUTE (N'MyDynamicallyGeneratedStoredProcedure') -- return 0 on success
It should have been:
1(a) EXECUTE (N'CreateMyDynamicStoredProcedure') -- return 0 on success
1(b) EXECUTE (N'MyDynamicStoredProcedure') -- return 0 on success
I missed that 1(b) was actually executed somewhere else and after step (2) .
(I should go get a life!!)
EXECUTE is sychronous. The 2nd one runs after the 1st one. Always.
Do you have multiple connections running the same code? You are using a global temp table that will be visible to all connections so it may look like asyncc execution...
As gbn's answer has pointed out, EXECUTE is synchronous.
The problem might be that your SQL Connection object within CRL stored procedure is not in the same context as your batch script. Your global temporary table should have been dropped after running EXECUTE (N'MyDynamicallyGeneratedStoredProcedure')
Make sure that you create your SQLConnection object by passing "context connection=true"
Here is the post answer where someone had a similar problem accessing temporary table since SQLConnection was not in the same connection context.
Accessing TSQL created #temp tables from CLR stored procedure. Is it possible?
If your second CRL stored procedure runs through a different connection, CRL sproc will not be able to access the global temp table since it should have been dropped.
Refer to this post on Global Temporary life cycle (when the gloal temp is dropped)
Deleting Global Temporary Tables (##tempTable) in SQL Server

How to get name of executing stored procedure in Snowflake?

Does snowflake have function which returns name of the current stored procedure like the following in SQL Server.
SELECT OBJECT_NAME(##PROCID)
I am just trying to build a logging table to log all statements that are executed inside a stored procedure this is for monitoring purpose i.e. which statement within stored procedure failed and how long queries are taking to run. If Snowflake has something out-of-box OR a recommended way of doing it please share.
Try this from within your stored procedure:
const procName = Object.keys(this)[0];
Also see this related post.

How to solve ORA-01006 in procedure?

I need to have a procedure to calculate count of something and insert it into another table but get error
ORA-01006:bind variable does not exist.
Here is my code:
Insert part is not be executed and jumps to exception instead.
Your dynamic SQL call is
EXECUTE IMMEDIATE v_sql USING v_result;
This is the syntax for passing a parameter into the dynamic statement. But your code doesn't take any parameters, because you have concatenated them in the string. Therefore, the code hurls ORA-01006.
What you need to do instead is provide a variable for the result set to be return into. So the call should be
EXECUTE IMMEDIATE v_sql INTO v_result;
The syntax for EXECUTE IMMEDIATE is comprehensively covered in the PL/SQL guide. You should bookmark the Oracle documentation for future reference.

Can a dynamic table be returned by a function or stored procedure in SQL Server?

I would like to call a stored procedure or user-defined function that returns a dynamic table that is created via a pivot expression. I don't know the number of columns up front.
Is this possible? (I am not interested in temporary tables)
You can do that via stored procedure as it can return any kind of table, question is what are you trying to achieve and what will you do with data that you have no idea about?
This cannot be done with functions (as the returned table structure must be pre-defined), but it can be done with a stored proceed. Some psuedo-code:
CREATE PROCEDURE Foo
As
DECLARE #Command
SET #Command = 'SELECT * from MyTable'
-- For debugging, work in an optional PRINT #Command statement
EXECUTE (#Command)
RETURN 0
When you run stored procedure Foo, it builds your query as a string in #Command, and then dynamically executes it without knowing anything about what is being queried or returned, and the data set returned by that EXECUTE statement is "passed back" to the process that called the procedures.
Build your query with care, this stuff can be really hard to debug. Depending on your implementation, it might be a source of SQL injection attacks (remember, the stored procedure really doesn't know what that dynamic query is going to do). For quick stuff, EXECUTE() works fine, but for safer and more useful (if elaborate) solutions, look into sp_ExecuteSQL.
Yes, you can do this from a Stored Procedure, but not from a user-defined Function. It is worth looking into the Table Value Function, I believe you can also return a dynamic table from there, but I have not used that myself.

How to use stored procedures within a DTS data transformation task?

I have a DTS package with a data transformation task (data pump). I’d like to source the data with the results of a stored procedure that takes parameters, but DTS won’t preview the result set and can’t define the columns in the data transformation task.
Has anyone gotten this to work?
Caveat: The stored procedure uses two temp tables (and cleans them up, of course)
Enter some valid values for the stored procedure parameters so it runs and returns some data (or even no data, you just need the columns). Then you should be able to do the mapping/etc.. Then do a disconnected edit and change to the actual parameter values (I assume you are getting them from a global variable).
DECLARE #param1 DataType1
DECLARE #param2 DataType2
SET #param1 = global variable
SET #param2 = global variable (I forget exact syntax)
--EXEC procedure #param1, #param2
EXEC dbo.proc value1, value2
Basically you run it like this so the procedure returns results. Do the mapping, then in disconnected edit comment out the second EXEC and uncomment the first EXEC and it should work.
Basically you just need to make the procedure run and spit out results. Even if you get no rows back, it will still map the columns correctly. I don't have access to our production system (or even database) to create dts packages. So I create them in a dummy database and replace the stored procedure with something that returns the same columns that the production app would run, but no rows of data. Then after the mapping is done I move it to the production box with the real procedure and it works. This works great if you keep track of the database via scripts. You can just run the script to build an empty shell procedure and when done run the script to put back the true procedure.
You would need to actually load them into a table, then you can use a SQL task to move it from that table into the perm location if you must make a translation.
however, I have found that if working with a stored procedure to source the data, it is almost just as fast and easy to move it to its destination at the same time!
Nope, I could only stored procedures with DTS by having them save the state in scrap tables.

Resources