I have SP created in database A, which has multiple SQL texts (Delete/Insert/Update) , and part of it , its calling other procedure of database B in try block. If I run locally it works fine, but when I execute SP through ETL tool parallel with concurrency , out of 6 calls , 2 or 3 fail randomly saying object of Database B does not exist databaseA.object, Not sure why its happening.
any idea how to resolve? Could it tied to flow of sql statements? How can we ensure that sql statements runs after another? SQL only fails for tied to database B only. ETL makes connection to Database A and since its not failing for all statements ,I do not its authorization issue.
Create procedure DB_A_PROC(id varchar)
--
---
as
Try {
sql1 execution on database B (Calling another procedure of database B)
sql2 exeecution on database A (delete)
sql3 execution on database B (Calling another procedure of database B)
sql4 execution on database B (Calling another procedure of database B)
sql5 execution on database A (insert)
sql6 execution on database B (Calling another procedure of database B)
}
catch {
}
Try wrapping your statements in transactions to make sure they aren't getting executed out of order due to the process running multiple times in parallel.
Related
I am new to SSIS. I am trying to create an ETL pipeline to automate the updating and deleting process for a database.
I have created a data flow task which reads the Excel file and sends the data to respective staging tables in SQL Server.
For the data to be updated in the main database, it has to go through some transformation in the staging tables. I have created a stored procedure that will enforce these changes.
I want the stored procedure to get called right after data is loaded through the data flow task to the staging tables rather than me going to SSMS to manually execute the stored procedure.
I have tried adding the "Execute SQL Task" on the control flow tab but not getting any results.
I would like to further add many more transformations in this whole process in future steps. Any ideas on how to make this whole process more convenient would also be appreciated.
[Data Flow Task] -> [Execute SQL Task]
Configure the Execute SQL Task with a Direct Input value of
EXECUTE dbo.MasterQuery;
Based on the image of your stored procedure, it would appear you have a logic error in there.
IF EXISTS(SELECT 1 FROM dbo.OutlookDataStg WHERE [Flag] = 'Outlook')
BEGIN
UPDATE dbo.OutlookDataStg
SET [Data Type] = 'Outlook'
WHERE [Flag] = 'Actual'
-- Cut off at this point
END
The logic provided is
If there is at least one row in the table dbo.OutlookDataStg where the value flag is Outlook, then update the same table but set the Data type to Outlook for any rows with a flag of Actual.
Unless you have some unusual condition, it would see you've mixed your Flag and Data Type values
When I ran a procedure from R, it stops in the middle of the execution. But if I ran it directly from SQL Server, it completes the execution.
Here is the code (there is not a lot to show):
connection<-odbcDriverConnect("SERVER=server_name;DRIVER={SQL Server};DATABASE=DB;UID=RUser;PWD=****")
stringEXEC<-"EXEC [dbo].[LongProcedure]"
data<-sqlQuery(channel = connection,query = stringEXEC,errors = TRUE)
Some remarks:
the procedure is calling for 12 another procedures. and each of the 12 creating a specific table (it's very long query to print it here in the question)
And there is no error.
Why is this happening?
I ran into a similar issue. Currently, you can only execute SELECT statement from R, not stored procedures.
If you prefer working in R-Studio, I suggest executing the results of your stored procedure into a table in SQL Server first, then using that table in R. You'll still get the benefit of scalability with that compute context.
Passing T-SQL select statement to sp_execute_external_script
I'm using Visual Studio 2015, SSIS to run set of sql tasks in Execute Sql task and then do a data transfer between tables which are in SSMS by executing package in SSIS. When we run a series of sql statements on SSMS, we get results like rows effected for every sql successful activity. However, now I want to automate the process using SSIS to reduce the turn around time. I would like to get the rows effected for every sql query like select, insert, delete which are in execute sql task. How can it be done in SSIS? I don't have dbo_owner permission to stored procedures in SSMS. I'm thinking SSIS would be a quick way. But it is very important for me to make a log of rows effected to validate the data, as it is financial data. I have nearly 10 sql statements in each sql task like select and delete. But the output is only one table.
For example my sql task is like below
select * from dbo.table1;
select * from dbo.table2 where city = 'Chicago';
create dbo.table3(id int, name varchar(50);
insert into dbo.table3(1,'a');
select * from dbo.table3;
If I execute this in SSMS I get rows effected for each select statement and also table is created. If I execute the same through package in SSIS, how will get messages for each of them?
I assume your data lies on SQL Server. With selects, you could use data flow tasks and row counts instead of Excecute Sql's.
For inserts and updates there's a few ways to get affected rowcount, like this: https://stackoverflow.com/a/1834264/5605866
or like this: http://microsoft-ssis.blogspot.fi/2011/03/rowcount-for-execute-sql-statement.html
Basically the same thing but with a bit different syntax.
You can use the Row Count transaformation after the Data source and save it the variable. Can refer to this get the number of rows returned from the Source that SHOULD be processed.
Hope this help.
I have a model in qlikview with DB2 files, xls files and SQL views.
All my views return no errors, all with data.
I used a stored procedure for some data(a list of items with no data in some days. The SP uses inserts to temp table, cursors and joins).
In SQL server management studio returns normal results
Example results for stored procedure
In Qlikview, i tried with:
centrosCostosSinDatos:
SQL GRANT Execute ON SP_nameStoredProcedure to qlikviewReader;
This return no data, no table in table viewer, no dimensions, nothing, but, i don't have error.
With load, have error Field not found - < codigoCentroCosto >
centrosCostosSinDatos:
LOAD codigoCentroCosto,
fecha;
SQL GRANT Execute ON SP_nameStoredProcedure to qlikviewReader;
With load *, i have error Error: File extdata.cpp. Line 2903
Thanks for advance
Today I test again.
Changed my stored prcedure for use a table and not a temporal table. In execution, the table has not rows.
I grant execute permissions to the user (qlikviewReader) in the database server, and change the execute line:
centrosCostosSinDatos:
SQL GRANT Execute ON SP_nameStoredProcedure to qlikviewReader;
to
centrosCostosSinDatos:
LOAD codigoCentroCosto as codeCentroCosto,
fecha;
SQL execute SP_centrosCostosSinDatos;
and works. Load * works too.
In my case, grant execute not works. Permissions in Stored procedure for this user it works.
I am just doing some stat collection on multiple servers, and as a test I'm working with my machine (Machine A) and another machine (Machine B) on the local network.
My Machine (A) is collecting all the information in the staging table from the other Machine (B). I have a sp that runs and dynamically creates something like this:
exec ('exec SprocName ?', 1000) at [Machine B]
What above does is pull the information needed with 1000 row batches from Machine B. This will be looped until all the data is retrieved.
The next time it runs, with a different SprocName, it doesn't actually make the call to Machine B, sees the ##rowcount as 0 and moves on. It only runs the first sproc that makes it to the statement above.
so the psuedo code:
while (each sproc)
{
set #qry = exec ('exec SprocName ?', 1000) at [Machine B]
while (rowcount <> 0)
{ exec (#qry) }
}
I have tried this method before as 'select * from openquery([Machine B], 'exec SprocName #batchsize), but i was trying a different method this time around. Does anybody have a clue why exec () at Servername only wants to work with one sprocname? It will loop through and pull all the rows, but moving to the second sprocname apparently does not even call to Machine B.
I'm not going to use Servername.Table.Schema.Sproc for performance reasons.
Some stats:
Machine A - Windows 7 Sql Server 2008 SP1 no CU installed
Machine B - Windows 2003 Sql SErver 2005 SP3 no CU installed
Both have mostly all the MSDTC options on that pertain to this except XA transactions.
Thanks in advance if anyone actually understood my problem and can help.
I need to step away from the code for a bit every once and awhile... Came back and noticed a flaw in the looping logic. The psuedo code was mostly right... it didn't reset the #rowcount variable I was using..