How to execute a SQL query in Azure Data Factory - sql-server

I create a pipeline in ADF for performing copy activity. My source database is Azure SQL database and Sink is Azure Blob .I want to execute an SQL Query in ADF to delete data from source once data is copied to blob. I am not allowed to use copy or lookup to execute query.Is their any custom way to do this.I need to create a view and have to do some activity.Please help

You can also use the built-in stored procedure sp_executesql, which allows you to provide a random SQL statement as parameter.
That way you don't have to implement your own stored procedure.
See more information about this stored procedure on sp_executesql (Transact-SQL).

If you are using data mapping flows, there is a new activity to execute custom SQL scripts:
Azure Data Factory mapping data flows adds SQL scripts to sink transformation
In a regular pipeline, you probably have to resort to using the Stored Procedure activity:
Transform data by using the SQL Server Stored Procedure activity in Azure Data Factory
You would have to write the delete logic in the SP, and then invoke the SP from Data Factory.

You can write a stored procedure for deleting the data from source table and call that stored procedure in "Stored procedure" activity after copy activity.
Your data flow will look like:
COPY ACTIVITY -----> STORED PROCEDURE ACTIVITY

They have rolled out the script activity
The script task can be used for the following purposes:
Truncate a table or view in preparation for inserting data.
Create, alter, and drop database objects such as tables and views.
Re-create fact and dimension tables before loading data into them.
Run stored procedures. If the SQL statement invokes a stored procedure that returns results from a temporary table, use the WITH RESULT SETS option to define metadata for the result set.
Save the rowset returned from a query as activity output for downstream consumption.
Script task is present under General tab of Activities.
Ref 1
https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-script
Ref 2
https://techcommunity.microsoft.com/t5/azure-data-factory-blog/execute-sql-statements-using-the-new-script-activity-in-azure/ba-p/3239969

Related

Is it possible to use Azure Data Factory copy activity to copy the data result of a view into a table?

I have staging tables in my SQL Server database, views that transform and combine those tables and final tables that I create from the result data of the views.
I could automatise the process by creating a stored process that would truncate the final table and insert the data from the view.
I want to know if it's possible to do this operation with an Azure Data Factory copy activity using the view as source and the table as sink.
Thank you for your help!
ADF does support SQL server as source as well as sink.
So there are 2 ways:
You can use copy activity with the view as your source and table as the destination
You can use stored procedure activity wherein you have all data ingestion/transformations logics within stored procedure and call the stored procedure

ETL (Extract Transform Load) versus stored procedure for processing fact table data

When processing Fact table, do people commonly use a Data Flow Task (in SSIS) as their ETL (Extract Transform Load), or do they use stored procedures to process data for fact table?
Is it common to use a Data Flow Task with vlookup task to filter data and to insert data into fact table? Or write a stored procedure with SQL queries to process data into fact table?
While processing using Data Flow task, SQL Server seems to load data into server memory and process its Vlookup and Filter tasks and process the data back by batch into SQL Server database.
On the other hand, processing using stored procedure where data is stay in SQL Server. Your feed back is appreciated. Thank you.

Is it Possible to call Functions and Stored Procedures of One database In another database- Azure Sql server

I need to call function from another database to a database within the Azure sql. That is ,
there is a function in DB1. I need that same function in DB2.
How to call it from DB2?
Both database are in azure Sql server
Maybe you can see this documentation:Reporting across scaled-out cloud databases (preview).
Elastic query also introduces a stored procedure that provides direct access to the shards. The stored procedure is called sp_execute _remote and can be used to execute remote stored procedures or T-SQL code on the remote databases. It takes the following parameters:
Data source name (nvarchar): The name of the external data source of type RDBMS.
Query (nvarchar): The T-SQL query to be executed on each shard.
Parameter declaration (nvarchar) - optional: String with data type definitions for the 4. parameters used in the Query parameter (like sp_executesql).
Parameter value list - optional: Comma-separated list of parameter values (like sp_executesql).
The sp_execute_remote uses the external data source provided in the invocation parameters to execute the given T-SQL statement on the remote databases. It uses the credential of the external data source to connect to the shardmap manager database and the remote databases.
Example:
EXEC sp_execute_remote
N'MyExtSrc',
N'select count(w_id) as foo from warehouse'
It means that you can call the Functions and Stored Procedures of One database In another database, just need to modify the SQL statement.
Hope this helps.

Run a SQL Server procedure from Excel

I'm using SQL Server 2008 Enterprise. I created a procedure in one database. The procedure is composed of several queries to different databases and the final combined result set is being displayed.
I try to execute it via Excel, so the results will appear automatically in Excel sheet, but I'm getting the error:
The query did not run, or the database table could not be opened. Check the database server or contact your DBA. Make sure the external database is available and hasn't been moved or recognized, then try the operation again
I created a simpler procedure that queries only one database, and the results displayed at the Excel sheet with no issues.
Hence I suspect that, the original procedure failed due to the fact that I'm querying several databases in the procedure, when in the connection details of the "External Data Properties", only one database is mentioned.
My question is - can it be solved? Can I use multiple databases in the procedure and see it in the Excel?
Thanks,
Roni
I transformed the procedure to have Table Variables instead of Temporary tables and I've added "set nocount on" to the beginning of the procedure.
The second action solved the issue.
The first action improved the response time of the procedure.
(Copying key parts of #Roni's answer)
create procedure dbo.xxx
as
set nocount on
...

SSIS - Log to table other than SYSSSISLOG

SSIS seems to insist on logging to the system table SYSSSISLOG. Is there a way to make it use a different table?
I want each package to log to a different table.
Quick answer is the same as John Sansom's answer: When logging is used, it creates a table and a stored proc (name varies with version between 2005 and 2008) The stored proc can be modified to do whatever you want. If the stored proc is removed Sql server re-creates it, but if the stored proc is there, Sql server assumes it is OK and leaves it alone. This allows you to modify the stored proc to write to whatever table/tables you want.
Well, you can query that huge-ass log table with something like this:
--first, we identify the packages
;with DetectedPackages as (
select source, s.executionid
from dbo.sysssislog as s
where event = 'PackageStart'
group by source, s.executionid
)
--then we use those executionids to display results
select * from dbo.sysssislog as s
join DetectedPackages dp on s.executionid = dp.executionid
where dp.source = 'PackageName'
And if you want to encapsulate every package in a view, now you know how to do that.
Take a look at the following article over on SQL Server Central, you may need to register but it's free to do so and you will find the site to be excellent SQL Server resource.
The article details how to implement a custom Log Provider that redirects the SSIS log output to another table. Using this implementation as your framework you could extend it to meet your requirements.
SSIS Custom Logging the Easy Way
The above is quite correct however not written well. When you specify your logging in SSIS you can log to a specific data provider IE SSIS Log provider for SQL Server. When you point this to a specific database it will create a [dbo].[sysssislog] table under the System Tables folder in your database. If you navigate in SSMS to your database and programmability -> Stored Procedures there will be a procedure called [dbo].[sp_ssis_addlogentry] this will insert log entries from SSIS. You can repoint this stored procedure to point to the table you want to log to instead of the one generated by SSIS within your database.

Resources