How to get notification from Azure Sql database on insert - sql-server

I need to get a notification or call a webservice whenever a row is inserted into a specific table in my Azure Sql Database. I have been searching the web for a good solution, but i haven't found any.
I tried to call a web app service in Azure - but this is not allowed from Azure Sql Databases.
I looked at the Azure logic apps, but the SQL Server Connector has been removed.
How do I get notificated when a row is put in?

Although this is not natively supported in SQL Azure, there are a few different options you can consider.
1) Modify the calling code to insert a row into the table and write a message to Azure storage queue. You can have a separate process which drains the message from the queue and invokes the web service so that these actions are loosely coupled.
2) Enable change tracking on the specific table so that your app can discover the latest changes (i.e. inserts) to the table. This feature is well documented if you search the Azure SQL docs.

Related

Azure SQL database table - archiving to different Azure SQL database

I want to move all data from one Azure SQL Server to different Azure SQL Server which more than 90 days old, and after moving need to delete moved data from first Azure SQL Server.
I want to run these steps on daily basis.
I am new to Azure and able to do same with Azure Data Factory. Can you please suggest any other best suited approach?
You are already using the best approach.
Azure Data Factory is an easy to use when it comes to extract and copy the data between the services. It also provide scheduling the triggers, i.e., triggering the copy pipeline after specific interval of time or any event. Refer Create a trigger that runs a pipeline on a schedule.
If the volume of data is large, you can re-configure the Integration Runtime (IR) resources (Compute type and Core count) to overcome the performance issue, if required. Refer below image.

Auto sync Azure SQL DB data to Azure search index

I want to sync any DML operations on Azure SQL DB to azure search index immediately.
I have gone through this question.
How does Auto-indexing/sync of Azure SQL DB with Azure Search Works?
No answer in this question. This was posted almost 5 yrs back.
With integrated change policy in place do we have auto sync feature by any means now.
Function app does not have a SQL trigger event attached.
I don't want to do a while true loop or any timer or call indexer when data gets update.
Please suggest if there are any other best approach or any build feature.
Azure Functions don't have a SQL trigger but Logic Apps do: https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-sqlazure Logic Apps can also trigger functions and custom APIs, so you should be able to trigger an indexing operation from a SQL operation this way. Do keep in mind however that the indexing process itself may be delayed or take time once triggered so your index may not immediately reflect the changes, depending on payload.
There are two ways you can integrate Azure SQL data with Azure Search.
Built-in indexer. You have the built-in index support for Azure SQL. It supports incremental updating of the index, but with a limited refresh ratio. Currently, you can run incremental indexing every 5 minutes at the most. See Connect to and index Azure SQL content using an Azure Cognitive Search indexer
Push API. To support immediate updates, you have to push data via the Push API. In this case you only create the index, not the indexer. Your code that pushes content to Azure SQL is responsible for pushing content to Azure Search. Check out this example Tutorial: Optimize indexing with the push API

Call external API on change to Azure SQL database table

We have a table called Guest in an Azure SQL database. We also have a campaign management tool sitting behind an API on the providers cloud.
When a record is created, updated or deleted in the Guest table, we would like to call the API in order to update the campaign management tool with the latest information about the Guest.
Our initial idea was to hook up a database trigger to a C# .NET Azure Function, however, it looks like this is only supported in Cosmos DB.
We would prefer not to have an application running on a scheduled task that periodically checks for changes in the database and sends these changes to the API.
We have also been reading about creating CLR stored procedures but it looks like these are not supported in Azure SQL databases.
Looking forward to hearing ideas & suggestions.
I can think of a few ways to accomplish this.
[Unfortunately CLR is no longer supported in SQL Azure.]
One way is:
Turn Change Data Capture on, on your Guest table.
Create a server-less Azure Function that has a timer trigger. This function would use the CDC to determine what had changed in your table, and call your vendor API accordingly.
The server-less function is relatively lightweight compared to "an application running on a scheduled task".
You can also use Azure Logic Apps for this case.
There are some predefined trigger which helps to trigger
When an item is created
When an item is modified
Then using the Action to call your API
Refer here
This will be the simplest way that you can achieve your usecase.
You will have to migrate to Azure SQL Managed instances which supports CLR and Broker.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-transact-sql-information#clr

Allow Data Push into an Azure SQL Database?

I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.

Azure Search from existing database

I have an existing SQL Server database that uses Full Text Search and Semantic search for the UI's primary searching capability. The tables used in the search contain around 1 million rows of data.
I'm looking at using Azure Search to replace this, however my database relies upon the Full Text Enabled tables for it's core functionality. I'd like to use Azure Search for the "searching" but still have my current table structure in place to be able to edit records and display the detail record when something has been found.
My thoughts to implement this is to:
Create the Azure indexes
Push all of the searchable data from the Full Text enabled table in SQL Server to Azure Search
Azure Search to return ID's of documents that match the search criteria
Query the existing database to fetch the rows that contain those ID's to display on the front end
When some data in the existing database changes, schedule an update in Azure Search to ensure the data stays in sync
Is this a good approach? How do hybrid implementations work where your existing data is in an on-prem database but you want to take advantage of Azure Search?
Overall, your approach seems reasonable. A couple of pointers that might be useful:
Azure SQL now has support for Full Text Search, so if moving to Azure SQL is an option for you and you still want to use Azure Search, you can use Azure SQL indexer. Or you can run SQL Server on IaaS VMs and configure the indexer using the instructions here.
With on-prem SQL Server, you might be able to use Azure Data Factory sink for Azure Search to sync data.
I actually just went through this process, almost exactly. Instead of SQL Server, we are using a different backend data store.
Foremost, we wrote an application to sync all existing data. Pretty simple.
For new documents being added, we made the choice to sync to Azure Search synchronously rather than async. We made this choice because we measured excellent performance when adding to and updating the index. 50-200 ms response time and no failures over hundreds of thousands of records. We couldn't justify the additional cost of building and maintaining workers, durable queues, etc. Caveat: Our web service is located in the same Azure region as the Azure Search instance. If your SQL Server is on-prem, you could experience longer latencies.
We ended up storing about 80% of each record in Azure Search. Obviously, the more you store in Azure Search, the less likely you'll have to perform a worst-case serial "double query."

Resources