Calling API from Azure SQL Database (as opposed to SQL Server) - sql-server

So I have an Azure SQL Database instance that I need to run a nightly data import on, and I was going to schedule a stored procedure to make a basic GET request against an API endpoint, but it seems like the OLE object isn't present in the Azure version of SQL Server. Is there any other way to make an API call available in Azure SQL Database, or do I need to put in place something outside of the database to accomplish this?

There are several options. I do not know whether a powershell job as stated in the first comment to your question can execute http requests but I do know at least a couple of options:
Azure Data Factory allows you to create scheduled pipelines to copy/transform data from a variety of sources (like http endpoints) to a variety of destinations (like azure sql databases). This involves no or a little bit of scripting.
Azure Logic Apps allows you to do the same:
With Azure Logic Apps, you can integrate (cloud) data into (on-premises) data storage. For instance, a logic app can store HTTP request data in a SQL Server database.
Logic apps can be triggered by a schedule as well and involves none or little scripting
You could also write an Azure Function that is executed on a schedule and calls the http endpoint and write the result to the database. Multiple languages are supported for writing functions, like c# and powershell for example.
All those options include the possibility to force an execution outside the schedule.
In my opinion Azure Data Factory (no coding) or an Azure Function (code only) are the best options given the need to parse a lot of json data. But do mind that Azure Functions on a Consumption Plan have a maximum execution time allowed of 10 minutes per invocation.

Related

Azure SQL database table - archiving to different Azure SQL database

I want to move all data from one Azure SQL Server to different Azure SQL Server which more than 90 days old, and after moving need to delete moved data from first Azure SQL Server.
I want to run these steps on daily basis.
I am new to Azure and able to do same with Azure Data Factory. Can you please suggest any other best suited approach?
You are already using the best approach.
Azure Data Factory is an easy to use when it comes to extract and copy the data between the services. It also provide scheduling the triggers, i.e., triggering the copy pipeline after specific interval of time or any event. Refer Create a trigger that runs a pipeline on a schedule.
If the volume of data is large, you can re-configure the Integration Runtime (IR) resources (Compute type and Core count) to overcome the performance issue, if required. Refer below image.

Emergency Contacts

My project is to nightly upload employee next-of-kin details somewhere offsite in case of emergency. Security needs to be locked down to just three named users.
The data is already available in a VM SQL Server 2014 view.
My first bash, was to create a SQL Job to extract to CSV via BCP, then (step2) to upload to Azure file share via AZCopy.
I thought I'd nailed my first azure project ... but sadly this uses a shared access signature (appended to the URL) and not Azure AD, so I don't think this will do? (not sure)
Any ideas please?
If you don't need the data structured at the other end of your export then a flat-file as you suggested would work well and keep the complexity down. Shared signatures will work well, you just have to renew the signature when it expires. You'd have the same issue using Azure AD as well as the authentication token would just expire.
You could directly out of SQL Management Studio backup the full database to Azure Storage but this sounds like overkill for your requirements.
Another way to do it is by using Azure Data Factory, however, this has additional costs of executing the data pipeline to move your data and the complexity for a simple task.
Personally, the simplest way would be to export the data to a CSV on the file system, then have a scheduled task using PowerShell and put it up in your blob container in Azure Storage.

How do I replicate data from the Common Data Service to SQL Server on Azure?

I have data in Microsoft's Common Data Service (from Microsoft Dynamics for Talent). I can't use the Data Management Framework as the data in question is in entities that are not available through the DMF.
How do I replicate the data in the CDS back a SQL database?
What I've tried so far is to create a logic app (and flow, neither worked) that grabs data using the CDS connector and pushes it into an SQL database, but there are several problems with this:
It's a maintenance burden
It's extremely error tedious to add new tables, etc. I have written a somehwat horendous stored proc that tries to create a table based on the data given to it from the json-ified data from the flow, but this is very error prone.
It doesn't work at all, since the size of the data exceeds some kind of limitation in the SQL connector and I get spurious errors.
Rather than trying to push through with these issues, I'd rather ask whether there's a better way to achieve this. With the Data Management Framework in Dynamics it was simply a matter of scheduling these sync jobs, which worked pretty well. Is there something similar with CDS?
I've also tried looking at the Data Integration projects in Powerapps, but these only seem to allow me to get data into Powerapps/CDS, not back out...
Common Data Service for Apps provides access to the data using the user interfaces or API, there is no direct access to the underlying database. This architecture has certain limitations when it comes to processing large volumes of data, for example for the purposes of data warehousing, reporting, or using Azure machine learning and analytics tools. Replicating CDS data using Extract, Transform, Load (ETL) tools is possible but inherently complex to maintain.
Data Export Service is a service made available on Microsoft AppSource that adds the ability to replicate Dynamics 365 for Customer Engagement apps data to an Azure SQL Database store in a customer-owned Azure subscription.
Note: The Data Export Service requires Dynamics 365 for Customer Engagement apps subscription, it is not available on Common Data Service for Apps plans.

Allow Data Push into an Azure SQL Database?

I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.

Exposing SQL Data to clients

we have an internal SQL Server 2008R2 db that we'd like to expose (partially - only some tables) to our clients via Internet, so they can feed their Excel reports. What are our best options? How should we provide security (ie. Should we create another, staging DB server on DMZ for this?). As far as quantity to transfer, it's very small (< 100 recs).
Here would be one simple way to start with if they need live, real-time access:
Create a custom SQL user account for web access, locked down with read-only access to the relevant tables or stored procedures.
Create a REST web service that connects to the database using the SQL Account above. Expose methods for each set of data that can be retrieved.
Make sure the web service runs over SSL (HTTPS) and requires username/password authentication - for example via BASIC auth with custom hard-coded account per client.
Then when the clients need to retrieve data, they can access a specific URL and receive data in CSV format or whatever is convenient for their reports. Also, REST web services are easily accessed via XMLHTTPObject if you have clients that are technically-savvy and can write VBA macros.
If the data is not needed real-time - for instance, if once a day is often enough, you could probably just generate .csv output files and host them somewhere the clients can download manually through their web browser. For instance, host on an FTP site or simple IIS website with BASIC authentication.
If data is not needed real-time, the other alternative is use SSIS or SSRS to export excel file, and email to your clients.

Resources