Azure Storage to Azure SQL Server using FunctionApp - sql-server

I have one Storage Account on Azure where I get lots of json files from somewhere. I am running an exe on my computer constantly to get the Json File and read the data from json and convert the data into table and upload it on an SQL Server database on azure - A synchronization process
I have not worked with WebJob or FunctionApp
I believe this can be done by FunctionApp triggering on blob. I am not sure if I will be able to do the complete process as I do locally on FunctionApp.
Is this the right decision for this purpose or would you suggest WebJob for this?

If you prefer to use azure web jobs, this should be as simple as just uploading a .zip file of your project. You can follow this article to create the webjob, and the webjob should work as what you do locally.
If you prefer to use azure function, then you can use blob trigger do the same thing, but you may make some changes as per your need.

Related

Calling API from Azure SQL Database (as opposed to SQL Server)

So I have an Azure SQL Database instance that I need to run a nightly data import on, and I was going to schedule a stored procedure to make a basic GET request against an API endpoint, but it seems like the OLE object isn't present in the Azure version of SQL Server. Is there any other way to make an API call available in Azure SQL Database, or do I need to put in place something outside of the database to accomplish this?
There are several options. I do not know whether a powershell job as stated in the first comment to your question can execute http requests but I do know at least a couple of options:
Azure Data Factory allows you to create scheduled pipelines to copy/transform data from a variety of sources (like http endpoints) to a variety of destinations (like azure sql databases). This involves no or a little bit of scripting.
Azure Logic Apps allows you to do the same:
With Azure Logic Apps, you can integrate (cloud) data into (on-premises) data storage. For instance, a logic app can store HTTP request data in a SQL Server database.
Logic apps can be triggered by a schedule as well and involves none or little scripting
You could also write an Azure Function that is executed on a schedule and calls the http endpoint and write the result to the database. Multiple languages are supported for writing functions, like c# and powershell for example.
All those options include the possibility to force an execution outside the schedule.
In my opinion Azure Data Factory (no coding) or an Azure Function (code only) are the best options given the need to parse a lot of json data. But do mind that Azure Functions on a Consumption Plan have a maximum execution time allowed of 10 minutes per invocation.

Emergency Contacts

My project is to nightly upload employee next-of-kin details somewhere offsite in case of emergency. Security needs to be locked down to just three named users.
The data is already available in a VM SQL Server 2014 view.
My first bash, was to create a SQL Job to extract to CSV via BCP, then (step2) to upload to Azure file share via AZCopy.
I thought I'd nailed my first azure project ... but sadly this uses a shared access signature (appended to the URL) and not Azure AD, so I don't think this will do? (not sure)
Any ideas please?
If you don't need the data structured at the other end of your export then a flat-file as you suggested would work well and keep the complexity down. Shared signatures will work well, you just have to renew the signature when it expires. You'd have the same issue using Azure AD as well as the authentication token would just expire.
You could directly out of SQL Management Studio backup the full database to Azure Storage but this sounds like overkill for your requirements.
Another way to do it is by using Azure Data Factory, however, this has additional costs of executing the data pipeline to move your data and the complexity for a simple task.
Personally, the simplest way would be to export the data to a CSV on the file system, then have a scheduled task using PowerShell and put it up in your blob container in Azure Storage.

How to deploy SSIS package to Azure?

guys.
I know it is a general question. Let me give you my scenario:
I have a client who sends a bunch of Excel files to me and I use my on-premise SSIS package to export it to a database located on Azure. My SSIS package does call stored procedures stored on the Azure SQL Server to manipulate the data.
I want to move the whole process to the cloud and I want to know what is the best way and how can we achieve it. I was thinking maybe I can use blob storage in a container and by providing a cloud folder located on Azure and let my client through the files there. Then my an app (service) such as Data Factory can detect those files and run my SSIS package that is deployed on Azure "Somehow".
Any ideas or sample code would be great.
Thanks!
You can try below manual approach -
1. Copy all csv files to ADLS (Azure Data Lake Storage) (For Automation - Copy activity, for loop and lookup activity, you may use)
2. For any transformation of data, use USQL jobs (ADLA) whose output is also stored in ADLS. You can save USQL scripts in blob storage (for ADF AUTOMATOMATION).
3. To transfer the data from ADLS files to Azure sql database, use copy activity of azure data factory and use sql as sink and csv as source format.

Migrate hundreds of databases to Azure elastic database pool

I'm aware of the various options in place for migrating a single database up to Azure. My problem is that these all only seem to cater for a single database at a time. However, I have a db per tenant model with over 2000 databases to migrate and not a lot of time to play with.
Can anyone point me in towards the best (ie fastest) way of doint this?
In the end we accomplished this with Powershell and the Azure API. Essentially batch creating bacpacs on the source server, uploading them to blob storage then importing them into Azure SQL server pools.
If I was facing the same challenge now I'd take a look at the Azure Database Migration Service - https://azure.microsoft.com/en-gb/services/database-migration/
I am also facing this problem and am going down the route of using the Visual Studio data compare tool.
All my tenant databases have the same schema so I made an empty template database in Azure, and just use the CREATE AS COPY command to make a new one each time ready for receiving the migration.
Then I ask Visual Studio to compare the empty database with the live database and automatically insert the data for me.
Seems to be working well so far, there's very little manual steps needed and it doesn't involve using the Azure Portal, or blob storage or creating databases outside of the elastic pool which is great. But the overall time will be slow to migrate data for all the databases.

Move data from one database to another in Azure

I'm in the process of migration from dedicated servers to Azure. In my existing SQL Server, I have a few jobs that move data from live database to archives.
From what I have read so far, in Azure you cannot use cross database scritps. The other options I have seen include Azure SQL Data Sync, Azure Factory and maybe SSIS. I have to note that there's some logic on what data is archived and I need the ability to specify this in the query.
Has anyone some experience and what would you recommend?
Thanx
You can use the copy feature inside of data factory to do this now directly in Azure.
Azure Data Factory

Resources