Question: Data stored/generated in on-premises SQL server needs to be logged to app insights. This data represents both the temporary data that is passed to stored procedures and also concrete data that is stored in tables. I am asking this question to figure out if there is any direct way to do this. I have come up with the following options till now, in case there is no direct solution please let me know which is the most suited option for the task.
Options:
Log that data into a temporary table and have a windows service/ azure web job that picks up that data in batches and does the logging.
Use a SQL CLR stored procedure that logs the data directly to app insights using Azure App Insights DLL
Use Azure Data Factory to export the data from on-premise SQL server to some Azure based storage and then to App Insights
Related
I am new to Azure and have no prior experience or knowledge regarding working with Azure data warehouse systems (now Azure Synapse Analytics Framework)
I have access to a "read only" data warehouse (not in Azure) that looks like this:
I want to replicate this data warehouse as it is on Azure cloud. Can anyone point me to the right direction (video tutorials or documentation) and the number of steps involved in this process? There are around 40 databases in this warehouse. And what if I wanted to replicated only specific ones?
We can't do that you only have the read only permisson. No matter which data warehouse, we all need the server admin or database owner permission to do the database replicate.
You can easily get this from the all documents relate to the database backup/migrate/replicate, for example: https://learn.microsoft.com/en-us/sql/t-sql/statements/backup-transact-sql?view=sql-server-ver15#permissions,
If you have enough permission then you can to that. But for Azure SQL datawarehouse, now we called SQL pool (formerly SQL DW), we can't replicate other from on-premise datawarehouse to Azure directly.
The official document provide a way import the data into to Azure SQL pool((formerly SQL DW)):
Once your dedicated SQL pool is created, you can import big data with
simple PolyBase T-SQL queries, and then use the power of the
distributed query engine to run high-performance analytics.
You also could use other ETL tool to achieve the data migration from on-premise datawarehouse to Azure. For example using Data Factory, combine these two tutorials:
Copy data to and from SQL Server by using Azure Data Factory
Copy and transform data in Azure Synapse Analytics by using Azure
Data Factory
What's the backend database query of this Microsoft Dataverse Analytics dashboard?
I'm trying to workaround Dataverse analytics by accessing the transactional database behind that dashboard, I'm interested in getting Daily Active Users (DAU) shown above but via a SQL query and reading directly from the backend database.
It appears that the DB is this https://learn.microsoft.com/en-us/dynamics365/customer-engagement/web-api/entitytypes?view=dynamics-ce-odata-9 but I have not been able to comprehend the data model and I'm unable to find the tables to get DAU. Any thoughts?
Thanks
Basically you have to do everything what is MS doing in behind the scenes. CRM online is SaaS model and we don’t have access to Azure SQL server directly. But what you can do is, one of these options:
Use “Data export service” to replicate the data to your own Azure SQL server, then build Power BI on your own from the data
You can use REST Web API to pull the data and visualize (May not be so much flexible)
Based on your need and urgency, you may wait or use preview version of TDS endpoint, for read-only direct SQL access. Read more
I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.
I'm new to Azure development, and I'm having trouble finding examples of what I want to do.
I have an XML file in Azure file storage and I want to use a Logic App to get that XML data into a SQL database.
I guess I will need to create a "SQL Database" in Azure, before the Logic App can be written (correct?).
Assuming that I have some destination SQL database, are there Logic App connectors/triggers/whatever that I can use to: 1) recognize that a file has been uploaded to Azure, and 2) process that XML to go into a database?
If so, can such connectors/triggers/whatevers be configured/written so that any business rules I have, for massaging the data between the XML and the database, can be specified?
Thanks!
Yes you are right you need to create the db and then write logicapps to perform necessary functionality.
There are lot of connectors with trigger like blob storage, Sql connector etc...
You can perform your processing with the help of "Enterprise Connectors" or you can do custom processing using "AzureFunctions" which integrate with logic apps.
In order to perform CRUD operations on an Azure SQL Database, you can use the SQL Connector. Documentation on the connector can be found here:
Logic App SQL Connector
Adding SQL Connector to a Logic App
I've also written a blog myself on how to use the SQL Connector to perform Bulk operations using a stored procedure and OpenJSON : Bulk insert into SQL
This might help you in designing your Logic App if you choose to use a stored procedure.
I'm in the process of migration from dedicated servers to Azure. In my existing SQL Server, I have a few jobs that move data from live database to archives.
From what I have read so far, in Azure you cannot use cross database scritps. The other options I have seen include Azure SQL Data Sync, Azure Factory and maybe SSIS. I have to note that there's some logic on what data is archived and I need the ability to specify this in the query.
Has anyone some experience and what would you recommend?
Thanx
You can use the copy feature inside of data factory to do this now directly in Azure.
Azure Data Factory