What's the backend database query of this Microsoft Dataverse Analytics dashboard?
I'm trying to workaround Dataverse analytics by accessing the transactional database behind that dashboard, I'm interested in getting Daily Active Users (DAU) shown above but via a SQL query and reading directly from the backend database.
It appears that the DB is this https://learn.microsoft.com/en-us/dynamics365/customer-engagement/web-api/entitytypes?view=dynamics-ce-odata-9 but I have not been able to comprehend the data model and I'm unable to find the tables to get DAU. Any thoughts?
Thanks
Basically you have to do everything what is MS doing in behind the scenes. CRM online is SaaS model and we don’t have access to Azure SQL server directly. But what you can do is, one of these options:
Use “Data export service” to replicate the data to your own Azure SQL server, then build Power BI on your own from the data
You can use REST Web API to pull the data and visualize (May not be so much flexible)
Based on your need and urgency, you may wait or use preview version of TDS endpoint, for read-only direct SQL access. Read more
Related
I'm looking for a one click system that doesn't require one to delete the Azure database, publish from the local server, and re-create the user info onto the deployment.
What currently works:
Drop existing Azure database.
MSDeploy the database to azure.
Move the database to the app pool
Configure Azure database user/access
I briefly looked into the Azure sync, but that doesn't seem like something one can use "on request". Do correct me with example if I'm wrong on this assumption.
The ideal solution would be a one button click from Azure Data Studio to push any and all changes from the (localdb) database to the live one.
Azure Data Studio doesn't provide any readymade Single Click data transmission feature from local to cloud or vice-versa.
Azure Data Studio offers a modern editor experience with IntelliSense,
code snippets, source control integration, and an integrated terminal.
It's engineered with the data platform user in mind, with the built-in
charting of query result sets and customizable dashboards.
It doesn't provide in-built data push feature. Either you should use any programming language to build a dashboard as per your requirement, or you need to use Store Procedure for it.
I am new to Azure and have no prior experience or knowledge regarding working with Azure data warehouse systems (now Azure Synapse Analytics Framework)
I have access to a "read only" data warehouse (not in Azure) that looks like this:
I want to replicate this data warehouse as it is on Azure cloud. Can anyone point me to the right direction (video tutorials or documentation) and the number of steps involved in this process? There are around 40 databases in this warehouse. And what if I wanted to replicated only specific ones?
We can't do that you only have the read only permisson. No matter which data warehouse, we all need the server admin or database owner permission to do the database replicate.
You can easily get this from the all documents relate to the database backup/migrate/replicate, for example: https://learn.microsoft.com/en-us/sql/t-sql/statements/backup-transact-sql?view=sql-server-ver15#permissions,
If you have enough permission then you can to that. But for Azure SQL datawarehouse, now we called SQL pool (formerly SQL DW), we can't replicate other from on-premise datawarehouse to Azure directly.
The official document provide a way import the data into to Azure SQL pool((formerly SQL DW)):
Once your dedicated SQL pool is created, you can import big data with
simple PolyBase T-SQL queries, and then use the power of the
distributed query engine to run high-performance analytics.
You also could use other ETL tool to achieve the data migration from on-premise datawarehouse to Azure. For example using Data Factory, combine these two tutorials:
Copy data to and from SQL Server by using Azure Data Factory
Copy and transform data in Azure Synapse Analytics by using Azure
Data Factory
How to update or delete data in azure sql DB using azure stream analytics
Currently, Azure Stream Analytics (ASA) only supports inserting (appending) rows to SQL outputs (Azure SQL Databases, and Azure Synapse Analytics).
You should consider to use workarounds to enable UPDATE, UPSERT, or MERGE on SQL databases, with Azure Functions as the intermediary layer.
You can find more information about such workarounds in this MS article.
Firstly, we need to know what is Azure Stream Analytics.
An Azure Stream Analytics job consists of an input, query, and an output. Stream Analytics ingests data from Azure Event Hubs, Azure IoT Hub, or Azure Blob Storage. The query, which is based on SQL query language, can be used to easily filter, sort, aggregate, and join streaming data over a period of time. You can also extend this SQL language with JavaScript and C# user defined functions (UDFs). You can easily adjust the event ordering options and duration of time windows when preforming aggregation operations through simple language constructs and/or configurations.
Azure Stream Analytics now natively supports Azure SQL Database as a source of reference data input. Developers can author a query to extract the dataset from Azure SQL Database, and configure a refresh interval for scenarios that require slowly changing reference datasets.
That means that you can not insert or update data in azure sql DB using Azure Stream Analytics.
Azure Stream Analytics is not a database manage tool.
Hope this helps.
I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.
Question: Data stored/generated in on-premises SQL server needs to be logged to app insights. This data represents both the temporary data that is passed to stored procedures and also concrete data that is stored in tables. I am asking this question to figure out if there is any direct way to do this. I have come up with the following options till now, in case there is no direct solution please let me know which is the most suited option for the task.
Options:
Log that data into a temporary table and have a windows service/ azure web job that picks up that data in batches and does the logging.
Use a SQL CLR stored procedure that logs the data directly to app insights using Azure App Insights DLL
Use Azure Data Factory to export the data from on-premise SQL server to some Azure based storage and then to App Insights