Azure SQL monitoring high DTU - sql-server

We're getting spikes from time to time but can't find what causes it.
How to monitor the Azure SQL DTU usage?
How can I find what are the high DTU queries in live?

The following will show you a log of the 100 most recent DTU logs. As of now, a log entry is created every 15 seconds.
SELECT TOP 100 *
FROM sys.dm_db_resource_stats
ORDER BY end_time DESC

Please have a check on the below links which talks about Azure SQL Database Throughput Unit (DTU).
http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
Azure SQL Database "DTU percentage" metric
http://msdn.microsoft.com/en-us/library/azure/dn741336.aspx
Regards,
Mekh.

Related

On premise vs Azure SQL server performance

I have a query that creates 400 transactions, each transaction running an update on a table. Running this query on the development server on premise takes around 100 ms, versus over 500 ms on Azure. I ran some statistics and RESERVED_MEMORY_ALLOCATION_EXT came up higher than dev, with a wait count of 3523254 and 0.9 seconds wait time. Any suggestions on how to troubleshoot further?

Azure SQL database point in time restore taking over 24 hrs

I have been trying to do a PITR of a 2GB S0 Azure SQL Server db. It has been running for over 24hrs. The DB restore progress has been saying 50% complete for 18
Hrs without any errors. Should I upgrade the server DTUs and size or the actual service tier?
According to this post. On SQL Database, the "horsepower" is measured by Database Throughput Units, or just "DTUs". This unit is measured by an integer and may variate from 5 to 1750. Every database edition has an offer of one or more "Service Objectives", which are directly related to the number of DTUs and the price to be played.
In the following image, you can find the list of "Service Objectives" (S0, P3, Basic, P11, S3, etc…) per SQL Database Edition and its respective prices. Notice that Microsoft is always updating its offer, so those prices and Service Objectives per Edition may be outdated when you read this post:
One option is a more conservative, responsible and dignified way to choose the number of DTUs, and is based on real data about your database activity. It is the DTU Calculator (http://dtucalculator.azurewebsites.net/), an online service that helps us by advising about the most appropriate Service Objective for a database. You just need to download a PowerShell script, available on the DTU Calculator website, and run it in the server where your database is located. As soon as you run this script, the following data will be measured and recorded in a CSV file:
Processor – % Processor Time
Logical Disk – Disk Reads/sec
Logical Disk – Disk Writes/sec
Database – Log Bytes Flushed/sec
Once the collection is done, you just need to upload the file generated by the script and interpret the results. Here is a sample of one of the charts generated by the DTU Calculator, indicating that 89.83% of the database load would run well with the Service Objective S3, of the "Standard" SQL Database edition.
Here is a decision tree that will help you to reach the optimal point for your database.
So I think you can increase the DTU appropriately to speed up the process. :)
If you are on a S0 you are using Azure SQL Database, not a Managed Instance.
2GB is quite small, it should have recovered the point in time restore in an hour or so.
Contact Microsoft Support.

Azure SQL query performance changed

I have been running a query on Azure SQL database and the same query is now taking 10 times longer - 4mins up to 40. The datasets are the same and even with ramping up the DTU's, the query time barely changes.
Are there any things I can do to either reset the database to how it was performing or any tips to help out.
Thanks

SQL Azure DTUs at 100% troubleshoot

I am performing load testing of my app using visual studio load tests
my tests get data from about 40 tables in sql azure database in seperate api request for each table.
The load testing is simulating around 200 users over 10 mins and about 80-100 requests per second.
During the tests, I have seen that my SQL Azure server DTUs are touching 100% which is clearly not a good sign for the production performance.
This also stays the same, even if I Scale Up my Database with more DTUs.
How do I troubleshoot or address the issue , if it's a specific query which I need to look at or just a scaling issue to address
Take a look at the Query Performance Insight blade on your Azure SQL Database. This will show you the top queries that are consuming resources and are good candidates for optimization.
Also check out the Performance Recommendations blade on your Azure SQL Database. This will show indexing recommendations after the Analyzer has enough history to see usage and determine if adding (or removing) indexes will deliver benefits.
If you're using an Azure SQL DB with a Web App, plug in Application Insights. This will give you a lot of visibility into what queries are long-running, and what queries are being executed a bunch of times that could benefit from minor optimization.
Take a look at the Query Store optimize top queries showing high costs of resources and long running queries. Click here to know more about Query Store.
Run the following query to know what resources show more consumption. Then identify top queries using the most of those resources.
SELECT
AVG(avg_cpu_percent) AS 'Average CPU Utilization In Percent',
MAX(avg_cpu_percent) AS 'Maximum CPU Utilization In Percent',
AVG(avg_data_io_percent) AS 'Average Data IO In Percent',
MAX(avg_data_io_percent) AS 'Maximum Data IO In Percent',
AVG(avg_log_write_percent) AS 'Average Log Write Utilization In Percent',
MAX(avg_log_write_percent) AS 'Maximum Log Write Utilization In Percent',
AVG(avg_memory_usage_percent) AS 'Average Memory Usage In Percent',
MAX(avg_memory_usage_percent) AS 'Maximum Memory Usage In Percent'
FROM sys.dm_db_resource_stats;
Hope this helps.

Are all available DTU used to exec a query?

I have a not simple query.
When I had 10 DTUs for my database, it took about 17 seconds to execute the query.
I increased the level to 50 DTU - now the execution takes 3-4 seconds.
This ratio corresponds to the documentation - more DTU = work faster.
But!
1 On my PC I can execute the query in 1 sec.
2 In portal-statistics I see that I use only 12 DTU (max DTU percentage = 25% ).
In sys.dm_db_resource_stats I see that MAX(avg_cpu_percent) is about 25% and the other params are less.
So the question is - Why my query takes 3-4 sec to exec?
It can be executed in 1 sec. And server does not use all my DTU.
How to make server use all available resources to exec queries faster?
DTU is a combined measurement of CPU, memory, data I/O and transaction log I/O.
This means that reaching a DTU bottleneck can mean any of those.
This question may help you to measure the different aspects: Azure SQL Database "DTU percentage" metric
And here's more info on DTU: https://learn.microsoft.com/en-us/azure/sql-database/sql-database-what-is-a-dtu
On my PC I can execute the query in 1 sec
We should not be comparing our Onprem computing power with DTU.
DTU is a combination of CPU,IO,Memory you will be getting based on your performance tier.so the comparison is not valid.
How to make server use all available resources to exec queries faster?
This is simply not possible,since when sql runs a query,memory is the only constraint ,that can prevent the query from even starting.Rest of the resources like CPU,IO speed can increase or decrease based on what query does
In summary,you will have to ensure ,queries are not constrained due to resource crunch,they can use up all resources if they need and can release them when not needed.
You also will have to look at wait types and further fine tune the query.
As Bernard Vander Beken mentioned:
DTU is a combined measurement of CPU, memory, data I/O and transaction
log I/O.
I'll also add that Microsoft does not share the formula used to calculate DTUs. You mentioned that you are not seeing DTUs peg at 100% during query execution. But since we do not know the formula, you may very well be pegging components of DTU, but not pegging DTU itself.
Azure SQL is a shared environment, and each tenant will be throttled to ensure that the minimum SLA for all tenants
What a DTU is is quite fuzzy.
We have done an experiment where we run a set of benchmarks on machines with the same amount of DTU on different data centers.
http://dbwatch.com/azure-database-performance-measured
It turns out that the actual performance varies by a factor of 5.
We have also seen instances where the performance of a repeated query on the same database varies drastically.
We provide our database performance benchmarks for free if you would like to compare the instance you run on your PC with the instance in the azure cloud.

Resources