tdengine 3.0.1.8 test downsampling query refer to the official document can not query the timestamp
enter image description here
select max(v)-min(v) as v from u14_1759 interval(1h) limit 24;
I want to add a time column to the query results
You can try this for time computing in TDengine database
select elasped(ts) as v from u14_1759 interval(1h) limit 24;
Related
I created a table "test_result" in TDengine database. It is used to store the IoT sensor data by time. I want to sort the data, but encountered an error:
taos> select * from test_result order by ts desc limit 5;
DB error: Result set too large to be sorted(0.292303s)
taos>
Is there any configuration to avoid this error?
you can modify maxNumOfOrderedRes to allow more results.
the default value is 10000 , and the max is 1000000.
I want to query from snow flake db as part of monitoring process, How much time a user using snowflakedb to execute his queries after particular date. The purpose of this is, to prevent users to running long queries.
Account usage history is some thing I wanted to know. I'm very new to snowflakedb.
Is there any way to query from the metadata ?
You can use Query history view for this requirement
There are many columns in this view you can see and use appropriately as per your requirement.
Example Query :
SELECT query_id,
query_text,
query_type,
session_id,
user_name,
warehouse_name,
start_time,
end_time,
total_elapsed_time,
compilation_time,
execution_time
FROM snowflake.account_usage.query_history
WHERE user_name = 'anyusername'
AND Cast (start_time AS DATE) >= 'yourdate in yyyy-mm-dd format'
AND total_elapsed_time > 600000 -- around 10 minutes in milliseconds or you can specify any number here
AND warehouse_name = 'your datawarehouse name'
ORDER BY execution_time DESC;
There is also a parameter called STATEMENT_TIMEOUT_IN_SECONDS to control long running queries. Set to the amount of time, in seconds, after which a running SQL statement (query, DDL, DML, etc.) is canceled by the system. Can be set for Account » User » Session; can also be set for individual warehouses. The default setting is 172800 (2 days).
I am using the following query to retrieve query history from my Snowflake database.
SELECT *
FROM table(MY_DATABASE.information_schema.query_history(
end_time_range_start => dateadd(HOUR, -4, current_timestamp()),
current_timestamp()
));
Oddly, if the warehouse (size: XS) I am using gets suspended after a period of inactivity, the next time I attempt to retrieve query history- the history that was there prior to the warehouse's suspension is gone.
I could not find anything documented to explain this.
Anyone run into this issue or related documentation that could explain this?
Thank you!
I can't explain exactly the limitations of that information schema query you are running (some of them only return like 10,000 rows or like you said, once the warehouse turns off), but it's a limited view into the actual query history. You can use the snowflake database for all query history.
It's a massive table so make sure you put filters on it. Here's an example query to access it:
USE DATABASE snowflake;
USE SCHEMA account_usage;
SELECT *
FROM query_history
WHERE start_time BETWEEN '2020-01-01 00:00' AND '2020-01-03 00:00'
AND DATABASE_NAME = 'DATABASE_NAME'
AND USER_NAME = 'USERNAME'
ORDER BY START_TIME DESC;
1: Your question states that after a period of inactivity, does not specify what is the period of inactivity.
"after a period of inactivity, the next time I attempt to retrieve query history- the history that was there prior to the warehouse's suspension is gone."
If its beyond 7 days then the data can be found from account_usage table. Below is the link of difference between INFORMATION_SCHEMA and ACCOUNT_USAGE.
https://docs.snowflake.com/en/sql-reference/account-usage.html#differences-between-account-usage-and-information-schema
2: Your query does not specify USER_NAME or WAHREHOUSE_NAME in your query so it could be that before the output of your queries before suspension of warehouse may have moved beyond 4 hours period as in your predicate. If you can increase the time period and check if behaviour still exists.
3: In general its not advisable to query INFORMATION_SCHEMA to get query history unless your application requires data without any latency. If possible use ACCOUNT_USAGE table to get query history information.
Here is what I did.
1: Created an XS warehouse
2: Set auto_suspend to 5 minutes
3: Ran few queries
4: Ran your query (which does not specify user_name or warehouse_name) meaning you are searching for history from all users.
SELECT *
FROM table(MY_DATABASE.information_schema.query_history(
end_time_range_start => dateadd(HOUR, -4, current_timestamp()),
current_timestamp()
));
5: Returned output of few 100 records.
6: Used additional where clause to check for data of my user which ran few queries before auto_suspend of Warehouse and it returned few records.
SELECT *
FROM table(MY_DATABASE.information_schema.query_history(
end_time_range_start => dateadd(HOUR, -4, current_timestamp()),
current_timestamp()
))
WHERE USER_NAME = 'ADITYA';
7: Waited for 10 minutes so that my warehouse is auto_suspended.
8: Repeat point 5 and point 6 and again it returned records as expected.
My InfluxDB query is:
select count(quantity) from test group by time(1d)
Ouput
How do we drop zero?
From the InfluxDB docs:
fill() changes the value reported for time intervals that have no data.
To return no timestamp for intervals with no data, use fill(none):
select count(quantity) from test where time > now()-10d group by time(1d) fill(none)
Is it possible to find out which query was executing/waiting in Oracle db when JDBC timeout was issued for a session?
I checked the sessions and I can see P2TEXT = 'interrupt' and P3TEXT = 'timeout' and wait_class = 'System I/O' but sql_id is blank and p2 is 0.
Thanks in advance.
Use the Active Session History (ASH) data to find out what was running and when.
ASH data is based on sampling. Don't expect to find the exact query at the exact millisecond. But if there is a performance problem it should stick out in a query like this:
select username, sample_time, sql_id
from gv$active_session_history
join dba_users
on gv$active_session_history.user_id = dba_users.user_id
where sample_time between timestamp '2017-05-20 09:00:00' and timestamp '2017-05-21 09:05:00'
order by sample_time desc;
That view generally only contains data for the past day, depending on how busy the system is. If you need to go back further in time you may be able to use DBA_HIST_ACTIVE_SESS_HISTORY instead.