How to increase executing time of query in Idea Intellij? - database

I would like to ask, how can i change/increase time for executing some db statement in IDEA IntelliJ. I am working with Postgresql. And when I'm generating big number of data and trying to insert them into a table, it takes longer than 20 secs. And my IntelliJ has timeout just to 20 seconds.
Here is the image of my problem.
When it hits zero, it does nothing.

Related

OLEDB or ODBC Error when Direct importing Query

I have a query that has around 160 million rows that would take me 4 hours to import into Power BI.
Ever since my company changed their server to Azure, I can never import this query successfully. It would start loading for 1 million rows-ish, after 1 minute or two this error always pops out.
I tried:
changing the command time out to 200 minutes, still errors out within loading for a minute or two, sometimes within 10 seconds
if I select top 1000 rows in my query, it will complete without error. But when I switch back to the original query, it always fails.
Attaching the error message. I have talked to the DE in my team and they don't seem to have a clue. Does anyone have any idea on how to fix this?
this is the error message
We had the same problem and went for a delta lake in combination with option 1.
You have to ask your self first why you are importing so much data. Always keep your model as small as possible. I can't imagine you are looking at every row. If this is needed you could you could use a combination of direct query for details and loading an aggregate for your reporting. But load aggregates instead of everything.
Maybe your can load less history, like the last two years.
You could look into loading incrementally, you could load per partition.
You can try to increase the DTU's for your server.

Is it possible to execute DMV queries between certain time periods?

We have a DMV query that executes every 10 mins and inserts usage statistics, like SESSION_CURRENT_DATABASE, SESSION_LAST_COMMAND_STARTTIME, etc.. and supposedly has been running fine for the last 2 years.
Today we were notified by data hyperingestion team that the last records shown were from 6/10. So we found out the job has been stuck for 14 days not executing new statistics since. We've immediately restarted the job and it's been executing successfully since the morning, but basically we've lost the data during this 14 days period. Is there a way for us to execute this DMV query between 6/10-6/24 on the $SYSTEM.DISCOVER to recover these past 14 days of data?
Or all hope's lost?
DMV query:
SELECT [SESSION_ID]
,[SESSION_SPID]
,[SESSION_CONNECTION_ID]
,[SESSION_USER_NAME]
,[SESSION_CURRENT_DATABASE]
,[SESSION_USED_MEMORY]
,[SESSION_PROPERTIES]
,[SESSION_START_TIME]
,[SESSION_ELAPSED_TIME_MS]
,[SESSION_LAST_COMMAND_START_TIME]
,[SESSION_LAST_COMMAND_END_TIME]
,[SESSION_LAST_COMMAND_ELAPSED_TIME_MS]
,[SESSION_IDLE_TIME_MS]
,[SESSION_CPU_TIME_MS]
,[SESSION_LAST_COMMAND_CPU_TIME_MS]
,[SESSION_READS]
,[SESSION_WRITES]
,[SESSION_READ_KB]
,[SESSION_WRITE_KB]
,[SESSION_COMMAND_COUNT]
FROM $SYSTEM.DISCOVER_SESSIONS
I wouldn't say it's "gone" unless the instance has been restarted or the db has been detached. For example, the dmv for procedure usage should still have data in it, but you won't be able to specifically recreate what it looked like 10 days ago.
You can get a rough idea by looking back through the 2 years of data you already have, and get a sense of if there are spikes or consistent usage. Then, grab a snapshot of the DMV today, and extrapolate it back 14 days to get a rough idea of what usage was like.

How to retrieve the progress of oracle sql query?

I have a coding problem I can't solve.
The requirement is show progress of a sql query. I am using Spring Boot and Angularjs, so the idea is to show progress of sql to UI, not need to be real time, but preferred. Basically user click on button on UI, it will trigger API to retrieve data from db and then return the completed data to UI.
We have a computation extensive sql which will takes long time to finish. When the row we want to retrieve is about 10 million, it roughly takes about 15 minutes. We want to show the progress of the SQL so User have a idea of how long it will take. So the idea is to check how many rows has completed:
Say 1 million retrieved, then it should return 10% and so on. Next 1 million, then 20%.
I have no idea how to approach this. Need some suggestion.
Thanks in Advance.
Assuming this is one long select call, you can run the following query to get the stats on a long-running statement:
select * from v$session_longops where time_remaining > 0 and sid = ???
sid value should be replaced with the oracle session id of the session that is running the long query. You can determine that from looking in v$session. You will need to execute the above query in a separate thread (or whatever a concurrent unit of execution is in spring).
Thanks Everyone. I've tried to use OldProgrammer's solution, but failed to do so. Somehow I just can't retrieve Hibernate sesison ID, and all related post I found were either too old or just manually create new session which I don't want.
I did a work around to this problem, not a solution though.
So what I did is that for that complex SQL, first I did total count, then chunk it into 100 chunks. Then with each chunk finish(Pagination), I will update the completion percentage. Meanwhile, on the UI, I will have time-interval to check the progress of this SQL from a different API endpoint, currently I set it to every 2 seconds.
So the outcome is like this, for fairly small amount of rows, it will jump from 2% to 6% for every API call; for large amount, only after few API progress check, it will go from 1% to 2%.
This is not a clean solution, but it did its purpose, it will give User some idea of how long it will take to finish.

Entity Framework just stopping with timeout during INSERT

I have a small c# application which using Entitiy Framework 6 to parse text files into some database structure.
In general file content is parsed into 3 tables:
Table1 --(1-n)-- Table2 --(1-n)-- Table3
the application worked for months without any issues on Dev, Stage and Production environment.
Last week it stopped on stage and now I am trying to figure out why.
One file contains ~ 100 entries Table1, ~2000 Entries Table 2, ~2000 Entries Table 3
.SaveChanges() is called after each file.
I get the following timeout exception:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. The statement has been terminated.
AutoDetectChangesEnabled is set to false.
Because there is a 4th table were I execute one update statement after each file there were transactions arround the whole thing, so I removed the 4th table and transaction stuff but the problem persists.
To test if it's just an performance issue I set Database.CommandTimeout = 120 without any effect, it's still running into timeout after 2 minutes.
(Before the issue one file was stored in about 5 seconds which is absolutely ok)
If I look at the SQL Server using SQL Server Profiler I can see the following after .SaveChanges() is called:
SQL Server Profiler
Only the first few INSERT statements for Table3 are shown (always first 4-15 statements and all of them shortly after .SaveChanges())
After that: no new entries until the timeout occurs.
I have absolutely no idea what to check because there is no error or something like that in code.
If I look at SQL Server, there is absolutely no reason for it to delay the queries or something like that (CPU, memory and disk space are ok).
Would be glad for each comment on this, if you want more infos please let me know.
Best Regards
Fixed it by rebuilding fragmented indexes in Table1.
The following article was helpful to understand how to take care of fragmented indexes:
https://solutioncenter.apexsql.com/why-when-and-how-to-rebuild-and-reorganize-sql-server-indexes/
(If some mod is still thinking this is no valid answer, any explanation would be great)

How to execute a SQL query for longer time

In one environment,the database is slow and query is taking time to run approximately 10 minutes beacause of which other threads are waiting for the object and entire jvm is getting hanged.In order to simulate the issue and to be sure that it is because of longer time for executing query,i want to intentionally run same query for 10 minutes(slowing down my query in my environment).We are using jdbc connectivity.Can anyone please suggest how to slow my query so that it will take 10 to 15 minutes to execute.We are not using query timeout.
what about using dbms_lock.sleep( Number_of_seconds ) in your query for delaying?

Resources