How to retrieve the progress of oracle sql query? - angularjs

I have a coding problem I can't solve.
The requirement is show progress of a sql query. I am using Spring Boot and Angularjs, so the idea is to show progress of sql to UI, not need to be real time, but preferred. Basically user click on button on UI, it will trigger API to retrieve data from db and then return the completed data to UI.
We have a computation extensive sql which will takes long time to finish. When the row we want to retrieve is about 10 million, it roughly takes about 15 minutes. We want to show the progress of the SQL so User have a idea of how long it will take. So the idea is to check how many rows has completed:
Say 1 million retrieved, then it should return 10% and so on. Next 1 million, then 20%.
I have no idea how to approach this. Need some suggestion.
Thanks in Advance.

Assuming this is one long select call, you can run the following query to get the stats on a long-running statement:
select * from v$session_longops where time_remaining > 0 and sid = ???
sid value should be replaced with the oracle session id of the session that is running the long query. You can determine that from looking in v$session. You will need to execute the above query in a separate thread (or whatever a concurrent unit of execution is in spring).

Thanks Everyone. I've tried to use OldProgrammer's solution, but failed to do so. Somehow I just can't retrieve Hibernate sesison ID, and all related post I found were either too old or just manually create new session which I don't want.
I did a work around to this problem, not a solution though.
So what I did is that for that complex SQL, first I did total count, then chunk it into 100 chunks. Then with each chunk finish(Pagination), I will update the completion percentage. Meanwhile, on the UI, I will have time-interval to check the progress of this SQL from a different API endpoint, currently I set it to every 2 seconds.
So the outcome is like this, for fairly small amount of rows, it will jump from 2% to 6% for every API call; for large amount, only after few API progress check, it will go from 1% to 2%.
This is not a clean solution, but it did its purpose, it will give User some idea of how long it will take to finish.

Related

OLEDB or ODBC Error when Direct importing Query

I have a query that has around 160 million rows that would take me 4 hours to import into Power BI.
Ever since my company changed their server to Azure, I can never import this query successfully. It would start loading for 1 million rows-ish, after 1 minute or two this error always pops out.
I tried:
changing the command time out to 200 minutes, still errors out within loading for a minute or two, sometimes within 10 seconds
if I select top 1000 rows in my query, it will complete without error. But when I switch back to the original query, it always fails.
Attaching the error message. I have talked to the DE in my team and they don't seem to have a clue. Does anyone have any idea on how to fix this?
this is the error message
We had the same problem and went for a delta lake in combination with option 1.
You have to ask your self first why you are importing so much data. Always keep your model as small as possible. I can't imagine you are looking at every row. If this is needed you could you could use a combination of direct query for details and loading an aggregate for your reporting. But load aggregates instead of everything.
Maybe your can load less history, like the last two years.
You could look into loading incrementally, you could load per partition.
You can try to increase the DTU's for your server.

Is it possible to show the records from an ADOQuery whilst opening it?

I have an ADOQuery linked to a DBGrid by a DataSource.
The ADOQuery and the DataSource are in a DataModule and the connection is in another form.
Is there any way to make my application show rows while the query is fetching the records?
Like MSSQL Management Studio.
The select takes about 7 min to terminate the execution.
I'm using Delphi 2007.
A difficult challenge. If I need to do massive queries, I normally break the query into chunks. I then create a stored procedure that takes parameters #ChunkNumber, #ChunkSize and #TotalChunks. So you would only run the query for records from (#ChunkNumber-1)#ChunkSize+ 1 to #ChunkNumber#ChunkSize. In your Delphi code, simply run a loop like this (PSeudo Code):
for(Chunk = 1 to TotalChunks)
{
DataTableResults = sp_SomePrecedure #ChunkNumber = #Chunk,
#ChunkSize = ChunkSize
RenderTableToClient(DataTableResults)
}
In this way, lets say you have 10,000 records, chunk size is 100. So you will have 100 SP calls. So you could render each chunk received from the SP, so the user is able to see the table updating.
Limitations are if the query running needs to run all records in one hit first. E.g. a Group By. SQL server uses OFFSET so you can combine to get something useful.
I have queries that run about 800K records take about 10 mins to run which I do this with. But what I do is chunk up the source tables and then run queries, e.g. if one table users has 1M records and you want to return a query which shows the total pages accessed per hour, you could chunk the users up and run the query for each chunk only.
Sorry I dont have specific code examples but hope this suggestion leads you in a positive direction.

Deleting same number of records from SQL Server database takes either 0.2 sec or 30 sec

I am using SQL Server 2008 R2. I am deleting ~5000 records from a table. While I was testing performance with the same data I have found that the deletion either takes 1 sec or 31 sec.
The test database is confidential so can not share it here.
I have already tried to separate the load and only delete 1000 record at a time but I still experience the deviation.
How should I continue my investigation? What could be the reason for the performance difference?
The query is simple, something like: delete from PART where INVOICE_ID = 64225
DELETE clause uses a lot of system and transaction log resources and thats why it can take a lot of time. If possible try to TRUNCATE the table instead.
When you run the DELETE for the first time perhaps the data wasn't in the buffer pool so had to be read off storage, the second time you run it will be in storage. Other factors will include if the checkpoint process is running your changed pages off to storage. Post your query plan ->
set statistics xml on
select * from sys.objects
set statistics xml off
Argh! After having a closer look at the execution plan I have noticed that one index was missing. After adding the index all executions are running fast. Note that after this I have removed the index to test if the problem comes back. The performance deviations re appeared.

Why ssrs 2005 report takes long time for execution when using parameter?

I am trying to execute one SSRS 2005 report. This report takes one parameter.
If I don't use the parameter and write the value directly then it runs in 10 sec. eg.
Select * from table1 where id = 122
If I use parameter then it takes long time like 10 to 15 min like
Select * from table1 where id = #id
I don't know why this thing is happening.
Thanks in advance.
It's impossible to answer the question as asked: only you have the info to determine why things aren't performing well.
What we can do however, is answer the question "How to investigate SSRS performance issues?". One of the best tools I've found so far is to use the ExecutionLog2 View in the ReportServer catalog database. In your case the important columns to look at:
TimeDataRetrieval, for time spent connecting to the data source and retrieving data rows
TimeProcessing, for time spent turning the data rows into the report
TimeRendering, for time spent creating the final output (pdf, html, excel, etc)
This will give you a starting point for investigating further. Most likely (from your description) I'd guess the problem lies in the first bit. A suitable follow up step would be to analyze the query that is executed by SSRS, possibly using the execution plan.
1) try replacing your sub queries with join logic. To the best possible. I know many a times sub-queries feel more logical as it makes the problem flow thru when we are thinking in macro view [this result set] gets [that result sets out put].
2) Can also put index as well. And since its int it will be faster.

How to combine multiple activity data like Facebook with SQL Server?

I have made a search but couldn't find a solution which works for me.
I just wonder how Facebook or Linkedin manages to handle same type activity with one sentence?
I mean, if you store every activity with different IDs in an Activity Table, how can you list them as "Member_a and 15 more people changed their photos"
I'm trying to make a social activity wall for my web-site, it's not that big but I just wanted to know the logic on this situation.
For example, when first page loads, I make an Ajax call and listing 0-10 records and if user scrolls down, page makes another ajax call which lists 11-20 records.
Now; if I try to combine same type of activity after sql select query with using if else, if this 10 records are the same, the user will only see 1 item. I hope I could explain what I want to say :)
So, I need a solution which makes this query in SQL Statement.
I'm not asking from you to write a query for me, I just want to know the logic.
Here is a screenshot what I want to achieve:
You see, they are actually different stored data but they combined it and made it as a 1 item network update.
By the way, I'm using C# and SQL Server 2008.
for example:
SELECT Min(b.MemberName), COUNT(*) as Total FROM Network_Feed a
JOIN Member b on a.MemberID = b.MemberID
WHERE a.FeedType = 1
did I understand your question right?
It's not easy to manage petabytes of data as a one table. So, big projects running on SQL Server are used some advanced scaling(distributing data and load) tricks like Service Brokers and Replication.
You can check
http://www.microsoft.com/casestudies/Case_Study_Detail.aspx?CaseStudyID=4000004532 as an SQL Server example.

Resources