I currently have a report that grabs certain orders (orders with discounts) and is emailed on a daily basis. However, is there a way so that it will only email or send out the subscription, if there are orders with discounts?
Help would be immensely appreciated.
The workaround we use for this problem is kind of silly, but very effective.
Add a row count check at the beginning of your code like:
IF (SELECT COUNT(X) FROM TABLES)>0
BEGIN
RAISERROR ('No Rows to Report',2,1)
END
The error will halt execution of the subscription.
Related
I am working on creating user level monitor of credit usage on a monthly level, And have created a small query as per below.
Select user_name,
sum(CREDITS_USED_CLOUD_SERVICES) credit_used,
date_trunc(month, current_date) month,
Warehouse_name
from query_history
where start_time >= date_trunc(month, current_date)
group by 1,3,4
order by 2 desc;
And it seems to be working however there is only one confusion. the Credit_usage field is named as
CREDITS_USED_CLOUD_SERVICES in snowflake, which makes me think it is only providing me the credit usage of service layer and not warehouse layer. If so then this query is not good? and If my concern is right, can somebody please suggest or guide me to correct path of how to get the credit usage per user.
I am trying to create an alert to be sent daily. The condition is to display all the Return Orders not completed.
I am expecting it to be sent later today, then just in case there were items on those data that are still not completed by tomorrow, then it will sent out again.
Is there a need to query that condition? let's say
SELECT SalesOrderNo, SalesOrderDetailID, CustomerNo, ItemNo, DueDate
, Completed, Qty, UDCode, CustomerPO
FROM dbo._EventAlertReturns
WHERE GETDATE() is today?
What is the best way to do that?
At the moment, your query is adequate on its own, you don't need that WHERE clause - but you cannot set up an automated process using a mere SELECT statement.
As Jon Vote said, to do something like this you need a SQL Server Scheduled Job. With a scheduling agent you could create a scheduled package to execute in the evening which emails you the results of your query.
For additional granularity you can look into SQL Server Data Tools to more-easily create tasks and settings for your automated packages.
I'm trying to query only opportunities that have tasks associated with the following query:
Select Id, (id, status from tasks) from opportunity where id in (select whatid from task)
The where condition is not compiling, any ideas?
Thank you!
"Task is not supported for semi-join inner SELECTs"
You're welcome to upvote an idea, it's only 7 years old ;) https://success.salesforce.com/ideaView?id=08730000000J68oAAC
You can't do a rollup summary of activities either, to put a counter of Tasks for example. Messy...
you can just ignore it, SELECT all opps and filter them manually in Apex if(!opp.Tasks.isEmpty()){/*do my stuff */}
You can try splitting it into 2 queries, get a Set<Id> of Task.WhatId and then bind that to second, Opp-related query...
you can put some helper field on Opp that would help you identify them (and in future populate it with a cross-object workflow? process builder? Task trigger?)
you can consider using a report with "Opportunities with Tasks" cross filter and then fetch that report's results with Analytics REST API. That counts as a callout though and it's limited to 2K rows I think.
We have started using bigquery for event logging from our games.
We collect events from appengine nodes and enques them in chunks now and then which are placed in a task queue.
A backend is then processing this queue and uploads events to bigquery.
Today we store ca 60 million daily events from one of our games and 6 million from another.
We have also made cron jobs to process these events to gather various gaming KPI's. (I.e. second day retention, active users, etc etc)
Everything has gone quite smooth but we do now face a tricky problem !
======== Question 1 ===============================================
Due to some reason the deletion of the queue tasks fails now and then. Not very often but it happens and often in bursts.
TransientFailureException is probably the cause ... I say probable since we are deleting process events in batch mode. I.e. ...
List<Boolean> Queue.deleteTask(List<TashHandle> taskstoDelete)
... so we actually don't know why we failed to delete a task.
We have today added retry code that will try to delete those failed deletions again.
Is there a best practice to deal with this kind of problem?
========= Question 2 =======================================================
Duplicate detection
The following SQL succeds to find duplicates for or smaller game but exceeds resources for
the bigger one.
SELECT DATE(ts) date, SUM(duplicates) - COUNT(duplicates) as duplicates
FROM (
SELECT ts, eventId, userId, count(*) duplicates
FROM [analytics_davincigameserver.events_app1_v2_201308]
GROUP EACH BY ts, eventId, userId
HAVING duplicates > 1
)
GROUP EACH BY date
Is there a way to detect duplicates even for our bigger game?
I.e. a query that bigquery will be able to mangle our 60 million daily rows and locate duplicates.
Thanks in advance!
For question #2 (I'd prefer they were separate questions, to skip this step and opportunity of confusion):
Resources are exhausted on the inner query, or the outer query?
Does this work?
SELECT ts, eventId, userId, count(*) duplicates
FROM [analytics_davincigameserver.events_app1_v2_201308]
GROUP EACH BY ts, eventId, userId
HAVING duplicates > 1
What about reducing the cardinality? I'm guessing as you are grouping by timestamp, there might be too many different buckets to group by. Does this work better?
SELECT ts, eventId, userId, count(*) duplicates
FROM [analytics_davincigameserver.events_app1_v2_201308]
WHERE ABS(HASH(ts) % 10) = 1
GROUP EACH BY ts, eventId, userId
HAVING duplicates > 1
I need to keep a daily statistic of the count of records in a table.
Is there a way to automate counting the records daily and writing the result into another table? Maybe using a SQL Agent Job or something like that?
I'm using SQL Server 2008.
Thank you!
Edit:
If I delete today all records from 1/1/2010, the statistic still needs to show that at 1/1/2010 there were 500 records at the end of the day. So solely using GetDate() and summing up doesn't work, as I'd get 0 records with that method for 1/1/2010.
Add a column to your table like so:
ALTER TABLE My_Table
ADD insert_date DATETIME NOT NULL DEFAULT GETDATE()
You can then query against that as SQL intended.
Insert trigger: update counting table record for today (insert if not already created)
Delete trigger: decrement counting table record for today (insert if not already created)
In my opinion you answered your own question with the best option. Create a Job that just calls a stored procedure getting the count and stamping them.
The other option mentioned by Tom H. is a better choice, but If you can't alter the table for whatever reason the job is a good option.
Another option could be to place an insert trigger on that table to increment a count somewhere, but that could affect performance depending on how you implement it.
Setting up the job is simple through the SQL Management studio interface with a schedule of how often to run and what stored procedure to call. You can even just write the command directly in the command window of the step instead of calling a sp.
Tom's answer with OMG_Ponies' addendum about tombstoning instead of deleting is the best answer. If you are concerned about how many records were in the table on a certain day, there is a good possibility that someone one day will ask for information about those records on that day.
If that is a no go, then as others have said, create a second table with a field for the PK of the last record for the day, and then count for the day, then create a job that runs at the end of each day and counts all records with OrginalTable.PK > MAX(NewCountTable.Last_PK_Field) and adds that row (Last_PK_Field, Count) to the NewCountTable.
SQL Job is good -- yes.
Or you could add a date column to the table defaulted to GETDATE(). This wouldn't work if you don't want your daily counts to be affected by folks deleting records after the fact.