Calculating Average Weekly Traffic in Data Studio - google-data-studio

I've got a table in Google Data Studio showing monthly traffic numbers and I would like to have another table that shows average weekly traffic based on the monthly numbers in another table on the same page.
Having some trouble figuring out the custom calculated field formula for this. Any help would be appreciated.

This seems to work for me.
SUM(Sales)/COUNT_DISTINCT((EXTRACT(ISOWEEK from DATE)))

From your example, is it not as easy as your monthly traffic numbers / 4.34?

Depending on how you want to present this, there's a pretty easy, and decent solution - Reference Lines.
Using a reference line, you can chart weekly values (ie: weekly sessions) on a bar chart, then via a reference line, you can plot 1 line for the average of that period (all the bars currently present). Because it is native to the visualization it will recalculate as you you filter!

Related

Using lags or monthly categorical features for recognizing the seasonality with DeepAR and TFT from pytorch-forecasting

I try to forecast monthly sales with the help of DeepAR and Temporal Fusion Transformer from pytorch-forecasting. The data I use has monthly seasonality, and the seasonality is the same or at least very similar for different countries.
While generating the TimeSeriesDataSet with pytorch-forecasting I could set the parameter lags for the target variable. The documentation says about it:
Lags can be useful to indicate seasonality to the models
I’m not sure if this is the better option than using the month or maybe month and country in a combination as a categorical feature to simplify the recognition of the seasonality.
Did anyone have own experience with this topic or an explanation what choice would be the best?
Thanks in advance!
DeepAR algorithm automatically generates feature for time series. Read more here
https://docs.aws.amazon.com/sagemaker/latest/dg/deepar_how-it-works.html
You can also add your own custom feature ( both categorical and continuous ) for each Timeseries. (e.g. Public holidays etc )
It works well when you have multiple time series with more than 300 data points for each.
All time series should have same frequency.
Benchmark on DeepAR and TFT is in your hands, I guess TFT will outperform.

Does Apache Superset support Weighted Averages?

I'm trying to use Apache Superset to create a dashboard that will display the average rate of X/Y at different entities such that the time grain can be changed on the fly. However, all I have available as raw data is daily totals of X and Y for the entities in question.
It would be simple to do if I could just get a line chart that displayed sum(X)/sum(Y) as its own metric, where the sum range would change with the time grain, but that doesn't seem to be supported.
Creating a function in SQLAlchemy that calculates the daily rates and then uses that as the raw data is also an insufficient solution, since taking the average of that over different time ranges would not be properly weighted.
Is there a workaround I'm not seeing?
Is there a way to use Druid or some other tool to make displaying a quotient over a variable range possible?
My current best solution is to just set up different charts for each time grain size (day, month, quarter, year), but that's extremely inelegant and I'm hoping to do better.
There are multiple ways to do this, one is using the Metric editor as shown bellow, in this case the metric definition is stored as part of the chart.
Another way is to define a metric in the "datasource editor", where the metric will be stored with the datasource definition, and become reusable for any chart using this datasource, as shown here
Side note: depending on the database you use, you may have to CAST from say an integer to a numeric type as I did in the example, or multiply by 100 in order to get a proper result that's useful.

Calculated field for Percentage difference in data studio

I need to have the percentage difference between different time duration for the same raw data coming from Analytics. Unable to do so as metrics from data source doesn't consist any time duration and in order to create calculated field I'm supposed to use the metrics from the data source. How shall I go about creating the percentage difference in this scenario?
Feel free to ask follow up.
If I understand you correctly then you can't do a calculated metric to achieve what you are after. The only option available is "comparison to period" but I'm guessing from what you've put there that it isn't possible either.
The only way to achieve this would be by organising you data in your data source to be something like
Date | metric value | metric value to compare to
But I appreciate this may not be as flexible as you'd like

How to calculate a percentage using columns of data in a SSRS TABLIX that I have grouped

I have an SQL query that gives me a data set with 3 columns:
Contract Code
Volume
MonthRegistered
I want to present this data grouped on rows by Contract_Code and columns by MonthRegistered:
I then want to calculate a Percentage difference between the months:
I will only ever in this case have 2 months worth of data - Each 1 year apart.
I am trying to express the percentage variation from one year to the next for each row of data.
I did this expression:
=(Fields!Volume.Value)/(Fields!Volume.Value)
but CLEARLY it was not right - and how it is not right is it is not addressing the columns independently.
I did format the TABLIX text box as a percentage so at least I figured that one out.
in this Technet article: Calculating Totals and Other Aggregates (Reporting Services) it states:You can also write your own expressions to calculate aggregate values for one scope relative to another scope. I couldn't find reference to how to address the separate scopes.
I would appreciate any pointers on this one please!
Sorry for posting my examples as JPG rather than actual text but I needed to hide some of the data...
This only works because you will only ever have two months worth of data to compare. You have to make sure that your SQL has already ordered by MonthRegistered. If you do not order in your query then SSRS's own sorting will be applied to determine which value is first and last.
=First(Fields!Volume.Value) / Last(Fields!Volume.Value)
Because you have performed the aggregation in SSRS you may have to wrap each statement in a SUM expressions.
It would be advisable to perform the aggregation in SQL where possible, if you only plan on showing it in this way.

Lifelog API totals not matching up

I have been able to retrieve data from the Lifelog API and calculate all of the totals I need, but I am finding that the totals I calculate are different than the one's the Lifelog app is showing. Any idea what I may be doing wrong? Here is an example query I am running to get total steps:
/me/activities?start_time=2015-01-20T00:00:00.391-20&end_time=2015-01-20T23:59:00.000Z&type=physical:walk
The full guide on how to calculate totals is now available here:
https://developer.sony.com/develop/services/lifelog-api/guides/how-to-calculate-daily-totals/

Resources