I'm having an issue where the figures pulled from Analytics are different in a table format than the time series.
As you can see in the image below the 'Nutzer' (user) value in the table for Sep 2019 is 6692 but on the time series is 7789. This then affects the calculated values for 'Umsatz pro Nutzer'.
Does anyone know why this happens and how to stop it?
I think the most likely answer is that there is a filter active on one or both of the objects that is altering the displayed data.
To test this out, create a copy of the sheet and delete the chart. Then make a copy of your table and change it to a time series using the chart options. I just did this process using a connection to the GA property for our mobile app and I see identical data for users for both the table and the time series.
Related
I am creating a data studio time series chart with a MoM comparison for clicks. I am using Google Search Console as the data source, so I want this time series to show all the data from the past 16 months (I have used an advanced time filter for this).
I created new fields in my data source by copying the links below.
GDS Fields
Please reference the below links for what the fields contain:
https://medium.com/compassred-data-blog/how-to-create-custom-formatted-date-comparisons-in-google-data-studio-9280348742a3 and https://medium.com/#brian_42021/first-off-great-work-patrick-really-impressive-workaround-that-im-finding-very-useful-bd1ee30abe57.
And then I created a blend using the data source and using Comparison Period as date and date as the join keys.
GDS Blend
I was able to get the MoM to show when using the blend as the data source in my time series chart, but the issue is that for January (current month), is is not pulling the previous month's clicks (December). This affects February as well. Does anyone have any solutions to have data studio pull these metrics? Below is an image of the issue.
MoM Issue
I created a simple report to track the funds raised for our primary school. Just a pie chart that picks data from a Google Sheets being maintained by the treasurer.
There is now a request to add a timestamp on the screen (the report is being shared by taking a screenshot and sharing in various social media platforms to report progress).
After some Googling I couldn't find a function that returns current time. Something like =now() in Sheets. However, it was recommended to try create the time in the source data.
So I created a field that stores the value of the current time in Google Sheets using this function:
=(now()-date(1970,1,1))*86400
I also set the Sheets to refresh every minute. The Sheets works perfectly.
On the Studio side I added a calculated field to display the time in my preferred format:
TODATE(Amount,'SECONDS','%H:%M:%S on %d %b %Y')
Again, this conversion works perfectly.
The problem is that the time refresh doesn't work on the report side. No matter how many times I refresh the data it still doesn't pick the updated time from the source. Yet the source sheet has the updated time.
As far as I can tell, the worksheet time update has to be triggered manually for the report to be updated.
That negates the whole purpose of the timestamp.
Sharing the report directly from Google Studio is not a practical option for now. Still, I have shared the report.
0) Summary
Use either:
#1 New Recommended Approach: Using Scorecards
#2 Original Suggestion: Using Tables
1) New Recommended Approach: Scorecards
It can be achieved using the CURRENT_DATETIME function (released on the 17 Sep 2020 Google Data Studio update to Dates and Times).
The below looks at three use cases using Scorecards which are aggregated by MAX or MIN (in the below scenario either aggregation would display the same Date Time); the fields will automatically update based on the chosen Data Freshness settings (for example, the Google Sheet used in this Report is set to refresh every 15 minutes) and can also be manually updated if required (by clicking on the refresh icon at the top of the report or using the shortcut keys Ctrl + Shift + E):
1.1) UTC
The default function would display a value based on UTC:
CURRENT_DATETIME()
1.2) Time Zone
A Time Zone could also be specified; for example, the below would display the the EST Time Zone:
CURRENT_DATETIME("EST")
1.3) Location
A location can also be specified, based on the TZ database name, for example, Colombo, Sri Lanka would be:
CURRENT_DATETIME("Asia/Colombo")
Added an Editable Google Data Studio Report and a GIF to elaborate:
2) Original Suggestion: Tables
The below looks at three use cases (outlined above) created using Tables.
Added an Editable Google Data Studio Report and a GIF to demonstrate:
There isn't a function to do that yet but hopefully will come soon (see:https://issuetracker.google.com/issues/78200216 which is assigned) however with a little careful design, you could achieve it using date rather than timestamp utilising a date filter.
If you don't have a date field in your data than you could simply set this to TODAY.
If you do have a date field then use the advanced date to set a start date of your field's earliest date and max date of TODAY.
You could then use some shapes / formatting to cover up what isn't needed.
Hardly ideal but maybe a stop gap?
There is a variable TODAY() that doesn't seem to be documented in their Function documentation but that works in calculated fields and may help you
I've been banging my head against this for about a year on and off and I just hit a crunch time.
Business Issue: We use a software called Compeat Advantage (General Ledger system) and they provide a Excel add-in that allows you to use a function to retrieve data from the Microsoft SQL database. The problem is that it must make a call to the database for each cell with that function. On average it takes about .2 seconds to make the call and retrieve the data. Not bad except when a report has these in volume. Our standard report built with it has ~1,000 calls. So by math it takes just over 3 minutes to produce the report.
Again, in and of itself not a bad amount of time for a fully custom report. The issue I am trying to address that is one of the smaller reports ran, AND in some cases we have to produce 30 variants of the same report unique per location.
Arguments in function are; Unit(s) [String], Account(s) [String], Start Date, End Date. All of this is retrieved in a SUM() for all info to result in a single [Double] being returned.
SELECT SUM(acctvalue)
FROM acctingtbl
WHERE DATE BETWEEN startDate AND endDate AND storeCode = Unit(s) AND Acct = Account(s)
Solution Sought: For the standard report there is only three variation of the data retrieved (Current Year, Prior Year, and Budget) and if they are all retrieved in bulk but in detailed tables/arrays the 3 minute report would drop to less than a second to produce.
I want to find a way to retrieve in line item detail and store locally to access without the need to create a ODBC for every single function call on the sheet.
SELECT Unit, Account, SUM(acctvalue)
FROM acctingtbl
WHERE date BETWEEN startDate AND endDate
GROUP BY Unit, Account
Help: I am failing to find a functional way to do this. The largest problem I have is the scope/persistence of data. It is easy to call for all the data I need from the database, but keeping it around for use is killing me. Since these are spreadsheet functions after the call the data in the variables is released so I end up in the same spot. Each function call on the sheet takes .2 seconds.
I have tried storing the data in a CSV file but continue to have data handling issues is so far as moving it from the CSV to an array to search and sum data. I don't want to manipulate registry to store the info.
I am coming to the conclusion if I want this to work I will need to call the database, store the data in a .veryhidden tab, and then pull it forward from there.
Any thoughts would be much appreciated on what process I should use.
Okay!
After some lucking Google-fu I found a passable work around.
VBA - Update Other Cells via User-Defined Function This provided many of the answers.
Beyond his code I had to add code to make that sheet calculate ever time the UDF was called to check the trigger. I did that by doing a simple cell + cell formula and having a random number placed in it every time the workbook calculates.
I am expanding the code in the Workbook section now to fill in the holes.
Should solve the issue!
So I am trying to plot a value over a time series in powerBI report builder. I am currently getting the data from a relational MSSQL database. Now, this value (UnitCapacity) has a StartDate and an End Date. So what I have done is created a date time dimension inside powerbi using an mquery to replicate the days between a particular year and another. What I am trying to do is to plot the Unit capacities over a time series chart. Then I created filters so that I can choose which Refinery unit to plot.
So how I tried to tackle it is by creating a relationship between the IIROutagesDenormalised and DateTimeDim over the handle where the handle is in this format: {YYYY}-{MM}-{DD}. Is this the right way to do this please?
When I tried to Create the DAX query to get the Calendar date dimension, this is giving me the error below:
You don't need to take care of the date format because it should be handled by Power BI, as long as the data type is correct. Not sure about the business logic but there is a simpler way using DAX.
You can create a calendar table using DAX:
DateTimeDim = CALENDAR(MIN(IIROutagesDenormalised[OutageStartDate]), MAX(IIROutagesDenormalised[OutageEndDate]))
Which returns a table with column Date.
If you create a relationship between the Date column and OutageStartDate:
With a simple measure (depending on the business logic), like
Total = SUM(IIROutagesDenormalised[UnitCapacity])
You can plot something like the following:
Which also works with the filter:
I have two different data sets using two different SQL queries. Essentially one data set is day/caller stats rolled up the other set is call data. So each call data set rolls up to get their day/caller data.
I needed to separate these two queries for performance because I needed one extract and one parameterized custom query for the call data. So essentially I will always bring in this month of data and last month for the day/caller data.
What I need to do is create one dashboard, that has the caller and all of their stats aggregated for the time period. Then I need to be able to click a row to prompt all the call data in a different sheet on the same dashboard
I am at the home stretch and need a way to connect these two sheets and update the call data. Right now I only have a parameter for the Unique ID of the callers not time, I bring in all the same days of calls even though it is really not needed. In a perfect world I will click the report caller and my second query will update to the appropriate day range and Unique ID and produce only that callers calls. My problem right now is no matter what I do I cannot create the one sheet to update the second call sheet. I have successfully created a manually functioning report but I need the action to filter to a timer period and the specific caller.
Let me know if you have any feedback. My two issues are creating two separate queries caller data (225k rows help in export) call data (7 million rows if unfiltered) which needs to be a live connection so when sheet is clicked the parameters will update and those calls will populate. Anything would help!
The solution i can think of is to use an action filter and there is an option below to select the fields to map between the sheets.Choose selected fields instead of all fields and map the id and time between the two data sources.
Apart from this i dont really get what the issue is.If you need further clarifications please rephrase your questions and provide examples and your data structure.