I have large dataset with incremental refresh and scheduled refresh set.
On Power BI Desktop refresh works pretty fast and I can publish it to service, but scheduled refresh throws an error:
The operation was throttled by the Power BI Premium because of insufficient memory. Please try again later.
I don't know if it's a problem with dataset size or if I implemented incremental refresh wrong.
I set up RangeStart, RangeEnd parameters, store 3 years of data and refresh last 5 days.
Could it be problem with size of database tables not Power BI?
Related
I've built an Excel document that connects to an on-prem SQL server and pulls data from a SQL database into Excel using the standard M query connection of "=Sql.Database("server", "database")"
However, when some users refresh the data they are waiting up to 5 minutes. For me, it takes 15-45 seconds. The delay is always on the "connecting to datasource" part. Once the connection has been made, downloading the data is usually fast.
This discrepancy in time is repeatable even if we refresh the same data at the same time. I have been able to demonstrate a small improvement in their refresh time by giving them a higher spec laptop (I7 vs I5, 16GB vs 8GB RAM) - but I'm not sure the laptop spec would even cause the difference. The volume of data is very small - a few thousand rows at most for some of the queries. The data refresh speed doesn't correlate with network speed at all.
I have ticked "fast data load" on all queries. The "credentials" supplied in the data source settings are via the user's windows accounts and the privacy setting is "organisational".
Why is the data load so slow for some users? What can I try to improve it for them?
Our setup is SQL Server + SSAS + multidimensional OLAP + PowerBI on top of it.
Recent issue we noticed is that when user loads PowerBI report, from time to time forementioned event occurs. This leads to really long waiting time until the report loads (event takes up to 45-60 seconds).
Our cube has ~20 dimensions and ~50 measures, 2-3 million rows in 3 partitions, MOLAP storage.
What can we do about it? How can we debug it? We don't have SSAS experts on board and googling this event didn't help much. Where can we search for reasons of such behavior?
It turns out that in our case it was probably caused by cache being dropped each time the cube was processed.
Our solution would be creating SSIS Package that would run certain DMV queries to populate cache every time we process our cubes, so end-users would be able to use the cache instead of generating it themself.
Has anyone successfully used the PBI incremental refresh with Snowflake as a data source? A full refresh of my dataset (without incremental refresh) takes approximately 20 minutes, but with incremental refresh turned on, the data refresh times out because it takes longer than 120 minutes. When looking at the query history in Snowflake, it looks like a 'SELECT *' query is being done again and again until it times out.
I've seen some posts that says 'query folding' is not supported by Snowflake while others say it's partially supported.
Any clarity would be appreciated!
We also had tried out multiple options to check if incremental refresh could be enabled for Snowflake Power BI combination.Two things we used to verify the details were
The query history from snowflake for the query which was sent from Power BI
Using diagnostics feature in power bi desktop which will show whether the source query was generated
Both of these indicated that query folding was not working and hence incremental refresh. Another option which we explored was if we can leverage Power BI Dataflows for incremental refresh. But this also was not supported directly.
We also are planning to try out one more "long cut" which might help us to implement incremental refresh:
Bring in an Azure ADLS gen2 storage between power bi and snowflake
We will need to bring in the data that needs to be incrementally loaded to ADLS
Power BI dataflows can be leveraged to do the incremental refresh for the Power BI Datasets from ADLS.
Not sure how much this will suit you. All the best
Thanks,
Prasanth
As on Aug 2020, Incremental refresh with Snowflake works in both Dataset and Data Flow. Verified with Query History in Snowflake.
We are in the process of creating a suite of SQL server 2016 Reporting Services Mobile Reports for our company’s Cloud offering to customers, however, we keep running into a situation where the all datasets expire after a certain time.
We have found that all the datasets on the server seem to stop working after 30 days after they have been created and an Error message (“The data set could not be processed. There was a problem getting data from the Report Server Web Service.”) is displayed.
To resolve this, all the datasets need to be opened manually and re-saved onto the server. As you can imagine, this isn’t really a suitable solution for as we have a long number of reports and datasets for each customer.
After a bit of investigation, we have managed to pinpoint a “Snapshotdata” table in the report server database which has an “ExpirationDate” column, which seems to be linked to the issue.
Has anyone else can across this before and could please advise a possible solution to the datasets expiring? Why would the datasets have an expiration date on them anyway?
A dataset will not be expired once it has been created.
In your scenario, did you create cache for those datasets? Was there anything change to the dataset?
You said in mobile report it prompted "dataset could not be processed" error, please locate to dataset property pane, and check whether it returns data successfully by clicking on Load Data. If not, change to another account and try again.
Besides, please check whether the account used to connect to data source was expired after 30 days which might caused the failure of data retrieval.
I have a form in MS Access that takes too long to save. It's a multiuser environment and the time to save the form increases. There was some improvement witnessed when I moved all record/rowsources to be set on runtime. However, when there are multiple users access the form, there's lag of 2-3 minutes or more. There are about 15-20 users accessing the application.
There are about 40 to 45 textboxes/comboboxes on the form. The backend is SQL Server.
I have also tried rebuilding one of the indexes which was fragmented about 58%.
What can I do improve the performance of the app?
Change the data source for the form so that it isn't bound to the tables in SQL Server. Populate the form using Pass Through queries or set up a view in the SQL Server database.