update the microsoft power bi report automatically when updating the databse - sql-server

I'm generating some simple BI reports for a dashboard of KPIs in an angular App. Well my question is can power BI update the data of the report automatically whenever i update the database? .For the DB i'm using sql server .

This is what you are looking for
Real-time streaming in Power BI
There is one more way you can do that Data refresh in Power BI
I would put my 2 cents on Data refresh

You can use a python script to automate this, I have used this project before and it works https://github.com/dubravcik/pbixrefresher-python
You will need to convert the .py file into an .exe file and run it on a task scheduler and set your preferred execution rate.

Related

How can I get data in power BI from other power BI files

I created a large .pbix file on my local drive trying to extract infomation from several hunderd webpages and combine them. Like this:
myfile.pbix
query_001
query_002
...
query_904
combined_query
But it takes hours to open and refresh. So I grouped the queries in power query and put them in small individual .pbix files on the local drive as below. There are only data with no models and reports.
myfile1.pbix
query_001
query_002
myfile2.pbix
query_003
query_004
...
myfile44.pbix
query_903
query_904
How can I get them togther again? I cannot find such a function as "import data from .pbix files". Do I have to publish them first? Thanks.
I was in your shoe once and I know how slow power query refresh can be. Power query is awesome but it may not be suitable for complex application. However, you can apply following fix and see how it works in your case.
A. If you use power bi service, you can create dataflow using power query and have them refreshed at a scheduled time. You can then query those dataflows in a pbi desktop, create model and solve business problem. By using dataflow in a subsequent seperate pbix, you are divinding and distributing the transformation and modelling. Dataflow are connected to Azure deta lake and if gateways are configured correctly and memory allocation are done appropriately, refresh are much faster than pbix desktop.
B. Second option is to use SSIS. SSIS has power query connector. (You can use something else too inside SSIS, not necessarily PQWRY).So whatver power query you have written to query the datasource you can wrap those inside a SSIS package, scheduled and have the ETL result pushed to a SQL server. You can schedule the SSIS job and you can build pbix model only by querying the SQL tables. The benefit is if you need to apply further transformation, you can wrote native SQL queries which are blazing fast compared to Power Query. In this way you are avoiding a ELT+modelling to ETL and subsequent modelling. ETL and modelling runs seperately and not in the same package, hence way faster.
C. If you use Power BI service, you can publish each of the data sources in service and then query simply the powerbi dataset for modeling. If you want, you can also use power bi's new awesome composite modeling.

How to change/update dataset source file?

I'm searching for a solution to change dataset source data in power bi. I want a scriptable way using C# to update datasets, and make it use another specific local file, using the Power BI rest API. Any help?
Don't do this! Power BI REST API work with Power BI Service, which is in the cloud, and it will be a challenging task to give it access to files on your local drive!
Your problem (as asked in your other question) is that you are skipping layers from the recommended data warehouse architecture and try to combine them all in Power BI only. Just put these files somewhere in the cloud (Azure Data Lake, Azure SQL Database, OneDrive for Business, etc.) and use them as a datasource for your reports. You can also put them in a database on-premise and use Power BI Gatawey to connect to it from Power BI Service. See Get data from Comma Separated Value (.CSV) files. This will give you the flexibility to push data the way you want, even to clean up the historical data that you don't need anymore.
If you need to be able to programatically switch the data source for your report, then you can define it with parameters (e.g. ServerName and DatabaseName) and change their values using the REST API. See Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4 for more details how to implement connection-specific parameters.
To answer this, there is indeed a way, and it's by setting up the file source as a parameter in the power query.
All I do after is using the API to Updated this parameter.

How many times does an on-premises Data Gateway used in multiple reports connect to the same data source?

So, I am pretty new at using Power BI, so I'm having some issues when building reports that use on-premises Data Gateway. My goal is to build different reports in Power BI (let´s say 4 reports, as an example), all of them getting information from the same web data base. The problem is that everytime someone access this online data base, my company has to pay a small amount of money.
The reports refresh once per day, exactly at 2AM, using the on-premises Data Gateway. My point is: will the on-premises Data Gateway access this data base once per day, at 2 AM, and refresh all the reports or will it access it once for exery report (meaning 4 access at 2AM)?
If something is unclear, just warn me and I try to provide more information. Thanks!
It will follow the "4 access at 2AM" scenario.
I would design one master/dataset report that extracts all the required data from the source database. Once published (to your workspace or a group workspace), I would use that dataset as the source for the other 3 reports. From Power BI Desktop, go to Get Data / Power BI service.
Then you only have 1 dataset to refresh via the gateway. Another benefit is that you only have one set of queries & modelling to maintain.
It depends on how you have created the report. If the dataset is exactly the same, you should publish only one report and create the other 3 desktop files connecting to Power Bi Service as dataset. In that way you can have only one dataset for four reports and will refresh only once a day.
Regards

Catching Power BI queries

How I can catch Power BI queries on Import mode? For Direct Query I can just connect profiler to server, but for Import I have only model.
I need it to check how some complicated reports work.
You could still connect (SQL Server?) Profiler to your server, but manually trigger the Refresh - either using Power BI Desktop or from the Dataset's menu in app.powerbi.com (if using a gateway).
Note it might not give you the full picture as the Edit Queries capability can rival the most complex ETL processes - most of that wont hit your database. Probably the real answer to your 2nd para is that you need to open the source .pbix file using Power BI Desktop.

Power BI (cloud) + SSAS Cubes

1) Can Power BI (online/cloud-based) use our local SSAS cube directly as a data source?
2) If no, and I assume it is no, then can we upload our SSAS cube(s) to be used as a data source, and how do we do that, preferably incrementally (if it is possible to do that incrementally)?
3) If SSAS cubes cannot be used, then I assume that we have to use data built into the SSAS Tabular Model, and use DAX to query it?
4) If this is true, then how do we send data to there? Do we have to define the tabular model locally and ship the stored results (since the tabular model is in memory, I’m not sure that that even makes sense), or do we send constituent tables to the cloud and build the tabular model structures there.
5) If I build this in an SSIS package (which I gather I do), is it an SSIS package that is built and maintained locally (meaning on our existing database, running MSSQL 2012 w/ Analysis Services, the way our existing SSIS packages are), or is it built and maintained in the Power BI Online environment in the cloud?
We're looking at using the PowerBI Preview to deploy dashboards and scorecards based on data that we collect on-premises. I'm assuming that we'd use that OData plugin to make data available in the cloud, for starters...?
edit: thanks for reading!
Regarding 1): Power BI for O365 cannot as yet connect to an SSAS tabular or multidimensional model directly as a source. However, the new Power BI Preview (released December 14) does allow a direct connection to an SSAS tabular model with the new connector. So it is very likely that Microsoft will soon release a similar connector for SSAS multidimensional, as they did for power view on SharePoint, first releasing tabular then multidimensional.
2): There is no direct connection to user-maintained SSAS multidimensional models no matter where they are stored. There is a direct connection to tabular models with Power BI Preview, whether the tabular model is on-prem or in azure.
3), 4) & 5): If the purpose is to "deploy dashboards and scorecards based on data that we collect on-premises" then take a close look at Power Query and Power Pivot and at the Power BI Designer. Also take a look at the developer tools that are available here: https://msdn.microsoft.com/powerbi/
This is an old question, and things have changed between then and now. In today's context using Power BI Gateway and Power BI desktop you can connect your on-premise SSAS cubes to power BI cloud and schedule your cubes to refreshed automatically. Earlier it used to support only tabular model, but as of now it supports both tabular and multidimensional SSAS cubes. The only thing which is not yet available is you cannot live connect a multidimensional cube but you can schedule it to be auto refreshed. However with SSAS tabular you can live connect to Power BI.
We do this for a number of customers... you first need to consider the volume of data in your models as this will dictate your requirement for using Analysis Services (Tabular, as quite righty pointed out Multidimensional is not an option as the in memory model of SSAS Tabular is suited to PBI).
Then you need to think about how the dashboard will be updated and how automated this process can be made using Power Query + Power Pivot - using your OneDrive for Business could be an option to make life simpler should you not be able to use SSAS Connector in the preview.
Finally, depending on your source OData and a few other connections can be automatically refreshed on a schedule (CRM for example)...
Hope that helps.

Resources