I'm searching for a solution to change dataset source data in power bi. I want a scriptable way using C# to update datasets, and make it use another specific local file, using the Power BI rest API. Any help?
Don't do this! Power BI REST API work with Power BI Service, which is in the cloud, and it will be a challenging task to give it access to files on your local drive!
Your problem (as asked in your other question) is that you are skipping layers from the recommended data warehouse architecture and try to combine them all in Power BI only. Just put these files somewhere in the cloud (Azure Data Lake, Azure SQL Database, OneDrive for Business, etc.) and use them as a datasource for your reports. You can also put them in a database on-premise and use Power BI Gatawey to connect to it from Power BI Service. See Get data from Comma Separated Value (.CSV) files. This will give you the flexibility to push data the way you want, even to clean up the historical data that you don't need anymore.
If you need to be able to programatically switch the data source for your report, then you can define it with parameters (e.g. ServerName and DatabaseName) and change their values using the REST API. See Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4 for more details how to implement connection-specific parameters.
To answer this, there is indeed a way, and it's by setting up the file source as a parameter in the power query.
All I do after is using the API to Updated this parameter.
Related
I am trying to connect my Power BI Report to my organization's SQL database. I just want to know that if it will assist me in providing business reports to the business on time. Currently we are doing extraction from database via FTP and putting it into Excel and cleaning the data and then formatting it to the user's needs. You can imagine the how tiresome this can will be.
I just want to know that if it will assist me in providing business reports to the business on time ?
Answer is yes.
To get better outcome you need to do research on their logics, requirements, how visualization present, KPI etc..... Once you finalize those things you can start working on implementations.
Note: Try to identify visualizations that display stakeholder's requirements. Because powerbi dashboard only for the upper management level, They need to get business decisions only for looking at graphs.
They can export data via excel sheet, powerpoint slide or pdf for their need, so you need to format those excel sheets also with the implementations.
I have been tasked with creating an application or using an existing one (Access, Excel, Power Apps) that allows users to read Snowflake data and also allow update, insert and delete operations. I am pretty sure Excel, Access and PowerApps are read only. PowerApps would also run 10 bucks a month for an app that currently only needs to be used once a quarter.
I was hoping I could used ODBC, but it looks like that only reads, no writeback. I do have the ability to use a SQL server as a middle man. I thought I would use ADF to mirror the data being modified with truncate and loads to Snowflake. But if I could skip that link in the chain it would be preferable.
Thoughts?
There are a couple of tools that can help you and business users read and write back to Snowflake. Many users then use Streams and Tasks on the table that is updated to automate further processing on Snowflake.
Some examples:
Open-source Excelerator - Excel plug-in to write to Snowflake
Sigma Computing - a cloud-native, serverless Excel / BI tool
I've done some research without getting valuable information about my question.
I'm working on a data warehouse project and one my customer's requirement is that to use power bi pro for data visualisation.
What is not clear to me is if power bi, while acquiring data in its data model, would or not benefit from the indexing structure developed in SQL Server.
Thank you in advance for recommendation/tips on this subject.
It somewhat depends on whether you are using a live connection.
Existing indexes may speed up data loading when using PowerBI in import mode where the data source is a view, query, or stored procedure.
They will also be used in Live mode when connecting to the above sources, and might be used when connecting directly to multiple tables.
As the comments state, if you are bringing entire tables into PowerBI with import mode, then the existing indexes will not benefit you, and the internal SSAS instance that PBI uses is a whole different kettle of fish.
One caveat is that columnstore indexes can be used to get around some of the data size limitations when dealing with the gateway as described here: https://community.powerbi.com/t5/Power-Query/Using-SQL-Server-with-Nonclustered-Columnstore-Index/td-p/563787, but that's not directly related to your question.
Indexes help with retrieval speed on the server end. The answer to how much it will help depends on the specifics of your situation. If you are doing a lot of data transformation and mashup in the Power BI query editor, indexes will only help where there is a step that selects rows from the SQL Server. It won't help with steps where the processing is being done on the Power BI end (such as merging with data from an Excel file or adding custom columns or some forms of substituting values). However, since you mention a data warehouse rather than a simple database, I'm going to assume you're barely doing any transformation on the Power BI end, relying instead on the server end to do the heavy lifting. In that case indexes will definitely help speed things up if they're done strategically
There are some difference between Import mode and Connect live mode.
Import mode:
Data import can be used against any data source type, it can combining Data from different sources. Current Power BI service limitation published file size is 1 GB.
When using import, data are stored in Power BI file/service. Therefore, there is no need to setup permissions on data source side (service account for load is enough) and you can share data publically or with people outside organization. On the other hand, all data are stored on Power BI. It is supported to implement full DAX expressions and full Power Query transformations.
Connect live mode:
There are more limitations for live connection in place. It doesn’t work against all data sources. Current list can be seen here, it cannot combine data from multiple sources.
You are also limited to just one data source/database you selected. You can’t combine data from multiple data sources anymore. If you are connected to SQL Database, you can still create logical relationships between objects from that database as well as measures and calculated columns. When you are connected to SQL Server Analysis Services, you are limited just to report layout and even can’t make calculated columns ,while you can only create measures currently. When using live connection, users have to have access to underlying data source. This means you can’t share outside of your organization or publically. And It is not supported to implement full DAX expressions, only Report Level Measures, to learn more about report level measures, watch this great video from Patrick, and there is no Power Query transformations.
You can learn more: directquery-live-connection-or-import-data-tough-decision
I'm an entry level data analyst for a medium sized franchisor overseeing ~100+ franchise locations.
I'm looking to implement a BI tool (such as Power BI), however we do not have direct access to any of our source data (no APIs or DB access). i.e. we must download our data in the form of reports from each of our 5+ IT platforms.
I'm currently using Excel and Power Query to convert these reports to a usable format for ad-hoc data analysis, however this is not an ideal solution for future BI requirements and historical data analysis work. I'm not sure the best method or tools to use to essentially create a new database from these reports (which are not always in a flat format).
Considering my situation, does anyone have any recommendations on any database platforms (i.e. MS Access, AWS Redshift, Azure, etc.) and/or ETL solutions so I could simply download the files and have them automatically "cleansed" and uploaded to a database?
Thank you
So, I am pretty new at using Power BI, so I'm having some issues when building reports that use on-premises Data Gateway. My goal is to build different reports in Power BI (let´s say 4 reports, as an example), all of them getting information from the same web data base. The problem is that everytime someone access this online data base, my company has to pay a small amount of money.
The reports refresh once per day, exactly at 2AM, using the on-premises Data Gateway. My point is: will the on-premises Data Gateway access this data base once per day, at 2 AM, and refresh all the reports or will it access it once for exery report (meaning 4 access at 2AM)?
If something is unclear, just warn me and I try to provide more information. Thanks!
It will follow the "4 access at 2AM" scenario.
I would design one master/dataset report that extracts all the required data from the source database. Once published (to your workspace or a group workspace), I would use that dataset as the source for the other 3 reports. From Power BI Desktop, go to Get Data / Power BI service.
Then you only have 1 dataset to refresh via the gateway. Another benefit is that you only have one set of queries & modelling to maintain.
It depends on how you have created the report. If the dataset is exactly the same, you should publish only one report and create the other 3 desktop files connecting to Power Bi Service as dataset. In that way you can have only one dataset for four reports and will refresh only once a day.
Regards