How I can catch Power BI queries on Import mode? For Direct Query I can just connect profiler to server, but for Import I have only model.
I need it to check how some complicated reports work.
You could still connect (SQL Server?) Profiler to your server, but manually trigger the Refresh - either using Power BI Desktop or from the Dataset's menu in app.powerbi.com (if using a gateway).
Note it might not give you the full picture as the Edit Queries capability can rival the most complex ETL processes - most of that wont hit your database. Probably the real answer to your 2nd para is that you need to open the source .pbix file using Power BI Desktop.
Related
I pull data down from an Azure based SQL Server every now and then using the SQL Server Import and Export Wizard (which I already know isn't the best way to go about that...in the queue to change but not for now). I was wondering if there is a way to add user ApplicationIntent=ReadOnly somehow like you would in a standard SQL Server Connection string so I hit one of the read only secondary databases instead of the high trafficked main one. I'm going to go ahead with no there is not you are doing it wrong but I thought I'd ask first.
Thanks.
Don't beat yourself up for using the Import/Export Wizard - for quick and dirty /one-off ETL work, it's a great tool. Under the hood, it uses SSIS. But by design it is a wizard and so is not going to give you all of the knobs to turn.
But all is not lost! To answer your question about using read-only application intent, at the point right before you'd actually run the ETL through the wizard, it gives you the option to save the SSIS package rather than run it. Save it off and then you have the option of changing the connection however you'd like. This Q/A over on the DBA sibling site shows you how to get a read-only connection in SSIS.
If that all seems roundabout, the suggestion from #Stu in the comment above of just determining the name of the secondary replica and using that is a great one given the one-off nature of Import/Export Wizard work.
I created a large .pbix file on my local drive trying to extract infomation from several hunderd webpages and combine them. Like this:
myfile.pbix
query_001
query_002
...
query_904
combined_query
But it takes hours to open and refresh. So I grouped the queries in power query and put them in small individual .pbix files on the local drive as below. There are only data with no models and reports.
myfile1.pbix
query_001
query_002
myfile2.pbix
query_003
query_004
...
myfile44.pbix
query_903
query_904
How can I get them togther again? I cannot find such a function as "import data from .pbix files". Do I have to publish them first? Thanks.
I was in your shoe once and I know how slow power query refresh can be. Power query is awesome but it may not be suitable for complex application. However, you can apply following fix and see how it works in your case.
A. If you use power bi service, you can create dataflow using power query and have them refreshed at a scheduled time. You can then query those dataflows in a pbi desktop, create model and solve business problem. By using dataflow in a subsequent seperate pbix, you are divinding and distributing the transformation and modelling. Dataflow are connected to Azure deta lake and if gateways are configured correctly and memory allocation are done appropriately, refresh are much faster than pbix desktop.
B. Second option is to use SSIS. SSIS has power query connector. (You can use something else too inside SSIS, not necessarily PQWRY).So whatver power query you have written to query the datasource you can wrap those inside a SSIS package, scheduled and have the ETL result pushed to a SQL server. You can schedule the SSIS job and you can build pbix model only by querying the SQL tables. The benefit is if you need to apply further transformation, you can wrote native SQL queries which are blazing fast compared to Power Query. In this way you are avoiding a ELT+modelling to ETL and subsequent modelling. ETL and modelling runs seperately and not in the same package, hence way faster.
C. If you use Power BI service, you can publish each of the data sources in service and then query simply the powerbi dataset for modeling. If you want, you can also use power bi's new awesome composite modeling.
I'm generating some simple BI reports for a dashboard of KPIs in an angular App. Well my question is can power BI update the data of the report automatically whenever i update the database? .For the DB i'm using sql server .
This is what you are looking for
Real-time streaming in Power BI
There is one more way you can do that Data refresh in Power BI
I would put my 2 cents on Data refresh
You can use a python script to automate this, I have used this project before and it works https://github.com/dubravcik/pbixrefresher-python
You will need to convert the .py file into an .exe file and run it on a task scheduler and set your preferred execution rate.
Can Lightswitch Be Used To Create A Web Based Real Time SQL Server Databse Monitoring Application
In otherwords if I have one or more querys I run in SQL Server Mgt Studio's Query Tool to get various pieces of information can I use Lightswitch to create an IE based version of this that will execute the same queries against the same SQL database and re-execute those on some timed value so that I effectively have a real time montioring applictaion or live report that shows the info I choose?
SQL Server Mgt Studio has a great tool called the Activity Monitor that on a fixed interval (a value that can be changed by the user) to rquery a number of system views and other code so as to provide the user with a monitoring like interface that is effetcively a live report. Its live because it continually re-querys the data source without the user having to do anything.
For a long time I've been using pre-defined queries in SSMS's query tool to continually check on data I've defined (as opposed as to system views created by someone at Microsoft) and I would love a way to do this without having to use SSMS and in a way so that it auto executes the queries on a specific interval so I don't have to continually press F5.
If there is another solution aside from LIghtswitch thaat can do this that doesn;t cost and arm and a leg I'd love to hear about it.
Thanks
You would need to attach the database to Lightswitch and recreate the queries there. Then create a screen to display the relevant data. But yes, Lightswitch can do what you want. You just need to implement a timer to refresh the screen on the interval you define. I do something similar in my Lightswitch app. I followed this guide:
http://lightswitchspecial.blogspot.in/2012/02/autorefresh-lightswitch-screen.html
I'm trying to pull data from an ODBC app to SQL2005(dev ed) DB on an hourly basis. When I run SSIS the option to import all tables and views is grayed out and forces your to write a query. How would I go about setting up a SSIS integration service to update ALL 250 some tables on an hourly basis.
What kind of database is your ODBC data source pointing to? SSIS might not give you a GUI for selecting tables/views for all DB types.
perhaps you could rephrase your question a little, I am not 100% sure what you are asking here. Are you trying to get data into SQL Server from an application via SSIS with the Data Transform task using an ODBC connection to the Application?
Anyhoo, the simple answer to the MS Access part of your question is "hell no" MS Access is never, ever the answer to anything ;-)
I would be inclined to find out why the tables and views are greyed out and fix that issue. (not enough info in this question to determine why they are greyed out)
You might be better off using the Import and Export Wizard. Go into SQL Server Management Studio, right click on the Database you want to import the data into, and select Tasks -> Import Data. It will launch the wizard which will walk you through defining the import process.
At the end of the wizard you can choose to execute the import, and even save it as an SSIS package which you can tweak later.