20 CRUD operation UI - create/update/delete
WorkFlow Management
Dashboard for analysis
Report for CRUD and crucial data
Report in formats such as csv, pdf etc.,
Roles and Permission
Related
I am looking to pull site usage data from sharepoint such as daily users, click through rate, which parts of the site they are using the most, which links they are using the most, which documents are being opening the most. Is there a way to do this through excel, any programs, etc. I have been looking at Power BI, Excel, Power Query, etc. I haven't found a way to pull the data from sharepoint analytics itself though.
I am looking to pull data from the sharepoint site and display it as a chart, a Pareto chart for example.
1.The current situation that the site usage analysis page can export:
Site owners can export the 90-days site usage data in an excel file by going to the download button in the upper right corner on site usage page. Report on unique viewers, site visits, popular platforms and site traffic. For popular content on the site (news posts, documents and pages) the report will be for last 7 days.
excel_example
2.I've also tried getting data from the web in Excel, but it doesn't work. There is currently only one connector between site usage and PowerBI.
I tracked down a published post in UserVoice: Export to Excel on Site usage. You can vote and comment anytime.
3.You might try using the Office 365 Admin API to use, retrieve and store the data in a database, and then report on it with PowerBI. This requires registering with Azure AD and give it permissions to the API.
Reference: Office 365 Management Activity API reference
We are a SaaS product and we would like to be able have per-user data exports that will be used with various analytical (BI) tools like Tableau or PowerBI. Instead of just managing all those exports manually, we thought of using some cloud database such as AWS Redshift (which will be part of our service). But then, it is not clear how is user will access those databases naturally, unless we do some kind of SSO integration with AWS.
So - what is the best practice for exporting data for analytics use in SaaS products?
In this case you can build your security in to your backend API layer.
First you can set up processes to load your data to Redshift, then make sure that only your backend API server/cluster has access to redshift (e.g. through a vpc with no external ip access to redshift)
Now you have your data, you can validate your user as usual through your backend service, then when a user requests a download through the backend API, the backend can create a query to extract from redshift only the correct data based upon the users security role. In order to make this possible you may need to build some kind of security column into your redshift data model.
I am assuming getting data to redshift is not a problem.
What you are looking for, if I understand correctly is a OEM solutions.
The problem is how does one mimic the security model you have in place for your SaaS offering.
That depends on how complex is your security model.
If it is as simple as just authenticate the user and he has access to all tenant data or the data can be easily filtered for user. Things are simple for you. Trusted authentication will allow you to authenticate that user and user filtering will allow you to show him all that he has access to.
But here is the kicker, if your security is really complex , then it can become really difficult to mimic it within these products.
Here for integrating tableau this link will help:-
https://tableau.github.io/embedding-playbook/#
Power BI, this product am not a fan off. I tried to embed a view in one my applications and data refresh was a big issue.
Its almost like they want you to be a azure shop for real time reporting.( I like GCP more )
If you create the api's and populate datasets then they have crazy restrictions like 1MB/sec etc.
On the other instances datasets can be refreshed only 8 times.
I gave up on them.
Very recently I got a call from Sisense and they seemed promising as well from a OEM perspective. You might was to try them.
I am in the process of creating a few dashboards for my organization. We currently use a few different cloud based applications where we authenticate with Google Chrome.
Does Google Data Studio have the ability secure access and or filter content based on your google account?
For example, if I create a dashboard with data for User(s)#domain.com will I be able to set a filter in the data source or dashboard to allow USER A to see only sales data they generated? My user population is over 5k so individual reports are not an option.
Not natively. I'm not sure about your datsources but in Google Bigquery you can create views which serve the same purpose
row permissions in BigQuery
The AUT is a web application. There are different types of users like admint, consumer etc.
The application is a utility domain application. There is hardly any data that user/human inputs into the system.
The data is coming from electronic meters and getting stored in the database.
Our application only renders the data to the consumers, admins and other users with the data with different calculations in the form of charts and graphs.
There are about 9-10 pages in the application.
My question is : How can I proceed with automation in this scenario..?
I have got olap cube on ssas which has all operational history data
I need a front end toll which can not only show data from cube but also sources like web crawl scribe blogs Facebook etc.
Can I leverage any open source software for building required reports which can be run in real time
Have got OBIEE but need a platform which can help devise such reports through a search screen on wrbsitw with parameter enters by user