Export CSV as portal user - salesforce

I have a salesforce I export to CSV using a URL like this https://tapp0.salesforce.com/00OT00000014APi?export=1&enc=UTF-8&xf=csv following this blog post.
This works fine and is very fast when I run for a fully licenced user.
However when I try to call the same report export as a Gold Patner portal user I get an error "Insufficient Privileges"
I have marked the report as deployed.
Given all users access to the Report Folder
The user does have the correct sharing and profile rules setup to view the data in the report.
Going to just the report URL by itself works https://tapp0.salesforce.com/00OT00000014APi
It only fails when I try to export to CSV.
I DO reliase I am using an unsuported internal API call. But was wondering is there anyway portal users can export reports to CSV?

Not sure about through the UI, but this can be done through a SOQL query and portal users do have limited access to the API. Most, but not all, reports can be converted to SOQL queries to produce the same output.
For running your query, the easiest is probably to build a Visualforce page that is backed by an Apex controller that runs the query and outputs a CSV. Take a look at the contentType attribute on the apex:page tag. You can set things like application/vnd.ms-excel#contacts.xls to automatically export a data table to Excel. I didn't try, but it probably works with CSV too -- worst case is that you open in Excel first and save as CSV.
Also, if you don't mind portal users having to leave Salesforce to get their CSV, you might want to try Workbench, which is an app I built that allows for portal user login and helps you build the SOQL query for CSV export both through the SOAP and Bulk APIs.

Ok I found the problem
You need to go to Salesforce > Setup > Manage Users > Profiles
Then Click to Edit the RS_PortalUser profile
Click the checkbox next to “Run Reports” and “Export Reports”
Click save

Related

getting information from specific runs of logic apps

We have logic apps running in azure.
We can query some details of past runs in azure log analytics.
Log analytics does not seem to contain any of the output from each task in the logic app, even though i can see this in the logic app history.
Is there a way to query the data/payloads/output from each task in the logic app?
Yes, you can get logs of Logic apps, I have followed below process:
Firstly, open your Log analytics workspace
In general tab open workspace summary as below:
then click on add as below:
Then click on Logic apps managemnet(preview):
The click on create:
Then give details and click on create:
Then open your created as below:
Then, I have created a workflow in logic app and then I opened Diagnostic settings as below:
Then give important details as below:
open your Log analytics workspace
In general tab open workspace summary as below:
Then you will get output as below:
When you click on chart you will get information like below:
If you click on row of LogicApp(table in above display)
Alternatively in work summary in log analytics, click on logs and then You can the below kql query to get logs and you can export it using export option as below:
AzureDiagnostics
| where ResourceProvider == "MICROSOFT.LOGIC"
| where OperationName has "workflowRunCompleted"
You can also send a email of payload (example SO-thread) to check your runs or you can create an alert on each run in logicapp.

how to integrate react application with filemaker server as backend

My job has tasked me to create a login page with username, password, reset password, sign in and sign up. When you login you are sent to a landing page. All should be done with react but i have to use filemaker server as a backend that stores usernames and passwords in a database. I've never used filemaker before and i don't know where to start. Any advise will be appreciated
Use Filemaker Data Api:
https://help.claris.com/en/data-api-guide/
Login call to get the token
Make a POST request to
{yourServerUrl}/fmi/data/version/databases/{database-name}/layouts/{layout-name}/records
to write the new record into the database.
Assuming your company is on a recent version of FileMaker (version 16 or later, I believe) then FileMaker Server has a feature called the Data API that publishes all FM data in a more standard-manner than going through the trials of ODBC or the PHP API.
The requirements are essentially just clicking the checkbox in the Server Admin control panel:
That feature has tiered pricing associated with it, but it includes a base amount of requests as part of having a server license and most likely you will find it to be the best avenue. If, for some reason, you can't use the Data API, going in via the PHP API which produces a more verbose XML output and is tied to fields visible on layouts is another avenue.

Is this possible to create a private report filtering in Data Studio embeded report

I created a report in DataStudio and embedded it on my website. I activated the option "anyone with the link can view" so this report will be visible to my website users.
But I need to show my website users different data depending on their user ids and more important I don't want users would be able to see other users' data so if I used URL filtering users would be able to breach and search another user id to see his data.
Does anyone have a solution for this scenario?
In Google documentation I saw an option to limit the report to users in my domain, I assume this will solve this issue, but I don't find how to restrict other domains.
Users are logged onto Google
If users of your website are already logged onto Google, use the Filter by email address guide from Data Studio help center. This requires you to setup FILTER BY EMAIL and then have a field in your data can be directly used as an email filter.
Users are not logged on to Google
If you want a solution where the users don't have to be logged onto Google, you will need to:
Create a Community Connector to pass the filtered data to your users. The connector should accept a short lived token as part of the config.
Create a dashboard with your connector and pass unique short-lived tokens for each user.
You should have an endpoint that returns the current user's data based on the token provided. Alternatively, the endpoint can return only the user's identify and you can query a secondary data source with a service account filtering for the user's identity.
Your connector should call your endpoint to fetch data only for the user/for the user's identity.
This official guide demonstrates how to implement this in more details.
Disclaimer: I work in the Data Studio team and wrote the above guide.
First option is to add extra 2 fields to your data source.
User_ID
Password
For example:
Data, User_ID, Password
$10,Daniel,123
$20,Alex,456
In your dashboard, you need to create two parameters:
User_ID_Parameter
Password_Parameter
Both parameters can set the default value to null, and accepts any values.
Then create a new calculated field:
CASE
WHEN REGEXP_MATCH(User_ID,USER_ID_Parameter) AND REGEXP_MATCH(Password,Password_Parameter) THEN 1
ELSE 0
END
Then create a new filter to the chart that you want to hide:
To include the above calculated field Equal to 1
Second option is to use the Data Studio default Row Level Security
The only caveat is the users need to sign in before they can view the report.

How to I access reports programmatically in SalesForce using Apex

I'm trying to write an app on the SalesForce platform that can pull a list of contacts from a report and send them to a web service (say to send them an email or SMS)
The only way I can seem to find to do this is to add the report results to a newly created campaign, and then access that campaign. This seems like the long way around.
Every post I read online says you can't access the reports through Apex, however most or all of these posts were written before Version 20 of the API was released last month, which introduced a new report object. I can now programmatically access info about a report (Such as the date last run etc) but I still can't seem to find a way to access the result data contained in that report.
Does anyone know if there's a way to do that?
After much research into it, I've discovered the only way to do this at the moment is indeed to scrape the CSV document. I would guess that Conga etc are using exactly this method.
We've been doing this for a while now, and it works. The only caveats are:
Salesforce username / password /
security token has to be shared to
the app connecting. If the password
changes (and by default it is changed
every 30 days or so) the token also
changes and must be re-entered.
You have to know the host of the account, which can be difficult to
get right. For instance while most european accounts would use emea.salesforce.com to access CSV, our account uses na7 (North America 7) even though we're located in
ireland. I'm currently sending the page host to the app and parsing it
to calculate the correct subdomain to use, but I think there has to be a
better way to do this.
Salesforce really needs to sort this out by supplying an API call which allows custom report results to be exported on the fly and allowing us to use OAuth to connect to it. But of course, this is unlikely to happen.
In the SalesforceSpring 11 update, it seems you can obtain more informations about the Reports:
As stated in the API for Report and ReportType, you can access via Apex the fields used in the query by the Report, reading the field "columns", as well as the field used to represent the filters called "filter".
Iterating through this objects, should allow you to build a String representing the same query of the Report. After building that string you can make a dynamic query with a Database.query(..) call.
It seems to be a little messy, but should work.. (NOT TESTED YET!)
As header states, this works only with Custom Reports!
Just to clarify for fellow rookies who will find this, when the question was asked you could access your report data programatically, but you had to use some hacky, error prone methods.
This is all fixed, you can now access your reports via the API as of Winter '14.
Documentation here - http://www.salesforce.com/us/developer/docs/api_analytics/index.htm
Go to town on those custom dashboards etc. Cross posted from the Salesforce Stack Exchange - https://salesforce.stackexchange.com/questions/337/can-report-data-be-accessed-programatically/
But Conga (appextremes) do this in their QuickMerge product, where the user specifies the report Id, and the apex script on the page runs the report to extract the results for a mail merge operation.
the v20.0 API added metadata about the reports, but no way to actually run the report and obtain the results. If this is a standard report, or a report you've defined, you can work out the equivalent SOQL query for your report and run that, but if its an end user defined report, there's no way to do this.

PerformancePoint dashboard permissions problem in MOSS

I have a PerformancePoint dashboard running in MOSS 2007 portal. The dashboard consists of one SSRS 2005 report, running in SharePoint Integrated mode.
NT Authority\Authenticated Users have read permissions to the report library containing the SSRS report, the dashboard, and the report library containing the dashboard.
Users that attempt to access the dashboard receive the following error message:
The permissions granted to user
'DOMAIN\firstname.lastname' are
insufficient for performing this
operation. (rsAccessDenied)
Users that then click on the direct link to the report in MOSS will see the report with no problem. Subsequent visits to the dashboard show the report with no problem.
The report is using a data source that is located one folder up from the report location. The report has been updated to point to the correct shared data source after deployment. Both the report and the data source have been published. The data source is using stored credentials, with a domain service account that has been set to Use as Windows credentials. This service account is serving other reports in other areas with no problem.
Edit:
Ok, I've gotten a lot more information on this problem. The request is never actually being made to the data source. The user comes in to the dashboard and requests a report for the first time using their kerberos token identifying themselves. The report looks in the Report Server database and finds that they are not listed in the users table and generates this rsAccessDenied error. Once they view the report directly their name is in this table and they never have the problem again.
Unfortunately, removing the user from the Users table in the RS database doesn't actually cause this error to happen again.
Everything I've read says that when you run a Report Server in MOSS integrated mode all your permissions are handled at the MOSS report library level, and all Auth users have permissions to the report library, as stated earlier. Any ideas?
It sounds like the dashboard page is not passing credentials to the report server. Because you stated if the user hits the report directly it works and then if they go through the dashboard it works. So does it stop working after a certain period, indicating a cached session?
I would look carefully at the properties on the dashboard in performance point to see if it is passing credentials or trying to use anonymous. I hope this isn't a case of double hop and Kerberos :(
I'm not sure if this is actually the same issue, but you could double check that the data sources are set to "Approved", not "Pending". It's a stretch, but it's worth a shot.
I implemented a work-around:
created a fake report
on the page with my report, created a content editor web part consisting of the following:
<iframe style="display:none;" src="https://link/to/my/report.rdl"></iframe>
I tested with another user that was experiencing the error, and they are no longer experiencing the error with my new and improved page.
I know this is a kludgy, and might even be dependent on the loading order of the page. Therefore, I would really like to find out what's causing this issue so I can fix it for good.
Edit:
I don't want to accept my own answer, since it's just a work-around. If anyone can post anything relevant to the user name placed into the Report Server DB and how that relates to SharePoint Integrated mode, you'll get the bounty.

Resources