I'm trying to write an app on the SalesForce platform that can pull a list of contacts from a report and send them to a web service (say to send them an email or SMS)
The only way I can seem to find to do this is to add the report results to a newly created campaign, and then access that campaign. This seems like the long way around.
Every post I read online says you can't access the reports through Apex, however most or all of these posts were written before Version 20 of the API was released last month, which introduced a new report object. I can now programmatically access info about a report (Such as the date last run etc) but I still can't seem to find a way to access the result data contained in that report.
Does anyone know if there's a way to do that?
After much research into it, I've discovered the only way to do this at the moment is indeed to scrape the CSV document. I would guess that Conga etc are using exactly this method.
We've been doing this for a while now, and it works. The only caveats are:
Salesforce username / password /
security token has to be shared to
the app connecting. If the password
changes (and by default it is changed
every 30 days or so) the token also
changes and must be re-entered.
You have to know the host of the account, which can be difficult to
get right. For instance while most european accounts would use emea.salesforce.com to access CSV, our account uses na7 (North America 7) even though we're located in
ireland. I'm currently sending the page host to the app and parsing it
to calculate the correct subdomain to use, but I think there has to be a
better way to do this.
Salesforce really needs to sort this out by supplying an API call which allows custom report results to be exported on the fly and allowing us to use OAuth to connect to it. But of course, this is unlikely to happen.
In the SalesforceSpring 11 update, it seems you can obtain more informations about the Reports:
As stated in the API for Report and ReportType, you can access via Apex the fields used in the query by the Report, reading the field "columns", as well as the field used to represent the filters called "filter".
Iterating through this objects, should allow you to build a String representing the same query of the Report. After building that string you can make a dynamic query with a Database.query(..) call.
It seems to be a little messy, but should work.. (NOT TESTED YET!)
As header states, this works only with Custom Reports!
Just to clarify for fellow rookies who will find this, when the question was asked you could access your report data programatically, but you had to use some hacky, error prone methods.
This is all fixed, you can now access your reports via the API as of Winter '14.
Documentation here - http://www.salesforce.com/us/developer/docs/api_analytics/index.htm
Go to town on those custom dashboards etc. Cross posted from the Salesforce Stack Exchange - https://salesforce.stackexchange.com/questions/337/can-report-data-be-accessed-programatically/
But Conga (appextremes) do this in their QuickMerge product, where the user specifies the report Id, and the apex script on the page runs the report to extract the results for a mail merge operation.
the v20.0 API added metadata about the reports, but no way to actually run the report and obtain the results. If this is a standard report, or a report you've defined, you can work out the equivalent SOQL query for your report and run that, but if its an end user defined report, there's no way to do this.
Related
As you know Salesforce is enforcing Enhanced Domains. I found from Salesforce help that:
Custom components in your org must be evaluated in order to check
whether they use domain name/static URLs
Some embedded content stored in Salesforce might no longer appear
Third-party applications can lose access to your data
Single sign-on integrations can fail
However, I'm struggling with finding out which particular Salesforce elements/configurations should be checked in order to detect potential gaps? Do you know which areas exactly can be affected and shall be evaluated (like Apex Codes, Email Templates and so on)? Is there any guide on that?
Your biggest concern should be inbound integrations. Things that log in over REST/SOAP API, get response with session id back, ignore the "url to use for all subsequent requests" and just use hardcoded url, whether it's prod or sandbox.
Look at this guy, he's victim of either enhanced domain or "disable api versions < 30" thing: The requested resource no longer exists with rest PHP. Look at these guys, they had hardcoded url: how to solve python code error (TooManyRedirects: Exceeded 30 redirects), Salesforce API via postman error INVALID_SESSION_ID.
As for stuff inside Salesforce itself - best would be to download whole project with sfdx and run a text search for your domain name (and site/community name if you have these). Email templates that use merge fields for forgot password etc should be fine, merge fields with record link should be fine... But if you manually craft email body in apex - might be a problem. A lot depends how creative the developer was. If you find getsalesforcebaseurl().toexternalform() it should still work. If it's hardcoded / read from custom setting / custom label / custom metadata it might be more fun.
If you have external apps that display pieces of salesforce (embedded live chat? some iframe with FAQ? CMS Connect) - the domain change might mean they need to be updated, both in terms of updating url and changing security rules (CSP for example)
While looking at our snowflake.account_usage.login_history in order to identify users with outdated client drivers (using reported_client_type + reported_client_version), I came across this user_name that I did not recognize: WORKSHEETS_APP_USER.
It's not one of our users, so I'm wondering where it's coming from.
The client driver it's using is OTHER 1.1.5.
It's using OAUTH_ACCESS_TOKEN to authenticate (which is not an authentication method we use for Snowflake).
And it's using a ton of different IPs in the 10.4.* range.
It has a lot more logins during the week than during the weekend -- so probably a human(s).
I'm thinking it's probably related to the worksheets UI (either in Snowsight or in the old console).
If so, would there be any way to know who was the original user(s) behind this activity?
The first time Snowsight is accessed in an account, Snowflake creates an internal WORKSHEETS_APP_USER user to support the web interface. This user is used to cache query results in an internal stage in your account. For more information, see Getting Started With Snowsight.
https://docs.snowflake.com/en/sql-reference/account-usage/users.html#usage-notes
I am executing a search operation for people search using Microsoft Graph Endpoint - https://graph.microsoft.com/V1.0/users.
The question I have is - I am able to get all the textual data I need, but is there a way to get photo for each returned user in a single call?. If there are 10 users returned in the previous search, executing 10 different operations to get the photos based on each user's id would be a challenge.
It isn't possible to fetch both user's data and photo in a single call since they are different data types (application/json vs image/jpeg).
Marc is spot on here. However you should also check out the new batching feature (note this is still in /beta) which would allow you to get up to 5 photos in one request round-trip. See https://developer.microsoft.com/en-us/graph/docs/concepts/json_batching. We'd love to get your feedback on this.
I'd like to create visualforce page that inserts a record into salesforce account object. However, I expect some of the page users won't have salesforce accounts. Can they still access it? If not, what are the alternatives that can be used to visualforce page in this case? (Please don't consider Web to Lead Forms).
Thanks,
Yes, it's possible. Go read about Salesforce Sites. For a start:
http://wiki.developerforce.com/page/Websites
http://wiki.developerforce.com/page/An_Introduction_to_Force.com_Sites
(of course it's also possible to write that page in say Java/.NET/PHP and use integration via SOAP or REST to talk to Salesforce... but these 2 main links will keep the whole solution within SF so no need to need to learn new language, have extra maintenance effort etc)
Sites are VF pages that expose a bit of your company's data without need to log in. You can use them to input data too, just remember that in theory anybody could learn the link and spam you (not too different from web2lead, inbound email handlers etc). You specify security in a way similar to Profiles, the records will have "Created By = {site name} Guest User".
I don't think there's anything out of the box to restrict visibility, they're open to whole world. So if you would want something similar to login IP ranges (so only sales reps from your office's network can enter data) - you might have to write some logic in the controller.
I'd like to set up a coldfusion page that will pull the status updates from my own facebook account and twitter accounts and put them in a SQL database along with their timestamps. Whenever I run this page it should only grab information after the most recent time stamp it already has within the database.
I'm hoping this won't be too bad because all I'm interested in is just status updates and their time stamps. Eventually I'd like to pull other things like images and such, but for a first test just status updates is fine. Does anyone have sample code and/or pointers that could assist me in this endeavor?
I'd like it if any information relates to the current version of the apis (twitter with oAuth and facebook open graph) if they are necessary. Some solutions I've seen involve the creation of a twitter application and facebook application to interact with the APIs; is that necessary if all I want to do is access a subset of my own account information? Thanks in advance!
I would read the max(insertDate) from the database and if the API allows you, only request updates since that date. Then insert those updates. The next time you run you'll just need to get the max() of the last bunch of updates before calling for the next bunch.
You could run it every 5 minutes using a ColdFusion scheduled task.
How you communicate with the API is usually using <cfhttp />. One thing I always do is log every request and response, either in a text file, or in a database. That's can be invaluable when troubleshooting.
Hope that helps.
Use the cffeed tag to pull RSS feeds from Twitter and Facebook. Retain the date of the last feed scan somewhere (application variable or database) and loop over the feed entries. Any entry older than last scan is ignored, everything else gets committed. Make sure to wrap cffeed in a try/catch, as it will throw errors if the service is down (ahem, twitter) As mentioned in other answers, set it up as a scheduled task.
<cffeed action="read" properties="feedMetadata" query="feedQuery"
source="http://search.twitter.com/search.atom?q=+from:mytwitteraccount" />
Different approach than what you're suggesting, but it worked for us. We had two live events, where we asked people to post to a bespoke Facebook fan page, or to Twitter with a hashtag we endorsed for the event in realtime. Then we just fetched and parsed the RSS feeds of the FB page, and the Twitter search results, extracting what was new, on a short interval... I think it was approximately every three minutes. CFFEED was a little error-prone and wonky, just doing a CFHTTP get of the RSS feeds, and then processing the CFHTTP.filecontent struct item as XML worked fine
.LAG