Access VisualForce Page without salesforce account - salesforce

I'd like to create visualforce page that inserts a record into salesforce account object. However, I expect some of the page users won't have salesforce accounts. Can they still access it? If not, what are the alternatives that can be used to visualforce page in this case? (Please don't consider Web to Lead Forms).
Thanks,

Yes, it's possible. Go read about Salesforce Sites. For a start:
http://wiki.developerforce.com/page/Websites
http://wiki.developerforce.com/page/An_Introduction_to_Force.com_Sites
(of course it's also possible to write that page in say Java/.NET/PHP and use integration via SOAP or REST to talk to Salesforce... but these 2 main links will keep the whole solution within SF so no need to need to learn new language, have extra maintenance effort etc)
Sites are VF pages that expose a bit of your company's data without need to log in. You can use them to input data too, just remember that in theory anybody could learn the link and spam you (not too different from web2lead, inbound email handlers etc). You specify security in a way similar to Profiles, the records will have "Created By = {site name} Guest User".
I don't think there's anything out of the box to restrict visibility, they're open to whole world. So if you would want something similar to login IP ranges (so only sales reps from your office's network can enter data) - you might have to write some logic in the controller.

Related

When and how should I check what active organisation a user has?

I am building a hybrid mobile app using AngularJS and Ionic as front-end.
Each user belongs to an organisation. But it is possible to change which organisation a user belongs to on the server and in a different web application.
The user can do some things in the web app:
Get data about the organisation
Post, put and delete data about the organisation
Each of these requires an API call to get the relevant information.
Now my question is, when and how should I check which organisation the user belongs to?
Should I send an API call before every get, post, put and delete to check which organisation the user belongs to?
If yes, then what it a nice way to organize this organisation checking without having it tangle up all my other code?
It sounds like what you're trying to get at is permissions for the user to edit, etc. the organization only when they belong to it. That should be done server-side for the following reasons:
It keeps the access control coupled to the operation, so the server can prevent disallowed reads/changes even if there's a bug in the client.
It stops malicious users from bypassing the membership check altogether, which they can do if the client is all that's enforcing the rules.
It avoids the API calls you're worried about that constantly need to recheck the user's membership, as well as the race conditions if membership changes between the check and the next call.
It handles both your Ionic client and your other web client, and lets you expand to more clients in the future, without each having to duplicate the checking logic.
Similarly, it lets you modify your permissioning logic in one place, for example if you wanted to differentiate users who can read the organization from admins who can edit it.
Once the server is solid, there are only a few places you'll need to sync the user's memberships:
At app startup, unless you keep a cache from the last use and that's good enough.
On some schedule as they use the app, if memberships change frequently enough that you want to sync quickly. Perhaps whenever they visit their list of organizations.
When the user does something in the app to invalidate the cache, like join or leave an organization.
When an API call about an organization fails, because the user may no longer be a member.

chatter posts in sites pages?

Can chatter posts be accessed in sites pages? Both, reading chatter posts and adding comments from a sites page that is using the guest site user login account?
If you make an Apex controller that is without sharing, then yes, you can expose Chatter posts on a Sites page (although you might have to recreate the user experience, I'm not sure if the default Visualforce components would work that way).
Adding comments is another matter though -- I doubt you'd succeed in getting the guest user to add a comment (although this is pure conjecture -- I haven't actually tried it myself so who knows, it might work). In this case you'd probably have to write the controller such that it added any comments as some existing regular user in the system.
Also, it's worth noting the existence of Chatter Answers, which is a way to expose a limited chatter feed on Sites. If this maps to what you're trying to do then it might save you some time.

Evernote users in the application database

What's the best practice or the common way of keeping (or not keeping) Evernote users in your application's database?
Should I create my own membership system and create a connection to Evernote accounts?
Should I store Evernote user data (or only part of it) in my own app and let the user log in only with Evernote?
Summary: you must protect their data but how you protect it is up to you. Use the integer edam_userId to identify data.
I think the API License agreement covers protection in the terms:
you agree that when using the API you will not, directly or indirectly, take or enable another to take any of the following actions:...
1.8.4 circumvent or modify any Keys or other security mechanism employed by Evernote or the API;
If you cache people's data and your server-based app lacks security to prevent people looking at other's data, then I think you're pretty clearly violating that clause. I think it's quite elegantly written!
Couple that with the responsibility clause 1.2
You are fully responsible for all activities that occur using your Keys, regardless of whether such activities are undertaken by you or a third party.
So if you don't protect someone's cached data and another user is able to get at it, you're explicitly liable.
Having cleared up the question of your obligations to (as you'd expect) protect people's data, the question is how do you store it?
Clause 4.3 covers identifiers pretty directly although it's a bit out of date now that we are all forced to use oAuth - there are no passwords ever entered into anything other a web view. However, mobile or desktop client apps must provide a mechanism for the user to log out, which must completely remove the username and password from your application and its persistent storage.
For a web app, you can't even save the username: If your Application runs as an Internet service on a multi-user server, you must not ask for, view, store or cache the sign-in name or password of Evernote user accounts.
The good news is that you can rely on the edam_userId value which comes back to you in the oAuth token credentials response, as discussed here.
When you look at the Data Model, you can see the unique id under the User and going into the User struct, see the reassuring definition The unique numeric identifier for the account, which will not change for the lifetime of the account.
Thinking about the consequences, as you can't get the user id until you have logged into the service, if you want to provide a local login for people you will have to link your local credentials to the user id. That may irk some people if they have to enter a username twice but can't be helped.
You can allow users to log-in via OAuth. Here's a guide on how that process works.
But you'll probably also want to store a minimal amount of user data, at least a unique identifier, in your database so you can do things like create relationships between the user and their notebooks and tags. Refer to the Evernote data model for those relationships. If you're using rails, this will also help you take advantage of rails conventions.

Pointers for Custom Sitecore Analytics

I am trying to do some very basic analytics on an existing sitecore site. All i need to find is basic behavior (page views, time on page) about logged in user. For instance, I need to be able to see which pages a particular logged in user has viewed, and how long he/she stayed on that page.
I am using Sitecore 6.4, is this possible?
If so, what is the preferred way to go about doing this?
The way that Sitecore's analytics work isn't quite like you might imagine. It's really not designed to do reporting on specific users without some configuration on your part. Basically it means that you have to set up their username to be captured.. which is probably easiest to do by assigning a tag to their session. From there, there are things like the session reports and then you can pull up all the sessions for that tag (username). This is something that you will probably have to spend some time and some trial and error to do... but it is possible.. just not necessarily an out of the box report.

How best to screen scrape a password protected site on behalf of a 3rd party?

I want to write a program that analyzes your fantasy baseball team and notifies you of recommended actions, possibly multiple times per day. The problem is, you aren't playing fantasy baseball on my site, you're playing on yahoo, or cbs, or espn, etc.
On the majority of these sites, fantasy teams and leagues are not public, so you must be logged in and a member of the league to see the teams in the league.
All that I need is the plain html for the team page on each of those sites to be sent to my server, where I can then parse and analyze the file and send user notifications.
The problem is that I need username/password combinations to easily get this data to my server when I need it, and I think there will be a lot of people who wouldn't want to entrust their yahoo/espn/cbs password to me.
I have come up with several possible ways to solve this problem:
The most obvious way is to ask for their credentials for the site on which their team is hosted. Then I could just programmatically log in and request the data I need. I'm guessing a number of people would be comfortable giving me their credentials, and a number of them not so much.
Write a desktop client, which the user then downloads. The client would require their credentials, but it could then basically do exactly the same thing that the server based version would do, log in, request the page, and send the page back to my server. The difference being that their password would never need to leave their desktop. Their computer would need to be on, and this program running for this method to work.
Write browser add-ons that navigate to the page I need, use the cookie that is saved from a previous login to login to the site, and send the page back to my server. This doesn't require my software to ever ask for their password, but if the cookie expires I am hosed, and I don't know much about browser add-ons besides.
I'm sure there are other options, but these are what I've come up with so far.
I have two questions:
1. What are the other possibilities for this type of task?
2. Am I over-estimating people's reluctance to give me their yahoo (for example) password? Is option (1) above the obvious choice?
It was suggested in the comments that I try yahoo pipes, and that looked like a promising suggestion so I explored it a bit. Having looked now at this, I don't think that is an option. So, it looks like I'll be going with option 1.
This is a problem I grappled with a couple of years ago when I wanted to do the same thing. Our site is http://benchcoach.com and the options we were considering were the following:
Original we considered getting the user's credentials and login. We would then log in and scrape their league and team info. The problem there is that after reading several of the various terms of service, this would definitely be violating the terms of service. On top of this, Yahoo! was definitely one of the sites we were considering and their users have email (where we could get access to sensitive data), and Yahoo! wallet. In addition, it would be pretty trivial for Yahoo/ESPN/CBS to block our programmatic logins by IP Address.
The solution we settled on (not 100% happy but it does seem to work) was asking our users to install a bookmarklet (like delicious, digg or reddit) which would post the current html page to our servers, where we could parse the data and load our database. If they were still logged into their Yahoo/ESPN/CBS account, we would direct them directly to the pages, otherwise, those sites would prompt for authentication. Clicking the bookmarklet once more, would post the page to our servers.
The pros of this approach was that we never collected anyone's credentials so any concern of security would have been alleviated. Secondly, it would make it impossible for Yahoo/ESPN/CBS to block access to our service since we would never be connecting directly to their servers but rather the user's browser would be posting the contents of their browser to our server.
The problems with this is that it takes 2 clicks to post a page to our site. For head to head leagues, we needed 3-4 pages so it would take our user 6-8 clicks to sync their league to our servers. We're still looking at options for this.
One important note is that I ran into the product manager of the Yahoo Fantasy Football site at a conference a year ago. We talked about how we were getting the Yahoo data, and he confirmed that getting credentials would violate their TOS and they may stop us. While I don't think they would have, it would have made it hard to invest time and energy to develop this only to have them block our site and pissing of users by closing their accounts.
A potentially more complicated answer could possibly be done with (for example) yahoo pipes.
Hypothetically, you create a pipe which prompts the user for their credentials and provides them with a url which contains their scraped data. They enter this URL in their site, and never have to provide their credentials directly. Even better, for the security-conscious, it would be possible to examine what the pipe was actually doing before entering any information.
The downside would be increased complexity (as well as you'd have to write and maintain the pipe). Having said that, you could provide a link directly to the published pipe from your site, to make things as easy as possible.
Option 1 is the obvious choice. People who trust your site will provide the details. There is no other way you can login to other site while screen scraping.

Resources