I have put my elasticsearch server behind a Apache reverse proxy that provides basic authentication.
Authenticating to Apache directly from the browser works fine. However, when I use Kibana 3 to access the server, I receive authentication errors.
Obviously because no auth headers are sent along with Kibana's Ajax calls.
I added the below to elastic-angular-client.js in the Kibana vendor directory to implement authentication quick and dirty. But for some reason it does not work.
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
What is the best approach and place to implement basic authentication in Kibana?
/*! elastic.js - v1.1.1 - 2013-05-24
* https://github.com/fullscale/elastic.js
* Copyright (c) 2013 FullScale Labs, LLC; Licensed MIT */
/*jshint browser:true */
/*global angular:true */
'use strict';
/*
Angular.js service wrapping the elastic.js API. This module can simply
be injected into your angular controllers.
*/
angular.module('elasticjs.service', [])
.factory('ejsResource', ['$http', function ($http) {
return function (config) {
var
// use existing ejs object if it exists
ejs = window.ejs || {},
/* results are returned as a promise */
promiseThen = function (httpPromise, successcb, errorcb) {
return httpPromise.then(function (response) {
(successcb || angular.noop)(response.data);
return response.data;
}, function (response) {
(errorcb || angular.noop)(response.data);
return response.data;
});
};
// check if we have a config object
// if not, we have the server url so
// we convert it to a config object
if (config !== Object(config)) {
config = {server: config};
}
// set url to empty string if it was not specified
if (config.server == null) {
config.server = '';
}
/* implement the elastic.js client interface for angular */
ejs.client = {
server: function (s) {
if (s == null) {
return config.server;
}
config.server = s;
return this;
},
post: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
console.log($http.defaults.headers);
path = config.server + path;
var reqConfig = {url: path, data: data, method: 'POST'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
get: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
// no body on get request, data will be request params
var reqConfig = {url: path, params: data, method: 'GET'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
put: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
var reqConfig = {url: path, data: data, method: 'PUT'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
del: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
var reqConfig = {url: path, data: data, method: 'DELETE'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
head: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
// no body on HEAD request, data will be request params
var reqConfig = {url: path, params: data, method: 'HEAD'};
return $http(angular.extend(reqConfig, config))
.then(function (response) {
(successcb || angular.noop)(response.headers());
return response.headers();
}, function (response) {
(errorcb || angular.noop)(undefined);
return undefined;
});
}
};
return ejs;
};
}]);
UPDATE 1: I implemented Matts suggestion. However, the server returns a weird response. It seems that the authorization header is not working. Could it have to do with the fact, that I am running Kibana on port 81 and elasticsearch on 8181?
OPTIONS /solar_vendor/_search HTTP/1.1
Host: 46.252.46.173:8181
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Origin: http://46.252.46.173:81
Access-Control-Request-Method: POST
Access-Control-Request-Headers: authorization,content-type
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
This is the response
HTTP/1.1 401 Authorization Required
Date: Fri, 08 Nov 2013 23:47:02 GMT
WWW-Authenticate: Basic realm="Username/Password"
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 346
Connection: close
Content-Type: text/html; charset=iso-8859-1
UPDATE 2: Updated all instances with the modified headers in these Kibana files
root#localhost:/var/www/kibana# grep -r 'ejsResource(' .
./src/app/controllers/dash.js: $scope.ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
./src/app/services/querySrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
./src/app/services/filterSrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
./src/app/services/dashboard.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
And modified my vhost conf for the reverse proxy like this
<VirtualHost *:8181>
ProxyRequests Off
ProxyPass / http://127.0.0.1:9200/
ProxyPassReverse / https://127.0.0.1:9200/
<Location />
Order deny,allow
Allow from all
AuthType Basic
AuthName “Username/Password”
AuthUserFile /var/www/cake2.2.4/.htpasswd
Require valid-user
Header always set Access-Control-Allow-Methods "GET, POST, DELETE, OPTIONS, PUT"
Header always set Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization"
Header always set Access-Control-Allow-Credentials "true"
Header always set Cache-Control "max-age=0"
Header always set Access-Control-Allow-Origin *
</Location>
ErrorLog ${APACHE_LOG_DIR}/error.log
</VirtualHost>
Apache sends back the new response headers but the request header still seems to be wrong somewhere. Authentication just doesn't work.
Request Headers
OPTIONS /solar_vendor/_search HTTP/1.1
Host: 46.252.26.173:8181
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Origin: http://46.252.26.173:81
Access-Control-Request-Method: POST
Access-Control-Request-Headers: authorization,content-type
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Response Headers
HTTP/1.1 401 Authorization Required
Date: Sat, 09 Nov 2013 08:48:48 GMT
Access-Control-Allow-Methods: GET, POST, DELETE, OPTIONS, PUT
Access-Control-Allow-Headers: Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization
Access-Control-Allow-Credentials: true
Cache-Control: max-age=0
Access-Control-Allow-Origin: *
WWW-Authenticate: Basic realm="Username/Password"
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 346
Connection: close
Content-Type: text/html; charset=iso-8859-1
SOLUTION:
After doing some more research, I found out that this is definitely a configuration issue with regard to CORS. There are quite a few posts available regarding that topic but it appears that in order to solve my problem, it would be necessary to to make some very granular configurations on apache and also make sure that the right stuff is sent from the browser.
So I reconsidered the strategy and found a much simpler solution. Just modify the vhost reverse proxy config to move the elastisearch server AND kibana on the same http port. This also adds even better security to Kibana.
This is what I did:
<VirtualHost *:8181>
ProxyRequests Off
ProxyPass /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/
ProxyPassReverse /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/
ProxyPass / http://127.0.0.1:9200/
ProxyPassReverse / https://127.0.0.1:9200/
<Location />
Order deny,allow
Allow from all
AuthType Basic
AuthName “Username/Password”
AuthUserFile /var/www/.htpasswd
Require valid-user
</Location>
ErrorLog ${APACHE_LOG_DIR}/error.log
</VirtualHost>
Here is a perfect solution:
https://github.com/fangli/kibana-authentication-proxy
Support not only basicAuth backend, but also GoogleOAuth and BasicAuth for the client.
If works, please give a star, thanks.
In Kibana, replace the existing elastic-angular-client.js with the latest which can be found here. Then, in the Kibana code replace all instances of:
$scope.ejs = ejsResource(config.elasticsearch);
with
$scope.ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'accept, origin, authorization', 'Authorization': 'Basic ' + Base64Encode('user:Password')}});
That should be all you need.
Update:
Is apache configured for for CORS? See this.
Header always set Access-Control-Allow-Methods "GET, POST, DELETE, OPTIONS, PUT"
Header always set Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization"
Header always set Access-Control-Allow-Credentials "true"
Header always set Cache-Control "max-age=0"
Header always set Access-Control-Allow-Origin *
You are correct in that it's a CORS issue. Kibana 3 uses CORS to communicate with ElasticSearch.
In order to enable HTTP Authentication Headers and Cookies to be sent with the Kibana CORS requests you need to do two things:
ONE: In your Kibana config.js file, find the setting where your ElasticSearch server is defined:
elasticsearch: "http://localhost:9200",
This needs to be changed to:
elasticsearch: {server: "http://localhost:9200", withCredentials: true},
This will tell Kibana to send the Authentication headers and cookies IF the server is capable of received them.
TWO: Next you need to go into your ElasticSearch config file (elasticsearch.yml on the host server; mine was located at /etc/elasticsearch/elasticsearch.yml on a CentOS7 server). In this file you will find a "Network And HTTP" section. You will need to find the line that says:
#http.port: 9200
Uncomment this line and change the port to the port that you want ElasticSearch to run on. I chose 19200. Then do the same for the #transport.tcp.port: 9300 setting. Again I chose 19300.
Lastly, at the end of this section (just for organizational sake, you could also simply append the following to the end of the file) add in:
http.cors.allow-origin: http://localhost:8080
http.cors.allow-credentials: true
http.cors.enabled: true
You can change the above origin address to wherever your web server is serving Kibana from. Alternatively you could simply put /.*/ to match all origins but this is not adviseable.
Now save the elasticsearch.yml file and restart the elasticsearch server. Your reverse proxy should be configured to run on port 9200 and point to 19200 if the request authenticates.
Word of warning, if you are using Cookies to authenticate requests you should make sure to white list the HTTP OPTIONS method in your reverse proxy configuration as only GET, PUT, POST and DELETE requests include the cookies. I haven't tested whether OPTIONS include the Authentication header as well but it may be the same situation as the cookies. Kibana will not function correctly if the OPTIONS requests cannot get through as well.
It is also a good idea to configure your reverse proxy to blacklist any request that ends in _shutdown as this command shouldn't be needed via external requests in most cases.
Related
I am using a React frontend to log into a nodejs server running express-session. Frontend is running on localhost:3000, server is on localhost:5000.
Everything is working properly using postman from localhost (session cookie is sent from server when user is properly authenticated and received/stored by postman. Subsequent postman api request to different path on server uses the session cookie and correctly retrieves the data it should based on the session contents). I can also is login using the browser directly to the server (http://localhost:5000/api/authenticate). The server generates the session, sends the cookie to the browser and it stores the cookie locally.
What doesn't work is when I make the api request from within the React app. The server is returning the session cookie but the browser is not storing it. After researching this for the last few days (there are a lot of questions on this general subject), it seems to be an issue with cross site request but I can't seem to find the right set of app and server settings to get it working properly. The cookie is being sent by the server but the browser won't store it when the request from the app.
*** after some additional troubleshooting and research, I've made some updates. My initial XHR request requires a pre-flight and the request and response headers appear to be correct now but still no cookie being stored in browser. More details below the setup ****
Server Setup
var corsOptions = {
origin: 'http://localhost:3000',
credentials: true
};
app.options('*', cors(corsOptions)) // for pre-flight
app.use(cors(corsOptions));
app.use(session({
genid: (req) => {
console.log('Inside the session middleware');
console.log(req.sessionID);
return uuidv4();
},
store: new FileStore(),
secret: 'abc987',
resave: false,
saveUninitialized: true,
cookie: { httpOnly: false, sameSite: 'Lax', hostOnly: false }
}));
app.use( bodyParser.json() );
app.use(bodyParser.urlencoded({
extended: true
}));
app.use(function(req, res, next) {
res.header('Access-Control-Allow-Headers', 'Origin, X-Requested-With, Content-Type, Accept, withCredentials, credentials');
next();
});
app.post('/api/authenticate', function(req, res) {
const usernameLower = req.body.username.toLowerCase();
const passwordHash = md5(req.body.password);
connection.query('select USERID from USERS where LOWER(USERNAME)=? && PASSWORD=? ', [usernameLower, passwordHash], function (error, results, fields) {
if (error) {
console.log(error);
req.session.destroy();
res.status(500)
.json({
error: 'Internal error please try again'
});
} else if (results[0]) {
const userId = results[0].USERID;
// setup session data
mySession = req.session;
mySession.user = {};
mySession.user.userId = userId;
res.json(mySession.user);
} else {
console.log('auth failed');
req.session.destroy();
res.status(401)
.json({
error: 'Incorrect email or password'
});
}
});
});
Client setup -- the request is triggered by clicking a submit button in a form
handleSubmit(event) {
event.preventDefault();
axios.defaults.withCreditials = true;
axios.defaults.credentials = 'include';
axios({
credentials: 'include',
method: 'post',
url: 'http://localhost:5000/api/authenticate/',
headers: {'Content-Type': 'application/json' },
data: {
username: this.state.username,
password: this.state.password
}
})
.then((response) => {
if (response.status === 200) {
this.props.setLoggedIn(true);
console.log('userId: '+response.data.userId);
} else {
console.log("login error");
}
})
.catch(error => console.log(error))
}
Below is the response cookie sent to the browser but the browser is not storing it.
{"connect.sid":{"path":"/","samesite":"Lax","value":"s:447935ac-fc08-47c6-9b66-4fa30b355021.Yo5H3XVz3Ux3GjTPVhy8i2ZPJm2RM2RzUnznxU9wBvo"}}
Request headers from XHR request (pre-flight):
OPTIONS /api/authenticate/ HTTP/1.1
Host: localhost:5000
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:78.0) Gecko/20100101 Firefox/78.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Access-Control-Request-Method: POST
Access-Control-Request-Headers: content-type
Referer: http://localhost:3000/
Origin: http://localhost:3000
DNT: 1
Connection: keep-alive
Pre-flight server response headers
HTTP/1.1 204 No Content
X-Powered-By: Express
Access-Control-Allow-Origin: http://localhost:3000
Vary: Origin, Access-Control-Request-Headers
Access-Control-Allow-Credentials: true
Access-Control-Allow-Methods: GET,HEAD,PUT,PATCH,POST,DELETE
Access-Control-Allow-Headers: content-type
Content-Length: 0
Date: Fri, 10 Jul 2020 21:35:05 GMT
Connection: keep-alive
POST request header
POST /api/authenticate/ HTTP/1.1
Host: localhost:5000
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:78.0) Gecko/20100101 Firefox/78.0
Accept: application/json, text/plain, */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Content-Type: application/json
Content-Length: 45
Origin: http://localhost:3000
DNT: 1
Connection: keep-alive
Referer: http://localhost:3000/
Server response headers
HTTP/1.1 200 OK
X-Powered-By: Express
Access-Control-Allow-Origin: http://localhost:3000
Vary: Origin
Access-Control-Allow-Credentials: true
Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept
Content-Type: application/json; charset=utf-8
Content-Length: 95
ETag: W/"5f-Iu5VYnDYPKfn7WPrRi2d2Q168ds"
Set-Cookie: connect.sid=s%3A447935ac-fc08-47c6-9b66-4fa30b355021.Yo5H3XVz3Ux3GjTPVhy8i2ZPJm2RM2RzUnznxU9wBvo; Path=/; SameSite=Lax
Date: Fri, 10 Jul 2020 21:35:05 GMT
Connection: keep-alive
I used the "Will it CORS" tool at https://httptoolkit.tech/will-it-cors/ and my request/response headers all seem to be correct but still no cookie stored.
Pre-flight request contains the correct origin
Pre-flight response contains the correct allow-origin and allow-credentials
POST request contains the correct origin and allow-credentials
POST response contains the correct
Appreciate any help to unravel this....
I solved my issues and wanted to post the solution in case others come across this.
To recap, the backend server is nodejs using express. The following setup allows the front-end to accept the cookies which were created on the nodejs server.
app.use(function (req, res, next) {
res.header("Access-Control-Allow-Origin", "https://frontendserverdomain.com:3000"); // update to match the domain you will make the request from
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
res.header("Access-Control-Allow-Credentials", true); // allows cookie to be sent
res.header("Access-Control-Allow-Methods", "GET, POST, PUT, HEAD, DELETE"); // you must specify the methods used with credentials. "*" will not work.
next();
});
The front-end app is based on React and uses axios to make http request. It is hosted at "https://frontendserverdomain.com:3000" which is added to the "Access-Control-Allow-Origin" header in the nodejs setup (see above).
On the front-end, Axios needs the "withCreditials" setting applied.
axios.defaults.withCredentials = true;
With these settings, your app will be able to exchange cookies with the back-end server.
One gotcha for me getting CORS working was to make sure the front-end host is properly added to the back-end servers header "Access-Control-Allow-Origin". This includes the port number if it's specified in your URL when accessing the front-end.
Inn terms of cookie exchange, the "Access-Control-Allow-Credentials" and "Access-Control-Allow-Methods" headers must be set correctly as shown above. Using a wildcard on "Access-Control-Allow-Methods" will not work.
This does not look right:
axios.defaults.headers.common = {
credentials: "include",
withCredentials: true
}
There are no such request headers. Instead credentials is controlled via XHR request.
Use this instead to make sure your client accepts cookies:
axios.defaults.withCredentials = true;
I am making an authentication system based on tokens. When a user logs in a token sent back and this then submitted with each call to the server
Assigning a token
.factory('AuthenticationService', function($rootScope, $http, authService, $httpBackend) {
var service = {
login: function(user) {
$http.post('http://192.168.100.100/myApp/login', { user: user }, { ignoreAuthModule: true })
.success(function (data, status, headers, config) {
$http.defaults.headers.common.Authorization = data.authorizationToken;
console.log("token:" + data.authorizationToken);
authService.loginConfirmed(data, function(config) {
config.headers.Authorization = data.authorizationToken;
return config;
});
})
After this is executed calls are sent as OPTIONS rather than POST the problem being that I am sending to RESTful server and OPTIONS isn't ahhh ummm an option. i.e server expects POST, GET etc.
Chrome shows my headers as ..
General
**Remote Address:** 192.168.100.100:80
**Request URL:** http://192.168.100.100/myapp/login
**Request Method:** OPTIONS
**Status Code:** 404 Not Found
Response Headers
**Access-Control-Allow-Origin:** *
**Cache-Control:** no-cache, must-revalidate
**Connection:** Keep-Alive
**Content-Encoding:** gzip
**Content-Length:** 563
**Content-Type:** text/plain
**Date:** Tue, 04 Aug 2015 04:29:14 GMT
**Expires:** 0
**Keep-Alive:** timeout=5, max=100
**Server:** Apache/2.2.22 (Debian)
**Vary:** Accept-Encoding
**X-Powered-By:** PHP/5.4.41-0+deb7u1
Request Headers
OPTIONS /myapp/login HTTP/1.1
**Host:** 192.168.100.100
**Connection:** keep-alive
**Access-Control-Request-Method:** POST
**Origin:** null
**User-Agent:** Mozilla/5.0 (Linux; U; Android 4.0; en-us; GT-I9300 Build/IMM76D) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Mobile Safari/534.30
**Access-Control-Request-Headers:** authorization, content-type
**Accept:** */*
**Accept-Encoding:** gzip, deflate, sdch
**Accept-Language:** en-US,en;q=0.8
Will it always be OPTIONS and do I have to alter my RESTful server to accomodate this, should I not be able to see the token in the headers?
This is an preflight request to check if CORS are enaabled or not
During the preflight request, you should see the following two headers: Access-Control-Request-Method and Access-Control-Request-Headers. These request headers are asking the server for permissions to make the actual request. Your preflight response needs to acknowledge these headers in order for the actual request to work.
Shortly. You need to enable these headers at you server for actual request to work
Can you please check if CORS is enabled? if yes please try to handle OPTIONS request like this
if (req.method === 'OPTIONS') {
console.log('!OPTIONS');
var headers = {};
// IE8 does not allow domains to be specified, just the *
// headers["Access-Control-Allow-Origin"] = req.headers.origin;
headers["Access-Control-Allow-Origin"] = "*";
headers["Access-Control-Allow-Methods"] = "POST, GET, PUT, DELETE, OPTIONS";
headers["Access-Control-Allow-Credentials"] = false;
headers["Access-Control-Max-Age"] = '86400'; // 24 hours
headers["Access-Control-Allow-Headers"] = "X-Requested-With, X-HTTP-Method-Override, Content-Type, Accept";
res.writeHead(200, headers);
res.end();
}
The current route of the request originates on localhost:3001, goes through a proxy running on that same localhost at localhost:3001/proxy, where the request is then routed to the Salesforce instance. The proxy is made using ExpressJS and the client side app is made using AngularJS. Note: I did remember to tag my security token at the end of my password (Salesforce API requirement), although when using cURL, this doesn't seem to be necessary. Here are a series of HTTP traces that will hopefully provide some clues:
HTTP Request Log from Angular.JS App:
POST /proxy HTTP/1.1
Host: localhost:3001
Connection: keep-alive
Content-Length: 234
Pragma: no-cache
Cache-Control: no-cache
X-User-Agent: salesforce-toolkit-rest-javascript/v29.0
Origin: http://localhost:3001
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.107 Safari/537.36
Content-Type: application/x-www-form-urlencoded
Accept: application/json, text/plain, */*
SalesforceProxy-Endpoint: https://uniquename.salesforce.com/services/oauth2/token
Referer: http://localhost:3001/
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
Cookie: liveagent_oref=; liveagent_ptid=3c69c2f9-139d-4439-ba6c-fd8d9dcae101; liveagent_vc=5
grant_type=password&client_id=3MVGxyzxyzxyzxyz&client_secret=99889988&username=first.last%40email.com&password=pswdwACYodaYfHs
400 Bad Request
Object {error_description: "grant type not supported", error: "unsupported_grant_type"}
Relevant Express.JS code used for proxy routing:
app.all('/proxy', function(req, res) {
var url = req.header('SalesforceProxy-Endpoint');
console.log(req.body); //prints all form data variables in JSON format
console.log(res.body); //undefined
request({url: url}).pipe(res).on('error', function(error){
//I think I may need to pipe more information using request?
console.log(error);
});
});
Request details using cURL:
curl -v https://uniquename.salesforce.com/services/oauth2/token
-d "grant_type=password" -d "client_id=3MVGxyzxyzxyzxyz"
-d "client_secret=99889988" -d "username=jfirst.last#email.com" -d "password=pswd"
> POST /services/oauth2/token HTTP/1.1
> User-Agent: curl/7.41.0
> Host: uniquename.salesforce.com
> Accept: */*
> Content-Length: 207
> Content-Type: application/x-www-form-urlencoded
>
* upload completely sent off: 207 out of 207 bytes
< HTTP/1.1 200 OK
< Date: Wed, 29 Jul 2015 06:04:55 GMT
< Set-Cookie: BrowserId=auu1mgvHSMS1EedDEduz8Q;Path=/;Domain=.salesforce.com;Exp
ires=Sun, 27-Sep-2015 06:04:55 GMT
< Expires: Thu, 01 Jan 1970 00:00:00 GMT
< Pragma: no-cache
< Cache-Control: no-cache, no-store
< Content-Type: application/json;charset=UTF-8
< Transfer-Encoding: chunked
<
{
"id":"https://test.salesforce.com/id/05390530530",
"issued_at":"1438132525896197",
"token_type":"Bearer",
"instance_url":"https://uniquename.salesforce.com",
"signature":"blahblah",
"access_token":"XXXXXXXXXXXXXXXXX"
}
* Connection #0 to
host uniquename.salesforce.com left intact
As you can see, I get back a valid response from the cURL request. I suspect something is wrong with the proxy, as it may not be forwarding all the form data to Salesforce, but I'm not sure how to debug that in Express.JS. The reason I suspect this is because if I try curl https://uniquename.salesforce.com/services/oauth2/token it returns the same unsupported_grant_type error.
I finally got this working by switching to use the cors-anywhere proxy. I deploy my AngularJS app on port 8080, and the proxy on port 3001. My packages are managed using npm and grunt. Here is the code for the proxy:
var host = 'localhost';
var port = 3001;
var cors_proxy = require('cors-anywhere');
cors_proxy.createServer().listen(port, host, function() {
console.log('CORS proxy running on ' + host + ':' + port);
});
And here is how I'm making the HTTP request in AngularJS (you have to fill in your own credentials in the data object):
var login = {
method: 'POST',
url: 'http://localhost:3001/https://login.salesforce.com/services/oauth2/token',
data: 'grant_type=password&client_id='+encodeURIComponent(CLIENT_ID)+'&client_secret='+encodeURIComponent(CLIENT_SECRET)+'&username='+encodeURIComponent(EMAIL)+'&password='+encodeURIComponent(PASSWORD+USER_SECURITY_TOKEN),
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Accept': '*/*'
}
};
$http(login).success(function(response){
console.log(response);
})
.error( function(response, status){
console.log(response);
console.log("Status: " + status);
});
You can run the proxy with the node server.js command, and run the AngularJS app with grunt. I hope this helps someone out, this was a tough problem to solve.
I don't think you're proxying everything in the HTTP request. I've included some code below that will pass the request method and the request headers to your endpoint.. Also try using the 'request-debug' library so you can compare the proxy HTTP request vs your curl request and look for differences
var express = require('express');
var app = express();
var request = require('request');
require('request-debug')(request);
app.all('/proxy', function(req, res) {
var url = req.header('SalesforceProxy-Endpoint');
var options = {
url: url,
method: req.method,
headers: req.headers
}
request(options).pipe(res).on('error', function (error, response, body) {
if (!error && response.statusCode == 200) {
//TODO: do something useful with the response
} else {
console.log('ERROR: ' + error);
}
});
});
I tried angular.js and started with a web-ui for a restful api (using http basic auth). It is really nice and all except the authorization works.
Up to now I am using $http.defaults.headers.common.Authorization to set the password, but mostly the browser opens his default login-form for http-basic-auth. Another strange behaviour is that the angular-request does not contain a Authorisation-Header (neither OPTIONS nor the GET-request). I also tried to set this header on each request with the header-config, but this also didn't work.
Are there some special headers I have to set? Or do I have to set $http.defaults.headers.common.Authorization in a special context?
tools.factory('someFactory', function ($http, Base64) {
//...
factory.checkAuth = function (username, password) {
storeCredentials(username, password);
factory.initConnection();
return factory.getData();
};
factory.initConnection = function(){
var credentials = loadCredentials();
factory.authHeader = 'Basic ' + Base64.encode(credentials.username + ':' + credentials.password);
$http.defaults.headers.common.Authorization = factory.authHeader;
};
factory.getData = function () {
return $http({
method: 'GET',
url: urlBase + '/happening',
headers: {
// 'Authorization': factory.authHeader
}
});
};
Request header:
OPTIONS /v8/happening HTTP/1.1
Host: api.gospry.com
Connection: keep-alive
Access-Control-Request-Method: GET
Origin: http://web.gospry.com
User-Agent: [..]
Access-Control-Request-Headers: accept, authorization
Accept: */*
Referer: http://web.gospry.com/
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8
Response header:
The backend was slightly modified to support CORS (Access-Control-headers and preflight-request-support).
HTTP/1.1 401 Unauthorized
Server: Apache-Coyote/1.1
Access-Control-Allow-Origin: http://web.gospry.com
Access-Control-Allow-Methods: POST, GET, PUT, PUSH, OPTIONS, DELETE
Access-Control-Max-Age: 3600
Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Authorization
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: 0
Strict-Transport-Security: max-age=31536000 ; includeSubDomains
X-Frame-Options: DENY
Set-Cookie: JSESSIONID=2DDC92A1B0DC57C221CDC3B7A5DC1314; Path=/v8/; Secure; HttpOnly
WWW-Authenticate: Basic realm="Realm"
X-Content-Type-Options: nosniff
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: 0
X-Frame-Options: DENY
Allow: GET, HEAD, POST, PUT, DELETE, TRACE, OPTIONS, PATCH
Content-Length: 0
Date: Wed, 08 Apr 2015 18:04:04 GMT
I use an interceptor. I can't spot the issue in your code but here is what I am using:
authModule.factory('CeradAuthInterceptor',
['CeradAuthManager', function ( authManager)
{
return {
request: function (config)
{
config.headers = config.headers || {};
if (authManager.authToken)
{
config.headers.Authorization = 'Bearer ' + authManager.authToken;
}
return config;
}
};
}]);
appModule.config(['$httpProvider', function ($httpProvider) {
$httpProvider.interceptors.push('CeradAuthInterceptor');
}]);
The issue you describe with the login-form is explained in the AngularJS Tips and Tricks Using ngResource with a workaround. See the Basic Authentication section:
A workaround, if appropriate, is to tell your web server to return something other than 401 on an authentication failure, and go from there. In AngularJS, a 500 (for example) will cause the $http promise to be rejected and you can handle it however you’d like. This is not recommended if you actually ever need the login prompt to occur!
I'm working on a app using Ionic Framework.
On the backend i wrote a Flask Application for api which looks like below:
#API.route("/saverez",methods=["POST","OPTIONS"])
#crossdomain(origin='*', headers="*",methods="*")
#render_api
def saver():
.....
I got errors while posting json to api.
var headers = {
'Access-Control-Allow-Origin' : '*',
'Access-Control-Allow-Methods' : 'POST, GET, OPTIONS',
'Accept': 'application/json'
};
$http({
method: "POST",
headers: headers,
url: url+ '/api/saverez',
data: $scope.form
}).success(function (result)
console.log(result);
}).error(function (data, status, headers, config) {
console.log(data);
console.log(status);
console.log(headers);
console.log(config);
});
So this gives me the error:
XMLHttpRequest cannot load http://myurl/api/saverez. Request header field Access-Control-Allow-Origin is not allowed by Access-Control-Allow-Headers.
I googled it and then i found this snippet:
http://flask.pocoo.org/snippets/56/
I also added headers to my nginx conf like below:
location ~* \.(eot|ttf|woff)$ {
add_header Access-Control-Allow-Origin *;
}
Tried everything in that documentation and also evertyhing i found on google but sadly it didn't do any good.
How can i set the right headers for all origins ? I also use google pagespeed does it can cause this issue ?
Thanks in advance.
--- EDIT ---
Chrome network output
Remote Address:myip
Request URL:http://myurl/api/saverez
Request Method:OPTIONS
Status Code:200 OK
Request Headersview source
Accept:*/*
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Access-Control-Request-Headers:access-control-allow-origin, accept, access-control-allow-methods, content-type
Access-Control-Request-Method:POST
Cache-Control:no-cache
Connection:keep-alive
Host:myurl
Origin:http://192.168.1.46:8100
Pragma:no-cache
Referer:http://192.168.1.46:8100/
User-Agent:Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X; en-us) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53
Response Headersview source
Access-Control-Allow-Credentials:true
Access-Control-Allow-Headers:*
Access-Control-Allow-Methods:*
Access-Control-Allow-Origin:*
Access-Control-Max-Age:21600
Allow:POST, OPTIONS
Content-Length:0
Content-Type:text/html; charset=utf-8
Date:Thu, 28 Aug 2014 13:26:11 GMT
Server:nginx/1.6.0
In my app module config section I have the following:
angular.module('starterapp', ['ionic'])
.config(function ($stateProvider, $httpProvider, $urlRouterProvider) {
// We need to setup some parameters for http requests
// These three lines are all you need for CORS support
$httpProvider.defaults.useXDomain = true;
$httpProvider.defaults.withCredentials = true;
delete $httpProvider.defaults.headers.common['X-Requested-With'];
}
That is all you need to have to make all the HTTP requests work with CORS. This of course assumes you have made your backend.
You adding those additional headers would not be allowed according the w3c specification for XMLHTTPRequest as they may only be added by the host browser.
PaulT's answer got me very close to solving this problem for myself, but I also had to explicitly add the Content-Type for my post operations.
$httpProvider.defaults.headers.post['Content-Type'] = 'application/x-www-form-urlencoded; charset=UTF-8';
Hope this helps.
I got it working by just adding JSONP as the method
$resource(
'http://maps.google.com/maps/api/geocode/json?address=:address&sensor=false',
{},
{
get: {
method: 'JSONP',
}
});