Unable to see uploaded images on Azure Blob - angularjs

I am new to azure blob storage and I am trying to upload image to blob storage. I am using angular at client end and uploaded image with following headers :
'Content-type'
'x-ms-blob-type'
'Content-Length'
my blob is saved and i can see it in Azure Portal. But I am not able to see images. I am unable to understand the reason.
Link of my client is http://educms.azurewebsites.net/#/pages/results
There is no button. as soon as you select an image file, it will get uploaded. You can see the uploaded file at https://hobcity.blob.core.windows.net/images2/filename.extension
Upload Code of AngularJs
$scope.uploadFile = function(files) {
var fd = new FormData();
//Take the first selected file
fd.append("file", files[0]);
var size = files[0].size;
var name = files[0].name;
var type = files[0].type;
var postData = {"name" : name};
postData.containerName = 'images2';
DataService.save('/tables/results/', postData).then(function(data){
var header = {
'Access-Control-Allow-Origin': '*',
'Content-type' : type,
'x-ms-blob-type' : 'BlockBlob',
'Content-Length' : size
}
var url = data.imageUri
var queryString = data.sasQueryString
var uploadUrl = url+ '/' + name + '/?'+queryString
$http.put(uploadUrl, fd, {
headers: header,
transformRequest: angular.identity
}).success(function(data){
console.log(data)
}).error(function(err){
console.error(err);
});
});
}
HTML :
code can be seen live at http://educms.azurewebsites.net/scripts/controllers/results.js
Anyone know what's wrong ?

So I uploaded a simple text file and traced the request through Fiddler. Here's what I saw:
PUT http://hobcity.blob.core.windows.net/images2/simpletextfile.txt/?se=2014-08-11T17%3A13%3A52Z&sr=c&sp=w&sig=SlY7wURwfSjM72Hw22507OHpnaCC1Ky6POk6hhR6fbU%3D HTTP/1.1
Accept: application/json, text/plain, */*
Access-Control-Allow-Origin: *
Content-Type: text/plain, multipart/form-data; boundary=---------------------------7de26921205a0
x-ms-blob-type: BlockBlob
Referer: http://educms.azurewebsites.net/#/pages/results
Accept-Language: en-US
Origin: http://educms.azurewebsites.net
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko
Host: hobcity.blob.core.windows.net
Content-Length: 254
DNT: 1
Connection: Keep-Alive
Pragma: no-cache
-----------------------------7de26921205a0
Content-Disposition: form-data; name="file"; filename="simpletextfile.txt"
Content-Type: text/plain
https://hobcity.blob.core.windows.net/images2/Add-Item.png
-----------------------------7de26921205a0--
I believe the reason you're running into the issue is because you're uploading the file as is (see your Content-Type is multipart/form-data) and this is corrupting the data. What you would need to do is read the file contents into a byte array and then upload that byte array. If you search for HTML 5 File API, you will find examples of how to read a file using JavaScript. Also I wrote a blog post about uploading files in Azure Blob Storage using JavaScript which you may find useful: http://gauravmantri.com/2013/12/01/windows-azure-storage-and-cors-lets-have-some-fun/ (though this post make use of jQuery instead of Angular but should give you some idea).

Related

NodeJS proxy not working as intended

The current route of the request originates on localhost:3001, goes through a proxy running on that same localhost at localhost:3001/proxy, where the request is then routed to the Salesforce instance. The proxy is made using ExpressJS and the client side app is made using AngularJS. Note: I did remember to tag my security token at the end of my password (Salesforce API requirement), although when using cURL, this doesn't seem to be necessary. Here are a series of HTTP traces that will hopefully provide some clues:
HTTP Request Log from Angular.JS App:
POST /proxy HTTP/1.1
Host: localhost:3001
Connection: keep-alive
Content-Length: 234
Pragma: no-cache
Cache-Control: no-cache
X-User-Agent: salesforce-toolkit-rest-javascript/v29.0
Origin: http://localhost:3001
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.107 Safari/537.36
Content-Type: application/x-www-form-urlencoded
Accept: application/json, text/plain, */*
SalesforceProxy-Endpoint: https://uniquename.salesforce.com/services/oauth2/token
Referer: http://localhost:3001/
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
Cookie: liveagent_oref=; liveagent_ptid=3c69c2f9-139d-4439-ba6c-fd8d9dcae101; liveagent_vc=5
grant_type=password&client_id=3MVGxyzxyzxyzxyz&client_secret=99889988&username=first.last%40email.com&password=pswdwACYodaYfHs
400 Bad Request
Object {error_description: "grant type not supported", error: "unsupported_grant_type"}
Relevant Express.JS code used for proxy routing:
app.all('/proxy', function(req, res) {
var url = req.header('SalesforceProxy-Endpoint');
console.log(req.body); //prints all form data variables in JSON format
console.log(res.body); //undefined
request({url: url}).pipe(res).on('error', function(error){
//I think I may need to pipe more information using request?
console.log(error);
});
});
Request details using cURL:
curl -v https://uniquename.salesforce.com/services/oauth2/token
-d "grant_type=password" -d "client_id=3MVGxyzxyzxyzxyz"
-d "client_secret=99889988" -d "username=jfirst.last#email.com" -d "password=pswd"
> POST /services/oauth2/token HTTP/1.1
> User-Agent: curl/7.41.0
> Host: uniquename.salesforce.com
> Accept: */*
> Content-Length: 207
> Content-Type: application/x-www-form-urlencoded
>
* upload completely sent off: 207 out of 207 bytes
< HTTP/1.1 200 OK
< Date: Wed, 29 Jul 2015 06:04:55 GMT
< Set-Cookie: BrowserId=auu1mgvHSMS1EedDEduz8Q;Path=/;Domain=.salesforce.com;Exp
ires=Sun, 27-Sep-2015 06:04:55 GMT
< Expires: Thu, 01 Jan 1970 00:00:00 GMT
< Pragma: no-cache
< Cache-Control: no-cache, no-store
< Content-Type: application/json;charset=UTF-8
< Transfer-Encoding: chunked
<
{
"id":"https://test.salesforce.com/id/05390530530",
"issued_at":"1438132525896197",
"token_type":"Bearer",
"instance_url":"https://uniquename.salesforce.com",
"signature":"blahblah",
"access_token":"XXXXXXXXXXXXXXXXX"
}
* Connection #0 to
host uniquename.salesforce.com left intact
As you can see, I get back a valid response from the cURL request. I suspect something is wrong with the proxy, as it may not be forwarding all the form data to Salesforce, but I'm not sure how to debug that in Express.JS. The reason I suspect this is because if I try curl https://uniquename.salesforce.com/services/oauth2/token it returns the same unsupported_grant_type error.
I finally got this working by switching to use the cors-anywhere proxy. I deploy my AngularJS app on port 8080, and the proxy on port 3001. My packages are managed using npm and grunt. Here is the code for the proxy:
var host = 'localhost';
var port = 3001;
var cors_proxy = require('cors-anywhere');
cors_proxy.createServer().listen(port, host, function() {
console.log('CORS proxy running on ' + host + ':' + port);
});
And here is how I'm making the HTTP request in AngularJS (you have to fill in your own credentials in the data object):
var login = {
method: 'POST',
url: 'http://localhost:3001/https://login.salesforce.com/services/oauth2/token',
data: 'grant_type=password&client_id='+encodeURIComponent(CLIENT_ID)+'&client_secret='+encodeURIComponent(CLIENT_SECRET)+'&username='+encodeURIComponent(EMAIL)+'&password='+encodeURIComponent(PASSWORD+USER_SECURITY_TOKEN),
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Accept': '*/*'
}
};
$http(login).success(function(response){
console.log(response);
})
.error( function(response, status){
console.log(response);
console.log("Status: " + status);
});
You can run the proxy with the node server.js command, and run the AngularJS app with grunt. I hope this helps someone out, this was a tough problem to solve.
I don't think you're proxying everything in the HTTP request. I've included some code below that will pass the request method and the request headers to your endpoint.. Also try using the 'request-debug' library so you can compare the proxy HTTP request vs your curl request and look for differences
var express = require('express');
var app = express();
var request = require('request');
require('request-debug')(request);
app.all('/proxy', function(req, res) {
var url = req.header('SalesforceProxy-Endpoint');
var options = {
url: url,
method: req.method,
headers: req.headers
}
request(options).pipe(res).on('error', function (error, response, body) {
if (!error && response.statusCode == 200) {
//TODO: do something useful with the response
} else {
console.log('ERROR: ' + error);
}
});
});

AngularJS WebApi Authorization header does not appear to be getting passed

I am trying to create a call with AngularJS v1.3.15 into ASP.NET WebAPI (latest from Nuget) to get a list of customers. I can successfully authenticate and get back a token. I add the token to an Authentication header but when I make the call it gets kicked back saying the Authentication header is missing.
The Angular call after I get the token looks like this
$scope.baseUrl = "http://localhost:7800/";
$http({
method: 'GET',
url: $scope.baseUrl + 'customer',
headers: {
'Authorization': $scope.token
}})
I have also tried to utilize the angularjs $resourse
return $resource($scope.baseUrl + 'customer', { }, { 'get': { method: 'GET', isArray: true, headers: { 'Authorization': $scope.token } } });
In the WebApiConfig Register method I have the following setup
var cors = new EnableCorsAttribute("*","*","*");
config.EnableCors(cors);
in a DelegatingHandler I check for the Authorization header like so
var authHeader = request.Headers.Authorization;
From the Angular app it is always null. If I run a check from Fiddler and PostMan I get the Authorization header just fine.
When I press F12 from Chrome and look at the request header these are the results
OPTIONS /customer HTTP/1.1
Host: localhost:7800
Connection: keep-alive
Access-Control-Request-Method: GET
Origin: http://localhost:63342
User-Agent: Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2376.0 Safari/537.36
Access-Control-Request-Headers: accept, authorization
Accept: */*
Referer: http://localhost:63342/Test/index.html
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8
Fixed my issue, after some searches I found that OPTIONS does not seem to be supported out of the box. I found that if I add NuGet package Microsoft.Owin.Cors and then add
app.UseCors(Microsoft.Owin.Cors.CorsOptions.AllowAll);
to the startup.cs file

AngularJs Service, File Upload, Laravel

I'm having trouble with an XHR Request, for some reason my server is not receiving my files:
Here is my angular service update algorithm:
var update = function(id, name, file) {
var formData = new FormData();
formData.append('name', name);
formData.append('img', file);
return $http({
method : 'PUT',
url : '/albums/' + id,
data : formData,
headers : {'Content-Type': undefined},
transformRequest : angular.identity
});
};
On my laravel controller I just have:
public function update($id) {
return Response::json(Input::hasFile('img'));
}
The file is obviously there, why can't I retrieve it in my backend?
This is my request info:
Remote Address:[::1]:8000
Request URL:http://localhost:8000/albums/1
Request Method:PUT
Status Code:200 OK
Request Headers
Accept:application/json, text/plain, */*
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8,es;q=0.6
Cache-Control:no-cache
Connection:keep-alive
Content-Length:13811
Content-Type:multipart/form-data; boundary=----WebKitFormBoundaryJ46EVSBw57RaVu7x
Cookie:_token=eyJpdiI6IkkzSXVmdnhubFFlVnlzSnZNWVFzVWk3ZlVKSFRDNFFlNndJWUVsVGNVU2c9IiwidmFsdWUiOiI5OG5PamUrVGZkZGx0ajZONklWajJ2OTM3MWlRd2tGZ2g5S2Jja1RhVjJ4Q1wvYk9xQTB4TlRKUWxkWmdvRm1EcHlzTGRjSEdzN2U5TWNPYWxEYVExVUE9PSIsIm1hYyI6IjA4NTY0ZTlmMjAyNTk3NGQxMmFhODIxMTU3NGNiYjQ4ZDA3OTgxMTA3Yzk1MmVkNmJkMGNkYjUyMmNhMzZkNzQifQ%3D%3D; laravel_session=eyJpdiI6IjRISElnWjd3ZlwvY2k1Z1pvOERWOGxyVHlaQzEwRmlqY1FiV0tNNzZEbEs4PSIsInZhbHVlIjoiYnp4UzVqOFoxMm5MMXhQdzJhVFphSkgrRGh2b2plYXhjdXpTamJ0UjVYdGdxS0puQmpPVXhObEtyb1I3XC9HQnRFdnBMWXV0MzRmWXAybGRySGRvXC9vUT09IiwibWFjIjoiMGQ1NzUyYTBjZmU3NzQ3ZDBkYjg5ZWViOGZmYzg3ZDY1ODg0N2JmNDg1NmQyNmMwZDcxMDE5NzcxZjIxM2MxMiJ9
Host:localhost:8000
Origin:http://localhost:8000
Pragma:no-cache
Referer:http://localhost:8000/Admin/Client
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.104 Safari/537.36
Request Payload
------WebKitFormBoundaryJ46EVSBw57RaVu7x
Content-Disposition: form-data; name="name"
Some Weird Album
------WebKitFormBoundaryJ46EVSBw57RaVu7x
Content-Disposition: form-data; name="img"; filename="derpino.jpg"
Content-Type: image/jpeg
------WebKitFormBoundaryJ46EVSBw57RaVu7x--
Response Headers
Cache-Control:no-cache
Connection:close
Content-Type:application/json
Date:Wed, 29 Oct 2014 12:47:20 GMT
Host:localhost:8000
Set-Cookie:laravel_session=eyJpdiI6IlQ4WlFOaG1keVhXVlA1dlluNWFZZGlMcmRQNGM3bThCRjZ6cnh4ZlorcWs9IiwidmFsdWUiOiJOZkJXNXBJQTVSTGZzWHJ4alg1SXBoN0Q2ekR6UVpnWThKQ0c4MXZOQlc1RUhNMUUraUZSTlpPYTlPTFdLQXpiYTJONkRvb29WN1djVlZkSGdaWStjQT09IiwibWFjIjoiOWI5MzEwODE2YTZlM2EzODMwZDE1YzI4YmE4M2NiYWJjMTRjMDEzOGI3YjA4NmRlMGU5NDBlZWEyMzI4MGQ3MCJ9; path=/; httponly
X-Frame-Options:SAMEORIGIN
X-Powered-By:PHP/5.5.11
I found the error! Apparently I can not send a file with the PUT method, I changed the method to POST and It works. In both the service and the laravel route
var update = function(id, name, file) {
var formData = new FormData();
formData.append('name', name);
formData.append('img', file);
return $http({
method : 'PUT',
url : '/albums/' + id,
data : formData,
headers : {'Content-Type': undefined},
transformRequest : angular.identity
});
};

Angular POST to Web API doesn't pass data

I've been writing code against ASP.NET Web API for a while now with jQuery and I'm starting something new in Angular (writing against the same Web API backend.)
I'm POSTing to a method that will return some search results for an entity in the system. It looks like this:
public IEnumerable<dynamic> Post(string entity, FormDataCollection parameters)
{
// entity is passed on the route.
// parameters contains all the stuff I'm trying to get here.
}
If I call the method using jQuery.post:
$.post('api/search/child', {where : 'ParentID = 1'}, function(d){ foo = d });
it works just right and returns what I would expect.
I've made a service in my angular application that makes a similar call:
$http.post('api/search/child', { where: 'parentID = ' + parent.ID })
.success(function (data, status, headers, config) {
// etc.
})
But when it hits my "Post" method on the server, "paramters" is null.
After some googling I've tried adding a content-type header to ensure it's passed as JSON, and trying JSON.stringify-ing and $.param()-ing the "data" argument, but that didn't do anything (and from what I've read that shouldn't be necessary.) What am I doing wrong here? Thanks for your help!
UPDATE:
Here's the raw request from the (working) jQuery example:
POST http://localhost:51383/api/search/child HTTP/1.1
Accept: */*
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
X-Requested-With: XMLHttpRequest
Referer: http://localhost:51383/mypage.aspx
Accept-Language: en-US
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Host: localhost:51383
Content-Length: 23
DNT: 1
Connection: Keep-Alive
Pragma: no-cache
Cookie: (etc)
Authorization: (etc)
where=ParentID+%3D+1
And the raw request from the (failing) Angular sample:
POST http://localhost:51383/api/search/parent HTTP/1.1
Content-Type: application/json;charset=utf-8
Accept: application/json, text/plain, */*
Referer: http://localhost:51383/mypage.aspx
Accept-Language: en-US
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Connection: Keep-Alive
Content-Length: 27
DNT: 1
Host: localhost:51383
Pragma: no-cache
Cookie: (etc)
{"where":"ScorecardID = 1"}
Very weird. Even when I add the 'json' data type parameter to the end of the jQuery call, it still creates the www-form-urlencoded request. And that's the one that works. My Web API application is already set up for JSON (but thank-you Dalorzo).
Check if you have included the JSON Formatter in your configuration. It should be something like :
System.Web.Http.GlobalConfiguration.Configuration.Formatters.XmlFormatter.SupportedMediaTypes.Clear();
config.Formatters.Insert(0, new System.Net.Http.Formatting.JsonMediaTypeFormatter());
The Content-Type=application/json only will work if you set the proper formatter.
You can also try using [FromBody] next to your parameter type.
Solved! Discovered this question:
AngularJs $http.post() does not send data
Pointing to this lovely article:
http://victorblog.com/2012/12/20/make-angularjs-http-service-behave-like-jquery-ajax/
Turns out Angular doesn't post data the same way jQuery does but you can override it with some tweaking.
I solved this by below codes:
Client Side:
$http({
url: me.serverPath,
method: 'POST',
data: data,
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
}).
success(function (serverData) {
console.log("ServerData:", serverData);
......
Notice that data is an object.
On the server (ASP.NET MVC):
[AllowCrossSiteJson]
public string Api()
{
var data = JsonConvert.DeserializeObject<AgentRequest>(Request.Form[0]);
if (data == null) return "Null Request";
var bl = Page.Bl = new Core(this);
return data.methodName;
}
and 'AllowCrossSiteJsonAttribute' is needed for cross domain requests:
public class AllowCrossSiteJsonAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
filterContext.RequestContext.HttpContext.Response.AddHeader("Access-Control-Allow-Origin", "*");
base.OnActionExecuting(filterContext);
}
}

How and where to implement basic authentication in Kibana 3

I have put my elasticsearch server behind a Apache reverse proxy that provides basic authentication.
Authenticating to Apache directly from the browser works fine. However, when I use Kibana 3 to access the server, I receive authentication errors.
Obviously because no auth headers are sent along with Kibana's Ajax calls.
I added the below to elastic-angular-client.js in the Kibana vendor directory to implement authentication quick and dirty. But for some reason it does not work.
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
What is the best approach and place to implement basic authentication in Kibana?
/*! elastic.js - v1.1.1 - 2013-05-24
* https://github.com/fullscale/elastic.js
* Copyright (c) 2013 FullScale Labs, LLC; Licensed MIT */
/*jshint browser:true */
/*global angular:true */
'use strict';
/*
Angular.js service wrapping the elastic.js API. This module can simply
be injected into your angular controllers.
*/
angular.module('elasticjs.service', [])
.factory('ejsResource', ['$http', function ($http) {
return function (config) {
var
// use existing ejs object if it exists
ejs = window.ejs || {},
/* results are returned as a promise */
promiseThen = function (httpPromise, successcb, errorcb) {
return httpPromise.then(function (response) {
(successcb || angular.noop)(response.data);
return response.data;
}, function (response) {
(errorcb || angular.noop)(response.data);
return response.data;
});
};
// check if we have a config object
// if not, we have the server url so
// we convert it to a config object
if (config !== Object(config)) {
config = {server: config};
}
// set url to empty string if it was not specified
if (config.server == null) {
config.server = '';
}
/* implement the elastic.js client interface for angular */
ejs.client = {
server: function (s) {
if (s == null) {
return config.server;
}
config.server = s;
return this;
},
post: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
console.log($http.defaults.headers);
path = config.server + path;
var reqConfig = {url: path, data: data, method: 'POST'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
get: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
// no body on get request, data will be request params
var reqConfig = {url: path, params: data, method: 'GET'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
put: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
var reqConfig = {url: path, data: data, method: 'PUT'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
del: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
var reqConfig = {url: path, data: data, method: 'DELETE'};
return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb);
},
head: function (path, data, successcb, errorcb) {
$http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password');
path = config.server + path;
// no body on HEAD request, data will be request params
var reqConfig = {url: path, params: data, method: 'HEAD'};
return $http(angular.extend(reqConfig, config))
.then(function (response) {
(successcb || angular.noop)(response.headers());
return response.headers();
}, function (response) {
(errorcb || angular.noop)(undefined);
return undefined;
});
}
};
return ejs;
};
}]);
UPDATE 1: I implemented Matts suggestion. However, the server returns a weird response. It seems that the authorization header is not working. Could it have to do with the fact, that I am running Kibana on port 81 and elasticsearch on 8181?
OPTIONS /solar_vendor/_search HTTP/1.1
Host: 46.252.46.173:8181
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Origin: http://46.252.46.173:81
Access-Control-Request-Method: POST
Access-Control-Request-Headers: authorization,content-type
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
This is the response
HTTP/1.1 401 Authorization Required
Date: Fri, 08 Nov 2013 23:47:02 GMT
WWW-Authenticate: Basic realm="Username/Password"
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 346
Connection: close
Content-Type: text/html; charset=iso-8859-1
UPDATE 2: Updated all instances with the modified headers in these Kibana files
root#localhost:/var/www/kibana# grep -r 'ejsResource(' .
./src/app/controllers/dash.js: $scope.ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
./src/app/services/querySrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
./src/app/services/filterSrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
./src/app/services/dashboard.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}});
And modified my vhost conf for the reverse proxy like this
<VirtualHost *:8181>
ProxyRequests Off
ProxyPass / http://127.0.0.1:9200/
ProxyPassReverse / https://127.0.0.1:9200/
<Location />
Order deny,allow
Allow from all
AuthType Basic
AuthName “Username/Password”
AuthUserFile /var/www/cake2.2.4/.htpasswd
Require valid-user
Header always set Access-Control-Allow-Methods "GET, POST, DELETE, OPTIONS, PUT"
Header always set Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization"
Header always set Access-Control-Allow-Credentials "true"
Header always set Cache-Control "max-age=0"
Header always set Access-Control-Allow-Origin *
</Location>
ErrorLog ${APACHE_LOG_DIR}/error.log
</VirtualHost>
Apache sends back the new response headers but the request header still seems to be wrong somewhere. Authentication just doesn't work.
Request Headers
OPTIONS /solar_vendor/_search HTTP/1.1
Host: 46.252.26.173:8181
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Origin: http://46.252.26.173:81
Access-Control-Request-Method: POST
Access-Control-Request-Headers: authorization,content-type
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Response Headers
HTTP/1.1 401 Authorization Required
Date: Sat, 09 Nov 2013 08:48:48 GMT
Access-Control-Allow-Methods: GET, POST, DELETE, OPTIONS, PUT
Access-Control-Allow-Headers: Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization
Access-Control-Allow-Credentials: true
Cache-Control: max-age=0
Access-Control-Allow-Origin: *
WWW-Authenticate: Basic realm="Username/Password"
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 346
Connection: close
Content-Type: text/html; charset=iso-8859-1
SOLUTION:
After doing some more research, I found out that this is definitely a configuration issue with regard to CORS. There are quite a few posts available regarding that topic but it appears that in order to solve my problem, it would be necessary to to make some very granular configurations on apache and also make sure that the right stuff is sent from the browser.
So I reconsidered the strategy and found a much simpler solution. Just modify the vhost reverse proxy config to move the elastisearch server AND kibana on the same http port. This also adds even better security to Kibana.
This is what I did:
<VirtualHost *:8181>
ProxyRequests Off
ProxyPass /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/
ProxyPassReverse /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/
ProxyPass / http://127.0.0.1:9200/
ProxyPassReverse / https://127.0.0.1:9200/
<Location />
Order deny,allow
Allow from all
AuthType Basic
AuthName “Username/Password”
AuthUserFile /var/www/.htpasswd
Require valid-user
</Location>
ErrorLog ${APACHE_LOG_DIR}/error.log
</VirtualHost>
Here is a perfect solution:
https://github.com/fangli/kibana-authentication-proxy
Support not only basicAuth backend, but also GoogleOAuth and BasicAuth for the client.
If works, please give a star, thanks.
In Kibana, replace the existing elastic-angular-client.js with the latest which can be found here. Then, in the Kibana code replace all instances of:
$scope.ejs = ejsResource(config.elasticsearch);
with
$scope.ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'accept, origin, authorization', 'Authorization': 'Basic ' + Base64Encode('user:Password')}});
That should be all you need.
Update:
Is apache configured for for CORS? See this.
Header always set Access-Control-Allow-Methods "GET, POST, DELETE, OPTIONS, PUT"
Header always set Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization"
Header always set Access-Control-Allow-Credentials "true"
Header always set Cache-Control "max-age=0"
Header always set Access-Control-Allow-Origin *
You are correct in that it's a CORS issue. Kibana 3 uses CORS to communicate with ElasticSearch.
In order to enable HTTP Authentication Headers and Cookies to be sent with the Kibana CORS requests you need to do two things:
ONE: In your Kibana config.js file, find the setting where your ElasticSearch server is defined:
elasticsearch: "http://localhost:9200",
This needs to be changed to:
elasticsearch: {server: "http://localhost:9200", withCredentials: true},
This will tell Kibana to send the Authentication headers and cookies IF the server is capable of received them.
TWO: Next you need to go into your ElasticSearch config file (elasticsearch.yml on the host server; mine was located at /etc/elasticsearch/elasticsearch.yml on a CentOS7 server). In this file you will find a "Network And HTTP" section. You will need to find the line that says:
#http.port: 9200
Uncomment this line and change the port to the port that you want ElasticSearch to run on. I chose 19200. Then do the same for the #transport.tcp.port: 9300 setting. Again I chose 19300.
Lastly, at the end of this section (just for organizational sake, you could also simply append the following to the end of the file) add in:
http.cors.allow-origin: http://localhost:8080
http.cors.allow-credentials: true
http.cors.enabled: true
You can change the above origin address to wherever your web server is serving Kibana from. Alternatively you could simply put /.*/ to match all origins but this is not adviseable.
Now save the elasticsearch.yml file and restart the elasticsearch server. Your reverse proxy should be configured to run on port 9200 and point to 19200 if the request authenticates.
Word of warning, if you are using Cookies to authenticate requests you should make sure to white list the HTTP OPTIONS method in your reverse proxy configuration as only GET, PUT, POST and DELETE requests include the cookies. I haven't tested whether OPTIONS include the Authentication header as well but it may be the same situation as the cookies. Kibana will not function correctly if the OPTIONS requests cannot get through as well.
It is also a good idea to configure your reverse proxy to blacklist any request that ends in _shutdown as this command shouldn't be needed via external requests in most cases.

Resources