Is it possible to upload files to S3 from browser in IE8? - file

Now I have this code in javascript.
var file_object = $('#PHOTO').get(0).files[0];
the_form = new FormData();
the_form.append("AWSAccessKeyId", "TESTING");
the_form.append("acl", "authenticated-read");
the_form.append("policy", policy);
the_form.append("signature", signature);
the_form.append("Content-Type", "image/jpeg");
the_form.append("key", "test.jpg");
the_form.append("file", file_object);
$.ajax({
url: "http://S3BUCKET.s3.amazonaws.com",
type: "POST",
data: the_form,
processData: false,
contentType: false
})
It works sweetly, in Chrome, Firefox, except IE6,7,8,9.
The reason is that file object is not supported until IE10!
https://developer.mozilla.org/en-US/docs/Web/API/File
Is there any work-around solution for browsers before IE10?
PS: Code example would be nice!!

Without Flash many things are definitely a no-go. I believe the lib you reference has some Flash fallbacks, but I'm unclear as to whether they can handle all the issues involved. This is something I'm currently dealing with myself, and here are the issues in brief:
Content-Type header in response. IE (without Flash intermediary) will try to download a JSON content type, no way around this that I know without a proxy middleman to fudge headers.
hostname mapping. If you don't map to origin hostname, IE iframe (which is the non-Flash fallback) will not allow you to read the contents of it from the containing window. Fire and forget may be possible, but consuming the response/detecting errors from s3 may not.
I will update this answer as I uncover more in the coming days. This is a large project so we have some pretty significant requirements and I imagine I'll learn a lot in the next week or so.
This is covered in a lot more detail here (not my company/project/post): http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/

Related

How to initiate a file download in Extjs 7.0 Modern framework

In the Extjs classic framework we use Ext.form.Panel to initiate a file download. In the Modern framework I have been unable to get Ext.form.Panel to do the same. So how can I perform a simple file download (of a potentially large file) in the Modern framework. I would rather not have to change the server side code.
This is the code we use in Classic
params={};
params.doc_path=rec.get('server_path');
params.doc_link_name=rec.get('name');
var form = Ext.create('Ext.form.Panel', {
standardSubmit: true,
renderTo: Ext.getBody(),
url: '/document/download',
method: 'POST',
timeout: 120
});
// Call the submit to begin the file download.
// Note that neither Success nor Failure are ever called
form.submit({
params: params
});
This is the server side code in our ruby server
def download
# A small helper to download the file passed in doc_path
send_file params["doc_path"], type: 'application/octet-stream', disposition: 'attachment', filename: params["doc_link_name"]
end
If we try that in Modern our server does not receive the correct url. (We get a routing error).
File upload using a form in Modern works just the same as Classic, so why doesn't the file download work the same?
Does anyone have some sample code on how to use Ext.exporter.file to download a file from a server? I have read the docs and just got lost. And in any case when I put in a require for Ext.exporter.file I get a 404 not found error, so that is out of the question.
I just need click and forget so no need to track success or failure.
Thanks to Imagine-breaker for his answer in this post I implemented this function:
downloadURI: function(uri, name) {
var link = document.createElement("a");
if(!name)name="";
link.setAttribute('download', name);
link.href = uri;
document.body.appendChild(link);
link.click();
link.remove();
}
and call it like this
lfg.downloadURI('/document/download?doc_path=' + encodeURIComponent(rec.get('server_path')),rec.get('name'));
Server side code remains exactly as is. I might implement this in our Classic version of the app as well - it seems so much simpler.

CSRF Validation Failed in Drupal 7

I've been searching and searching, including the many topics here, for a solution to my problem. I've had no luck thus far.
A bit of a backstory: I'm writing an AngularJS app with Drupal 7 as a backend. I'm able to login without problem, save Session Name and Session ID, and put them together for a Cookie header (I had to use this "hack"). Further, if I made a login call in the Postman app, then tried to update the node, it'd work. It makes me think that there's a problem with session authentication, but I still can't figure it out.
That being said, I'm at a roadblock. Whenever I try to PUT to update a node, I get the following error:
401 (Unauthorized : CSRF validation failed)
Now, my ajax call looks like this:
$http({
method: 'PUT',
url: CONSTANTS.SITE_URL+"/update/node/"+target_nid,
headers:{
'Content-Type': CONSTANTS.CONTENT_TYPE,
'Authentication': CONSTANTS.SESS_NAME +"="+CONSTANTS.SESS_ID,
'X-CSRF-Token' : CONSTANTS.TOKEN
},
data: {
(JSON stuff)
}
})
The CONTENT_TYPE is "application/json", the "Authentication" is the band-aid for the Cookie header problem, and the "X-CSRF-Token" is what is (presumably) giving me the problem. SESS_NAME, SESS_ID, and TOKEN are all gathered from the response at Login. I can pull lists made by users on the website, I can pull the list of all of the nodes of a certain type on the website as well. I only run into a problem when I attempt to PUT to update the node.
If I missed any information, let me know and I'll add it!
EDIT: I'm using AngularJS version 1.5.3.
After trying everything else, I followed one of the comments in the thread I linked at the beginning of my original post. They had to comment out a line in Services.module :
if ($non_safe_method_called && !drupal_valid_token($csrf_token, 'services')) {
//return t('CSRF validation failed');
}
It's around line 590, plus or minus a few depending on how much you've messed with the file. I don't like doing it this way, but I can't for the life of me figure out why the token's not working right. It's a temporary fix, for sure, but if someone runs across this with the same problem in the future it'll hopefully help you out!
Instead of removing the line you could also add a true to drupal_valid_token
if ($non_safe_method_called && !drupal_valid_token($csrf_token, 'services',true)) {
return t('CSRF validation failed');
}

Provide a callback URL in Google Cloud Storage signed URL

When uploading to GCS (Google Cloud Storage) using the BlobStore's createUploadURL function, I can provide a callback together with header data that will be POSTed to the callback URL.
There doesn't seem to be a way to do that with GCS's signed URL's
I know there is Object Change Notification but that won't allow the user to provide upload specific information in the header of a POST, the way it is possible with createUploadURL's callback.
My feeling is, if createUploadURL can do it, there must be a way to do it with signed URL's, but I can't find any documentation on it. I was wondering if anyone may know how createUploadURL achieves that callback calling behavior.
PS: I'm trying to move away from createUploadURL because of the __BlobInfo__ entities it creates, which for my specific use case I do not need, and somehow seem to be indelible and are wasting storage space.
Update: It worked! Here is how:
Short Answer: It cannot be done with PUT, but can be done with POST
Long Answer:
If you look at the signed-URL page, in front of HTTP_Verb, under Description, there is a subtle note that this page is only relevant to GET, HEAD, PUT, and DELETE, but POST is a completely different game. I had missed this, but it turned out to be very important.
There is a whole page of HTTP Headers that does not list an important header that can be used with POST; that header is success_action_redirect, as voscausa correctly answered.
In the POST page Google "strongly recommends" using PUT, unless dealing with form data. However, POST has a few nice features that PUT does not have. They may worry that POST gives us too many strings to hang ourselves with.
But I'd say it is totally worth dropping createUploadURL, and writing your own code to redirect to a callback. Here is how:
Code:
If you are working in Python voscausa's code is very helpful.
I'm using apejs to write javascript in a Java app, so my code looks like this:
var exp = new Date()
exp.setTime(exp.getTime() + 1000 * 60 * 100); //100 minutes
json['GoogleAccessId'] = String(appIdentity.getServiceAccountName())
json['key'] = keyGenerator()
json['bucket'] = bucket
json['Expires'] = exp.toISOString();
json['success_action_redirect'] = "https://" + request.getServerName() + "/test2/";
json['uri'] = 'https://' + bucket + '.storage.googleapis.com/';
var policy = {'expiration': json.Expires
, 'conditions': [
["starts-with", "$key", json.key],
{'Expires': json.Expires},
{'bucket': json.bucket},
{"success_action_redirect": json.success_action_redirect}
]
};
var plain = StringToBytes(JSON.stringify(policy))
json['policy'] = String(Base64.encodeBase64String(plain))
var result = appIdentity.signForApp(Base64.encodeBase64(plain, false));
json['signature'] = String(Base64.encodeBase64String(result.getSignature()))
The code above first provides the relevant fields.
Then creates a policy object. Then it stringify's the object and converts it into a byte array (you can use .getBytes in Java. I had to write a function for javascript).
A base64 encoded version of this array, populates the policy field.
Then it is signed using the appidentity package. Finally the signature is base64 encoded, and we are done.
On the client side, all members of the json object will be added to the Form, except the uri which is the form's address.
var formData = new FormData(document.forms.namedItem('upload'));
var blob = new Blob([thedata], {type: 'application/json'})
var keys = ['GoogleAccessId', 'key', 'bucket', 'Expires', 'success_action_redirect', 'policy', 'signature']
for(field in keys)
formData.append(keys[field], url[keys[field]])
formData.append('file', blob)
var rest = new XMLHttpRequest();
rest.open('POST', url.uri)
rest.onload = callback_function
rest.send(formData)
If you do not provide a redirect, the response status will be 204 for success. But if you do redirect, the status will be 200. If you got 403 or 400 something about the signature or policy maybe wrong. Look at the responseText. If is often helpful.
A few things to note:
Both POST and PUT have a signature field, but these mean slightly different things. In case of POST, this is a signature of the policy.
PUT has a baseurl which contains the key (object name), but the URL used for POST may only include bucket name
PUT requires expiration as seconds from UNIX epoch, but POST wants it as an ISO string.
A PUT signature should be URL encoded (Java: by wrapping it with a URLEncoder.encode call). But for POST, Base64 encoding suffices.
By extension, for POST do Base64.encodeBase64String(result.getSignature()), and do not use the Base64.encodeBase64URLSafeString function
You cannot pass extra headers with the POST; only those listed in the POST page are allowed.
If you provide a URL for success_action_redirect, it will receive a GET with the key, bucket and eTag.
The other benefit of using POST is you can provide size limits. With PUT however, if a file breached your size restriction, you can only delete it after it was fully uploaded, even if it is multiple-tera-bytes.
What is wrong with createUploadURL?
The method above is a manual createUploadURL.
But:
You don't get those __BlobInfo__ objects which create many indexes and are indelible. This irritates me as it wastes a lot of space (which reminds me of a separate issue: issue 4231. Please go give it a star)
You can provide your own object name, which helps create folders in your bucket.
You can provide different expiration dates for each link.
For the very very few javascript app-engineers:
function StringToBytes(sz) {
map = function(x) {return x.charCodeAt(0)}
return sz.split('').map(map)
}
You can include succes_action_redirect in a policy document when you use GCS post object.
Docs here: Docs: https://cloud.google.com/storage/docs/xml-api/post-object
Python example here: https://github.com/voscausa/appengine-gcs-upload
Example callback result:
def ok(self):
""" GCS upload success callback """
logging.debug('GCS upload result : %s' % self.request.query_string)
bucket = self.request.get('bucket', default_value='')
key = self.request.get('key', default_value='')
key_parts = key.rsplit('/', 1)
folder = key_parts[0] if len(key_parts) > 1 else None
A solution I am using is to turn on Object Changed Notifications. Any time an object is added, a Post is sent to a URL - in my case - a servlet in my project.
In the doPost() I get all info of objected added to GCS and from there, I can do whatever.
This worked great in my App Engine project.

How can CakePHP return a Mimetype header of JPG?

Found this:
http://stackoverflow.com/questions/7198124/setting-the-header-to-be-content-image-in-cakephp
but either not understanding, or its not working for me.
Basically want to record 'opens' for emails. Currently it "works" but in gmail it shows that an image is not being displayed--so I want to return an actual image header. I've tried doing:
$this->layout=false;
$this->response->type('jpg');
In the Controller for Opens, but that is not working.
Web Sniffer (http://web-sniffer.net/), is showing a jpeg response, but still have a blank no file found image. How can I fix?
[edit]
Thinking this:
http://stackoverflow.com/questions/900207/return-a-php-page-as-an-image
and this:
http://book.cakephp.org/2.0/en/controllers/request-response.html
might be solution
Serve a real image
If you only send the headers for an image, but don't send the image content - it will be considered a broken image. To send a file refer to the documentation for whichever version of CakePHP you are using. For example in 2.3+:
public function opened() {
...
$this->response->file('/path/to/1x1.gif');
return $this->response;
}
Pretty sure this worked:
$name = './img/open.jpg';
$fp = fopen($name, 'rb');
$this->response->header("Content-Type: image/jpg");
$this->response->header("Content-Length: " . filesize($name));
fpassthru($fp);
Where open.jpg is a 1x1 real pixel image in cakephp's /img directory.
Would love if someone else could confirm?
Getting:
����JFIF``��C $.' ",#(7),01444'9=82<.342��C 2!!22222222222222222222222222222222222222222222222222��"����������?����
Going to manually (so I'm guessing thats a "real" image file?). No longer getting gmails no image file icon.
nvermind--websniff says this is a html request. will update when I figure out.
[edit]
think this might be correct way:
$this->response->type('jpg');
$this->response->file('./img/open.jpg');
just tested, and definitely getting a 1x1 pixel download. No jibberish like above.
Gmails caching proxy
Normally serving a real image as suggest by AD7Six would be the way to go, however with Gmails caching proxy in place you may run into problems when just serving an image.
http://www.emailmarketingtipps.de/2013/12/07/gmails-image-caching-affects-email-marketing-heal-opens-tracking/
The problem is/was that the image has been cached, and the proxy wouldn't request it a second time, making tracking of opens unreliable.
Content length to the rescue
Until recently the workaround for this has been to respond with a content length of 0 (and private cache control which seems to be necessary for some other webmail providers):
// ...
$this->response->header(array
(
'Cache-Control' => 'private',
'Content-Length' => 0
));
$this->response->type('gif');
return $this->response;
This would return a response with no content body, which is treated as broken, however not all clients did actually show a broken image symbol, still it was recommended to hide the image using styles on the img tag.
Return of cache control
However it has been reported that Gmail made some changes recently so that sending a no-cache header is now being respected again.
http://blog.movableink.com/real-time-content-and-re-open-tracking-return-to-gmail/
So in addition to AD7Six example an appropriate Cache-Control header might now do the trick
// ...
$this->response->header(array
(
'Cache-Control' => 'no-cache, max-age=0'
));
$this->response->file('/path/to/1x1.gif');
return $this->response;

DOM Exception 18 in Awesomium

I have a webpage being loaded from the local file system and rendered using awesomium, the page is using AngularJSto render part of the page. However, I have a problem with part of my AngularJS controller generating a Dom Exception 18:
angular-1.0.0rc10.js # line 5349 Error: SECURITY_ERR: DOM Exception 18
It seems this exception is caused by the presence of this code at the end of my AngularJS controller:
$http({method: 'GET', url: 'http://placeholdermetaserver.appspot.com/list?format=json&game=Heist'})
.success(function(data)
{
//Do stuff with data
});
Oddly enough everything is just fine if I use a straight XMLHttpRequest instead of the AngularJS $http object:
var request = new XMLHttpRequest();
request.onload = function() {
//Do stuff with data
};
request.open("GET", "http://placeholdermetaserver.appspot.com/list?format=json&game=Heist", true);
This exception is not generated when I simply load this page in chrome (off the local file system, same as awesomium).
What could cause this and how can I fix it?
The $http service includes some Cross Site Request Forgery (XSRF) countermeasures. I'm not especially familiar with Awesomium, so I'm not sure what security features it implements, but I'd consult the documentation (both of Awesomium and AngularJS) for more info.
http://docs.angularjs.org/api/angular.module.ng.$http
From the perspective of your server, this is prone to the textbook XSRF img tag attack if you ever send a GET request like:
"http://myapp.com/doSomething/somegame/12345"
From the perspective of your client, let's say you make a request like:
"http://myapp.com/doSomething/somegame/" + someId
A clever hacker might coax someId to be:
"#123.45.56.689/myEvilFakeJson.json"
In which case the request isn't made to your server, but instead some
other one. If you hard code the location of the request or are careful with sanitizing input, it probably won't be that much of a risk.

Resources