I use GAE/J blobstore to upload images, I found in most cases it works just fine, but sometimes getting the image size give me wrong value:
...
BlobInfo info = blobInfoFactory.loadBlobInfo(blobKey);
long size = info.getSize();
Say, I have a jpg image, it's size is 141KB with dimension 236 x 1580, but after uploading to blobstore, the size is just about 60KB(the dimension is wrong too), other images which bigger than 141KB can get uploaded fine and can get correct size at server side.
I'm not sure it's GAE problem or not, Could someone shed some light on this? Thanks
BTW i use jquery-file-upload at client side, with a max uploaded image size set to 5Mb.
Related
I am aiming to take a file a user attaches through an Lightning Component and create a document object containing the data.
So far I have overcome the request size limits by chunking the data being uploaded into 1MB chunks. When the Apex Aura method receives these chunks of data it will either create a new document (if it is the first chunk), or will retrieve the existing document and add the new chunk to the end.
Data is received Base64 encoded, and then decoded server-side.
As the document data is stored as a Blob, the original file contents will be read as a String, and then appended with the chunk received. The new contents are then converted back into a Blob to be stored within the ContentVersion object.
The problem I'm having is that strings in Apex have a maximum length of 6,000,000 or so. Whenever the file size exceeds 6MB, this limit is hit during the concatenation, and will cause the file upload to halt.
I have attempted to avoid this limit by converting the Blob to a String only when necessary for the concatenation (as suggested here https://developer.salesforce.com/forums/?id=906F00000008w9hIAA) but this hasn't worked. I'm guessing it was patched because it's still technically allocating a string larger then the limit.
Code's really simple when appending so far:
ContentVersion originalDocument = [SELECT Id, VersionData FROM ContentVersion WHERE Id =: <existing_file_id> LIMIT 1];
Blob originalData = originalDocument.VersionData;
Blob appendedData = EncodingUtil.base64Decode(<base_64_data_input>);
Blob newData = Blob.valueOf(originalData.toString() + appendedData.toString());
originalDocument.VersionData = newData;
You will have hard time with it.
You could try offloading the concatenation to asynchronous process (#future/Queueable/Schedulable/Batchable), they'll have 12MB RAM instead of 6. Could buy you some time.
You could try cheating by embedding an iframe (Visualforce or lightning:container tag? Or maybe a "canvas app") that would grab your file and do some manual JavaScript magic calling normal REST API for document upload: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_sobject_insert_update_blob.htm (last code snippet is about multiple documents). Maybe jsforce?
Can you upload it somewhere else (SharePoint? Heroku?) and have that system call into SF to push them (no Apex = no heap size limit). Or even look "Files Connect" up.
Can you send an email with attachments? Crude but if you write custom Email-to-Case handler class you'll have 36 MB of RAM.
You wrote "we needed multiple files to be uploaded and the multi-file-upload component provided doesn't support all extensions". That may be caused by these:
In Experience Builder sites, the file size limits and types allowed follow the settings determined by site file moderation.
lightning-file-upload doesn't support uploading multiple files at once on Android devices.
if the Don't allow HTML uploads as attachments or document records security setting is enabled for your organization, the file uploader cannot be used to upload files with the following file extensions: .htm, .html, .htt, .htx, .mhtm, .mhtml, .shtm, .shtml, .acgi, .svg.
I am unable to open a base64 string as pdf in angular application if the file size is more than say 10mb. The browser crashes with a page saying "AW, SNAP". The base64 string i.e. "base64content" already has the metadata at the start and looks something like this:
data:application/pdf;base64,JVBERi0xLjQK...DUKJSVFT0YK
I am using below way to open the base64 string as a PDF in new tab:
$window.open(base64content);
Try with
window.open('data:application/pdf;base64,' + base64content);
This happened because the base64 string was too big for the browser to interpret. This is because the pdf was heavy(more than 2mb) the approx max limit of browser. The solution was to use blob based approach.
I recently started using bottle and GAE blobstore and while I can upload the files to the blobstore I cannot seem to find a way to download them from the store.
I followed the examples from the documentation but was only successful on the uploading part. I cannot integrate the example in my app since I'm using a different framework from webapp/2.
How would I go about creating an upload handler and download handler so that I can get the key of the uploaded blob and store it in my data model and use it later in the download handler?
I tried using the BlobInfo.all() to create a query the blobstore but I'm not able to get the key name field value of the entity.
This is my first interaction with the blobstore so I wouldn't mind advice on a better approach to the problem.
For serving a blob I would recommend you to look at the source code of the BlobstoreDownloadHandler. It should be easy to port it to bottle, since there's nothing very specific about the framework.
Here is an example on how to use BlobInfo.all():
for info in blobstore.BlobInfo.all():
self.response.out.write('Name:%s Key: %s Size:%s Creation:%s ContentType:%s<br>' % (info.filename, info.key(), info.size, info.creation, info.content_type))
for downloads you only really need to generate a response that includes the header "X-AppEngine-BlobKey:[your blob_key]" along with everything else you need like a Content-Disposition header if desired. or if it's an image you should probably just use the high performance image serving api, generate a url and redirect to it.... done
for uploads, besides writing a handler for appengine to call once the upload is safely in blobstore (that's in the docs)
You need a way to find the blob info in the incoming request. I have no idea what the request looks like in bottle. The Blobstoreuploadhandler has a get_uploads method and there's really no reason it needs to be an instance method as far as I can tell. So here's an example generic implementation of it that expects a webob request. For bottle you would need to write something similar that is compatible with bottles request object.
def get_uploads(request, field_name=None):
"""Get uploads for this request.
Args:
field_name: Only select uploads that were sent as a specific field.
populate_post: Add the non blob fields to request.POST
Returns:
A list of BlobInfo records corresponding to each upload.
Empty list if there are no blob-info records for field_name.
stolen from the SDK since they only provide a way to get to this
crap through their crappy webapp framework
"""
if not getattr(request, "__uploads", None):
request.__uploads = {}
for key, value in request.params.items():
if isinstance(value, cgi.FieldStorage):
if 'blob-key' in value.type_options:
request.__uploads.setdefault(key, []).append(
blobstore.parse_blob_info(value))
if field_name:
try:
return list(request.__uploads[field_name])
except KeyError:
return []
else:
results = []
for uploads in request.__uploads.itervalues():
results += uploads
return results
For anyone looking for this answer in future, to do this you need bottle (d'oh!) and defnull's multipart module.
Since creating upload URLs is generally simple enough and as per GAE docs, I'll just cover the upload handler.
from bottle import request
from multipart import parse_options_header
from google.appengine.ext.blobstore import BlobInfo
def get_blob_info(field_name):
try:
field = request.files[field_name]
except KeyError:
# Maybe form isn't multipart or file wasn't uploaded, or some such error
return None
blob_data = parse_options_header(field.content_type)[1]
try:
return BlobInfo.get(blob_data['blob-key'])
except KeyError:
# Malformed request? Wrong field name?
return None
Sorry if there are any errors in the code, it's off the top of my head.
I have an image in blob store which is uploaded by users(their profile pic). I want to make a copy of the same and and re-size the copy so that it can be displayed as a thumbnail. I want to make a copy of the same instead of using the ImageService because this would be used more often compared to the profile image.
What I am doing here is this:
reader = profile_image.open() #get binary data from blob
data = reader.read()
file_name = files.blobstore.create(mime_type=profile_image.content_type)#file to write to
with files.open(file_name, 'a') as f:
f.write(data)
files.finalize(file_name)
blob_key = files.blobstore.get_blob_key(file_name)
image = images.Image(blob_key = blob_key)
image.resize(width=32, height=32)
entity.small_profile_pic = <MyImageModel>(caption=<caption given by user>,
picture=str(blob_key))
This is giving me error:
BadValueError: Image instance must have a complete key before it can be stored as a reference.
I think this is because the blob is not saved(put()) into the datastore, but how do i do it. Doed files.blobstore.get_blob_key(file_name) not do it ?
I would also like to ask: does the blobstore also cache the dynamically transformed images images served using get_serving_url() ...
I would use the get_serving_url method. In the doc is stated that:
The get_serving_url() method allows you to generate a stable, dedicated URL for serving web-suitable image thumbnails. You simply store a single copy of your original image in Blobstore, and then request a high-performance per-image URL. This special URL can serve that image resized and/or cropped automatically, and serving from this URL does not incur any CPU or dynamic serving load on your application (though bandwidth is still charged as usual). Images are served with low latency from a highly optimized, cookieless infrastructure.
Also the code you posted doesn't seem to follow the exampled posted in the docs. I would use something like this
img = images.Image(blob_key=original_image_key)
img.resize(width=32, height=32)
thumbnail = img.execute_transforms(output_encoding=images.JPEG)
file_name = files.blobstore.create(mime_type='image/jpeg')#file to write to
with files.open(file_name, 'a') as f:
f.write(thumbnail)
files.finalize(file_name)
blob_key = files.blobstore.get_blob_key(file_name)
I'm uploading some images to GAE's Blobstore. Upload goes succesfully and when i see the images in GAE console i can see that they're stored in the expected size and resolution.
According to this site, to get a URL to the uploaded image i can use this:
BlobKey blobKey = (BlobKey) blobs.get("image");
ImagesService imagesService = ImagesServiceFactory.getImagesService();
String imageUrl = imagesService.getServingUrl(blobKey);
The resulting URL points to a smaller version of the uploaded images (in size and quality). How can i get a URL to the full size image stored in Blobstore?
you can append =s0 parameter after the url to get full sized image.