Is there a size limit to upload file using webapp2? - google-app-engine

I am using appengine and writing my server code in python using webapp2. I am trying to upload video files from browser and save it to Google cloud storage. I use the form element in my HTML and webapp2 handler on server side to upload the file from browser. It works for files of smaller size, but when I try to upload a video file greater than 100MB, the browser throws the blow error
This webpage is not available
ERR_CONNECTION_RESET
I am unable to debug this on server side as it doesn't hit the post method.
Is there a config parameter in web app 2 that can be modified to upload files of greater size?
Any inputs is greatly appreciated.

App Engine has a limit of 32MB on all requests. You should upload your files directly to Google Cloud Storage, not through your server. This will also save you a lot of instance time.
EDIT: As Alex mentioned, signed URLs is a great way to let users upload and download files directly from GCS.

Related

How can I upload large media file in S3 from client side?

I have a form where a user submits an image from the client-side. Currently, I am using AWS amplify to upload images to s3. I have not faced an issue so far because most of the times users selected the images less than 5MB. But right now I need to expand this limit and also optimizes the performance as well because the size limit is increased. When I have researched into performance, I got to know about the transfer acceleration and its compatibility with AWS SDK but not AWS amplify. So, I had to move my upload logic to SDK. The architecture I am following in the backend is Serverless and Lambda's. Let's say the user submits his form, the request cycle contains the image of size 10MB. But lambda will not process this request because the lambda request-response cycle is limited to 6MB transactions. How can I improve the performance, along with increasing the size limit? Also, can it be achieved with the AWS amplify itself?
Your use case is best served with S3 pre-signed urls. In this case your lambda function generates a url using the aws-sdk for a s3 upload and give back only the url for your frontend. Then you can upload directly to this url from the frontend without AWS Lambda being invoked again. All the data transfer will occur between your frontend and S3.
You can find more details about this solution here.

How to upload image to Google Cloud Storage with GAE/Cloud Endpoints API method

Does anyone have an example of a GAE/Cloud Endpoints API method (in Java) that can take in an image from an Android app and upload it to Google Cloud Storage?
I cannot seem to find any samples on how to do this but it is possible from what I understand.
EDIT:
The tutorial here shows how to add a dependency to google app engine in eclipse and upload/download an image to Google Cloud Storage. Is it possible to do this with Cloud Endpoints somehow..? After all, they are both Google App Engine.
I want to offload as much of the upload/download code into my Cloud Endpoints API method(s), rather than coding everything inside of Android. This would allow me to reuse my Cloud Endpoints API on other clients.
More info I found: https://developers.google.com/api-client-library/java/apis/storage/v1#sample
Looks like this is the gradle dependency for the cloud endpoints backend?:
dependencies {
compile 'com.google.apis:google-api-services-storage:v1-rev66-1.21.0'
}
EDIT:
You should use this dependency inside cloud endpoints:
compile 'com.google.appengine.tools:appengine-gcs-client:0.5'
You can upload file to Google Storage using Json Api
You may or may not want to store file metadata to datastore thru Endpoints.
You may or may not want to authenticate your users thru Endpoints before give them possibility to store files to Storage.
What I want to say is that Storage / Endpoints / Datastore are three different things and you don't required to use them all together.
Useful link: https://github.com/pliablematter/simple-cloud-storage
You cannot directly upload (large) files to an Endpoints API method but instead need to receive them using the blobstore (or GCS) (https://cloud.google.com/appengine/docs/python/blobstore/ ). This requires the following:
Setup on your server a blobstore upload handler (which is just a regular webapp2 handler).
Expose an Endpoints method that calls blobstore.create_upload_url(), and then returns the upload URL to your App.
Within the App, upload the picture to that upload URL; the file will then be accessible within your upload handler, where you can move it to GCS, Datastore or somewhere else.
Solution : https://github.com/thorrism/GoogleCloudExample
Enable Google Cloud Storage : => https://console.developers.google.com/apis
Generate and download a key P12 : => https://console.developers.google.com/iam-admin/iam
Create a folder named "assets" and place there your key :
=> app/src/main/assets/"yourKey.P12"
For everyone can access your uploaded file reading do not forget to add the permissions on your Bucket

Is it possible for users to upload to google cloud storage?

I'd like to create an object, give my users an upload url, and let them upload data. The resulting object must be public-readable. Is this possible with google cloud storage? If so, is it possible through google app engine, and where can I find documentation and/or examples for doing it?
To have a user upload directly to Google Cloud Storage, you can use the Signed URLs feature. This allows you to grant access to issue a PUT request to an object to a single user.
If you're using Python, there is a python example demonstrating signed URLs.
You can create an upload url using the blobstore service. See the create_upload_url function.
To make the object publicly accessible you may need to play with the acls of the bucket.
See also the Cloud Storage Overview.
Another option to upload directly to Google Cloud Storage is Resumable URLs.
If your object is big, such as a video, you can upload it in chunks this way. If the upload fails (e.g. client loses internet connection), you can resume from where you left off and not have to have the user start over again. Plus you save some money by not having to restart that upload.
However if your media is small, just use Signed URLs.

Allowing an authenticated user to download a big object stored on Google Storage

I have some big files stored on Google Storage. I would like users to be able to download them only when they are authenticated to my GAE application. The user would use a link of my GAE such as http://myapp.appspot.com/files/hugefile.bin
My first try works for files which sizes are < 32mb. Using the Google Storage experimental API, I could read the file first then serve it to the user. It required my GAE application to be a team member of the project which Google Storage was enabled. Unfortunately this doesn’t work for large files, and it hogs bandwidth by first downloading the file to GAE and then serving it to the player.
Does anyone have an idea on how to carry out that?
You can store files up to 5GB in size using the Blobstore API: http://code.google.com/appengine/docs/python/blobstore/overview.html
Here's the Stackoverflow thread on this: Upload file bigger than 40MB to Google App Engine?
One thing to note, is reading blobstore can only be done in 32MB increments, but the API provides ways to accessing portions of the file for reads: http://code.google.com/appengine/docs/python/blobstore/overview.html#Serving_a_Blob
FYI in the upcoming 1.6.4 release of AppEngine we've added the ability to pass a Google Storage object name to the blobstore.send_blob() to send Google Storage files of any size from you AppEngine application.
Here is the pre-release announcement for 1.6.4.

Bulk file upload Appengine

I have around 1500 images which are dynamically generated in my local server. I want these to upload to Appengine's datastore. How can i do this? any help, any ideas?
BlobStore
You can use the blobstore if you have a billing account.
So I assume to use the bolbstore API to upload it in chunks with a client tool over http.
Before every upload, you can ask the blobstore to give you an unique upload URL to post MIME multipart posts to it.
There is a size limit for requests I think.
Simple DataStore
If you store your images in the datastore, you can use a python tool somehow to synchronize.

Resources