Google App Engine BlobStore as image host? - google-app-engine

If I were to make a project with the Google App Engine (using Python), and this project contained small user-generated images (which, once uploaded, will be accessed a lot, but won't change or altered dynamically anymore), would the Google App Engine BlobStore make sense to use (in terms of costs, speed etc.)? Or would GAE or the client connecting to Amazon S3 and storing images there make more sense, as these files will end up being static?
For what it's worth, the generated image files are all considered to be public, not user-private, and it would be perfectly fine for them to be on another subdomain. All files will be fixed-palette 16 colors PNGs of exactly 19x19 pixels. Their URL/ID would be referenced in the GAE datastore, with a couple of more attributes (like creatorId), for handling/ showing them in the web app.
Thanks!

If you are concerned about speed and cost, by far the best way is to store them in the blobstore and use get_serving_url() (link). These images are served by google's high performance image servers, and will never cost you instance hours, just bandwidth, and you don't have to worry about memcache.

I asked a similar question a few days ago.
Im sticking with storing the images in the DataStore as BLOBS (not in the BlobStore) and ensuring i set a Cache Control header to ensure they arent requested too many times.

For such small images, you can simply use the Datastore. Even with the 1Gb of space it gives you in the free quotas you should be able to store a few 19x19 pixels images easily. Using BlobStore is slightly more complicated as the API's are more complex and the actual sotorage procedure involves more steps than just storing binary data in the DataStore. I do recommend however that you implement memCache for the retrieval of these images, since you say that will not be modified afterwards. You don't want to query the same 19*19*4 bytes out of the database for each image over and over.

Related

Where to store large deposit of images for a react application?

I am developing a React web application that will require a large amount of images to be loaded dynamically based on user interaction (e.g. the user clicks on a grid and loads a specific image).
The issue is that the application will have a large number of images (about 10000 in total) and some of them are quite large files (around 20mb - 40mb). We are expecting a total size of around 200gb - 400gb for all the images.
Where would be the best place to store these images? One option would be some sort of database (e.g. AWS), but I'm not sure how well that scales with such a large number of images.
Alternatively, would storing the images on the filesystem be a viable solution? I am currently not sure what the options are for deploying and hosting a site with such a large number of images.
Thanks for the help.
I would personally suggest to go with cloud storages.Storing in filesystems is not easily transferrable and also expensive.We also need to go over the security aspects which makes cloud platforms the better choice.They are also scalable,depending upon the service and provider you decide to go with.For me Dropbox is the best choice as it is cheaper and also almost cover all my required services.

How to host large files on Google-App-Engine

I would like my server that runs on Google App Engine to host large files such as audio scripts and images, is it possible to store them as a column in a database? If not, what mechanisms may I use?
You have two options:
Blobstore (currently available in Java, Python and Go).
Google Cloud Storage (currently available in Java, Python and PHP).
Blobstore and GCS are most likely what your are looking for.
Both services are not covered by the GAE SLAs however. If you need that kind of reliability promise you're stuck with the GAE datastore.
You can put your files in a BLOB property of a datastore entity and serve it from there. Datastore entities have a size limit of 1MB however.
To circumvent that, you must split and re-assemble your files using multiple entities. There again is a size limit to any GAE response which is 32MB.

Is it better to store a physical Image into datastore or a link to it?

I need to store images and I have 2 options:
store the image into GAE datastore.
store the image somewhere (maybe also on Dropbox or another website) and store its link into GAE datastore.
What's the best practice when we need to store an image into DB, in the hypotesis that each image is bijectivelly linked to a specific element of the datastore?
I think it depends heavily on the use case.
I have a small company website running on appengine and the content images are all stored in the datastore and for that application it works well (they are all relatively small images).
If you have a high traffic site you may find storing them in GCS, or some other mechanism that supports a more cost effective CDN will be more appropriate.
If the images are large (more than 1MB) then the datastore isn't a practical solution.
There will be no hard and fast rule. Understand your use cases, your cost structure, how complex the solution will be to manage, and then choose the most appropriate solution.
Neither of the above. Google's cloud platform includes a service specifically for storing files, Google Cloud Storage, which is well integrated into GAE. You should use that.

Storing back-end of mobile app on amazon web services - how?

I have an app which needs a backend which consists of 3 parts:
Database
Front PHP pages to handle request from the app, access DB and return JSON
File-system storage for pictures
Now I know that many people go with Amazon AWS today, and I recently had a quick look around at their services. Now for the database, it seems DynamoDB would suit me fine! But my question is, which product (and how) should I use to store the static php pages, and the basic filesystem (which can potentially get pretty big). For the static php pages, I really need something as simple as services the likes of GoDaddy.com, but it has to be fast and be able to respond to many requests.
For the image file-system storing, this could either be done on the same place as the php files, or anywhere else as long and I can access it with php. What do you recommend?
I would really like to hear which products you think will suit this back-end, as this is a pretty popular setting. If you think of something better than amazon, I am open to suggestions, just keep in mind that the top priorities (by order) are stability, scalability and ease of use, and it need to be VERY scaleable.
Thanks for any replays!
You could use S3 for storage as it is simple, easy to use and scalable. But it's slow to retrieve files from S3. To solve this, retrieve the files from Cloudfront instead of S3. S3 bucket can act as the origin server of your Cloudfront distribution. It has two advantages -
- Retrieval will be very fast, specially of more popular pages/pictures
- It doesn't matter which part of globe your app is being used, Cloudfront will select the nearest edge cache to serve your content.

Storing data in a Google App Engine App

I'm reading up on Google App engine and I'm thinking of using it as a CDN for a project I'm working on. As far as I can tell, there's two ways to store data. I could use a datastore or I could put files in a directory.
I was brought up believing it's a bad idea to store large binary data in a database, but according to Google, the datastore isn't an RMDB, it just acts like one.
So my gut is telling me to upload files to a directory. However, I thought I'd best canvas an opinion on here before making my mind up.
Has anyone used GAE for stuff like this? And if so, what method did you choose for storing files, and why?
You cannot write to the file system in App Engine. You need to use the Datastore to store any data.
Note that if your "large binary files" are actually large, you're going to run in to the 1MB limit on all API calls. An API for storing larger blobs is on the roadmap, but there's no way of knowing when it will be released. At present, you need to split blobs larger than 1MB into multiple datastore entities.
The blobstore API lets you store files upto 50 mb ,though its an experimental api and requires billing be enabled.Also its its different from bigtable.
http://code.google.com/appengine/docs/java/blobstore/
Nowadays Google Cloud Storage is the way to go for large files.

Resources