I have an app which needs a backend which consists of 3 parts:
Database
Front PHP pages to handle request from the app, access DB and return JSON
File-system storage for pictures
Now I know that many people go with Amazon AWS today, and I recently had a quick look around at their services. Now for the database, it seems DynamoDB would suit me fine! But my question is, which product (and how) should I use to store the static php pages, and the basic filesystem (which can potentially get pretty big). For the static php pages, I really need something as simple as services the likes of GoDaddy.com, but it has to be fast and be able to respond to many requests.
For the image file-system storing, this could either be done on the same place as the php files, or anywhere else as long and I can access it with php. What do you recommend?
I would really like to hear which products you think will suit this back-end, as this is a pretty popular setting. If you think of something better than amazon, I am open to suggestions, just keep in mind that the top priorities (by order) are stability, scalability and ease of use, and it need to be VERY scaleable.
Thanks for any replays!
You could use S3 for storage as it is simple, easy to use and scalable. But it's slow to retrieve files from S3. To solve this, retrieve the files from Cloudfront instead of S3. S3 bucket can act as the origin server of your Cloudfront distribution. It has two advantages -
- Retrieval will be very fast, specially of more popular pages/pictures
- It doesn't matter which part of globe your app is being used, Cloudfront will select the nearest edge cache to serve your content.
Related
I am working on a webapp for a client that has a cPanel virtual server, and it appears that I can only use MySQL, but I want to store the data using a json-like structure, so that I can more easily use Angular.js on the frontend.
I've looked into installing a NoSQL database, and I can't find anything viable (if you know of a way to do that, that would be my best solution), so I'm thinking of storing the data as json strings in a series of text files on the server that I would write to with php.
I'd like to hear some opinions, and if there are any better solutions of which I'm not thinking of.
Go look at firebase and thank me afterwards.
In short, firebase is a cloud real-time JSON data storage. Everything for the backend is done for you and all you need to do is the front-end. Their servers are CDNs which means it will be great if you're looking to serve the entire world. All you need to do is configure your data-structure and use it!
It also provides sockets, which is great for real-time data (used for games, chat and etc).
There is a free option. The only downside is that it is a little expensive if you want to scale it, nevertheless if your app really gets to that stage - I'm sure you'll have money to hire some people to develop a similar backend for yourself.
I need to store images and I have 2 options:
store the image into GAE datastore.
store the image somewhere (maybe also on Dropbox or another website) and store its link into GAE datastore.
What's the best practice when we need to store an image into DB, in the hypotesis that each image is bijectivelly linked to a specific element of the datastore?
I think it depends heavily on the use case.
I have a small company website running on appengine and the content images are all stored in the datastore and for that application it works well (they are all relatively small images).
If you have a high traffic site you may find storing them in GCS, or some other mechanism that supports a more cost effective CDN will be more appropriate.
If the images are large (more than 1MB) then the datastore isn't a practical solution.
There will be no hard and fast rule. Understand your use cases, your cost structure, how complex the solution will be to manage, and then choose the most appropriate solution.
Neither of the above. Google's cloud platform includes a service specifically for storing files, Google Cloud Storage, which is well integrated into GAE. You should use that.
I am writing a web application that requires a database which will have entities like user, friends etc. Since Cloud SQL service is not free so i am looking for alternatives. Amazon RDS is one option, since they have a free tier which would suit my needs in the short term but before I get into it I would like to know more about blobstores.
Is it ideal to use blobstore to store such kind of information?
There are questions like:
how will the read/write latency be compared to a traditional db ?
if i start with blobstore and later i want to move to relational db, what are the problems that i could face ?
The most important of all is, if it is ideal to use blobstore in my scenario.
After looking at the documentation on google dev site I have found that blobstores are used to store large/medium files like images and videos.
You can't and shouldn't try to use the blobstore for structured data. That's what the datastore is for. Blobstore is for unstructured data such as files.
We are developing an application which requires the client (mobile device) to send files of size 5MB or more to the server component for processing and we would like some advice on the following:
Is there any way to combine a Backend-as-a-Service (BaaS) platform with our own data-storage (hosted in our particular case in AWS)? We essentially would prefer if the files from the client are sent directly to our own database in the cloud rather than be stored in the BaaS servers.
In other words, we need a BaaS platform or a solution that allows unbundling/bypassing its data-storage feature so that we can use the BaaS only for the rest of its facilities (such as the client authentication, the REST API etc).
We have our own servers in EC2 which are needed for the main processing part of the files and only need the BaaS platform for conveniences that will kick-start our application in a short amount of time. Pulling the files from the BaaS platform's own data-storage to the EC2 servers would induce overall latency overhead as well as extra bandwidth cost in most cases.
I'd faced a similar dilemma while building my app. In my case, I had to upload and store photos uploaded by users somewhere AND I didn't want to build a backend myself. So, I decided to use Amazon S3 to store the photos uploaded by the user and used SimpleDB as it offered me greater flexibility and ease of use than using a MySQL backend. Now, obviously, SimpleDB is not a Backend-as-a-Service platform but I was looking for the same convenience as you are.
So what I'm suggesting is that you use a Backend-as-a-Service platform like Parse (which has an excellent freemium model), CloudMine (another great service but with tight limitations on the freemium model i.e only 500 free users/month) or Kinvey (which markets itself as the first BaaS platform, I don't have much information about it but it's definitely worth a look). And use S3 for your data storage. This way you can use the BaaS for client authentication, the REST API etc as you mentioned and you can continue using S3. All you need to do is create an appropriate naming scheme for your S3 buckets and objects such that you can easily identify which object belongs to which user, this can be done easily using a prefix-based naming scheme (seeing as S3 doesn't offer the ability to create sub-folders in buckets). Now whenever you need to pull some client information you can make a call to your BaaS with the client authenticated details and whenever you need access to your data-storage you can make a call to S3 using the Android SDK provided by AWS to retrieve the objects that belong to that particular user. Seeing as you plan on using EC2 to process those files transferring those files from S3 to EC2 should not cost you any extra bandwidth (I might be wrong here because I haven't looked into it but as far as I can remember transferring data within AWS is free).
Do let me know if you have additional questions.
If I were to make a project with the Google App Engine (using Python), and this project contained small user-generated images (which, once uploaded, will be accessed a lot, but won't change or altered dynamically anymore), would the Google App Engine BlobStore make sense to use (in terms of costs, speed etc.)? Or would GAE or the client connecting to Amazon S3 and storing images there make more sense, as these files will end up being static?
For what it's worth, the generated image files are all considered to be public, not user-private, and it would be perfectly fine for them to be on another subdomain. All files will be fixed-palette 16 colors PNGs of exactly 19x19 pixels. Their URL/ID would be referenced in the GAE datastore, with a couple of more attributes (like creatorId), for handling/ showing them in the web app.
Thanks!
If you are concerned about speed and cost, by far the best way is to store them in the blobstore and use get_serving_url() (link). These images are served by google's high performance image servers, and will never cost you instance hours, just bandwidth, and you don't have to worry about memcache.
I asked a similar question a few days ago.
Im sticking with storing the images in the DataStore as BLOBS (not in the BlobStore) and ensuring i set a Cache Control header to ensure they arent requested too many times.
For such small images, you can simply use the Datastore. Even with the 1Gb of space it gives you in the free quotas you should be able to store a few 19x19 pixels images easily. Using BlobStore is slightly more complicated as the API's are more complex and the actual sotorage procedure involves more steps than just storing binary data in the DataStore. I do recommend however that you implement memCache for the retrieval of these images, since you say that will not be modified afterwards. You don't want to query the same 19*19*4 bytes out of the database for each image over and over.