new to Gatsby here.
I've followed the Gatsby-image guide and store my photos in an images folder in my project and pull the photos via graphQL from there. It works great locally but I guess that kind of storage is not meant for actually production? As I get the 'failed to load resource error' from the public HTML and blurry images when deployed. From browsing through stack overflow and github it looks like a ton of people run in to this.
So where does everyone store their photos? Just curious for some advice, thanks!
You have plenty of choices:
Cloud-Based solutions
AWS S3 (S3, S3 Glacier etc...)
Google Cloud Storage
Azure Blob Storage
Others:
Node framework's public folder. (Most of the time it's statically served)
Directly serve them from an Nginx or an Apache.
The list isn't exhaustive.
I recommend you to use Cloud storage because it's cheaper and you have 99.99% availability. It is also easier to link them with cloud services like permission ones (IAM of AWS for example).
And for your use case AppSync for GraphQl.
Related
I have been using Google Cloud Storage to save photos that users upload from a mobile app (built with flutter and firebase), recently I had the need to resize/transform images and I wanted to explore if it's possible to do it directly from google cloud storage!
I found this project https://github.com/albertcht/python-gcs-image that you have to deploy on google app engine and if you call it with a bucket and an image it returns a URL to a Google CDN I think (something like this http://lh3.googleusercontent.com/*).
I looked at the code in the repository and the only thing it does is to return the result of google.appengine.api.images.get_serving_url and I don't understand why I cannot get this serving_url directly from my dart code?
What is the difference between Google Cloud Storage and lh3.googleusercontent.com?? Can I make the same image processing directly from cloud storage?
It seems odd that I have to run an app engine app that just returns a URL?
What am I missing?
The lib use this api. It's in Python 2.7 which have a end of life the 01/01/2020.
Moreover, the image api is available only with AppEngine 1st generation (python 2.7) and not available for the 2nd generation (python 3).
All of this for not recommended you to use this.
The best design today is to perform the resize/crop when the file is uploaded and to store the result in Cloud Storage. Example here and here
Then, you only have to serve, from Cloud Storage, the resized/cropped images.
Look # Firebase Extensions, there is already such an Extension provided there
What I found is:
The Java, Python, and go Standard environments for Google App Engine
include the GAE Images API (Java, Python, which can resize, rotate,
flip, and crop an image, as well as return an image serving URL which
allows for client side transformations, similar to Cloudinary and
Imgix.
This matches my previous understanding and experience as well. serving_url is really convenient for image manipulations.
Having said that, as you correctly pointed out, it's first and foremost an AppEngine feature and will require you to use AppEngine in one way on another.
If it's not what you want, you can create a service that will crop your images and deploy it serverless. It's a lot less burdensome that having AppEngine service running 24/7. What's more, AWS had several pre-baked templates to do just that - crop images that can be deployed in a couple of clicks.
If you are, like myself, interested in Google Cloud solution I can offer a similar function that I wrote. It can be deployed in Cloud Run as-is. See details in my other answer.
With it you can not only resize the images for mobile, but also map your own domain to the Cloud Run function and put it behind any CDN you like, which potentially can be faster that service from Google Storage. You can find plenty info on the Internet about why full-fledged CND is better than just bare Google Storage.
I have many websites and also websites made by clients which I would like to optimize. I am currently using different CDN providers but I would like to simplify my workflow and hopefully also lower the costs.
I would like to have a CDN with a Pull Zone, and that CDN would also optimize the images (while not modifying the other static resources).
Ideally, I would also have access to statistics for each Pull Zone (since I would like to charge my clients for this service instead of guessing).
What are the different ways to do this with the Google Cloud? Is there a way to do this only using Google Functions, CDN, and Google Storage? Of course, I guess that having a little NodeJS app running to optimize the images would be needed as well. I just wonder about the general architecture and if it is even possible (I know it is with Azure and AWS but I am already running a few things on the Google Cloud).
Thanks a lot :)
In GCP a pull zone can be created by associating a HTTP(S) Load Balancer to a Cloud Storage Bucket and enabling Cloud CDN.
Having a different bucket for every client will break down the logs on your project, but not the billing for it.
To be able to separate billing you can always export the logs to a BigQuery and use it to break down the billing costs per client based on their use.
Regarding the optimization of the images, Google CDN will not perform any operation, neither GCS Bucket.
The only operation available in this direction is when using and serving GZip-compressed files.
I suggest you to dedicate one Instance to be able to prepare the images before storing or to add/replace the optimized versions of the images already inside the bucket.
I've inherited a project which uses Google App Engine Blobstore to store files, and I need to download all the images so they can be migrated to a new system.
I was able to get all the Google Datastore data out with https://github.com/jodeleeuw/export-google-datastore and was hoping there was something similarly easy with Blobstore. Or at least an example of how to read all the files in the blobstore and download them.
The Datastore solution you mentioned relies on the Datastore API being available for apps not running on GAE. But AFAIK the Blobstore API is not available outside GAE, so a similar solution is likely impossible.
I see a couple of options:
enhance the GAE app code you inherited by adding the capability to get the data in the Blobstore (since it normally should be able to) and export it either directly to where you want the data moved or in an intermediate place, say GCS for example, from where you can ship the data to the final destination easier. Moving directly to the final location would be preferable IMHO if you need to keep the app working with the new location - you can build a nice, hitless migration story
use the developer console Blobstore browser/viewer which also allows downloading and deleting the blobs, either manually or using a GUI automation tool (like selenium, for example) for a programmatic approach.
I'm building a static site which deploys to Google App Engine. Are there any advantages of storing the assets (js, css, imgs) in CloudStorage. All of the assets are going be under 32MB (if thats a limit).
Based on this slide (slides 24-28), it sounds like requests for static assets of a GAE app would use Google's special infrastructure designed for serving static assets. However its not clear about how its performance compares to Google CloudStorage.
Any clarifications on this would be much appreciated. Similar questions on Stackoverflow exists, but they are fairly dated (2010), and Google's Cloud products have changed since then.
Setting your cache headers will result in similar performance AFAIK.
Both GAE and GCS use Google Edge cache.
GCS is probably easier to manage your static resources, whereas it takes a redeploy to GAE to add/remove static resources.
Other important note : GCS as CDN does not work over HTTPS with custom domains. GAE does support HTTPS with custom domains.
I want to upload files into my Google App Engine project.
I've been reading a while on this issue, and there is a lot of answers arguing Blobstore is the best option.
But if I did understand this well, these are database objects. I would like to upload them as system files to a caching theme or to ease a possible future migration to another CDN.
Imagine I want to save several files for each user, a couple of images, text files, maybe some video, etc...
If some day I want to move these static files to another CDN, shouldn't it be out of the database?
Is that a good idea? Is there a solution?
I would recommend you to use GAE datastore to store the files references (for example, video, images, etc.) and upload the content on Amazon S3. You may even allow the clients to upload the content directly to Amazon S3 without to process the content through GAE using Amazon HTML POST Forms (Browser Uploads to S3 using HTML POST Forms). Amazon AWS also offers CDN (CloudFront) with tight integration with S3 service.
If you need to store only small pictures, you may also consider the Google images services as it offers "free" CDN and some cool out-of the box transformations.
You can't write to the filesystem on App Engine, so your options are to store data to the datastore, or the blobstore. Of the two, the blobstore is generally the better choice for storing file uploads.
Regarding porting to a CDN, no matter how you do this it's going to require changes. App Engine is not a CDN, no two CDNs are exactly alike, and it's unlikely the CDN would expose an interface that you can interact with using standard filesystem operations anyway.