Mount google cloud storage bucket using Goofys - gcs

How to install Goofys on linux and mount Google cloud storage bucket using Goofys?

Related

Mount Google Cloud Storage in a Custom Flex GAE

I have a web application in a Custom Runtime Flex App Engine, which is deploy from a docker image, working fine.
This app is connected to a db in Google Cloud SQL to data persistence.
Now I trying to persist file data in a Cloud Storage Bucket, but I do not have found the way to mount this storage like a file system.
Gcsfuse does not work in this case due to privileges are required to tun docker.
In the app.yaml you can specify a volume, but not associate to storage.
How can persist file data with cloud storage in a Custom Flex GAE app?

Google Cloud App Engine - Default Storage Buckets

When I create a new application on Google Cloud App Engine, these buckets in Google Storage show up as well (where yyy is my app's name):
eu.artifacts.yyy.appspot.com
staging.yyy.appspot.com
yyy.appspot.com
What do they do exactly ?
From the official documentation Using Cloud Storage for App Engine:
When you create an app, App Engine creates a default bucket that
provides the first 5GB of storage for free. The default bucket also
includes a free quota for Cloud Storage I/O operations. See Pricing,
quotas, and limits for details. You will be charged for storage over
the 5GB limit.
The name of the default bucket is in the following format:
project-id.appspot.com
App Engine also creates a bucket that it uses for temporary storage
when it deploys new versions of your app. This bucket, named
staging.project-id.appspot.com, is for use by App Engine only. Apps
can't interact with this bucket.
eu.artifacts.yyy.appspot.com is your container registry bucket
Your Container Registry bucket URL will be listed as
gs://artifacts.[PROJECT-ID].appspot.com or
gs://[STORAGE-REGION].artifacts.[PROJECT-ID].appspot.com, where:
[PROJECT-ID] is your Google Cloud Console project ID. Domain-scoped
projects will have the domain name as part of the project ID.
[STORAGE-REGION] is the location of the storage bucket: us for
registries in the host us.gcr.io eu for registries in the host
eu.gcr.io asia for registries in the host asia.gcr.io
Each of these buckets are for App Engine build and temporary artifacts.
Example: App Engine also creates a bucket that it uses for temporary storage when it deploys new versions of your app. This bucket, named staging.project-id.appspot.com, is for use by App Engine only. Apps can't interact with this bucket.

Why does writing to GCS bucket result in local dev blob store entries instead?

I use the Google App Engine Standard environment to develop my Python app using Development SDK 1.9.61.
I'm trying to learn to use Google Cloud Storage in my app by following these instructions. I verified that my default and staging buckets do exist via the cloud console, and manually uploaded a sample file to each bucket using my browser.
Next, I programmatically uploaded some files to a bucket (so I thought) via my local development app instance per Google's instructions.
However, when I checked my cloud storage buckets via my GCP Console in my browser, I could not find the files. After searching my local development SDK console, I eventually found the files located in the local "Blobstore Viewer".
I'm confused, based on Google's instructions I expected to find the files in my project's cloud storage bucket.
I searched the App Engine Python Release Notes for some potential SDK version changes to explain this behavior, but couldn't find anything relevant.
Is this the way it's supposed to work? Are Google's instructions in error?
If you upload files to a local development server, those exist in-memory on your machine. The GCP Console doesn't interact with your local development server, it interacts with the public (production) Google Cloud Storage API.
So in essence, the files on your local dev server are in a completely different namespace. If you want to interact with the production version of Google Cloud Storage and see the results in the GCP console, you'll need to use a non-dev-server deployment of your application.

Ssh to Google Cloud Storage from AppEngine

I am interested in knowing if it's possible to do ssh to google cloud storage from an AppEngine to read some file from there, process and insert data in BigQuery. Thanks
Neither Cloud Storage nor App Engine Standard have any concept of an "instance" that you could SSH to (unless you're using App Engine Flexible, but you still can't SSH to Cloud Storage).
However you can talk to Cloud Storage from App Engine, or just directly load your GCS bucket into BigQuery.

Uploading Storing and Linking to Images on Google Container Engine

I am developing an app where users can upload images. The app has a NodeJs Backend an Angular Frontend with a Redis and Neo4j all dockerize and run by Kubernetes. Now I would like to store images, but there are so many service that I think could do the job that I don't know what to do... Can I use my Google Drive account and the Drive Sdk to upload the images of my users ? Should I look into Google Cloud Storage ? What about the persistence storage option in Kubernetes ? Or can I use my Flickr Account ??? Could someone point me the right direction... Thanks
For uploading and storing static files such as images in the cloud using GCP should probably be using Cloud Storage.
While both Google Drive and Google Cloud Storage provide an API to upload files, Cloud storage is more suited for your use case. I took this excerpt from here
Cloud Storage is intended to be accessed primarily through its API and
provides all the functionality necessary for developers to use it as a
backing store for their own applications.
and
Cloud Storage enables developers to store their application data in
the Google cloud (and they’re responsible for the storage their app
consumes), whereas in Drive, users allow apps to interact with the
user’s private storage and content.

Resources