I have some configuration files in my application, to stop needing a deployment each time the configuration files change - I wanted to have a simple process which would sync these files from a store, such a google cloud storage into a folder on the machine(s) aka RAM.
I need the data in RAM for speed.
I do not want at this stage to add complexity by adding DBs and pub/sub etc.
Running a nodejs app on the flexible GAE environment.
In essence I am looking for a simple solution, at this stage.
You can use Cloud Storage FUSE which is a adapter for mounting Cloud Storage buckets to your local machine or GCE instances, as a file system.
You have the installing instructions here.
Related
I'm trying to deploy Errbot in App Engine. Errbot needs the data directory to be writable but an App Engine instance's local filesystem is not writable. Is there any way to work around this?
From Documentation:
You are correct, GAE local filesystem that your application is deployed to is not writeable. This behavior ensures the security and scalability of your application.
However, if your use case involves to store data in a persistent manner, you may consider using Cloud Storage to read and write files during runtime. App Engine creates a default bucket when you create an application.
This bucket provides the first 5GB of storage for free and includes a free quota for Cloud Storage I/O operations. You can create other Cloud Storage buckets, but only the default bucket includes the first 5GB of storage for free.
I have a REST, .NET Core application running in Google App Engine Flexible.
It reads binary files from Cloud Storage (several MB in size, rarely hundreds of MB).
To make it running faster I'm caching these files in local file system (/tmp). But this approach doesn't work when the app is scaled and more instances are running simultaneously.
What are my options for fast file cache which is shared between app instances?
Cloud Filestore looks great but is not available for App Engine
Cloud Memorystore - I'm not sure it is suitable for me
You can use a redis Database to cache up to 30 mb of data for free from your App Engine and this is a solution easy to implement.
For this you only need to create a Redis Cloud database and modify the appsettings.json file of your app.
I use the Google App Engine Standard environment to develop my Python app using Development SDK 1.9.61.
I'm trying to learn to use Google Cloud Storage in my app by following these instructions. I verified that my default and staging buckets do exist via the cloud console, and manually uploaded a sample file to each bucket using my browser.
Next, I programmatically uploaded some files to a bucket (so I thought) via my local development app instance per Google's instructions.
However, when I checked my cloud storage buckets via my GCP Console in my browser, I could not find the files. After searching my local development SDK console, I eventually found the files located in the local "Blobstore Viewer".
I'm confused, based on Google's instructions I expected to find the files in my project's cloud storage bucket.
I searched the App Engine Python Release Notes for some potential SDK version changes to explain this behavior, but couldn't find anything relevant.
Is this the way it's supposed to work? Are Google's instructions in error?
If you upload files to a local development server, those exist in-memory on your machine. The GCP Console doesn't interact with your local development server, it interacts with the public (production) Google Cloud Storage API.
So in essence, the files on your local dev server are in a completely different namespace. If you want to interact with the production version of Google Cloud Storage and see the results in the GCP console, you'll need to use a non-dev-server deployment of your application.
I have a machine learning project and for this project, I have to get data from a website in evey 15 minutes. And I decided to use google cloud platform to do it. I've coded a python script to do the process(get the data from website and write down to a csv file) and when I run this script on my computer, it works well. I need to run this script for a couple weeks. So it should be running in google cloud's computers and it should continue running when I close my computer. How can I do this?
I can also use another cloud service if it's required to but google cloud would be better.
Disclaimer: I'm with Google Cloud Platform Support
Google Cloud Compute Engine is defined as an Infrastructure as a Service. It basically provides access to Virtual Machines (VMs), Disks and Networking functionalities. By using this product, you are able to configure your resources from scratch, defining one or multiple VM instances, configuring your work environment, etc. It might require more configuration and boiler plating than needed, but it offers the most control. You can always use some resources for free but in my opinion it is a lot of scratch to start from.
Google Cloud App Engine is defined as a Platform as a Service. It is basically a managed app platform. The management can be automatised to certain degrees. It is based on Compute Engine, in the sense that it provides functionalities, a platform, on top of the infrastructure defined by Compute Engine VMs. You can thus deploy your python script in an App Engine Flexible Python Environment. You can define your whole application as a collection of interrelated microservices, i.e. one service gets the data from a website, maybe another writes csv files and another might trigger ML jobs.
App Engine also provides the possibility to schedule jobs as cron jobs. So if your application needs to run periodical jobs or at a specific time, this is the tool to use. App Engine pricing is correlated with the used resources, but you can estimate eventual budgets by using the Google Cloud Platform Pricing Calculator.
You can store the csv files in Google Cloud Storage as objects in buckets or as data in Datastore, Cloud SQL or BigQuery. Components of Google Cloud Platform can communicate with each other via service acounts. This allows your App Engine deployment, for example, to perform CRUD operations in your Cloud SQL instance, programatically. Or... to trigger a Cloud Machine Learning job.
Your question is very broad and can be addressed in multiple, various ways. I would initially deploy the python script in App Engine Flexible. I would deploy a cron job if needed, to fetch data every 15 minutes. I would upload the csv files in Google Cloud Storage Buckets. I would then use the Cloud Machine Learning python client to trigger Machine Learning jobs programatically.
There are other products that might interest you:
Cloud Dataflow - configure stream/batch data processing
Cloud Dataprerp - transform/clean raw data
Cloud Pub/Sub - global real-time messaging.
All the products/components and sub-products/sub-components can communicate with each other and processes can easily be automated in the Cloud. So the whole project can run in Google's Cloud infrastructure when you close your computer. But, of course, you have to configure it beforehand, in your Google Cloud Platform Project(s).
I am aware that I met your broad question with a broad answer. For any specific issues along your path of implementing the project in the Cloud, the community will be here to provide support.
Good luck!
I'm developing a google app engine application that uses cloud storage. I want to have a base set of files on the cloud storage that are shared by each user of the application. I know I can use gsutil to copy these files to the production server.
But I would like to test my application on my local development server, so I need these files in the dev cloud storage as well. I can't find any way to copy the files. Is there a way to use gsutil to copy files to the development server's cloud storage simuation?
We don't currently support the full GCS API in the local dev server.
Your best bet is to probably just write to a different bucket when running locally for now.