I'm developing a google app engine application that uses cloud storage. I want to have a base set of files on the cloud storage that are shared by each user of the application. I know I can use gsutil to copy these files to the production server.
But I would like to test my application on my local development server, so I need these files in the dev cloud storage as well. I can't find any way to copy the files. Is there a way to use gsutil to copy files to the development server's cloud storage simuation?
We don't currently support the full GCS API in the local dev server.
Your best bet is to probably just write to a different bucket when running locally for now.
Related
I have some configuration files in my application, to stop needing a deployment each time the configuration files change - I wanted to have a simple process which would sync these files from a store, such a google cloud storage into a folder on the machine(s) aka RAM.
I need the data in RAM for speed.
I do not want at this stage to add complexity by adding DBs and pub/sub etc.
Running a nodejs app on the flexible GAE environment.
In essence I am looking for a simple solution, at this stage.
You can use Cloud Storage FUSE which is a adapter for mounting Cloud Storage buckets to your local machine or GCE instances, as a file system.
You have the installing instructions here.
I use the Google App Engine Standard environment to develop my Python app using Development SDK 1.9.61.
I'm trying to learn to use Google Cloud Storage in my app by following these instructions. I verified that my default and staging buckets do exist via the cloud console, and manually uploaded a sample file to each bucket using my browser.
Next, I programmatically uploaded some files to a bucket (so I thought) via my local development app instance per Google's instructions.
However, when I checked my cloud storage buckets via my GCP Console in my browser, I could not find the files. After searching my local development SDK console, I eventually found the files located in the local "Blobstore Viewer".
I'm confused, based on Google's instructions I expected to find the files in my project's cloud storage bucket.
I searched the App Engine Python Release Notes for some potential SDK version changes to explain this behavior, but couldn't find anything relevant.
Is this the way it's supposed to work? Are Google's instructions in error?
If you upload files to a local development server, those exist in-memory on your machine. The GCP Console doesn't interact with your local development server, it interacts with the public (production) Google Cloud Storage API.
So in essence, the files on your local dev server are in a completely different namespace. If you want to interact with the production version of Google Cloud Storage and see the results in the GCP console, you'll need to use a non-dev-server deployment of your application.
I am migrating my google app engine project to windows azure app platform.Please help me to migrate the all google app engine blobstore files to Azure blobstore.
I have one solution with python.But I am not much aware of Python,Please help me if it is possible with javascript,java or any tool.
A simple way to do this (if you are not working with a huge amount of data) would be to download the google app engine blobstore files to a local drive, and then upload them using a tool like Azure Storage Explorer (http://storageexplorer.com/). Azure Storage Explorer lets you upload a directory through the user interface. Or you can use the AzCopy command-line interface to upload local files to Blob storage (https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/).
For downloading google blobstore files, you may be able to use a tool or interface like the ones described here: How can I get to the blobstore in the new Google Cloud Console?
If you have a very large amount of data, then you may need a different option. For uploading very large amounts of data to Azure Blob storage, you can use the Import/Export service, and send Microsoft a drive containing your data (https://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/).
I have a GAE based application that pulls a file from cloud storage and then does some processing on that file. To run the application in the remote appengine environment, I first upload a file to cloud storage using the browser console, and then make requests to the application, which pulls the file I uploaded from cloud storage. I'd like to be able to do development locally, however there is not a sweet browser console for the local implementation of gcs, as discussed here: Local storage browser for Google Cloud Storage and dev_appserver.py.
I'm wondering if it's possible to use gsutil. It seems the local gcs implementation is accessible through a localhost endpoint, mentioned here: Google Cloud Storage on Appengine Dev Server.
Right now, what I want to do is just load a file into my local gcs instance. I could do this my writing a little utility, but it seems much better to use gsutil if I can get that to connect to my local instance.
Thank you,
Ben
I'm using the Eclipse Plugin for App Engine, and I have my application running fine locally (able to read/write to the local Cloud Datastore).
However when I deployed to App Engine, the server copy does not seem to have any Cloud Datastore information. Do I need to upload this separately, and if so how do I do this?