In AppEngine i could download the data from appspot to my local using appcfg download_data but it seems to be now downloading the one from the blobstore Api, is there any way i could download the blob as well as the data keeping their keys synchronized?
So the final outcome is I download data from the server (including blob) go to another pc and upload all data and blob thus when i run it, it would look like the one the appspot. Thanks
Related
I am migrating my google app engine project to windows azure app platform.Please help me to migrate the all google app engine blobstore files to Azure blobstore.
I have one solution with python.But I am not much aware of Python,Please help me if it is possible with javascript,java or any tool.
A simple way to do this (if you are not working with a huge amount of data) would be to download the google app engine blobstore files to a local drive, and then upload them using a tool like Azure Storage Explorer (http://storageexplorer.com/). Azure Storage Explorer lets you upload a directory through the user interface. Or you can use the AzCopy command-line interface to upload local files to Blob storage (https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/).
For downloading google blobstore files, you may be able to use a tool or interface like the ones described here: How can I get to the blobstore in the new Google Cloud Console?
If you have a very large amount of data, then you may need a different option. For uploading very large amounts of data to Azure Blob storage, you can use the Import/Export service, and send Microsoft a drive containing your data (https://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/).
I've been able to get jQuery-File-Upload from https://github.com/blueimp/jQuery-File-Upload (python version) to allow my website users to upload images to a google app engine app, however it saves the images (for a limited time) as a blob.
It would be preferable to have the images saved in a google bucket but I'm unable to figure out how to get the files to save in a bucket rather than a blob.
Can anyone recommend a similar way for my website visitors to upload images to my google bucket, or if jQuery-File-Upload to do so?
I created an Appengine Python signed url example here:
There is a cloud API to directly upload files to google cloud bucket via javascript, however implementing security could make things a bit sticky.
In general,It should be fairly easy to use google cloud services found here
use storage insert api call
Now if you like to give access to clients to upload data on you application's behalf, then you may have to build a signed url ,signed with your key and then directly upload to cloud, read more here
I find it easy to do server side operations if the upload size is fairly small by using BlobStore API. You can use the Blobstore API to store blobs in Cloud Storage instead of storing them in Blobstore. Read all about it here.
This method might be the easiest to implement in your case .
I have a GAE based application that pulls a file from cloud storage and then does some processing on that file. To run the application in the remote appengine environment, I first upload a file to cloud storage using the browser console, and then make requests to the application, which pulls the file I uploaded from cloud storage. I'd like to be able to do development locally, however there is not a sweet browser console for the local implementation of gcs, as discussed here: Local storage browser for Google Cloud Storage and dev_appserver.py.
I'm wondering if it's possible to use gsutil. It seems the local gcs implementation is accessible through a localhost endpoint, mentioned here: Google Cloud Storage on Appengine Dev Server.
Right now, what I want to do is just load a file into my local gcs instance. I could do this my writing a little utility, but it seems much better to use gsutil if I can get that to connect to my local instance.
Thank you,
Ben
I'm using cloud storage(bucket) as a cdn server for my application. I do not store the user upload files on my app server instead they are saved to Google cloud. Current implementation is as follows: user upload file from website(HTML page) to API server and then API server save this file to Google cloud storage(CDN/bucket), which is a dual process.
But now I want to direct upload file from angular HTML page i.e. from website to the Google cloud storage. I know we can upload file from API server(python or java or many other) to cloud storage but I don't want to do that. The reason behind is that I want to eliminate dual process time means first uploading file to our server and then server will send that file to google cloud. If the file size is small above we can consider the above work flow but what if I'm uploading large size files to server.
Is this possible to create some Authentication Headers values on server side and client(HTML/website) will use it for direct uploading file to Google Cloud Storage(bucket)? And after response from google cloud api the client will call the API server to save the respective changes.
I did lots of R&D's on it but did find any useful solution.
I also checked the below link but no luck.
Upload file to Google Cloud Storage using AngularJS multipart form data
Create a signed urls to upload a file directly to gcs without authentication.
https://cloud.google.com/storage/docs/access-control#Signed-URLs
In my application on Google App Engine, I have files/documents uploaded to Blob Store. The size of Blob Store has grown to more than 100 GB.
I am in search of a mechanism by which I can backup my blob store data, may be to some other server location or Google Cloud Storage or some other safe place.
AFAIK there isn't a simple tool that you can use to backup your Blobstore data. There are at least a few approaches that you could take to write your own tool:
Use cron.xml and a servlet (or some similar scheduling) to copy batches of your entity blobs to Google Cloud Storage and then download from there (see below for tools).
Use remote-API or provide some other API (REST etc) where a remote tool you make can query your entities and then over a serving url (that you probably already have) download the Blobs to your local machine.
Those approaches are not very friendly and there are however tools such as GSUtil that would be helpful if you migrated your data to Google Cloud Storage (GCS). I've seen a few people migrate their data now that upload and serving with GCS is supported with the Blobstore API.
It's important to also note that the Files API for Blobstore is now deprecated (since 1.8.1?) and that there is a new preferred way through the App Engine GCS client.
Additionally if you check Posts about 1.8.1 Pre-release SDKs Available. you'll see that there is some indication from Google that blobs might be auto-migrated over to Cloud Storage (free of charge) before the end of the year (presumably close to when Blobstore will be deprecated). Depending on the urgency of your needs it may be feasible to just wait.
You can use Backup/Restore functionality of GAE Admin console.