I've got a message yesterday from Google saying that the Files API will be disabled on July 28th and it is recommended to migrate to Google Cloud Storage.
Currently I use Files API in the following way - once email is received, I save its attachment (images only) to blobstore -
from google.appengine.api import files
bs_file = files.blobstore.create(mime_type=ctype, _blobinfo_uploaded_filename='screenshot_'+image_file_name)
try:
with files.open(bs_file, 'a') as f:
f.write(image_file)
files.finalize(bs_file)
blob_key = files.blobstore.get_blob_key(bs_file)
Later on, I access blobstore and attach the same images to another mail I send:
attachments = []
for at_blob_key in message.attachments:
blob_reader = blobstore.BlobReader(at_blob_key)
blob_info = blobstore.BlobInfo.get(at_blob_key)
if blob_reader and blob_info:
filename = blob_info.filename
attachments.append((filename, blob_reader.read()))
if len(attachments) > 0:
email.attachments = attachments
email.send()
Now, I am supposed to use Google Cloud Storage instead of Blobstore. Google Cloud Storage is not free, so I have to enable billing. Currently my Blobstore Stored Data is 0.27Gb, which is small, so looks like I will not have to pay a lot. But I am afraid to enable billing, since some other parts of my code could result in a huge bill (and seems there is no way to enable billing just for Google Cloud Storage).
So, is there any way to continue usage of Blobstore for files storage in my case? What else can I use for free instead of Google Cloud Storage (what is about Google Drive)?
The below example uses the GCS default bucket to store your screenshots. The default bucket has free quota.
from google.appengine.api import app_identity
import cloudstorage as gcs
default_bucket = app_identity.get_default_gcs_bucket_name()
image_file_name = datetime.datetime.utcnow().strftime('%Y%m%d%H%M%S') + '_' + image_file_name # GCS filename should be unique
gcs_filename = '/%s/screenshot_%s' % (default_bucket, image_file_name)
with gcs.open(gcs_filename, 'w', content_type=ctype) as f:
f.write(image_file)
blob_key = blobstore.create_gs_key('/gs' + gcs_filename)
blob_key = blobstore.BlobKey(blob_key) # if should be stored in NDB
Related
I need to access a json file in GCS (Google Cloud Storage) and change its content from Google App Engine (PY3 standard runtime). Though I can read the content of the file in GCS by:
from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.bucket('bucket_name')
blob = bucket.blob('file_name')
file_content = blob.download_as_string().decode('utf-8')
I don't know how to write directly to the GCS file. It is also okay if I can delete the GCS file and re-upload one created in some temp folder, but it looks GAE file system is read-only and I cannot create files on the server side. Could you help me? Thank you very much!
You can use following code to upload the file
from google.cloud.storage import Blob
client = storage.Client(project="my-project")
bucket = client.get_bucket("my-bucket")
blob = Blob("secure-data", bucket)
with open("my-file", "rb") as my_file:
blob.upload_from_file(my_file)
I finally used Blod.upload_from_string() to avoid creating temp files.
You need to use the Client Library that currently supports Python 2 and 3.
An example on how to upload a file is below
from gcloud import storage
from oauth2client.service_account import ServiceAccountCredentials
import os
credentials_dict = {
'type': 'service_account',
'client_id': os.environ['BACKUP_CLIENT_ID'],
'client_email': os.environ['BACKUP_CLIENT_EMAIL'],
'private_key_id': os.environ['BACKUP_PRIVATE_KEY_ID'],
'private_key': os.environ['BACKUP_PRIVATE_KEY'],
}
credentials = ServiceAccountCredentials.from_json_keyfile_dict(
credentials_dict
)
client = storage.Client(credentials=credentials, project='myproject')
bucket = client.get_bucket('mybucket')
blob = bucket.blob('myfile')
blob.upload_from_filename('myfile')
In my developent server, i am able to use the blobkey to download a csv object. The problem is that in production, the blobkey does not download anything (it returns a 404); presumably because the blobkey is inaccurate. I think this is because googles deprecation of the blobstore and is no longer using blobkeys. This means i need to try and download from google storage bucket. I am not sure how to do this; In development server, i would go to this endpoint to download /data?key=<blob_key> to download the blob object.
I can also download the csv object if i navigate to the bucket and ot the item and then click download. Is there some minor adjustments i can make to get the download to occur? BI would appreciate if someone could point me to a particular direction.
To download objects from your buckets in Cloud Storage depending on your preferences you can check the following code sample (Python):
from google.cloud import storage
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
# bucket_name = "your-bucket-name"
# source_blob_name = "storage-object-name"
# destination_file_name = "local/path/to/file"
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
print(
"Blob {} downloaded to {}.".format(
source_blob_name, destination_file_name
)
)
Be sure that you are not using Python 2.7 anymore since it became deprecated and it is not supported anymore. If you have Python 2.7 please upgrade to Python 3.7.
Running a dart server in App Engine Flexible Environment, there seems to be a limitation of serving files larger than 32MB.
There are a few requirements to files I want to serve:
file size can be larger than 32MB
can not be publicly accessible (authorization is done on the server)
At the moment I try to read the file from the bucket using the gcloud library and then pipe into the request.response. This fails because of the limitation e.g.: HTTP response was too large: 33554744. The limit is: 33554432.
Is there a way to serve larger files from storage? The documentation on this topic is quite confusing (I don't think there is dart specific documentation at all). I keep reading something about the Blobstore but I am not sure if this solution is applicable for dart.
As #filiph suggest you can use signed urls from Google Cloud Storage.
Server side I have this code in python:
import time
import base64
from oauth2client.service_account import ServiceAccountCredentials
def create_signed_url(file_location):
# Get credentials
creds = ServiceAccountCredentials.from_json_keyfile_name('filename_of_private_key.json')
client_id = creds.service_account_email
# Set time limit to two hours from now
timestamp = time.time() + 2 * 3600
# Generate signature string
signature_string = "GET\n\n\n%d\n%s" % (timestamp, file_location)
signature = creds.sign_blob(signature_string)[1]
encoded_signature = base64.b64encode(signature)
encoded_signature = encoded_signature.replace("+", "%2B").replace("/", "%2F")
# Generate url
return "https://storage.googleapis.com%s?GoogleAccessId=%s&Expires=%d&&Signature=%s" % (
file_location, client_id, timestamp, encoded_signature)
The appengine/image package works fine with images stored in Blobstore. However, what would be a good approach to resize images stored in Google Cloud Storage?
You can use the same Image Service with the Google Cloud Storage, especially if you use Blobstore API for uploading images.
A sample code in Java:
String fullName = "/gs/" + bucketName + "/" + objectName;
Image picture = ImagesServiceFactory.makeImageFromFilename(fullName);
Transform resize = ImagesServiceFactory.makeResize(maxWidth, maxHeight);
picture = imagesService.applyTransform(resize, picture);
In Go, you can use BlobKeyForFile function:
BlobKeyForFile returns a BlobKey for a Google Storage file. The
filename should be of the form "/gs/bucket_name/object_name".
Image cropping functionality is fairly easy to implement these days. You can then do whatever you want with the image - store it back to Google Storage or return it immediately to the client.
What's more, you can easily deploy that functionality to any cloud-based serverless solution.I was using Cloud Run because it's Docker-based and hence can be ported anywhere potentially.
I have a service that we use for image cropping based on nodejs/sharp and deployed into Google Cloud Run. You can use it as-is. There's nothing project-specific hardcoded in it.
from the app-engine mapreduce console (myappid.appspot.com/mapreduce/status)
I have a mapreduce defined with input_reader: mapreduce.input_readers.BlobstoreLineInputReader
that I have used successfully with a regular blobstore file, but it doesn't work with a Blobkey created from cloud storage with create_gs_key. when I run it, I get the error "BadReaderParamsError: Could not find blobinfo for key THEKEY". The input reader checks for the existence of a BlobInfo. Is there any work around to this? shouldn't BlobInfo.get(BLOBKEY FROM CS) return a blobinfo?
to get a blob_key from a google cloud storage file, I run this:
from google.appengine.ext import blobstore
READ_PATH = '/gs/mybucket/myfile.json'
blob_key = blobstore.create_gs_key(READ_PATH)
print blob_key
A community member created a LineInputReader for Cloud Storage as an issue on the appengine-mapreduce library: http://code.google.com/p/appengine-mapreduce/issues/detail?id=140
We've posted our modifications here: https://github.com/thinkjson/CloudStorageLineInputReader
We're using this to do MapReduce over about 4TB of data, and have been happy with it so far.
Cloud Storage and BlobStore are two different storages, you can't pass a key from the Cloud Storage as a BlobStore key.
You will need to implement your own line reader over Cloud Storage file.