I have been pulling my hair out figuring out how to get an image from my App Engine bucket (or any private bucket) and display it on my Flask website running on GCP's App Engine.
I'm trying variations of the following:
#app.route('/get_object/<object>')
def get_object(object):
client = storage.Client()
bucket = client.get_bucket('bucket-id')
blob = bucket.get_blob(object)
return blob
and my HTML looks like so:
<img src={{ url_for('get_object', object='test.jpg') }}>
And in my App Engine default bucket I have there sitting a photo called test.jpg, but nothing seems to work. My bucket must be private, is there anyway to serve private files?
I did this:
My image src uses a python function to pass the bucket and object
<img src="{{ url_for('get_file', bucket='testBucket', object='image.png') }}">
here's the function
#app.route('/get_file/<testBucket>/<object>')
#requires_auth
def get_file(testBucket, object):
storage_client = storage.Client()
bucket = storage_client.get_bucket(testBucket)
blob = bucket.blob(object)
with tempfile.TemporaryDirectory() as tmpdirname:
fullpath = os.path.join(tmpdirname, object)
blob.download_to_filename(fullpath)
return send_from_directory(tmpdirname, object)
This saves the files local to the App Engine application in a temporary directory that gets deleted once the function exists on the return, so the user gets the file but the temp location no longer exists. I think...
Related
I am accessing an object in Google Cloud Storage from my App Engine Standard Python 3 application, downloading the object to /tmp folder, editing it and then uploading this edited object back to the Google Storage:
from google.cloud import storage
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
# bucket_name = "your-bucket-name"
# source_blob_name = "storage-object-name"
# destination_file_name = "local/path/to/file"
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
def upload_blob(bucket_name, source_file_name, destination_blob_name):
"""Uploads a file to the bucket."""
# bucket_name = "your-bucket-name"
# source_file_name = "local/path/to/file"
# destination_blob_name = "storage-object-name"
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file_name)
def edit_file(request):
... skipped for brevity ...
download_blob(bucket_name, source_blob_name, destination_file_name)
with open(source_file_name, "a") as f:
f.write(f"{x}: {y}: {z}<br>")
upload_blob(bucket_name, source_file_name, destination_blob_name)
Is there a way to edit the object directly w/o downloading it to /tmp folder? I could not find a method for this
The blob object in Cloud Storage is immutable objects. So you need to download it somewhere and modify to create new object then upload it.
I have a Flask app where a user can upload an image and the image is saved on a static folder on the filesystem.
Currently, I'm using Google App Engine for hosting and found that it's not possible to save to the static folder on the standard environment. Here is the code
def save_picture(form_picture,name):
picture_fn = name + '.jpg'
picture_path = os.path.join(app.instance_path, 'static/image/'+ picture_fn)
output_size = (1000,1000)
i = Image.open(form_picture)
i.thumbnail(output_size)
i.save(picture_path)
return picture_path
#app.route('/image/add', methods=['GET', 'POST'])
def addimage():
form = Form()
if form.validate_on_submit():
name = 'randomname'
try:
picture_file = save_picture(form.image.data,name)
return redirect(url_for('addimage'))
except:
flash("unsuccess")
return redirect(url_for('addimage'))
My question is if I change from standard to flex environment would it be possible to save to a static folder? If not what are the other hosting options that I should consider? Do you have any suggestions?
Thanks in advance.
following your's advice I'm changing to use Cloud Storage. i'm wondering what should i use from upload_from_file(), upload_from_filename() or upload_from_string(). the source_file takes data from form.photo.data from flask-wtform. i'm not successfully saving on the cloud storage yet. this is my code:
def upload_blob(bucket_name, source_file, destination_blob_name):
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file)
return destination_blob_name
#app.route('/image/add', methods=['GET', 'POST'])
def addimage():
form = Form()
if form.validate_on_submit():
name = 'randomname'
try:
filename = 'foldername/'+ name + '.jpg'
picture_file = upload_blob('mybucketname', form.photo.data, filename)
return redirect(url_for('addimage'))
except:
flash("unsuccess")
return redirect(url_for('addimage'))
I have successfully able to save file on google cloud storage by changing the save_picture function just in case anyone have trouble with this in the future:
app.config['BUCKET'] = 'yourbucket'
app.config['UPLOAD_FOLDER'] = '/tmp'
def save_picture(form_picture,name):
picture_fn = secure_filename(name + '.jpg')
picture_path = os.path.join(app.config['UPLOAD_FOLDER'], picture_fn)
output_size = (1000,1000)
i = Image.open(form_picture)
i.thumbnail(output_size)
i.save(picture_path)
storage_client = storage.Client()
bucket = storage_client.get_bucket(app.config['BUCKET'])
blob = bucket.blob('static/image/'+ picture_fn)
blob.upload_from_filename(picture_path)
return picture_path
The problem with storing it to some folder is that it would live on that one instance and other instances would not be able to access it. Furthermore, instances in GAE come and go, so you would lose the image eventually.
You should use Google Cloud Storage for this:
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('bucket-id-here')
blob = bucket.get_blob('remote/path/to/file.txt')
blob.upload_from_string('New contents!')
https://googleapis.dev/python/storage/latest/index.html
With Flask and Appengine, Python3.7, I save files to a bucket in the following way, because I want to loop it for many files:
for key, upload in request.files.items():
file_storage = upload
content_type = None
identity = str(uuid.uuid4()) # or uuid.uuid4().hex
try:
upload_blob("f00b4r42.appspot.com", request.files[key], identity, content_type=upload.content_type)
The helper function:
from google.cloud import storage
def upload_blob(bucket_name, source_file_name, destination_blob_name, content_type="application/octet-stream"):
"""Uploads a file to the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_file(source_file_name, content_type=content_type)
blob.make_public()
print('File {} uploaded to {}.'.format(
source_file_name,
destination_blob_name))
Changing from Google App Engine Standard Environment to Google App Engine Flexible Environment will allow you to write to disk, as well as to choose a Compute Engine machine type with more memory for your specific application [1]. If you are interested on following this path find all the relevant documentation from migrating a Python app here.
Nonetheless, as it was explained by user #Alex on his provided answer as instances are created (the number of instances is scaled up) or deleted (the number of instances is scaled down) according to your load, the better option in your particular case would be to use Cloud Storage. Find an example for uploading objects to Cloud Storage with Python here.
So.. This is my code for uploading an image with the Ferris2 frame work. Yay me, it works. However, see how I had to comment out gcs.open(... ? I don't want that commented out. I'd really love to just upload straight to cloud storage using that call, and not having to use anything related to blobs. What's the easiest way to accomplish this given that I'm stuck with using the AClassForm and the ferris framework?
class AClassForm(forms.model_form(AClass, exclude=('image_url') ) ):
image = FileField(u'Image File')
class AClasses(Controller):
class Meta:
Model = AClass
prefixes = ('admin',)
components = (scaffold.Scaffolding, Upload)
Form = AClassForm
admin_list = scaffold.list
admin_view = scaffold.view
admin_edit = scaffold.edit
admin_delete = scaffold.delete
def admin_add(self):
self.scaffold.ModelForm = AClassForm
self.scaffold.form_encoding = "multipart/form-data"
def before_save_callback(controller,container, item):
image = self.request.params["image"]
object_name = blobstore.parse_file_info(image).gs_object_name.split('/')[-1]
upload_settings = settings.get("upload")
url = upload_settings["url"]
bucket = upload_settings["bucket"]
#send to the cloud
#write a task to execute this?
item.image_url = url % (bucket, object_name)
#gcs_file= gcs.open("/".join(["", bucket, object_name]),
# 'w', content_type="image/jpeg",
#options={'x-goog-acl': 'public-read'} )
#gcs_file.write(item.image)#.file.encode('utf-8')) #
#gcs_file.close()
return
self.events.scaffold_before_save += before_save_callback
return scaffold.add(self)
I am not sure how Ferris works internally but you can use cloudstorage directly.
My image storage wrapper, provides resizing and returns a public URL to the upload for serving directly from storage.
import urlparse
from google.appengine.api import app_identity, blobstore, images
import cloudstorage
class ImageStorage(object):
def __init__(self, base_uri):
self.base_uri = "/{}/{}".format(app_identity.get_default_gcs_bucket_name(), base_uri.lstrip('/'))
def put(self, image, name, mime=None, width=None, height=None):
"""Puts an image into the Google Cloud Storage"""
if width or height:
image = images.resize(image, width=width or height, height=height or width)
mime = 'image/png' # resize defaults to output_encoding=PNG
options = {'x-goog-acl': 'public-read'}
with cloudstorage.open(self.make_key(name), 'w', content_type=mime, options=options) as fp:
fp.write(image)
return self.get_url(name)
def get_url(self, name):
"""Gets the url for an image from Google Cloud Storage"""
key = self.make_key(name)
# must be prefixed with /gs for the blob store to know it is from gcs
# https://cloud.google.com/appengine/docs/python/blobstore/#Python_Using_the_Blobstore_API_with_Google_Cloud_Storage
url = images.get_serving_url(blobstore.create_gs_key('/gs' + key))
# s/https/http/ if running under dev_appserver.py
# if not config.isLocal:
# parts = urlparse.urlparse(url)
# secure_parts = ('https',) + parts[1:]
# url = urlparse.urlunparse(secure_parts)
return url
def make_key(self, name):
"""Makes an item name key for Google Cloud Storage"""
return '%s/%s' % (self.base_uri, name)
Usage inside of a subclass of webapp2.RequestHandler. This stores the file with name "image" in the default bucket for your app in cloud storage at path /some/bucket/path/my-image-name.
thumb = self.request.POST["image"]
if hasattr(thumb, 'value'):
gs_base_uri = '/some/bucket/path'
image_storage = ImageStorage(gs_base_uri)
thumb_fn = 'my-image-name'
session_thumb_url = image_storage.put(image=thumb.value,
name=thumb_fn,
mime=thumb.type,
width=300, height=300)
return session_thumb_url
As I understand it, if you're using the Upload component of Ferris you can't escape the Blobstore, but the following comes pretty close. You don't have to use the Form class if you don't want to, I rarely use it myself. So imagine the following Controller:
from ferris import Controller, route
from ferris.components.upload import Upload
import cloudstorage as gcs
from google.appengine.ext import blobstore
import logging
class ImageManager(Controller):
class Meta:
components = (Upload,)
#route
def list(self):
#This just passes the upload URL to use in the form
self.context['upload_url'] = self.components.upload.generate_upload_url(uri=self.uri('image_manager:image_uploader_action'))
#route
def image_uploader_action(self):
# This gets all of the uploads passed in from the form
uploads = self.components.upload.get_uploads()
# This is the raw google cloud object reference. 'myfile' is the name of the upload field in the html form
file_gcs_obj_name = uploads['myfile'][0].cloud_storage.gs_object_name
# This is the blobstore key just for giggles
file_blobstore_key = uploads['myfile'][0].key()
# This will get rid of the preceeding junk you don't need i.e. "/gs/yadda/yadda"
clean_file_name = file_gcs_obj_name[3:]
# This is the name of the file as it was uploaded by the end-user
file_name_friendly = uploads['myfile'][0].filename
# This is the actual file, with this you can do whatever you want
the_actual_image = gcs.open(clean_file_name,'r')
# The file name by default is long and ugly, lets make a copy of the file with a more friendly name
new_filename = '/mydomain.appspot.com/'+file_name_friendly
gcs.copy2(clean_file_name,new_filename)
# We can generate a serving URL by using the blobstore API
valid_blob_reference = blobstore.create_gs_key('/gs'+new_filename)
file_serving_url = images.get_serving_url(valid_blob_reference)
logging.info('the serving url is: %s'% file_serving_url)
# delete the original image from cloud storage
gcs.delete(clean_file_name)
# Delete the original image from blobstore
blobstore.delete(file_blobstore_key)
# Close the file
the_actual_image.close()
return 'Done. go check the cloud storage browser'
Now all you need is the HTML form. You can use something like this:
{% extends "layouts/default.html" %}
{% block layout_content %}
<form name="myform" action="{{upload_url}}" method="POST" enctype="multipart/form-data">
<input type="file" name="myfile" id="fileToUpload">
<input type="submit" value="Upload File" name="submit">
</form>
{% endblock %}
Ferris is still going to place a file in the blobstore but you can delete it after you've used the cloudstorage.copy2() function. That function is fairly new so remember to update your cloudstorage package, you can download the latest copy from Google or Pypi (https://pypi.python.org/pypi/GoogleAppEngineCloudStorageClient/1.9.22.1)
Hope this helps.
I am trying to upload a photo jpg from a PhoneGap app (javascript) to Google App Engine (php), store parameters in a db, and photo to Google Cloud Storage. All works except the photo file transfer.
The upload function I'm using is typical of PhoneGap's file transfer example http://docs.phonegap.com/en/edge/cordova_file_file.md.html#FileTransfer.
function uploadPhoto(imageURI) {
// imageURI is photo local url file:///Users/me/Library/...etc ... .jpg
// from navigator.camera.getPicture function
// prepare post variables
var options = new FileUploadOptions();
options.fileKey="file";
options.fileName=imageURI.substr(imageURI.lastIndexOf('/')+1);
options.mimeType="image/jpeg";
var params = new Object();
params.foo = "foo";
options.params = params;
options.chunkedMode = false;
// upload image and option
var ft = new FileTransfer();
ft.upload(imageURI, 'http://myapphere.appspot.com/php/myphoto.php', function(r){
// do stuff with r.response
},function(error){
// error
},options,true);
}
On the Google App Engine server, the parameter variables pass fine - what doesn't seem to transfer it the $_FILE file.
/* php/myphoto.php */
// option variables are passed - this works
$foo = $_POST["foo"];
// but it seems the $_FILES is empty?
$gs_name = $_FILES["file"]["tmp_name"]; // <-- not transferring?
$fileName = 'test.jpg';
$moveResult = move_uploaded_file($gs_name, "gs://mybucket/".$fileName);
A file test.jpg is stored in mybucket (a small blank binary/octet). As a test, I created an HTML form on GAE to upload a file image and move_upload_file $_FILE to mybucket, it works. It's the transfer of $_FILES from the javascript app that I can't figure out. (I'm aware of the Google Cloud Storage JSON API objects.insert, but here I'd like to go from phonegap html/javascript to Google App Engine PHP page to process passed data).
I'm wondering if it's possible to download a file from Google Cloud Storage with a different name than the one that has in the bucket.
For example, in Google Cloud Storage I have stored a file named 123-file.txt but when I download it I would like choose a different name, let's say file.txt
I've noticed that the link for download it is like:
https://storage.cloud.google.com/bucket_name%2F123-file.txt?response-content-disposition=attachment;%20filename=123-file.txt
So I've tried to change it to:
https://storage.cloud.google.com/bucket_name%2F123-file.txt?response-content-disposition=attachment;%20filename=file.txt
But it still keeps downloading as 123-file.txt instead of file.txt
The response-content-disposition parameter can only be used by authorized requests. Anonymous links don't work with it. You have a few options:
The content-disposition of a particular object is part of its metadata and can be permanently set. If you always want a specific file to be downloaded with a specific name, you can just permanently set the content-disposition metadata for the object.
You can also generate signed URLs that include the response-content-disposition query parameter. Then the users will be making authorized requests to download the resource.
example (first option Brandon Yarbrough) with javascript library:
const storage = new Storage()
const fileBucket = storage.bucket('myBucket')
const file = fileBucket.file('MyOriginalFile.txt')
const newName = "NewName.txt"
await file.save(content, {
metadata: {
contentDisposition: `inline; filename="${newName}"`
}
})
the following is a part of a python script i've used to remove the forward slashes - added by google cloud buckets when to represent directories - from multiple objects, it's based on this blog post, please keep in mind the double quotes around the content position "file name"
def update_blob_download_name(bucket_name):
""" update the download name of blobs and remove
the path.
:returns: None
:rtype: None
"""
# Storage client, not added to the code for brevity
client = initialize_google_storage_client()
bucket = client.bucket(bucket_name)
for blob in bucket.list_blobs():
if "/" in blob.name:
remove_path = blob.name[blob.name.rfind("/") + 1:] # rfind gives that last occurence of the char
ext = pathlib.Path(remove_path).suffix
remove_id = remove_path[:remove_path.rfind("_id_")]
new_name = remove_id + ext
blob.content_disposition = f'attachment; filename="{new_name}"'
blob.patch()