Saving image file on filesystem on Google App Engine with Python Flask - google-app-engine

I have a Flask app where a user can upload an image and the image is saved on a static folder on the filesystem.
Currently, I'm using Google App Engine for hosting and found that it's not possible to save to the static folder on the standard environment. Here is the code
def save_picture(form_picture,name):
picture_fn = name + '.jpg'
picture_path = os.path.join(app.instance_path, 'static/image/'+ picture_fn)
output_size = (1000,1000)
i = Image.open(form_picture)
i.thumbnail(output_size)
i.save(picture_path)
return picture_path
#app.route('/image/add', methods=['GET', 'POST'])
def addimage():
form = Form()
if form.validate_on_submit():
name = 'randomname'
try:
picture_file = save_picture(form.image.data,name)
return redirect(url_for('addimage'))
except:
flash("unsuccess")
return redirect(url_for('addimage'))
My question is if I change from standard to flex environment would it be possible to save to a static folder? If not what are the other hosting options that I should consider? Do you have any suggestions?
Thanks in advance.
following your's advice I'm changing to use Cloud Storage. i'm wondering what should i use from upload_from_file(), upload_from_filename() or upload_from_string(). the source_file takes data from form.photo.data from flask-wtform. i'm not successfully saving on the cloud storage yet. this is my code:
def upload_blob(bucket_name, source_file, destination_blob_name):
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file)
return destination_blob_name
#app.route('/image/add', methods=['GET', 'POST'])
def addimage():
form = Form()
if form.validate_on_submit():
name = 'randomname'
try:
filename = 'foldername/'+ name + '.jpg'
picture_file = upload_blob('mybucketname', form.photo.data, filename)
return redirect(url_for('addimage'))
except:
flash("unsuccess")
return redirect(url_for('addimage'))
I have successfully able to save file on google cloud storage by changing the save_picture function just in case anyone have trouble with this in the future:
app.config['BUCKET'] = 'yourbucket'
app.config['UPLOAD_FOLDER'] = '/tmp'
def save_picture(form_picture,name):
picture_fn = secure_filename(name + '.jpg')
picture_path = os.path.join(app.config['UPLOAD_FOLDER'], picture_fn)
output_size = (1000,1000)
i = Image.open(form_picture)
i.thumbnail(output_size)
i.save(picture_path)
storage_client = storage.Client()
bucket = storage_client.get_bucket(app.config['BUCKET'])
blob = bucket.blob('static/image/'+ picture_fn)
blob.upload_from_filename(picture_path)
return picture_path

The problem with storing it to some folder is that it would live on that one instance and other instances would not be able to access it. Furthermore, instances in GAE come and go, so you would lose the image eventually.
You should use Google Cloud Storage for this:
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('bucket-id-here')
blob = bucket.get_blob('remote/path/to/file.txt')
blob.upload_from_string('New contents!')
https://googleapis.dev/python/storage/latest/index.html

With Flask and Appengine, Python3.7, I save files to a bucket in the following way, because I want to loop it for many files:
for key, upload in request.files.items():
file_storage = upload
content_type = None
identity = str(uuid.uuid4()) # or uuid.uuid4().hex
try:
upload_blob("f00b4r42.appspot.com", request.files[key], identity, content_type=upload.content_type)
The helper function:
from google.cloud import storage
def upload_blob(bucket_name, source_file_name, destination_blob_name, content_type="application/octet-stream"):
"""Uploads a file to the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_file(source_file_name, content_type=content_type)
blob.make_public()
print('File {} uploaded to {}.'.format(
source_file_name,
destination_blob_name))

Changing from Google App Engine Standard Environment to Google App Engine Flexible Environment will allow you to write to disk, as well as to choose a Compute Engine machine type with more memory for your specific application [1]. If you are interested on following this path find all the relevant documentation from migrating a Python app here.
Nonetheless, as it was explained by user #Alex on his provided answer as instances are created (the number of instances is scaled up) or deleted (the number of instances is scaled down) according to your load, the better option in your particular case would be to use Cloud Storage. Find an example for uploading objects to Cloud Storage with Python here.

Related

Google Cloud Storage Object Read/Write Operation from App Engine

I am accessing an object in Google Cloud Storage from my App Engine Standard Python 3 application, downloading the object to /tmp folder, editing it and then uploading this edited object back to the Google Storage:
from google.cloud import storage
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
# bucket_name = "your-bucket-name"
# source_blob_name = "storage-object-name"
# destination_file_name = "local/path/to/file"
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
def upload_blob(bucket_name, source_file_name, destination_blob_name):
"""Uploads a file to the bucket."""
# bucket_name = "your-bucket-name"
# source_file_name = "local/path/to/file"
# destination_blob_name = "storage-object-name"
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file_name)
def edit_file(request):
... skipped for brevity ...
download_blob(bucket_name, source_blob_name, destination_file_name)
with open(source_file_name, "a") as f:
f.write(f"{x}: {y}: {z}<br>")
upload_blob(bucket_name, source_file_name, destination_blob_name)
Is there a way to edit the object directly w/o downloading it to /tmp folder? I could not find a method for this
The blob object in Cloud Storage is immutable objects. So you need to download it somewhere and modify to create new object then upload it.

How to serve images/files from private bucket on Google Cloud

I have been pulling my hair out figuring out how to get an image from my App Engine bucket (or any private bucket) and display it on my Flask website running on GCP's App Engine.
I'm trying variations of the following:
#app.route('/get_object/<object>')
def get_object(object):
client = storage.Client()
bucket = client.get_bucket('bucket-id')
blob = bucket.get_blob(object)
return blob
and my HTML looks like so:
<img src={{ url_for('get_object', object='test.jpg') }}>
And in my App Engine default bucket I have there sitting a photo called test.jpg, but nothing seems to work. My bucket must be private, is there anyway to serve private files?
I did this:
My image src uses a python function to pass the bucket and object
<img src="{{ url_for('get_file', bucket='testBucket', object='image.png') }}">
here's the function
#app.route('/get_file/<testBucket>/<object>')
#requires_auth
def get_file(testBucket, object):
storage_client = storage.Client()
bucket = storage_client.get_bucket(testBucket)
blob = bucket.blob(object)
with tempfile.TemporaryDirectory() as tmpdirname:
fullpath = os.path.join(tmpdirname, object)
blob.download_to_filename(fullpath)
return send_from_directory(tmpdirname, object)
This saves the files local to the App Engine application in a temporary directory that gets deleted once the function exists on the return, so the user gets the file but the temp location no longer exists. I think...

Getting mp3 duration storaged in google app engine cloud storage

I have mp3 files storaged in Google App Engine Cloud Storage and I want to get their durations.
I made this code with help from one guy here but unfortunately the class AudioSystem doesn't work with the Google App Engine Cloud Storage.
Does someone know a way to do it?
ListResult lr = gcsService.list(mybucketname, ListOptions.DEFAULT);
while (lr.hasNext() && playlistLength > 0) {
ListItem li = lr.next();
String filename = li.getName();
GcsService gcsService =
GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
GcsInputChannel readChannel = gcsService.openPrefetchingReadChannel(new GcsFilename(mybucketName, fileName), 0, 1024 * 1024);
AudioInputStream audioInputStream;
try (InputStream in = Channels.newInputStream(readChannel)) {
audioInputStream = AudioSystem.getAudioInputStream(in);
}
long frames = audioInputStream.getFrameLength();
double durationInSeconds = (frames+0.0) / format.getFrameRate();
playlistLength-=(int)(durationInSeconds)/60;
Here is the error returned :
Error for /hello java.lang.NoClassDefFoundError: javax.sound.sampled.AudioSystem is a restricted class.
Please see the Google App Engine developer's guide for more details.
at com.google.apphosting.runtime.security.shared.stub.javax.sound.sampled.AudioSyst‌​em.<clinit>(AudioSystem.java)
Looking into the issue and the doc here, there's a good chance your issue is not solvable with this library, as it is currently restricted because it's making some kind of system call that the platform won't let you do.
You have multiple solutions available to you. I would suggest, when you upload the file, uploading an entity to the datastore containing the metadata and retrieving that instead.
You can also parse the file yourself and read the duration from it (mp3 being a pretty simple format, as Igor Artamonov is pointing out).

Google Cloud Storage: download a file with a different name

I'm wondering if it's possible to download a file from Google Cloud Storage with a different name than the one that has in the bucket.
For example, in Google Cloud Storage I have stored a file named 123-file.txt but when I download it I would like choose a different name, let's say file.txt
I've noticed that the link for download it is like:
https://storage.cloud.google.com/bucket_name%2F123-file.txt?response-content-disposition=attachment;%20filename=123-file.txt
So I've tried to change it to:
https://storage.cloud.google.com/bucket_name%2F123-file.txt?response-content-disposition=attachment;%20filename=file.txt
But it still keeps downloading as 123-file.txt instead of file.txt
The response-content-disposition parameter can only be used by authorized requests. Anonymous links don't work with it. You have a few options:
The content-disposition of a particular object is part of its metadata and can be permanently set. If you always want a specific file to be downloaded with a specific name, you can just permanently set the content-disposition metadata for the object.
You can also generate signed URLs that include the response-content-disposition query parameter. Then the users will be making authorized requests to download the resource.
example (first option Brandon Yarbrough) with javascript library:
const storage = new Storage()
const fileBucket = storage.bucket('myBucket')
const file = fileBucket.file('MyOriginalFile.txt')
const newName = "NewName.txt"
await file.save(content, {
metadata: {
contentDisposition: `inline; filename="${newName}"`
}
})
the following is a part of a python script i've used to remove the forward slashes - added by google cloud buckets when to represent directories - from multiple objects, it's based on this blog post, please keep in mind the double quotes around the content position "file name"
def update_blob_download_name(bucket_name):
""" update the download name of blobs and remove
the path.
:returns: None
:rtype: None
"""
# Storage client, not added to the code for brevity
client = initialize_google_storage_client()
bucket = client.bucket(bucket_name)
for blob in bucket.list_blobs():
if "/" in blob.name:
remove_path = blob.name[blob.name.rfind("/") + 1:] # rfind gives that last occurence of the char
ext = pathlib.Path(remove_path).suffix
remove_id = remove_path[:remove_path.rfind("_id_")]
new_name = remove_id + ext
blob.content_disposition = f'attachment; filename="{new_name}"'
blob.patch()

serving blobs from GAE blobstore in flask

I'm trying to serve big files saved in the blobstore using Flask.
For smaller files I can simply do:
def download_blob(blob_key):
blob_info = blobstore.get(blob_key)
response = make_response(blob_info.open().read())
response.headers['Content-Type'] = blob_info.content_type
response.headers['Content-Disposition'] = 'attachment; filename="%s"' % blob_info.filename
return response
but it fails for larger files. How can I incorporate BlobstoreDownloadHandler into my Flask app without resorting back to webapp2?
If you don't care about range-requests, then you can just set a header of 'X-AppEngine-BlobKey' (or blobstore.BLOB_KEY_HEADER to be safe) with the string-version of your blob-key, along with the content type and disposition as you have it.

Resources