I am saving a user's image as a BlobProperty by doing:
user.image = urlfetch.fetch(image_url).content
Then I'm rendering that image using a url such as:
/image/user_id
The image must be saving because because when I do len(user.image) I get a number in the thousands. And on the local instance the image renders ok. On the deployed app, I get the following error, and when I go to the image url nothing shows in the browser:
Traceback (most recent call last):
File "/base/python27_runtime/python27_dist/lib/python2.7/wsgiref/handlers.py", line 86, in run
self.finish_response()
File "/base/python27_runtime/python27_dist/lib/python2.7/wsgiref/handlers.py", line 127, in finish_response
self.write(data)
File "/base/python27_runtime/python27_dist/lib/python2.7/wsgiref/handlers.py", line 202, in write
assert type(data) is StringType,"write() argument must be string"
AssertionError: write() argument must be string
Also, here's the handler that serves the image:
class ImageHandler(webapp2.RequestHandler):
""" Returns image based on id. """
def get(self, *args, **kwargs):
user = db.get(
db.Key.from_path('User', models.User.get_key_name(kwargs.get('id'))))
if user.image:
self.response.headers['Content-Type'] = "image/jpeg"
self.response.out.write(user.image)
else:
self.response.out.write("No image")
Just to clarify I tried both setting content-type to jpeg and png. And things are working ok on the local server. Any help would be appreciated. Thanks!
Why not write the image to the blobstore and then use the send_blob() mechanism?
http://code.google.com/appengine/docs/python/blobstore/overview.html#Serving_a_Blob
Answering my question, that fixes everything:
self.response.out.write(str(user.image))
It's confusing because the example in the docs does not cast the BlobProperty as string.
Related
I've been using endpoints_proto_datastore library to build my endpoints APIs and I am trying to figure out how to return a list of records result that I've retrieved from the search API.
The #query_method seems to require a Query type to return and it'll do the fetch call internally. How would I go about implementing an endpoint method that would handle full-text search? Do I just define a custom protorpc requets Message and response Message and skip the endpoints_proto_datastore library all together?
This is what I tried and got an error that list doesn't have ToMessage attribute.
Encountered unexpected error from ProtoRPC method implementation: AttributeError ('list' object has no attribute 'ToMessage')
Traceback (most recent call last):
File "google_appengine/lib/protorpc-1.0/protorpc/wsgi/service.py", line 181, in protorpc_service_app
response = method(instance, request)
File "google_appengine/lib/endpoints-1.0/endpoints/api_config.py", line 1329, in invoke_remote
return remote_method(service_instance, request)
File "google_appengine/lib/protorpc-1.0/protorpc/remote.py", line 412, in invoke_remote_method
response = method(service_instance, request)
File "third_party/py/endpoints_proto_datastore/ndb/model.py", line 1416, in EntityToRequestMethod
response = response.ToMessage(fields=response_fields)
AttributeError: 'list' object has no attribute 'ToMessage'
Here's a general view of the code:
class MyModel(EndpointsModel):
SearchSchema = MessageFieldsSchema(('q',))
_query_string = None
def QueryStringSet_(self, value):
self._query_string = value
#EndpointsAliasProperty(name='q', setter=QueryStringSet_)
def query_string(self):
return self._query_string
class MyServices(...):
#MyModel.method(
request_fields=MyModel.SearchSchema,
name='search', path='mymodel/search')
def SearchMyModel(self, request):
return MyModel.Search(request.q)
If you were using Java then the answer would be to use
import com.google.api.server.spi.response.CollectionResponse;
In python you need to create Response Message Classes.
I have a pipeline that creates a blob in the blobstore and places the resulting blob_key in one of its named outputs. When I run the pipeline through the web interface I have built around it, everything works wonderfully. Now I want to create a small test case that will execute this pipeline, read the blob out from the blobstore, and store it to a temporary location somewhere else on disk so that I can inspect it. (Since testbed.init_files_stub() only stores the blob in memory for the life of the test).
The pipeline within the test case seems to work fine, and results in what looks like a valid blob_key, but when I pass that blob_key to the blobstore.BlobReader class, it cannot find the blob for some reason. From the traceback, it seems like the BlobReader is trying to access the real blobstore, while the writer (inside the pipeline) is writing to the stubbed blobstore. I have --blobstore_path setup on dev_appserver.py, and I do not see any blobs written to disk by the test case, but when I run it from the web interface, the blobs do show up there.
Here is the traceback:
Traceback (most recent call last):
File "/Users/mattfaus/dev/webapp/coach_resources/student_use_data_report_test.py", line 138, in test_serial_pipeline
self.write_out_blob(stage.outputs.xlsx_blob_key)
File "/Users/mattfaus/dev/webapp/coach_resources/student_use_data_report_test.py", line 125, in write_out_blob
writer.write(reader.read())
File "/Users/mattfaus/Desktop/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/blobstore/blobstore.py", line 837, in read
self.__fill_buffer(size)
File "/Users/mattfaus/Desktop/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/blobstore/blobstore.py", line 809, in __fill_buffer
self.__position + read_size - 1)
File "/Users/mattfaus/Desktop/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/blobstore/blobstore.py", line 657, in fetch_data
return rpc.get_result()
File "/Users/mattfaus/Desktop/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/api/apiproxy_stub_map.py", line 604, in get_result
return self.__get_result_hook(self)
File "/Users/mattfaus/Desktop/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/api/blobstore/blobstore.py", line 232, in _get_result_hook
raise _ToBlobstoreError(err)
BlobNotFoundError
Here is my test code:
def write_out_blob(self, blob_key, save_path='/tmp/blob.xlsx'):
"""Reads a blob from the blobstore and writes it out to the file."""
print str(blob_key)
# blob_info = blobstore.BlobInfo.get(str(blob_key)) # Returns None
# reader = blob_info.open() # Returns None
reader = blobstore.BlobReader(str(blob_key))
writer = open(save_path, 'w')
writer.write(reader.read())
print blob_key, 'written to', save_path
def test_serial_pipeline(self):
stage = student_use_data_report.StudentUseDataReportSerialPipeline(
self.query_config)
stage.start_test()
self.assertIsNotNone(stage.outputs.xlsx_blob_key)
self.write_out_blob(stage.outputs.xlsx_blob_key)
It might be useful if you show how you finalized the blobstore file or if you can try that finalization code separately. It sounds Files API didn't finalize the file correctly on dev appserver.
Turns out that I was simply missing the .value property, here:
self.assertIsNotNone(stage.outputs.xlsx_blob_key)
self.write_out_blob(stage.outputs.xlsx_blob_key.value) # Don't forget .value!!
[UPDATE]
The SDK dashboard also exposes a list of all blobs in your blobstore, conveniently sorted by creation date. It is available at http://127.0.0.1:8000/blobstore.
I recently started using bottle and GAE blobstore and while I can upload the files to the blobstore I cannot seem to find a way to download them from the store.
I followed the examples from the documentation but was only successful on the uploading part. I cannot integrate the example in my app since I'm using a different framework from webapp/2.
How would I go about creating an upload handler and download handler so that I can get the key of the uploaded blob and store it in my data model and use it later in the download handler?
I tried using the BlobInfo.all() to create a query the blobstore but I'm not able to get the key name field value of the entity.
This is my first interaction with the blobstore so I wouldn't mind advice on a better approach to the problem.
For serving a blob I would recommend you to look at the source code of the BlobstoreDownloadHandler. It should be easy to port it to bottle, since there's nothing very specific about the framework.
Here is an example on how to use BlobInfo.all():
for info in blobstore.BlobInfo.all():
self.response.out.write('Name:%s Key: %s Size:%s Creation:%s ContentType:%s<br>' % (info.filename, info.key(), info.size, info.creation, info.content_type))
for downloads you only really need to generate a response that includes the header "X-AppEngine-BlobKey:[your blob_key]" along with everything else you need like a Content-Disposition header if desired. or if it's an image you should probably just use the high performance image serving api, generate a url and redirect to it.... done
for uploads, besides writing a handler for appengine to call once the upload is safely in blobstore (that's in the docs)
You need a way to find the blob info in the incoming request. I have no idea what the request looks like in bottle. The Blobstoreuploadhandler has a get_uploads method and there's really no reason it needs to be an instance method as far as I can tell. So here's an example generic implementation of it that expects a webob request. For bottle you would need to write something similar that is compatible with bottles request object.
def get_uploads(request, field_name=None):
"""Get uploads for this request.
Args:
field_name: Only select uploads that were sent as a specific field.
populate_post: Add the non blob fields to request.POST
Returns:
A list of BlobInfo records corresponding to each upload.
Empty list if there are no blob-info records for field_name.
stolen from the SDK since they only provide a way to get to this
crap through their crappy webapp framework
"""
if not getattr(request, "__uploads", None):
request.__uploads = {}
for key, value in request.params.items():
if isinstance(value, cgi.FieldStorage):
if 'blob-key' in value.type_options:
request.__uploads.setdefault(key, []).append(
blobstore.parse_blob_info(value))
if field_name:
try:
return list(request.__uploads[field_name])
except KeyError:
return []
else:
results = []
for uploads in request.__uploads.itervalues():
results += uploads
return results
For anyone looking for this answer in future, to do this you need bottle (d'oh!) and defnull's multipart module.
Since creating upload URLs is generally simple enough and as per GAE docs, I'll just cover the upload handler.
from bottle import request
from multipart import parse_options_header
from google.appengine.ext.blobstore import BlobInfo
def get_blob_info(field_name):
try:
field = request.files[field_name]
except KeyError:
# Maybe form isn't multipart or file wasn't uploaded, or some such error
return None
blob_data = parse_options_header(field.content_type)[1]
try:
return BlobInfo.get(blob_data['blob-key'])
except KeyError:
# Malformed request? Wrong field name?
return None
Sorry if there are any errors in the code, it's off the top of my head.
I have successfully uploaded a file to blobstore using this code.
But I am unable to download it.
What I am doing is:
`class PartnerFileDownloadHandler(blobstore_handlers.BlobstoreDownloadHandler):
def get(self, blob_key):
resource = str(urllib.unquote(blob_key))
logging.info('I am here.') //This gets printed successfully.
blob_info = blobstore.BlobInfo.get(blob_key)
logging.info(blob_info) //This gets logged too.
self.send_blob(blob_info)`
I have also tried:
blobstore.BlobReader(blob_key).read()
and I get file data in string form but I can not write it to file, as local file system can not be accessed from within a handler, I guess.
The way I am uploading a file is the only way in my project so I can not use the usual way specified in the Google's official tutorial. Also The file I am uploading to blobstore is not present at my local file syatem, I pick it from a URL, perhaps this is the problem why I am not able to download the file.
Any suggestions?
Thanks
Perhaps you should use resource instead of blob_key from your code sample?
class PartnerFileDownloadHandler(blobstore_handlers.BlobstoreDownloadHandler):
def get(self, blob_key):
resource = str(urllib.unquote(blob_key))
self.send_blob(resource)
you can use DownloadHandler as this:
from mimetypes import guess_type
def mime_type(filename):
return guess_type(filename)[0]
class Thumbnailer(blobstore_handlers.BlobstoreDownloadHandler):
def get(self , blob_key):
if blob_key:
blob_info = blobstore.get(blob_key)
if blob_info:
save_as1 = blob_info.filename
mime_type=mime_type(blob_info.filename)
self.send_blob(blob_info,content_type=mime_type,save_as=save_as1)
I'm using webapp with Google App Engine.
I recently added a call to request.get('variable_name'). This worked fine, but completely changed the contents of request.body.
Upon closer examination, it looks like if I do not make a call to request.get(), then request.body yields text without any url formatting. But after a call the request.get(), request.body now contains text that includes URL formatting (a lot of '%' signs, etc...).
Am I using webapp wrong? should I not be mixing and matching these two methods for information retrieval?
Here is some sample code:
class profiles_resource(webapp.RequestHandler):
def post(self):
# Value of request.body in debugger: 'str: {"query":"SELECT..."
token = self.request.get('token')
# Value of request.body in debugger: '%7B%22query%22%3A%22SELECT..."
request.get looks for request parameters in both the query string and the body of the request, assuming the body is formencoded. If you intend to read the body directly, do not use self.request.get or self.request.POST.