Uploaded images not publicly readable - google-app-engine

Im unable to see images by the URL returned from getPublicUrl(). The image is uploaded directly to Google Cloud Storage via HTTP POST and I can see them in Datastore/Blobstore Viewer on http://localhost:8000.
try{
echo CloudStorageTools::getPublicUrl("gs://qbucket_first/sample.jpeg", true);
}catch ( CloudStorageException $e ) {
echo 'There was an exception creating the Image Public URL, details ' . $e->getMessage();
}
What I get is this: http://localhost:8080/_ah/gcs/qbucket_first/sample.jpeg But I receive a 404 error when entering that URL. How can I fix this problem? I am using Chrome, Mac OS X, GAELauncher v1.8.9 PHP.
Edit 1
If I run this code:
file_put_contents("gs://qbucket_first/hello.txt", "Hello");
echo CloudStorageTools::getPublicUrl("gs://qbucket_first/hello.txt", true);
I get
http://localhost:8080/_ah/gcs/qbucket_first/hello.txt
And when entering this URL, I can download the file and read the content. So it works with text files which means that Datastore is working on local dev.

There's the same thing in Python version Google App Engine.
dev_appserver.py create a fake Cloud Storage for local development, when you write something into Google Cloud Storage in local environment, it actually writes it into a fake space, getPublicUrl() in local environment only gets the fake url in your local Google Cloud Storage.
The first time you upload the file to the real Google Cloud Storage, so in the local environment you get 404.
the second time you save the file in your local runtime, so it actually writes into the local fake GCS environment, so the second time it works.

you are using the dev server, which seems to be the problem. Run it from production.

Does the image filename have spaces?
If yes, remove the spaces or replace with _.

Related

Access local Google Cloud Storage files in browser with filename instead of key

In Google App Engine (GAE), files that get stored to the local Cloud Storage show up in the admin console with a path. Example:
/gs/myapp.appspot.com.somefile.jpg
This one seems to get closer:
http://localhost:8080/_ah/img/encoded_gs_file:somefile.jpg
But that generates an error:
Error 404 ApplicationError: 6: Could not read blob.
This one works but it requires I know the key:
http://localhost:8080/_ah/img/encoded_gs_key:some_key
Is there a way to use the local url but use the filename instead of a key?
I think you should go through the details of this GitHub Code about how to read and write blobs. The code confirms that for image files, you always require the keys.
For Images, you require the Key http://localhost:8080/_ah/img/encoded_gs_file:[Keys]
While for other files: https://localhost:8080/_ah/gcs/default_bucket/file_name

App Engine: Copy live Datastore to local dev Datastore (that still works)

This used to be possible by downloading with the bulkloader and uploading to the local dev server. However, the bulkloader download has been non-functional for several months now, due to not supporting oauth2.
A few places recommend downloading from a cloud storage backup, and uploading to the local datastore through either bulkloader or by directly parsing the backup. However, neither of these appear functional anymore. The bulkloader method throws:
OperationalError: unable to open database file
And the RecordsReader class, which is used to read the backup files, reaches end of file when trying to read the first record, resulting in no records being read.
Does there exist a current, functional, method for copying the live datastore to the local dev datastore?
RecordsReader is functioning perfectly on unix. I've tried this https://gist.github.com/jehna/3b258f5287fcc181aacf one day ago and it worked amazing.
You should add to the imports your Kinds implementation and run it in the datastore interactive shell.
for example:
from myproject.kinds_implementations import MyKind
I've removed the
for pp in dir(a):
try:
ppp = getattr(a, "_" + pp)
if isinstance(ppp, db.Key):
ppp._Key__reference.set_app(appname)
ppp
except AttributeError:
""" It's okay """
And it worked well. In my case the backup downloaded in multiple directories so I've modified the access to the directories. for some thing like that:
for directory in mypath:
full_directory_path = join(mypath, directory)
for sub_dir in listdir(directory_full_path):
full_sub_dir_path = join(full_directory_path, sub_dir)
onlyfiles = [ f for f in listdir(full_sub_dir_path) if isfile(join(mypath,f)) ]
for file in onlyfiles:
If you're working on windows you're welcome to follow my question about RecordsReader on windows, hopefully someone will answer there Google datastore backup to local dev_appserver
edit:
Working great on windows if you change the file open permissions from 'r' to 'rb'
The bulkloader is still functional on Python with OAuth2, albeit with some caveats. In downloading from the live app, there is an issue with refreshing of the OAuth2 token so the total download time is limited to 3600 seconds, or 3600+3600 if you manually use a refresh token with --oauth2_refresh_token.
When uploading to the development server app, OAuth2 will fail with a 401, so it's necessary to edit google.appengine.ext.remote_api.handler and stub out 'CheckIsAdmin' to always return True as a workaround:
def CheckIsAdmin(self):
return True
user_is_authorized = False
...
I upvoted the above answer however, as it looks like a more robust solution at this point.

403 Forbidden on local Google Cloud Storage call

I am using the default bucket name, but whenever I try to write a file, I get a 403 Forbidden. It tries to write to a bucket named: app_default_bucket.
This is the default bucket retrieved by file.DefaultBucketName(ctx).
Local file permissions also seem to be okay.
In production everything works as expected.
It's trying to write on your remote Google Cloud Storage account. Seems like a current bug. For now you might have to create/reconfigure the default bucket on your account.
Using the client library with the dev server not working in Go

Google Cloud Storage return status 404 Not Found even though object exists?

I have a script that uploads a photo to google compute engine to be processed, saves it in google cloud storage, and then responds with the path which i send to an app engine app to be read... testing with the gsutil cp command shows that the picture is saved correctly to GCS as the cp command always finds it.
However a lot of times app engine has problems find the photo when I send the path, returning a:
NotFoundError: Expect status [200] from Google Storage. But got status 404
Any thoughts?
Thanks to Voscausa, you can simply solve the issue by utilizing adequate parameters:
https://developers.google.com/appengine/docs/python/googlecloudstorageclient/retryparams_class

Location of GS File in Local/Dev AppEngine

I'm trying to trouble shoot some issues I'm having with an export task I have created. I'm attempting to export CSV data using Google Cloud Storage and I seem to be unable to export all my data. I'm assuming it has something to do with the (FAR TOO LOW) 30 second file limit when I attempt to restart the task.
I need to trouble shoot, but I can't seem to find where my local/development server writing the files out. I see numerous entries in the GsFileInfo table so I assume something is going on, but I can't seem to find the actual output file.
Can someone point me to the location of the Google Cloud Storage files in the local AppEngine development environment?
Thanks!
Looking at dev_appserver code, looks like you can specify a path or it will calculate a default based on the OS you are using.
blobstore_path = options.blobstore_path or os.path.join(storage_path,
'blobs')
Then it passed this path to blobstore_stub (GCS storage is backed by blobstore stub), which seems to shard files by their blobstore key.
def _FileForBlob(self, blob_key):
"""Calculate full filename to store blob contents in.
This method does not check to see if the file actually exists.
Args:
blob_key: Blob key of blob to calculate file for.
Returns:
Complete path for file used for storing blob.
"""
blob_key = self._BlobKey(blob_key)
return os.path.join(self._DirectoryForBlob(blob_key), str(blob_key)[1:])
For example, i'm using ubuntu and started with dev_appserver.py --storage_path=~/tmp, then i was able to find files under ~/tmp/blobs and datastore under ~/tmp/datastore.db. Alternatively, you can go to local admin_console, the blobstore viewer link will also display gcs files.
As tkaitchuck mentions above, you can use the included LocalRawGcsService to pull the data out of the local.db. This is the only way to get the file, as they are stored in the Local DB using the blobstore. Here's the original answer:
which are the files uri on GAE java emulating cloud storage with GCS client library?

Resources