Google Cloud App Engine PHP 7.2 implement Storge - google-app-engine

I am starting using GCloud and have a simple question related to work with App Engine and Cloud Storage.
My PHP application runs over a framework that need a tmp directory, and I try to implement that deploying with tmp folder, in composer.json -> scripts -> try to create a new one and setup permissions, but it shows a read-only file system errors. Also tried to create a symbolic link to /tmp, nothing works.
Understood that I missed some concept, maybe there is a way to map a file system folder to a storage instance (bucket)?
Can anyone explain this?

You are telling us that you tried to implement something but you are not showing us what did you actually tried.
Did you try to write files into a /tmp directory (no need to deploy a /tmp folder with your code, App Engine have one with read/write policy by default) as stated in the public documentation?
Also, Google Cloud Storage is not based on an instance (VM) architecture so asking about "storage instance" is inaccurate.
If writing to a /tmp directory doesn't work for you, sharing with us the code that you tried to implement would be useful.
You can always write your files to a Cloud Storage Bucket by using the sample code provided in the public documentation:
function upload_object($bucketName, $objectName, $source)
{
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$object = $bucket->upload($file, [
'name' => $objectName
]);
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $objectName);
}

Related

Google App boto file stored in inappropriate directory

I installed Google Cloud SDK and it dumped a .boto file directory in to the My Documents folder (e.g. C:\Users\John) which is a wildly inappropriate location. I do see many instances of the boto file in the Python files, a couple of dozens of instances / examples:
return os.path.join(self.LegacyCredentialsDir(account), '.boto')
os.path.expanduser(os.path.join('~', '.boto')),
Where do I go to change the path to something appropriate? An appropriate path would be something such as C:\Users\John\AppData\Roaming\gcloud\.boto in example.
At the top of the file:
This file contains credentials and other configuration information needed
by the boto library, used by gsutil. You can edit this file (e.g., to add
credentials) but be careful not to mis-edit any of the variable names (like
"gs_access_key_id") or remove important markers (like the "[Credentials]" and
"[Boto]" section delimiters).
[Credentials]
Google OAuth2 credentials are managed by the Cloud SDK and
do not need to be present in this file.
To add HMAC google credentials for "gs://" URIs, edit and uncomment the
following two lines:
The latest versions of Boto don't seem to be a great fit for App Engine. I ran into this issue about a year ago, and I don't remember all of the details, but I avoided Boto3 and stuck with Boto 2.47 and that worked well for me.
For my use case, I only needed help with SES. If you need many other AWS services then YMMV.

App Engine: Copy live Datastore to local dev Datastore (that still works)

This used to be possible by downloading with the bulkloader and uploading to the local dev server. However, the bulkloader download has been non-functional for several months now, due to not supporting oauth2.
A few places recommend downloading from a cloud storage backup, and uploading to the local datastore through either bulkloader or by directly parsing the backup. However, neither of these appear functional anymore. The bulkloader method throws:
OperationalError: unable to open database file
And the RecordsReader class, which is used to read the backup files, reaches end of file when trying to read the first record, resulting in no records being read.
Does there exist a current, functional, method for copying the live datastore to the local dev datastore?
RecordsReader is functioning perfectly on unix. I've tried this https://gist.github.com/jehna/3b258f5287fcc181aacf one day ago and it worked amazing.
You should add to the imports your Kinds implementation and run it in the datastore interactive shell.
for example:
from myproject.kinds_implementations import MyKind
I've removed the
for pp in dir(a):
try:
ppp = getattr(a, "_" + pp)
if isinstance(ppp, db.Key):
ppp._Key__reference.set_app(appname)
ppp
except AttributeError:
""" It's okay """
And it worked well. In my case the backup downloaded in multiple directories so I've modified the access to the directories. for some thing like that:
for directory in mypath:
full_directory_path = join(mypath, directory)
for sub_dir in listdir(directory_full_path):
full_sub_dir_path = join(full_directory_path, sub_dir)
onlyfiles = [ f for f in listdir(full_sub_dir_path) if isfile(join(mypath,f)) ]
for file in onlyfiles:
If you're working on windows you're welcome to follow my question about RecordsReader on windows, hopefully someone will answer there Google datastore backup to local dev_appserver
edit:
Working great on windows if you change the file open permissions from 'r' to 'rb'
The bulkloader is still functional on Python with OAuth2, albeit with some caveats. In downloading from the live app, there is an issue with refreshing of the OAuth2 token so the total download time is limited to 3600 seconds, or 3600+3600 if you manually use a refresh token with --oauth2_refresh_token.
When uploading to the development server app, OAuth2 will fail with a 401, so it's necessary to edit google.appengine.ext.remote_api.handler and stub out 'CheckIsAdmin' to always return True as a workaround:
def CheckIsAdmin(self):
return True
user_is_authorized = False
...
I upvoted the above answer however, as it looks like a more robust solution at this point.

File is not creating on heroku using cakephp

I tried to make a file on heroku using PHP code:
$fh = fopen("../DownloadFiles/".$filename,'a');
fwrite($fh,$s);
but the file has not been created and it is not showing any error. Please help.
This should work just fine, but are you aware that if you're running multiple dynos, that file will exist only on the dyno that served that one request, and not on all the others?
Also, Dynos restart every 24 hours, and their state is reset every time you push a change to Heroku, so you cannot store persistent information on them; that's called an ephemeral filesystem.
You could, for instance, store uploaded files on Amazon S3, like described in the docs: https://devcenter.heroku.com/articles/s3-upload-php
Two remarks about your original issue:
you're probably running an old version of CakePHP which mangles all internal PHP error handling and writes it out to a log file (if you're lucky), so you can't see anything in heroku logs, and it's not possible to configure it otherwise; upgrade to a more recent version that lets you log to streams and then use php://stderr as the destination
in general, if you want to write to a file in PHP, you can just do file_put_contents($filename, $contents)...
Does the DownloadFiles folder exist in the deployment? Node fs gives error if the directory is not found. You can add a snippet to check if dir exists and if not then create. You can use fs.exists and fs.mkdir.
For more info http://nodejs.org/api/fs.html

Laravel: Making storage work on App Engine

I followed the Getting started with Laravel on PHP for App Engine and I'm getting an error when I change the path of the Storage to Google Cloud Storage in local development.
ex.
const BUCKET_NAME = "bucket-name";
$storage_path = "gs://" . BUCKET_NAME . "/storage";
Here is the ErrorException:
file_put_contents(/meta/services.json): failed to open stream: No such file or directory
App Engine doesn't allow you to write to the local filesystem for security and scalability reasons. Fortunately though you can read and write to Google Cloud Storage easily using commands like file_put_contents(). This facility is also emulated in the local dev_appserver.
Take a look at https://github.com/ajessup/laravel for a version of Laravel that's been tweaked to run well on Google App Engine, including writing /meta/services.json to GCS.
This might help.
http://forumsarchive.laravel.io/viewtopic.php?id=9341
"By defining the manifest path in app/config/app.php to point to a Cloud Storage path instead, like this: 'manifest' => 'gs://yourbucketname' .'/meta',"
Looks like storage_path() issue when 'manifest' => storage_path().'/meta',

Location of GS File in Local/Dev AppEngine

I'm trying to trouble shoot some issues I'm having with an export task I have created. I'm attempting to export CSV data using Google Cloud Storage and I seem to be unable to export all my data. I'm assuming it has something to do with the (FAR TOO LOW) 30 second file limit when I attempt to restart the task.
I need to trouble shoot, but I can't seem to find where my local/development server writing the files out. I see numerous entries in the GsFileInfo table so I assume something is going on, but I can't seem to find the actual output file.
Can someone point me to the location of the Google Cloud Storage files in the local AppEngine development environment?
Thanks!
Looking at dev_appserver code, looks like you can specify a path or it will calculate a default based on the OS you are using.
blobstore_path = options.blobstore_path or os.path.join(storage_path,
'blobs')
Then it passed this path to blobstore_stub (GCS storage is backed by blobstore stub), which seems to shard files by their blobstore key.
def _FileForBlob(self, blob_key):
"""Calculate full filename to store blob contents in.
This method does not check to see if the file actually exists.
Args:
blob_key: Blob key of blob to calculate file for.
Returns:
Complete path for file used for storing blob.
"""
blob_key = self._BlobKey(blob_key)
return os.path.join(self._DirectoryForBlob(blob_key), str(blob_key)[1:])
For example, i'm using ubuntu and started with dev_appserver.py --storage_path=~/tmp, then i was able to find files under ~/tmp/blobs and datastore under ~/tmp/datastore.db. Alternatively, you can go to local admin_console, the blobstore viewer link will also display gcs files.
As tkaitchuck mentions above, you can use the included LocalRawGcsService to pull the data out of the local.db. This is the only way to get the file, as they are stored in the Local DB using the blobstore. Here's the original answer:
which are the files uri on GAE java emulating cloud storage with GCS client library?

Resources