Reading wordnet folder in Google Cloud Application - google-app-engine

I have a Scala web application running in GAE. I need to use a Java library -JWI- which requires me to pass a root folder of Wordnet into edu.mit.jwi.Dictionary's constructor.
I thought about putting all Wordnet stuff into Google Cloud Storage, but it doesn't have a concept of a folder at all. So, my question: is there any way to do what I want with Google Cloud Storage or should I use anything else?

You were right when you stated “there is no API in Google Cloud Java library for folder manipulation”. As of today, there’s no folder manipulation for the java client library. You can check the library here

You can use Google Cloud Storage (GCS), even if gsutil handles subdirectories in a different way, because it behaves as a regular folder and uses the same notation.
I am not sure about how your application works but if I am guessing well:
Load the JWI library to your Cloud Shell.
Import the library in your Scala application in App Engine flexible. Find an example here on how to call a Java class using Scala.
Deploy the application. Following the previous steps, the image deployed will contain the JWI library you need.
Load the Wordnet semantic dictionary in a bucket and pass the root folder of Wordnet, in this case a GCS folder, using the Java client library for the Google Cloud Storage API. The “Dictionary” must be downloaded (using a get function) and locally stored while you are using it.
Find here the Java client library documentation for Cloud Storage. You might need more functions than the ones below which I have written for you, to create a bucket, upload a file and download it.
package com.example.storage;
// Imports the Google Cloud client library
import com.google.cloud.storage.Acl;
import com.google.cloud.storage.Acl.Role;
import com.google.cloud.storage.Acl.User;
import com.google.cloud.storage.Bucket;
import com.google.cloud.storage.BucketInfo;
import com.google.cloud.storage.Blob;
import com.google.cloud.storage.BlobId;
import com.google.cloud.storage.BlobInfo;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
public class QuickstartSample {
public static void main(String... args) throws Exception {
// Instantiates a client
Storage storage = StorageOptions.getDefaultInstance().getService();
// The name for the new bucket
String bucketName = args[0]; // "my-new-bucket";
// Creates the new bucket
Bucket bucket = storage.create(BucketInfo.of(bucketName));
System.out.printf("Bucket %s created.%n", bucket.getName());
// [START uploadFile]
// Object name
String fileName="filename.ext";
// Create file inside the bucket
BlobInfo blobInfo =
storage.create(
BlobInfo
.newBuilder(bucketName, fileName)
// Modify access list to allow all users with link to read file
.setAcl(new ArrayList<>(Arrays.asList(Acl.of(User.ofAllUsers(), Role.READER))))
.build()
// other options required
);
// return the public download link
blobInfo.getMediaLink();
// [END uploadFile]
// Copy file from a bucket
String blobName = "filename.ext";
BlobId blobId = BlobId.of(bucketName, blobName);
Blob blob = storage.get(blobId);
}
Finally, find here how to compile the code and running it:
mvn clean package -DskipTests
mvn exec:java -Dexec.mainClass=com.example.storage.QuickstartSample -Dexec.args="bucketName"

Related

React i18next loading translation files both from frontend (e.g. localhost :3000) and backend (e.g. localhost:5000)

I am making one react app which fetches data from backend and displays in the browser. For changing the language for static names (like in header, footer), it is changed if I set locales folder inside public folder, and creating all the json files for required language code.
Now, I want to load the translation files from the backend as well, because, the data fetched from the backend is always random, and backend will send the corresponding translation files.
I am so much of confused of how to achieve that.
I have been through lots of stackoverflow solutions, which suggest to use custom backend plugin. But, I am confused how to create the custom backend plugin.
here is the part of code of my i18next.js configuration :
import i18n from 'i18next';
import { initReactI18next } from 'react-i18next';
import Backend from 'i18next-http-backend';
i18n
.use(initReactI18next)
.use(Backend)
.init({
ns:['common', 'translation'],
defaultNS : 'common',
//still loads the translation files if I do not define below line (because it is in public folder in react app)
backend: { loadPath: "/locales/{{lng}}/{{ns}}.json" }
});
From the above code, I am just able to load the translation files that is in the frontend.
If I change the line "backend" as :
backend : { loadPath : "http://localhost:5000/locales/{{lng}}/{{ns}}.json"
It will load the translation files from backend server at localhost: 5000. But, the translation files located in the public folder in not loaded now.
Can anyone help with some example, how to load both paths so that the translation files from both frontend and backend works.
When you have both defined resources in your local project and backend translations, you're capable of using i18next-chained-backend library to support both. Refer the short example here.
Reference: https://www.i18next.com/how-to/add-or-load-translations

I want to serve PDF files stored in Google Cloud Storage

I need to serve PDF files stored in Google Cloud Storage.
I tried:
from google.appengine.api import blobstore
from google.appengine.api import images
bkey = blobstore.create_gs_key('/gs' + filename)
url = images.get_serving_url(bkey)
Error:
get_serving_url_hook\n raise _ToImagesError(e, readable_blob_key)\n', 'TransformationError\n']
You are treating the PDF file as if it wan image. You cannot use the 'images' api with a pdf file. There are several ways of storing and serving static files, which you can find from this link [1]
[1] https://cloud.google.com/appengine/docs/standard/python3/serving-static-files

Google App Engine does not recognized #Named parameter

I am using the Google Plugin for Eclipse, and I am writing an App Engine app as a Dynamic Web Module in Eclipse WTP.
I have defined the following Java class to serve as a Cloud Endpoint API:
package mypackage;
import static mypackage.OfyService.ofy;
import java.util.List;
import java.util.logging.Logger;
import mypackage.models.ProbeEntry;
import mypackage.models.ProbeSet;
import com.google.api.server.spi.config.Api;
import com.google.api.server.spi.config.ApiMethod;
import com.google.api.server.spi.config.ApiNamespace;
import com.google.api.server.spi.config.Named;
import com.googlecode.objectify.ObjectifyService;
#Api(name = "analysisEndpoint",
version = "v1",
namespace = #ApiNamespace(
ownerDomain = "myorg",
ownerName = "myorg",
packagePath = "analysis")
)
public class AnalysisEndpoint {
private static final Logger logger = Logger.getLogger(AnalysisEndpoint.class.getName());
#ApiMethod(name = "getMyProbeEntries", httpMethod = ApiMethod.HttpMethod.GET)
public ProbeSet getMyProbeEntries(#Named("amount") int amount) {
ObjectifyService.begin();
List<ProbeEntry> probeList = ofy().load().type(ProbeEntry.class).limit(amount).list();
return new ProbeSet(probeList);
}
}
I attempt to deploy to the Google App Engine by right-clicking the project -> Google App Engine WTP -> Deploy Project to Remote Server. I see in my console that the project is compiling and uploading, but eventually errors out with:
99% Endpoints configuration not updated. The app returned an error when the Google Cloud Endpoints server attempted to communicate with it.
The error log on the app engine shows the following:
18:31:58.119
javax.servlet.ServletContext log: unavailable
com.google.api.server.spi.config.validation.MissingParameterNameException: analysisEndpoint.myorg.analysis.AnalysisEndpoint.getMyProbeEntries parameter (type int): Missing parameter name. Parameter type (int) is not an entity type and thus should be annotated with #Named.
at
com.google.api.server.spi.config.validation.ApiConfigValidator.validateApiParameter(ApiConfigValidator.java:214)
...
As can be seen in the code, I do have #Named("amount") before the offending parameter. What is going wrong here? Side note: If I simply remove the amount parameter, the project deploys to App Engine without a problem.
Any help would be greatly appreciated.

Google Cloud Storage on Appengine Dev Server

There's a similar question that was recently responded to on Stackoverflow here: Google Cloud Storage Client not working on dev appserver
The solution was to either upgrade the SDK to 1.8.8 or use the previous revision of the GCS client library which didn't have the bug instead.
I'm currently using 1.8.8 and have tried downloading multiple revisions and /_ah/gcs doesn't load for me. After using up a significant number of my backend instances trying to understand how GCS and app engine work together, it'd be great if I could just test it on my local server instead!
When I visit localhost:port/_ah/gcs I get a 404 not found error.
Just a heads up, to install the library all I did was drag and drop the code into my app folder. I'm wondering if maybe I skipped a setup step? I wasn't able to find the answer in the documentation!
thanks!!
Note
To clarify this is my first week using GCS, so my first time trying to use the dev_server to host it.
I was able to find the google cloud storage files I wrote to a bucket locally at:
localhost:port/_ah/gcs/bucket_name/file_suffix
Where port is by default 8080, and the file was written to: /bucket_name/file_suffix
For those trying to understand the full process of setting up a simple python GAE app and testing local writes to google cloud storage:
1. Follow the google app engine "quickstart":
https://cloud.google.com/appengine/docs/standard/python/quickstart
2. Run a local dev server with:
dev_appserver.py app.yaml
3. If using python, follow "App Engine and Google Cloud Storage Sample":
https://cloud.google.com/appengine/docs/standard/python/googlecloudstorageclient/app-engine-cloud-storage-sample
If you run into "ImportError: No module named cloudstorage" you need to create a file named appengine_config.py
touch appengine_config.py
and add to it:
from google.appengine.ext import vendor
vendor.add('lib')
GAE runs this script automatically when starting your local dev server with dev_appserver.py app.yaml, and it is necessary to run this script for GAE to find the cloudstorage library in your lib/ folder
4. "Writing a file to cloud storage" from the same tutorial:
def create_file(self, filename):
"""Create a file."""
self.response.write('Creating file {}\n'.format(filename))
# The retry_params specified in the open call will override the default
# retry params for this particular file handle.
write_retry_params = cloudstorage.RetryParams(backoff_factor=1.1)
with cloudstorage.open(
filename, 'w', content_type='text/plain', options={
'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
retry_params=write_retry_params) as cloudstorage_file:
cloudstorage_file.write('abcde\n')
cloudstorage_file.write('f'*1024*4 + '\n')
self.tmp_filenames_to_clean_up.append(filename)
with cloudstorage.open(
filename, 'w', content_type='text/plain', options={
'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
retry_params=write_retry_params) as cloudstorage_file:
cloudstorage_file.write('abcde\n')
cloudstorage_file.write('f'*1024*4 + '\n')
Where filename is /bucket_name/file_suffix
4. After calling create_file via a route in your WSGI app, your file will be available at:
localhost:port/_ah/gcs/bucket_name/file_suffix
Where port is by default 8080, and the file was written to: /bucket_name/file_suffix
Postscript
Unfortunately, I did not find either 3) or 4) in their docs, so I hope this helps someone get set up more easily in the future.
To access gcs objects on dev_appserver, you must specify the bucket & object name, i.e. /_ah/gcs/[bucket]/[object].
The storage simulator for the local server is working in later versions of the SDK. For Java, one may choose to follow a dedicated tutorial: “App Engine and Google Cloud Storage Sample”.

Access Denied exception when using google-api-java-client

I am trying to use google-api-java-client for OAuth2.0 to create a simple 3rd party app to access an OAuth2.0 based webservices.
The programs breaks when I try to initialize
private static final HttpTransport HTTP_TRANSPORT = new ApacheHttpTransport();
They are imported as:
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.apache.ApacheHttpTransport;
It is a simple Web Application Project using Google App Engine plugin inside Eclipse.
Caused by: java.security.AccessControlException: access denied (java.net.NetPermission getProxySelector)
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:376)
at java.security.AccessController.checkPermission(AccessController.java:549)
at java.lang.SecurityManager.checkPermission(SecurityManager.java:532)
at com.google.appengine.tools.development.DevAppServerFactory$CustomSecurityManager.checkPermission(DevAppServerFactory.java:383)
at java.net.ProxySelector.getDefault(ProxySelector.java:73)
at com.google.api.client.http.apache.ApacheHttpTransport.newDefaultHttpClient(ApacheHttpTransport.java:157)
at com.google.api.client.http.apache.ApacheHttpTransport.(ApacheHttpTransport.java:100)
at com.mytest.demo.TestApiDemoServlet.(TestApiDemoServlet.java:17)
I am using App Engine 1.8.4 and google-api-java-client 1.16.0-rc
Any help would be greatly appreciated!
You can't use ApacheHttpTransport in the GAE environment as this is retricted. This is the reason it is failing. You need to use UrlFetchTransport instead and it should work. To use this you need to import the corresponding jars from this link directly and drop in your libs or jar folder-
https://code.google.com/p/google-http-java-client/wiki/Setup#google-http-client-appengine

Resources