Unable to get google bigquery and google app engine to work - google-app-engine

Found the answer to my question: For those having the same problem.
ANSWER: When working with HTTP servlets i needed to have the jars within the WEB-INF/lib directory. Else i could just keep them under the java build path (libraries). Thus in eclispe, right click on lib, then Add Google API's and the select BigQuery.
I am testing out google app engine with big query.
I am able to run big query fine in eclipse when I run it as an app, however when i run it as an HttpServlet i keep getting the following error!
java.lang.NoClassDefFoundError: com/google/api/client/json/JsonFactory
Below is the exact code I am using.
package com.hw3.test;
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.bigquery.Bigquery;
import com.google.api.services.bigquery.BigqueryScopes;
import com.google.api.services.bigquery.model.GetQueryResultsResponse;
import com.google.api.services.bigquery.model.QueryRequest;
import com.google.api.services.bigquery.model.QueryResponse;
import com.google.api.services.bigquery.model.TableCell;
import com.google.api.services.bigquery.model.TableRow;
import java.io.IOException;
import javax.servlet.http.*;
import java.util.List;
import java.util.Scanner;
#SuppressWarnings("serial")
public class HelloWord3Servlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws IOException {
Bigquery bigquery = createAuthorizedClient(); //If i comment this out i will get the text below, else i get the error from the title.
resp.setContentType("text/plain");
resp.getWriter().println("\nQuery Results:\n------------\n");
}
private static List<TableRow> executeQuery(String querySql, Bigquery bigquery, String projectId)
throws IOException {
QueryResponse query = bigquery.jobs().query(projectId, new QueryRequest().setQuery(querySql)).execute();
// Execute it
GetQueryResultsResponse queryResult = bigquery.jobs()
.getQueryResults(query.getJobReference().getProjectId(), query.getJobReference().getJobId()).execute();
return queryResult.getRows();
}
public static Bigquery createAuthorizedClient() throws IOException {
// Create the credential
HttpTransport transport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential credential = GoogleCredential.getApplicationDefault(transport, jsonFactory);
// Depending on the environment that provides the default credentials
// (e.g. Compute Engine, App
// Engine), the credentials may require us to specify the scopes we need
// explicitly.
// Check for this case, and inject the Bigquery scope if required.
if (credential.createScopedRequired()) {
credential = credential.createScoped(BigqueryScopes.all());
}
return new Bigquery.Builder(transport, jsonFactory, credential).setApplicationName("Bigquery Samples").build();
}
public static void main(String[] args) throws IOException {
Scanner sc;
if (args.length == 0) {
// Prompt the user to enter the id of the project to run the queries
// under
System.out.print("Enter the project ID: ");
sc = new Scanner(System.in);
} else {
sc = new Scanner(args[0]);
}
String projectId = sc.nextLine();
// Create a new Bigquery client authorized via Application Default
// Credentials.
Bigquery bigquery = createAuthorizedClient();
List<TableRow> rows = executeQuery(
"SELECT TOP(corpus, 10) as title, COUNT(*) as unique_words " + "FROM [publicdata:samples.shakespeare]",
bigquery, projectId);
printResults(rows);
}
private static void printResults(List<TableRow> rows) {
System.out.print("\nQuery Results:\n------------\n");
for (TableRow row : rows) {
for (TableCell field : row.getF()) {
System.out.printf("%-50s", field.getV());
}
System.out.println();
}
}
}
I got this code directly from the google website although i did modify it slightly so that i can test out app engine. However it will not work when using app engine.
Any help is greatly appreciated!

It sounds like dependencies aren't configured correctly when you are running as an HttpServlet. How do you tell your app which dependencies to use? What version are you trying to load? Is that version available in Google App Engine?
Note that the specific version of the jackson libraries you require change depending on what environment you are running in. See https://developers.google.com/api-client-library/java/google-http-java-client/setup for a list of dependencies you need in various environments.

ANSWER: When working with HTTP servlets i needed to have the jars within the WEB-INF/lib directory. Else i could just keep them under the java build path (libraries). Thus in eclispe, right click on lib, then Add Google API's and the select BigQuery.

Related

PubSub Emulator - ( Support Proto Buffer publish/receive msg)

I am developing a solution to use a common Proto Buffer library to send and receive msg using directly proto buffer serialized (ByteString) and deserialization from a (ByteString) directly into the same Proto Buffer Class. My solution until now it is not working. Just when I use a real PubSub.
Based on The doc: Testing apps locally with the emulator information and more specific in the section knowing limitations:
Emulator doesn't provide Schema support for protocol buffers.
Although, I am not using any schema definition in Topic/Subscription. Just using a common proto buffer library programmatically. I'm afraid there is a Pubsub emulation limitation and for this reason my solution doesn't work with the Emulator.
Bellow my Test Class any clarification we be very welcome.
package com.example.pubsubgcpspringapplications;
import java.io.IOException;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import com.alpian.common.pubsub.messages.OnfidoVerificationEvent;
import com.example.pubsubgcpspringapplications.config.PubSubTestConfig;
import com.example.pubsubgcpspringapplications.services.MessageRealGcpService;
import com.example.pubsubgcpspringapplications.util.DataGenerationUtils;
import com.google.api.core.ApiFuture;
import com.google.cloud.pubsub.v1.AckReplyConsumer;
import com.google.cloud.pubsub.v1.MessageReceiver;
import com.google.cloud.pubsub.v1.Publisher;
import com.google.cloud.pubsub.v1.Subscriber;
import com.google.protobuf.ByteString;
import com.google.protobuf.InvalidProtocolBufferException;
import com.google.protobuf.util.JsonFormat;
import com.google.pubsub.v1.ProjectSubscriptionName;
import com.google.pubsub.v1.PubsubMessage;
import lombok.SneakyThrows;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.ActiveProfiles;
//#ActiveProfiles("test")
public class EmulatorPubSubWithSpringTest {
#BeforeAll
static void startUpTests() throws IOException {
PubSubTestConfig.setupPubSubEmulator();
}
#SneakyThrows
#Test
void successfulTest() throws InterruptedException {
var status = DataGenerationUtils.STATUS_COMPLETE;
var result = DataGenerationUtils.RESULT_CLEAR;
var subResult = DataGenerationUtils.SUB_RESULT_CLEAR;
var documentReport = DataGenerationUtils.generateOnfidoDocumentReport(status, result, subResult);
var facialSimilarityReport = DataGenerationUtils
.generateOnfidoFacialSimiliratyVideoReport(status, result, subResult);
OnfidoVerificationEvent.Builder builder = OnfidoVerificationEvent.newBuilder();
builder.setCheckId(DataGenerationUtils.FAKE_CHECK_ID);
builder.setApplicantId(DataGenerationUtils.FAKE_APPLICANT_ID);
builder.setDocument(documentReport);
builder.setFacialSimilarityVideo(facialSimilarityReport);
OnfidoVerificationEvent onfidoVerificationEvent = builder.build();
publishProtoMessageTest(onfidoVerificationEvent);
MessageReceiver receiver =
(PubsubMessage message, AckReplyConsumer consumer) -> {
ByteString data = message.getData();
// Get the schema encoding type.
String encoding = message.getAttributesMap().get("googclient_schemaencoding");
block:
try {
switch (encoding) {
case "BINARY":
// Obtain an object of the generated proto class.
OnfidoVerificationEvent state = OnfidoVerificationEvent.parseFrom(data);
System.out.println("Received a BINARY-formatted message: " + state);
break;
case "JSON":
OnfidoVerificationEvent.Builder stateBuilder = OnfidoVerificationEvent.newBuilder();
JsonFormat.parser().merge(data.toStringUtf8(), stateBuilder);
System.out.println("Received a JSON-formatted message:" + stateBuilder.build());
break;
default:
break block;
}
} catch (InvalidProtocolBufferException e) {
e.printStackTrace();
}
consumer.ack();
System.out.println("Ack'ed the message");
};
ProjectSubscriptionName subscriptionName =
ProjectSubscriptionName.of(PubSubTestConfig.PROJECT_ID, PubSubTestConfig.SUBSCRIPTION_NAME);
// Create subscriber client.
Subscriber subscriber = Subscriber.newBuilder(subscriptionName, receiver).build();
try {
subscriber.startAsync().awaitRunning();
System.out.printf("Listening for messages on %s:\n", subscriptionName);
subscriber.awaitTerminated(30, TimeUnit.SECONDS);
} catch (TimeoutException timeoutException) {
subscriber.stopAsync();
}
Thread.sleep(15000);
}
public static void publishProtoMessageTest(OnfidoVerificationEvent onfidoVerificationEvent)
throws IOException, ExecutionException, InterruptedException {
Publisher publisher = null;
block:
try {
publisher = Publisher.newBuilder("projects/my-project-id/topics/topic-one").build();
PubsubMessage.Builder message = PubsubMessage.newBuilder();
// Prepare an appropriately formatted message based on topic encoding.
message.setData(onfidoVerificationEvent.toByteString());
System.out.println("Publishing a BINARY-formatted message:\n" + message);
// Publish the message.
ApiFuture<String> future = publisher.publish(message.build());
//System.out.println("Published message ID: " + future.get());
} finally {
if (publisher != null) {
publisher.shutdown();
publisher.awaitTermination(1, TimeUnit.MINUTES);
}
}
}
}
Note: Please, I just copied some sniped code from google tutorial and modified it. I don't want to use JSON just publish and receive msg using proto files.
Many Thanks in advance!
EDIT: Better clarification about the simulator in comments and in another posted answer.
As you pointed, the PubSub emulator currently not support the use os protobuffer messages, and that's what you are using in your code (Snippets from Publish / Receive messages of protobuf schema type), and its not supported currently. You can try to use Avro schema type or open a feature request on Google issue tracker for work with protobuffer schemas in PubSub emulator.
The "resource not found" issue would not have anything to do with Pub/Sub the emulator not supporting Protocol Buffer schemas. If you tried to use Protocol Buffers in an unsupported way (which would be creating a Schema object that uses PROTCOL_BUFFER as its type), then you'd get back an error specifically about the lack of support for Protocol Buffer schemas in the emulator.
Your issue looks more like one of the following:
The name of the subscription does not match the name of the subscription you created.
You did not actually create the subscription in the emulator, but instead created it in the actual Pub/Sub service.
You did not point your subscriber to the emulator by setting the PUBSUB_EMULATOR_HOST environment variable.
You should verify the subscription exists in the emulator. You can do this by running the gcloud tool against it. Let's assume you started up your emulator with the following command:
gcloud beta emulators pubsub start --project=my-test-project
If this starts up your emulator on port 8085, you can check that your subscription exists by running:
> CLOUDSDK_API_ENDPOINT_OVERRIDES_PUBSUB=http://localhost:8085/ gcloud --project my-test-topic pubsub subscriptions list
If your subscription does not exist when you run that command, then it means that you likely didn't create the subscription in the emulator, but instead created it in the actual service. If you do see it, then it likely means your subscriber isn't sending requests to the emulator, but is actually sending requests the Pub/Sub service itself.

Unexpected Character(m~) In $SkipToken

I am using ms graph api for java. At the beginning of the skipToken i have received inside #odata.nextLink, there is an unexpected character (m~) before actual skip token string (Can be seen below). Skip token string works fine after i get rid of m~.
But i am confused why this has happened and can other unexpected characters effect skipToken in the future? And what can i do to prevent that?
I am using msgraph java sdk version 2.4.0.
https://graph.microsoft.com/v1.0/users?$select=givenName%2csurname%2cuserPrincipalName%2cbusinessPhones%2cassignedPlans&$count=true&$orderby=displayName&$filter=&$top=2&$skiptoken=m~X%270100B7013B3B33303030343530303330303033323030333030303435303033313030343130303330303034353030333230303331303033303030343530303334303033383030333030303435303033323030333130303330303033373030333030303332303033303030343530303431303033323030333030303435303033303030333230303330303034353030333730303330303033303030343530303330303034313030333030303435303033323030333130303B313B303B%27
I'm not clear how did you get the m~ in skiptoken, but I can get the pages of users success by microsoft graph api sdk for java with below code:
package com.graph;
import java.util.List;
import com.azure.identity.ClientSecretCredential;
import com.azure.identity.ClientSecretCredentialBuilder;
import com.microsoft.graph.authentication.TokenCredentialAuthProvider;
import com.microsoft.graph.models.User;
import com.microsoft.graph.requests.GraphServiceClient;
import com.microsoft.graph.requests.UserCollectionPage;
public class Testgraph {
public static void main(String[] args) {
final ClientSecretCredential clientSecretCredential = new ClientSecretCredentialBuilder()
.clientId("clientId")
.clientSecret("clientSecret")
.tenantId("tenantId")
.build();
final TokenCredentialAuthProvider tokenCredentialAuthProvider = new TokenCredentialAuthProvider(clientSecretCredential);
final GraphServiceClient graphClient = GraphServiceClient
.builder()
.authenticationProvider(tokenCredentialAuthProvider)
.buildClient();
//You can use the code below to get current page users
UserCollectionPage users = graphClient.users()
.buildRequest()
.get();
List<User> userList=users.getCurrentPage();
for (User user:userList) {
System.out.println(user.displayName);
}
//If you want to get nextpage, you can use below code
UserCollectionPage users1 = users.getNextPage().buildRequest().get();
List<User> userList1=users1.getCurrentPage();
for (User user:userList1) {
System.out.println(user.displayName);
}
}
}

Deleting a ServingUrl created from a GcsFilename instead of a BlobKey doesn't seem to be supported?

I am porting my Google App Engine app from the BlobStore to the Google Cloud Store.
I found that in GAE SDK 1.9.7 they deprecated all the .getServingURL() methods that took BlobKey and replaced them with on that takes a ServingUrlOptions object as configuration.
This make sense and seems to work, but there doesn't seem to be any matching .deleteServingUrl() that takes a GcsFilename?
I found the following in the SdkReleaseNotes but it doesn't clarify how you actually do this?
Version 1.7.0 - June 26, 2012
You can now use get_serving_url() and delete_serving_url() for Google Cloud Storage buckets.
There is nothing in the ImagesService javadoc that appears to do the job.
How do you delete a serving url that is created with a GcsFilename?
Solution
After way too much digging through JavaDocs, I discovered:
BlobKey createGsBlobKey(java.lang.String filename)
Here is the complete solution I ended up with.
Imports:
import com.google.appengine.api.blobstore.BlobKey;
import com.google.appengine.api.blobstore.BlobstoreService;
import com.google.appengine.api.blobstore.BlobstoreServiceFactory;
import com.google.appengine.api.images.ImagesService;
import com.google.appengine.api.images.ImagesServiceFactory;
import com.googlecode.objectify.Work;
import com.vertigrated.gae.codex.service.datastore.entity.ImageMetadata;
import javax.annotation.Nonnull;
import java.util.UUID;
import static com.googlecode.objectify.ObjectifyService.ofy;
And the code:
private static final BlobstoreService BLOBSTORE_SERVICE;
private static final ImagesService IMAGES_SERVICE;
static
{
BLOBSTORE_SERVICE = BlobstoreServiceFactory.getBlobstoreService();
IMAGES_SERVICE = ImagesServiceFactory.getImagesService();
}
#Override
public boolean delete(#Nonnull final UUID uuid)
{
return ofy().transact(new Work<Boolean>()
{
#Override
public Boolean run()
{
final ImageMetadata im = ofy().load().type(ImageMetadata.class).id(uuid.toString()).now();
final BlobKey bk = BLOBSTORE_SERVICE.createGsBlobKey(im.getFilename().toString());
IMAGES_SERVICE.deleteServingUrl(bk);
ofy().delete().entity(im);
return ImageMetadataEntityService.this.delete(uuid);
}
});
}

NoClassDefFoundError: javax.naming.directory.InitialDirContext is a restricted class. Using CCS (GCM) in Google App Engine

Im trying to implement google's Cloud Connection Server with Google App Engine following this tutorial -
Implementing an XMPP-based App Server. I copied latest smack jars from http://www.igniterealtime.org/projects/smack/ (smack.jar and smackx.jar), put them in WEB-INF/lib and added them to the classpath (im using eclipse).
In the code sample in the first link i posted, the XMPPConnection is initiated in a 'main' method. Since this is not really suitable to GAE i created a ServletContextListener and added it to web.xml.
public class GCMContextListener implements ServletContextListener {
private static final String GCM_SENDER_ID = "*GCM_SENDER_ID*";
private static final String API_KEY = "*API_KEY*";
private SmackCcsClient ccsClient;
public GCMContextListener() {
}
#Override
public void contextInitialized(ServletContextEvent arg0) {
final String userName = GCM_SENDER_ID + "#gcm.googleapis.com";
final String password = API_KEY;
ccsClient = new SmackCcsClient();
try {
ccsClient.connect(userName, password);
} catch (XMPPException e) {
e.printStackTrace();
}
}
#Override
public void contextDestroyed(ServletContextEvent arg0) {
try {
ccsClient.disconnect();
} catch (XMPPException e) {
e.printStackTrace();
}
}
}
web.xml
<web-app>
<listener>
<listener-class>com.myserver.bootstrap.GCMContextListener</listener-class>
</listener>
</web-app>
Now, when i start the GAE server i get the following exception :
java.lang.NoClassDefFoundError: javax.naming.directory.InitialDirContext is a restricted class. Please see the Google App Engine developer's guide for more details.
i searched the "Google App Engine developer's guide for more details" but couldnt find anything about this. can you please help me ?
Google App Engine restricts access to certain JRE classes. In fact they published a whitelist that shows you which classes are useable. It seems to me that the Smack library might require some reference to a directory context (maybe to create the XMPP messages?) and that is why your servlet causes this exception. The javax.naming.directory is not in the whitelist.
I'm currently working on setting up a GCM Server as well. It seems to me that you need to read through the example and see what that main method is doing. What I see is a connection to the GCM server:
try {
ccsClient.connect(userName, password);
} catch (XMPPException e) {
e.printStackTrace();
}
Then a downstream message being sent to a device:
// Send a sample hello downstream message to a device.
String toRegId = "RegistrationIdOfTheTargetDevice";
String messageId = ccsClient.getRandomMessageId();
Map<String, String> payload = new HashMap<String, String>();
payload.put("Hello", "World");
payload.put("CCS", "Dummy Message");
payload.put("EmbeddedMessageId", messageId);
String collapseKey = "sample";
Long timeToLive = 10000L;
Boolean delayWhileIdle = true;
ccsClient.send(createJsonMessage(toRegId, messageId, payload, collapseKey,
timeToLive, delayWhileIdle));
}
These operations would be completed at some point during your application's lifecycle, so your servlet should support them by providing the methods the example is implementing, such as the connect method that appears in the first piece of code that I pasted here. It's implementation is in the example at line 235 if I'm not mistaken.
As the documentation says, the 3rd party application server, which is what you're trying to implement using GAE, should be:
Able to communicate with your client.
Able to fire off properly formatted requests to the GCM server.
Able to handle requests and resend them as needed, using exponential back-off.
Able to store the API key and client registration IDs. The API key is included in the header of POST requests that send messages.
Able to store the API key and client registration IDs.
Able to generate message IDs to uniquely identify each message it sends.

which are the files uri on GAE java emulating cloud storage with GCS client library?

I'm developing a web application using Google app engine for Java.
I will use Google Cloud storage and according to the documentation, I'm using GCS client library to emulate cloud storage on local disk.
I have no problem saving the files, I can see them from eclipse under the war folder (under the path WEB-INF/appengine-generated) and I can see them from the web admin panel accessible from the url
localhost:8888/_ah/admin
as indicated in this question
My question is the following. Which are the files URI under localhost to access them with GCS emulation?
Example of one of uploaded files on localhost:
file key is aglub19hcHBfaWRyJwsSF19haF9GYWtlQ2xvdWRTdG9yYWdlX18xIgpxcmNvZGUuanBnDA
ID/name is encoded_gs_key:L2dzLzEvcXJjb2RlLmpwZw
filename is /gs/1/qrcode.jpg
Thanks in advance.
You can see how this is done here:
https://code.google.com/p/appengine-gcs-client/source/browse/trunk/java/src/main/java/com/google/appengine/tools/cloudstorage/dev/LocalRawGcsService.java
As of today this mapping is being maintained by the using the local datastore. This may change in the future, but you should be able to simply call into this class or one of the higher level classes provided with the GCS client to get at the data.
Using getServingUrl()
The local gcs file is saved into a blob format.
When saving it, I can use location like your filename "/gs/1/qrcode.jpg"
Yet, when accessing it, this fake location is not working.
I found a way. It may not be the best, but works for me.
BlobKey bk = BlobstoreServiceFactory.getBlobstoreService().createGsBlobKey(location);
String url = ImagesServiceFactory.getImagesService().getServingUrl(bk);
The url will be like:
http://127.0.0.1:8080/_ah/img/encoded_gs_key:yourkey
(I was hardly to find any direct solution by google search.
I hope this answer can help others in need.)
Resource: ImagesServiceFactory ImageService
FileServiceFactory
For those who wish to serve the local GCS files that have been created by the GAE GCS library, one solution is to expose a Java Servlet like this:
package my.applicaion.servlet;
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.google.appengine.api.blobstore.BlobKey;
import com.google.appengine.api.blobstore.BlobstoreService;
import com.google.appengine.api.blobstore.BlobstoreServiceFactory;
public final class GoogleCloudStorageServlet
extends HttpServlet
{
#Override
protected void doGet(final HttpServletRequest request, final HttpServletResponse response)
throws ServletException, IOException
{
final BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
final String fileName = "/gs" + request.getPathInfo();
final BlobKey blobKey = blobstoreService.createGsBlobKey(fileName);
blobstoreService.serve(blobKey, response);
}
}
and in your web.xml:
<servlet>
<servlet-name>GoogleCloudStorage</servlet-name>
<servlet-class>my.applicaion.servlet.GoogleCloudStorageServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>GoogleCloudStorage</servlet-name>
<url-pattern>/gcs/*</url-pattern>
</servlet-mapping>
If you host this servlet in your GAE application, the URL for accessing a GCS file with bucket bucket-name and with name fileName is http://localhost:8181:/gcs/bucket-name/fileName, the local GAE development server port number being 8181.
This works at least from GAE v1.9.50.
And if you intend to have the local GCS server working in a unit test with Jetty, here is a work-around, hopefully with the right comments:
final int localGcsPortNumber = 8081;
final Server localGcsServer = new Server(localGcsPortNumber);
final ServletContextHandler context = new ServletContextHandler(ServletContextHandler.NO_SESSIONS);
final String allPathSpec = "/*";
context.addServlet(new ServletHolder(new HttpServlet()
{
#Override
protected void service(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException
{
final BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
final String fileName = "/gs" + request.getRequestURI();
final BlobKey blobKey = blobstoreService.createGsBlobKey(fileName);
if (blobKey != null)
{
// This is a work-around over the "ServeBlobFilter" which does not take the "Content-Type" from the "blobInfo", but attempts to retrieve it from the "blobKey"
final BlobInfo blobInfo = BlobStorageFactory.getBlobInfoStorage().loadGsFileInfo(blobKey);
if (blobInfo != null)
{
final String contentType = blobInfo.getContentType();
if (contentType != null)
{
response.addHeader(HttpHeaders.CONTENT_TYPE, contentType);
}
}
}
blobstoreService.serve(blobKey, response);
}
}), allPathSpec);
// The filter is responsible for taken the "blobKey" from the HTTP header and for fulfilling the response with the corresponding GCS content
context.addFilter(ServeBlobFilter.class, allPathSpec, EnumSet.of(DispatcherType.REQUEST));
// This attribute must be set, otherwise a "NullPointerException" is thrown
context.getServletContext().setAttribute("com.google.appengine.devappserver.ApiProxyLocal", LocalServiceTestHelper.getApiProxyLocal());
localGcsServer.setHandler(context);
localGcsServer.start();

Resources