How to log as jsonPayload to stackdriver from google app engine using logback? - google-app-engine

My spring boot app uses logback to log messages in json format. The app is configured to use consolelogappender (stdout).When the logs appear in stackdriver, they appear as textPayload instead of jsonPayload. Is it possible to write message to jsonPayload field in stackdriver using logback? If not, what are my options to log in json format?

Based on this Github Link it seems the issue all log entries are seen as textpayload. It has been added as a Feature Request but we do not have an ETA on when it will be available.
I'm not entirely sure if an alternative exist as Logback seems to be giving extensive log information, but if you are able to use the Stackdriver Logging Client instead, you could format the entry in order to get your object as a JsonPayLoad, although you will have specify most of the log categories yourself which can be an extra amount of work.

The easy way to do this, is to implements the transformation of TextPayload(JSON Format) to JSONPayload on the LoggingEnhacer
Check this answer How to use Stackdriver Structured Logging in App Engine Flex Java environment

It is possible via google-cloud-logging-logback library.
However, please note following (from https://cloud.google.com/logging/docs/structured-logging):
Note: message is saved as textPayload if it is the only field remaining
after the Logging agent moves the other special-purpose fields and
detect_json wasn't enabled; otherwise message remains in jsonPayload.
detect_json is not applicable to managed logging environments like
Google Kubernetes Engine.
To add more data to json add an enhancer. Example:
import ch.qos.logback.classic.spi.ILoggingEvent;
import com.google.cloud.logging.LogEntry;
import com.google.cloud.logging.Payload;
import com.google.cloud.logging.logback.LoggingEventEnhancer;
import java.util.HashMap;
public class EventEnhancer implements LoggingEventEnhancer {
#Override
public void enhanceLogEntry(
LogEntry.Builder logEntry,
ILoggingEvent e
) {
HashMap<String, Object> map = new HashMap<>();
map.put("thread", e.getThreadName());
map.put("context", e.getLoggerContextVO().getName());
map.put("logger", e.getLoggerName());
Payload.JsonPayload payload = logEntry.build().getPayload();
map.putAll(payload.getDataAsMap());
logEntry.setPayload(
Payload.JsonPayload.of(map)
);
}
}
Configuration:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE configuration>
<configuration scan="true">
<appender name="CLOUD" class="com.google.cloud.logging.logback.LoggingAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<log>application.log</log>
<redirectToStdout>true</redirectToStdout>
<resourceType>gae_app</resourceType>
<loggingEventEnhancer>EventEnhancer</loggingEventEnhancer>
<flushLevel>INFO</flushLevel>
</appender>
<root level="INFO">
<appender-ref ref="CLOUD"/>
</root>
</configuration>

Related

Unable to push vespa metrics to cloudwatch

Basically I need to monitor vespa metrics and for that I am trying to implement method to push metrics to cloudwatch.
This is the document that I am referring to https://docs.vespa.ai/documentation/monitoring.html
I have added the credentials file and putMetricData permission in the IAM role attached. The service.xml file that I am using in my code looks like this:
<admin version="2.0">
<adminserver hostalias="admin0"/>
<configservers>
<configserver hostalias="admin0"/>
</configservers>
<monitoring>
</monitoring>
<metrics>
<consumer id="my-cloudwatch">
<metric-set id="vespa" />
<cloudwatch region="ap-south-1" namespace="vespa">
<shared-credentials file="~/.aws/credentials" profile="default" />
</cloudwatch>
</consumer>
</metrics>
</admin>
I have deployed the code using vespa-deploy prepare application.zip && vespa-deploy activatebut I am still not seeing any metrics updated on my cloudwatch.
Also, I have tried to add:
<monitoring>
<interval>1</interval>
<systemname>vespa</systemname>
</monitoring>
But getting this error when deploying:
Request failed. HTTP status code: 400
Invalid application package: default.default: Error loading model: XML error in services.xml: element "interval" not allowed here; expected the element end-tag [9:16], input:
How can I fix this issue. Or atleast debug the issue that I am facing.
I suggest to use absolute path to the credentials file, as the ~ may not resolve to the directory you intended at runtime.
A couple more things:
I recommend using the default metric set, as vespa contains a lot of metrics, which will drive your CloudWatch cost higher. If you need additional metrics, you can add them with the metric tag inside consumer.
The monitoring element doesn't do anything useful in this context, so you should just drop it.
If you still don't see any metrics, please check for warnings or errors in the vespa log file (use vespa-logfmt) and the Telegraf log file: /opt/vespa/logs/telegraf/telegraf.log. (Vespa uses Telegraf internally to emit metrics to CloudWatch.)

Google AppEngine application log assigned to the wrong request log

When I look at the logs in the Google Log Viewer for my GAE project, I see that often the logs that I write myself in the code are assigned to the wrong request. Most of the time the log is assigned to the request directly after the request that produced the log entry.
As the root of every application log in GAE must be a request, this means that the wrong request is sometimes marked as error, because another request before produced an error, but the log is somehow assigned to the request after that.
I don't really do anything special, I use Ktor as my servlet and have an interceptor that creates a log when an exception occurs before returning status 500.
I use Java logging via SLF4J with the google cloud logging handler, but before that I used logback via SLf4J and had the same problem.
The content of the logs itself is also correct, the returned status of the request, the level of the log entry, the message, everything is ok.
I thought that it may be because I use kotlin and switch coroutine contexts during a single request, but in some cases the point where I write the log and where I send the response are exactly next to each other, so I'm not sure if kotlin has anything to do with it.
My logging.properties:
# To use this configuration, add to system properties : -Djava.util.logging.config.file="/path/to/file"
#
.level = INFO
# it is recommended that io.grpc and sun.net logging level is kept at INFO level,
# as both these packages are used by Stackdriver internals and can result in verbose / initialization problems.
io.grpc.netty.level=INFO
sun.net.level=INFO
handlers=com.google.cloud.logging.LoggingHandler
# default : java.log
com.google.cloud.logging.LoggingHandler.log=custom_log
# default : INFO
com.google.cloud.logging.LoggingHandler.level=INFO
# default : ERROR
com.google.cloud.logging.LoggingHandler.flushLevel=WARNING
# default : auto-detected, fallback "global"
#com.google.cloud.logging.LoggingHandler.resourceType=container
# custom formatter
com.google.cloud.logging.LoggingHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.SimpleFormatter.format=%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS %4$-6s %2$s %5$s%6$s%n
#optional enhancers (to add additional fields, labels)
#com.google.cloud.logging.LoggingHandler.enhancers=com.example.logging.jul.enhancers.ExampleEnhancer
My logging relevant dependencies:
implementation "org.slf4j:slf4j-jdk14:1.7.30"
implementation "com.google.cloud:google-cloud-logging:1.100.0"
An example logging call:
exception<Throwable> { e ->
logger().error("Error", e)
call.respondText(e.message ?: "", ContentType.Text.Plain, HttpStatusCode.InternalServerError)
}
with logger() being:
import org.slf4j.Logger
import org.slf4j.LoggerFactory
inline fun <reified T : Any> T.logger(): Logger = LoggerFactory.getLogger(T::class.java)
Edit:
An example of the log in Google cloud. The first request has the query parameter GAID=cdda802e-fb9c-47ad-0794d394c913, but as you can see the error log for that request is in the one below, marked in red.

Service Bus Connector and "raw" Text (Content-Type)

Ok.
I'm writing a POC (Proof of Concept) Logic App.
The logic app has a Service Bus connector wired to a queue.
I'm using peek/complete/abandon.
I wrote a client app (dotnet c# console app) that writes messages to the queue (nothing really to do with the logic app part).
I'm setting the Content Type to "text/plain".
string payLoad = #"{ ""myid"": ""1000"", ""mymessage"": ""1000 is great"" , ""myboolean"" : ""true"" }";
QueueClient queueClient = /* not seen here */;
brokeredMsg = new BrokeredMessage(payLoad) { ContentType = System.Net.Mime.MediaTypeNames.Text.Plain };
queueClient.Send(brokeredMsg);
Now I use Service Bus Explorer (4.0.104), and I see the message in the queue
The issue is that when my Logic App runs, its not seeing plain-text/json.
You can see it picked up the content-type.
But it garbly gooks the content itself.
Is there a way to get raw-text with this trigger?
Note, the documentation says:
let's look at the two Content-Types that don't require conversion or
casting that you can use in a logic app: application/json and
text/plain.
from :
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-content-type
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-servicebus
My C# console app packages.config (nothing to do with Logic Apps, but including for completeness)
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Microsoft.WindowsAzure.ConfigurationManager" version="2.0.1.0" targetFramework="net45" />
<package id="WindowsAzure.ServiceBus" version="2.1.4.0" targetFramework="net45" />
</packages>
APPEND:
I needed to do two thing to get it to work
One:
I had to change the "sender" code slightly.
QueueClient queueClient = /* not seen here */;
string ct = System.Net.Mime.MediaTypeNames.Text.Plain;
/* see https://social.msdn.microsoft.com/Forums/en-US/8fbf2391-8440-46db-bb47-648daccf46fd/servicebus-output-json-is-being-wrapped-in-a-xml-header-in-logic-app?forum=azurelogicapps and https://abhishekrlal.com/2012/03/30/formatting-the-content-for-service-bus-messages/ */
string payLoad = #"{ ""myid"": ""1000"", ""mymessage"": ""1000 is great"" , ""myboolean"" : ""true"" }";
brokeredMsg = new BrokeredMessage(new System.IO.MemoryStream(System.Text.Encoding.UTF8.GetBytes(Convert.ToString(payLoad))), true) { ContentType = ct };
queueClient.Send(brokeredMsg);
And I used the hint Derek Li gave me.
I've accepted his answer as the-answer, but PLEASE NOTE I had to do slightly more than he suggested. The code above has the urls for the reason I changed the sender code.
In a nutshell, the constructor for BrokeredMessage that I was using was choosing a specific serializer for me.
BrokeredMessage(Object)
Initializes a new instance of the BrokeredMessage class from a given object by
using DataContractSerializer with a binary XmlDictionaryWriter.
https://learn.microsoft.com/en-us/dotnet/api/microsoft.servicebus.messaging.brokeredmessage.-ctor?view=azureservicebus-4.1.1#Microsoft_ServiceBus_Messaging_BrokeredMessage__ctor
and after I figured out the answer, I found this SOF answer:
Azure Service Bus Serialization Type
You can use expression #base64ToString(triggerBody()?['ContentData'])" to convert it to string.

GWT Upload fails to App Engine

I want to provide a file upload to Google App Engine with the "GWT Upload" (https://code.google.com/p/gwtupload/). During the upload I get an error. As UploadAction servlet I use the build in: gwtupload.server.gae.AppEngineUploadAction
The servlet is configured in the web.xml in the following way:
<context-param>
<!-- max size of the upload request -->
<param-name>maxSize</param-name>
<param-value>3145728</param-value>
</context-param>
<context-param>
<!-- Useful in development mode to slow down the uploads in fast networks.
Put the number of milliseconds to sleep in each block received in the server.
false or 0, means don't use slow uploads -->
<param-name>slowUploads</param-name>
<param-value>200</param-value>
</context-param>
<servlet>
<servlet-name>uploadServlet</servlet-name>
<!-- This is the default servlet, it puts files in session -->
<servlet-class>gwtupload.server.gae.AppEngineUploadAction</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>uploadServlet</servlet-name>
<url-pattern>*.gupld</url-pattern>
</servlet-mapping>
During upload the progress bar progresses some percentages and then shows the following error:
But there are no more details in the logs.
The error message shows the class gwtupload.server.gae.MemCacheFileItemFactory$CacheableFileItem with the method setHeader(). That's strange because I can't find the method in that class. What's happening here?
Edit:
This is basically all the custom code i use. On the server side i use the build in gwtupload.server.gae.AppEngineUploadAction servlet.
package com.uploadtest.client;
import gwtupload.client.IUploadStatus.Status;
import gwtupload.client.IUploader;
import gwtupload.client.IUploader.UploadedInfo;
import gwtupload.client.MultiUploader;
import gwtupload.client.PreloadedImage;
import gwtupload.client.PreloadedImage.OnLoadPreloadedImageHandler;
import com.google.gwt.core.client.EntryPoint;
import com.google.gwt.user.client.ui.FlowPanel;
import com.google.gwt.user.client.ui.RootPanel;
/**
* Entry point classes define <code>onModuleLoad()</code>.
*/
public class GWTUploadTest2 implements EntryPoint {
// A panel where the thumbnails of uploaded images will be shown
private FlowPanel panelImages = new FlowPanel();
public void onModuleLoad() {
// Attach the image viewer to the document
RootPanel.get("thumbnails").add(panelImages);
// Create a new uploader panel and attach it to the document
MultiUploader defaultUploader = new MultiUploader();
RootPanel.get("default").add(defaultUploader);
// Add a finish handler which will load the image once the upload finishes
defaultUploader.addOnFinishUploadHandler(onFinishUploaderHandler);
}
// Load the image in the document and in the case of success attach it to the viewer
private IUploader.OnFinishUploaderHandler onFinishUploaderHandler = new IUploader.OnFinishUploaderHandler() {
public void onFinish(IUploader uploader) {
if (uploader.getStatus() == Status.SUCCESS) {
new PreloadedImage(uploader.fileUrl(), showImage);
// The server sends useful information to the client by default
UploadedInfo info = uploader.getServerInfo();
System.out.println("File name " + info.name);
System.out.println("File content-type " + info.ctype);
System.out.println("File size " + info.size);
// You can send any customized message and parse it
System.out.println("Server message " + info.message);
}
}
};
// Attach an image to the pictures viewer
private OnLoadPreloadedImageHandler showImage = new OnLoadPreloadedImageHandler() {
public void onLoad(PreloadedImage image) {
image.setWidth("75px");
panelImages.add(image);
}
};
}
In addition to that i added the following jars to my clath path:
log4j-1.2.17.jar
gwtupload-gae-0.6.6.jar
gwtupload-0.6.6.jar
commons-fileupload-1.3.jar
commons-io-2.4.jar
Also zipped my whole sample project and uploaded it here:
https://skydrive.live.com/redir?resid=60B826E451F52B4D!118&authkey=!ALa1n2mL2sRR0wU
Edit 2:
Like Manolo pointed out: I was using "commons-fileupload-1.3.jar" instead of "commons-fileupload-1.2.1.jar". Changing the jar fixed my problem!
The problem is in the version of the commons-fileupload you are using, change it to the version 1.2.1, which is the one pointed in the gwtupload documentation.
It should work with 1.2.2 as well, but to use 1.3 requires new methods (setHeaders) which are not in the UploadListeners provided with gwtupload.
You should change in your project the target java (JDK compliance) to 1.6, since it is the last one supported in GWT to avoid problems, although it runs in 1.7.

Generate and download file with jboss seam

I need to add an 'export' function to an existing web app using seam. The purpose is to export search results to a csv file. I have no problem generating a csv, but I do not know how the send the csv back to the user.
I do not want to store the csv on the server because that would be waisted storage space. How could I achieve this in jboss seam?
Use the Document Store Servlet provided by Seam.
Almost copying and pasting from the reference doc, declare the servlet in web.xml like this:
<servlet>
<servlet-name>Document Store Servlet</servlet-name>
<servlet-class>org.jboss.seam.document.DocumentStoreServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>Document Store Servlet</servlet-name>
<url-pattern>/seam/docstore/*</url-pattern>
</servlet-mapping>
Then create a export.xhtml file with only <s:resource> tag:
<s:resource xmlns="http://www.w3.org/1999/xhtml"
xmlns:s="http://jboss.com/products/seam/taglib"
data="#{myComponent.csvData}"
contentType="application/vnd.ms-excel"
fileName="#{myComponent.csvFileName}"/>
Generate link for downloading the file in your page with <s:download>:
<s:download src="/csv/export.xhtml">
<h:outputText value="Download CSV"/>
<f:param name="param1" value="somevalue"/>
<f:param name="param2" value="someOtherValue"/>
</s:download>
Finally, implement getCsvData() and getCsvFileName() methods in your component:
// could be byte[], File or InputStream
public InputStream getCsvData() {
// generate data to be downloaded
}
public String getCsvFileName() {
return "myfile.csv";
}
Note that <s:download> propagates conversation (unless you set propagation=none). If you propagate the conversation context probably you won't need to pass any parameter. For large data set it may be preferable to not propagate the conversation and pass parameter to select the data in a request scoped component.
There's a couple of ways:
1) Check the Seam docs for info on using Seam-Excel to programmatically generate your file and then write it out using a mime-type set for CSV - this is all detailed in the docs.
However, I could not get this to work in the latest version of Seam, as it requires a response object, which used to be available from the Seam context but now only returns null.
2) Code the CSV file you want as an Excel xhtml template (see the Seam docs and example projects) and simply render this as normal using an tag.
I do this regularly and it works well, bar the restriction that you cannot supply a filename.
HTH.

Resources