How to define correct MediaTypes for ResponseBuilder in ExceptionMapper - cxf

I'm facing following problem:
I'm using CXF for REST services. For exception handling I'm using javax.ws.rs.ext.ExceptionMapper and in public Response toResponse(T ex) I want to return some object, for example
class MyObject {
String detail;
}
example implementation of method is similar to
public Response toResponse(T ex) {
MyObject o = new MyObject();
o.detail = "...";
return Response.status(400).entity(o).build();
}
but I'm having problem
org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor writeResponseErrorMessage
WARNING: No message body writer has been found for response class RequestError.
Somehow I found that when I specify MediaType
return Response.status(400).entity(o).type("application/json").build();
everything is ok, but I do not know which type client accepts...
Of course I can somewhere store which types client accepts and later use the correct one, but this smells. I'd like to use something nicer.
For example in my CXF endpoint I can specify, using #Produces, what kind of MediaTypes my controller method produces and CXF/Spring select the correct one. I tried it in my ExceptionMapper too, but it doesn't work.

u can do it like this
#Context HttpHeaders headers;
public Response toResponse(Exception e) {
ExceptionEntity ee = new ExceptionEntity(e);
ResponseBuilder rb = Response.status(Response.Status.INTERNAL_SERVER_ERROR);
rb.type(headers.getMediaType());
rb.entity(ee);
Response r = rb.build();
return r;
}
i'm using cxf-rs 2.7.5

Related

#AfterReturning from ExceptionHandler not working

I have a GlobalExceptionHandler class which contain multiple methods annotated with #ExceptionHandler.
#ExceptionHandler({ AccessDeniedException.class })
public final ResponseEntity<Object> handleAccessDeniedException(
Exception ex, WebRequest request) {
return new ResponseEntity<Object>(
"Access denied message here", new HttpHeaders(), HttpStatus.FORBIDDEN);
}
I have a AOP which is suppose to be triggered after the exception handler returns response.
#AfterReturning(value="#annotation(exceptionHandler)",returning="response")
public void afterReturningAdvice(JoinPoint joinPoint, Object response) {
//do something
}
But the #AfterReturning is not triggered after the handler returns a valid response.
Tried full qualified name but not working
#AfterReturning(value = "#annotation(org.springframework.web.bind.annotation.ExceptionHandler)", returning = "response"){
public void afterReturningAdvice(JoinPoint joinPoint, Object response) {
//do something
}
Please go through the documentation to understand the proxying mechanisms in Spring framework.
Assuming the ExceptionHandler code written was of the following format
#ControllerAdvice
public class TestControllerAdvice {
#ExceptionHandler({ AccessDeniedException.class })
final public ResponseEntity<Object> handleAccessDeniedException(
Exception ex, WebRequest request) {
return new ResponseEntity<Object>(
"Access denied message here", new HttpHeaders(), HttpStatus.FORBIDDEN);
}
}
key points from the documentation pertaining to the question are
Spring AOP uses either JDK dynamic proxies or CGLIB to create the
proxy for a given target object.
If the target object to be proxied implements at least one
interface, a JDK dynamic proxy is used. All of the interfaces
implemented by the target type are proxied. If the target object
does not implement any interfaces, a CGLIB proxy is created.
With CGLIB, final methods cannot be advised, as they cannot be overridden in runtime-generated subclasses.
OP identified the issue based on the comments and hints , this answer is for any future references.

How to understand Window mechanism in Apache Flink

I'm learning how to use Flink to process streaming data.
As my understanding, I can use the function map to do all kinds of transformation many times.
Saying that the Data Source kept sending Strings to Flink. All of Strings are the JSON-format data as below:
{"name":"titi","age":18}
{"name":"toto","age":20}
...
Here is my code:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
FlinkPravegaReader<String> source = FlinkPravegaReader.<String>builder()
.withPravegaConfig(pravegaConfig)
.forStream(stream)
.withDeserializationSchema(new PravegaDeserializationSchema<>(String.class, new JavaSerializer<>()))
.build();
// Convert String to Json Object
// MyJson is a POJO class, defined by me
DataStream<MyJson> jsonStream = env.addSource(source).name("Pravega Stream")
.map(new MapFunction<String, MyJson>() {
#Override
public MyJson map(String s) throws Exception {
MyJson myJson = JSON.parseObject(s, MyJson.class);
return myJson;
}
});
// Convert MyJson Object to String and extract what I need
DataStream<String> valueInJson = jsonStream
.map(new MapFunction<MyJson, String>() {
#Override
public String map(MyJson myJson) throws Exception {
return myJson.getName().toString();
}
});
valueInJson.print();
env.execute("StreamingJob");
As you see, my example is quite simple:
get and deserialize data ---> transform string to Json object ---> transform Json object to string and get what I need (I just need name here).
For now, it seems that everyting works fine. I did get the expected output from the log file.
However, I know that Flink provides us a powerful function: Window.
I want to know how to use this mechanism into my example.
For example, if I want to split the data stream with some 2-seconds windows, how to code this?
I've tried like this:
DataStream<String> valueInJson = jsonStream
.timeWindow(Time.seconds(2))
.map(new MapFunction<MyJson, String>() {
#Override
public String map(MyJson myJson) throws Exception {
return myJson.toString();
}
});
valueInJson.print();
However, I got an error:
cannot find symbol
symbol: method
timeWindow(org.apache.flink.streaming.api.windowing.time.Time)
location: variable jsonStream of type
org.apache.flink.streaming.api.datastream.DataStream
But, I have imported:
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.streaming.api.windowing.time.Time;
Why did I get this error? Did I use the Windows wrongly? Did I miss understand something about Flink?
You have the error because the timeWindow() function is defined in the KeyedStream not in the DataStream as it is key-based operation. In your case it should be enough to change timeWindow() into timeWindowAll().

Hystrix Javanica : Call always returning result from fallback method.(java web app without spring)

I am trying to integrate Hystrix javanica into my existing java EJB web application and facing 2 issues with running it.
When I try to invoke following service it always returns response from fallback method and I see that the Throwable object in fallback method has "com.netflix.hystrix.exception.HystrixTimeoutException" exception.
Each time this service is triggered, HystrixCommad and fallback methods are called multiple times around 50 times.
Can anyone suggest me with any inputs? Am I missing any configuration?
I am including following libraries in my project.
project libraries
I have setup my aspect file as follows:
<aspectj>
<weaver options="-verbose -showWeaveInfo"></weaver>
<aspects>
<aspect name="com.netflix.hystrix.contrib.javanica.aop.aspectj.HystrixCommandAspect"/>
</aspects>
</aspectj>
Here is my config.properties file in META-INF/config.properties
hystrix.command.default.execution.timeout.enabled=false
Here is my rest service file
#Path("/hystrix")
public class HystrixService {
#GET
#Path("clusterName")
#Produces({ MediaType.APPLICATION_JSON })
public Response getClusterName(#QueryParam("id") int id) {
ClusterCmdBean clusterCmdBean = new ClusterCmdBean();
String result = clusterCmdBean.getClusterNameForId(id);
return Response.ok(result).build();
}
}
Here is my bean class
public class ClusterCmdBean {
#HystrixCommand(groupKey = "ClusterCmdBeanGroup", commandKey = "getClusterNameForId", fallbackMethod = "defaultClusterName")
public String getClusterNameForId(int id) {
if (id > 0) {
return "cluster"+id;
} else {
throw new RuntimeException("command failed");
}
}
public String defaultClusterName(int id, Throwable e) {
return "No cluster - returned from fallback:" + e.getMessage();
}
}
Thanks for the help.
If you want to ensure you are setting the property, you can do that explicitly in the circuit annotation itself:
#HystrixCommand(commandProperties = {
#HystrixProperty(name = "execution.timeout.enabled", value = "false")
})
I would only recommend this for debugging purposes though.
Something that jumps out to me is that Javanica uses AspectJ AOP, which I have never seen work with new MyBean() before. I've always have to use #Autowired with Spring or similar to allow proxying. This could well just be something that is new to me though.
If you set a breakpoint inside the getClusterNameForId can you see in the stack trace that its being called via reflection (which it should be AFAIK)?
Note you can remove commandKey as this will default to the method name. Personally I would also remove groupKey and let it default to the class name.

Is it possible to return a text/plain from Google Cloud Endpoints?

I want to return just a simple blob of text from Google Cloud Endpoints that would be interpreted by the client as nothing but a text file. Is this possible?
I know it is not possible to return primitives, but can I return an HttpServletResponse and set the content myself or something?
Disclaimer: Not tested, just a braindump.
Cloud Endpoints uses ProtoRPC as the underlying transport, which encodes messages as JSON over the wire. You can't change this behavior. The simplest way to return a text file is to just define a simple message class with one String member for the text file:
public class TextFile {
private String text;
// getText, setText methods ...
}
Then your Endpoints method would look something like this:
#Api(name = "my_api", ...)
public class MyAPI {
#ApiMethod(name = "myapi.returntext", httpMethod = "get)
public TextFile returnText() {
TextFile response = new TextFile;
response.setText(read_text_from_some_source());
return response;
}
}
You'll get a trivial JSON response from this method which should be easy enough to parse the text data out of:
{ "text": "<contents_of_text_dump>" }
The response may have some extra fields such as 'kind' and 'etag' which you can ignore.
Of course the simplest method if you just want to dump out some text is to forget about Endpoints altogether and just set up a GET handler:
public class ReturnText extends HttpServlet {
public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException {
response.setContentType("text/html");
response.getWriter().write(read_text_from_some_source());
}
}
You can then map this to whatever endpoint url you wish in your web.xml.

apache camel #Produce method with Object argument instead of String

I am using Camel's POJO producing e.g.
{
public interface MyListener {
String sayHello(String name);
}
public class MyBean {
#Produce(uri = "activemq:foo")
protected MyListener producer;
public void doSomething() {
// lets send a message
String response = producer.sayHello("James");
}
}
}
The interfaces using method sayHello with string object which used as body in the camel. However, If i try to use any other Object here i get exception from camel saying no TypeConvertor found for BeanInvocation for Conversion java.io.InputStream.
I know is the object was allowed it would have been mentioned somewhere. But i want to reason why it has been done like that and if there's a way to work-around this.
I havent really used POJO messaging as yet. Maybe, an experienced user can help you better with this.
But from what I understand, it should be able to support any kind of object not just string.
The error that you're talking of seems to arise out of a mismatch down the route. I'm guessing there is some kind of issue with the consumption.
Can you please post the exact error stacktrace and the consumer method?
Thanks!
Struggling with the same problem right now. The only obvious workaround so far is to use #EndpointInject instead of #Produce - then you get ProducerTemplate and publish any object:
#EndpointInject(uri = "seda:report-send")
ProducerTemplate reportSender;
Now you can do
Object myObject = new Object();
reportSender.sendBody(myObject);
Or even
Object myObject = new Object();
Map<String, Object> headers = new HashMap<String, Object>();
headers.put("Subject", "Mail subject");
headers.put("contentType", "text/plain");
reportSender.sendBodyAndHeaders(myObject, headers);

Resources