I am trying to use MultiValuedMap as the ResponseBody of a rest service but the response I get in the browser is:
{"empty" : false}
This was working fine using MultiValueMap as the ResponseBody but after upgrading the org.apache.commons libraries, MultiValueMap is deprecated with instructions to use MultiValuedMap instead.
Here is relevant parts of my code:
import org.apache.commons.collections4.MultiValuedMap;
#RestController("DatabaseDefinitionRestController")
public class DatabaseDefinitionRestController {
#RequestMapping(value = "/database/{id}/definitions", method = RequestMethod.GET)
public MultiValuedMap<Long, DatabaseDefinition> mapDatabaseDefinitions(#PathVariable Long id) {
return databaseDefinitionService.loadDatabaseDefinition(id);
}
}
I also tried:
import org.apache.commons.collections4.multimap.ArrayListValuedHashMap;
#RestController("DatabaseDefinitionRestController")
public class DatabaseDefinitionRestController {
#RequestMapping(value = "/database/{id}/definitions", method = RequestMethod.GET)
public ArrayListValuedHashMap<Long, DatabaseDefinition> mapDatabaseDefinitions(#PathVariable Long id) {
return databaseDefinitionService.loadDatabaseDefinition(id);
}
}
Any help would be appreciated.
Whatever you're using for serialization probably has special case handling for all core Java collections, including Map. The deprecated MultiValueMap implements Map and thus benefits from that special handling. The new MultiValuedMap, for whatever reason, does not. This makes it fall back on default general handling, which depends on the internal implementation of whichever concrete class is used.
Call asMap() on the MultiValuedMap to get a view of it that implements Map, and put that in your ResponseBody to get the special case Map-based serialization.
Related
I'm working on a Quarkus extension that provides an interceptor (and its annotation) to add some retry logic around business methods this extension offers. Nothing new in there, and this is working when i annotate a public method of a bean in an application that uses this extension.
But the extension also provides some #ApplicationScoped beans that are also annotated, but the interceptor is not intercepting any of these.
Seems like an interceptor does not check / apply on the extension itself.
I would like to know if this is an intended behavior, or an issue in my extension setup, and if so how to fix it. Could not find anything about this in the documentation, but there is so much dos that i may have missed something.
Any idea about this ?
I finally found a way to make this work.
I was using a producer bean pattern to produce my beam as an #ApplicationScoped bean inside the extension.
#ApplicationScoped
public class ProxyProducer {
#Produces
#ApplicationScoped
public BeanA setUpBean(ExtensionConfig config)
{
return new BeamsClientProxy(new InternalBean(config.prop1, config.prop2));
}
}
with the following BeanA class (just an example)
public class BeanA {
private final InternalBean innerBean;
public BeanA(final InternalBean innerBean) {
this.innerBean = innerBean;
}
#MyInterceptedAnnotation
public void doSomething() {
}
}
Due to this setup, the bean is not considered by the interceptor (i guess because it's produced only the first time it's used / injected somewhere else)
Removing the producer pattern and annotating directly the BeanA fixed the issue.
Example:
#ApplicationScoped
public class BeanA {
private final InternalBean innerBean;
public BeanA(final ExtensionConfig config) {
this.innerBean = new InternalBean(config.prop1, config.prop2);
}
#MyInterceptedAnnotation
public void doSomething() {
}
}
with of course adding the following lines to register the bean directly on the extension processor:
#BuildStep
AdditionalBeanBuildItem proxyProducer() {
return AdditionalBeanBuildItem.unremovableOf(BeanA.class);
}
As a conclusion:
Changing the bean implementation to avoid the producer-based bean use case solved my issue (please refers to Ladicek comment below)
Edit:
As Ladicek explained, Quarkus doesn't support interception on producer-based beans.
I tried to inject with #Autowired annotation a repository into changelog
and it doesn't get injected.
Config uses spring application context
#Bean
public SpringBootMongock mongock(ApplicationContext springContext, MongoClient mongoClient) {
return new SpringBootMongockBuilder(mongoClient, "yourDbName", "com.package.to.be.scanned.for.changesets")
.setApplicationContext(springContext)
.setLockQuickConfig()
.build();
}
And the changelog
#ChangeLog(order = "001")
public class MyMigration {
#Autowired
private MyRepository repo;
#ChangeSet(order = "001", id = "someChangeId", author = "testAuthor")
public void importantWorkToDo(DB db){
repo.findAll() // here null pointer
}
}
firstly, notice that if you are using repositories in your changelogs, it's a bad practice to use it for writes, as it won't be covered by the lock mechanism(this is feature is coming soon), only for reads.
To inject your repository(or any other dependency) you simply need to inject it in your changeSet method signature, like this:
#ChangeLog(order = "001")
public class MyMigration {
#ChangeSet(order = "001", id = "someChangeId", author = "testAuthor")
public void importantWorkToDo(MongoTemplate template, MyRepository repo){
repo.findAll(); this should work
}
}
Notice that you should use the last version(at this moment 3.2.4) and DB class is not supported anymore. Please use MongoDatabase or MongoTemplate(preferred).
Documentation to Mongock
we have recently released the version 4.0.7.alpha, which among other things allows you to use Spring repositories(and any other custom bean you wish) in your changeSets with no problem. You can insert, update, delete and read. It will be safely covered by the lock.
The only restriction is that it needs to be an interface, which should be the common case for Spring repositories.
Please take a look to this example
I am trying to integrate Hystrix javanica into my existing java EJB web application and facing 2 issues with running it.
When I try to invoke following service it always returns response from fallback method and I see that the Throwable object in fallback method has "com.netflix.hystrix.exception.HystrixTimeoutException" exception.
Each time this service is triggered, HystrixCommad and fallback methods are called multiple times around 50 times.
Can anyone suggest me with any inputs? Am I missing any configuration?
I am including following libraries in my project.
project libraries
I have setup my aspect file as follows:
<aspectj>
<weaver options="-verbose -showWeaveInfo"></weaver>
<aspects>
<aspect name="com.netflix.hystrix.contrib.javanica.aop.aspectj.HystrixCommandAspect"/>
</aspects>
</aspectj>
Here is my config.properties file in META-INF/config.properties
hystrix.command.default.execution.timeout.enabled=false
Here is my rest service file
#Path("/hystrix")
public class HystrixService {
#GET
#Path("clusterName")
#Produces({ MediaType.APPLICATION_JSON })
public Response getClusterName(#QueryParam("id") int id) {
ClusterCmdBean clusterCmdBean = new ClusterCmdBean();
String result = clusterCmdBean.getClusterNameForId(id);
return Response.ok(result).build();
}
}
Here is my bean class
public class ClusterCmdBean {
#HystrixCommand(groupKey = "ClusterCmdBeanGroup", commandKey = "getClusterNameForId", fallbackMethod = "defaultClusterName")
public String getClusterNameForId(int id) {
if (id > 0) {
return "cluster"+id;
} else {
throw new RuntimeException("command failed");
}
}
public String defaultClusterName(int id, Throwable e) {
return "No cluster - returned from fallback:" + e.getMessage();
}
}
Thanks for the help.
If you want to ensure you are setting the property, you can do that explicitly in the circuit annotation itself:
#HystrixCommand(commandProperties = {
#HystrixProperty(name = "execution.timeout.enabled", value = "false")
})
I would only recommend this for debugging purposes though.
Something that jumps out to me is that Javanica uses AspectJ AOP, which I have never seen work with new MyBean() before. I've always have to use #Autowired with Spring or similar to allow proxying. This could well just be something that is new to me though.
If you set a breakpoint inside the getClusterNameForId can you see in the stack trace that its being called via reflection (which it should be AFAIK)?
Note you can remove commandKey as this will default to the method name. Personally I would also remove groupKey and let it default to the class name.
I'd like to have one class responsible for my logging. The way I understand it, that's what interceptors can be used for. Here's my interceptor:
package logging;
import java.io.Serializable;
import javax.interceptor.AroundInvoke;
import javax.interceptor.InvocationContext;
public class LoggingInterceptor implements Serializable {
/**
*
*/
private static final long serialVersionUID = -2095670799863528243L;
#AroundInvoke
public Object intercept(InvocationContext context) throws Exception {
System.out.println("before calling method :"
+ context.getMethod().getName());
Object[] params = context.getParameters();
for (Object param : params) {
System.out.println("PARAM " + param.toString());
}
Object result = context.proceed();
System.out.println("after calling method :"
+ context.getMethod().getName());
return result;
}
}
I annotated various methods with the #Interceptor annotation.
My question is: How do I differentiate between the methods that are called? Depending on the method, I want to log a different message and maybe some parameters like folder names.
Right now, the only thing I can think of is a big if-elseif-else or switch statement to check the name of the method.
But this seems to be poor design. Am I using the interceptor for the right purpose? And if so, how would I go about implementing logging in a clean way?
I'd like to have one class responsible for logging because I want to display methods on the user interface as well as log to a file. Also, I'd like to use some java ee 7 built-in java ee 7 mechanism for such a cross-cutting concern.
I have a Spring controller defined like this with 2 request mappings, one using localDAO and the other using dependencyDAO. LocalDAO classes exist in my project and DependencyDAO classes are imported via maven dependency:
#RestController
#PreAuthorize("hasRole('USER')")
public class MyController
#Autowired
private localDAO LocalDAO; // dao classes exist in my project
#Autowired
private DependencyDAO dependencyDAO; // dao classes imported via maven dependency
...
#RequestMapping("/getUsingLocalDAO")
private String getUsingLocalDAO(
#JsonProperty("param") String param) {
localDAO.get(param) ... // <== this never null
}
#RequestMapping("/getUsingDependencyDAO")
private String getUsingDependencyDAO(
#JsonProperty("param") String param) {
dependencyDAO.get(param) ... // <== this always null
}
...
My dao beans are defined in another class:
#Configuration
public class DaoBeans {
#Bean
public LocalDAO localDAO() throws Exception {
return new LocalDAOImpl();
}
#Bean
public DependencyDAO dependencyDAO () throws Exception {
return new DependencyDAOImpl();
}
...
I am doing an $http.post from Angular like this:
$http.post('getUsingLocalDAO', $.param($scope.parameters), {
headers : {
"content-type" : "application/x-www-form-urlencoded"
}
}).success(function(data) {
...
}).error(function(data) {
...
$http.post('getUsingDependencyDAO', $.param($scope.parameters), {
headers : {
"content-type" : "application/x-www-form-urlencoded"
}
}).success(function(data) {
...
}).error(function(data) {
...
Both posts are identical except for the method they execute.
When stepping through the debugger I can see all the dao beans being created.
When I call getUsingLocalDAO everything works as expected.
But, when I call getUsingDependencyDAO every #Autowired object is null.
I believe I am getting different instances of MyController. One managed by Spring and one not; or at least not instantiated properly.
I make these calls in succession. It doesn't matter what order they are in.
I tried injecting the servlet context via #Autowired to get the bean manually but it is always null in getUsingDependencyDAO as well.
I tried using application context aware and although I see the context setter being set in the debugger the context is always null in getUsingDependencyDAO.
If I wrap the two calls in a third request mapping like so everything works well (no null objects).
#RequestMapping("/getUsingBothDAO")
private String getUsingBothDAO(
#JsonProperty("param") String param) {
getLocalDAO(param);
getDependencyDAO(param);
...
}
I am using Spring-Boot 4.1.5. My project is 100% annotation driven and has no .xml configurations. The only difference between the two request mappings is that one uses a bean from a dependency and one does not.
I have been searching for an answer to this problem for 3 days and have not found anything close to what I am experiencing.
Can anyone shed some light as to what I am doing wrong? Any help would be greatly appreciated.
Thanks.
Ok, I solved the problem. My example code above is not entirely accurate. The request method that was giving me nulls was defined as a private method while the one that worked was defined as public as its supposed to be. Originally the private method was not a request method and that modifier remained after the change. I changed it to public and everything is working.
It was just coincidence that the private method was from an imported project. It's curious that Spring did not throw an error that the request mapping didn't exist on the private method or something to that effect.
Thanks to anyone who looked at this and was trying to figure it out.