How to use a mongoTemplate with a qualifier in a MongoRepository? - spring-data-mongodb

We have an in house configuration mechanism which create spring bean outside of the regular xml file or java Configuration class.
Inside this configuration, we have a mongoTemplate bean which is created with a specific qualifier "appMongoTemplate"
We can #autowired this mongoTemplate in our services like any regular spring bean:
#Autowired
#Qualifier("appMongoTemplate")
protected MongoTemplate mongoTemplate;
Now I am trying to use a MongoRepository.
I just declare a regular interface:
public interface MyRepository extends MongoRepository<MyDocument, String>
But when my application start, I have an exception during the repository creation, because it cannot find the mongoTemplate bean.
I guess I need to specify the name of the mongoTemplate to use, but I don't know how.
Thanks

Both the XML namespace and the #EnableMongoRepositories annotation have an attribute to explicitly wire the MongoTemplate bean to be used for the repositories:
<mongo:repositories mongo-template-ref="appMongoTemplate" />
or
#EnableMongoRepositories(mongoTemplateRef = "appMongoTemplate")

Related

#EnableJpaRepositories annotation disables data.sql initialization script

My Spring Boot (v2.3.4) based application uses my custom library containing core entities and business logic. To use entities and repositories from this library I had to use #EnableJpaRepositories and #EntityScan annotations with proper packages provided.
I also wanted to initialize database with some required data (let's say the configuration) during application startup. I found that Spring Boot allows to use data.sql or data-${platform}.sql files to achieve that.
Long story short when using #EnableJpaRepositories annotation the data.sql script is not executed.
I did some digging in the code and found that when #EnableJpaRepositories annotation is not used then entityManagerFactory bean is of org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean type. This bean uses org.springframework.boot.autoconfigure.orm.jpa.DataSourceInitializedPublisher bean post processor, which fires org.springframework.boot.autoconfigure.jdbc.DataSourceSchemaCreatedEvent event indicating the schema has been created. Class, which listens for this event is org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerInvoker. This listener invokes initSchema() method from org.springframework.boot.autoconfigure.jdbc.DataSourceInitializer class. This method is responsible for whole initialization using data.sql script.
It looks like setting #EnableJpaRepositories annotation creates instance of different class for entityManagerFactory bean, which does not support this simple initialization.
My basic question is then how to make it all work with #EnableJpaRepositories annotation. I can always use Hibernate's import.sql file (which works fine) but I'm also trying to understand what exactly is going on under the hood I how can I control it.
UPDATE 1 28.09.2021
I did further investigation and #EnableJpaRepositories annotation does not change the instance type of entityManagerFactory but it causes silent exception (?) when creating org.springframework.scheduling.annotation.ProxyAsyncConfiguration bean (during creation of org.springframework.context.annotation.internalAsyncAnnotationProcessor bean). It looks like everything is related to #EnableAsync annotation, which I'm also using but didn't know it might be related. But it is - removing it makes the initialization work even with #EnableJpaRepositories.
UPDATE 2 28.09.2021
I've found full explanation for my issue. There are 4 conditions, which must be met to reproduce the issue:
#EnableJpaRepositories annotation in application configuration
#EnableAsync annotation in application configuration
Configuration implements AsyncConfigurer interface
Autowired any JpaRepository repository or any other bean which injects repository
Enabling asynchronous execution and implementing AsyncConfigurer makes the whole configuration to be instantiated before regular beans. Because Spring has to inject repository, it needs to instantiate entityManagerFactory bean too. Spring prints thenINFO level logs like below:
Bean 'entityManagerFactoryBuilder' of type [org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
One of not eligible BeanPostProcessors is DataSourceInitializedPublisher responsible for firing DataSourceSchemaCreatedEvent event. Without that event, data-${platform}.sql script won't be processed at all.
I'm not sure what is the role of #EnableJpaRepositories in that process but without it the problem does not occur.
Example
Minimal code to reproduce the issue (data.sql located in src/main/resources):
#Entity
public FileStore {
...
}
public interface FileStoreRepository extends extends JpaRepository<FileStore, Long> {
}
#Configuration
#EnableAsync
#EnableJpaRepositories
public class Configuration implements AsyncConfigurer {
#Autowired
private FileStoreRepository fileStoreRepository;
...
}
Solutions
There are two solutions I'm aware of:
Move AsyncConfigurer along with its overrided methods and #EnableAsync annotation to separate configuration class
Use #Lazy annotation on autowired bean like below:
#Lazy
#Autowired
private FileStoreRepository fileStoreRepository;
Similar problem was pointed by #Allen D. Ball and can be checked there.
This behavior had changed.
Take a look at the
how-to guide.
Add:
spring.jpa.defer-datasource-initialization: true

Spring AOP PointCut only working when Bean defined in ApplicationContext

I am wondering, why my point cuts ins Spring AOP do only work, if I specify my beans containing the join points explicitly in the application context XML.
Normally in my project, all Spring beans are defined over annotations:
#Service
#Component
configured with
<context:component-scan base-package="my.package.base" scoped-proxy="interfaces" />
<context:annotation-config />
The beans get created and are usable throughout my application but the point cut is not triggered.
When I specify the bean manually in my application context with
<bean class="..." />
The point cut is matched and the according advice is executed.
#Pointcut("execution(* my.package.base..*.update*(..))")
public void updateDataPointcut() {}
AOP is configured in the application context with
<aop:aspectj-autoproxy />
I have also created a pointcut for Spring Data JPA CrudRepository which works fine.
What is the difference?
Is there a pitfall in the "component-scan" configuration?

camel-cdi how to not auto start CamelContext and not auto discover RouteBuilder

In previous projects i often used Guice also in conjunction with camel. My approach was to extend Camel's Main class and inject my preconfigured context there.
I needed to control start of the context. Before start of context i did some preperation (e.g. start hawtio and other setup stuff).
The same i did with RouteBuilder. One central RouteBuilder set up stuff like onException, added RoutePolicies and configured autostart on other routes and of course added all the other routes.
In meantime i learned to love CDI and camel's CDI support in 2.17 (and fuse 6.3) seems to be complete.
So what would be a good approach with camel-cdi to control the start of camel context (deployed as osgi bundle on fuse)?
How to disable or control autodiscovery of RouteBuilder (and or other stuff)?
So what would be a good approach with camel-cdi to control the start
of camel context (deployed as osgi bundle on fuse)?
Camel CDI always starts the auto-configured Camel contexts. That being said, it is possible to customise these so that routes are not started, by declaring a PostConstruct lifecycle event for example:
#ApplicationScoped
class CustomCamelContext extends DefaultCamelContext {
#PostConstruct
void customize() {
setAutoStartup(false);
}
}
In that example, the routes added to that Camel context won't be started along with the context.
This respects the Camel principle to start contexts with all the validation that's done at that stage. Yet with the ability to not start the routing.
How to disable or control autodiscovery of RouteBuilder (and or other
stuff)?
The RoutesBuilder beans qualified with #ContextName are automatically added to the corresponding CamelContext beans by Camel CDI. If no such CamelContext bean exists, it gets automatically created. On the other hand, RoutesBuilder beans qualified with user-defined qualifiers do not trigger the automatic creation of any CamelContext beans. That can be used for Camel routes that may be required to be added later during the application execution. For example with:
#DoNotDiscover
class MyRouteBuilder extends RouteBuilder {
// ...
}
If no Camel context bean qualified with #DoNotDiscover is explicitly declared, the MyRouteBuilder bean won't be auto-discovered. Still it can be used later during the application execution, e.g.:
#Inject
#DoNotDiscover
Instance<RouteBuilder> routes;
#Inject
CamelContext context;
for (RouteBuilder route : routes)
route.addRoutes(route);

Spring MVC annotation with XML performance

I had previously used Spring MVC and hibernate annotations in my Google web application project. It is taking some time to start the application after deployment.
For that reason, I am switching to a Spring MVC XML-based approach for the controller only. However, for service and DAO classes, #Service and #Repository annotations remain as is.
In my Spring XML I am doing as like below (there is no bean tag defined for service and DAO classes):
<bean class="org.springframework.web.servlet.mvc.support.ControllerClassNameHandlerMapping" />
<bean class="org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter" />
<bean class="com.my.controller.UserController">
<property name="domainManager" ref="domainManager"/>
<property name="userProfileDao" ref="userProfileDao"/>
</bean>
Inside UserController, I am not using any #autowired annotation. I am using combination of annotations with XML. Are there any drawbacks of this approach? Am I going about this the wrong way?
The difference is not between using Annotation or XML, it's between Autowiring and "manually injecting beans".
EDIT: #Autowired and XML component scan are doing the same thing.
You can "manually inject" beans with both XML and full Java #Configuration, the equivalent of your example would be :
#Configuration
public class WebAppConfig {
#Bean
public UserDao userDao() {
return new UserDao();
}
#Bean
public UserController userController() {
UserController ctrl = new UserController();
ctrl.setUserDao(userDao());
return ctrl;
}
}
The question is quite accurate because the App Engine team itself has revealed that the App Engine runtime was bad at classpath scanning (which Autowiring does to find matches by Class).
The performance loss at instance startup time would occur if you were doing :
public class UserController {
#Autowired
private UserDao userDao;
// ...
}
See this video, especially the question from the Pivotal (Spring framework) contributor : http://www.youtube.com/watch?v=lFarE1hH0ss
Few people know about this issue. Using Spring AOP can even totally crash on the production runtime. See : Using Spring AOP on App Engine causes StackOverflowError
So about your use of XML, there is no "right or wrong". Personally I don't like writing XML since I feel like it's more error prone, but some people like to clearly separate their configuration from their code. I still use autowiring in production, since the startup time is not an issue for me. Do what you and your team feel comfortable with, just keep in in mind the GAE limitations.

factory and dao to support some database with spring

Currently we have an application who use spring who support mysql.
Some people prefer to use Oracle.
So i search a way with spring to have an abstract factory with a factory for every database and each one have a dao.
How to put the glue between all this component?
How the component know the datasource who need to be used?
Is there some good pratice with spring to do this?
Not clear what exactly is your problem, but Spring profiles are answer to all of them. First you need to define two DataSources for each supported database:
<bean id="oracleDataSource" class="..." profile="oracle">
<!-- -->
</bean>
<bean id="mysqlDataSource" class="..." profile="mysql">
<!-- -->
</bean>
Note the profile attribute. Actually you will probably get away with simply parametrizing one data source yo use different JDBC URL and driver, but doesn't matter.
Now you define two versions of each DAO: one for Oracle and one for MySQL:
interface MonkeyDao {
//...
}
#Repository
#Profile("oracle")
class OracleMonkeyDao implements MonkeyDao {
//...
}
#Repository
#Profile("mysql")
class MySqlMonkeyDao implements MonkeyDao {
//...
}
As you can see you have two beans defined implementing the same interface. If you do it without profiles and then autowire them:
#Resource
private MonkeyDao monkeyDao;
Spring startup will fail due to unresolved dependency. But if you enable one of the profiles (either mysql or oracle) Spring will only instantiate and create bean for matching profile.

Resources