I'm deploying a Spring Batch job triggered by a Camel route. Here is the Spring Batch config:
#Configuration
#EnableBatchProcessing
public class JobConfig
{
...
#Bean(name = "personJob")
public Job personJob(JobCompletionNotificationListener personListener, Step personStep)
{
return jobBuilderFactory
.get(...)
.incrementer(new RunIdIncrementer())
.listener(...)
.flow(...)
.end()
.build();
}
...
The Camel route looks like this:
#ApplicationScoped
public class MyRouteBuilder extends RouteBuilder
{
#Override
public void configure() throws Exception
{
from("file://...")
...
.to("spring-batch:personJob?jobLauncherRef=jobLauncher");
}
Running the route above raises the following exception:
[ERROR] Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: spring-batch://personJob?jobLauncherRef=jobLauncher due to: No JobLauncher named jobLauncher found in the registry.
[ERROR] Caused by: java.lang.IllegalStateException: No JobLauncher named jobLauncher found in the registry."}}}}
However, the documentation clearly states:
The #EnableBatchProcessing works similarly to the other #Enable*
annotations in the Spring family. In this case, #EnableBatchProcessing
provides a base configuration for building batch jobs. Within this
base configuration, an instance of StepScope is created in addition to
a number of beans made available to be autowired:
JobRepository: bean name "jobRepository"
JobLauncher: bean name "jobLauncher"
...
So, there should be a bean named "jobLauncher" of the type JobLauncher. Why isn't it found in the registry ?
Many thanks in advance,
Seymour
Related
I'm trying to deploy appengine, but I'm seeing this error in the logs:
Uncaught exception from servlet
com.google.inject.ProvisionException: Unable to provision, see the following errors:
1) Error in custom provider, com.google.inject.OutOfScopeException: Cannot access scoped [sc.analysis.metrics.Metrics]. Either we are not currently inside an HTTP Servlet request, or you may have forgotten to apply com.google.inject.servlet.GuiceFilter as a servlet filter for this request.
at sc.analysis.metrics.MetricsModule.configure(MetricsModule.java:13)
while locating sc.analysis.metrics.Metrics
at sc.geo.management.geo.api.GeoAdminAPIv2.<init>(GeoAdminAPIv2.java:124)
while locating sc.geo.management.geo.api.GeoAdminAPIv2
1 error
at com.google.inject.internal.InternalProvisionException.toProvisionException(InternalProvisionException.java:226)
at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1053)
at com.google.inject.spi.ProviderLookup$1.get(ProviderLookup.java:111)
at com.google.api.server.spi.guice.ServiceMap.get(ServiceMap.java:68)
at com.google.api.server.spi.guice.GuiceEndpointsServlet.createService(GuiceEndpointsServlet.java:36)
at com.google.api.server.spi.EndpointsServlet.createSystemService(EndpointsServlet.java:136)
at com.google.api.server.spi.EndpointsServlet.init(EndpointsServlet.java:57)
at com.google.inject.servlet.ServletDefinition.init(ServletDefinition.java:121)
at com.google.inject.servlet.ManagedServletPipeline.init(ManagedServletPipeline.java:82)
at com.google.inject.servlet.ManagedFilterPipeline.initPipeline(ManagedFilterPipeline.java:103)
at com.google.inject.servlet.GuiceFilter.init(GuiceFilter.java:220)
at org.eclipse.jetty.servlet.FilterHolder.initialize(FilterHolder.java:140)
at org.eclipse.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:731)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742)
at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:755)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1449)
at com.google.apphosting.runtime.jetty94.AppEngineWebAppContext.startWebapp(AppEngineWebAppContext.java:274)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1414)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:524)
at com.google.apphosting.runtime.jetty94.AppEngineWebAppContext.doStart(AppEngineWebAppContext.java:218)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at com.google.apphosting.runtime.jetty94.AppVersionHandlerFactory.doCreateHandler(AppVersionHandlerFactory.java:178)
at com.google.apphosting.runtime.jetty94.AppVersionHandlerFactory.createHandler(AppVersionHandlerFactory.java:112)
at com.google.apphosting.runtime.jetty94.AppVersionHandlerMap.getHandler(AppVersionHandlerMap.java:82)
at com.google.apphosting.runtime.jetty94.JettyServletEngineAdapter.serviceRequest(JettyServletEngineAdapter.java:167)
at com.google.apphosting.runtime.RequestRunner.dispatchServletRequest(RequestRunner.java:264)
at com.google.apphosting.runtime.RequestRunner.dispatchRequest(RequestRunner.java:229)
at com.google.apphosting.runtime.RequestRunner.run(RequestRunner.java:194)
at com.google.apphosting.runtime.ThreadGroupPool$PoolEntry.run(ThreadGroupPool.java:273)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.inject.OutOfScopeException: Cannot access scoped [sc.analysis.metrics.Metrics]. Either we are not currently inside an HTTP Servlet request, or you may have forgotten to apply com.google.inject.servlet.GuiceFilter as a servlet filter for this request.
at com.google.inject.servlet.GuiceFilter.getContext(GuiceFilter.java:165)
at com.google.inject.servlet.GuiceFilter.getOriginalRequest(GuiceFilter.java:147)
at com.google.inject.servlet.ServletScopes$1$1.get(ServletScopes.java:107)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:39)
at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1050)
at sc.analysis.metrics.StaticMetricsHolder.get(StaticMetricsHolder.java:27)
at sc.analysis.metrics.StaticMetricsHolder.get(StaticMetricsHolder.java:19)
at sc.util.ScDatastore.findEntity(ScDatastore.java:765)
at sc.util.ScDatastore.findEntity(ScDatastore.java:747)
at sc.util.Datastore.findEntity(Datastore.java:289)
at picaboo.entity.util.RegistryEntities.findRegistryEntity(RegistryEntities.java:98)
at picaboo.entity.util.RegistryEntities.findRegistryEntity(RegistryEntities.java:94)
at picaboo.entity.util.RegistryEntities.findOrCreateRegistryEntity(RegistryEntities.java:116)
at sc.registry.RegistrySetting.getEntity(RegistrySetting.java:125)
at sc.registry.RegistrySetting.getUncachedValue(RegistrySetting.java:207)
at sc.registry.RegistrySetting.fetchLatestValue(RegistrySetting.java:186)
at sc.registry.RegistrySetting.updateIfNecessary(RegistrySetting.java:151)
at sc.registry.RegistrySetting.getValue(RegistrySetting.java:196)
at sc.registry.RegistrySetting.getValue(RegistrySetting.java:35)
at sc.registry.ConvertedSetting.reloadIfNecessary(ConvertedSetting.java:62)
at sc.registry.ConvertedSetting.getValue(ConvertedSetting.java:34)
at sc.geo.management.geo.api.AdminApiIngestion.<init>(AdminApiIngestion.java:62)
at sc.geo.management.geo.api.AdminApiIngestion.<init>(AdminApiIngestion.java:47)
at sc.geo.management.geo.api.GeoAdminAPI.<init>(GeoAdminAPI.java:327)
at sc.geo.management.geo.api.GeoAdminAPIv2.<init>(GeoAdminAPIv2.java:124)
at sc.geo.management.geo.api.GeoAdminAPIv2$$FastClassByGuice$$855da3d.newInstance(<generated>)
at com.google.inject.internal.DefaultConstructionProxyFactory$FastClassProxy.newInstance(DefaultConstructionProxyFactory.java:89)
at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:114)
at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91)
at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:306)
at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1050)
... 33 more
This error is thrown when the servlet starts.
It's always the Metrics Module, but the Metrics Module looks correct in terms of injection:
package sc.analysis.metrics;
import com.google.inject.AbstractModule;
import com.google.inject.Singleton;
import com.google.inject.servlet.RequestScoped;
public class MetricsModule extends AbstractModule {
#Override
protected void configure() {
bind(Metrics.class).to(MetricsImpl.class).in(RequestScoped.class);
bind(GlobalMetrics.class).to(GlobalMetricsImpl.class).in(Singleton.class);
requestStaticInjection(StaticMetricsHolder.class);
requestStaticInjection(StaticGlobalMetricsHolder.class);
requestStaticInjection(ScopeSafeMetricsHolder.class);
}
}
None of the other injections seem to have issues; is there sommething I'm missing? I don't know much about guice to be honest, but the code that calls the Metrics (GeoAdminAPI) uses a provider:
public class GeoAdminAPIv2 extends GeoAdminAPI {
#Inject
GeoAdminAPIv2(...,
final Provider<Metrics> metrics,
...)
I am having a number of type conversion issues using the Java DSL with Camel 3.14.3. For a simple example I have a route that uses a direct endpoint to trigger a pollEnrich for a file endpoint.
public class BasicRoute extends RouteBuilder {
#Override
public void configure() {
from("direct:test")
.pollEnrich("file://watchDirectory", 10000)
.to("mock:result");
}
}
When the route starts I get the following exception...
Exception in thread "main" org.apache.camel.FailedToCreateRouteException: Failed to create route route1 at: >>> PollEnrich[constant{file://watchDirectory}] <<< in route: Route(route1)[From[direct:test] -> [PollEnrich[constant{file... because of Error parsing [10000] as a java.time.Duration.
...
Caused by: org.apache.camel.NoTypeConversionAvailableException: No type converter available to convert from type: java.lang.String to the required type: java.time.Duration with value 10000
I am running this within a simple OG java app, so I am sure I am missing something in the context initialization, but I cannot find it.
I am creating an application using Apache Camel to transfer messages from AMQP to Kafka. Code can also be seen here - https://github.com/prashantbhardwaj/qpid-to-kafka-using-camel
I thought of creating it as standalone SpringBoot app using spring, amqp and kafka starters. Created a route like
#Component
public class QpidToKafkaRoute extends RouteBuilder {
public void configure() throws Exception {
from("amqp:queue:destinationName")
.to("kafka:topic");
}
}
And SpringBoot application configuration is
#SpringBootApplication
public class CamelSpringJmsKafkaApplication {
public static void main(String[] args) {
SpringApplication.run(CamelSpringJmsKafkaApplication.class, args);
}
#Bean
public JmsConnectionFactory jmsConnectionFactory(#Value("${qpidUser}") String qpidUser, #Value("${qpidPassword}") String qpidPassword, #Value("${qpidBrokerUrl}") String qpidBrokerUrl) {
JmsConnectionFactory jmsConnectionFactory = new JmsConnectionFactory(qpidPassword, qpidPassword, qpidBrokerUrl);
return jmsConnectionFactory;
}
#Bean
#Primary
public CachingConnectionFactory jmsCachingConnectionFactory(JmsConnectionFactory jmsConnectionFactory) {
CachingConnectionFactory cachingConnectionFactory = new CachingConnectionFactory(jmsConnectionFactory);
return cachingConnectionFactory;
}
jmsConnectionFactory bean which is created using Spring Bean annotation should be picked by amqp starter and should be injected into the route. But it is not happening. When I started this application, I got following exception -
org.apache.camel.FailedToStartRouteException: Failed to start route route1 because of Route(route1)[From[amqp:queue:destinationName] -> [To[kafka:.
Caused by: java.lang.IllegalArgumentException: connectionFactory must be specified
If I am not wrong connectionFactory should be created automatically if I pass right properties in application.properties file.
My application.properties file looks like :
camel.springboot.main-run-controller = true
camel.component.amqp.enabled = true
camel.component.amqp.connection-factory = jmsCachingConnectionFactory
camel.component.amqp.async-consumer = true
camel.component.amqp.concurrent-consumers = 1
camel.component.amqp.map-jms-message = true
camel.component.amqp.test-connection-on-startup = true
camel.component.kafka.brokers = localhost:9092
qpidBrokerUrl = amqp://localhost:5672?jms.username=guest&jms.password=guest&jms.clientID=clientid2&amqp.vhost=default
qpidUser = guest
qpidPassword = guest
Could you please help suggest why during autoconfiguring connectionFactory object is not being used? When I debug this code, I can clearly see that connectionFactory bean is getting created.
I can even see one more log line -
CamelContext has only been running for less than a second. If you intend to run Camel for a longer time then you can set the property camel.springboot.main-run-controller=true in application.properties or add spring-boot-starter-web JAR to the classpath.
however if you see my application.properties file, required property is present at the very first line.
One more log line, I can see at the beginning of application startup -
[main] trationDelegate$BeanPostProcessorChecker : Bean 'org.apache.camel.spring.boot.CamelAutoConfiguration' of type [org.apache.camel.spring.boot.CamelAutoConfiguration] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
Is this log line suggesting anything?
Note - One interesting fact that exactly same code was running fine last night, just restarted my desktop and there is not even a single word changed and now it is throwing exception.
This just refers to an interface
camel.component.amqp.connection-factory = javax.jms.ConnectionFactory
Instead it should refer to an existing factory instance, such as
camel.component.amqp.connection-factory = #myFactory
Which you can setup via spring boot #Bean annotation style.
I am trying to add a custom sftp component in Apache Camel to wrap the username, host, port and password in a configuration object to be passed to a sftpcomponent.
Below is the code that I have tried:
#Configuration
class SftpConfig {
#Bean("sourceSftp")
public SftpComponent getSourceSftpComponent(
#Qualifier("sftpConfig")
SftpConfiguration sftpConfig) throws Exception{
SftpComponent sftpComponent = new SftpComponent();
// not getting way to set the configuration
return sftpComponent;
}
#Bean("sftpConfig")
public SftpConfiguration getSftpConfig(
#Value("${host}") String host,
#Value("${port}") int port,
#Value("${applicationUserName}") String applicationUserName,
#Value("${password}") String password) {
SftpConfiguration sftpConfiguration = new SftpConfiguration();
sftpConfiguration.setHost(host);
sftpConfiguration.setPort(port);
sftpConfiguration.setUsername(applicationUserName);
sftpConfiguration.setPassword(password);
return sftpConfiguration;
}
}
//In other class
from("sourceSftp:<path of directory>") ---custom component
A similar approach in JMSComponent works fine where I have created a bean for sourcejms, but I am not able to do it for sftp as SftpComponent doesn't have set call for sftpconfiguration.
The Camel maintainers seem to be moving away from providing individual components with a "setXXXConfiguration" method to configure their properties. The "approved" method of providing properties -- which works with the SFTP -- is to specify them on the connection URL:
from ("sftp://host:port/foo?username=foo&password=bar")
.to (....)
An alternative approach is to instantiate an endpoint and set its properties, and then use a reference to the endpoint in the from() call. There's a gazillion ways of configuring Camel -- this works for me for XML-based configuration:
<endpoint id="fred" uri="sftp://acme.net/test/">
<property key="username" value="xxxxxxx"/>
<property key="password" value="yyyyyyy"/>
</endpoint>
<route>
<from uri="fred"/>
<to uri="log:foo"/>
</route>
You can customize it by extending the SftpComponent. This allows you to define multiple endpoints without providing the username/password for each endpoint definition.
Step 1: Extend SftpComponent and give your component a custom name, ie customSftp
#Component("customSftp")
public class CustomSftpComponent extends SftpComponent {
private static final Logger LOG = LoggerFactory.getLogger(CustomSftpComponent.class);
#Value("${sftp.username}")
private String username;
#Value("${sftp.password}")
private String password;
#SuppressWarnings("rawtypes")
protected void afterPropertiesSet(GenericFileEndpoint<SftpRemoteFile> endpoint) throws Exception {
SftpConfiguration config = (SftpConfiguration) endpoint.getConfiguration();
config.setUsername(username);
config.setPassword(password);
}
}
Step 2: Create a camel route to poll 2 different folders using your custom component name.
#Component
public class PollSftpRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("{{sftp.endpoint1}}").routeId("pollSftpRoute1")
.log(LoggingLevel.INFO, "Downloaded file from input folder 1.")
.to("file:data/out1");
from("{{sftp.endpoint2}}").routeId("pollSftpRoute2")
.log(LoggingLevel.INFO, "Downloaded file from input folder 2.")
.to("file:data/out2");
}
}
Step 3: Place this in application.properties
camel.springboot.main-run-controller=true
sftp.endpoint1=customSftp://localhost.net/input/1?delay=30s
sftp.endpoint2=customSftp://localhost.net/input/2?delay=30s
sftp.username=sftp_user1_l
sftp.password=xxxxxxxxxxxx
With this you don't have to repeat the username/password for each endpoints.
Note: With this approach you wont be able to set the username/password in URI endpoint configuration. Anything you set in URI will be replaced in afterPropertiesSet.
In Java project, I am using Sprig Boot 1.5.3.RELEASE. It is connecting with two databases i.e. MongoDB and Microsoft SQLServer. When I run it with spring-boot:run goal, it works fine. However, when I try to run it with package goal then below error is reported by test cases despite the fact that those test cases are not connecting to SQL Server database:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1486)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1104)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1066)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:835)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:741)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:467)
.....
.....
MediationTest.java (Java class containing test cases generating above error)
#RunWith(SpringRunner.class)
#DataMongoTest(excludeAutoConfiguration = EmbeddedMongoAutoConfiguration.class)
#SpringBootTest(classes = { Application.class })
public class MediationTest {
#Autowired
private SwiftFormat swiftFormat;
......................
......................
MsqlDbConfig.java
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(entityManagerFactoryRef = "msqlEntityManagerFactory", transactionManagerRef = "msqlTransactionManager", basePackages = { "com.msql.data" })
public class MsqlDbConfig {
#Bean(name = "msqlDataSource")
#ConfigurationProperties(prefix = "msql.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "msqlEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean msqlEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("msqlDataSource") DataSource dataSource) {
return builder.dataSource(dataSource)
.packages("com.utils.msql.info")
.persistenceUnit("msql").build();
}
#Bean(name = "msqlTransactionManager")
public PlatformTransactionManager msqlTransactionManager(
#Qualifier("msqlEntityManagerFactory") EntityManagerFactory msqlEntityManagerFactory) {
return new JpaTransactionManager(msqlEntityManagerFactory);
}
}
application.properties
spring.data.mongodb.uri=mongodb://dev-abc-123:27017/db
msql.datasource.url=jdbc:sqlserver://ABC-SQL14-WXX;databaseName=dev
msql.datasource.username=dev
msql.datasource.password=*****
msql.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
msql.jpa.hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
spring.jpa.hibernate.naming_strategy=org.hibernate.cfg.EJB3NamingStrategy
spring.jpa.show-sql=true
The spring-boot:run goal is defined by the Mojo included within the spring-boot-maven-plugin project. You can find it here. https://github.com/spring-projects/spring-boot/blob/8e3baf3130220a331d540cb07e1aca263b721b38/spring-boot-tools/spring-boot-maven-plugin/src/main/java/org/springframework/boot/maven/RunMojo.java.
The requiresDependencyResolution scope is set to Test. This will include the dependencies from each phase on the classpath. Take a look at the specification here. https://maven.apache.org/developers/mojo-api-specification.html
The package goal provided by Maven wouldn't include these additional dependencies on the classpath and I believe that is the cause of your issues.
Spring Boot provides a repackage goal which is what should be used for building out executable spring-boot applications.
However, to get more to the point. I think if you update your test to exclude an additional class it might fix your problem.
#DataMongoTest(excludeAutoConfiguration = {EmbeddedMongoAutoConfiguration.class, HibernateJpaAutoConfiguration.class})