File component(Apache camel) delete=true parameter not working in windows - apache-camel

The Apache Camel File component not working properly in Windows7, where as its working in Linux without any Problem.
My Requirement:
After the file processing, the files must be deleted from the Directory. In Windows, because of .camelLock the files are not deleting properly.
After Multiple attempts, then only Apache camel can delete the file from the Directory. If the attemt failed to delete the file from the Directory then its throwing an exception.
If I have only a File in the Directory, it's working without any Problem, but if i have multiple Files then it's throwing an Exception.
Application Environment:
I deployed the Apache camel application in Tomcat Server.
Apache Camel version: 2.17.1
Apache Camel Route:
<from uri="file:///var/opt/irs/message?delete=true" />
<to uri="direct:file.storage.original" />
In Windows I am receiving following Error:
1|2017-11-14 17:56:34,828|11-01-41|default|WARN
|yes||o.a.c.c.f.GenericFileOnCompletion|file.analysis.input|Error
during commit. Exchange[ID-51741-1510678404569-9-22]. Caused by:
[org.apache.camel.component.file.GenericFileOperationFailedException -
Cannot delete file: GenericFile[C:\var\opt\irs\message\661.zip]]
org.apache.camel.component.file.GenericFileOperationFailedException:
Cannot delete file: GenericFile[C:\var\opt\irs\message\661.zip] at
org.apache.camel.component.file.strategy.GenericFileDeleteProcessStrategy.commit(GenericFileDeleteProcessStrategy.java:89)
at
org.apache.camel.component.file.GenericFileOnCompletion.processStrategyCommit(GenericFileOnCompletion.java:127)
Similar Problems:
Camel 2.15 file locks ,
Camel 2.14.2 not deleting files on Windows ,deleting moving files

In this case, I forgot to close file input stream. so that's why I had a problem in Windows.
I am using InputStream unnecessarily for the Filecomponent. I removed it, everything working file.
InputStream input = CamelContextHelper.convertTo(context, InputStream.class, body);
If anybody has same problem, just close the file InputStream properly or Cross-check your code and try to replace it with other proper alternative solution.
Alternative Solution:
In case,if you are unable to find open streams in your code, then apply below parameters to Apache camel route. The performance would be better.
readLockCheckInterval=1&readLockTimeout=3
In my case Performance much better in windows.

Related

Flink logging - Using Log4j2

We are running a Flink(1.9.1) application on AWS-EMR(5.29) using yarn. We are using a common logging adaptor throughout all the components(including the Flink application) in our project and it uses Log4j2.
From the documentation, I see that there are 3 configuration files.
log4j.properties
log4j-yarn-session.properties
log4j-cli.properties
I understand that I will have to modify log4j.properties for the job manager and task manager logs and log4j-cli.properties for the code not included in the cluster code.
Now given this situation,
How do I pass my log4j2.properties?
Do we replace the logging jars in the lib folder with log4j2 jars?
Not a solid solution but this is a workaround. If the log4j.properties file in the /conf folder is deleted, the log4j2 file within the jar that is within the classpath is referred. But be careful when you have multiple jars in the classpath with the log4j2 properties file.

chmod option from Apache Camel SFTP Component doesn't work when deploying de route on Openshift

I have an Apache Camel route that listens for an FTP server to process files. The problem is that I need to change permission of the file across the FTP server, so I tried to add chmod option included in the documentation for Apache Camel SFTP Component, but it didn't change the permissions of the file so the processing fails in the last step not matter if it finished ok or not. All of this works locally but didn't work deployed on Openshift.
I am using the chmod option as follows:
sftp://AAAA#BBBB/PATH/TO/procesado?password=XXXX&maxMessagesPerPoll=15&delay=30000&fileName=${header.CamelFileName}&chmod=755
Can someone tell me why it doesn't work on Openshift, or if there is a way to make it work.

How to read a property file from different jar file resource in Apache Camel

Is there any technique to read a property file from a different jar file resource folder in Apache Camel
I have two project one is ESB-tracker and other is ESB-common. queue-config.properties is place under ESB-common/pk/com/herman/common/resources/.
ESB-common is a simple java project and ESB-tracker in a fuse-integration project in which im trying to read this property file using the below line.
<propertyPlaceholder id="properties" location="pk/com/herman/common/resources/queue-config.properties"/>
I have deployed the both jar files in jboss fuse but i got below excpetion
Caused by: java.io.FileNotFoundException: Properties file pk/com/herman/common/resources/queue-config.properties not found in classpath

Flink Error: java.lang.ClassNotFoundException: org.apache.flink.shaded.calcite.com.google.common.base.Throwables

I am using Flink for streaming the data which is in the csv file. I want to put it into table format with certain schema. For this purpose I am using Flink-table_2.10-1.1.3.jar (Table api) but I got the errors:
log4j:WARN No appenders could be found for logger (org.apache.flink.api.java.typeutils.TypeExtractor).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/shaded/calcite/com/google/common/base/Throwables
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:450)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:460)
at org.apache.calcite.rel.metadata.RelMetadataQuery.revise(RelMetadataQuery.java:186)
at org.apache.calcite.rel.metadata.RelMetadataQuery.collations(RelMetadataQuery.java:484)
at org.apache.calcite.rel.metadata.RelMdCollation.project(RelMdCollation.java:207)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:122)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:120)
at org.apache.calcite.plan.RelTraitSet.replaceIfs(RelTraitSet.java:238)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:116)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:108)
at org.apache.flink.api.table.plan.logical.Project.construct(operators.scala:90)
at org.apache.flink.api.table.plan.logical.Project.construct(operators.scala:85)
at org.apache.flink.api.table.plan.logical.LogicalNode.toRelNode(LogicalNode.scala:78)
at org.apache.flink.api.table.Table.getRelNode(table.scala:66)
at org.apache.flink.api.table.StreamTableEnvironment.translate(StreamTableEnvironment.scala:243)
at org.apache.flink.api.java.table.StreamTableEnvironment.toDataStream(StreamTableEnvironment.scala:147)
at table_streaming_test.main(table_streaming_test.java:90)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.shaded.calcite.com.google.common.base.Throwables
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
When I explore the corresponding jar, the respective class is present there. Can you please tell that why this is happening?
Also can I get the maven source so that I can build the Flink-table .jar at my place?
I had the same problem with CEP library. I added to my pom file but I kept getting ClassNotFoundException. I even packaged it with my jar file via IntelliJ but didn't work.
If you're using their flink-quickstart archetype, I think there are some other things to change in pom file to make it work. When I created a clean project and added flink dependencies myself, I didn't get that exception anymore. You can try and see if this approach works.
You can also add flink-table JAR file to lib folder in Flink. this also fixed my problem with CEP library. the JAR file is available in Maven repository website. download the version you want.
According to the Table and SQL document on Flink website:
Note: The Table API is currently not part of the binary distribution.
See linking with it for cluster execution here.
I was also facing the same problem with Table api in flink v1.4.2.
I added flink-table_2.11-1.4.2.jar file present in opt folder to the lib folder and restarted flink.
This works for me. Hopefully works for you too :)

NoClassDefFoundError MimeTypeException with PDF extraction

I am getting an exception trying to use update/extract with PDF files
My Set up is:-
Ubuntu Server 11.10
Tomcat 6
Solr 3.5.0.2011.11.22.15.54.38
I can browse to solr/admin OK
I have put all the contrib/extract and apache-solr-cell3.5.0.jar libraries into the tomcat folder webapps/solr/WEB-INF/lib
I am calling extract using:-
curl "http://localhost:8080/solr/update/extract?uprefix=attr_&fmap.content=attr_content&commit=true" -F "file=/path/to/my.pdf"
error is
java.lang.NoClassDefFoundError: org/apache/tika/mime/MimeTypeException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:383)
at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:425)
at org.apache.solr.core.SolrCore.createRequestHandler(SolrCore.java:461)
at org.apache.solr.core.RequestHandlers$LazyRequestHandlerWrapper.getWrappedHandler(RequestHandlers.java:248)
at org.apache.solr.core.RequestHandlers$LazyRequestHandlerWrapper.handleRequest(RequestHandlers.java:239)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1372)
Would appreciate any pointers - the only time this error seems to come up elsewhere is with Nutch and cached results.
I have tried sending the mimetype in the querystring and also a *.doc file but got the same error.
According to the error message it is not a MimeTypeException exception you get: The problem is a NoClassDefFoundError, because Solr cannot load the class MimeTypeException.
Normally this class is present in tika-core.jar.
Make sure you actually have that file and also check if you have a lib statement in your solrconfig.xml pointing to the right directory.
This was due to the basic error of copying the necessary tika libraries (to tomcat6/webapps/solr/WEB-INF/lib) but leaving ownership of the jar files as ROOT instead of chown-ing them to TOMCAT6. After setting the right permission and restarting Tomcat it started working OK
Found the solution of this problem, I was using SolrJ to update my pdf indexing.
after deploy solr to tomcat, I didn't include the following libraries into the tomcat/webapp
and I get all the lazy loading problem, etc etc
I even try to get apache tika...
until I do this...
shutdown tomcat
\apache-solr-3.5.0\contrib\extraction
copy the libraries above to below
\apache-tomcat-7.0.26\webapps\solr\WEB-INF\lib
startup tomcat
cheers

Resources