I have to read a file from SFTP server. But while reading it camel throws com.jcraft.jsch.SftpException: Permission denied. But i have read and write permission for the folder and files in it. I tried moving the file to .done folder with filezilla and it works.
Route:
sftp://USERNAME#IP_ADDRESS:PORT/path?password=MY_PASSWORD&move=.done&binary=true&disconnect=true&include=FILENAME_(2\\d{3})(\\d{2})(\\d{2}).txt
Stacktrace
o.a.c.c.file.remote.SftpConsumer : Error processing file RemoteFile[FILENAME_20170802.txt] due to Cannot retrieve file: path/FILENAME_20170802.txt. Caused by: [org.apache.camel.component.file.GenericFileOperationFailedException - Cannot retrieve file: path/FILENAME_20170802.txt]
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot retrieve file: path/FILENAME_20170802.txt
at org.apache.camel.component.file.remote.SftpOperations.retrieveFileToStreamInBody(SftpOperations.java:704) ~[camel-ftp-2.19.1.jar:2.19.1]
at org.apache.camel.component.file.remote.SftpOperations.retrieveFile(SftpOperations.java:648) ~[camel-ftp-2.19.1.jar:2.19.1]
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:408) [camel-core-2.19.1.jar:2.19.1]
at org.apache.camel.component.file.remote.RemoteFileConsumer.processExchange(RemoteFileConsumer.java:137) [camel-ftp-2.19.1.jar:2.19.1]
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:218) [camel-core-2.19.1.jar:2.19.1]
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:182) [camel-core-2.19.1.jar:2.19.1]
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:174) [camel-core-2.19.1.jar:2.19.1]
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:101) [camel-core-2.19.1.jar:2.19.1]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_131]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_131]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Caused by: com.jcraft.jsch.SftpException: Permission denied
at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2873) ~[jsch-0.1.54.jar:na]
at com.jcraft.jsch.ChannelSftp._stat(ChannelSftp.java:2225) ~[jsch-0.1.54.jar:na]
at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:1318) ~[jsch-0.1.54.jar:na]
at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:1290) ~[jsch-0.1.54.jar:na]
at org.apache.camel.component.file.remote.SftpOperations.retrieveFileToStreamInBody(SftpOperations.java:689) ~[camel-ftp-2.19.1.jar:2.19.1]
... 14 common frames omitted
The directory path that you have provided is relative, not absolute.
Just confirming if that's the issue.
By default, Camel will go to /home/{username}/ directory. Not the "\" root directory.
Hope it helps.
Related
java.lang.RuntimeException: Warning: Local jar
C:\Zeppelin\zeppelin-0.8.0-bin-all\bin\54480 does not exist, skipping.
Warning: Local jar C:\Zeppelin\zeppelin-0.8.0-bin-all\bin\10.10.10.122
does not exist, skipping. java.lang.ClassNotFoundException:
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:348) at
org.apache.spark.util.Utils$.classForName(Utils.scala:235) at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:836)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2018-09-18
10:33:29 INFO ShutdownHookManager:54 - Shutdown hook called 2018-09-18
10:33:29 INFO ShutdownHookManager:54 - Deleting directory
C:\Users\Polichetti\AppData\Local\Temp\spark-e1cca18a-e05a-4539-b1b6-2f56a8ab27aa
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterManagedProcess.start(RemoteInterpreterManagedProcess.java:205)
at
org.apache.zeppelin.interpreter.ManagedInterpreterGroup.getOrCreateInterpreterProcess(ManagedInterpreterGroup.java:64)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getOrCreateInterpreterProcess(RemoteInterpreter.java:111)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.internal_create(RemoteInterpreter.java:164)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:132)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:299)
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:407)
at org.apache.zeppelin.scheduler.Job.run(Job.java:188) at
org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:307)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Please James, this is not the way that Stackoverflow works... Do some research If you have an error log, and don't throw to here like trash.
I'm deploying google search appliance connector for database(SQL). I'm filling the information in the GSA hostname and other required variable where I have specified the full class name of my JDBC Driver as:
com.microsoft.sqlserver.jdbc.SQLServerDriver
which is right. I have browsed my library of the JDBC to check that.
But after the installation, I am getting this error in the logs:
main .Application.realDaemonStart() INFO: about to init com.google.enterprise.adaptor.database.DatabaseAdaptor
11-04 14:05:27.314 main .Application.realDaemonStart() WARNING: Failed to initialize adaptor
java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerDriver
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at com.google.enterprise.adaptor.database.DatabaseAdaptor.init(DatabaseAdaptor.java:133)
at com.google.enterprise.adaptor.Application.realDaemonStart(Application.java:186)
at com.google.enterprise.adaptor.Application.daemonStart(Application.java:155)
at com.google.enterprise.adaptor.Application.start(Application.java:110)
at com.google.enterprise.adaptor.Application.main(Application.java:553)
at com.google.enterprise.adaptor.AbstractAdaptor.main(AbstractAdaptor.java:64)
at com.google.enterprise.adaptor.database.DatabaseAdaptor.main(DatabaseAdaptor.java:315)
What does this mean? I have specified the right class name in the JDBC. Any help is appreciated.
When you run the adaptor - is your JAR file in the same directory as your database adaptor JAR file?
If you are using the Google provided command line, be sure to modify the "classpath" to reference the JAR file for the ODBC driver.
ie.
-classpath="msqljdbc.jar,adaptor-database-4.1.1-withlib.jar"
or
-cp="msqljdbc.jar,adaptor-database-4.1.1-withlib.jar"
I have created a Gatling simulation, by creating a ".har" file from Google Chrome. And I have changed input parameters & trying to fetch data from a ".csv" file.
Now when I run the simulation, how can I check if Gatling simulation has fetched data fields from ".csv" file?
Temporarily lower logging level to DEBUG in conf/logback.xml, you'll see the requests that are being generated.
Gatling has quite good error handling/logging connected to feeders.
When you provide wrong file name or path of your feeder file you will get following error (java.lang.IllegalArgumentException: Could not locate feeder file). Note that tests will not run.
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.gatling.mojo.MainWithArgsInFile.runMain(MainWithArgsInFile.java:50)
at io.gatling.mojo.MainWithArgsInFile.main(MainWithArgsInFile.java:33)
Caused by: java.lang.IllegalArgumentException: Could not locate feeder file: file gatling/src/test/resources/data/wrongFile.csv doesn't exist
at io.gatling.core.feeder.FeederSupport$class.feederBuilder(FeederSupport.scala:53)
at io.gatling.core.Predef$.feederBuilder(Predef.scala:22)
at io.gatling.core.feeder.FeederSupport$class.separatedValues(FeederSupport.scala:44)
at io.gatling.core.Predef$.separatedValues(Predef.scala:22)
at io.gatling.core.feeder.FeederSupport$class.separatedValues(FeederSupport.scala:41)
at io.gatling.core.Predef$.separatedValues(Predef.scala:22)
at io.gatling.core.feeder.FeederSupport$class.csv(FeederSupport.scala:34)
at io.gatling.core.Predef$.csv(Predef.scala:22)
at com.scenario.Scenario.<init>(Scenario.scala:10)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at io.gatling.app.Gatling$.io$gatling$app$Gatling$$$anonfun$1(Gatling.scala:41)
at io.gatling.app.Gatling.run(Gatling.scala:92)
at io.gatling.app.Gatling.runIfNecessary(Gatling.scala:75)
at io.gatling.app.Gatling.start(Gatling.scala:65)
at io.gatling.app.Gatling$.start(Gatling.scala:57)
at io.gatling.app.Gatling$.fromArgs(Gatling.scala:49)
at io.gatling.app.Gatling$.main(Gatling.scala:43)
at io.gatling.app.Gatling.main(Gatling.scala)
When your attribute will not matched column name from feeder you will get following error. All tests will end with error.
00:10:07.937 [ERROR] i.g.h.a.s.HttpRequestAction - 'httpRequest-1' failed to execute: No attribute named 'wrongAttributeName' is defined
When you have empty file with feeder data you will get following error.
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.gatling.mojo.MainWithArgsInFile.runMain(MainWithArgsInFile.java:50)
at io.gatling.mojo.MainWithArgsInFile.main(MainWithArgsInFile.java:33)
Caused by: java.lang.IllegalStateException: Feeder is now empty, stopping engine
at io.gatling.core.action.SingletonFeed$$anonfun$receive$1.applyOrElse(SingletonFeed.scala:61)
at akka.actor.Actor$class.aroundReceive(Actor.scala:482)
at io.gatling.core.akka.BaseActor.aroundReceive(BaseActor.scala:23)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
at akka.actor.ActorCell.invoke(ActorCell.scala:495)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
at akka.dispatch.Mailbox.run(Mailbox.scala:224)
at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
When non of this appear you can be almost sure that everything is ok. But to verify parameters send by Gatling you can check
Gatling logs (change logging level to DEBUG) in logback.xml
<root level="DEBUG">
<appender-ref ref="CONSOLE" />
</root>
And then you will see log similar to
00:13:53.233 [DEBUG] i.g.h.a.s.HttpTx$ - Sending request=Action name uri=http://your-rest-service?param1=value1¶m2=value2: scenario=com.scenario.Scenario Scenario name, userId=1
Access logs on your server. In Tomcat instruction you can get for example here
I am using Apache Solr on my drupal website.
Tomcat 6 is installed and I have replaced schema.xml, solr-config.xml and protwords.txt files with the new files which was present in module installation directory.
When I run localhost:8983, I get this error.
Log4j (org.slf4j.impl.Log4jLoggerFactory)
2528 [coreLoadExecutor-3-thread-1] ERROR org.apache.solr.core.CoreContainer – Failed to load file /opt/solr-4.5.1/example/solr/collection1/conf/solrconfig.xml
2529 [coreLoadExecutor-3-thread-1] ERROR org.apache.solr.core.CoreContainer – Unable to create core: egitraining-dev.esc.rl.ac.uk
org.apache.solr.common.SolrException: Could not load config file /opt/solr-4.5.1/example/solr/collection1/conf/solrconfig.xml
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:490)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:557)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:247)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:239)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.IOException: Can't find resource 'solrconfig.xml' in classpath or 'solr/collection1/conf/conf/', cwd=/opt/solr-4.5.1/example
at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:322)
at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:287)
at org.apache.solr.core.Config.<init>(Config.java:116)
at org.apache.solr.core.Config.<init>(Config.java:86)
at org.apache.solr.core.SolrConfig.<init>(SolrConfig.java:129)
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:487)
... 11 more
2531 [coreLoadExecutor-3-thread-1] ERROR org.apache.solr.core.CoreContainer – null:org.apache.solr.common.SolrException: Unable to create core: egitraining-dev.esc.rl.ac.uk
at org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:934)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:566)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:247)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:239)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.solr.common.SolrException: Could not load config file /opt/solr-4.5.1/example/solr/collection1/conf/solrconfig.xml
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:490)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:557)
... 10 more
Caused by: java.io.IOException: Can't find resource 'solrconfig.xml' in classpath or 'solr/collection1/conf/conf/', cwd=/opt/solr-4.5.1/example
at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:322)
at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:287)
at org.apache.solr.core.Config.<init>(Config.java:116)
at org.apache.solr.core.Config.<init>(Config.java:86)
at org.apache.solr.core.SolrConfig.<init>(SolrConfig.java:129)
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:487)
... 11 more
2533 [main] INFO org.apache.solr.servlet.SolrDispatchFilter – user.dir=/opt/solr-4.5.1/example
2533 [main] INFO org.apache.solr.servlet.SolrDispatchFilter – SolrDispatchFilter.init() done
2576 [main] INFO org.eclipse.jetty.server.AbstractConnector – Started SocketConnector#0.0.0.0:8983
Can anyone help Please?
Thanks
This might have something to do with the default Solr config files provided by the Search API Solr module. Try to remove the next couple of lines from solrconfig.xml:
<useCompoundFile>false</useCompoundFile>
<ramBufferSizeMB>32</ramBufferSizeMB>
<mergeFactor>10</mergeFactor>
Patch found at https://drupal.org/comment/7945999#comment-7945999.
I have integrated nutch with hadoop and ran a crawl with it. But when I want to create solrindex for the data, I got Job Failed. This is my command:
$bin/nutch solrindex http://namenode:8983/solr/ crawl/crawldb -linkdb crawl/linkdb crawl/segments/*
the directory 'crawl' stores the data and it located in HDFS. I just used the example within the solr release. And it works fine when ran the command in single node mode(not in distributed mode). Here is a fragment of output and forgive me for posting it in such a mussy way:
12/04/28 13:51:22 INFO mapred.JobClient: map 100% reduce 23%
12/04/28 13:51:30 INFO mapred.JobClient: Task Id : attempt_201204212112_0076_r_000000_0, Status : FAILED
java.io.IOException
at org.apache.nutch.indexer.solr.SolrWriter.makeIOException(SolrWriter.java:103)
at org.apache.nutch.indexer.solr.SolrWriter.close(SolrWriter.java:98)
at org.apache.nutch.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:48)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.close(ReduceTask.java:466)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:530)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:420)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: org.apache.solr.client.solrj.SolrServerException: java.net.ConnectException: Connection refused
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:478)
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:244)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:49)
at org.apache.nutch.indexer.solr.SolrWriter.close(SolrWriter.java:93)
... 9 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(Unknown Source)
at java.net.PlainSocketImpl.connectToAddress(Unknown Source)
at java.net.PlainSocketImpl.connect(Unknown Source)
at java.net.SocksSocketImpl.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at java.net.Socket.<init>(Unknown Source)
at java.net.Socket.<init>(Unknown Source)
at org.apache.commons.httpclient.protocol.DefaultProtocolSocketFactory.createSocket(DefaultProtocolSocketFactory.java:79)
at org.apache.commons.httpclient.protocol.DefaultProtocolSocketFactory.createSocket(DefaultProtocolSocketFactory.java:121)
at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:706)
at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:386)
at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:170)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:396)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:324)
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:422)
... 13 more