Camel consuming large file encoded with base64 - file

I'm trying to read a large file with apache camel. The content of the file is base64 encoded and I want to decode this. Reading a large ASCII-based text file is no problem, but this one...
In this case I cannot split it at the end of a line, because the base64 line end is not the line end of my data (the data is csv). Later I want do manipulate the data.
With following camel route it works, but so the file have to exist and is not locked, when the route starts, else I get an IllegalArgumentException. But I don't know, when the file will exist.
EDIT: I forgot: camel-version: 2.14.1 | java: 1.7
I need a solution for my promblen, can someone help me?
from("stream:file?fileName=/path/to/file&scanStream=true")
.routeId(ROUTE_ID_BASE64)
.unmarshal().base64()
.setHeader("foo", constant("foo"))
.aggregate(header("foo"), new StringBodyAggregator())
.completionSize(1000)
.completionTimeout(1500)
.to("file:" + this.tempPfad + "?fileName=" + this.tempDateiname + "&charset=utf-8&fileExist=Append");
I tried also:
from("file:" + this.eingabePfad + "?filter=#dateinamenFilter&maxDepth=1&readLock=changed&noop=true&charset=utf-8")
.routeId(ROUTE_ID_BASE64)
.unmarshal().base64()
.setHeader("foo", constant("foo"))
.aggregate(header("foo"), new StringBodyAggregator())
.completionSize(1000)
.completionTimeout(1500)
.to("file:" + this.tempPfad + "?fileName=" + this.tempDateiname + "&charset=utf-8&fileExist=Append");
But then I get following error:
ERROR Failed delivery for (MessageId: ID-smith-52872-1421933301634-0-1 on ExchangeId: ID-smith-52872-1421933301634-0-3). Exhausted after delivery attempt: 2 caught: org.apache.camel.component.file.GenericFileOperationFailedException: Cannot store file: /path/to/temp/tempDatei.csv. Processed by failure processor: FatalFallbackErrorHandler[Channel[DelegateSync[yy.xxxxx.myproject.routes.DateiPolling$1#5734ea]]]
Message History
---------------------------------------------------------------------------------------------------------------------------------------
RouteId ProcessorId Processor Elapsed (ms)
[routeBase64 ] [routeBase64 ] [file:///path/to/?charset=utf-8&filter=%23datein] [ 1998]
[routeBase64 ] [unmarshal2 ] [unmarshal[org.apache.camel.model.dataformat.Base64DataFormat#153097e] ] [ 5]
[routeBase64 ] [setHeader1 ] [setHeader[foo] ] [ 0]
[routeBase64 ] [aggregate1 ] [aggregate[header(foo)] ] [ 0]
[routeBase64 ] [to1 ] [file:/path/to/temp/?fileName=tempDatei.csv&char] [ 24]
[ ] [process1 ] [yy.xxxxx.myproject.routes.DateiPolling$1#5734ea ] [ 15]
Exchange
---------------------------------------------------------------------------------------------------------------------------------------
Exchange[
Id ID-smith-52872-1421933301634-0-3
ExchangePattern InOnly
Headers {breadcrumbId=ID-smith-52872-1421933301634-0-1, CamelFileAbsolute=true, CamelFileAbsolutePath=/path/to/base64File.txt, CamelFileLastModified=1417777182000, CamelFileLength=267439971, CamelFileName=base64File.txt, CamelFileNameConsumed=base64File.txt, CamelFileNameOnly=base64File.txt, CamelFileParent=/path/to, CamelFilePath=/path/to/base64File.txt, CamelFileRelativePath=base64File.txt, CamelRedelivered=true, CamelRedeliveryCounter=1, CamelRedeliveryMaxCounter=1, foo=foo}
BodyType org.apache.commons.codec.binary.Base64InputStream
Body [Body is instance of java.io.InputStream]
]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot store file: /path/to/temp/tempDatei.csv
at org.apache.camel.component.file.FileOperations.storeFile(FileOperations.java:276)
at org.apache.camel.component.file.GenericFileProducer.writeFile(GenericFileProducer.java:277)
at org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:165)
at org.apache.camel.component.file.GenericFileProducer.process(GenericFileProducer.java:79)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:120)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:416)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:105)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:87)
at org.apache.camel.processor.aggregate.AggregateProcessor$1.run(AggregateProcessor.java:548)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.apache.camel.util.concurrent.SynchronousExecutorService.execute(SynchronousExecutorService.java:62)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
at org.apache.camel.processor.aggregate.AggregateProcessor.onSubmitCompletion(AggregateProcessor.java:540)
at org.apache.camel.processor.aggregate.AggregateProcessor.access$900(AggregateProcessor.java:82)
at org.apache.camel.processor.aggregate.AggregateProcessor$AggregationTimeoutMap.onEviction(AggregateProcessor.java:853)
at org.apache.camel.processor.aggregate.AggregateProcessor$AggregationTimeoutMap.onEviction(AggregateProcessor.java:814)
at org.apache.camel.support.DefaultTimeoutMap.purge(DefaultTimeoutMap.java:212)
at org.apache.camel.processor.aggregate.AggregateProcessor$AggregationTimeoutMap.purge(AggregateProcessor.java:826)
at org.apache.camel.support.DefaultTimeoutMap.run(DefaultTimeoutMap.java:162)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Stream closed
at java.io.BufferedReader.ensureOpen(BufferedReader.java:115)
at java.io.BufferedReader.read(BufferedReader.java:172)
at org.apache.camel.converter.IOConverter$1.read(IOConverter.java:83)
at java.io.InputStream.read(InputStream.java:170)
at java.io.InputStream.read(InputStream.java:101)
at org.apache.commons.codec.binary.BaseNCodecInputStream.read(BaseNCodecInputStream.java:158)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:283)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:325)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.read1(BufferedReader.java:203)
at java.io.BufferedReader.read(BufferedReader.java:279)
at java.io.Reader.read(Reader.java:140)
at org.apache.camel.util.IOHelper.copy(IOHelper.java:224)
at org.apache.camel.component.file.FileOperations.writeFileByReaderWithCharset(FileOperations.java:402)
at org.apache.camel.component.file.FileOperations.storeFile(FileOperations.java:266)
... 29 more
2015-01-22 14:28:26,113 [eTimeoutChecker] AggregateProcessor WARN Error processing aggregated exchange. Exchange[base64File.txt]. Caused by: [org.apache.camel.component.file.GenericFileOperationFailedException - Cannot store file: /path/to/temp/tempDatei.csv]
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot store file: /path/to/temp/tempDatei.csv
at org.apache.camel.component.file.FileOperations.storeFile(FileOperations.java:276)
at org.apache.camel.component.file.GenericFileProducer.writeFile(GenericFileProducer.java:277)
at org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:165)
at org.apache.camel.component.file.GenericFileProducer.process(GenericFileProducer.java:79)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:120)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:416)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:105)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:87)
at org.apache.camel.processor.aggregate.AggregateProcessor$1.run(AggregateProcessor.java:548)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.apache.camel.util.concurrent.SynchronousExecutorService.execute(SynchronousExecutorService.java:62)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
at org.apache.camel.processor.aggregate.AggregateProcessor.onSubmitCompletion(AggregateProcessor.java:540)
at org.apache.camel.processor.aggregate.AggregateProcessor.access$900(AggregateProcessor.java:82)
at org.apache.camel.processor.aggregate.AggregateProcessor$AggregationTimeoutMap.onEviction(AggregateProcessor.java:853)
at org.apache.camel.processor.aggregate.AggregateProcessor$AggregationTimeoutMap.onEviction(AggregateProcessor.java:814)
at org.apache.camel.support.DefaultTimeoutMap.purge(DefaultTimeoutMap.java:212)
at org.apache.camel.processor.aggregate.AggregateProcessor$AggregationTimeoutMap.purge(AggregateProcessor.java:826)
at org.apache.camel.support.DefaultTimeoutMap.run(DefaultTimeoutMap.java:162)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Stream closed
at java.io.BufferedReader.ensureOpen(BufferedReader.java:115)
at java.io.BufferedReader.read(BufferedReader.java:172)
at org.apache.camel.converter.IOConverter$1.read(IOConverter.java:83)
at java.io.InputStream.read(InputStream.java:170)
at java.io.InputStream.read(InputStream.java:101)
at org.apache.commons.codec.binary.BaseNCodecInputStream.read(BaseNCodecInputStream.java:158)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:283)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:325)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.read1(BufferedReader.java:203)
at java.io.BufferedReader.read(BufferedReader.java:279)
at java.io.Reader.read(Reader.java:140)
at org.apache.camel.util.IOHelper.copy(IOHelper.java:224)
at org.apache.camel.component.file.FileOperations.writeFileByReaderWithCharset(FileOperations.java:402)
at org.apache.camel.component.file.FileOperations.storeFile(FileOperations.java:266)
... 29 more

With adding the line .split().tokenize(LINE_SEPARATOR).streaming() it works now. Important is .end() at the end of the route, and not direct after the unmarshalling.
I changed also the Aggregatorstrategy from String to byte[], to prevent errors with UTF-8 encoding.
The complete route:
from("file:" + this.eingabePfad + "?filter=#dateinamenFilter&maxDepth=1&readLock=changed&noop=true")
.routeId(ROUTE_ID_BASE64)
.split().tokenize(LINE_SEPARATOR).streaming()
.unmarshal().base64()
.setHeader("foo", constant("foo"))
.aggregate(header("foo"), new ByteArrayBodyAggregator())
.completionSize(1000)
.completionTimeout(1500)
.to("file:" + this.tempPfad + "?fileName=" + this.tempDateiname + "&charset=utf-8&fileExist=Append")
.end();

Related

Converting pandas table to Table API to DataStream using PyFlink

Hello I am new to PyFlink. I try to convert a pandas table including strings to a DataStream object in Flink.
Take the following as an example code:
from pyflink.datastream import *
from pyflink.table import *
import pandas as pd
import numpy as np
env = StreamExecutionEnvironment.get_execution_environment()
t_env = StreamTableEnvironment.create(env)
env.set_parallelism(1)
# Create a Pandas DataFrame
#pdf = pd.DataFrame(np.random.rand(1000, 5))
pdf = pd.DataFrame(["abc", "def"])
# Create a PyFlink Table from a Pandas DataFrame
table = t_env.from_pandas(pdf)
table.execute().print()
# interpret the insert-only Table as a DataStream again
res_ds = t_env.to_data_stream(table)
# add a printing sink and execute in DataStream API
res_ds.print()
env.execute()
I get the following error:
Caused by: java.lang.ClassCastException: class org.apache.flink.table.data.binary.BinaryStringData cannot be cast to class java.lang.String (org.apache.flink.table.data.binary.BinaryStringData is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
The problem seems to be the conversion from the table to the datastream object. The first print function is executed.
Output:
$ python PandasConv.py
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker$1 (file:/home/user/Flink/Python_Projects/flinkenv/lib/python3.8/site-packages/pyflink/opt/flink-python-1.16.0.jar) to method java.time.ZoneRegion.getId()
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
+----+--------------------------------+
| op | 0 |
+----+--------------------------------+
| +I | abc |
| +I | def |
+----+--------------------------------+
2 rows in set
Traceback (most recent call last):
File "PandasConv.py", line 25, in <module>
env.execute()
File "/home/user/Flink/Python_Projects/flinkenv/lib/python3.8/site-packages/pyflink/datastream/stream_execution_environment.py", line 764, in execute
return JobExecutionResult(self._j_stream_execution_environment.execute(j_stream_graph))
File "/home/user/Flink/Python_Projects/flinkenv/lib/python3.8/site-packages/py4j/java_gateway.py", line 1321, in __call__
return_value = get_return_value(
File "/home/user/Flink/Python_Projects/flinkenv/lib/python3.8/site-packages/pyflink/util/exceptions.py", line 146, in deco
return f(*a, **kw)
File "/home/user/Flink/Python_Projects/flinkenv/lib/python3.8/site-packages/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o46.execute.
: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:141)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$1(AkkaInvocationHandler.java:268)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.util.concurrent.FutureUtils.doForward(FutureUtils.java:1277)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$null$1(ClassLoadingUtils.java:93)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$guardCompletionWithContextClassLoader$2(ClassLoadingUtils.java:92)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$1.onComplete(AkkaFutureUtils.java:47)
at akka.dispatch.OnComplete.internal(Future.scala:300)
at akka.dispatch.OnComplete.internal(Future.scala:297)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:224)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:221)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$DirectExecutionContext.execute(AkkaFutureUtils.java:65)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:621)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:24)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:23)
at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:139)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:83)
at org.apache.flink.runtime.scheduler.DefaultScheduler.recordTaskFailure(DefaultScheduler.java:256)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:247)
at org.apache.flink.runtime.scheduler.DefaultScheduler.onTaskFailed(DefaultScheduler.java:240)
at org.apache.flink.runtime.scheduler.SchedulerBase.onTaskExecutionStateUpdate(SchedulerBase.java:738)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:715)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:78)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:477)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:309)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:307)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:222)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:84)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:168)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
... 5 more
Caused by: java.lang.ClassCastException: class org.apache.flink.table.data.binary.BinaryStringData cannot be cast to class java.lang.String (org.apache.flink.table.data.binary.BinaryStringData is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
at org.apache.flink.table.runtime.typeutils.serializers.python.StringSerializer.serialize(StringSerializer.java:41)
at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serializePositionBased(RowSerializer.java:306)
at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:280)
at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:72)
at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serializePositionBased(RowSerializer.java:306)
at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:280)
at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:72)
at org.apache.flink.streaming.api.operators.python.process.AbstractExternalOneInputPythonFunctionOperator.processElement(AbstractExternalOneInputPythonFunctionOperator.java:142)
at org.apache.flink.streaming.api.operators.python.process.ExternalPythonProcessOperator.processElement(ExternalPythonProcessOperator.java:111)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
at org.apache.flink.table.runtime.operators.sink.OutputConversionOperator.processElement(OutputConversionOperator.java:105)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
at org.apache.flink.table.runtime.arrow.sources.ArrowSourceFunction.run(ArrowSourceFunction.java:200)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:333)
I tried to cast explicit, e.g. by "dtype='String'" or in Flink "DataType.STRING()". Problem not solved.
I have read the following article https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/ops/debugging/debugging_classloading/#x-cannot-be-cast-to-x-exceptions
But I do not know where to find the config file.

Apache Camel Bindy: Unexpected / unmapped characters found at the end of the fixed-length record at line : 2

Getting following exception in Camel:
Exchange[
Id ID-Dell-PC-51429-1581618665098-0-4
ExchangePattern InOnly
Headers {breadcrumbId=ID-Dell-PC-51429-1581618665098-0-3, CamelFileAbsolute=false, CamelFileAbsolutePath=C:\Users\Dell\eclipse-workspace\camelHelloWorld\input\TIL.txt, CamelFileContentType=text/plain, CamelFileLastModified=1581618006722, CamelFileLength=12050, CamelFileName=TIL.txt, CamelFileNameConsumed=TIL.txt, CamelFileNameOnly=TIL.txt, CamelFileParent=input, CamelFilePath=input\TIL.txt, CamelFileRelativePath=TIL.txt, CamelRedelivered=false, CamelRedeliveryCounter=0}
BodyType org.apache.camel.component.file.GenericFile
Body [Body is file based: GenericFile[TIL.txt]]
]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
java.lang.IllegalArgumentException: Unexpected / unmapped characters found at the end of the fixed-length record at line : 2
at org.apache.camel.dataformat.bindy.BindyFixedLengthFactory.bind(BindyFixedLengthFactory.java:281)
at org.apache.camel.dataformat.bindy.fixed.BindyFixedLengthDataFormat.createModel(BindyFixedLengthDataFormat.java:262)
at org.apache.camel.dataformat.bindy.fixed.BindyFixedLengthDataFormat.unmarshal(BindyFixedLengthDataFormat.java:196)
at org.apache.camel.processor.UnmarshalProcessor.process(UnmarshalProcessor.java:67)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:448)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:118)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:433)
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:211)
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:175)
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:174)
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:101)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.runAndReset(Unknown Source)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown Source)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Trying to convert fixed file format using following:
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "", propOrder = {
"id",
"name",
.
.
})
#XmlRootElement(name = "Til")
#FixedLengthRecord(length = 239)
public class Til {
#XmlElement(name = "ID", required = true)
#DataField(pos = 1, length = 4)
protected String id;
#XmlElement(name = "NAME", required = true)
#DataField(pos = 5, length = 15)
protected String name;
.
.
}
Total Characters are 239 in File.
What could be reason of this error? Any kind of trim or padding required to fix this?
This can be overcome by applying the following code on your POJO class. This is basically telling Bindy to ignore characters that comes after the last mapped fixed length data as per POJO.
#FixedLengthRecord(ignoreTrailingChars = true)
You can perhaps open your file in notepad++ after enabling View -> Show Symbol ->Show All Characters and see if any characters are actually present at the end. These characters are commonly noticed when dealing with files generated from a different OS. You can also doublecheck the mapping to ensure that all the fields are actually mapped as per the correct length, which can go easily unnoticed.

JavaMail Error. Transport.send(message) throws java.lang.NoClassDefFoundError: sun/security/ssl/EllipticCurvesExtension

I am trying to send an email using Gmail SMTP via TLS. The code below seems correct but i get an error at Transport.send(message). I have tried all the solutions i could found over the internet including this to no avail.
Here is my dependency:
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>javax.mail</artifactId>
<version>1.6.2</version>
</dependency>
And the send email code:
private static void sendEmail(String to, String title, String html) throws MessagingException {
System.out.println("Sending email to " + to);
senderEmail = "someEmail";
senderPassword = "senderPassword";
// create properties
Properties props = new Properties();
props.put("mail.smtp.auth", "true");
props.put("mail.smtp.starttls.enable", "true");
props.put("mail.smtp.host", "smtp.gmail.com");
props.put("mail.smtp.port", "587");
// create session using properties
Session session = Session.getInstance(props, new javax.mail.Authenticator() {
#Override
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication(senderEmail, senderPassword);
}
});
//create message using session
Message message = new MimeMessage(session);
// prepare, then send email
message.setContent(html, "text/html; charset=utf-8");
message.setFrom(new InternetAddress(senderEmail));
message.setRecipients(Message.RecipientType.TO, InternetAddress.parse(to));
message.setSubject(title);
//send message
Transport.send(message);
System.out.println("Done!");
}
I am using:
Ubuntu 18.04 LTS
Netbeans 8.2
Payara Server 5.183
The stack trace:
Warning: #{homeController.registerNewVoter()}: java.lang.NoClassDefFoundError: sun/security/ssl/EllipticCurvesExtension
javax.faces.FacesException: #{homeController.registerNewVoter()}: java.lang.NoClassDefFoundError: sun/security/ssl/EllipticCurvesExtension
at com.sun.faces.application.ActionListenerImpl.getNavigationOutcome(ActionListenerImpl.java:120)
at com.sun.faces.application.ActionListenerImpl.processAction(ActionListenerImpl.java:95)
at javax.faces.component.UICommand.broadcast(UICommand.java:246)
at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:871)
at javax.faces.component.UIViewRoot.processDecodes(UIViewRoot.java:1035)
at com.sun.faces.lifecycle.ApplyRequestValuesPhase.execute(ApplyRequestValuesPhase.java:79)
at com.sun.faces.lifecycle.Phase.doPhase(Phase.java:100)
at com.sun.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:201)
at javax.faces.webapp.FacesServlet.executeLifecyle(FacesServlet.java:731)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:475)
at org.apache.catalina.core.StandardWrapper.service(StandardWrapper.java:1628)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:339)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:209)
at org.glassfish.tyrus.servlet.TyrusServletFilter.doFilter(TyrusServletFilter.java:305)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:251)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:209)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:256)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:160)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:755)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:575)
at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:99)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:159)
at org.apache.catalina.connector.CoyoteAdapter.doService(CoyoteAdapter.java:371)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:238)
at com.sun.enterprise.v3.services.impl.ContainerMapper$HttpHandlerCallable.call(ContainerMapper.java:516)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:213)
at org.glassfish.grizzly.http.server.HttpHandler.runService(HttpHandler.java:182)
at org.glassfish.grizzly.http.server.HttpHandler.doHandle(HttpHandler.java:156)
at org.glassfish.grizzly.http.server.HttpServerFilter.handleRead(HttpServerFilter.java:218)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:95)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:260)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:177)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:109)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:88)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:53)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:524)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:89)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.run0(WorkerThreadIOStrategy.java:94)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.access$100(WorkerThreadIOStrategy.java:33)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy$WorkerThreadRunnable.run(WorkerThreadIOStrategy.java:114)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:569)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.run(AbstractThreadPool.java:549)
at java.lang.Thread.run(Thread.java:745)
Caused by: javax.faces.el.EvaluationException: java.lang.NoClassDefFoundError: sun/security/ssl/EllipticCurvesExtension
at com.sun.faces.application.MethodBindingMethodExpressionAdapter.invoke(MethodBindingMethodExpressionAdapter.java:100)
at com.sun.faces.application.ActionListenerImpl.getNavigationOutcome(ActionListenerImpl.java:106)
... 42 more
Caused by: java.lang.NoClassDefFoundError: sun/security/ssl/EllipticCurvesExtension
at sun.security.ssl.Handshaker.getActiveProtocols(Handshaker.java:793)
at sun.security.ssl.Handshaker.activate(Handshaker.java:549)
at sun.security.ssl.SSLSocketImpl.kickstartHandshake(SSLSocketImpl.java:1492)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1361)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
at com.sun.mail.util.SocketFetcher.configureSSLSocket(SocketFetcher.java:620)
at com.sun.mail.util.SocketFetcher.startTLS(SocketFetcher.java:547)
at com.sun.mail.smtp.SMTPTransport.startTLS(SMTPTransport.java:2150)
at com.sun.mail.smtp.SMTPTransport.protocolConnect(SMTPTransport.java:752)
at javax.mail.Service.connect(Service.java:388)
at javax.mail.Service.connect(Service.java:246)
at javax.mail.Service.connect(Service.java:195)
at javax.mail.Transport.send0(Transport.java:254)
at javax.mail.Transport.send(Transport.java:124)
at com.electixxx.home.HomeController.emailElectionCode(HomeController.java:199)
at com.electixxx.home.HomeController.registerNewVoter(HomeController.java:152)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.el.ELUtil.invokeMethod(ELUtil.java:332)
at javax.el.BeanELResolver.invoke(BeanELResolver.java:537)
at javax.el.CompositeELResolver.invoke(CompositeELResolver.java:256)
at com.sun.el.parser.AstValue.invoke(AstValue.java:283)
at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:304)
at org.jboss.weld.module.web.util.el.ForwardingMethodExpression.invoke(ForwardingMethodExpression.java:40)
at org.jboss.weld.module.web.el.WeldMethodExpression.invoke(WeldMethodExpression.java:50)
at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:89)
at com.sun.faces.application.MethodBindingMethodExpressionAdapter.invoke(MethodBindingMethodExpressionAdapter.java:90)
... 43 more
Warning: StandardWrapperValve[Faces Servlet]: Servlet.service() for servlet Faces Servlet threw exception
java.lang.NoClassDefFoundError: sun/security/ssl/EllipticCurvesExtension
at sun.security.ssl.Handshaker.getActiveProtocols(Handshaker.java:793)
at sun.security.ssl.Handshaker.activate(Handshaker.java:549)
at sun.security.ssl.SSLSocketImpl.kickstartHandshake(SSLSocketImpl.java:1492)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1361)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
at com.sun.mail.util.SocketFetcher.configureSSLSocket(SocketFetcher.java:620)
at com.sun.mail.util.SocketFetcher.startTLS(SocketFetcher.java:547)
at com.sun.mail.smtp.SMTPTransport.startTLS(SMTPTransport.java:2150)
at com.sun.mail.smtp.SMTPTransport.protocolConnect(SMTPTransport.java:752)
at javax.mail.Service.connect(Service.java:388)
at javax.mail.Service.connect(Service.java:246)
at javax.mail.Service.connect(Service.java:195)
at javax.mail.Transport.send0(Transport.java:254)
at javax.mail.Transport.send(Transport.java:124)
at com.electixxx.home.HomeController.emailElectionCode(HomeController.java:199)
at com.electixxx.home.HomeController.registerNewVoter(HomeController.java:152)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.el.ELUtil.invokeMethod(ELUtil.java:332)
at javax.el.BeanELResolver.invoke(BeanELResolver.java:537)
at javax.el.CompositeELResolver.invoke(CompositeELResolver.java:256)
at com.sun.el.parser.AstValue.invoke(AstValue.java:283)
at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:304)
at org.jboss.weld.module.web.util.el.ForwardingMethodExpression.invoke(ForwardingMethodExpression.java:40)
at org.jboss.weld.module.web.el.WeldMethodExpression.invoke(WeldMethodExpression.java:50)
at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:89)
at com.sun.faces.application.MethodBindingMethodExpressionAdapter.invoke(MethodBindingMethodExpressionAdapter.java:90)
at com.sun.faces.application.ActionListenerImpl.getNavigationOutcome(ActionListenerImpl.java:106)
at com.sun.faces.application.ActionListenerImpl.processAction(ActionListenerImpl.java:95)
at javax.faces.component.UICommand.broadcast(UICommand.java:246)
at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:871)
at javax.faces.component.UIViewRoot.processDecodes(UIViewRoot.java:1035)
at com.sun.faces.lifecycle.ApplyRequestValuesPhase.execute(ApplyRequestValuesPhase.java:79)
at com.sun.faces.lifecycle.Phase.doPhase(Phase.java:100)
at com.sun.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:201)
at javax.faces.webapp.FacesServlet.executeLifecyle(FacesServlet.java:731)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:475)
at org.apache.catalina.core.StandardWrapper.service(StandardWrapper.java:1628)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:339)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:209)
at org.glassfish.tyrus.servlet.TyrusServletFilter.doFilter(TyrusServletFilter.java:305)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:251)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:209)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:256)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:160)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:755)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:575)
at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:99)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:159)
at org.apache.catalina.connector.CoyoteAdapter.doService(CoyoteAdapter.java:371)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:238)
at com.sun.enterprise.v3.services.impl.ContainerMapper$HttpHandlerCallable.call(ContainerMapper.java:516)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:213)
at org.glassfish.grizzly.http.server.HttpHandler.runService(HttpHandler.java:182)
at org.glassfish.grizzly.http.server.HttpHandler.doHandle(HttpHandler.java:156)
at org.glassfish.grizzly.http.server.HttpServerFilter.handleRead(HttpServerFilter.java:218)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:95)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:260)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:177)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:109)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:88)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:53)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:524)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:89)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.run0(WorkerThreadIOStrategy.java:94)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.access$100(WorkerThreadIOStrategy.java:33)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy$WorkerThreadRunnable.run(WorkerThreadIOStrategy.java:114)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:569)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.run(AbstractThreadPool.java:549)
at java.lang.Thread.run(Thread.java:745)

How to set RocksDBStateBackend parameter in Flink?

I use the follow code to set RocksDBStateBackend and it option, it can run correctly locally, but can't be submitted to cluster.
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
RocksDBStateBackend rocksDBBackEnd = new RocksDBStateBackend("file:///Users/zsh/tmp/rocksdb");
rocksDBBackEnd.setOptions(new OptionsFactory() {
#Override
public DBOptions createDBOptions(DBOptions currentOptions) {
return currentOptions;
}
#Override
public ColumnFamilyOptions createColumnOptions(ColumnFamilyOptions currentOptions) {
final long blockCacheSize = 8 * 1024 * 1024;
final long blockSize = 4 * 1024;
final long targetFileSize = 2 * 1024 * 1024;
final long writeBufferSize = 64 * 1024 * 1024;
final int writeBufferNum = 1; //default 2
final int minBufferToMerge = 1; //default 2
return currentOptions
.setCompactionStyle(CompactionStyle.LEVEL)
.setTargetFileSizeBase(targetFileSize)
.setWriteBufferSize(writeBufferSize)
.setMaxWriteBufferNumber(writeBufferNum)
.setMinWriteBufferNumberToMerge(minBufferToMerge)
.setTableFormatConfig(
new BlockBasedTableConfig()
.setBlockCacheSize(blockCacheSize)
.setBlockSize(blockSize)
);
}
});
env.setStateBackend(rocksDBBackEnd);
....
env.execute();
When i submit my job this way:
flink run -d -c gerryzhou.metricTest target/gerryzhou.flink-1.0-SNAPSHOT.jar
it throw below exception:
org.apache.flink.client.program.ProgramInvocationException: The program execution failed: JobManager did not respond within 60000 milliseconds
at org.apache.flink.client.program.ClusterClient.runDetached(ClusterClient.java:505)
at org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:103)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:442)
at org.apache.flink.client.program.DetachedEnvironment.finalizeExecute(DetachedEnvironment.java:76)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:387)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:838)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:259)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1086)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1133)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1130)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1129)
And the jobmanager.log look like this
2017-06-29 15:37:16,651 WARN akka.remote.ReliableDeliverySupervisor - Association with remote system [akka.tcp://flink#10.242.98.255:51891] has failed, address is now gated for [5000] ms. Reason: [gerryzhou.metricTest$1]
2017-06-29 15:37:16,651 ERROR Remoting - gerryzhou.metricTest$1
java.lang.ClassNotFoundException: gerryzhou.metricTest$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:677)
I have changed my code and implemented the OptionsFactory with a single class file MRocksDBFactory, use it like this rocksDBBackEnd.setOptions(new MRocksDBFactory());. The error info in jobManager.log become this:
2017-06-29 16:29:27,162 WARN akka.remote.ReliableDeliverySupervisor - Association with remote system [akka.tcp://flink#10.242.98.255:52638] has failed, address is now gated for [5000] ms. Reason: [gerryzhou.MRocksDBFactory]
2017-06-29 16:29:27,163 ERROR Remoting - gerryzhou.MRocksDBFactory
java.lang.ClassNotFoundException: gerryzhou.MRocksDBFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:677)
at akka.util.ClassLoaderObjectInputStream.resolveClass(ClassLoaderObjectInputStream.scala:19)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1819)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1986)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:192)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:967)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:437)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Can anybody help me ?

Mongo timed out on save

I'm incurring in a problem using spring-data-mongo (1.5.2) and mongo-java-driver 2.9.2.
This project is within a spring container and used as jar in another application.
Functional tests are working fine (saving, retrieve, everything) but when I include this jar into my other application, i retrieve the needed DAO then I can retrieve objects but not store them.
The error I get is
com.mongodb.MongoTimeoutException: Timed out while waiting for a
server that matches
{serverSelectors=[ReadPreferenceServerSelector{readPreference=primary},
LatencyMinimizingServerSelector{acceptableLatencyDifference=15 ms}]}
after 10000 ms
What I'm starting to think is that has something to do with the transaction been lost while retrieving the dao/repository to the (non-spring-)application.
EDIT: No it's not lost as i tried to include the jar into another application and it worked... It should be something wrong within the first application.
This is the result of rs.status() executed on the primary
{
"set" : "mongodb.vms",
"date" : ISODate("2014-08-29T06:54:04Z"),
"myState" : 1,
"members" : [
{
"_id" : 0,
"name" : "server1-webtec-vm01:27017",
"health" : 1,
"state" : 1,
"stateStr" : "PRIMARY",
"uptime" : 1191318,
"optime" : Timestamp(1408545246, 2),
"optimeDate" : ISODate("2014-08-20T14:34:06Z"),
"electionTime" : Timestamp(1408104842, 1),
"electionDate" : ISODate("2014-08-15T12:14:02Z"),
"self" : true
},
{
"_id" : 1,
"name" : "server2-webtec-vm02.ch.mycompany.net:27017",
"health" : 1,
"state" : 2,
"stateStr" : "SECONDARY",
"uptime" : 1191317,
"optime" : Timestamp(1408545246, 2),
"optimeDate" : ISODate("2014-08-20T14:34:06Z"),
"lastHeartbeat" : ISODate("2014-08-29T06:54:03Z"),
"lastHeartbeatRecv" : ISODate("2014-08-29T06:54:03Z"),
"pingMs" : 0,
"syncingTo" : "server1-webtec-vm01:27017"
}
],
"ok" : 1
}
Here the whole exception:
Not able to start the 'ch.MYCOMPANY.escenic.engine.service.MongoCountryService' service
org.springframework.dao.DataAccessResourceFailureException: Timed out while waiting for a server that matches {serverSelectors=[ReadPreferenceServerSelector{readPreference=primary}, LatencyMinimizingServerSelector{acceptableLatencyDifference=15 ms}]} after 10000 ms; nested exception is com.mongodb.MongoTimeoutException: Timed out while waiting for a server that matches {serverSelectors=[ReadPreferenceServerSelector{readPreference=primary}, LatencyMinimizingServerSelector{acceptableLatencyDifference=15 ms}]} after 10000 ms
at org.springframework.data.mongodb.core.MongoExceptionTranslator.translateExceptionIfPossible(MongoExceptionTranslator.java:71)
at org.springframework.data.mongodb.core.MongoTemplate.potentiallyConvertRuntimeException(MongoTemplate.java:1918)
at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:412)
at org.springframework.data.mongodb.core.MongoTemplate.saveDBObject(MongoTemplate.java:945)
at org.springframework.data.mongodb.core.MongoTemplate.doSave(MongoTemplate.java:885)
at org.springframework.data.mongodb.core.MongoTemplate.save(MongoTemplate.java:833)
at org.springframework.data.mongodb.repository.support.SimpleMongoRepository.save(SimpleMongoRepository.java:72)
at org.springframework.data.mongodb.repository.support.SimpleMongoRepository.save(SimpleMongoRepository.java:87)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.executeMethodOn(RepositoryFactorySupport.java:405)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:390)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:344)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at com.sun.proxy.$Proxy30.save(Unknown Source)
at ch.MYCOMPANY.escenic.engine.service.MongoCountryService.checkCountries(MongoCountryService.java:133)
at ch.MYCOMPANY.escenic.engine.service.MongoCountryService.startService(MongoCountryService.java:35)
at neo.nursery.AbstractNurseryService.doStartService(AbstractNurseryService.java:245)
at sun.reflect.GeneratedMethodAccessor62.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.escenic.phoenix.admin.servlet.ECEBrowserHelper.displayMethod(ECEBrowserHelper.java:343)
at org.apache.jsp.browser_jsp._jspService(browser_jsp.java:160)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:432)
at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:390)
at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at com.escenic.servlet.TopFilter.doFilterImpl(TopFilter.java:135)
at com.twelvemonkeys.servlet.GenericFilter.doFilter(GenericFilter.java:208)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)
at com.googlecode.psiprobe.Tomcat70AgentValve.invoke(Tomcat70AgentValve.java:38)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1040)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:607)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:315)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: com.mongodb.MongoTimeoutException: Timed out while waiting for a server that matches {serverSelectors=[ReadPreferenceServerSelector{readPreference=primary}, LatencyMinimizingServerSelector{acceptableLatencyDifference=15 ms}]} after 10000 ms
at com.mongodb.BaseCluster.getServer(BaseCluster.java:87)
at com.mongodb.DBTCPConnector.getServer(DBTCPConnector.java:654)
at com.mongodb.DBTCPConnector.access$300(DBTCPConnector.java:39)
at com.mongodb.DBTCPConnector$MyPort.getConnection(DBTCPConnector.java:503)
at com.mongodb.DBTCPConnector$MyPort.get(DBTCPConnector.java:451)
at com.mongodb.DBTCPConnector.getPrimaryPort(DBTCPConnector.java:409)
at com.mongodb.DBCollectionImpl.update(DBCollectionImpl.java:263)
at com.mongodb.DBCollection.update(DBCollection.java:191)
at com.mongodb.DBCollection.save(DBCollection.java:975)
at com.mongodb.DBCollection.save(DBCollection.java:934)
at org.springframework.data.mongodb.core.MongoTemplate$10.doInCollection(MongoTemplate.java:950)
at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:410)
... 53 more

Resources