I would like to write to S3 from Flink 1.4.2 using presto interface and BucketingSink. I followed the instructions, added in flink-conf.yaml s3.access-key and s3.secret-key and put flink-s3-fs-presto-1.4.2.jar in lib folder. Below is error that is produced.
If job is executed in AWS environment I hope that I don't need to set up keys at all. I is this assumption correct.
java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).
at org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:70)
at org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy17.initialize(Unknown Source)
at org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:91)
at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.createHadoopFileSystem(BucketingSink.java:1206)
at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.initFileSystem(BucketingSink.java:411)
at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.initializeState(BucketingSink.java:355)
at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.tryRestoreFunction(StreamingFunctionUtils.java:178)
at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.restoreFunctionState(StreamingFunctionUtils.java:160)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.initializeState(AbstractUdfStreamOperator.java:96)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:258)
at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:748)
The application seems not to be using the flink-s3-fs-presto at all, but Hadoop's deprecated old S3 File System. The stack trace you pasted indicates that the flink-s3-fs-presto is not picked up for the file system scheme 's3://'.
Please make sure that the flink-s3-fs-presto JAR file is really in the lib folder of the TaskManagers that execute the job, not only on the client.
When you use YARN or Mesos to deploy Flink jobs, that should automatically happen.
When you deploy Flink via containers, make sure that the JAR file is in the lib folder of your container image.
When you run Flink TaskManagers standalone or manually, make sure all TaskManagers in the cluster have the JAR file in the lob folder before being started.
Related
I have an Apache Beam application(using beam version 2.23.0) that I am trying to deploy on AWS EMR(emr-5.30.1) with Flink(1.10.0) preinstalled.
The application is running with no issues when I deploy it on my local docker flink cluster. But when I do
flink run -m yarn-cluster -c my_class my_jar.jar
on the master node of the EMR cluster
I get
java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1054)
at org.apache.beam.sdk.options.PipelineOptionsFactory.<clinit>(PipelineOptionsFactory.java:471)
at org.myapp.main(MainApp.java:78)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:664)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:895)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:968)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:968)
Seems like the issue is with
org.apache.beam.sdk.options.PipelineOptionsFactory.<clinit>(PipelineOptionsFactory.java:471) but I am not clear on what is causing this behaviour.
Can someone please advise what may cause this?
Thank you in advance!
That is probably a classloading issue.
On EMR Flink EC2 instance, there are already some jars, and these libraries are loaded before your own dependencies. So, the version that is used at runtime is the one provided by EMR, not the one you have as a dependency in your own pom.xml.
There are multiple solutions:
in your pom.xml, use the same version than the one provided by EMR
in EC2 instance, replace the EMR version by yours
change the order of library loading
whatever the solution, you need to send to Flink all the required dependencies, no only the jar that contains your own code
I tried to deploy my application to flink on yarn with cli, Unfortunately,it's fail with below Exception
java.lang.NoClassDefFoundError: Lredis/clients/jedis/JedisCluster;
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
at java.lang.Class.getDeclaredFields(Class.java:1916)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:72)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1548)
at org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:183)
at org.apache.flink.streaming.api.datastream.DataStream.flatMap(DataStream.java:551)
at org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:594)
at com.hypers.hwt.realtime.top.HwtRealTimeTopRunner.executeLateStream(HwtRealTimeTop.scala:138)
at com.hypers.hwt.realtime.top.HwtRealTimeTopRunner.run(HwtRealTimeTop.scala:72)
at com.hypers.hwt.realtime.top.HwtRealTimeTop$.main(HwtRealTimeTop.scala:265)
at com.hypers.hwt.realtime.top.HwtRealTimeTop.main(HwtRealTimeTop.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:381)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:838)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:259)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1086)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1133)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1130)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1130)
I already use -yt parameter to distribute my external jars,but still fail.
Actually,flink submit job with 3 step:
wrap code and build graph in client
client submit job to jobmanager
jobmanager distribute job to taskmanager
problem
In long time test,I found this Exception is happen in step1. And step1 is run in local by YarnClusterClient. And I know this problem will be solved by add my external jars in $FLINK_HOME/lib,but it will cause conflict with other application
Expect
So I want to know if there are any way to add external jars class path in local?
Addtion
class LateFlatMap(conf: FlinkJedisClusterConfig) extends RichFlatMapFunction[(PvAccBean, UvAccBean), Iterable[(String, Array[Byte])]] {
var jedisCluster: JedisCluster = null
override def open(properties: Configuration): Unit = {
val genericObjectPoolConfig = new GenericObjectPoolConfig()
genericObjectPoolConfig.setMaxIdle(conf.getMaxIdle())
genericObjectPoolConfig.setMaxTotal(conf.getMaxTotal())
genericObjectPoolConfig.setMinIdle(conf.getMinIdle())
jedisCluster = new JedisCluster(conf.getNodes(), conf.getConnectionTimeout(),
conf.getMaxRedirections(), genericObjectPoolConfig)
}
#Override
override def close(): Unit = {
jedisCluster.close()
}
...
}
Basically I see two possibilities:
Add your 3rd party libs to your jobs jar by building a fat jar. Each major build system can do so (e.g. Maven Assembly Plugin or SBT Assembly Plugin). This would be my preferred solution
If you want to use your 3rd party libs for all of your flink jobs you can add them to flinks jars directory before your start your cluster. This will also work but offer you less flexibility.
Hope that helps
Tried all options combined -C and -yt, external jars added in classpaths and added in yarn.ship.directories but failing when initializing mq connection factory. Same working when placed in flink lib.
Wondering still this is not working at end of 2020
Try using
bin/start-scala-shell.sh local -a <full_external_jar_path>
I am trying to configure the latest version of WSO2 AM 2.1.0 using a Microsoft SQL Server for its database on a windows server.
The databases have been created with user accounts and trying to start up the application with the start up flag auto create the tables.
the command I am running is: wso2server.bat -Dsetup
Some of the tables are created for the product but receive the following error during the start up on the console window. I have checked and the folder is missing from the distribution. Any help on a way forward?
[2017-08-09 11:24:47,614] ERROR - IdentityCoreServiceComponent Error occurred while populating identity configuration properties
org.wso2.carbon.identity.base.IdentityRuntimeException: java.io.FileNotFoundException: D:\Software\2.1\wso2am-2.1.0\bin\..\dbscripts\identity\mssql.sql (The system cannot find the path specified)
at org.wso2.carbon.identity.base.IdentityRuntimeException.error(IdentityRuntimeException.java:71)
at org.wso2.carbon.identity.core.persistence.IdentityDBInitializer.executeSQLScript(IdentityDBInitializer.java:273)
at org.wso2.carbon.identity.core.persistence.IdentityDBInitializer.createIdentityDatabase(IdentityDBInitializer.java:141)
at org.wso2.carbon.identity.core.persistence.JDBCPersistenceManager.initializeDatabase(JDBCPersistenceManager.java:112)
at org.wso2.carbon.identity.core.internal.IdentityCoreServiceComponent.activate(IdentityCoreServiceComponent.java:130)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.equinox.internal.ds.model.ServiceComponent.activate(ServiceComponent.java:260)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.activate(ServiceComponentProp.java:146)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.build(ServiceComponentProp.java:345)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(InstanceProcess.java:620)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(InstanceProcess.java:197)
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343)
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SCRManager.java:222)
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.serviceChanged(FilteredServiceListener.java:107)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(BundleContextImpl.java:861)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEventPrivileged(ServiceRegistry.java:819)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEvent(ServiceRegistry.java:771)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.register(ServiceRegistrationImpl.java:130)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.registerService(ServiceRegistry.java:214)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.registerService(BundleContextImpl.java:433)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.registerService(BundleContextImpl.java:451)
at org.wso2.carbon.core.init.CarbonServerManager.initializeCarbon(CarbonServerManager.java:514)
at org.wso2.carbon.core.init.CarbonServerManager.removePendingItem(CarbonServerManager.java:290)
at org.wso2.carbon.core.init.PreAxis2ConfigItemListener.bundleChanged(PreAxis2ConfigItemListener.java:118)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(BundleContextImpl.java:847)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340)
Caused by: java.io.FileNotFoundException: D:\Software\2.1\wso2am-2.1.0\bin\..\dbscripts\identity\mssql.sql (The system cannot find the path specified)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:93)
at org.wso2.carbon.identity.core.persistence.IdentityDBInitializer.executeSQLScript(IdentityDBInitializer.java:235)
... 30 more
Thanks,
Gary
You have 2 options.
1) Run database scripts manually on created databases, and start server without -Dsetup.
2) Go to <APIM_HOME>/dbscripts/. Make a copy of apimgt and name it as identity. Then start server with -Dsetup.
I am using Flink for streaming the data which is in the csv file. I want to put it into table format with certain schema. For this purpose I am using Flink-table_2.10-1.1.3.jar (Table api) but I got the errors:
log4j:WARN No appenders could be found for logger (org.apache.flink.api.java.typeutils.TypeExtractor).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/shaded/calcite/com/google/common/base/Throwables
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:450)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:460)
at org.apache.calcite.rel.metadata.RelMetadataQuery.revise(RelMetadataQuery.java:186)
at org.apache.calcite.rel.metadata.RelMetadataQuery.collations(RelMetadataQuery.java:484)
at org.apache.calcite.rel.metadata.RelMdCollation.project(RelMdCollation.java:207)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:122)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:120)
at org.apache.calcite.plan.RelTraitSet.replaceIfs(RelTraitSet.java:238)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:116)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:108)
at org.apache.flink.api.table.plan.logical.Project.construct(operators.scala:90)
at org.apache.flink.api.table.plan.logical.Project.construct(operators.scala:85)
at org.apache.flink.api.table.plan.logical.LogicalNode.toRelNode(LogicalNode.scala:78)
at org.apache.flink.api.table.Table.getRelNode(table.scala:66)
at org.apache.flink.api.table.StreamTableEnvironment.translate(StreamTableEnvironment.scala:243)
at org.apache.flink.api.java.table.StreamTableEnvironment.toDataStream(StreamTableEnvironment.scala:147)
at table_streaming_test.main(table_streaming_test.java:90)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.shaded.calcite.com.google.common.base.Throwables
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
When I explore the corresponding jar, the respective class is present there. Can you please tell that why this is happening?
Also can I get the maven source so that I can build the Flink-table .jar at my place?
I had the same problem with CEP library. I added to my pom file but I kept getting ClassNotFoundException. I even packaged it with my jar file via IntelliJ but didn't work.
If you're using their flink-quickstart archetype, I think there are some other things to change in pom file to make it work. When I created a clean project and added flink dependencies myself, I didn't get that exception anymore. You can try and see if this approach works.
You can also add flink-table JAR file to lib folder in Flink. this also fixed my problem with CEP library. the JAR file is available in Maven repository website. download the version you want.
According to the Table and SQL document on Flink website:
Note: The Table API is currently not part of the binary distribution.
See linking with it for cluster execution here.
I was also facing the same problem with Table api in flink v1.4.2.
I added flink-table_2.11-1.4.2.jar file present in opt folder to the lib folder and restarted flink.
This works for me. Hopefully works for you too :)
I have a program in Apache Flink. I tested and ran it on the local machine and every thing works fine. To run the program on a remote cluster, I did necessary changes as mentioned in Apache Flink Official Website.
I did the following changes:
The two points below
ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
ExecutionEnvironment env = ExecutionEnvironment.createRemoteEnvironment("taskManagerName",
portNo,paralelismNo);
Fixing the necessary paths to read input files and write outputs.
Generate a thin jar out of the program and put the necessary jar
libraries into a folder besides my project jar file called
myproj.jar.
copying the data and the jar library and myproj.jar into the cluster
and run the following command remotely on the cluster:
java -cp pathToJarLib \\* -jar myproj.jar
But, I get the below error and I don't have any clue to fix the issue. There are no relevant log files which can aid me in fixing this issue.
Error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/api/common/functions/MapFunction
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2570)
at java.lang.Class.getMethod0(Class.java:2813)
at java.lang.Class.getMethod(Class.java:1663)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.api.common.functions.MapFunction
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
Your classpath is obviously not complete. Try to submit via bin/flink run myproj.jar. This sets up the classpath correctly.