This question already has an answer here:
Closed 11 years ago.
Possible Duplicate:
Which maven2 artifacts are necessary to build a WS with CXF and Spring?
I am using an ivy.xml file to get dependencies I need. So to get some dependencies, I write e.g.
<dependency org="org.hibernate" name="hibernate-entitymanager" rev="3.6.6.Final">
<artifact name="hibernate-entitymanager" type="jar" />
</dependency>.
When searching the Maven Central Repository for CXF, I found the following Ivy dependency information:
<dependency org="org.apache.cxf" name="cxf" rev="2.5.2" >
<artifact name="cxf" type="pom" />
</dependency>
That dependency has pom instead of jar for type. How can I get jars for CXF with Ivy?
apache-cxf is split into several JARs. I don't know ivy but you probably need to import them separately, for example:
<dependency org="org.apache.cxf" name="cxf" rev="2.5.2"/>
<dependency org="org.apache.cxf" name="cxf-rt-frontend-jaxws" rev="2.5.2"/>
Related
I need to save the table result to orc on S3, and this is how I do it:
tEnv.createTemporaryTable("my_output_table", TableDescriptor.forConnector("filesystem")
.schema(outputSchema)
.option("path", s3OutputPath)
.format(FormatDescriptor.forFormat("orc").build())
.build());
finalResultToInsert.executeInsert("my_output_table");
However, during runtime it throws error of
Caused by: org.apache.flink.table.api.ValidationException: Could not find any format factory for identifier 'orc' in the classpath.
at org.apache.flink.table.filesystem.FileSystemTableSink.<init>(FileSystemTableSink.java:128) ~[flink-table_2.12-1.14.2.jar:1.14.2]
at org.apache.flink.table.filesystem.FileSystemTableFactory.createDynamicTableSink(FileSystemTableFactory.java:87) ~[flink-table_2.12-1.14.2.jar:1.14.2]
at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:179) ~[flink-table_2.12-1.14.2.jar:1.14.2]
at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:394) ~[flink-table_2.12-1.14.2.jar:1.14.2]
......
I have already included the relevant dependency
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-orc_2.11</artifactId>
<version>1.14.2</version>
</dependency>
And in the generated jar file I can see the flink orc classes:
org/apache/flink/orc/
org/apache/flink/orc/AbstractOrcFileInputFormat.class
org/apache/flink/orc/OrcFileFormatFactory$1.class
org/apache/flink/orc/OrcFilters$LessThanEquals.class
org/apache/flink/orc/AbstractOrcFileInputFormat$OrcVectorizedReader.class
org/apache/flink/orc/OrcFilters$Not.class
org/apache/flink/orc/OrcFileFormatFactory.class
org/apache/flink/orc/OrcColumnarRowSplitReader.class
org/apache/flink/orc/OrcColumnarRowSplitReader$ColumnBatchGenerator.class
org/apache/flink/orc/AbstractOrcFileInputFormat$OrcReaderBatch.class
org/apache/flink/orc/OrcFilters$In.class
......
So I really don't understand why it still can not find it in the classpath.
A side note, in the project I also include flink-avro dependency and if I change the output format from orc to avro it just works fine.
Also, I am running the job on AWS EMR. The EMR release is 6.6.0 which has Flink version of 1.14.2. https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-660-release.html
Could anyone help with that? Thanks a lot!
If I'm not mistaken, ORC requires Hadoop. So you should also add org.apache.hadoop:hadoop-common and potentially other Hadoop dependencies too.
Ok, looks like I resolved the problem by placing
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-orc_2.11</artifactId>
<version>1.14.2</version>
</dependency>
To the top of the dependencies section in Maven pom, and it worked.
I think it is just due to the order of class loading, so put it in front might resolve the conflicts during the class loading for flink-orc dependency since it will be loaded first.
java.lang.IllegalStateException: No ExecutorFactory found to execute the application.
at org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:84)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1803)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1713)
at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1699)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1681)
at com.cep.StaticAlarmGenerationEntryTest.main(StaticAlarmGenerationEntryTest.java:149)
The error I met after I upgraded FLink from 1.10 to 1.11, and my IDE is eclipse.
and I tried to add artifactId:flink-clients_${scala.binary.version}, but still failed. Anybody already met and solved this issue, pls tell me. thanks a lot.
See the 1.11 release note, where you now have to add an explicit dependency on flink-clients.
I have solved my problem this way :
1.Use Java 8, apparently Flink has some sort of problem with Java 11 or 15
2.Change all the Scopes to "compile"
you can change scopes in this path : Project Structure → Modules → Dependencies → There is a table that one of its column's name is Scope
I found the reason why the error happened event I add dependency flink-clients. I upgraded Flink from 1.10 to 1.11, just edited the version of Flink, but not changed Scala version. Here also should change Scala version to 2.12. And the project is generated base on 1.10 archetype and Scala version is 2.11. Every time I build the project, it use the 2.11 environments.
So the fast way to solve this issue is :
use mvn archetype:generate -DarchetypeGroupId=org.apache.flink -DarchetypeArtifactId=flink-quickstart-java -DarchetypeVersion=1.11.0 this command to generate new project.
copy all your old code to this new project. You will find that the flink-clinets already added in the pom.xml.
I had this problem when I was packaging up the flink job in a shaded jar. When shading, if there are files with the same name in multiple jars it will overwrite the file as it unzips each jar into the new shaded jar.
Flink uses the file META-INF/services/org.apache.flink.core.execution.PipelineExecutorFactory to discover different executor factories and this file is present in multiple jars, each with different contents.
To fix this, I had to tell the maven-shade plugin to combine these files together as it came across them, and this solved the problem for me.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>job</shadedClassifierName>
<transformers>
<!-- add this to combine the PipelineExecutorFactory files into one -->
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/services/org.apache.flink.core.execution.PipelineExecutorFactory</resource>
</transformer>
</transformers>
...
I get a compile error from maven when building a project under Java 9 that uses the Oracle JavaMail IMAP provider:
... cannot access com.sun.mail.util.ReadableMime
[ERROR] class file for com.sun.mail.util.ReadableMime not found
(ReadableMime is an interface implemented by IMAPMessage)
It works under Java 8.
The dependencies are:
<dependency>
<groupId>javax.mail</groupId>
<artifactId>javax.mail-api</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>imap</artifactId>
<version>1.6.0</version>
</dependency>
Neither javax.mail-api nor imap include a com.sun.mail.util package, but it seems to be no longer part (if it ever officially was) of the JRE.
So, I guess this is a bug in the Oracle imap provider that pops up now in the presence of Jigsaw, or am I missing something?
Interestingly, the Eclipse compiler doesn't complain.
Thanks to EJP:
com.sun.mail:imap works with com.sun.mail:javax.mail, but not with javax.mail:javax.mail-api.
The latter does not include the com.sun.mail.util package.
com.sun.mail:imap does not declare any Maven dependency, but de facto it has a compile dependency to com.sun.mail:javax.mail.
I'm trying to package my WPF application with the NuGet pack command. So far I found out that adding -IncludeReferencedProjects resolves the problem of referenced projects not getting packed.
./nuget.exe pack {path}.csproj -NonInteractive -OutputDirectory C:\test -Properties Configuration=release -version 0.1.0 -Verbosity Detailed -IncludeReferencedProjects
The problem I face is that the project's dependencies are not getting packed. They however do show up in the log as you can see below. Dependencies: EntityFramework.
But the dependencies are never added to the package. Even when I manually inspect or deploy the package only Data.dll and {name}.exe are deployed.
What I've already tried (by searching Google/SO):
Adding a NuSpec file for the csproj file
NuGet.config with a reference to the correct packages folder (in case NuGet did not find it)
EDIT: added generated .nuspec
<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
<metadata>
<id>{path}.UpgradeDatabase</id>
<version>0.1.4</version>
<title>{path}.UpgradeDatabase</title>
<authors>stephanbisschop</authors>
<owners>stephanbisschop</owners>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>Description</description>
<copyright>Copyright © 2016</copyright>
<dependencies>
<dependency id="EntityFramework" version="6.1.3" />
</dependencies>
</metadata>
</package>
Thank you in advance,
Stephan
Problem solved.
It wasn't my NuGet package that was missing the dependencies.. it was Octopus Deploy that wasn't restoring the packages. We fixed it by adding entries in the appropriate .NuSpec files. Now all the .dll, .config , etc.. files get packaged with the package.
I am using Apache Camel with websphere. We have had some classloader issues.
I understand that Camel provides a websphere classloader but I cannot find an example in how to use it.
tried putting this in the apllicationContext.xml file
<bean id="WebsphereResolver" class="org.apache.camel.impl.WebSpherePackageScanClassResolver" />
but I got the error
Caused by: java.lang.NoSuchMethodException: org.apache.camel.impl.WebSpherePackageScanClassResolver.<init>()
What is correct format?
It turns out the problem was we were using an old version of fasterxml as defined in our pom.xml. Once we updated the pom.xml to use a new version of fasterxml, the problem went away. The apachexml bean uses fasterxml if it is available, but it assumes you are using a recent version.