I want to call my Java interfaces in a jar file in a PyFlink job. No solutions are found in the offical document.
It looks to me like support for this was not included in Flink 1.9, but is ongoing work. See FLIP-58. FLIP-78 and FLIP-88 may also be of interest. Note that most of these improvements will be included in the upcoming Flink 1.10 release.
You can use python table api to register java user-defined function if it satisfies your need. The signature of method is register_java_function in table_environment
Related
We are running a flink job using v.1.13.2 and setup/configured logging using log4j(classic/pre-log4j2). We want to upgrade to and use Log4j2 instead and could not find any way to do that. Wondering if there are any teams who went down this path to try to upgrade Log4j. Thanks.
Log4j2 has been the default logger since Flink 1.11. In order to be using log4j v1, there must be some configuration in place that needs to be removed / updated. See the documentation for details.
Although the log configuration file of flink is named log4j.properties, it actually use log4j2,as david said.
Is it possible to use PyFlink with python machine learning libraries such as LightGBM for a streaming application? Is there any good example for this?
There is no complete example but you can take a loot at Getting Started with Flink Python and then take a look at how Python UDFs can be used: UDFs in the Table API.
Is there a way to get any Apache Camel component "metadata" using Java code, like the list of options and other parameters and their types? I think some automatic help builder was mentioned somewhere that might be of use for this task without using reflection.
A way to get the registered components of all types (including data formats and languages) with java code is also sought. Thanks
Yeah take a look at the camel-catalog JAR which includes all such details. This JAR is what the tooling uses such as some of the Maven tooling itself, or IDE plugs for IntelliJ or Eclipse etc. The JAR has both Java API and metadata files embedded in the JAR you can load.
At runtime you can also access this catalog via RuntimeCamelCatalog which you can access via CamelContext. The runtime catalog is a little bit more limited than CamelCatalog as it has a view of what actually is available at runtime in the current Camel application.
Also I cover this in my book Camel in Action 2nd edition where there is a full chapter devoted on Camel tooling and how to build custom tooling etc.
This is what I've found so far
http://camel.apache.org/componentconfiguration.html
When packaging a flink job which uses for example some flink connectors and some third-party libraries (for processing), which dependencies should end up in the jobs jar so it can be launched in a flink-cluster using ("flink run [jarfile]")?
Is making a fat-jar the desired approach?
If writing a job in scala, do you include the scala default library in the jar?
I didn't find any documentation on how to package a job for flink once it is written.
Yes, a fat-jar is the standard way to package a Flink job. Everything that is contained in the Flink distribution must not be included (ie, Java and Scale default libraries, Flink core, ...). Only some Flink libraries that are not contained (plus user defined external dependencies) must be included in the fat-jar.
You can follow this guideline from the Flink documenation: https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/cluster_execution.html#linking-with-modules-not-contained-in-the-binary-distribution
This might also be helpful: https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/common/index.html#program-packaging-and-distributed-execution
I am new to serviceMix, I downloaded serviceMix 4.5.1 a couple of days ago.
When I tried to install ode in serviceMix using the command
features:install ode
It tells me this:
Error executing command: No feature named 'ode' with version '0.0.0' available
I googled/baidued mass of webs, I got a bad news that:
"Fuse ESB 4.4 does not support Apache ODE. The latest version of ODE is not compatible with Fuse ESB."
which comes from
http://fusesource.com/forums/thread.jspa?messageID=11209
Fuse ESB - ODE installation
So if serviceMix 4.4 does not support ODE any longer, what is the alternative way to do the web service orchestration in serviceMix? I have tried use camel to do this work,but that's not easy.
How about "bpel-g"?(http://code.google.com/p/bpel-g/) is it a good choice? or any other choice?
Any help will be really appreciated.
I like Activiti for processes and orchestration.
Never run it inside Karaf/SMX/Fuse ESB but it should be possible, if not using this instruction.
It also has a nice web explorer for human tasks etc. if you need it and BPMN modeller for rapid desing and visualization
I would recommend to try bpel-g. A colleague and me have been doing some BPEL conformance benchmarking lately (fyi: the benchmarking tool is available at github) and bpel-g turned out have the highest degree of support for the BPEL spec., along with the older ActiveBPEL engine from which bpel-g is a fork. ODE ranked third place.
Another nice feature of bpel-g is that it is indeed actively maintained. I don't know how well it integrates into the infrastructure of Fuse ESB, but since it's deployable as a war, this shouldn't be much of a problem.
UPDATE: Just had a look up: bpel-g seems to integrate with camel and provides a custom handler to invoke camel components. So, basically, the solution outlined in Petters answer also applies to bpel-g and, in contrast to Activiti, it has a message correlation framework. Finally, the barrier to using it should be smaller, as you already know BPEL. As a consequence, bpel-g might be a more suitable solution here.