kafka flink connection error shows NoSuchMethodError - apache-flink

new error appeared when i change from flinkkafkaconsumer09 to flinkkafkaconsumer
Flink code:
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
import java.util.Properties;
#SuppressWarnings("deprecation")
public class ReadFromKafka {
public static void main(String[] args) throws Exception {
// create execution environment
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("group.id", "test-consumer-group");
DataStream<String> stream = env
.addSource(new FlinkKafkaConsumer<String>("test4", new SimpleStringSchema(), properties));
stream.map(new MapFunction<String, String>() {
private static final long serialVersionUID = -6867736771747690202L;
#Override
public String map(String value) throws Exception {
return "Stream Value: " + value;
}
}).print();
env.execute();
}
}
ERROR:
log4j:WARN No appenders could be found for logger (org.apache.flink.api.java.ClosureCleaner).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:626)
at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:117)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1489)
at ReadFromKafka.main(ReadFromKafka.java:33)
Caused by: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.dataartisans</groupId>
<artifactId>kafka-example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>kafkaex</name>
<description>this is flink kafka example</description>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.12</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.12</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
</dependencies>
</project>

flink-connector-kafka_2.12 isn't compatible with FlinkKafkaConsumer09.
flink-connector-kafka_2.12 is a "universal" kafka connector, compiled for use with Scala 2.12. This universal connector can be used with any version of Kafka from 0.11.0 onward.
FlinkKafkaConsumer09 is for use with Kafka 0.9.x. If your Kafka broker is running Kafka 0.9.x, then you will need flink-connector-kafka-0.9_2.11 or flink-connector-kafka-0.9_2.12, depending on which version of Scala you want.
On the other hand, if your Kafka broker is running a recent version of Kafka (0.11.0 or newer), then stick with flink-connector-kafka_2.12 and use FlinkKafkaConsumer instead of FlinkKafkaConsumer09.
See the documentation for more info.

Related

Apache Flink: java.lang.NoClassDefFoundError for FlinkKafkaConsumer

I am trying to test simple flink kafka example.
mvn package works fine and then I ran ../../flink-1.14.3/bin/flink run -c com.comapny.flinktest.App target/flinktest-1.0-SNAPSHOT.jar and got the following error:
java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumer
at com.optiver.flinktest.App.main(App.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
My pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.company.flinktest</groupId>
<artifactId>flinktest</artifactId>
<version>1.0-SNAPSHOT</version>
<name>flinktest</name>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.12</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.12</artifactId>
<version>1.14.3</version>
<scope>Compile</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<archive>
<manifest>
<mainClass>com.company.flinktest.App</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
</project>
Folder structure: src/main/java/com/company/flinktest/App.java
Editor: Vscode
App.java Code:
package com.company.flinktest;
import java.util.Properties;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import java.util.InvalidPropertiesFormatException;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
public class App{
public static void main(String[] args) throws Exception{
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "kafkaserver:9092");
properties.setProperty("group.id", "testgroup");
FlinkKafkaConsumer<String> myConsumer = new FlinkKafkaConsumer<String>(
"mytopic",
new SimpleStringSchema(),
properties);
DataStream<String> stream = env.addSource(myConsumer);
stream.print();
env.execute("flink from kafka");
}
}
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
You can refer to my answer in one of the posts. Check the Scala language version though.

NoSuchMethodError in apache camel java

I ran the following code:
package com.dinesh.example4;
import javax.jms.ConnectionFactory;
//import org.apache.camel.support.HeaderFilterStrategyComponent;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.camel.CamelContext;
import org.apache.camel.Component;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.jms.JmsComponent;
import org.apache.camel.impl.DefaultCamelContext;
//import org.apache.camel.impl.*;
public class FileToActiveMQ {
public static void main(String[] args) throws Exception {
CamelContext context = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory();
context.addComponent("jms",JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
context.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("file:input_box?noop=true")
.to("activemq:queue:my_queue");
}
});
while(true)
context.start();
}
}
for transforming data from input_box folder to activemq.
I am getting the following error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.camel.impl.HeaderFilterStrategyComponent.<init>(Ljava/lang/Class;)V
at org.apache.camel.component.jms.JmsComponent.<init>(JmsComponent.java:71)
at org.apache.camel.component.jms.JmsComponent.<init>(JmsComponent.java:87)
at org.apache.camel.component.jms.JmsComponent.jmsComponent(JmsComponent.java:102)
at org.apache.camel.component.jms.JmsComponent.jmsComponentAutoAcknowledge(JmsComponent.java:127)
at com.dinesh.example4.FileToActiveMQ.main(FileToActiveMQ.java:18)
Here 18 th line in above code:
context.addRoutes(new RouteBuilder() {
Pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.dinesh</groupId>
<artifactId>camel-application1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-core -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.14.4</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jms</artifactId>
<version>2.24.0</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-activemq</artifactId>
<version>3.0.0</version>
</dependency>
</dependencies>
</project>
Please help.
According to your POM you mix 3 different Camel versions: 2.14.4, 2.24.0 and 3.0.0.
You have to use the same version for all Camel components, as #claus-ibsen already commented.
Do it for example like the example below (with properties for the framework versions and use it in all its dependencies).
However, as Sneharghya already answered, Camel 2.x has no camel-activemq but instead can use the dependency activemq-camel of ActiveMQ.
Therefore the POM should look like this. But I think the Camel version can vary.
<properties>
<amq.version>5.15.4</amq.version>
<camel.version>2.19.5</camel.version>
</properties>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>${camel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jms</artifactId>
<version>${camel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-camel</artifactId>
<version>${amq.version}</version>
</dependency>
You can also use Maven dependencyManagement.
camel-activemq is not available for versions older than 3.0.
If you want to keep using camel 2.24.3, then remove the camel-activemq dependency from your pom file and add
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-camel</artifactId>
<version>5.15.13</version>
</dependency>
1. Update the class
There are two issues.
One you were registering your component in one name jms and send the message to different component activemq. They should be same.
You were doing a while loop and instead the while loop starting the context many times.
public class FileToActiveMQ {
public static void main(String[] args) throws Exception {
CamelContext context = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory();
context.addComponent("activemq",
JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
context.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("file:input_box?noop=true")
.to("activemq:queue:my_queue");
}
});
context.start();
while(true) {
Thread.sleep(10000);
}
}
}
2. Replace the dependencies in pom as follows:
<dependencies>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.25.1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jms</artifactId>
<version>2.25.1</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-client</artifactId>
<version>5.15.7</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-camel</artifactId>
<version>5.15.7</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-pool</artifactId>
<version>5.15.7</version>
</dependency>
</dependencies>

How to write directly druid from Flink?

I want to write the data in flink to druid.
I know the tranquility which enables above scenario however I cannot find the working project written in Java.
If someone achieve the above scenario, I really appreciate it if you could guide me how to resolve my issue.
Thanks.
Assuming that Apache Flink and Druid are running in local.
All you need to do is enable Tranquility server for Druid as defined in the link https://druid.apache.org/docs/latest/tutorials/tutorial-tranquility.html
If your tranquility server configurations for Druid are set correctly. The following Spring Boot Java Code with Flink should allow you to push data into Druid using Http Post method.
FlinkDruidApp.java
#SpringBootApplication
public class FlinkDruidApp {
private static String url = "http://localhost:8200/v1/post/wikipedia";
private static RestTemplate template;
private static HttpHeaders headers;
FlinkDruidApp() {
template = new RestTemplate();
headers = new HttpHeaders();
headers.setAccept(Arrays.asList(MediaType.APPLICATION_JSON));
headers.setContentType(MediaType.APPLICATION_JSON);
}
public static void main(String[] args) throws Exception {
SpringApplication.run(FlinkDruidApp.class, args);
// Creating Flink Execution Environment
ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
//Define data source
DataSet<String> data = env.readTextFile("/wikiticker-2015-09-12-sampled.json");
// Trasformation on the data
data.map(x -> {
return httpsPost(x).toString();
}).print();
}
// http post method to post data in Druid
private static ResponseEntity<String> httpsPost(String json) {
HttpEntity<String> requestEntity =
new HttpEntity<>(json, headers);
ResponseEntity<String> response =
template.exchange("http://localhost:8200/v1/post/wikipedia", HttpMethod.POST, requestEntity,
String.class);
return response;
}
#Bean
public RestTemplate restTemplate() {
return new RestTemplate();
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.8.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.flinkdruid</groupId>
<artifactId>FlinkDruid</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>FlinkDruid</name>
<description>Flink Druid Connection</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.12</artifactId>
<version>1.9.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Output on console
<200,{"result":{"received":1,"sent":1}},[Date:"Mon, 23 Sep 2019 13:29:39 GMT", Content-Type:"application/json; charset=UTF-8", Content-Length:"34", Server:"Jetty(9.2.5.v20141112)"]>

Issue with Batch Table API in Flink 1.5 - Complains of need of Streaming API

I'm trying to create a Batch oriented Flink job, with Flink 1.5.0 and wish to use the Table and SQL APIs to process the data. My problem is trying to create the BatchTableEnviroment I get a compiling error
BatchJob.java:[46,73] cannot access org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
caused at
final BatchTableEnvironment bTableEnv = TableEnvironment.getTableEnvironment(bEnv);
As far as I know I have no dependency on the streaming environment.
My code is as the snippet below.
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.java.BatchTableEnvironment;
import org.apache.flink.table.sources.CsvTableSource;
import org.apache.flink.table.sources.TableSource;
import java.util.Date;
public class BatchJob {
public static void main(String[] args) throws Exception {
final ExecutionEnvironment bEnv = ExecutionEnvironment.getExecutionEnvironment();
// create a TableEnvironment for batch queries
final BatchTableEnvironment bTableEnv = TableEnvironment.getTableEnvironment(bEnv);
... do stuff
// execute program
bEnv.execute("MY Batch Jon");
}
My pom dependencies are as as below
<dependencies>
<!-- Apache Flink dependencies -->
<!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- Add connector dependencies here. They must be in the default scope (compile). -->
<!-- Example:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
-->
<!-- Add logging framework, to produce console output when running in the IDE. -->
<!-- These dependencies are excluded from the application JAR by default. -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>runtime</scope>
</dependency>
</dependencies>
Please can someone help me understand what the dependency of the Streaming API is and why I need it for a batch job?
Thanks very much in advance for your help.
Oliver
Flink's Table API and SQL support are unified APIs for batch and stream processing. Many internal classes are shared between batch and stream execution and Scala / Java Table API and SQL and hence link to Flink's batch and streaming dependencies.
Due to these common classes, also batch queries require on the flink-streaming dependencies.

Apache Flink- Class file for org.apache.flink.streaming.api.scala.DataStream not found

Using Apache Flink version 1.3.2 and Cassandra 3.11, I wrote a simple code to write data into Cassandra using Apache Flink Cassandra connector. The following is the code:
final Collection<String> collection = new ArrayList<>(50);
for (int i = 1; i <= 50; ++i) {
collection.add("element " + i);
}
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<Tuple2<UUID, String>> dataStream = env
.fromCollection(collection)
.map(new MapFunction<String, Tuple2<UUID, String>>() {
final String mapped = " mapped ";
String[] splitted;
#Override
public Tuple2<UUID, String> map(String s) throws Exception {
splitted = s.split("\\s+");
return new Tuple2(
UUID.randomUUID(),
splitted[0] + mapped + splitted[1]
);
}
});
dataStream.print();
CassandraSink.addSink(dataStream)
.setQuery("INSERT INTO test.phases (id, text) values (?, ?);")
.setHost("127.0.0.1")
.build();
env.execute();
Trying to run the same code using Apache Flink 1.4.2 (1.4.x), I got the error:
Error:(36, 22) java: cannot access org.apache.flink.streaming.api.scala.DataStream
class file for org.apache.flink.streaming.api.scala.DataStream not found
on the line
CassandraSink.addSink(dataStream)
.setQuery("INSERT INTO test.phases (id, text) values (?, ?);")
.setHost("127.0.0.1")
.build();
I think we have some dependency changes in Apache Flink 1.4.2 and it causes the problem.
I use the following dependencies imported in the code:
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.cassandra.CassandraSink;
How can I solve the error in Apache Flink version 1.4.2?
Update:
In Flink 1.3.2, the class org.apache.flink.streaming.api.scala.DataStream<T> is in Java documents, but in version 1.4.2 there is no such class. see here
I tried the code example in Flink 1.4.2 documents for Cassandra connector but I got the same error, but the example worked with Flink 1.3.2 dependencies!
Besides all other dependencies make sure you have the Flink Scala dependency:
Maven
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.4.2</version>
</dependency>
Gradle
dependencies {
compile group: 'org.apache.flink', name: 'flink-streaming-scala_2.11', version: '1.4.2'
..
}
I managed to get your example working with the following dependencies:
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.cassandra.CassandraSink;
Maven
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.4.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.4.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.4.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.4.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-cassandra_2.11</artifactId>
<version>1.4.2</version>
</dependency>
</dependencies>

Resources