Quarkus + Panache + RestEasy Native Image build fails - resteasy

I have the following simplified setup:
1)
import javax.ws.rs.GET;
import javax.ws.rs.Path;
#Path("/api")
public class MyResource {
public MyResource() {
}
#GET
#Path("/myPath/")
public void get() {
}
}
2)
import io.quarkus.hibernate.orm.panache.PanacheEntity;
import javax.persistence.Entity;
#Entity
public class MyEntity extends PanacheEntity {
public String hello;
public MyEntity() {
//For Panache only
}
}
3) pom.xml:
[...]
<properties>
<compiler-plugin.version>3.8.1</compiler-plugin.version>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<quarkus-plugin.version>1.5.0.Final</quarkus-plugin.version>
<quarkus.platform.artifact-id>quarkus-universe-bom</quarkus.platform.artifact-id>
<quarkus.platform.group-id>io.quarkus</quarkus.platform.group-id>
<quarkus.platform.version>1.5.0.Final</quarkus.platform.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>${quarkus.platform.group-id}</groupId>
<artifactId>${quarkus.platform.artifact-id}</artifactId>
<version>${quarkus.platform.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-orm-panache</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-mariadb</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>${quarkus-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>native</id>
<activation>
<property>
<name>native</name>
</property>
</activation>
<build/>
<properties>
<quarkus.package.type>native</quarkus.package.type>
</properties>
</profile>
</profiles>
</project>
4) application.properties
quarkus.datasource.db-kind = mariadb
quarkus.datasource.username = admin
quarkus.datasource.password = admin
quarkus.datasource.jdbc.url = jdbc:mariadb://localhost:5432/mydatabase
quarkus.hibernate-orm.database.generation = drop-and-create
When I run this with the native maven profile (mvn clean package -Pnative) I get:
Fatal error: com.oracle.graal.pointsto.util.AnalysisError$ParsingError: Error encountered while parsing com.oracle.svm.reflect.Class_getNestHost_d0409f1154f6242e625526eadd05fbcd60e7d7e9.invoke(java.lang.Object, java.lang.Object[])
Parsing context:
parsing java.lang.reflect.Method.invoke(Method.java:566)
parsing javax.enterprise.util.AnnotationLiteral.invoke(AnnotationLiteral.java:288)
parsing javax.enterprise.util.AnnotationLiteral.getMemberValue(AnnotationLiteral.java:276)
parsing javax.enterprise.util.AnnotationLiteral.hashCode(AnnotationLiteral.java:246)
parsing org.graalvm.collections.EconomicMapImpl.getHashIndex(EconomicMapImpl.java:414)
[...]
Caused by: com.oracle.svm.hosted.substitute.DeletedElementException: Unsupported method java.lang.Class.getNestHost() is reachable: The declaring class of this element has been substituted, but this element is not present in the substitution class
[...]
To diagnose the issue, you can add the option --report-unsupported-elements-at-runtime.
Running it with --report-unsupported-elements-at-runtime didn't help too much either.
When I delete the MyEntity class, it successfully compiles as a native executable on my Mac +
Graalvm-ce-java11-20.0.0
Any idea what's wrong here?

I update to 20.0.0 and everything works perfectly. I just found only one way to hit that error and that is if you omit to set properly GraalVM. The environment variables in my mac are:
export GRAALVM_HOME=/Library/Java/JavaVirtualMachines/graalvm-ce-java11-20.0.0/Contents/Home
export JAVA_HOME=${GRAALVM_HOME}
export PATH=${GRAALVM_HOME}/bin:$PATH
Let me know if that works for you.

Related

An error occurs when I query the oracle database using flink sql cdc

environment are as follows:
linux: centos 7 (A Virtual machine on VMware Workstation Pro)
oracle: 11.2.0.4 (archivelog mode already started)
flink: 1.13.6
flink cdc connector: 2.2.0
java: 1.8
I try to query data from oracle database using flink sql cdc in IDEA, however, I encountered some errors,and the error are as follows:
My java code are as follows:
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.TableResult;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import org.apache.flink.types.Row;
public class FlinkSQL_CDC_JDBC_Oracle {
public static StreamTableEnvironment getTableEnvironment(){
EnvironmentSettings settings = EnvironmentSettings
.newInstance()
.useBlinkPlanner()
.inStreamingMode()
.build();
StreamExecutionEnvironment sEnv = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnvironment = StreamTableEnvironment.create(sEnv,settings);
return tableEnvironment;
}
public static String buildSourceTable() {
String sql = "CREATE TABLE `a` (\n" +
" ID BIGINT,\n" +
" NAME VARCHAR,\n" +
" PRIMARY KEY(NAME) NOT ENFORCED )\n" +
" WITH (\n" +
" 'connector' = 'oracle-cdc',\n" +
// 请修改成 Oracle 所在的实际 IP 地址
" 'hostname' = 'hadoop104',\n" +
" 'port' = '1521',\n" +
" 'username' = 'oracle',\n" +
" 'password' = 'oracle',\n" +
" 'database-name' = 'ORCL',\n" +
" 'schema-name' = 'oracle',\n" +
" 'table-name' = 'A',\n" +
//"'scan.startup.mode'='latest-offset'," +
"'debezium.log.mining.strategy'='online_catalog'," +
"'debezium.database.tablename.case.insensitive' = 'false'," +
"'debezium.log.mining.continuous.mine'='true'" +
")";
return sql;
}
public static void test(){
StreamTableEnvironment tableEnvironment = getTableEnvironment();
tableEnvironment.executeSql(buildSourceTable());
tableEnvironment.executeSql("select * from a").print();
}
public static void main(String[] args) {
test();
}
}
My pom file is:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>Flink_Oracle</artifactId>
<version>1.0</version>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<flink.version>1.13.6</flink.version>
<scala.binary.version>2.11</scala.binary.version>
<!-- <mysql.version>5.7.16</mysql.version>-->
<gson.version>2.8.6</gson.version>
</properties>
<dependencies>
<!-- <dependency>-->
<!-- <groupId>org.apache.flink</groupId>-->
<!-- <artifactId>flink-shaded-guava</artifactId>-->
<!-- <version>18.0-7.0</version>-->
<!-- </dependency>-->
<!-- <dependency>-->
<!-- <groupId>com.ververica</groupId>-->
<!-- <artifactId>flink-cdc-base</artifactId>-->
<!-- <version>2.3.0</version>-->
<!-- </dependency>-->
<dependency>
<groupId>com.ververica</groupId>
<artifactId>flink-connector-oracle-cdc</artifactId>
<version>2.2.0</version>
</dependency>
<!-- <dependency>-->
<!-- <groupId>org.apache.flink</groupId>-->
<!-- <artifactId>flink-jdbc_${scala.binary.version}</artifactId>-->
<!-- <version>${flink.version}</version>-->
<!-- <scope>system</scope>-->
<!-- <systemPath>${project.basedir}/src/main/resources/libs/flink-connector-jdbc_2.11-1.13.6.jar</systemPath>-->
<!-- </dependency>-->
<!-- <dependency>-->
<!-- <groupId>com.oracle.database.jdbc</groupId>-->
<!-- <artifactId>ojdbc6</artifactId>-->
<!-- <version>11.2.0.4</version>-->
<!-- </dependency>-->
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-common</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>${gson.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<!--如果要打包的话,这里要换成对应的 main class-->
<mainClass>com.flink.cdc.demo.MysqlCdcMysql</mainClass>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*:*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
I searched the Internet for solution about error code ORA-00604 and the answer was that the system tablespace was insufficient, however, the problem continued after I added the system tablespace;
I searched the Internet for solution about error code ORA-12705 and the answer was that the NLS_LANG variable was configured incorrectly, however, my NLS_LANG variable was configured correctly, here, NLS_LANG=AMERICAN_AMERICA.AL32UTF8, The NLS_LANG value on the registry is the same as that on the oracle database; However, everything is ok when I used flink sql jdbc to query an oracle database;
Finally, I really don't know what to do to solve the problem.
I hope someone can help me, if possible, through the remote desktop to help me to solve the problem., I am willing to pay some reward in return.

Flink TableEnvironment.create throws NoSuchMethodError

I am testing flink hive connector, following the instruction here https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/hive/overview/.
The final code is as follows. I tried to run it in Intellij IDE. Unfortunately, it doesn't work. TableEnvironment.create throws NoSuchMethodError
public static void main(String[] args) throws Exception {
EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
TableEnvironment tableEnv = TableEnvironment.create(settings); // throws NoSuchMethodError
String name = "myhive";
String defaultDatabase = "default";
String hiveConfDir = "/Users/gaoxiahong/apache-hive-3.1.2-bin/conf";
HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
tableEnv.registerCatalog(name, hive);
tableEnv.useCatalog(name);
System.out.println(tableEnv.executeSql("show tables"));
}
Exception message is as follows:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.calcite.sql.parser.SqlParser.config()Lorg/apache/calcite/sql/parser/SqlParser$Config;
at org.apache.flink.table.planner.delegation.PlannerContext.lambda$getSqlParserConfig$1(PlannerContext.java:263)
at java.util.Optional.orElseGet(Optional.java:267)
at org.apache.flink.table.planner.delegation.PlannerContext.getSqlParserConfig(PlannerContext.java:257)
at org.apache.flink.table.planner.delegation.PlannerContext.createFrameworkConfig(PlannerContext.java:148)
at org.apache.flink.table.planner.delegation.PlannerContext.<init>(PlannerContext.java:130)
at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:116)
at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:62)
at org.apache.flink.table.planner.delegation.DefaultPlannerFactory.create(DefaultPlannerFactory.java:64)
at org.apache.flink.table.factories.PlannerFactoryUtil.createPlanner(PlannerFactoryUtil.java:52)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.create(TableEnvironmentImpl.java:302)
at org.apache.flink.table.api.TableEnvironment.create(TableEnvironment.java:93)
at com.yqg.flinkhive.Test.main(Test.java:18)
My flink version is 1.15.2 and hive version is 3.1.2. The pom.xml file looks like:
<properties>
<flink.version>1.15.2</flink.version>
<hive.version>3.1.2</hive.version>
<scala.version>2.12</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-hive_${scala.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
Can anyone help me figure out the issue here? Thanks in advane~
Per https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/configuration/overview/ you most likely need to swap flink-table-api-java-bridge to flink-table-api-scala-bridge_2.12.
See also https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/hive/overview/#program-maven

Where is the problem in my corde(YML file) or Azure database? Spring Boot and Azure

Fist I am start project and create Azure database.
After that DB link to my project and it was run.
But it is not and indicate run error->
22:40:19.125 [main] ERROR org.springframework.boot.SpringApplication - Application run failed
org.yaml.snakeyaml.scanner.ScannerException: mapping values are not allowed here
in 'reader', line 4, column 13:
username: javatechi
Where is the problem in my corde(YML file) or Azure database?
application.yml
spring:
datasource:
url:jdbc:jdbc:sqlserver://xxxx.database.windows.net:1433;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;
username: xxxx
password: xxxxxxxx
jpa:
show-sql: true
hibernate:
ddl-auto: update
dialect: org.hibernate.dialect.SQLServer2012Dialect
server:
port: 9191
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.1</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.javatechie</groupId>
<artifactId>springboot-azuresql</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>springboot-azure-sql</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-jpa</artifactId>
<version>2.7.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.webjars.npm</groupId>
<artifactId>table</artifactId>
<version>5.4.6</version>
</dependency>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
<version>1.0.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Employer.java
package com.javatechie.azuresql;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.Table;
#Entity
#Table
#Data
#AllArgsConstructor
#NoArgsConstructor
public class Employee {
#Id
#GeneratedValue
private int id;
private String name;
private String dept;
private long salary;
}
**EmployeeRepository.java**
package com.javatechie.azuresql;
import org.springframework.data.jpa.repository.JpaRepository;
public interface EmployeeRepository extends JpaRepository<Employee,Integer> {
}
SpringbootAzuersqlApplication.java(Main Class)
package com.javatechie.azuresql;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RestController;
import java.util.List;
#SpringBootApplication
#RestController
public class SpringbootAzuresqlApplication {
#Autowired
private EmployeeRepository repository;
#PostMapping("/product")
public Employee addEmployee(#RequestBody Employee employee){
return repository.save(employee);
}
#GetMapping("/products")
public List<Employee> getEmployees(){
return repository.findAll();
}
public static void main(String[] args) {
SpringApplication.run(SpringbootAzuresqlApplication.class, args);
}
}
indicate error
21:55:28.055 [main] ERROR org.springframework.boot.SpringApplication - Application run failed
org.yaml.snakeyaml.scanner.ScannerException: mapping values are not allowed here
in 'reader', line 4, column 13:
username: javatechi
^
[1]: https://i.stack.imgur.com/dfEgg.jpg
The error coming from the YAML parser is misleading - it is not actually username: javatechi that is incorrect.
Please have a read through the YAML spec, 2.1 Collections and its examples, which is introduced with:
2.1. Collections
YAML’s block collections use indentation for scope and begin each entry on its own line. Block sequences indicate each entry with a dash and space (“- ”). Mappings use a colon and space (“: ”) to mark each key/value pair. Comments begin with an octothorpe (also called a “hash”, “sharp”, “pound” or “number sign” - “#”).
In other words it is actually the previous line that is incorrect:
url:jdbc:jdbc:sqlserver://xxxx.database.windows.net:1433;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;
You can fix it by adding a space after the url: key to seperate it from its value:
url: jdbc:jdbc:sqlserver://xxxx.database.windows.net:1433;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;

Unable to load OWL File using OWL API

Please help me out with this:
Have used Maven and tried loading Ontology file using OWL API..
Getting errors while running the file:
1st Error :
No implementation for
java.util.Set was
bound. while locating
java.util.Set
for parameter 0 at uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.setOntologyStorers(OWLOntologyManagerImpl.java:1279)
at
uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.setOntologyStorers(OWLOntologyManagerImpl.java:1279)
at uk.ac.manchester.cs.owl.owlapi.OWLAPIImplModule.configure(Unknown
Source)
2nd Error :
An exception was caught and reported. Message:
org.semanticweb.owlapi.manchestersyntax.parser.ManchesterOWLSyntaxOntologyParserFactory
cannot be cast to javax.inject.Provider at
org.semanticweb.owlapi.OWLAPIServiceLoaderModule.configure(Unknown
Source)
My code looks like:
File selectedFile = new File("E:\\Pallavi\\Ontology\\Food.owl");
OWLOntologyManager m = OWLManager.createOWLOntologyManager();
IRI inputDocumentIRI = IRI.create(selectedFile);
/* Load an ontology from a document IRI */
OWLOntology ontology = m.loadOntologyFromOntologyDocument(inputDocumentIRI);
/* Report information about the ontology */
System.out.println("Ontology Loaded...");
System.out.println("Document IRI: " + inputDocumentIRI);
System.out.println("Logical IRI : " + ontology.getOntologyID());
System.out.println("Format : " + m.getOntologyFormat(ontology));
m.removeOntology(ontology);
System.out.println("Done");
My pom.xml looks like:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>TestOWL</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.5.3</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Implementation-Title>${project.name}</Implementation-Title>
<Implementation-Vendor>${project.organization.name}</Implementation-Vendor>
<Implementation-Version>${project.version}.${maven.build.timestamp}</Implementation-Version>
<Bundle-SymbolicName>org.semanticweb.owl.owlapi</Bundle-SymbolicName>
<Bundle-Version>${project.version}</Bundle-Version>
<excludeDependencies>groupId=com.google.guava;scope=compile|runtime|provided,
groupId=com.google.inject*;scope=compile|runtime|provided,
groupId=org.slf4j*;scope=compile|runtime|provided</excludeDependencies>
</instructions>
</configuration>
</plugin>
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.felix:org.osgi.core</exclude>
<exclude>org.openrdf.sesame:*</exclude>
<exclude>com.fasterxml.jackson.core:*</exclude>
<exclude>com.github.jsonld-java:*</exclude>
<exclude>com.fasterxml.jackson.core:*</exclude>
<exclude>org.apache.httpcomponents:*</exclude>
<exclude>commons-codec:commons-codec:*</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>org.semarglproject:*</exclude>
<exclude>com.google.guava:*</exclude>
<exclude>com.google.inject:*</exclude>
<exclude>javax.inject:*</exclude>
<exclude>aopalliance:*</exclude>
<exclude>com.google.inject.extensions:*</exclude>
<exclude>com.google.code.findbugs:*</exclude>
<exclude>org.slf4j:slf4j-api</exclude>
<exclude>commons-io:*</exclude>
<exclude>org.tukaani:*</exclude>
<exclude>net.sf.trove4j:*</exclude>
</excludes>
</artifactSet>
<transformers>
<transformer/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.github.ansell.owlapi</groupId>
<artifactId>owlapi-api</artifactId>
<version>3.4.6.2-ansell</version>
</dependency>
<dependency>
<groupId>net.sourceforge.owlapi</groupId>
<artifactId>owlapi-apibinding</artifactId>
<version>5.0.5</version>
</dependency>
<dependency>
<groupId>net.sourceforge.owlapi</groupId>
<artifactId>owlapi-osgidistribution</artifactId>
<version>5.0.5</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Please help me to get rid of these errors
You are excluding necessary dependencies for owlapi, which explains all the injection related errors.
On top of that, you're using owlapi 5 and the Ansell fork of owlapi 3. These will conflict in many areas.
If you are not using OSGi (seems no) drop all dependencies except owlapi-apibinding for 5.0.5 and remove all exclusions. If that does not solve the problem, update the question with the new state of affairs.

Is it possible to do a java cron job in order to export a BigQuery table?

I want to upload a java cron job in order to do some queries and export a table from BigQuery to Google Storage once a week. To do this, I've used the Google plugin for Eclipse for upload the cron to AppEngine.
The problem is that my java cron job calls a java class that has google maven dependencies to access to BigQuery, but when the cron is uploaded to AppEngine, the below error appears:
Error for /cron/gaejcronjob
java.lang.NoClassDefFoundError: com/google/api/client/json/JsonFactory
I've read the question:
java.lang.ClassNotFoundException: com.google.api.client.json.JsonFactory, but its response don't solve the problem.
Edit:
(added pom.xml, GAEJCronServlet.java and BigQuery.java code)
pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>XTG.Cron.Jobs</groupId>
<artifactId>BigQuery</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>LATEST</version>
</dependency>
</dependencies>
</project>
GAEJCronServlet.java:
package com.gaejexperiments.cron;
import java.io.IOException;
import java.util.logging.Logger;
import javax.servlet.ServletException;
import javax.servlet.http.*;
import com.gaejexperiments.cron.BigQuery;
#SuppressWarnings("serial")
public class GAEJCronServlet extends HttpServlet {
private static final Logger _logger = Logger.getLogger(GAEJCronServlet.class.getName());
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws IOException {
try {
_logger.info("Cron Job has been executed");
BigQuery bigquery = new BigQuery();
bigquery.exportTable();
} catch (Exception ex) {
//_logger.info(ex);
}
}
#Override
public void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
doGet(req, resp);
}
}
BigQuery.java:
package com.gaejexperiments.cron;
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson.JacksonFactory;
import com.google.api.services.bigquery.Bigquery;
import com.google.api.services.bigquery.model.ErrorProto;
import com.google.api.services.bigquery.model.Job;
import com.google.api.services.bigquery.model.JobConfiguration;
import com.google.api.services.bigquery.model.JobConfigurationExtract;
import com.google.api.services.bigquery.model.JobReference;
import com.google.api.services.bigquery.model.TableReference;
public class BigQuery {
private final String PROJECT_ID = projectId;
private final String DATASET_ID = "bigquerytest";
private final String TABLE_ID = "test";
private Bigquery service = null;
public void main(String[] args) {
try {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential credential = GoogleCredential.getApplicationDefault(httpTransport, jsonFactory);
Bigquery.Builder serviceBuilder =
new Bigquery.Builder(httpTransport, jsonFactory, credential)
.setApplicationName("Bigquery ");
service = serviceBuilder.build();
if (service == null || service.jobs() == null) {
throw new Exception("Service is null");
}
}
catch (Exception ex) {
System.out.println("Caught exception: " + ex + "\n");
ex.printStackTrace();
System.exit(1);
}
System.exit(0);
}
public void exportTable() throws Exception{
//Export
TableReference sourceTable = new TableReference();
sourceTable.setProjectId(PROJECT_ID);
sourceTable.setDatasetId(DATASET_ID);
sourceTable.setTableId(TABLE_ID);
JobConfigurationExtract jobExtract = new JobConfigurationExtract();
jobExtract.setDestinationFormat("CSV");
jobExtract.setDestinationUri("gs://xtg-bigquery/test1.csv");
jobExtract.setSourceTable(sourceTable);
JobConfiguration jobConfig = new JobConfiguration();
jobConfig.setExtract(jobExtract);
JobReference jobRef = new JobReference();
jobRef.setProjectId(PROJECT_ID);
Job outputJob = new Job();
outputJob.setConfiguration(jobConfig);
outputJob.setJobReference(jobRef);
Job job = service.jobs().insert(PROJECT_ID,
outputJob).execute();
if (job == null) {
throw new Exception("Job is null");
}
while (true) {
String status = job.getStatus().getState();
if (status != null || ("DONE").equalsIgnoreCase(status)) {
break;
}
Thread.sleep(1000);
}
ErrorProto errorResult = job.getStatus().getErrorResult();
if (errorResult != null) {
throw new Exception("Error running job: " + errorResult);
}
}
}
You're missing a couple of appengine specific pom settings. The recommended approach is to create the pom.xml from the app engine archetype like this (as described here):
mvn archetype:generate -Dappengine-version=1.9.30 -Dapplication-id=your-app-id -Dfilter=com.google.appengine.archetypes:appengine-skeleton-archetype
Alternatively you can add the build plugins into your existing pom.xml, the build section should then look something like this (that's basically what the archetype will create for you):
<build>
<!-- for hot reload of the web application-->
<outputDirectory>${project.build.directory}/${project.build.finalName}/WEB-INF/classes</outputDirectory>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<version>3.1</version>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
<archiveClasses>true</archiveClasses>
<webResources>
<!-- in order to interpolate version from pom into appengine-web.xml -->
<resource>
<directory>${basedir}/src/main/webapp/WEB-INF</directory>
<filtering>true</filtering>
<targetPath>WEB-INF</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.version}</version>
<configuration>
<enableJarClasses>false</enableJarClasses>
<version>${app.version}</version>
<!-- Comment in the below snippet to bind to all IPs instead of just localhost -->
<address>0.0.0.0</address>
<port>8080</port>
<!-- Comment in the below snippet to enable local debugging with a remote debugger
like those included with Eclipse or IntelliJ -->
<jvmFlags>
<jvmFlag>-agentlib:jdwp=transport=dt_socket,address=8000,server=y,suspend=n</jvmFlag>
</jvmFlags>
</configuration>
</plugin>
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>gcloud-maven-plugin</artifactId>
<version>${gcloud.plugin.version}</version>
<configuration>
<set_default>true</set_default>
</configuration>
</plugin>
</plugins>
</build>
You should also add the appengine sdk to your dependencies, mine usually looks like this:
<!-- Compile/runtime dependencies -->
<dependency>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-1.0-sdk</artifactId>
<version>${appengine.version}</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>jstl</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
Last but not least, the packaging of appengine projects is usually set to WAR
<packaging>war</packaging>
Having setup all that (and having a appengine-web.xml present in WEB-INF) you can deploy your appengine application with
mvn appengine:update
I recommend you create a project with the archetype and copy your content in the new project. That's much easier than adding all those configurations to your existing project.

Resources