Unable to load OWL File using OWL API - owl-api

Please help me out with this:
Have used Maven and tried loading Ontology file using OWL API..
Getting errors while running the file:
1st Error :
No implementation for
java.util.Set was
bound. while locating
java.util.Set
for parameter 0 at uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.setOntologyStorers(OWLOntologyManagerImpl.java:1279)
at
uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.setOntologyStorers(OWLOntologyManagerImpl.java:1279)
at uk.ac.manchester.cs.owl.owlapi.OWLAPIImplModule.configure(Unknown
Source)
2nd Error :
An exception was caught and reported. Message:
org.semanticweb.owlapi.manchestersyntax.parser.ManchesterOWLSyntaxOntologyParserFactory
cannot be cast to javax.inject.Provider at
org.semanticweb.owlapi.OWLAPIServiceLoaderModule.configure(Unknown
Source)
My code looks like:
File selectedFile = new File("E:\\Pallavi\\Ontology\\Food.owl");
OWLOntologyManager m = OWLManager.createOWLOntologyManager();
IRI inputDocumentIRI = IRI.create(selectedFile);
/* Load an ontology from a document IRI */
OWLOntology ontology = m.loadOntologyFromOntologyDocument(inputDocumentIRI);
/* Report information about the ontology */
System.out.println("Ontology Loaded...");
System.out.println("Document IRI: " + inputDocumentIRI);
System.out.println("Logical IRI : " + ontology.getOntologyID());
System.out.println("Format : " + m.getOntologyFormat(ontology));
m.removeOntology(ontology);
System.out.println("Done");
My pom.xml looks like:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>TestOWL</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.5.3</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Implementation-Title>${project.name}</Implementation-Title>
<Implementation-Vendor>${project.organization.name}</Implementation-Vendor>
<Implementation-Version>${project.version}.${maven.build.timestamp}</Implementation-Version>
<Bundle-SymbolicName>org.semanticweb.owl.owlapi</Bundle-SymbolicName>
<Bundle-Version>${project.version}</Bundle-Version>
<excludeDependencies>groupId=com.google.guava;scope=compile|runtime|provided,
groupId=com.google.inject*;scope=compile|runtime|provided,
groupId=org.slf4j*;scope=compile|runtime|provided</excludeDependencies>
</instructions>
</configuration>
</plugin>
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.felix:org.osgi.core</exclude>
<exclude>org.openrdf.sesame:*</exclude>
<exclude>com.fasterxml.jackson.core:*</exclude>
<exclude>com.github.jsonld-java:*</exclude>
<exclude>com.fasterxml.jackson.core:*</exclude>
<exclude>org.apache.httpcomponents:*</exclude>
<exclude>commons-codec:commons-codec:*</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>org.semarglproject:*</exclude>
<exclude>com.google.guava:*</exclude>
<exclude>com.google.inject:*</exclude>
<exclude>javax.inject:*</exclude>
<exclude>aopalliance:*</exclude>
<exclude>com.google.inject.extensions:*</exclude>
<exclude>com.google.code.findbugs:*</exclude>
<exclude>org.slf4j:slf4j-api</exclude>
<exclude>commons-io:*</exclude>
<exclude>org.tukaani:*</exclude>
<exclude>net.sf.trove4j:*</exclude>
</excludes>
</artifactSet>
<transformers>
<transformer/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.github.ansell.owlapi</groupId>
<artifactId>owlapi-api</artifactId>
<version>3.4.6.2-ansell</version>
</dependency>
<dependency>
<groupId>net.sourceforge.owlapi</groupId>
<artifactId>owlapi-apibinding</artifactId>
<version>5.0.5</version>
</dependency>
<dependency>
<groupId>net.sourceforge.owlapi</groupId>
<artifactId>owlapi-osgidistribution</artifactId>
<version>5.0.5</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Please help me to get rid of these errors

You are excluding necessary dependencies for owlapi, which explains all the injection related errors.
On top of that, you're using owlapi 5 and the Ansell fork of owlapi 3. These will conflict in many areas.
If you are not using OSGi (seems no) drop all dependencies except owlapi-apibinding for 5.0.5 and remove all exclusions. If that does not solve the problem, update the question with the new state of affairs.

Related

An error occurs when I query the oracle database using flink sql cdc

environment are as follows:
linux: centos 7 (A Virtual machine on VMware Workstation Pro)
oracle: 11.2.0.4 (archivelog mode already started)
flink: 1.13.6
flink cdc connector: 2.2.0
java: 1.8
I try to query data from oracle database using flink sql cdc in IDEA, however, I encountered some errors,and the error are as follows:
My java code are as follows:
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.TableResult;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import org.apache.flink.types.Row;
public class FlinkSQL_CDC_JDBC_Oracle {
public static StreamTableEnvironment getTableEnvironment(){
EnvironmentSettings settings = EnvironmentSettings
.newInstance()
.useBlinkPlanner()
.inStreamingMode()
.build();
StreamExecutionEnvironment sEnv = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnvironment = StreamTableEnvironment.create(sEnv,settings);
return tableEnvironment;
}
public static String buildSourceTable() {
String sql = "CREATE TABLE `a` (\n" +
" ID BIGINT,\n" +
" NAME VARCHAR,\n" +
" PRIMARY KEY(NAME) NOT ENFORCED )\n" +
" WITH (\n" +
" 'connector' = 'oracle-cdc',\n" +
// 请修改成 Oracle 所在的实际 IP 地址
" 'hostname' = 'hadoop104',\n" +
" 'port' = '1521',\n" +
" 'username' = 'oracle',\n" +
" 'password' = 'oracle',\n" +
" 'database-name' = 'ORCL',\n" +
" 'schema-name' = 'oracle',\n" +
" 'table-name' = 'A',\n" +
//"'scan.startup.mode'='latest-offset'," +
"'debezium.log.mining.strategy'='online_catalog'," +
"'debezium.database.tablename.case.insensitive' = 'false'," +
"'debezium.log.mining.continuous.mine'='true'" +
")";
return sql;
}
public static void test(){
StreamTableEnvironment tableEnvironment = getTableEnvironment();
tableEnvironment.executeSql(buildSourceTable());
tableEnvironment.executeSql("select * from a").print();
}
public static void main(String[] args) {
test();
}
}
My pom file is:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>Flink_Oracle</artifactId>
<version>1.0</version>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<flink.version>1.13.6</flink.version>
<scala.binary.version>2.11</scala.binary.version>
<!-- <mysql.version>5.7.16</mysql.version>-->
<gson.version>2.8.6</gson.version>
</properties>
<dependencies>
<!-- <dependency>-->
<!-- <groupId>org.apache.flink</groupId>-->
<!-- <artifactId>flink-shaded-guava</artifactId>-->
<!-- <version>18.0-7.0</version>-->
<!-- </dependency>-->
<!-- <dependency>-->
<!-- <groupId>com.ververica</groupId>-->
<!-- <artifactId>flink-cdc-base</artifactId>-->
<!-- <version>2.3.0</version>-->
<!-- </dependency>-->
<dependency>
<groupId>com.ververica</groupId>
<artifactId>flink-connector-oracle-cdc</artifactId>
<version>2.2.0</version>
</dependency>
<!-- <dependency>-->
<!-- <groupId>org.apache.flink</groupId>-->
<!-- <artifactId>flink-jdbc_${scala.binary.version}</artifactId>-->
<!-- <version>${flink.version}</version>-->
<!-- <scope>system</scope>-->
<!-- <systemPath>${project.basedir}/src/main/resources/libs/flink-connector-jdbc_2.11-1.13.6.jar</systemPath>-->
<!-- </dependency>-->
<!-- <dependency>-->
<!-- <groupId>com.oracle.database.jdbc</groupId>-->
<!-- <artifactId>ojdbc6</artifactId>-->
<!-- <version>11.2.0.4</version>-->
<!-- </dependency>-->
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-common</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>${gson.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<!--如果要打包的话,这里要换成对应的 main class-->
<mainClass>com.flink.cdc.demo.MysqlCdcMysql</mainClass>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*:*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
I searched the Internet for solution about error code ORA-00604 and the answer was that the system tablespace was insufficient, however, the problem continued after I added the system tablespace;
I searched the Internet for solution about error code ORA-12705 and the answer was that the NLS_LANG variable was configured incorrectly, however, my NLS_LANG variable was configured correctly, here, NLS_LANG=AMERICAN_AMERICA.AL32UTF8, The NLS_LANG value on the registry is the same as that on the oracle database; However, everything is ok when I used flink sql jdbc to query an oracle database;
Finally, I really don't know what to do to solve the problem.
I hope someone can help me, if possible, through the remote desktop to help me to solve the problem., I am willing to pay some reward in return.

Liquibase: 'dropForeignKeyConstrant' is not a valid element of a ChangeSet

Get this error:
Error setting up or running Liquibase: ChangeSet '02': 'dropForeignKeyConstrant' is not a valid element of a ChangeSet
pom.xml plugin(in .prop only credentials for db)
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.10.2</version>
<dependencies>
<dependency>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-groovy-dsl</artifactId>
<version>2.1.2</version>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.2.15</version>
</dependency>
</dependencies>
<configuration>
<propertyFile>src/main/resources/liquibase.properties</propertyFile>
</configuration>
</plugin>
ChangeSet(.groovy)
rollback {
dropUniqueConstraint(schemaName: schema_name, tableName: 'bucket', constraintName: 'u_bucket_name')
dropUniqueConstraint(schemaName: schema_name, tableName: 'version', constraintName: 'u_bucket_key')
dropForeignKeyConstrant(baseTableSchemaName: schema_name, baseTableName: 'file', constraintName: 'file_type_fk')
dropForeignKeyConstrant(baseTableSchemaName: schema_name, baseTableName: 'license', constraintName: 'file_fk')
As you can see dropUniqueConstraint worked, but dropForeignKeyConstrant didn't
Why i got this error and how to fix this?
DropForeignKeyConstrant was written instead of dropForeignKeyConstraint (missing i). Thanks for your attention, #andi

Implement user behavior in Gatling via sequential http requests

Is there a way to implement sequential http requests in one scenario?
Clearly what i want: make a load test on REST API with "simulating" users behavior.
1.1. User goes to /a
1.2. Get some info
2.1. Then fo to /b with some previously taken info
2.2. ... etc
And this scenario must be executed by some amount of VU at same time.
But what i see on graphics all VU doing request 1 simultaneously multiple times then all do request 2 and so on:
requests to /a goes for 10s
then requests to /b goes for 10s
But each rout response in about 20-30ms, so its not delaying by waiting response.
That's now how users will behavior and not what i want from my test.
What i am doing in my project:
pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>a</groupId>
<artifactId>b</artifactId>
<version>1.0.0</version> <!-- build_version -->
<properties>
<java.version>1.8</java.version>
<scala.version>2.12</scala.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<scala.maven.plagin.version>4.4.0</scala.maven.plagin.version>
<gatling.maven.plagin.version>3.0.5</gatling.maven.plagin.version>
<gatling.version>3.3.1</gatling.version>
</properties>
<dependencies>
<dependency>
<groupId>io.gatling.highcharts</groupId>
<artifactId>gatling-charts-highcharts</artifactId>
<version>${gatling.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<repositories>
<repository>
<id>sonatype</id>
<url>https://oss.sonatype.org/content/repositories/releases/</url>
</repository>
</repositories>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>${gatling.maven.plagin.version}</version>
<executions>
<execution>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala.maven.plagin.version}</version>
<configuration>
<jvmArgs>
<jvmArg>-Xss100M</jvmArg>
</jvmArgs>
<args>
<arg>-target:jvm-${java.version}</arg>
<arg>-deprecation</arg>
<arg>-feature</arg>
<arg>-unchecked</arg>
<arg>-language:implicitConversions</arg>
<arg>-language:postfixOps</arg>
</args>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
.scala file:
import io.gatling.core.Predef._
import io.gatling.core.structure.ScenarioBuilder
import io.gatling.http.Predef._
import io.gatling.http.protocol.HttpProtocolBuilder
import scala.concurrent.duration._
class SimpleScenario extends Simulation {
val users = 200
val maxRPS = 200
val rampUp: FiniteDuration = 1 minutes
val duration: FiniteDuration = 1 minutes
val httpProtocol: HttpProtocolBuilder = http
.baseUrl("http://some.site")
val scn0: ScenarioBuilder = scenario("ASimulation")
.exec(
Seq(
exec(
http("a")
.post("/a")
.body(StringBody("""{"username": "a", "password": "b"}"""))
.check(status.is(200))
)
.exec(
http("b")
.get("/b")
.check(status.is(200))
)
// ... and so on
)
)
setUp(
scn0
.inject(
constantConcurrentUsers(users) during(rampUp + duration)
)
.throttle(
reachRps(maxRPS) in (rampUp),
holdFor(duration)
)
).protocols(httpProtocol)
}
Command that i run to execute test:
mvn gatling:test
And i am tryed already:
.repeat(1)
.exec without Seq
chain of .exec's outside of one
so its not delaying by waiting response
Yes it is. Each virtual user will wait for response "a" before sending request "b".

Quarkus + Panache + RestEasy Native Image build fails

I have the following simplified setup:
1)
import javax.ws.rs.GET;
import javax.ws.rs.Path;
#Path("/api")
public class MyResource {
public MyResource() {
}
#GET
#Path("/myPath/")
public void get() {
}
}
2)
import io.quarkus.hibernate.orm.panache.PanacheEntity;
import javax.persistence.Entity;
#Entity
public class MyEntity extends PanacheEntity {
public String hello;
public MyEntity() {
//For Panache only
}
}
3) pom.xml:
[...]
<properties>
<compiler-plugin.version>3.8.1</compiler-plugin.version>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<quarkus-plugin.version>1.5.0.Final</quarkus-plugin.version>
<quarkus.platform.artifact-id>quarkus-universe-bom</quarkus.platform.artifact-id>
<quarkus.platform.group-id>io.quarkus</quarkus.platform.group-id>
<quarkus.platform.version>1.5.0.Final</quarkus.platform.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>${quarkus.platform.group-id}</groupId>
<artifactId>${quarkus.platform.artifact-id}</artifactId>
<version>${quarkus.platform.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-orm-panache</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-mariadb</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>${quarkus-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>native</id>
<activation>
<property>
<name>native</name>
</property>
</activation>
<build/>
<properties>
<quarkus.package.type>native</quarkus.package.type>
</properties>
</profile>
</profiles>
</project>
4) application.properties
quarkus.datasource.db-kind = mariadb
quarkus.datasource.username = admin
quarkus.datasource.password = admin
quarkus.datasource.jdbc.url = jdbc:mariadb://localhost:5432/mydatabase
quarkus.hibernate-orm.database.generation = drop-and-create
When I run this with the native maven profile (mvn clean package -Pnative) I get:
Fatal error: com.oracle.graal.pointsto.util.AnalysisError$ParsingError: Error encountered while parsing com.oracle.svm.reflect.Class_getNestHost_d0409f1154f6242e625526eadd05fbcd60e7d7e9.invoke(java.lang.Object, java.lang.Object[])
Parsing context:
parsing java.lang.reflect.Method.invoke(Method.java:566)
parsing javax.enterprise.util.AnnotationLiteral.invoke(AnnotationLiteral.java:288)
parsing javax.enterprise.util.AnnotationLiteral.getMemberValue(AnnotationLiteral.java:276)
parsing javax.enterprise.util.AnnotationLiteral.hashCode(AnnotationLiteral.java:246)
parsing org.graalvm.collections.EconomicMapImpl.getHashIndex(EconomicMapImpl.java:414)
[...]
Caused by: com.oracle.svm.hosted.substitute.DeletedElementException: Unsupported method java.lang.Class.getNestHost() is reachable: The declaring class of this element has been substituted, but this element is not present in the substitution class
[...]
To diagnose the issue, you can add the option --report-unsupported-elements-at-runtime.
Running it with --report-unsupported-elements-at-runtime didn't help too much either.
When I delete the MyEntity class, it successfully compiles as a native executable on my Mac +
Graalvm-ce-java11-20.0.0
Any idea what's wrong here?
I update to 20.0.0 and everything works perfectly. I just found only one way to hit that error and that is if you omit to set properly GraalVM. The environment variables in my mac are:
export GRAALVM_HOME=/Library/Java/JavaVirtualMachines/graalvm-ce-java11-20.0.0/Contents/Home
export JAVA_HOME=${GRAALVM_HOME}
export PATH=${GRAALVM_HOME}/bin:$PATH
Let me know if that works for you.

Is it possible to do a java cron job in order to export a BigQuery table?

I want to upload a java cron job in order to do some queries and export a table from BigQuery to Google Storage once a week. To do this, I've used the Google plugin for Eclipse for upload the cron to AppEngine.
The problem is that my java cron job calls a java class that has google maven dependencies to access to BigQuery, but when the cron is uploaded to AppEngine, the below error appears:
Error for /cron/gaejcronjob
java.lang.NoClassDefFoundError: com/google/api/client/json/JsonFactory
I've read the question:
java.lang.ClassNotFoundException: com.google.api.client.json.JsonFactory, but its response don't solve the problem.
Edit:
(added pom.xml, GAEJCronServlet.java and BigQuery.java code)
pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>XTG.Cron.Jobs</groupId>
<artifactId>BigQuery</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>LATEST</version>
</dependency>
</dependencies>
</project>
GAEJCronServlet.java:
package com.gaejexperiments.cron;
import java.io.IOException;
import java.util.logging.Logger;
import javax.servlet.ServletException;
import javax.servlet.http.*;
import com.gaejexperiments.cron.BigQuery;
#SuppressWarnings("serial")
public class GAEJCronServlet extends HttpServlet {
private static final Logger _logger = Logger.getLogger(GAEJCronServlet.class.getName());
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws IOException {
try {
_logger.info("Cron Job has been executed");
BigQuery bigquery = new BigQuery();
bigquery.exportTable();
} catch (Exception ex) {
//_logger.info(ex);
}
}
#Override
public void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
doGet(req, resp);
}
}
BigQuery.java:
package com.gaejexperiments.cron;
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson.JacksonFactory;
import com.google.api.services.bigquery.Bigquery;
import com.google.api.services.bigquery.model.ErrorProto;
import com.google.api.services.bigquery.model.Job;
import com.google.api.services.bigquery.model.JobConfiguration;
import com.google.api.services.bigquery.model.JobConfigurationExtract;
import com.google.api.services.bigquery.model.JobReference;
import com.google.api.services.bigquery.model.TableReference;
public class BigQuery {
private final String PROJECT_ID = projectId;
private final String DATASET_ID = "bigquerytest";
private final String TABLE_ID = "test";
private Bigquery service = null;
public void main(String[] args) {
try {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential credential = GoogleCredential.getApplicationDefault(httpTransport, jsonFactory);
Bigquery.Builder serviceBuilder =
new Bigquery.Builder(httpTransport, jsonFactory, credential)
.setApplicationName("Bigquery ");
service = serviceBuilder.build();
if (service == null || service.jobs() == null) {
throw new Exception("Service is null");
}
}
catch (Exception ex) {
System.out.println("Caught exception: " + ex + "\n");
ex.printStackTrace();
System.exit(1);
}
System.exit(0);
}
public void exportTable() throws Exception{
//Export
TableReference sourceTable = new TableReference();
sourceTable.setProjectId(PROJECT_ID);
sourceTable.setDatasetId(DATASET_ID);
sourceTable.setTableId(TABLE_ID);
JobConfigurationExtract jobExtract = new JobConfigurationExtract();
jobExtract.setDestinationFormat("CSV");
jobExtract.setDestinationUri("gs://xtg-bigquery/test1.csv");
jobExtract.setSourceTable(sourceTable);
JobConfiguration jobConfig = new JobConfiguration();
jobConfig.setExtract(jobExtract);
JobReference jobRef = new JobReference();
jobRef.setProjectId(PROJECT_ID);
Job outputJob = new Job();
outputJob.setConfiguration(jobConfig);
outputJob.setJobReference(jobRef);
Job job = service.jobs().insert(PROJECT_ID,
outputJob).execute();
if (job == null) {
throw new Exception("Job is null");
}
while (true) {
String status = job.getStatus().getState();
if (status != null || ("DONE").equalsIgnoreCase(status)) {
break;
}
Thread.sleep(1000);
}
ErrorProto errorResult = job.getStatus().getErrorResult();
if (errorResult != null) {
throw new Exception("Error running job: " + errorResult);
}
}
}
You're missing a couple of appengine specific pom settings. The recommended approach is to create the pom.xml from the app engine archetype like this (as described here):
mvn archetype:generate -Dappengine-version=1.9.30 -Dapplication-id=your-app-id -Dfilter=com.google.appengine.archetypes:appengine-skeleton-archetype
Alternatively you can add the build plugins into your existing pom.xml, the build section should then look something like this (that's basically what the archetype will create for you):
<build>
<!-- for hot reload of the web application-->
<outputDirectory>${project.build.directory}/${project.build.finalName}/WEB-INF/classes</outputDirectory>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<version>3.1</version>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
<archiveClasses>true</archiveClasses>
<webResources>
<!-- in order to interpolate version from pom into appengine-web.xml -->
<resource>
<directory>${basedir}/src/main/webapp/WEB-INF</directory>
<filtering>true</filtering>
<targetPath>WEB-INF</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.version}</version>
<configuration>
<enableJarClasses>false</enableJarClasses>
<version>${app.version}</version>
<!-- Comment in the below snippet to bind to all IPs instead of just localhost -->
<address>0.0.0.0</address>
<port>8080</port>
<!-- Comment in the below snippet to enable local debugging with a remote debugger
like those included with Eclipse or IntelliJ -->
<jvmFlags>
<jvmFlag>-agentlib:jdwp=transport=dt_socket,address=8000,server=y,suspend=n</jvmFlag>
</jvmFlags>
</configuration>
</plugin>
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>gcloud-maven-plugin</artifactId>
<version>${gcloud.plugin.version}</version>
<configuration>
<set_default>true</set_default>
</configuration>
</plugin>
</plugins>
</build>
You should also add the appengine sdk to your dependencies, mine usually looks like this:
<!-- Compile/runtime dependencies -->
<dependency>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-1.0-sdk</artifactId>
<version>${appengine.version}</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>jstl</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
Last but not least, the packaging of appengine projects is usually set to WAR
<packaging>war</packaging>
Having setup all that (and having a appengine-web.xml present in WEB-INF) you can deploy your appengine application with
mvn appengine:update
I recommend you create a project with the archetype and copy your content in the new project. That's much easier than adding all those configurations to your existing project.

Resources