How to call a remote EJB 3.1 with its client on OpenLiberty - ejb-3.1

I'm new to Open Liberty. I need to call an EJB 3.1 deployed on WAS 8.5 but I can't find any documentation or tutorial on how to do it.
Is this feature present in OpenLiberty? Which features should i enable on my server.xml? And what is the code to call the server?
UPDATE
After reading some documentation, I've tried to make the call with the following server.xml
<server description="Sample Liberty server">
<featureManager>
<feature>webProfile-7.0</feature>
</featureManager>
<variable name="default.http.port" defaultValue="9080" />
<variable name="default.https.port" defaultValue="9443" />
<variable name="app.context.root" defaultValue="ServletSample" />
<webApplication location="open-liberty-template.war"
contextRoot="/test" />
<keyStore id="defaultKeyStore" password="keypassword"/>
<!-- EXAMPLE TRACE: when you visit the /health endpoint, additional traces
are logged in the trace.log file. -->
<!-- <logging traceSpecification="com.ibm.ws.microprofile.health.*=all"></logging> -->
<httpEndpoint host="*" httpPort="${default.http.port}"
httpsPort="${default.https.port}" id="defaultHttpEndpoint" />
<httpDispatcher enableWelcomePage="false" />
</server>
and Servlet
#WebServlet("/RemoteTxAttrServlet")
public class TestServlet extends HttpServlet {
protected void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
Properties p = new Properties();
// p.put(Context.PROVIDER_URL, "iiop://10.64.2.93:2821");
Object found = null;
try {
Context c = new InitialContext();
found = c.lookup(
"corbaname::ip:port#ejb/global/ear-module/ejb-module/BeanName!package.Interface");
} catch (NamingException e) {
e.printStackTrace();
}
System.out.println(found);
}
}
but I obtain the following Exception:
[INFO] [err] javax.naming.NameNotFoundException: Intermediate context does not exist: corbaname::10.64.2.93:2821#ejb/global
[INFO] [err] at com.ibm.ws.jndi.internal.ContextNode.getTargetNode(ContextNode.java:125)
[INFO] [err] at [internal classes]
[INFO] [err] at java.naming/javax.naming.InitialContext.lookup(InitialContext.java:409)
[INFO] [err] at it.aci.grpc.services.TestServlet.doGet(TestServlet.java:27)
[INFO] [err] at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
[INFO] [err] at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
[INFO] [err] at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1258)
[INFO] [err] at [internal classes]
[INFO] [err] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[INFO] [err] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[INFO] [err] at java.base/java.lang.Thread.run(Thread.java:834)
furthermore i cannot find the javax.rmi.PortableRemoteObject on the classpath to norrow the Object reference to my target remote interface.
How can i fix it?

Related

Failed to execute Dataflow job from GAE standard environment?

I'm trying to start dataflow pipeline job from a servlet on GAE standard environment. The job reads from Cloud SQL database and writes in BigQuery, both in the same project. The cloud dataflow service account has all needed permissions (Storage Object Admin, Cloud SQL Client, BigQuery Admin, BigQuery Data Editor) I guess. I got the following exception:
java.lang.RuntimeException: Interrupted while staging packages
at org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:401)
at org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:273)
at org.apache.beam.runners.dataflow.util.GcsStager.stageFiles(GcsStager.java:82)
at org.apache.beam.runners.dataflow.util.GcsStager.stageDefaultFiles(GcsStager.java:70)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:741)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:186)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
at com.mycompany.prj.servers.dataflow.PipelineExecutor.run(PipelineExecutor.java:138)
at com.mycompany.prj.servers.dataflow.CloudSqlToBigQueryExporter.run(CloudSqlToBigQueryExporter.java:226)
at com.mycompany.prj.servers.dataflow.CloudSqlToBigQueryExporter.doTest(CloudSqlToBigQueryExporter.java:281)
at com.mycompany.prj.server.web.CronServlet.doGet(Unknown Source)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1780)
at com.mycompany.prj.server.web.TxFilter.doFilter(Unknown Source)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.googlecode.objectify.ObjectifyFilter.doFilter(ObjectifyFilter.java:48)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.mycompany.prj.servers.web.HttpCrossOriginFilter.doFilter(HttpCrossOriginFilter.java:23)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.mycompany.prj.servers.web.LogSchemeFilter.doFilter(LogSchemeFilter.java:17)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.google.apphosting.utils.servlet.JdbcMySqlConnectionCleanupFilter.doFilter(JdbcMySqlConnectionCleanupFilter.java:60)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:513)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at com.google.apphosting.runtime.jetty9.ParseBlobUploadHandler.handle(ParseBlobUploadHandler.java:111)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1182)
at com.google.apphosting.runtime.jetty9.AppEngineWebAppContext.doHandle(AppEngineWebAppContext.java:187)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:513)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at com.google.apphosting.runtime.jetty9.AppVersionHandlerMap.handle(AppVersionHandlerMap.java:293)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:539)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333)
at com.google.apphosting.runtime.jetty9.RpcConnection.handle(RpcConnection.java:216)
at com.google.apphosting.runtime.jetty9.RpcConnector.serviceRequest(RpcConnector.java:81)
at com.google.apphosting.runtime.jetty9.JettyServletEngineAdapter.serviceRequest(JettyServletEngineAdapter.java:134)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.dispatchServletRequest(JavaRuntime.java:772)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.dispatchRequest(JavaRuntime.java:735)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.run(JavaRuntime.java:705)
at com.google.apphosting.runtime.ThreadGroupPool$PoolEntry.run(ThreadGroupPool.java:261)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:347)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1915)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:68)
at org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:384)
... 50 more
I managed to run it on GAE standard. The problem was that I never scheduled it on a cron job, just run it through URL trigger. So for one hour I scheduled a dozen of requests and most of them passed. But some failed with this exception:
Failed java.lang.RuntimeException: Error while staging packages
java.lang.RuntimeException: Error while staging packages
at org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:403)
at org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:273)
at org.apache.beam.runners.dataflow.util.GcsStager.stageFiles(GcsStager.java:82)
at org.apache.beam.runners.dataflow.util.GcsStager.stageDefaultFiles(GcsStager.java:70)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:741)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:186)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
at com.mycompany.prj.servers.dataflow.PipelineExecutor.run(PipelineExecutor.java:138)
at com.mycompany.prj.servers.dataflow.CloudSqlToBigQueryExporter.run(CloudSqlToBigQueryExporter.java:245)
at com.mycompany.prj.servers.dataflow.CloudSqlToBigQueryExporter.doTestOnExp(CloudSqlToBigQueryExporter.java:304)
at com.mycompany.prj.servers.dataflow.CloudSqlToBigQueryExporter.doTestOnExp(CloudSqlToBigQueryExporter.java:285)
at com.mycompany.prj.server.web.CronServlet.doGet(Unknown Source)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1780)
at com.mycompany.prj.server.web.TxFilter.doFilter(Unknown Source)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.googlecode.objectify.ObjectifyFilter.doFilter(ObjectifyFilter.java:48)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.mycompany.prj.servers.web.HttpCrossOriginFilter.doFilter(HttpCrossOriginFilter.java:23)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.mycompany.prj.servers.web.LogSchemeFilter.doFilter(LogSchemeFilter.java:17)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at com.google.apphosting.utils.servlet.JdbcMySqlConnectionCleanupFilter.doFilter(JdbcMySqlConnectionCleanupFilter.java:60)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1767)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:524)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at com.google.apphosting.runtime.jetty9.ParseBlobUploadHandler.handle(ParseBlobUploadHandler.java:111)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1182)
at com.google.apphosting.runtime.jetty9.AppEngineWebAppContext.doHandle(AppEngineWebAppContext.java:187)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:513)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at com.google.apphosting.runtime.jetty9.AppVersionHandlerMap.handle(AppVersionHandlerMap.java:293)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:539)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333)
at com.google.apphosting.runtime.jetty9.RpcConnection.handle(RpcConnection.java:216)
at com.google.apphosting.runtime.jetty9.RpcConnector.serviceRequest(RpcConnector.java:81)
at com.google.apphosting.runtime.jetty9.JettyServletEngineAdapter.serviceRequest(JettyServletEngineAdapter.java:134)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.dispatchServletRequest(JavaRuntime.java:772)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.dispatchRequest(JavaRuntime.java:735)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.run(JavaRuntime.java:705)
at com.google.apphosting.runtime.ThreadGroupPool$PoolEntry.run(ThreadGroupPool.java:261)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException: Operation not allowed in a thread that is neither the original request thread nor a thread created by ThreadManager
at com.google.apphosting.runtime.ApiProxyImpl$CurrentRequestThreadFactory.newThread(ApiProxyImpl.java:1310)
at java.util.concurrent.ThreadPoolExecutor$Worker.(ThreadPoolExecutor.java:619)
at java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:932)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.MoreExecutors$ListeningDecorator.execute(MoreExecutors.java:537)
at java.util.concurrent.CompletableFuture.asyncRunStage(CompletableFuture.java:1640)
at java.util.concurrent.CompletableFuture.runAsync(CompletableFuture.java:1858)
at org.apache.beam.sdk.util.MoreFutures.supplyAsync(MoreFutures.java:101)
at org.apache.beam.runners.dataflow.util.PackageUtil.stagePackage(PackageUtil.java:172)
at org.apache.beam.runners.dataflow.util.PackageUtil.lambda$stageClasspathElements$2(PackageUtil.java:362)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:443)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
My appengine-web.xml looks like this:
<?xml version="1.0" encoding="utf-8"?>
<appengine-web-app xmlns="http://appengine.google.com/ns/1.0">
<application>my-app-name</application>
<version>1</version>
<threadsafe>true</threadsafe>
<sessions-enabled>true</sessions-enabled>
<runtime>java8</runtime>
<system-properties>
<property name="java.util.logging.config.file" value="WEB-INF/logging.properties"/>
<property name="file.encoding" value="UTF-8" />
<property name="DEFAULT_ENCODING" value="UTF-8" />
</system-properties>
<instance-class>F1</instance-class>
<automatic-scaling>
<min-idle-instances>0</min-idle-instances>
<max-idle-instances>1</max-idle-instances>
</automatic-scaling>
</appengine-web-app>
The cron servlet is pretty simple:
public class CronServlet extends HttpServlet {
#Override
protected void doGet(HttpServletRequest req, HttpServletResponse resp) {
String command = req.getParameter("command");
if (command == null) {
return;
}
if (command.equals("dataflow-job-test")) {
CloudSqlToBigQueryExporter.doTestOnExp();
//return;
}
}
#Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp) {
doGet(req, resp);
}
}
and the cron.yaml is like this:
cron:
- url: '/cron?command=dataflow-job-test'
schedule: 'every 5 minutes'
description: 'Dataflow job execution test'
The issue you are getting here, as described in the error message:
Operation not allowed in a thread that is neither the original request thread nor a thread created by ThreadManager
Is because your servlet is creating new Threads which is not allowed in App Engine unless you use GAE's Thread Manager intead of using tradicional Threads.
So in order to fix that you need to change your code to generate threads using the currentRequestThreadFactory().

error while running apache beam word count example on standalone flink cluster (Apache flink 1.9.0)

Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Class org.apache.beam.model.pipeline.v1.RunnerApi$StandardPTransforms$Primitives does not implement the requested interface org.apache.beam.vendor.grpc.v1p26p0.com.google.protobuf.ProtocolMessageEnum
at org.apache.beam.runners.core.construction.BeamUrns.getUrn (BeamUrns.java:27)
at org.apache.beam.runners.core.construction.PTransformTranslation.<clinit> (PTransformTranslation.java:135)
at org.apache.beam.runners.core.construction.PTransformMatchers$6.matches (PTransformMatchers.java:263)
at org.apache.beam.sdk.Pipeline$2.enterCompositeTransform (Pipeline.java:270)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit (TransformHierarchy.java:653)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit (TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 (TransformHierarchy.java:317)
at org.apache.beam.sdk.runners.TransformHierarchy.visit (TransformHierarchy.java:251)
at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:460)
at org.apache.beam.sdk.Pipeline.replace (Pipeline.java:260)
at org.apache.beam.sdk.Pipeline.replaceAll (Pipeline.java:210)
at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate (FlinkPipelineExecutionEnvironment.java:114)
at org.apache.beam.runners.flink.FlinkRunner.run (FlinkRunner.java:107)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:315)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:301)
at org.apache.beam.examples.WordCount.runWordCount (WordCount.java:208)
at org.apache.beam.examples.WordCount.main (WordCount.java:250)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-flink-1.9</artifactId>
<version>2.17.0</version>
</dependency>

Getting NoClassDefFoundError in Configjar execution when building OSB project using ANT

I am changing my OSB 11g project build process to use configjar tools instead of ANT argument lines. This I am doing to make the build at individual resource level instead of project level.
I have created settings file and build.xml for the same and trying to build the project. Also, I have set the environment variables using osb_env.sh file in the server classpath.
Still I am getting error with some logger class for which I have done no changes in the files.
Please see error below:
clean:
copylibs:
createSbConfigJar:
[echo] OSB project config creation started
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] de:osb-services ......................... SUCCESS [1.000s]
[INFO] de.osb-services:base64 .................. SUCCESS [3.459s]
[INFO] de.osb-services:logging-common .......... SUCCESS [0.699s]
[INFO] de.osb-services:logging ................. SUCCESS [2.884s]
[INFO] de.osb-services:loggingOutbound ......... SUCCESS [1.396s]
[INFO] de.osb-services:monitoring .............. SUCCESS [1.038s]
[INFO] de.osb-services:build ................... FAILURE [0.572s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11.546s
[INFO] Finished at: Wed Jun 12 12:42:49 CEST 2019
[INFO] Final Memory: 16M/249M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.3:run (deploy) on project build: An Ant BuildException has occured: The following error occurred while executing this line:
[ERROR] /pkg/vddi9/home/oradi9/jenkins/jenkins-data/workspace/OSB-R19.9_dev_test_7/build/build.xml:38: java.lang.NoClassDefFoundError: weblogic/i18n/logging/NonCatalogLogger: weblogic.i18n.logging.NonCatalogLogger
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.3:run (deploy) on project build: An Ant BuildException has occured: The following error occurred while executing this line:
/pkg/vddi9/home/oradi9/jenkins/jenkins-data/workspace/OSB-R19.9_dev_test_7/build/build.xml:38: java.lang.NoClassDefFoundError: weblogic/i18n/logging/NonCatalogLogger
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: The following error occurred while executing this line:
/pkg/vddi9/home/oradi9/jenkins/jenkins-data/workspace/OSB-R19.9_dev_test_7/build/build.xml:38: java.lang.NoClassDefFoundError: weblogic/i18n/logging/NonCatalogLogger
at org.apache.maven.plugin.antrun.AbstractAntMojo.executeTasks(AbstractAntMojo.java:131)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:98)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
Can you please tell what can be the issue here?
Please see below the existing (working) build.xml and changed(non-working) build.xml code.
(Old) build.xml:
<target name="buildOsbProject">
<java dir="${osb.home}/lib" jar="${eclipse.home}/plugins/${equinox.launcher.jar}" fork="true" failonerror="true" maxmemory="768m" >
<jvmarg line="-XX:MaxPermSize=256m" />
<arg line="-data ${workspace.dir}" />
<arg line="-application com.bea.alsb.core.ConfigExport" />
<arg line="-configProject OSBInboundServicesConfig" />
<arg line="-configJar ${osbProject.configJar}" />
<arg line="-configSubProjects OSBInboundServices" />
<arg line="-includeDependencies true" />
<arg line="--launcher.library plugins/${equinox.launcher.lib}" />
<sysproperty key="weblogic.home" value="${weblogic.home}" />
<sysproperty key="osb.home" value="${osb.home}" />
<sysproperty key="osgi.bundlefile.limit" value="750" />
<sysproperty key="harvester.home" value="${osb.home}/harvester" />
<sysproperty key="osgi.nl" value="en_US" />
<sysproperty key="sun.lang.ClassLoader.allowArraySyntax" value="true" />
<sysproperty key="middleware.home" value="${middleware.home}" />
<sysproperty key="com.bea.plateng.common.util.logging.useJDKlogger" value="true"/>
<sysproperty key="java.io.tmpdir" value="/tmp/salosb_tmp"/>
</java>
</target>
(New) build.xml
<taskdef resource="net/sf/antcontrib/antlib.xml">
<classpath>
<pathelement location="${workspace.dir}/build/lib/ant-contrib.jar"/>
</classpath>
</taskdef>
<!-- Task Definition -->
<taskdef name="configjar" classname="com.bea.alsb.tools.configjar.ant.ConfigJarTask">
<classpath>
<pathelement location="${osb.home}/tools/configjar/configjar.jar"/>
</classpath>
</taskdef>
<!-- This is the classpath for java runtime-->
<path id="classpath.java">
<fileset dir="lib">
<include name="*.jar"/>
</fileset>
</path>
<target name="createSbConfigJar">
<echo>OSB project config creation started</echo>
<property name="settingsFileFull" value="${workspace.dir}/build/settings-file.xml"/>
<property name="targetDirectory" value="${workspace.dir}/build"/>
<configjar debug="false" failonerror="true" errorProperty="" settingsFile="${settingsFileFull}"/>
</target>
<target name="buildDeployExportDelete" depends="clean,copylibs,createSbConfigJar,deleteFromOSB, prepareCustomization, importToOSB, setAuthorizationInformation,exportFromOSB">
<antcall target="deleteFromOSB">
</antcall>
</target>
settings-file.xml
<configjarSettings xmlns="http://www.bea.com/alsb/tools/configjar/config">
<source>
<project dir="${workspace.dir}/OSBInboundServices"/>
<system dir="${workspace.dir}/OSBInboundServicesConfig"/>
</source>
<configjar jar="${workspace.dir}/sbconfig.jar" overwrite="true">
<projectLevel includeSystem="true"/>
</configjar>
</configjarSettings>
OSB_Env.sh
#!/bin/sh
export MW_HOME=/pkg/home/nonarch/Oracle/Middleware
export OSB_HOME=/pkg/home/nonarch/Oracle/Middleware/Oracle_OSB1
export JAVA_HOME=/opt/java6
export BEA_HOME=$MW_HOME
export WL_HOME=$MW_HOME/wlserver_10.3
export WLS_VER=10.3
export COMMON_COMPONENTS_HOME=$MW_HOME/oracle_common
export MODULES_DIR=$MW_HOME/modules
export PATH=$WL_HOME/server/bin:$JAVA_HOME/jre/bin:$JAVA_HOME/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin
export MEM_ARGS="-Xms32m -Xmx200m -XX:MaxPermSize=128m"
export CONFIGJAR_HOME="$OSB_HOME/tools/configjar"
export JAVA_OPTS="-Dosb.home=$OSB_HOME -Dweblogic.home=$WL_HOME"
export JAVA_OPTIONS=" -Xverify:none"
CLASSPATH=$MW_HOME/modules/features/weblogic.server.modules_10.3.6.0.jar
CLASSPATH=$CLASSPATH:$WL_HOME/server/lib/weblogic.jar
CLASSPATH=$CLASSPATH:$MW_HOME/oracle_common/modules/oracle.http_client_11.1.1.jar:$MW_HOME/oracle_common/modules/oracle.xdk_11.1.0/xmlparserv2.jar:$MW_HOME/oracle_common/modules/oracle.webservices_11.1.1/orawsdl.jar:$MW_HOME/oracle_common/modules/oracle.wsm.common_11.1.1/wsm-dependencies.jar
CLASSPATH=$CLASSPATH:$OSB_HOME/modules/features/osb.server.modules_11.1.1.7.jar:$OSB_HOME/soa/modules/oracle.soa.common.adapters_11.1.1/oracle.soa.common.adapters.jar:$OSB_HOME/lib/external/log4j_1.2.8.jar:$OSB_HOME/lib/alsb.jar
CLASSPATH=$CLASSPATH:$CONFIGJAR_HOME/configjar.jar:$CONFIGJAR_HOME/L10N
export CLASSPATH

org.objectweb.jndi.DataSourceFactory Class not found exception when Tomcat 7 server deploy

When I start my Tomcat server 7, I am given a warning by mentioning following messages:
WARNING: Failed to register in JMX: [javax.naming.NamingException:
Could not load resource factory class [Root exception is
java.lang.ClassNotFoundException:
org.objectweb.jndi.DataSourceFactory]]
Since this is a warning, I am skipping this and continue the process. Once I create DataBase connection I am given following exception.
javax.naming.NamingException: Could not load resource factory class [Root exception is java.lang.ClassNotFoundException: org.objectweb.jndi.DataSourceFactory]
at org.apache.naming.factory.ResourceFactory.getObjectInstance(ResourceFactory.java:82)
at javax.naming.spi.NamingManager.getObjectInstance(NamingManager.java:321)
at org.apache.naming.NamingContext.lookup(NamingContext.java:848)
at org.apache.naming.NamingContext.lookup(NamingContext.java:159)
at org.apache.naming.NamingContext.lookup(NamingContext.java:836)
at org.apache.naming.NamingContext.lookup(NamingContext.java:159)
at org.apache.naming.NamingContext.lookup(NamingContext.java:836)
at org.apache.naming.NamingContext.lookup(NamingContext.java:159)
at org.apache.naming.NamingContext.lookup(NamingContext.java:836)
at org.apache.naming.NamingContext.lookup(NamingContext.java:173)
at org.apache.naming.SelectorContext.lookup(SelectorContext.java:156)
at javax.naming.InitialContext.lookup(InitialContext.java:417)
at se.cambiosys.spider.FHIRServermodule.connection.DBConnectionToolkit.getConnection(DBConnectionToolkit.java:38)
at se.cambiosys.spider.FHIRServermodule.internalToolkit.WSUserInternalToolkit.isValidUser(WSUserInternalToolkit.java:187)
at se.cambiosys.spider.FHIRServermodule.toolkit.WSUserToolkit.isValidUser(WSUserToolkit.java:23)
at se.cambiosys.spider.FHIRServermodule.servlets.AccessTokenServlet.validateCredentials(AccessTokenServlet.java:315)
at se.cambiosys.spider.FHIRServermodule.servlets.AccessTokenServlet.handlePasswordGrant(AccessTokenServlet.java:139)
at se.cambiosys.spider.FHIRServermodule.servlets.AccessTokenServlet.handleGrants(AccessTokenServlet.java:86)
at se.cambiosys.spider.FHIRServermodule.servlets.AccessTokenServlet.doPost(AccessTokenServlet.java:57)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:650)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:494)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:445)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1137)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:637)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
**Caused by: java.lang.ClassNotFoundException: org.objectweb.jndi.DataSourceFactory
at** org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1928)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1771)
at org.apache.naming.factory.ResourceFactory.getObjectInstance(ResourceFactory.java:80)
whether I am using wrong jar file in Tomcat server or any other problem?
InitialContext ic = new InitialContext();
DataSource da = (DataSource)
ic.lookup("java:comp/env/jdbc/fhirDB");//error... code snippet.
return da.getConnection();
where my ConText file.
<?xml version="1.0" encoding="UTF-8"?>
<Context reloadable="true">
<!-- path="/CosmicFHIRService" docBase="CosmicFHIRService.war" reloadable="true" crossContext="true" -->
<Resource name="jdbc/fhirDB" auth="Container"
type="javax.sql.DataSource" username="spider3" password="spider3"
factory="org.objectweb.jndi.DataSourceFactory"
driverClassName="com.microsoft.sqlserver.jdbc.SQLServerDriver"
url="jdbc:sqlserver://CSLK-DKDB-81-1:1433;DatabaseName=FAROE_SPIDERIII;SelectMethod=cursor;"
maxActive="8" />
<Transaction factory="org.objectweb.jotm.UserTransactionFactory" jotm.timeout="60" />
</Context>
The class org.objectweb.jndi.DataSourceFactory is not on your Tomcats' classpath. You'll have to place the .jar-File containing it into Apache Tomcat lib folder: $CATALINA_HOME/lib/
Edit:
As per your comment it was not your intention to actually use this class at all. It is your context.xml file that refers to the class which does not exist:
factory="org.objectweb.jndi.DataSourceFactory"
and next reference likely to fail:
<Transaction factory="org.objectweb.jotm.UserTransactionFactory".../>.
You can simply change that to alternative implementations and update further attributes if necessary as descripted in Apache Tomcat documentation.

Facing error while running training example "writing to Apache Kafka"

I copied the sample code from the hands-on examples of Apache Flink, and tried to get it run. The code is as following :
public class RideCleansing {
private static final String LOCAL_KAFKA_BROKER = "localhost:9092";
public static final String CLEANSED_RIDES_TOPIC = "mytopic";
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<TaxiRide> rides = env.addSource(new TaxiRideGenerator("C://data/nycTaxiRides.gz", 1000.0f));
DataStream<TaxiRide> filteredRides = rides.filter(new NYCFilter());
filteredRides.addSink(new FlinkKafkaProducer<>(LOCAL_KAFKA_BROKER,
CLEANSED_RIDES_TOPIC,
new TaxiRideSchema()));
env.execute("Taxi Ride Cleansing");
}
I have Apache Kafka installed and running locally. I get the following error!
Any help is appreciated!
18:43:15,734 INFO org.apache.flink.api.java.typeutils.TypeExtractor - class org.joda.time.DateTime is not a valid POJO type
Exception in thread "main" java.lang.NoClassDefFoundError: kafka/producer/Partitioner
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.dataArtisans.flinkTraining.exercises.dataStreamJava.rideCleansing.RideCleansing.main(RideCleansing.java:51)
Caused by: java.lang.ClassNotFoundException: kafka.producer.Partitioner
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 13 more
I used either of the two followings dependencies for the Kafka connector.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.9_2.11</artifactId> <version>1.0.0</version>
</dependency>
or
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10.2</artifactId> version>0.10.2</version>
</dependency>

Resources