sonar maven plugin fail to download the file: http://localhost:9000/deploy/jdbc-driver.jar - maven-plugin

I tried to download and install sonar and then running an analysis with maven sonar plugin.
I have tried first Sonar version 2.14, and now 2.13.1, but I get the same error over and over again.
It seems it is unable to download the jdbc driver from localhost (the sonar server) because I get this error message: "Fail to download the
file: http://localhost:9000/deploy/jdbc-driver.jar (no proxy): Read timed out -> [Help 1]"
But If I go to http://localhost:9000/deploy/jdbc-driver.jar with my browser, it downloads the file without problems, so it seems that the sonar server is working the correct way, and that the maven sonar plugin is doing something wrong that I can not figure out.
I have also tried to alter sonar's context root to /sonar, but I get the same error.
I have configured sonar to use postgreSQL instead of the embedded Derby database, but still no luck. Accessing Sonar from a browser works like a charm, but I can not run mvn sonar:sonar so at this point my sonar installation is pretty useless... :(
Here is the stack trace from running maven with the -e option:
[ERROR] Failed to execute goal org.codehaus.mojo:sonar-maven-plugin:2.0:sonar (default-cli) on project betterfarmer-engine: Can not execute Sonar: Fail to download the
file: http://localhost:9000/deploy/jdbc-driver.jar (no proxy): Read timed out -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:sonar-maven-plugin:2.0:sonar (default-cli) on project betterfarmer-engi
ne: Can not execute Sonar
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: Can not execute Sonar
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:118)
at org.codehaus.mojo.sonar.Bootstraper.start(Bootstraper.java:65)
at org.codehaus.mojo.sonar.SonarMojo.execute(SonarMojo.java:90)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
Caused by: org.sonar.api.utils.SonarException: Fail to download the file: http://localhost:9000/deploy/jdbc-driver.jar (no proxy)
at org.sonar.api.utils.HttpDownloader.download(HttpDownloader.java:130)
at org.sonar.batch.bootstrap.ArtifactDownloader.downloadJdbcDriver(ArtifactDownloader.java:58)
at org.sonar.batch.bootstrap.JdbcDriverHolder.<init>(JdbcDriverHolder.java:43)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.picocontainer.injectors.AbstractInjector.newInstance(AbstractInjector.java:147)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:319)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:274)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:341)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:689)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:638)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:663)
at org.sonar.api.platform.ComponentContainer.getComponentByType(ComponentContainer.java:113)
at org.sonar.batch.bootstrap.Module.getComponentByType(Module.java:131)
at org.sonar.batch.bootstrap.BootstrapModule.configure(BootstrapModule.java:63)
at org.sonar.batch.bootstrap.Module.init(Module.java:49)
at org.sonar.batch.bootstrap.Module.init(Module.java:41)
at org.sonar.batch.Batch.<init>(Batch.java:37)
at org.sonar.maven3.SonarMojo.executeBatch(SonarMojo.java:143)
at org.sonar.maven3.SonarMojo.execute(SonarMojo.java:136)
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:113)
... 23 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:150)
at java.net.SocketInputStream.read(SocketInputStream.java:121)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at sun.net.www.MeteredStream.read(MeteredStream.java:134)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:2959)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:2953)
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1262)
at org.apache.commons.io.IOUtils.copy(IOUtils.java:1236)
at org.sonar.api.utils.HttpDownloader.download(HttpDownloader.java:125)
... 48 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Your Sonar is probably not correctly installed.
Don't you have errors in the server log?
Are you able to browse the web application at http://localhost:9000?
I'd suggest to start with a fresh install (in another folder than the previous one) and check that you don't have env properties that point to the previous install location before starting the new Sonar.

Related

flink 1.12.1 example application failing on a single node yarn cluster

I am trying out flink example as explained in flink docs in a single node yarn cluster.
As mentioned in this discussion HADOOP_CONF_DIR is also set like below before executing the yarn command.
export HADOOP_CONF_DIR=/etc/hadoop/conf
On executing the below command
ubuntu#vrni-platform:~/build-target/flink$ ./bin/flink run-application -t yarn-application ./examples/streaming/TopSpeedWindowing.jar
It is failing with the below errors
The program finished with the following exception:
org.apache.flink.client.deployment.ClusterDeploymentException: Couldn't deploy Yarn Application Cluster
at org.apache.flink.yarn.YarnClusterDescriptor.deployApplicationCluster(YarnClusterDescriptor.java:465)
at org.apache.flink.client.deployment.application.cli.ApplicationClusterDeployer.run(ApplicationClusterDeployer.java:67)
at org.apache.flink.client.cli.CliFrontend.runApplication(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1061)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1136)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1136)
Caused by: org.apache.flink.yarn.YarnClusterDescriptor$YarnDeploymentException: The YARN application unexpectedly switched to state FAILED during deployment.
Diagnostics from YARN: Application application_1614159836384_0045 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1614159836384_0045_000001 exited with exitCode: -1000
Failing this attempt.Diagnostics: [2021-02-24 16:19:39.409]File file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar does not exist
java.io.FileNotFoundException: File file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:867)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:442)
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I have made the log level DEBUG and I do see that flink-dist_2.12-1.12.1.jar is getting copied to /home/ubuntu/.flink/application_1614159836384_0045.
2021-02-24 16:19:37,768 DEBUG org.apache.flink.yarn.YarnApplicationFileUploader [] - Got modification time 1614183577000 from remote path file:/home/ubuntu/.flink/application_1614159836384_0045/TopSpeedWindowing.jar
2021-02-24 16:19:37,769 DEBUG org.apache.flink.yarn.YarnApplicationFileUploader [] - Copying from file:/home/ubuntu/build-target/flink/lib/flink-dist_2.12-1.12.1.jar to file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar with replication factor 1
I have placed the entire DEBUG logs here.
Nodemanger logs have warnings like below
2021-02-24 16:36:34,219 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServices: Got event CONTAINER_INIT for appId application_1614159836384_0047
2021-02-24 16:36:34,220 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Created localizer for container_1614159836384_0047_01_000001
2021-02-24 16:36:34,222 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Writing credentials to the nmPrivate file /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1614159836384_0047_01_000001.tokens
2021-02-24 16:36:34,222 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Initializing user ubuntu
2021-02-24 16:36:34,224 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Copying from /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1614159836384_0047_01_000001.tokens to /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/ubuntu/appcache/application_1614159836384_0047/container_1614159836384_0047_01_000001.tokens
2021-02-24 16:36:34,224 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Localizer CWD set to /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/ubuntu/appcache/application_1614159836384_0047 = file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/ubuntu/appcache/application_1614159836384_0047
2021-02-24 16:36:34,247 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer: Disk Validator: yarn.nodemanager.disk-validator is loaded.
2021-02-24 16:36:34,268 WARN org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: { file:/home/ubuntu/.flink/application_1614159836384_0047/flink-dist_2.12-1.12.1.jar, 1614184593000, FILE, null } failed: File file:/home/ubuntu/.flink/application_1614159836384_0047/flink-dist_2.12-1.12.1.jar does not exist
java.io.FileNotFoundException: File file:/home/ubuntu/.flink/application_1614159836384_0047/flink-dist_2.12-1.12.1.jar does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:867)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:442)
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
The entire nodemanger logs are here.
Can someone let me know what is going wrong? Does flink not support single node yarn cluster for development?
Flink Version 1.12.1
There was a configuration issue in my setup. In my setup hadoop-yarn-nodemenager is running with yarn user.
ubuntu#vrni-platform:/tmp/flink$ ps -ef | grep nodemanager
yarn 4953 1 2 05:53 ? 00:11:26 /usr/lib/jvm/java-8-openjdk/bin/java -Dproc_nodemanager -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/lib/heap-dumps/yarn -XX:+ExitOnOutOfMemoryError -Dyarn.log.dir=/var/log/hadoop-yarn -Dyarn.log.file=hadoop-yarn-nodemanager-vrni-platform.log -Dyarn.home.dir=/usr/lib/hadoop-yarn -Dyarn.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Xmx512m -Dhadoop.log.dir=/var/log/hadoop-yarn -Dhadoop.log.file=hadoop-yarn-nodemanager-vrni-platform.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str=yarn -Dhadoop.root.logger=INFO,RFA -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.yarn.server.nodemanager.NodeManager
I was executing the ./bin/flink command as ubuntu user and yarn user does not have permission to write to ubuntu's home folder in my setup.
ubuntu#vrni-platform:/tmp/flink$ echo ~ubuntu
/home/ubuntu
ubuntu#vrni-platform:/tmp/flink$ echo ~yarn
/var/lib/hadoop-yarn
It appears flink needs permission to write to user's home directory to create a .flink folder even when the job is submitted in yarn. It is working fine for me if I run the flink with yarn user in my setup.

Unable to build Flink from sources due to MapR artifacts problems

Summary: mapr dependency could not be found and thus the Flink build on master branch fails.
Failed to execute goal on project flink-mapr-fs: Could not resolve
dependencies for project org.apache.flink:flink-mapr-fs:jar:1.10-SNAPSHOT:
Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
MapR has been in shutdown talks as a company - i'm unclear as to whether they're even open. What is the answer/ what to do here? I'd like to simply "skip"/ignore the mapr connector but not sure how to do so..
Details:
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal on project flink-mapr-fs: Could not resolve dependencies for project org.apache.flink:flink-mapr-fs:jar:1.10-SNAPSHOT: Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:221)
at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.resolveProjectDependencies(LifecycleDependencyResolver.java:127)
at org.apache.maven.lifecycle.internal.MojoExecutor.ensureDependenciesAreResolved(MojoExecutor.java:245)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:199)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.project.DependencyResolutionException: Could not resolve dependencies for project org.apache.flink:flink-mapr-fs:jar:1.10-SNAPSHOT: Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:180)
at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:195)
... 23 more
Caused by: org.eclipse.aether.collection.DependencyCollectionException: Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:291)
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.collectDependencies(DefaultRepositorySystem.java:316)
at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:172)
... 24 more
Caused by: org.eclipse.aether.resolution.ArtifactDescriptorException: Failed to read artifact descriptor for com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:282)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.readArtifactDescriptor(DefaultArtifactDescriptorReader.java:198)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.resolveCachedArtifactDescriptor(DefaultDependencyCollector.java:535)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.getArtifactDescriptorResult(DefaultDependencyCollector.java:519)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:409)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:363)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.process(DefaultDependencyCollector.java:351)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:254)
... 26 more
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Could not transfer artifact com.mapr.hadoop:maprfs:pom:5.2.1-mapr from/to mapr-releases (https://repository.mapr.com/maven/): sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:444)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:246)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:223)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:267)
... 33 more
Caused by: org.eclipse.aether.transfer.ArtifactTransferException: Could not transfer artifact com.mapr.hadoop:maprfs:pom:5.2.1-mapr from/to mapr-releases (https://repository.mapr.com/maven/): sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.eclipse.aether.connector.basic.ArtifactTransportListener.transferFailed(ArtifactTransportListener.java:43)
at org.eclipse.aether.connector.basic.BasicRepositoryConnector$TaskRunner.run(BasicRepositoryConnector.java:355)
at org.eclipse.aether.util.concurrency.RunnableErrorForwarder$1.run(RunnableErrorForwarder.java:67)
at org.eclipse.aether.connector.basic.BasicRepositoryConnector$DirectExecutor.execute(BasicRepositoryConnector.java:581)
at org.eclipse.aether.connector.basic.BasicRepositoryConnector.get(BasicRepositoryConnector.java:249)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:520)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:421)
... 36 more
Caused by: org.apache.maven.wagon.TransferFailedException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Just verify you have the below entry in "flink/flink-filesystems/flink-mapr-fs/pom.xml"
<profile>
<id>unsafe-mapr-repo</id>
<activation>
<property>
<name>unsafe-mapr-repo</name>
</property>
</activation>
<repositories>
<!-- MapR -->
<repository>
<id>mapr-releases</id>
<url>http://repository.mapr.com/maven/</url>
<snapshots><enabled>false</enabled></snapshots>
<releases><enabled>true</enabled></releases>
</repository>
</repositories>
</profile>
Once it is verified, run the below command from "flink/flink-filesystems/flink-mapr-fs", to ensure the build was successful for the below project.
mvn clean package -DskipTests -Punsafe-mapr-repo.
Once this is successful, run the same maven package command from "flink/flink-filesystems", to ensure the build was successful and after that you should be good to run from flink project.
This will ensure that you have a successful target-build for the flink project.

com.google.cloud.sql.mysql.SocketFactory ClassNotFoundException

Hi I am new to web development and am trying to get activejdbc to connect to a cloud sql instance... I am getting the following stacktrace:
[ERROR] Failed to execute goal org.javalite:db-migrator-maven-plugin:1.4.11:migrate (dev_migrations) on project activeweb-simple: Execution dev_migrations of go
al org.javalite:db-migrator-maven-plugin:1.4.11:migrate failed: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket fa
ctory 'com.google.cloud.sql.mysql.SocketFactory' due to underlying exception: -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.javalite:db-migrator-maven-plugin:1.4.11:migrate (dev_migrations) on project
activeweb-simple: Execution dev_migrations of goal org.javalite:db-migrator-maven-plugin:1.4.11:migrate failed: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransien
tConnectionException: Could not create socket factory 'com.google.cloud.sql.mysql.SocketFactory' due to underlying exception:
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:213)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:309)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:194)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:107)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:993)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:345)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:191)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution dev_migrations of goal org.javalite:db-migrator-maven-plugin:1.4.11:migrate failed: com.m
ysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket factory 'com.google.cloud.sql.mysql.SocketFactory' due to underlying ex
ception:
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 20 more
Caused by: org.javalite.db_migrator.MigrationException: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket factory 'c
om.google.cloud.sql.mysql.SocketFactory' due to underlying exception:
at org.javalite.db_migrator.DbUtils.openConnection(DbUtils.java:183)
at org.javalite.db_migrator.maven.AbstractDbMigrationMojo.openConnection(AbstractDbMigrationMojo.java:178)
at org.javalite.db_migrator.maven.AbstractDbMigrationMojo.openConnection(AbstractDbMigrationMojo.java:170)
at org.javalite.db_migrator.maven.MigrateMojo.executeMojo(MigrateMojo.java:21)
at org.javalite.db_migrator.maven.AbstractDbMigrationMojo.executeCurrentConfiguration(AbstractDbMigrationMojo.java:96)
at org.javalite.db_migrator.maven.AbstractDbMigrationMojo.execute(AbstractDbMigrationMojo.java:80)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
... 21 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket factory 'com.google.cloud.sql.mysql.SocketFactory' due
to underlying exception:
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
at com.mysql.jdbc.Util.getInstance(Util.java:360)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:935)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:924)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:870)
at com.mysql.jdbc.MysqlIO.createSocketFactory(MysqlIO.java:3194)
at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:293)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2232)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064)
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:790)
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.javalite.db_migrator.DbUtils.openConnection(DbUtils.java:181)
... 27 more
Caused by: java.lang.ClassNotFoundException: com.google.cloud.sql.mysql.SocketFactory
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:247)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.mysql.jdbc.MysqlIO.createSocketFactory(MysqlIO.java:3192)
... 43 more
My connection string in database.properties looks like this:
development.url=jdbc:mysql://google/database?cloudSqlInstance=project:location:instance&socketFactory=com.google.cloud.sql.mysql.SocketFactory&user=dbuser&password=dbuserpassword&useSSL=false
I have used the same connection string with spring-hibernate configuration with out a problem, but would like to try activeweb-activejdbc. Why the compiler cannot find the class? I have included google cloud sql dependency in pom.xml:
<dependency>
<groupId>com.google.cloud.sql</groupId>
<artifactId>mysql-socket-factory-connector-j-6</artifactId>
<version>1.0.2</version>
</dependency>
First, you need to include a fill stack trace. Second, if you get a ClassNotFoundException, this means you do not have that class on the classpath. You need to know for a fact that the class is there. Use Maven dependency plugin, or expand all your dependent jar files and ensure you have the class. No magic here:)
UPDATE:
Now that you provided the stack trace, I see that the exception happens when you run the migrator. If you look at the migrator configuration, you will see that it requires its own class path. For instance, the example here: https://github.com/javalite/activeweb-simple/blob/master/pom.xml#L39 lists the MySQL JDBC driver especially for the migrator, even if you have it configured already for the project runtime. Just copy the config, and replace the MySQL driver with your driver.

Atlassian Bamboo installation failing to start [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
I've installed Bamboo on my machine and I'm trying to run it. Every time I call bin\start-bamboo.bat from the command line it fails to start and I can't figure out why. Here's is what I have in my cataline.log:
23-May-2017 17:34:41.938 WARNING [main]
org.apache.tomcat.util.digester.SetPropertiesRule.begin
[SetPropertiesRule]{Server/Service/Engine/Valve} Setting property
'resolveHosts' to 'false' did not find a matching property.
23-May-2017 17:34:41.943 INFO [main]
org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR
based Apache Tomcat Native library which allows optimal performance in
production environments was not found on the java.library.path:
C:\Program Files
(x86)\Java\jdk1.7.0_55\bin;C:\WINDOWS\Sun\Java\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program
Files\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files\Microsoft
SQL Server\130\Tools\Binn\;C:\Program Files (x86)\Windows
Kits\10\Windows Performance Toolkit\;C:\Program Files\Microsoft SQL
Server\Client SDK\ODBC\110\Tools\Binn\;C:\Program Files
(x86)\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files\Microsoft
SQL Server\120\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL
Server\120\Tools\Binn\ManagementStudio\;C:\Program Files
(x86)\Microsoft SQL
Server\120\DTS\Binn\;C:\Users\matth.dnx\bin;C:\Program
Files\Microsoft DNX\Dnvm\;C:\Program Files (x86)\Microsoft Emulator
Manager\1.0\;C:\Program Files (x86)\nodejs\;C:\Program
Files\Microsoft\Web Platform Installer\;C:\Program
Files\TortoiseGit\bin;C:\Program Files\Git\cmd;C:\Program
Files\Intel\WiFi\bin\;C:\Program Files\Common
Files\Intel\WirelessCommon\;C:\Program Files
(x86)\Skype\Phone\;C:\Program Files\Intel\WiFi\bin\;C:\Program
Files\Common
Files\Intel\WirelessCommon\;C:\Users\matth\AppData\Roaming\npm;C:\Program
Files
(x86)\MSBuild\12.0\Bin;C:\Users\matth\Downloads\mysql-connector-java-5.1.42.tar\mysql-connector-java-5.1.42;.
23-May-2017 17:34:41.989 INFO [main]
org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler
["http-nio-8085"] 23-May-2017 17:34:42.003 INFO [main]
org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a
shared selector for servlet write/read 23-May-2017 17:34:42.005 INFO
[main] org.apache.catalina.startup.Catalina.load Initialization
processed in 299 ms 23-May-2017 17:34:42.014 INFO [main]
org.apache.catalina.core.StandardService.startInternal Starting
service Catalina 23-May-2017 17:34:42.014 INFO [main]
org.apache.catalina.core.StandardEngine.startInternal Starting Servlet
Engine: Apache Tomcat/8.0.36 23-May-2017 17:34:53.384 INFO
[localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars
At least one JAR was scanned for TLDs yet contained no TLDs. Enable
debug logging for this logger for a complete list of JARs that were
scanned but no TLDs were found in them. Skipping unneeded JARs during
scanning can improve startup time and JSP compilation time.
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.startInternal One or more
listeners failed to start. Full details will be found in the
appropriate container log file 23-May-2017 17:34:53.585 INFO
[localhost-startStop-1]
org.apache.catalina.util.SessionIdGeneratorBase.createSecureRandom
Creation of SecureRandom instance for session ID generation using
[SHA1PRNG] took [179] milliseconds. 23-May-2017 17:34:53.585 SEVERE
[localhost-startStop-1]
org.apache.catalina.core.StandardContext.startInternal Context []
startup failed due to previous errors 23-May-2017 17:34:53.643 INFO
[main] org.apache.coyote.AbstractProtocol.start Starting
ProtocolHandler ["http-nio-8085"] 23-May-2017 17:34:53.649 INFO [main]
org.apache.catalina.startup.Catalina.start Server startup in 11644 ms
And here's what I have in my localhost log:
23-May-2017 17:34:53.401 INFO [localhost-startStop-1]
org.apache.catalina.core.ApplicationContext.log No Spring
WebApplicationInitializer types detected on classpath 23-May-2017
17:34:53.402 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.setup.BootstrapLoaderListener
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/setup/BootstrapLoaderListener : Unsupported
major.minor version 52.0 (unable to load class
com.atlassian.bamboo.setup.BootstrapLoaderListener) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.403 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.ww2.actions.setup.BambooContextLoaderListener
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/ww2/actions/setup/BambooContextLoaderListener :
Unsupported major.minor version 52.0 (unable to load class
com.atlassian.bamboo.ww2.actions.setup.BambooContextLoaderListener) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.upgrade.UpgradeLauncher
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/upgrade/UpgradeLauncher : Unsupported major.minor
version 52.0 (unable to load class
com.atlassian.bamboo.upgrade.UpgradeLauncher) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.session.SeraphHttpSessionDestroyedListener
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/session/SeraphHttpSessionDestroyedListener :
Unsupported major.minor version 52.0 (unable to load class
com.atlassian.bamboo.session.SeraphHttpSessionDestroyedListener) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Skipped
installing application listeners due to previous error(s)
If anyone could help that'd be most appreciated.
Thanks,
Matt
The hint is within the stack trace:
Unsupported major.minor version 52.0
Check the Java versions - 52.0 is Java 8: https://en.wikipedia.org/wiki/Java_class_file#General_layout
Latest versions of Bamboo require JDK 1.8 (Java 8)
https://confluence.atlassian.com/bamboo/supported-platforms-289276764.html
But Bamboo is starting with Java 7:
...C:\Program Files (x86)\Java\jdk1.7.0_55\bin...
So you need to upgrade your current JDK, or install another JDK and point it to that.

mvn gcloud:deploy [Error gcloud app command exit code is: 2]

I am getting the following exception while trying to deploy an app on appengine
mvn gcloud:deploy [Error gcloud app command exit code is: 2]
what does exit code 2 means here?
Stack trace
[ERROR] Failed to execute goal com.google.appengine:gcloud-maven-plugin:2.0.9.121.v20160815:deploy (default-cli) on project userprofilev2: Error: gcloud app command exit code is: 2 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal com.google.appengine:gcloud-maven-plugin:2.0.9.121.
v20160815:deploy (default-cli) on project userprofilev2: Error: gcloud app command exit code is: 2
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: Error: gcloud app command exit code is: 2
at com.google.appengine.gcloudapp.AbstractGcloudMojo.startCommand(AbstractGcloudMojo.java:463)
at com.google.appengine.gcloudapp.GCloudAppDeploy.execute(GCloudAppDeploy.java:43)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
... 20 more
upgrading gcloud version using 'gcloud components update' commmand, fixed my problem

Resources