Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
I've installed Bamboo on my machine and I'm trying to run it. Every time I call bin\start-bamboo.bat from the command line it fails to start and I can't figure out why. Here's is what I have in my cataline.log:
23-May-2017 17:34:41.938 WARNING [main]
org.apache.tomcat.util.digester.SetPropertiesRule.begin
[SetPropertiesRule]{Server/Service/Engine/Valve} Setting property
'resolveHosts' to 'false' did not find a matching property.
23-May-2017 17:34:41.943 INFO [main]
org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR
based Apache Tomcat Native library which allows optimal performance in
production environments was not found on the java.library.path:
C:\Program Files
(x86)\Java\jdk1.7.0_55\bin;C:\WINDOWS\Sun\Java\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program
Files\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files\Microsoft
SQL Server\130\Tools\Binn\;C:\Program Files (x86)\Windows
Kits\10\Windows Performance Toolkit\;C:\Program Files\Microsoft SQL
Server\Client SDK\ODBC\110\Tools\Binn\;C:\Program Files
(x86)\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files\Microsoft
SQL Server\120\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL
Server\120\Tools\Binn\ManagementStudio\;C:\Program Files
(x86)\Microsoft SQL
Server\120\DTS\Binn\;C:\Users\matth.dnx\bin;C:\Program
Files\Microsoft DNX\Dnvm\;C:\Program Files (x86)\Microsoft Emulator
Manager\1.0\;C:\Program Files (x86)\nodejs\;C:\Program
Files\Microsoft\Web Platform Installer\;C:\Program
Files\TortoiseGit\bin;C:\Program Files\Git\cmd;C:\Program
Files\Intel\WiFi\bin\;C:\Program Files\Common
Files\Intel\WirelessCommon\;C:\Program Files
(x86)\Skype\Phone\;C:\Program Files\Intel\WiFi\bin\;C:\Program
Files\Common
Files\Intel\WirelessCommon\;C:\Users\matth\AppData\Roaming\npm;C:\Program
Files
(x86)\MSBuild\12.0\Bin;C:\Users\matth\Downloads\mysql-connector-java-5.1.42.tar\mysql-connector-java-5.1.42;.
23-May-2017 17:34:41.989 INFO [main]
org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler
["http-nio-8085"] 23-May-2017 17:34:42.003 INFO [main]
org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a
shared selector for servlet write/read 23-May-2017 17:34:42.005 INFO
[main] org.apache.catalina.startup.Catalina.load Initialization
processed in 299 ms 23-May-2017 17:34:42.014 INFO [main]
org.apache.catalina.core.StandardService.startInternal Starting
service Catalina 23-May-2017 17:34:42.014 INFO [main]
org.apache.catalina.core.StandardEngine.startInternal Starting Servlet
Engine: Apache Tomcat/8.0.36 23-May-2017 17:34:53.384 INFO
[localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars
At least one JAR was scanned for TLDs yet contained no TLDs. Enable
debug logging for this logger for a complete list of JARs that were
scanned but no TLDs were found in them. Skipping unneeded JARs during
scanning can improve startup time and JSP compilation time.
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.startInternal One or more
listeners failed to start. Full details will be found in the
appropriate container log file 23-May-2017 17:34:53.585 INFO
[localhost-startStop-1]
org.apache.catalina.util.SessionIdGeneratorBase.createSecureRandom
Creation of SecureRandom instance for session ID generation using
[SHA1PRNG] took [179] milliseconds. 23-May-2017 17:34:53.585 SEVERE
[localhost-startStop-1]
org.apache.catalina.core.StandardContext.startInternal Context []
startup failed due to previous errors 23-May-2017 17:34:53.643 INFO
[main] org.apache.coyote.AbstractProtocol.start Starting
ProtocolHandler ["http-nio-8085"] 23-May-2017 17:34:53.649 INFO [main]
org.apache.catalina.startup.Catalina.start Server startup in 11644 ms
And here's what I have in my localhost log:
23-May-2017 17:34:53.401 INFO [localhost-startStop-1]
org.apache.catalina.core.ApplicationContext.log No Spring
WebApplicationInitializer types detected on classpath 23-May-2017
17:34:53.402 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.setup.BootstrapLoaderListener
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/setup/BootstrapLoaderListener : Unsupported
major.minor version 52.0 (unable to load class
com.atlassian.bamboo.setup.BootstrapLoaderListener) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.403 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.ww2.actions.setup.BambooContextLoaderListener
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/ww2/actions/setup/BambooContextLoaderListener :
Unsupported major.minor version 52.0 (unable to load class
com.atlassian.bamboo.ww2.actions.setup.BambooContextLoaderListener) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.upgrade.UpgradeLauncher
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/upgrade/UpgradeLauncher : Unsupported major.minor
version 52.0 (unable to load class
com.atlassian.bamboo.upgrade.UpgradeLauncher) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Error
configuring application listener of class
com.atlassian.bamboo.session.SeraphHttpSessionDestroyedListener
java.lang.UnsupportedClassVersionError:
com/atlassian/bamboo/session/SeraphHttpSessionDestroyedListener :
Unsupported major.minor version 52.0 (unable to load class
com.atlassian.bamboo.session.SeraphHttpSessionDestroyedListener) at
org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2544)
at
org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:858)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1301)
at
org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at
org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:518)
at
org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:499)
at
org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:118)
at
org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4764)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at
org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at
org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
23-May-2017 17:34:53.404 SEVERE [localhost-startStop-1]
org.apache.catalina.core.StandardContext.listenerStart Skipped
installing application listeners due to previous error(s)
If anyone could help that'd be most appreciated.
Thanks,
Matt
The hint is within the stack trace:
Unsupported major.minor version 52.0
Check the Java versions - 52.0 is Java 8: https://en.wikipedia.org/wiki/Java_class_file#General_layout
Latest versions of Bamboo require JDK 1.8 (Java 8)
https://confluence.atlassian.com/bamboo/supported-platforms-289276764.html
But Bamboo is starting with Java 7:
...C:\Program Files (x86)\Java\jdk1.7.0_55\bin...
So you need to upgrade your current JDK, or install another JDK and point it to that.
Related
I am new to glassfish server and I am trying to deploy a Spring boot application in glassfish 5.1.
My Spring boot application communicates with database via JPA and Hibernate.
When I compile and deploy with mysql, deployment went fine.
However, when I change the database to Microsoft Sql Server (MSSQL) deployment fails with the error below:
java.lang.Exception: java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: org.apache.catalina.LifecycleException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Invocation of init method failed; nested exception is java.lang.NoClassDefFoundError: Could not initialize class sun.security.ssl.SSLExtension
at com.sun.enterprise.web.WebApplication.start(WebApplication.java:136)
at org.glassfish.internal.data.EngineRef.start(EngineRef.java:122)
at org.glassfish.internal.data.ModuleInfo.start(ModuleInfo.java:291)
at org.glassfish.internal.data.ApplicationInfo.start(ApplicationInfo.java:352)
at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:500)
at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:219)
at org.glassfish.deployment.admin.DeployCommand.execute(DeployCommand.java:491)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:540)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:536)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2.execute(CommandRunnerImpl.java:535)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$3.run(CommandRunnerImpl.java:566)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$3.run(CommandRunnerImpl.java:558)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:557)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:1465)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.access$1300(CommandRunnerImpl.java:110)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1847)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1723)
at org.glassfish.admin.rest.resources.admin.CommandResource.executeCommand(CommandResource.java:408)
at org.glassfish.admin.rest.resources.admin.CommandResource.execCommandSimpInMultOut(CommandResource.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:200)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:272)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:268)
at org.glassfish.jersey.internal.Errors.process(Errors.java:316)
at org.glassfish.jersey.internal.Errors.process(Errors.java:298)
at org.glassfish.jersey.internal.Errors.process(Errors.java:268)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703)
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:377)
at org.glassfish.admin.rest.adapter.JerseyContainerCommandService$3.service(JerseyContainerCommandService.java:174)
at org.glassfish.admin.rest.adapter.RestAdapter.service(RestAdapter.java:179)
at com.sun.enterprise.v3.services.impl.ContainerMapper$HttpHandlerCallable.call(ContainerMapper.java:463)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:168)
at org.glassfish.grizzly.http.server.HttpHandler.runService(HttpHandler.java:206)
at org.glassfish.grizzly.http.server.HttpHandler.doHandle(HttpHandler.java:180)
at org.glassfish.grizzly.http.server.HttpServerFilter.handleRead(HttpServerFilter.java:242)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:119)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:284)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:201)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:133)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:112)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:77)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:539)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:112)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.run0(WorkerThreadIOStrategy.java:117)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.access$100(WorkerThreadIOStrategy.java:56)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy$WorkerThreadRunnable.run(WorkerThreadIOStrategy.java:137)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:593)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.run(AbstractThreadPool.java:573)
at java.lang.Thread.run(Thread.java:750)
]]
Server: Galssfish 5.1
Databse used: mssql
Dependency: runtimeOnly 'com.microsoft.sqlserver:mssql-jdbc'
Java: jdk8
Properties file Config:
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.url=jdbc:sqlserver://127.0.0.1;databaseName=test_db
spring.datasource.username= SA
spring.datasource.password= Somepassword
Kindly assist.
Thanks to the explanation in Stackoverflow-glassfish5, the issue is now resolved.
There is a problem with Glassfish 5 dependency on Java8 sun package which is causing a clash during deployment.
To fix the issue, we needed to delete the sun folder from grizzly-npn-bootstrap.jar.
Location is: {{glassfish5_home}}/glassfish5/glassfish/modules/endorsed/grizzly-npn-bootstrap.jar.
Procedure is to extract it with a zip application (Winrar), then delete the sun folder and save.
Restart Glassfish 5 and then you are good.
I am trying out flink example as explained in flink docs in a single node yarn cluster.
As mentioned in this discussion HADOOP_CONF_DIR is also set like below before executing the yarn command.
export HADOOP_CONF_DIR=/etc/hadoop/conf
On executing the below command
ubuntu#vrni-platform:~/build-target/flink$ ./bin/flink run-application -t yarn-application ./examples/streaming/TopSpeedWindowing.jar
It is failing with the below errors
The program finished with the following exception:
org.apache.flink.client.deployment.ClusterDeploymentException: Couldn't deploy Yarn Application Cluster
at org.apache.flink.yarn.YarnClusterDescriptor.deployApplicationCluster(YarnClusterDescriptor.java:465)
at org.apache.flink.client.deployment.application.cli.ApplicationClusterDeployer.run(ApplicationClusterDeployer.java:67)
at org.apache.flink.client.cli.CliFrontend.runApplication(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1061)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1136)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1136)
Caused by: org.apache.flink.yarn.YarnClusterDescriptor$YarnDeploymentException: The YARN application unexpectedly switched to state FAILED during deployment.
Diagnostics from YARN: Application application_1614159836384_0045 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1614159836384_0045_000001 exited with exitCode: -1000
Failing this attempt.Diagnostics: [2021-02-24 16:19:39.409]File file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar does not exist
java.io.FileNotFoundException: File file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:867)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:442)
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I have made the log level DEBUG and I do see that flink-dist_2.12-1.12.1.jar is getting copied to /home/ubuntu/.flink/application_1614159836384_0045.
2021-02-24 16:19:37,768 DEBUG org.apache.flink.yarn.YarnApplicationFileUploader [] - Got modification time 1614183577000 from remote path file:/home/ubuntu/.flink/application_1614159836384_0045/TopSpeedWindowing.jar
2021-02-24 16:19:37,769 DEBUG org.apache.flink.yarn.YarnApplicationFileUploader [] - Copying from file:/home/ubuntu/build-target/flink/lib/flink-dist_2.12-1.12.1.jar to file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar with replication factor 1
I have placed the entire DEBUG logs here.
Nodemanger logs have warnings like below
2021-02-24 16:36:34,219 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServices: Got event CONTAINER_INIT for appId application_1614159836384_0047
2021-02-24 16:36:34,220 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Created localizer for container_1614159836384_0047_01_000001
2021-02-24 16:36:34,222 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Writing credentials to the nmPrivate file /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1614159836384_0047_01_000001.tokens
2021-02-24 16:36:34,222 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Initializing user ubuntu
2021-02-24 16:36:34,224 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Copying from /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1614159836384_0047_01_000001.tokens to /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/ubuntu/appcache/application_1614159836384_0047/container_1614159836384_0047_01_000001.tokens
2021-02-24 16:36:34,224 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Localizer CWD set to /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/ubuntu/appcache/application_1614159836384_0047 = file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/ubuntu/appcache/application_1614159836384_0047
2021-02-24 16:36:34,247 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer: Disk Validator: yarn.nodemanager.disk-validator is loaded.
2021-02-24 16:36:34,268 WARN org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: { file:/home/ubuntu/.flink/application_1614159836384_0047/flink-dist_2.12-1.12.1.jar, 1614184593000, FILE, null } failed: File file:/home/ubuntu/.flink/application_1614159836384_0047/flink-dist_2.12-1.12.1.jar does not exist
java.io.FileNotFoundException: File file:/home/ubuntu/.flink/application_1614159836384_0047/flink-dist_2.12-1.12.1.jar does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:867)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:442)
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
The entire nodemanger logs are here.
Can someone let me know what is going wrong? Does flink not support single node yarn cluster for development?
Flink Version 1.12.1
There was a configuration issue in my setup. In my setup hadoop-yarn-nodemenager is running with yarn user.
ubuntu#vrni-platform:/tmp/flink$ ps -ef | grep nodemanager
yarn 4953 1 2 05:53 ? 00:11:26 /usr/lib/jvm/java-8-openjdk/bin/java -Dproc_nodemanager -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/lib/heap-dumps/yarn -XX:+ExitOnOutOfMemoryError -Dyarn.log.dir=/var/log/hadoop-yarn -Dyarn.log.file=hadoop-yarn-nodemanager-vrni-platform.log -Dyarn.home.dir=/usr/lib/hadoop-yarn -Dyarn.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Xmx512m -Dhadoop.log.dir=/var/log/hadoop-yarn -Dhadoop.log.file=hadoop-yarn-nodemanager-vrni-platform.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str=yarn -Dhadoop.root.logger=INFO,RFA -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.yarn.server.nodemanager.NodeManager
I was executing the ./bin/flink command as ubuntu user and yarn user does not have permission to write to ubuntu's home folder in my setup.
ubuntu#vrni-platform:/tmp/flink$ echo ~ubuntu
/home/ubuntu
ubuntu#vrni-platform:/tmp/flink$ echo ~yarn
/var/lib/hadoop-yarn
It appears flink needs permission to write to user's home directory to create a .flink folder even when the job is submitted in yarn. It is working fine for me if I run the flink with yarn user in my setup.
I'm currently setting up the Apache Jackrabbit standalone for a future Symfony project. However, I get a malformated url exception in the jackrabbit.log_IS_UNDEFINED file. Apart from the log files, as far as I can tell, no other files are created.
There is a 503 error on port 8080 with the message that the service is unavailable.
I use Java 8 and the error message comes on both Ubuntu and MacOS Catalina, as well as in the Jackrabbit versions 2.16.6 and 2.20.1
Does anyone have an idea how to fix that? Unfortunately, I only have a very basic understanding of this.
2020-08-22 09:33:09.879 WARN [main] Resource.java:191 Bad Resource: jar:jar:file:/Users/kristiandubek/Development/programs/jackrabbit-standalone-2.16.6.jar!/WEB-INF/lib/FastInfoset-1.2.16.jar!/META-INF/resources
2020-08-22 09:33:09.880 WARN [main] WebAppContext.java:514 Failed startup of context o.e.j.w.WebAppContext#5700d6b1{/,jar:file:/Users/kristiandubek/Development/programs/jackrabbit-standalone-2.16.6.jar!/,null}{/Users/kristiandubek/Development/programs/jackrabbit-standalone-2.16.6.jar}
java.net.MalformedURLException: Nested JAR URLs are not supported
at java.net.URL.<init>(URL.java:645)
at java.net.URL.<init>(URL.java:508)
at java.net.URL.<init>(URL.java:457)
at org.eclipse.jetty.util.resource.Resource.newResource(Resource.java:166)
at org.eclipse.jetty.util.resource.Resource.newResource(Resource.java:149)
at org.eclipse.jetty.webapp.MetaInfConfiguration.scanForResources(MetaInfConfiguration.java:176)
at org.eclipse.jetty.webapp.MetaInfConfiguration.scanJars(MetaInfConfiguration.java:133)
at org.eclipse.jetty.webapp.MetaInfConfiguration.preConfigure(MetaInfConfiguration.java:86)
at org.eclipse.jetty.webapp.WebAppContext.preConfigure(WebAppContext.java:468)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:504)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.RequestLogHandler.doStart(RequestLogHandler.java:140)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.server.Server.start(Server.java:387)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:354)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.jackrabbit.standalone.Main.run(Main.java:172)
at org.apache.jackrabbit.standalone.Main.main(Main.java:59)
Caused by: java.lang.NullPointerException: Nested JAR URLs are not supported
at sun.net.www.protocol.jar.Handler.parseURL(Handler.java:160)
at java.net.URL.<init>(URL.java:640)
This is likely https://issues.apache.org/jira/browse/JCR-4537. You can try the latest stable release (2.20.2).
I am using Apache Solr on my drupal website.
Tomcat 6 is installed and I have replaced schema.xml, solr-config.xml and protwords.txt files with the new files which was present in module installation directory.
When I run localhost:8983, I get this error.
Log4j (org.slf4j.impl.Log4jLoggerFactory)
2528 [coreLoadExecutor-3-thread-1] ERROR org.apache.solr.core.CoreContainer – Failed to load file /opt/solr-4.5.1/example/solr/collection1/conf/solrconfig.xml
2529 [coreLoadExecutor-3-thread-1] ERROR org.apache.solr.core.CoreContainer – Unable to create core: egitraining-dev.esc.rl.ac.uk
org.apache.solr.common.SolrException: Could not load config file /opt/solr-4.5.1/example/solr/collection1/conf/solrconfig.xml
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:490)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:557)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:247)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:239)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.IOException: Can't find resource 'solrconfig.xml' in classpath or 'solr/collection1/conf/conf/', cwd=/opt/solr-4.5.1/example
at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:322)
at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:287)
at org.apache.solr.core.Config.<init>(Config.java:116)
at org.apache.solr.core.Config.<init>(Config.java:86)
at org.apache.solr.core.SolrConfig.<init>(SolrConfig.java:129)
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:487)
... 11 more
2531 [coreLoadExecutor-3-thread-1] ERROR org.apache.solr.core.CoreContainer – null:org.apache.solr.common.SolrException: Unable to create core: egitraining-dev.esc.rl.ac.uk
at org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:934)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:566)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:247)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:239)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.solr.common.SolrException: Could not load config file /opt/solr-4.5.1/example/solr/collection1/conf/solrconfig.xml
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:490)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:557)
... 10 more
Caused by: java.io.IOException: Can't find resource 'solrconfig.xml' in classpath or 'solr/collection1/conf/conf/', cwd=/opt/solr-4.5.1/example
at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:322)
at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:287)
at org.apache.solr.core.Config.<init>(Config.java:116)
at org.apache.solr.core.Config.<init>(Config.java:86)
at org.apache.solr.core.SolrConfig.<init>(SolrConfig.java:129)
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:487)
... 11 more
2533 [main] INFO org.apache.solr.servlet.SolrDispatchFilter – user.dir=/opt/solr-4.5.1/example
2533 [main] INFO org.apache.solr.servlet.SolrDispatchFilter – SolrDispatchFilter.init() done
2576 [main] INFO org.eclipse.jetty.server.AbstractConnector – Started SocketConnector#0.0.0.0:8983
Can anyone help Please?
Thanks
This might have something to do with the default Solr config files provided by the Search API Solr module. Try to remove the next couple of lines from solrconfig.xml:
<useCompoundFile>false</useCompoundFile>
<ramBufferSizeMB>32</ramBufferSizeMB>
<mergeFactor>10</mergeFactor>
Patch found at https://drupal.org/comment/7945999#comment-7945999.
I tried to download and install sonar and then running an analysis with maven sonar plugin.
I have tried first Sonar version 2.14, and now 2.13.1, but I get the same error over and over again.
It seems it is unable to download the jdbc driver from localhost (the sonar server) because I get this error message: "Fail to download the
file: http://localhost:9000/deploy/jdbc-driver.jar (no proxy): Read timed out -> [Help 1]"
But If I go to http://localhost:9000/deploy/jdbc-driver.jar with my browser, it downloads the file without problems, so it seems that the sonar server is working the correct way, and that the maven sonar plugin is doing something wrong that I can not figure out.
I have also tried to alter sonar's context root to /sonar, but I get the same error.
I have configured sonar to use postgreSQL instead of the embedded Derby database, but still no luck. Accessing Sonar from a browser works like a charm, but I can not run mvn sonar:sonar so at this point my sonar installation is pretty useless... :(
Here is the stack trace from running maven with the -e option:
[ERROR] Failed to execute goal org.codehaus.mojo:sonar-maven-plugin:2.0:sonar (default-cli) on project betterfarmer-engine: Can not execute Sonar: Fail to download the
file: http://localhost:9000/deploy/jdbc-driver.jar (no proxy): Read timed out -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:sonar-maven-plugin:2.0:sonar (default-cli) on project betterfarmer-engi
ne: Can not execute Sonar
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: Can not execute Sonar
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:118)
at org.codehaus.mojo.sonar.Bootstraper.start(Bootstraper.java:65)
at org.codehaus.mojo.sonar.SonarMojo.execute(SonarMojo.java:90)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
Caused by: org.sonar.api.utils.SonarException: Fail to download the file: http://localhost:9000/deploy/jdbc-driver.jar (no proxy)
at org.sonar.api.utils.HttpDownloader.download(HttpDownloader.java:130)
at org.sonar.batch.bootstrap.ArtifactDownloader.downloadJdbcDriver(ArtifactDownloader.java:58)
at org.sonar.batch.bootstrap.JdbcDriverHolder.<init>(JdbcDriverHolder.java:43)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.picocontainer.injectors.AbstractInjector.newInstance(AbstractInjector.java:147)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:319)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:274)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:341)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:689)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:638)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:663)
at org.sonar.api.platform.ComponentContainer.getComponentByType(ComponentContainer.java:113)
at org.sonar.batch.bootstrap.Module.getComponentByType(Module.java:131)
at org.sonar.batch.bootstrap.BootstrapModule.configure(BootstrapModule.java:63)
at org.sonar.batch.bootstrap.Module.init(Module.java:49)
at org.sonar.batch.bootstrap.Module.init(Module.java:41)
at org.sonar.batch.Batch.<init>(Batch.java:37)
at org.sonar.maven3.SonarMojo.executeBatch(SonarMojo.java:143)
at org.sonar.maven3.SonarMojo.execute(SonarMojo.java:136)
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:113)
... 23 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:150)
at java.net.SocketInputStream.read(SocketInputStream.java:121)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at sun.net.www.MeteredStream.read(MeteredStream.java:134)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:2959)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:2953)
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1262)
at org.apache.commons.io.IOUtils.copy(IOUtils.java:1236)
at org.sonar.api.utils.HttpDownloader.download(HttpDownloader.java:125)
... 48 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Your Sonar is probably not correctly installed.
Don't you have errors in the server log?
Are you able to browse the web application at http://localhost:9000?
I'd suggest to start with a fresh install (in another folder than the previous one) and check that you don't have env properties that point to the previous install location before starting the new Sonar.