rexster.xml configuration with titan 0.5.4 and berkeleyje - graph-databases

Im new to graph db and I'm totally lost since few days. Im trying to run rexster with titan db 0.5.4 with berkeleyje storage backend.
but I'm not sure if the rexster.xml configuration is correct.
if I follow the rexster docs I got in the doghouse empty db with no vertices or edges while actually there are. screenshot
<graphs>
<graph>
<graph-name>bio4j</graph-name>
<graph-type>com.tinkerpop.rexster.config.TinkerGraphGraphConfiguration</graph-type>
<graph-location>/Users/Phoenix/Dropbox/Graph4Bio/Bio4j/bio4j/importGOTitanProperties.properties</graph-location>
<graph-read-only>false</graph-read-only>
<extensions>
<allows>
<allow>tp:frames</allow>
</allows>
</extensions>
</graph>
...
</graphs>
and if I follow the answer here
<graphs>
<graph>
<graph-name>bio4j</graph-name>
<graph-type>com.thinkaurelius.titan.tinkerpop.rexster.TitanGraphConfiguration</graph-type>
<graph-location>/Users/Phoenix/Dropbox/Graph4Bio/Bio4j/bio4j/importGOTitanProperties.properties</graph-location>
<graph-read-only>false</graph-read-only>
<properties>
<storage.backend>local</storage.backend>
<storage.directory>/Users/Phoenix/Dropbox/Graph4Bio/Bio4j/bio4j</storage.directory>
<buffer-size>100</buffer-size>
</properties>
<extensions>
<allows>
<allow>tp:frames</allow>
</allows>
</extensions>
</graph>
...
</graphs>
I got these errors:
$ ../../bin/rexster.sh --start
0 [main] INFO com.tinkerpop.rexster.Application - .:Welcome to Rexster:.
90 [main] INFO com.tinkerpop.rexster.server.RexsterProperties - Using [/Users/Phoenix/Dropbox/Graph4Bio/Titan/rexhome/config/rexster.xml] as configuration source.
97 [main] INFO com.tinkerpop.rexster.Application - Rexster is watching [/Users/Phoenix/Dropbox/Graph4Bio/Titan/rexhome/config/rexster.xml] for change.
232 [main] WARN com.tinkerpop.rexster.config.GraphConfigurationContainer - Could not load graph bio4j. Please check the XML configuration.
233 [main] WARN com.tinkerpop.rexster.config.GraphConfigurationContainer - GraphConfiguration could not be found or otherwise instantiated: [com.thinkaurelius.titan.tinkerpop.rexster.TitanGraphConfiguration]. Ensure that it is in Rexster's path.
com.tinkerpop.rexster.config.GraphConfigurationException: GraphConfiguration could not be found or otherwise instantiated: [com.thinkaurelius.titan.tinkerpop.rexster.TitanGraphConfiguration]. Ensure that it is in Rexster's path.
at com.tinkerpop.rexster.config.GraphConfigurationContainer.getGraphFromConfiguration(GraphConfigurationContainer.java:142)
at com.tinkerpop.rexster.config.GraphConfigurationContainer.<init>(GraphConfigurationContainer.java:54)
at com.tinkerpop.rexster.server.XmlRexsterApplication.reconfigure(XmlRexsterApplication.java:99)
at com.tinkerpop.rexster.server.XmlRexsterApplication.<init>(XmlRexsterApplication.java:47)
at com.tinkerpop.rexster.Application.<init>(Application.java:97)
at com.tinkerpop.rexster.Application.main(Application.java:189)
Caused by: java.lang.IllegalArgumentException: Could not find implementation class: local
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:47)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:421)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:361)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1275)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:93)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:73)
at com.thinkaurelius.titan.tinkerpop.rexster.TitanGraphConfiguration.configureGraphInstance(TitanGraphConfiguration.java:33)
at com.tinkerpop.rexster.config.GraphConfigurationContainer.getGraphFromConfiguration(GraphConfigurationContainer.java:124)
... 5 more
Caused by: java.lang.ClassNotFoundException: local
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:42)
... 12 more
235 [main] WARN com.tinkerpop.rexster.config.GraphConfigurationContainer - Could not find implementation class: local
java.lang.IllegalArgumentException: Could not find implementation class: local
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:47)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:421)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:361)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1275)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:93)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:73)
at com.thinkaurelius.titan.tinkerpop.rexster.TitanGraphConfiguration.configureGraphInstance(TitanGraphConfiguration.java:33)
at com.tinkerpop.rexster.config.GraphConfigurationContainer.getGraphFromConfiguration(GraphConfigurationContainer.java:124)
at com.tinkerpop.rexster.config.GraphConfigurationContainer.<init>(GraphConfigurationContainer.java:54)
at com.tinkerpop.rexster.server.XmlRexsterApplication.reconfigure(XmlRexsterApplication.java:99)
at com.tinkerpop.rexster.server.XmlRexsterApplication.<init>(XmlRexsterApplication.java:47)
at com.tinkerpop.rexster.Application.<init>(Application.java:97)
at com.tinkerpop.rexster.Application.main(Application.java:189)
Caused by: java.lang.ClassNotFoundException: local
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:42)
... 12 more
244 [main] INFO com.tinkerpop.rexster.server.metrics.HttpReporterConfig - Configured HTTP Metric Reporter.
245 [main] INFO com.tinkerpop.rexster.server.metrics.ConsoleReporterConfig - Configured Console Metric Reporter.
1312 [main] INFO com.tinkerpop.rexster.server.HttpRexsterServer - HTTP/REST thread pool configuration: kernal[4 / 4] worker[8 / 8]
1313 [main] INFO com.tinkerpop.rexster.server.HttpRexsterServer - Using org.glassfish.grizzly.strategies.LeaderFollowerNIOStrategy IOStrategy for HTTP/REST.
1399 [main] INFO com.tinkerpop.rexster.server.HttpRexsterServer - Rexster Server running on: [http://localhost:8182]
1399 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - Using org.glassfish.grizzly.strategies.LeaderFollowerNIOStrategy IOStrategy for RexPro.
1399 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - RexPro thread pool configuration: kernal[4 / 4] worker[8 / 8]
1402 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - Rexster configured with [DefaultSecurity].
1403 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - RexPro Server bound to [0.0.0.0:8184]
1412 [main] INFO com.tinkerpop.rexster.server.ShutdownManager - Bound shutdown socket to /127.0.0.1:8183. Starting listener thread for shutdown requests.
and here is the db .properties file:
$ nano importGOTitanProperties.properties
#ImportGOTitan properties
storage.directory=bio4j
storage.backend=berkeleyje
#index.search.backend=elasticsearch
#index.search.directory=bio4j/es
storage.batch-loading=false
storage.transactions=true
query.fast-property=false
schema.default=none

after reading this I could figure out how to configure the rexster.xml file
rexster.xml:
<?xml version="1.0" encoding="UTF-8"?>
<rexster>
...
<graphs>
<graph>
<graph-name>bio4j</graph-name>
<graph-type>com.thinkaurelius.titan.tinkerpop.rexster.TitanGraphConfiguration</graph-type>
<graph-read-only>false</graph-read-only>
<properties>
<storage.backend>berkeleyje</storage.backend>
<storage.directory>/Users/Phoenix/Dropbox/Graph4Bio/Bio4j/bio4j</storage.directory>
<storage.buffer-size>100</storage.buffer-size>
</properties>
<extensions>
<allows>
<allow>tp:gremlin</allow>
</allows>
</extensions>
</graph>
</graphs>
</rexster>
starting the server:
$ ../../bin/rexster.sh --start
0 [main] INFO com.tinkerpop.rexster.Application - .:Welcome to Rexster:.
115 [main] INFO com.tinkerpop.rexster.server.RexsterProperties - Using [/Users/Phoenix/Dropbox/Graph4Bio/Titan/rexhome/config/rexster.xml] as configuration source.
129 [main] INFO com.tinkerpop.rexster.Application - Rexster is watching [/Users/Phoenix/Dropbox/Graph4Bio/Titan/rexhome/config/rexster.xml] for change.
942 [main] INFO com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration - Generated unique-instance-id=0a2581025086-AngryMac-local1
1049 [main] INFO com.thinkaurelius.titan.diskstorage.Backend - Initiated backend operations thread pool of size 8
1161 [main] INFO com.thinkaurelius.titan.diskstorage.log.kcvs.KCVSLog - Loaded unidentified ReadMarker start time Timepoint[1454913673816000 μs] into com.thinkaurelius.titan.diskstorage.log.kcvs.KCVSLog$MessagePuller#302c971f
1165 [main] INFO com.tinkerpop.rexster.RexsterApplicationGraph - Graph [bio4j] - configured with allowable namespace [tp:gremlin]
1205 [main] INFO com.tinkerpop.rexster.config.GraphConfigurationContainer - Graph bio4j - titangraph[berkeleyje:/Users/Phoenix/Dropbox/Graph4Bio/Bio4j/bio4j] loaded
2502 [main] INFO com.tinkerpop.rexster.server.HttpRexsterServer - HTTP/REST thread pool configuration: kernal[4 / 4] worker[8 / 8]
2504 [main] INFO com.tinkerpop.rexster.server.HttpRexsterServer - Using org.glassfish.grizzly.strategies.LeaderFollowerNIOStrategy IOStrategy for HTTP/REST.
2636 [main] INFO com.tinkerpop.rexster.server.HttpRexsterServer - Rexster Server running on: [http://localhost:8182]
2636 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - Using org.glassfish.grizzly.strategies.LeaderFollowerNIOStrategy IOStrategy for RexPro.
2636 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - RexPro thread pool configuration: kernal[4 / 4] worker[8 / 8]
2640 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - Rexster configured with no security.
2641 [main] INFO com.tinkerpop.rexster.server.RexProRexsterServer - RexPro Server bound to [0.0.0.0:8184]
2653 [main] INFO com.tinkerpop.rexster.server.ShutdownManager - Bound shutdown socket to /127.0.0.1:8183. Starting listener thread for shutdown requests.
the doghouse screenshot :))))

Related

Google PubSub with Ktor

I have been using Ktor for a while now and I want to integrate Google PubSub to Ktor but it is not working. When the app starts and PubSub is able to run, Ktor server fails to start.
I would appreciate anyone who has done has successfully integrated PubSub with Ktor to share with me how they were able to do that.
Thanks in advance.
Error Log after which PubSub runs well but Ktor dies off:
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
at io.grpc.netty.shaded.io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
at io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:238)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:312)
at io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:232)
at io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:289)
at io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:92)
at io.grpc.netty.shaded.io.netty.util.AsciiString.<init>(AsciiString.java:223)
at io.grpc.netty.shaded.io.netty.util.AsciiString.<init>(AsciiString.java:210)
at io.grpc.netty.shaded.io.netty.util.AsciiString.cached(AsciiString.java:1401)
at io.grpc.netty.shaded.io.netty.util.AsciiString.<clinit>(AsciiString.java:48)
at io.grpc.netty.shaded.io.grpc.netty.Utils.<clinit>(Utils.java:74)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.<clinit>(NettyChannelBuilder.java:81)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:38)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:24)
at io.grpc.ManagedChannelBuilder.forAddress(ManagedChannelBuilder.java:39)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:306)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1600(InstantiatingGrpcChannelProvider.java:73)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:214)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:221)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:204)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
at com.google.cloud.pubsub.v1.stub.GrpcSubscriberStub.create(GrpcSubscriberStub.java:272)
at com.google.cloud.pubsub.v1.Subscriber.doStart(Subscriber.java:276)
at com.google.api.core.AbstractApiService$InnerService.doStart(AbstractApiService.java:148)
at com.google.common.util.concurrent.AbstractService.startAsync(AbstractService.java:248)
at com.google.api.core.AbstractApiService.startAsync(AbstractApiService.java:120)
at com.google.cloud.pubsub.v1.Subscriber.startAsync(Subscriber.java:268)
at com.fbistech.ApplicationKt.listenToSub(Application.kt:314)
at com.fbistech.ApplicationKt.access$listenToSub(Application.kt:1)
at com.fbistech.ApplicationKt$main$started$1$1.invokeSuspend(Application.kt:96)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:277)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:86)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:61)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.fbistech.ApplicationKt$main$started$1.invoke(Application.kt:96)
at com.fbistech.ApplicationKt$main$started$1.invoke(Application.kt)
at io.ktor.application.ApplicationEvents.raise(ApplicationEvents.kt:46)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.safeRiseEvent(ApplicationEngineEnvironmentReloading.kt:192)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.instantiateAndConfigureApplication(ApplicationEngineEnvironmentReloading.kt:311)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.createApplication(ApplicationEngineEnvironmentReloading.kt:136)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.start(ApplicationEngineEnvironmentReloading.kt:268)
at io.ktor.server.netty.NettyApplicationEngine.start(NettyApplicationEngine.kt:174)
at io.ktor.server.netty.EngineMain.main(EngineMain.kt:26)
at com.fbistech.ApplicationKt.main(Application.kt:65)
2021-05-10 05:22:39.918 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent0 - java.nio.Bits.unaligned: available, true
2021-05-10 05:22:39.919 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): available
2021-05-10 05:22:39.919 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): unavailable
2021-05-10 05:22:39.919 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - sun.misc.Unsafe: available
2021-05-10 05:22:39.930 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - maxDirectMemory: 4294967296 bytes (maybe)
2021-05-10 05:22:39.930 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - -Dio.netty.tmpdir: /var/folders/g1/4bwcs52n38b3bcjc3rdv2zvm0000gn/T (java.io.tmpdir)
2021-05-10 05:22:39.930 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
2021-05-10 05:22:39.931 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - -Dio.netty.maxDirectMemory: -1 bytes
2021-05-10 05:22:39.931 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: 1024
2021-05-10 05:22:39.931 [main] DEBUG i.g.n.s.i.n.u.internal.CleanerJava9 - java.nio.ByteBuffer.cleaner(): available
2021-05-10 05:22:39.931 [main] DEBUG i.g.n.s.i.n.u.i.PlatformDependent - -Dio.netty.noPreferDirect: false
2021-05-10 05:22:39.971 [main] DEBUG i.g.n.s.i.n.u.i.NativeLibraryLoader - -Dio.netty.native.workdir: /var/folders/g1/4bwcs52n38b3bcjc3rdv2zvm0000gn/T (io.netty.tmpdir)
2021-05-10 05:22:39.971 [main] DEBUG i.g.n.s.i.n.u.i.NativeLibraryLoader - -Dio.netty.native.deleteLibAfterLoading: true
2021-05-10 05:22:39.971 [main] DEBUG i.g.n.s.i.n.u.i.NativeLibraryLoader - -Dio.netty.native.tryPatchShadedId: true
2021-05-10 05:22:39.972 [main] DEBUG i.g.n.s.i.n.u.i.NativeLibraryLoader - Unable to load the library 'io_grpc_netty_shaded_netty_tcnative_osx_x86_64', trying other loading mechanism.
java.lang.UnsatisfiedLinkError: no io_grpc_netty_shaded_netty_tcnative_osx_x86_64 in java.library.path: /Users/fbistech-d6m/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2447)
at java.base/java.lang.Runtime.loadLibrary0(Runtime.java:809)
at java.base/java.lang.System.loadLibrary(System.java:1893)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:371)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:312)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:363)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:341)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:136)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:96)
at io.grpc.netty.shaded.io.netty.handler.ssl.OpenSsl.loadTcNative(OpenSsl.java:590)
at io.grpc.netty.shaded.io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:136)
at io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.defaultSslProvider(GrpcSslContexts.java:228)
at io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:145)
at io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.forClient(GrpcSslContexts.java:94)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$DefaultProtocolNegotiator.newNegotiator(NettyChannelBuilder.java:594)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:500)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$NettyChannelTransportFactoryBuilder.buildClientTransportFactory(NettyChannelBuilder.java:171)
at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:608)
at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:340)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1600(InstantiatingGrpcChannelProvider.java:73)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:214)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:221)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:204)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
at com.google.cloud.pubsub.v1.stub.GrpcSubscriberStub.create(GrpcSubscriberStub.java:272)
at com.google.cloud.pubsub.v1.Subscriber.doStart(Subscriber.java:276)
at com.google.api.core.AbstractApiService$InnerService.doStart(AbstractApiService.java:148)
at com.google.common.util.concurrent.AbstractService.startAsync(AbstractService.java:248)
at com.google.api.core.AbstractApiService.startAsync(AbstractApiService.java:120)
at com.google.cloud.pubsub.v1.Subscriber.startAsync(Subscriber.java:268)
at com.fbistech.ApplicationKt.listenToSub(Application.kt:314)
at com.fbistech.ApplicationKt.access$listenToSub(Application.kt:1)
at com.fbistech.ApplicationKt$main$started$1$1.invokeSuspend(Application.kt:96)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:277)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:86)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:61)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.fbistech.ApplicationKt$main$started$1.invoke(Application.kt:96)
at com.fbistech.ApplicationKt$main$started$1.invoke(Application.kt)
at io.ktor.application.ApplicationEvents.raise(ApplicationEvents.kt:46)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.safeRiseEvent(ApplicationEngineEnvironmentReloading.kt:192)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.instantiateAndConfigureApplication(ApplicationEngineEnvironmentReloading.kt:311)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.createApplication(ApplicationEngineEnvironmentReloading.kt:136)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.start(ApplicationEngineEnvironmentReloading.kt:268)
at io.ktor.server.netty.NettyApplicationEngine.start(NettyApplicationEngine.kt:174)
at io.ktor.server.netty.EngineMain.main(EngineMain.kt:26)
at com.fbistech.ApplicationKt.main(Application.kt:65)
2021-05-10 05:22:39.973 [main] DEBUG i.g.n.s.i.n.u.i.NativeLibraryLoader - io_grpc_netty_shaded_netty_tcnative_osx_x86_64 cannot be loaded from java.library.path, now trying export to -Dio.netty.native.workdir: /var/folders/g1/4bwcs52n38b3bcjc3rdv2zvm0000gn/T
java.lang.UnsatisfiedLinkError: no io_grpc_netty_shaded_netty_tcnative_osx_x86_64 in java.library.path: /Users/fbistech-d6m/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2447)
at java.base/java.lang.Runtime.loadLibrary0(Runtime.java:809)
at java.base/java.lang.System.loadLibrary(System.java:1893)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:351)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:136)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:96)
at io.grpc.netty.shaded.io.netty.handler.ssl.OpenSsl.loadTcNative(OpenSsl.java:590)
at io.grpc.netty.shaded.io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:136)
at io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.defaultSslProvider(GrpcSslContexts.java:228)
at io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:145)
at io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.forClient(GrpcSslContexts.java:94)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$DefaultProtocolNegotiator.newNegotiator(NettyChannelBuilder.java:594)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:500)
at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$NettyChannelTransportFactoryBuilder.buildClientTransportFactory(NettyChannelBuilder.java:171)
at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:608)
at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:340)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1600(InstantiatingGrpcChannelProvider.java:73)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:214)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:221)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:204)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
at com.google.cloud.pubsub.v1.stub.GrpcSubscriberStub.create(GrpcSubscriberStub.java:272)
at com.google.cloud.pubsub.v1.Subscriber.doStart(Subscriber.java:276)
at com.google.api.core.AbstractApiService$InnerService.doStart(AbstractApiService.java:148)
at com.google.common.util.concurrent.AbstractService.startAsync(AbstractService.java:248)
at com.google.api.core.AbstractApiService.startAsync(AbstractApiService.java:120)
at com.google.cloud.pubsub.v1.Subscriber.startAsync(Subscriber.java:268)
at com.fbistech.ApplicationKt.listenToSub(Application.kt:314)
at com.fbistech.ApplicationKt.access$listenToSub(Application.kt:1)
at com.fbistech.ApplicationKt$main$started$1$1.invokeSuspend(Application.kt:96)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:277)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:86)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:61)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.fbistech.ApplicationKt$main$started$1.invoke(Application.kt:96)
at com.fbistech.ApplicationKt$main$started$1.invoke(Application.kt)
at io.ktor.application.ApplicationEvents.raise(ApplicationEvents.kt:46)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.safeRiseEvent(ApplicationEngineEnvironmentReloading.kt:192)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.instantiateAndConfigureApplication(ApplicationEngineEnvironmentReloading.kt:311)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.createApplication(ApplicationEngineEnvironmentReloading.kt:136)
at io.ktor.server.engine.ApplicationEngineEnvironmentReloading.start(ApplicationEngineEnvironmentReloading.kt:268)
at io.ktor.server.netty.NettyApplicationEngine.start(NettyApplicationEngine.kt:174)
at io.ktor.server.netty.EngineMain.main(EngineMain.kt:26)
at com.fbistech.ApplicationKt.main(Application.kt:65)
Suppressed: java.lang.UnsatisfiedLinkError: no io_grpc_netty_shaded_netty_tcnative_osx_x86_64 in java.library.path: /Users/fbistech-d6m/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2447)
at java.base/java.lang.Runtime.loadLibrary0(Runtime.java:809)
at java.base/java.lang.System.loadLibrary(System.java:1893)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:371)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:312)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:363)
at io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:341)
... 46 common frames omitted

Golang gocql cannot connect to Cassandra (using Docker)

I am trying to setup and connect to a Cassandra single node instance using docker and Golang and it is not working.
The closest information I could find to addressing connection issues between the golang gocql package and Cassandra is available here: Cassandra cqlsh - connection refused, however there are many different upvote answers with no clear indication of which is preferred. It is also a protected question (no "me toos"), so a lot of community members seem to be having trouble with this.
This problem should be slightly different, as it is using Docker and I have tried most (if not all of the solutions linked to above).
version: "3"
services:
cassandra00:
restart: always
image: cassandra:latest
volumes:
- ./db/casdata:/var/lib/cassandra
ports:
- 7000:7000
- 7001:7001
- 7199:7199
- 9042:9042
- 9160:9160
environment:
- CASSANDRA_RPC_ADDRESS=127.0.0.1
- CASSANDRA_BROADCAST_ADDRESS=127.0.0.1
- CASSANDRA_LISTEN_ADDRESS=127.0.0.1
- CASSANDRA_START_RPC=true
db:
restart: always
build: ./db
environment:
POSTGRES_USER: patientplatypus
POSTGRES_PASSWORD: SUPERSECRETFAKEPASSD00T
POSTGRES_DB: zennify
expose:
- "5432"
ports:
- 5432:5432
volumes:
- ./db/pgdata:/var/lib/postgresql/data
app:
restart: always
build:
context: .
dockerfile: Dockerfile
command: bash -c 'while !</dev/tcp/db/5432; do sleep 10; done; realize start --run'
# command: bash -c 'while !</dev/tcp/db/5432; do sleep 10; done; go run main.go'
ports:
- 8000:8000
depends_on:
- db
- cassandra00
links:
- db
- cassandra00
volumes:
- ./:/go/src/github.com/patientplatypus/webserver/
Admittedly, I am a little shaky on what listening addresses I should pass to Cassandra in the environment section, so I just passed 'home':
- CASSANDRA_RPC_ADDRESS=127.0.0.1
- CASSANDRA_BROADCAST_ADDRESS=127.0.0.1
- CASSANDRA_LISTEN_ADDRESS=127.0.0.1
If you try and pass 0.0.0.0 you get the following error:
cassandra00_1 | Exception (org.apache.cassandra.exceptions.ConfigurationException) encountered during startup: listen_address cannot be a wildcard address (0.0.0.0)!
cassandra00_1 | listen_address cannot be a wildcard address (0.0.0.0)!
cassandra00_1 | ERROR [main] 2018-09-10 21:50:44,530 CassandraDaemon.java:708 - Exception encountered during startup: listen_address cannot be a wildcard address (0.0.0.0)!
Overall, however I think that I am getting the correct start up procedure for Cassandra (afaict) because my terminal outputs that Cassandra started up as normal and is listening on the appropriate ports:
cassandra00_1 | INFO [main] 2018-09-10 22:06:28,920 StorageService.java:1446 - JOINING: Finish joining ring
cassandra00_1 | INFO [main] 2018-09-10 22:06:29,179 StorageService.java:2289 - Node /127.0.0.1 state jump to NORMAL
cassandra00_1 | INFO [main] 2018-09-10 22:06:29,607 NativeTransportService.java:70 - Netty using native Epoll event loop
cassandra00_1 | INFO [main] 2018-09-10 22:06:29,750 Server.java:155 - Using Netty Version: [netty-buffer=netty-buffer-4.0.44.Final.452812a, netty-codec=netty-codec-4.0.44.Final.452812a, netty-codec-haproxy=netty-codec-haproxy-4.0.44.Final.452812a, netty-codec-http=netty-codec-http-4.0.44.Final.452812a, netty-codec-socks=netty-codec-socks-4.0.44.Final.452812a, netty-common=netty-common-4.0.44.Final.452812a, netty-handler=netty-handler-4.0.44.Final.452812a, netty-tcnative=netty-tcnative-1.1.33.Fork26.142ecbb, netty-transport=netty-transport-4.0.44.Final.452812a, netty-transport-native-epoll=netty-transport-native-epoll-4.0.44.Final.452812a, netty-transport-rxtx=netty-transport-rxtx-4.0.44.Final.452812a, netty-transport-sctp=netty-transport-sctp-4.0.44.Final.452812a, netty-transport-udt=netty-transport-udt-4.0.44.Final.452812a]
cassandra00_1 | INFO [main] 2018-09-10 22:06:29,754 Server.java:156 - Starting listening for CQL clients on /127.0.0.1:9042 (unencrypted)...
cassandra00_1 | INFO [main] 2018-09-10 22:06:29,990 ThriftServer.java:116 - Binding thrift service to /127.0.0.1:9160
In my golang code I have the following package that is being called (simplified to show relevant section):
package data
import(
"fmt"
"github.com/gocql/gocql"
)
func create_userinfo_table() {
<...>
fmt.Println("replicating table in cassandra")
cluster := gocql.NewCluster("localhost") //<---error here!
cluster.ProtoVersion = 4
<...>
}
Which results in the following error in my terminal:
app_1 | [21:52:38][WEBSERVER] : 2018/09/10
21:52:38 gocql: unable to dial control conn 127.0.0.1:
dial tcp 127.0.0.1:9042: connect: connection refused
app_1 | [21:52:38][WEBSERVER] : 2018/09/10
21:52:38 gocql: unable to dial control conn ::1:
dial tcp [::1]:9042: connect: cannot assign requested address
app_1 | [21:52:38][WEBSERVER] : 2018/09/10
21:52:38 Could not connect to cassandra cluster: gocql:
unable to create session: control: unable to connect to initial hosts:
dial tcp [::1]:9042: connect: cannot assign requested address
I have tried several variations on the connection address
cluster := gocql.NewCluster("localhost")
cluster := gocql.NewCluster("127.0.0.1")
cluster := gocql.NewCluster("127.0.0.1:9042")
cluster := gocql.NewCluster("127.0.0.1:9160")
These seemed likely candidates for example, but no luck.
Does anyone have any idea what I am doing wrong?
Use the service name cassandra00 for the hostname per the docker-compose documentation https://docs.docker.com/compose/compose-file/#links
Containers for the linked service are reachable at a hostname identical to the alias, or the service name if no alias was specified.
Leave the CASSANDRA_LISTEN_ADDRESS envvar unset (or pass auto) per https://docs.docker.com/samples/library/cassandra/
The default value is auto, which will set the listen_address option in cassandra.yaml to the IP address of the container as it starts. This default should work in most use cases.

Error integration of solr 5.5.0 with nutch 1.13: 'Connection pool shut down'

I had a problem when I tried to integrate 'Solr' with 'Nutch':
version of 'Nutch':1.13
version of 'Solr': 5.5.0 (as recommended by the official
documentations https://wiki.apache.org/nutch/NutchTutorial#Verify_your_Nutch_installation)
The error is :
Active IndexWriters :
SOLRIndexWriter
solr.server.url : URL of the SOLR instance
solr.zookeeper.hosts : URL of the Zookeeper quorum
solr.commit.size : buffer size when sending to SOLR (default 1000)
solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml)
solr.auth : use authentication (default false)
solr.auth.username : username for authentication
solr.auth.password : password for authentication
Indexer: number of documents indexed, deleted, or skipped:
Indexer: finished at 2017-11-30 01:34:49, elapsed: 00:00:01
Cleaning up index if possible
apache-nutch-1.13/bin /nutch clean -Dsolr.server.url=http://localhost:8983/solr/nutch crawling_dir/crawldb
SolrIndexer: deleting 1/1 documents
ERROR CleaningJob: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:865)
at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:174)
at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
Error running:
apache-nutch-1.13/bin/nutch clean -Dsolr.server.url=http://localhost:8983/solr/nutch crawling_dir/crawldb
Failed with exit value 255.
on the log file:
2017-11-30 01:34:50,851 WARN output.FileOutputCommitter - Output Path is null in cleanupJob()
2017-11-30 01:34:50,851 WARN mapred.LocalJobRunner - job_local531807742_0001
java.lang.Exception: java.lang.IllegalStateException: Connection pool shut down
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.IllegalStateException: Connection pool shut down
at org.apache.http.util.Asserts.check(Asserts.java:34)
at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169)
at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202)
at org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:149)
at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:482)
at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:463)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:191)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:179)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:117)
at org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:122)
at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:244)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2017-11-30 01:34:51,458 ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:865)
at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:174)
at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
Please do you have any idea?
Had same problem and yours probably is due to the same reason
https://issues.apache.org/jira/browse/NUTCH-2269
Try patch it and error should be gone
From my finding, it appears to be a bug. Here's a blog that explains it well, https://reformatcode.com/code/apache-configuration/apache-nutch-112-with-apache-solr-621-give-an-error

NUTCH 1.13 fetch of url failed with: org.apache.nutch.protocol.ProtocolNotFound: protocol not found for url=http

fetch of httpurl failed with:
org.apache.nutch.protocol.ProtocolNotFound: protocol not found for
url=http at
org.apache.nutch.protocol.ProtocolFactory.getProtocol(ProtocolFactory.java:85)
at org.apache.nutch.fetcher.FetcherThread.run(FetcherThread.java:285)
Using queue mode : byHost
fetch of httpsurl failed with: org.apache.nutch.protocol.ProtocolNotFound: protocol not found for url=https
at org.apache.nutch.protocol.ProtocolFactory.getProtocol(ProtocolFactory.java:85)
at org.apache.nutch.fetcher.FetcherThread.run(FetcherThread.java:285)
I get above result while running nutch1.13 with solr6.6.0
command i used is
bin/crawl -i -D
solr.server.url=http://myip/solr/nutch/ urls/ crawl 2
below is plugin section in my nutch-site.xml
<name>plugin.includes</name>
<value>
protocol-(http|httpclient)|urlfilter-regex|parse-(html)|index-(basic|anchor)|indexer-solr|query-(basic|site|url)|response-(json|xml)|summary-basic|scoring-opic|urlnormalizer-(pass|regex|basic)
</value>
Below are my file contents
[root#localhost apache-nutch-1.13]# ls plugins
creativecommons index-more nutch-extensionpoints protocol-file scoring-similarity urlnormalizer-ajax
feed index-replace parse-ext protocol-ftp subcollection urlnormalizer-basic
headings index-static parsefilter-naivebayes protocol-htmlunit tld urlnormalizer-host
index-anchor language-identifier parsefilter-regex protocol-http urlfilter-automaton urlnormalizer-pass
index-basic lib-htmlunit parse-html protocol-httpclient urlfilter-domain urlnormalizer-protocol
indexer-cloudsearch lib-http parse-js protocol-interactiveselenium urlfilter-domainblacklist urlnormalizer-querystring
indexer-dummy lib-nekohtml parse-metatags protocol-selenium urlfilter-ignoreexempt urlnormalizer-regex
indexer-elastic lib-regex-filter parse-replace publish-rabbitmq urlfilter-prefix urlnormalizer-slash
indexer-solr lib-selenium parse-swf publish-rabitmq urlfilter-regex
index-geoip lib-xml parse-tika scoring-depth urlfilter-suffix
index-links microformats-reltag parse-zip scoring-link urlfilter-validator
index-metadata mimetype-filter plugin scoring-opic urlmeta
I'm stuck with this issue. As you can see i have included both protocol-(http|httpclient) .But still fetching url failed. Thanks in advance.
NEWER ISSUE hadoop.log
2017-09-01 14:35:07,172 INFO solr.SolrIndexWriter - SolrIndexer:
deleting 1/1 documents 2017-09-01 14:35:07,321 WARN
output.FileOutputCommitter - Output Path is null in cleanupJob()
2017-09-01 14:35:07,323 WARN mapred.LocalJobRunner -
job_local1176811933_0001 java.lang.Exception:
java.lang.IllegalStateException: Connection pool shut down at
org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.IllegalStateException: Connection pool shut down
at org.apache.http.util.Asserts.check(Asserts.java:34) at
org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169)
at
org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202)
at
org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184)
at
org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415)
at
org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
at
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
at
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:149)
at
org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:482)
at
org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:463)
at
org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:191)
at
org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:179)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:117)
at
org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:122)
at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:244) at
org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) at
org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) 2017-09-01 14:35:07,679
ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job
failed! at
org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:865) at
org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:174) at
org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at
org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
I somehow solved the issue. I think the space in nutch-site.xml was causing issue new plugin.includes section for others coming here.
<name>plugin.includes</name>
<value>protocol-http|protocol-httpclient|urlfilter-regex|parse-(html)|index-(basic|anchor)|indexer-solr|query-(basic|site|url)|response-(json|xml)|summary-basic|scoring-opic|urlnormalizer-(pass|regex|basic)</value>

Lily-Solr-HBase Integration

I'm trying to integrate lily-solr-hbase .
I'm using Hortonworks distribution (2.4 version)
Solr (5.2.1 version)
Lily (2.0 version)
I'm following the lily documentation from official link :
http://docs.ngdata.com/lily-docs-2_0/414-lily/432-lily.html
But while running the steps as mentioned to start the lily server .. I'm getting the below exception .
[root#BAN-SDU-OVS2-VM7 lily-2.0]# bin/lily-server
[INFO ][11:32:41,606][main ] org.kauriproject.runtime.info - Starting the Kauri Runtime.
[INFO ][11:32:42,163][main ] org.kauriproject.runtime.info - Reading module configurations of 12 modules.
[INFO ][11:32:43,152][main ] org.kauriproject.runtime.info - Starting the modules.
[INFO ][11:32:43,162][main ] org.kauriproject.runtime.info - Starting module pluginregistry - /root/lily-2.0/lily-2.0/lily-2.0/lib/org/lilyproject/lily-pluginregistry-impl/2.0/lily-pluginregistry-impl-2.0.jar
[INFO ][11:32:44,203][main ] org.kauriproject.runtime.info - Starting module general - /root/lily-2.0/lily-2.0/lily-2.0/lib/org/lilyproject/lily-general-module/2.0/lily-general-module-2.0.jar
org.kauriproject.runtime.KauriRTException: Error constructing module defined at /root/lily-2.0/lily-2.0/lily-2.0/lib/org/lilyproject/lily-general-module/2.0/lily-general-module-2.0.jar
at org.kauriproject.runtime.module.build.ModuleBuilder.buildInt(ModuleBuilder.java:152)
at org.kauriproject.runtime.module.build.ModuleBuilder.build(ModuleBuilder.java:55)
at org.kauriproject.runtime.KauriRuntime.start(KauriRuntime.java:240)
at org.kauriproject.runtime.cli.KauriRuntimeCli.run(KauriRuntimeCli.java:292)
at org.kauriproject.runtime.cli.KauriRuntimeCli.main(KauriRuntimeCli.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.kauriproject.launcher.RuntimeCliLauncher.run(RuntimeCliLauncher.java:79)
at org.kauriproject.launcher.RuntimeCliLauncher.launch(RuntimeCliLauncher.java:58)
at org.kauriproject.launcher.RuntimeCliLauncher.main(RuntimeCliLauncher.java:54)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'hbaseConfiguration' defined in KAURI-INF/spring/services.xml in /root/lily-2.0/lily-2.0/lily-2.0/lib/org/lilyproject/lily-general-module/2.0/lily-general-module-2.0.jar: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.lilyproject.server.modules.general.HadoopConfigurationFactoryImpl]: Constructor threw exception; nested exception is org.apache.hadoop.hbase.client.NoServerForRegionException: Unable to find region for after 1 tries.
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:254)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:925)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:835)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:440)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
at java.security.AccessController.doPrivileged(Native Method)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:380)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:264)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:261)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:185)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:164)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:429)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:728)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:380)
at org.kauriproject.runtime.module.build.ModuleBuilder.buildInt(ModuleBuilder.java:89)
... 11 more
Caused by: org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.lilyproject.server.modules.general.HadoopConfigurationFactoryImpl]: Constructor threw exception; nested exception is org.apache.hadoop.hbase.client.NoServerForRegionException: Unable to find region for after 1 tries.
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:115)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:87)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:248)
... 26 more
Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException: Unable to find region for after 1 tries.
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:914)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:820)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:788)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:249)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:213)
at org.lilyproject.server.modules.general.HadoopConfigurationFactoryImpl.waitOnHBase(HadoopConfigurationFactoryImpl.java:67)
at org.lilyproject.server.modules.general.HadoopConfigurationFactoryImpl.<init>(HadoopConfigurationFactoryImpl.java:53)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:100)
... 28 more
Startup failed. Will try to shutdown and exit.
[INFO ][11:33:45,632][main ] org.kauriproject.runtime.info - Shutting down the modules.
[INFO ][11:33:45,633][main ] org.kauriproject.runtime.info - Stopping the restservice manager.
Can anyone help so as how to fix this and proceed ?
Thanks in advance .

Resources