Renaming package causes crash; java.lang.UnsatisfiedLinkError: No implementation found for void - package

i tried to rename my package and the app crashes at the startup. This is the error measage, can someone plase help and tell me how can I fix it?
FATAL EXCEPTION: SDLThread
Process: com.MYAPP.demo, PID: 3967
java.lang.UnsatisfiedLinkError: No implementation found for void com.MYAPP.demo.MainActivity.initEnv() (tried Java_com_MYAPP_demo_MainActivity_initEnv and Java_com_MYAPP_demo_MainActivity_initEnv__)
at com.MYAPP.demo.MainActivity.initEnv(Native Method)
at com.MYAPP.demo.MainActivity._initEnv(MainActivity.java:71)
at org.libsdl.app.SDLMain.run(SDLActivity.java:1679)
at java.lang.Thread.run(Thread.java:919)

Related

Kylin Build Cube failed somtimes at" #19 Step Name: Hive Cleanup" java.lang.RuntimeException: Failed to read kylin_hive_conf.xml

The error occurs sometimes ,and after reboot kylin(kylin.sh stop and then kylin.sh start), it will find the conf dir location and pass this step.
I am using Kylin version "2.6.2", and KYLIN_CONF="/opt/kylin/conf" is already set correctly.
The errors hints are different , as i have countered the following:
1.
java.lang.RuntimeException: Failed to read kylin_hive_conf.xml at '/opt/apache-kylin-2.6.2-bin-hadoop3/bin/meta/kylin_hive_conf.xml'
at org.apache.kylin.common.util.SourceConfigurationUtil.loadXmlConfiguration(SourceConfigurationUtil.java:88)
at org.apache.kylin.common.util.SourceConfigurationUtil.loadHiveConfiguration(SourceConfigurationUtil.java:61)
at org.apache.kylin.common.util.HiveCmdBuilder.<init>(HiveCmdBuilder.java:48)
at org.apache.kylin.source.hive.GarbageCollectionStep.cleanUpIntermediateFlatTable(GarbageCollectionStep.java:63)
at org.apache.kylin.source.hive.GarbageCollectionStep.doWork(GarbageCollectionStep.java:49)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167)
at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167)
at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:114)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2.
java.lang.RuntimeException: Failed to read kylin_hive_conf.xml at '/opt/apache-kylin-2.6.2-bin-hadoop3/bin/meta/kylin_hive_conf.xml'
3.
java.lang.RuntimeException: Failed to read kylin_hive_conf.xml at '/opt/apache-kylin-2.6.2-bin-hadoop3/conf/meta/kylin_hive_conf.xml'
who can kindly help me find the root cause and fix this problem ?
Thanks in advance.
I hope you have already solve the issue. I had encounter same problem and investigated about it.
Prefer to https://github.com/apache/kylin/blob/kylin-2.6.2/engine-mr/src/main/java/org/apache/kylin/engine/mr/common/AbstractHadoopJob.java#L481
When we use MapReduce, KYLIN_CONF will be set to different folder.
System.setProperty(KylinConfig.KYLIN_CONF, metaDir.getAbsolutePath());
I think to workaround with it, we have to create simple link for all XML xml configurations.
Try to check your Kylin log
cat YOUR_PATH/apache-kylin-2.6.3-bin-hbase1x/logs/kylin.log | grep "The absolute path"
You possibly see the result
2019-10-14 23:47:04,438 INFO [LocalJobRunner Map Task Executor #0] common.AbstractHadoopJob:482 : The absolute path for meta dir is /SOME_FOLDER/meta

solr.DirectUpdateHandler2 failed to instantiate

I have 4 SOLR indexes running on a single node Hadoop environment.
The VM went out of space so i freed some space on the OS but after i restarted the VM , the indexes were not starting.
The VM where I am running SOLR has 64 GB of RAM
I am getting solr.DirectUpdateHandler2 failed to instantiate error
I need help on this as my system is not working anymore due to this issue and the backup i have is old.
This is the error:
null:org.apache.solr.common.SolrException: SolrCore 'EMAIL_DOMAINS_shard1_replica1' is not available due to init failure: Error Instantiating Update Handler, solr.DirectUpdateHandler2 failed to instantiate org.apache.solr.update.UpdateHandler
at org.apache.solr.core.CoreContainer.getCore(CoreContainer.java:752)
at org.apache.solr.servlet.SolrDispatchFilter.checkProps(SolrDispatchFilter.java:768)
at org.apache.solr.servlet.SolrDispatchFilter.getCoreByCollection(SolrDispatchFilter.java:742)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:325)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:211)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.solr.servlet.SolrHadoopAuthenticationFilter$2.doFilter(SolrHadoopAuthenticationFilter.java:394)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:589)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:291)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:552)
at org.apache.solr.servlet.SolrHadoopAuthenticationFilter.doFilter(SolrHadoopAuthenticationFilter.java:399)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.solr.servlet.HostnameFilter.doFilter(HostnameFilter.java:86)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:620)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.solr.common.SolrException: Error Instantiating Update Handler, solr.DirectUpdateHandler2 failed to instantiate org.apache.solr.update.UpdateHandler
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:893)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:663)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:498)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:262)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:256)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
... 1 more
Caused by: org.apache.solr.common.SolrException: Error Instantiating Update Handler, solr.DirectUpdateHandler2 failed to instantiate org.apache.solr.update.UpdateHandler
at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:581)
at org.apache.solr.core.SolrCore.createUpdateHandler(SolrCore.java:637)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:855)
... 8 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:567)
... 10 more
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.hadoop.hdfs.DFSOutputStream.start(DFSOutputStream.java:2359)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForAppend(DFSOutputStream.java:1934)
at org.apache.hadoop.hdfs.DFSClient.callAppend(DFSClient.java:1853)
at org.apache.hadoop.hdfs.DFSClient.append(DFSClient.java:1878)
at org.apache.hadoop.hdfs.DFSClient.append(DFSClient.java:1871)
at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:329)
at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:325)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.append(DistributedFileSystem.java:325)
at org.apache.hadoop.fs.FileSystem.append(FileSystem.java:1172)
at org.apache.solr.update.HdfsTransactionLog.<init>(HdfsTransactionLog.java:93)
at org.apache.solr.update.HdfsUpdateLog.init(HdfsUpdateLog.java:200)
at org.apache.solr.update.UpdateHandler.<init>(UpdateHandler.java:136)
at org.apache.solr.update.UpdateHandler.<init>(UpdateHandler.java:94)
at org.apache.solr.update.DirectUpdateHandler2.<init>(DirectUpdateHandler2.java:100)
... 15 more
In the exception is clearly written that you have an Out Of Memory (OOM) problem.
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
Given that you already had an out of space, please double check if you have already freed enough disk space.
After this, even if there are many reasons for an OOM, the common reason is that your JVM hasn't enough memory so double check your heap size configuration (i.e. –Xms and –Xmx).
If freeing more disk space and allocate more memory heap doesn't resolve your problem, I suggest also to:
check and if you need to increase user limits (ulimit –a), this means user max thread limits, max open processes and max open files limits.
check threads system wide settings and, even in this case, raise the limit if you need it.
On the other hand, rarely happens that you have allocated so much heap space or so much resource that the operating system has no enough room to run your application.
In conclusion just try to find the right balance between allocated resources and application needs.

unable to set password protection for PDF File using java(i text jar used)

PdfWriter writer =PdfWriter.getInstance(document,
new FileOutputStream("C:\\Documents and Settings\\abc\\Desktop\\Test.pdf"));
writer.setEncryption("123".getBytes(), "123".getBytes(),
PdfWriter.ALLOW_PRINTING,PdfWriter.ENCRYPTION_AES_128);
I am using itextpdf-5.4.4.jar.
When executing the setEncryption() method I get the following error:
Exception in thread "main java.lang.NoClassDefFoundError: org/bouncycastle/asn1/ASN1Primitive
Please suggest some solutions.
If I use itextpdf-5.2.1.jar then above code is working without any exceptions.
itextpdf 5.2.1 depends on BouncyCastle library bctsp-jdk15 1.46, while itextpdf 5.4.4 depends on two BouncyCastle libraries: bcpkix-jdk15on 1.49
and bcprov-jdk15on 1.49. ASN1Primitive was introduced to bcprov-jdk15on starting with version 1.47.

Taskqueue error while testing locally

I've been developing with appengine for the past 6 months now and haven't had too many issues-- but I just switched my box setup from ArchLinux to a Linux Mint install and now am having a weird issue I haven't seen before. I'm using the 1.8.0 python sdk. I'm trying to dispatch tasks onto a named queue and am getting the following traceback:
ERROR 2013-06-05 14:53:33,762 taskqueue_stub.py:1892] Failed to dispatch task
Traceback (most recent call last):
File "/opt/google_appengine/google/appengine/api/taskqueue/taskqueue_stub.py", line 1890, in ExecuteTask
'0.1.0.2')
File "/opt/google_appengine/google/appengine/tools/devappserver2/dispatcher.py", line 532, in add_request
headers_dict['Host'], urlparse.urlsplit(relative_url).path)
File "/opt/google_appengine/google/appengine/tools/devappserver2/dispatcher.py", line 580, in _resolve_target
raise request_info.ServerDoesNotExistError(prefix)
ServerDoesNotExistError: 15.bqdownloader-test
ERROR 2013-06-05 14:53:33,762 taskqueue_stub.py:1965] An error occured while sending the task "task16" (Url: "/_ah/warmup") in queue "download-queue". Treating as a task error.
My backends.py contains the following section that pertains:
- name: bqdownloader-test
class: B1
instances: 20
options: dynamic
I've looked around and have found one mention of this error, but no fix. Has anyone encountered this? I don't really know what to do to address this issue. Thanks in advance.

Getting sonar-runner to work with C code

I am trying to use sonar. The server is up and running (using the default embedded derby database), serving a web page with no projects. Then I was planning to use sonar-runner to examine the source code, but I cannot get it to work even with the simplest hello world program. Any hints on what is missing/I am doing wrong?
(hlovdal) localhost:/work/sonar>wget -q http://repository.codehaus.org/org/codehaus/sonar-plugins/sonar-runner/1.1/sonar-runner-1.1.zip
(hlovdal) localhost:/work/sonar>unzip sonar-runner-1.1.zip
Archive: sonar-runner-1.1.zip
creating: sonar-runner-1.1/
...
inflating: sonar-runner-1.1/lib/sonar-runner.jar
(hlovdal) localhost:/work/sonar>export SONAR_RUNNER_HOME=/work/sonar/sonar-runner-1.1
(hlovdal) localhost:/work/sonar>$SONAR_RUNNER_HOME/bin/sonar-runner -h
usage: sonar-runner [options]
Options:
-h,--help Display help information
-X,--debug Produce execution debug output
-D,--define <arg> Define property
(hlovdal) localhost:/work/sonar>
But trying to run sonar-runner fails:
(hlovdal) localhost:/work/sonar>cd helloworld/
(hlovdal) localhost:/work/sonar/helloworld>cat main.c
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char *argv[])
{
printf("hello world\n");
return EXIT_SUCCESS;
}
(hlovdal) localhost:/work/sonar/helloworld>grep -v -E '^#|^$' sonar-runner.properties
sonar.projectKey=helloworld
sonar.projectName=Hello world
sonar.projectVersion=1.0
sonar.language=c
sources=/work/sonar/helloworld
sonar.sourceEncoding=UTF-8
(hlovdal) localhost:/work/sonar/helloworld>$SONAR_RUNNER_HOME/bin/sonar-runner
Runner settings: /work/sonar/sonar-runner-1.1/conf/sonar-runner.properties
Runner version: 1.1
Server: http://localhost:9000
Work directory: /work/sonar/helloworld/.sonar
[INFO] Database dialect class org.sonar.jpa.dialect.Derby
[INFO] Initializing Hibernate
Exception in thread "main" org.sonar.batch.bootstrapper.BootstrapException: org.picocontainer.PicoLifecycleException: PicoLifecycleException: method 'public void org.sonar.batch.index.DefaultIndex.start()', instance 'org.sonar.batch.index.DefaultIndex#3d57211f, java.lang.RuntimeException: wrapper
at org.sonar.runner.Runner.delegateExecution(Runner.java:155)
at org.sonar.runner.Runner.execute(Runner.java:58)
at org.sonar.runner.Main.main(Main.java:52)
Caused by: org.picocontainer.PicoLifecycleException: PicoLifecycleException: method 'public void org.sonar.batch.index.DefaultIndex.start()', instance 'org.sonar.batch.index.DefaultIndex#3d57211f, java.lang.RuntimeException: wrapper
at org.picocontainer.monitors.NullComponentMonitor.lifecycleInvocationFailed(NullComponentMonitor.java:77)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.monitorAndThrowReflectionLifecycleException(ReflectionLifecycleStrategy.java:132)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:115)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84)
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169)
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132)
at org.picocontainer.behaviors.Stored.start(Stored.java:110)
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:996)
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:989)
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:746)
at org.sonar.batch.bootstrap.Module.start(Module.java:88)
at org.sonar.batch.bootstrap.BootstrapModule.doStart(BootstrapModule.java:96)
at org.sonar.batch.bootstrap.Module.start(Module.java:89)
at org.sonar.batch.Batch.execute(Batch.java:74)
at org.sonar.runner.Launcher.executeBatch(Launcher.java:60)
at org.sonar.runner.Launcher.execute(Launcher.java:53)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.sonar.runner.Runner.delegateExecution(Runner.java:152)
... 2 more
Caused by: java.lang.RuntimeException: wrapper
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.monitorAndThrowReflectionLifecycleException(ReflectionLifecycleStrategy.java:130)
... 22 more
Caused by: java.lang.NullPointerException
at org.sonar.api.resources.Resource.hashCode(Resource.java:242)
at java.util.HashMap.put(HashMap.java:389)
at org.sonar.batch.index.DefaultIndex.doStart(DefaultIndex.java:98)
at org.sonar.batch.index.DefaultIndex.start(DefaultIndex.java:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)
... 21 more
(hlovdal) localhost:/work/sonar/helloworld>
The projectKey does not look correct, it should be something like xxx:yyy

Resources