I am getting an error while installing the features.
error executing command: error restarting bundles
Initially, it worked fine for some of the features but then it suddenly started throwing this error.
Any suggestion on this would be appreciated.
if you clean karaf, you can resolve the issue, by running the following command, you can clean the karaf.
./karaf clean
When OpenDaylight moved to karaf 4 there were problems identified with
installing features in the karaf shell one after the other. I think
you are hitting that problem.
you can try listing out all the features you want in the featuresBoot
variable of the etc/org.apache.karaf.features.cfg file.
You may also have some success trying to install on the karaf shell with
the following flag --no-auto-refresh, like this:
feature:install --no-auto-refresh odl-l2switch-switch
Also, as sridhar reddy noted, if you use "karaf clean" to start karaf it will
wipe the data/ folder (and more) so that old loaded features will not come
back in on startup and you will start "clean".
Related
I am using Jboss fuse 6.3 and one of my features file when trying to uninstall throwing TimeoutException (below mentioned is detail). This features has one bundle which is communicate with service cloud using camel steaming API.
I wanted to see what Karaf is uninstalling when ran features:uninstall -v . But Karaf is showing only
Uninstalling feature . Any suggestion, how can we see what Karaf is trying to uninstall and where it is stuck to get timeout exception.
Any idea suggestion will be highly appreciated. Thank you so much in advance.
java.util.concurrent.TimeoutException
at java.util.concurrent.AbstractExecutorService.doInvokeAny(AbstractExecutorService.java:184)[:1.8.0_144]
at java.util.concurrent.AbstractExecutorService.invokeAny(AbstractExecutorService.java:225)[:1.8.0_144]
at org.apache.aries.blueprint.utils.threading.ScheduledExecutorServiceWrapper$4.call(ScheduledExecutorServiceWrapper.java:184)
at org.apache.aries.blueprint.utils.threading.ScheduledExecutorServiceWrapper$15.call(ScheduledExecutorServiceWrapper.java:452)
at org.apache.aries.blueprint.utils.threading.RWLock.runReadOperation(RWLock.java:35)
at org.apache.aries.blueprint.utils.threading.ScheduledExecutorServiceWrapper.runUnlessShutdown(ScheduledExecutorServiceWrapper.java:447)
at org.apache.aries.blueprint.utils.threading.ScheduledExecutorServiceWrapper.invokeAny(ScheduledExecutorServiceWrapper.java:178)
at org.apache.aries.blueprint.container.BlueprintEventDispatcher.callListener(BlueprintEventDispatcher.java:199)
at org.apache.aries.blueprint.container.BlueprintEventDispatcher.callListeners(BlueprintEventDispatcher.java:189)
at org.apache.aries.blueprint.container.BlueprintEventDispatcher.blueprintEvent(BlueprintEventDispatcher.java:140)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.destroy(BlueprintContainerImpl.java:897)
at org.apache.aries.blueprint.container.BlueprintExtender$3.run(BlueprintExtender.java:325)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)[:1.8.0_144]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)[:1.8.0_144]
I’m getting strange error while building project in Xcode 9.4
Build system information - unexpected service error: The Xcode build system has crashed. Please close and reopen your workspace.
I tried Xcode quit and reopen but that didn’t worked. Any solution?
Please clear derived data folder (located at ~/Library/Developer/Xcode/DerivedData) and restart the Xcode project.
This error usually happens between Xcode major versions. Apple usually claim their new build system is ** times faster than their previous version. If you see this error(I see this on changing Xcode9 to Xcode10 beta), you can always change it to the legacy build system. Here is how you can do this:
Open 'workspace settings'( it is now changed to 'Project Settings' if you are using Xcode10 or later) in the File Menu
Change build system to 'legacy'
Update for Xcode 13.4.1:
I had this infamous bug today as well. I tried a lot including clean project, delete derived, restart max etc.
What did the fix finally is similar to the answer of kakaiikaka: I set the workspace settings to "Legacy Build System (Deprecated)" for both, the shared and per user workspace. I tried to build with this, but got an error because I had packages wich are not supported.
Restarted Xcode, then changed back the build system. Restarted Xcode again.
Now the crash doesn't happen any more.
Looks like something internal was spoilt and cycling the build system fixed it.
I moved a lot of files all at once between folders, including nested folders. This error started happening. Nothing I did in regards to cleaning, purging derived data, or undoing the move operation would help.
What I did to help was: restore the previous version of the project file from source control and then re-add all the applicable new files to it. It was project file related. Deleting user data inside the project container did not help in my case. So as long as you use source control and can rollback the .xcodeproj, this may be an option.
it happened to me when i changed build configuration names. After deleting Pods folder and Podfile.lock, and then runnnig "pod install" fixed the issue.
For Xcode 10.2 delete podfile, podfile.lock, xcworkspace, open terminal, cd directory of project, pod init, add pods you want to pod file, pod install, open xcworkspac. Everything will be indexing now and then you can build.
i'm trying z.load in apache zeppelin as following:
%dep
z.load("/zeppelin-0.5.6-incubating-bin-all/lplibs/hive/csv-serde-1.0.5-jar-with-dependencies.jar")
I get an ERROR and it says (not sure this is the error):
Must be used before SparkInterpreter (%spark) initialized
Hint: put this paragraph before any Spark code and restart Zeppelin/Interpreter
this zeppelin section is the first i have in my notebook so i'm not sure what its complaining about..
Right now I can't check your problem, but you should restart interpreter (pushing restart button) before loading dependency jar file.
There might be a chance that Sparkcontext has already been started by other notebook.
So as Kangrok mentioned, just restart Spark interpreter.
But apart from that, why don't you use the latest zeppelin, in which you don't need to use %dep to load your dependencies. Instead it can load from Interpreter screen.
More details can be found here https://zeppelin.incubator.apache.org/docs/0.6.0-incubating-SNAPSHOT/manual/dependencymanagement.html
I upgraded today to GAE 1.7.4 and on trying to deploy I see following error:
Preparing to deploy:
Created staging directory at: 'C:\Users\VSKUMA~1.ST-\AppData\Local\Temp\appcfg4811921061542689032.tmp'
Scanning for jsp files.
Compiling jsp files.
java.lang.RuntimeException: Cannot get the System Java Compiler. Please use a JDK, not a JRE.
I already have JDK pointing in the build path and this all was working fine till GAE 1.7.3
I cannot uninstall the existing GREs for some reasons.
For me at least, forcing Eclipse to use a different vm itself worked. Add, for example:
-vm
C:\Program Files\Java\<jdk1.6.0_38>\bin\javaw.exe
to the first two lines* of the eclipse.ini file and restart Eclipse.
*Thanks to Andre
finally i uninstalled jre manually and this only helped me to get through this
Yes #Vik, i had this problem when updating to GAE 1.7.3 and the solution to the problem is to reinstall the JRE... but! if you are working with Eclipse, and this doesn´t work either, try reintalling the whole IDE. (i had to do so, and it worked for me ¬¬ )
Greetings,
I am trying to start a scala/liftweb project for deployment on Google App Engine. To do this, i need to package it up as a .war using maven.
However, whenever I run the 'mvn' command, I am met with:
Error opening zip file or JAR manifest missing : /Applications/JRebel/jrebel.jar
Error occurred during initialization of VM
agent library failed to init: instrument
Is there something wrong with my maven or do I need Jrebel? I see jrebel is not free which is why I am so surprised.
thanks!
No, JRebel is definitely not required to run Maven.
As Matt mentioned, JRebel is not required to run Maven. However, ZeroTurnaround does offer a free version that works with Scala. You can get it here:
http://sales.zeroturnaround.com/
As for your error - it indicates you are trying to start the JVM as though you are using JRebel. What is the full Maven command you are running? What is in your MAVEN_OPTS environment variable? If either of them contain something like -noverify -javaagent:/Applications/JRebel/jrebel.jar, then that's your problem.
One of the reason of the problem is a blank in the path of jrebel.jar
Make sure that there is no blank in the path like in "Program Files"