We have a Pepper robot recently and actively trying to develop apps for it. Lately more and more we encounter the same error - 720 (Indication for this error is that the shoulder LED's starts blinking yellow).
When we push the button we hear the following description of the error:
Description: Some of my motors are getting hot in my Neck. I will need to rest soon.
Solution based on the documentation: Robot motors are getting hot or are already too hot to be able to move. Put the robot in crouch and unstiffened and wait for few tens of minutes to let its motors cool down before you can use it again.
Is it normal that we encounter this error several times a day? When we push the button behind the tablet, the error goes away and we can continue our work.
The room temperature is about 22-24 degrees.
Is there something we can do to prevent this error from occurring?
Best Regards.
Also check that you don't have objects near pepper (1m, 1m50 around pepper). If you do, then Pepper might consider those objects as potential kids and consistently try to look down to see them. With time it will heat up the neck because of the head's weight creating torque on the neck.
Constant movement heats up the motor. If you do not need the robot to be alive or moving around when you develop then just put it in resting position.
http://doc.aldebaran.com/2-5/naoqi/motion/control-stiffness-api.html?highlight=relax#ALMotionProxy::rest
If you encounter this error too often, the neck motor might be damaged.
You can contact support using this form: https://www.ald.softbankrobotics.com/en/about-us/contact . They will tell you if your Pepper needs repair.
Related
We are trying to move Pepper around on a floor using ALNavigation SLAM APIs. We have created a map using ALNavigation:explore() method. The application works most of the time but sometimes Pepper stops and the application crashes in between due to some safeguard feature.
We are using ALNavigation:navigateToInMap to move Pepper around the map.
Here are some logs:
[W] 15:01:26 ALMotion.OmniWheelFollowPath: Stitch failed. Stopping path:
["Circle", [0.436987877, 11.431554794], [2.869375944, 11.368105888],
-0.046996359]
[W] 15:01:26 ALTouch.TouchManager: My Base is touched. Reasons: Wheel.
[W] 15:01:26 AutonomousLife: Robot was moved!
[W] 15:01:26 AutonomousLife: Robot moved, must enter safeguard state. Will
immediately re-enter solitary state.
Is there any way to fix this issue or is this a hardware issue with Pepper's wheels or something wrong in the code? I am simply calling navigateToInMap after localizing the robot and this works most of the times but this issue is getting more and more frequent.
Thanks
Pepper has a system to detect if she's been pushed, and (with current versions) there are sometimes false positives - especially on a floor with irregularities, or when pepper is moving quickly or accelerating brutaly.
Some solutions:
Make Pepper move / accelerate a bit more slowly
Have a system to launch the application again as soon as it exits safeguard (for example with a ShouldBeExploring or shouldBeNavigating trigger condition) - in my experience when this false positive happens the robot is in safeguard for a very short time, maybe less then a second.
I recommend the second solution because that's usually what you want to do when a safeguard that's not a false positive is triggered - when someone bumps into pepper, or shakes her, etc.
I'm trying to use whole body balancer made by Aldebaran to make my nao dance more steadily and to be less dependent on the surface horizont level, to neglect some small tilt.
I've succeeded in requesting nao to go to balance, but enabling balance constraint gives me nothing. For testing, I designed an ill-balanced timeline which leads robot to fall down when the body balancer is disabled and should keep the robot stable as log as it's enabled, that's what Aldebaran declares as a use-case. However, the robot still falls down (I keep him vertical with my hand) and then goes to balance due to ALMotionProxy::wbGoToBalance. It is strange, however, that he reaches balance in a rapid move, rather than in 3.0 seconds that I requested.
My suggestion now is: whole body balancer needs some resources (joints) that are actually used by my timeline (it uses all the joints). Is it correct? Can anyone confirm or deny this?
The source I use is generally this one:
self.proxy = ALProxy("ALMotion")
self.proxy.wbEnable(True)
self.proxy.wbFootState("Fixed", "LLeg")
self.proxy.wbFootState("Free", "RLeg")
self.proxy.wbEnableBalanceConstraint(True, "LLeg")
I use this source inside a box in Choregraphe 1.14 and it is definitely called (it leaves logs I stripped out). And it definitely gives me no exceptions, I check and log them.
Yes, I think that you must remove some joints from your timeline.
The test is easy: disable for instance ankles from your timeline and see the results.
Disabling some joint is easy:
open the timeline
click the small pen beside the "Motion" caption on
the left
then uncheck some circles (for instance the LAnkleRoll
circle): so those joint animation will be disabled.
retest
I've got a little side project going on using SDL2/SDL_mixer and a couple other sound libraries. I've been trying for a while now to synchronize my audio and video but haven't been able to get it anywhere near successfully. All new to this stuff so forgive the poorman's logic and coding. At first I thought to set the delay to SDL_Delay(30) after every frame, and then a few other numbers in that range. Not quite right. Then I tried doing it by getting Ticks. Where I would get the difference between current_ticks and last_ticks and set a delay if the delta between ticks was <=30 and set the delay to 30-delta. Still not quite right (by far). Hoping someone on here with more experience might guide me in the right direction. In regards to the video, it's a visualizer of course, seems like a popular beginners project.
The basic way you synchronize audio and video is that you choose one to use as a timer source and present the other according to that timer. The easiest is generally audio, but because it's generally buffered ahead, you need some method of measuring what time in the audio stream is actually coming out of the speakers. Once you get that, it's just a matter of waiting until the audio reaches the right time for the next video frame and displaying it.
Till now I have been able to create an application where the Kinect sensor is at one place. I have used speech recognition EmguCV (open cv) and Aforge.NET to help me process an image, learn and recognize objects. It all works fine but there is always scope for improvement and I am posing some problems: [Ignore the first three I want the answer for the fourth]
The frame rate is horrible. Its like 5 fps even though it should be like 30 fps. (This is WITHOUT all the processing) My application is running fine, it gets color as well as depth frames from the camera and displays it. Still the frame rate is bad. The samples run awesome, around 25 fps. Even though I ran the exact same code from the samples it wont just budge. :-( [There is no need for code, please tell me the possible problems.]
I would like to create a little robot on which the kinect and my laptop will be mounted on. I tried using the Mindstorms Kit but the lowtorque motors dont do the trick. Please tell me how will I achieve this.
How do I supply power on board? I know that the Kinect uses 12 volts for the motor. But it gets that from an AC adapter. [I would not like to cut my cable and replace it with a 12 volt battery]
The biggest question: How in this world will it navigate. I have done A* and flood-fill algorithms. I read this paper like a thousand times and I got nothing. I have the navigation algorithm in my mind but how on earth will it localize itself? [It should not use GPS or any kind of other sensors, just its eyes i.e. the Kinect]
Helping me will be Awesome. I am a newbie so please don't expect me to know everything. I have been up on the internet for 2 weeks with no luck.
Thanks A lot!
Localisation is a tricky task, as it depends on having prior knowledge of the environment in which your robot will be placed (i.e. a map of your house). While algorithms exist for simultaneous localisation and mapping, they tend to be domain-specific and as such not applicable to the general case of placing a robot in an arbitrary location and having it map its environment autonomously.
However, if your robot does have a rough (probabilistic) idea of what its environment looks like, Monte Carlo localisation is a good choice. On a high level, it goes something like:
Firstly, the robot should make a large number of random guesses (called particles) as to where it could possibly be within its known environment.
With each update from the sensor (i.e. after the robot has moved a short distance), it adjusts the probability that each of its random guesses is correct using a statistical model of its current sensor data. This can work especially well if the robot takes 360ยบ sensor measurements, but this is not completely necessary.
This lecture by Andrew Davison at Imperial College London gives a good overview of the mathematics involved. (The rest of the course will most likely be very interesting to you as well, given what you are trying to create). Good luck!
I recently wrote a program to display data on a set of LCD TV's. The data is for the most part static with the exception of refreshing from the database every 60 seconds. I know screen burn isn't as big an issue with LCD's as Plasma TV's, however, I would like to try and minimize the risk. These screens will be running for 8 hours a day.
I programmed a small square that bounces around the screens on top of all the data. The square constantly changes colors as it goes. I did test that it hits every pixel on the screen. It completes a "cycle" every couple of minutes.
Is that sufficient to mitigate the risk of burn in? Or do I need to make something more complicated?
Discard all the effort altogether, LCDs do not sufer from that problem at all.
And that square is probalby annoying, and even if it were to do any good, it would have to stay on the screen for longer period of time.
And - I wouldn't worry, 8 hours per day is normal. If you are paranod, you can move the window / re-place the text every so.
That is not true exactly. While LCD don't suffer from what burn in actually is. They do have a similar problem, especially when used as a computer screen, or left on a tv guide. An image will stick if left on the screen long enough, usually goes away but it can be permanent.
The program you are describing sounds like it would work just fine.