I can't for the life of my work out why this is happening. I have in a class derived from CCLayer. I am scheduling a method call like so when initialising the class
//create an update method for keeping track of how long its been since an animation has played
[self schedule:#selector(playIdleAnimation:)];
And the method is
//an update method that will play an idle animation after a random period of idleness
-(void) playIdleAnimation:(ccTime) dt {
//if the user isn't playing an animation, increment the time since last animation variable
if ([bodySprite numberOfRunningActions] == 0) {
timeSinceLastAnimation += (float)dt;
//now check to see if we have surpassed the time set to cause an idle animation
if (timeSinceLastAnimation > (arc4random() %14) + 8) {
//reset the cooldown timer
timeSinceLastAnimation = 0;
[bodySprite stopAllActions];
//play the idle animation
//[bodySprite runAction:[CCAnimate actionWithAnimation:waitAnimation restoreOriginalFrame:NO]];
NSLog(#"PLAYING IDLE ANIMATION");
}
}
//player is currently playing animation so reset the time since last animation
else
timeSinceLastAnimation = 0;
}
But yet, when I go to run the program the console statements show the condition is being passed twice each cooldown
012-06-29 09:52:57.667 Test Game[5193:707] PLAYING IDLE ANIMATION
2012-06-29 09:52:57.701 Test Game[5193:707] PLAYING IDLE ANIMATION
2012-06-29 09:53:05.750 Test Game[5193:707] PLAYING IDLE ANIMATION
2012-06-29 09:53:05.851 Test Game[5193:707] PLAYING IDLE ANIMATION
I am trying to fix a bug where the game crashes when I finish playing the idle animation, and I'm certain this has something to do with it.
I don't see where you are unscheduling the selector. I bet that it's normal behavior to be called ever frame kicks in, and you see it being triggered twice because it takes a frame for the layer to be deallocated.
If you want a one-time method call, do this:
-(void) playIdleAnimation:(ccTime) dt {
[self unschedule:_cmd];
// rest of the code here
}
Cocos2d 2.0 has a scheduleOnce method that you can use instead.
Related
How can I set up a timer when playing a log file. The timer should start when the log file starts. At a certain event the timer should be printed out in the write window.
There are som built in functions in CAPL do you know how they work?
For example TimeToElapse
Thanks
First insert a replay block in your measurement configuration. In the replay block select your log file and uncheck "Start replay on measurement start" if you want to start the replay from CAPL code.
In the following example I bound the procedure to two on key events:
on key 'a' {
replayStart("ReplayBlockName");
setTimer(mytimer, mytime);
}
on timer mytimer {
// on timer event needed so that setTimer function works properly
}
on key 's' {
write("time to elapse = %d", timeToElapse(mytimer));
}
So basically hit the key 'a' during measurement and afterwards key 's' to see how much time is left. Keep in mind that the output is dependent on your timer. When declaring a regular timer, timeToElapse returns whole seconds. When declaring a msTimer, timeToElapse returns milli seconds.
This may or may not be a bug, but I would like some help understanding the behavior of Timer.
Here is a test program that sets up Timer.periodic with a duration of 1000 microseconds (1 millisecond). The callback that fires increments a count. Once the count reaches 1000 intervals, the program prints the time elapsed and exits. The point being to get close to 1 second in execution time. Consider the following:
import 'dart:async'
main() {
int count = 0;
var stopwatch = new Stopwatch();
stopwatch.start();
new Timer.periodic(new Duration(microseconds: 1000), (Timer t) {
count++;
if(count == 1000){
print(stopwatch.elapsed);
stopwatch.stop();
}
});
The result is:
0:00:01.002953
That is, just over a second (assuming the remainder is coming from start time of the stopwatch).
However, if you change the resolution to be anything under 1 millisecond e.g. 500 microseconds, the Timer seems to ignore the duration entirely and executes as quickly as possible.
Result being:
0:00:00.008911
I would have expected this to be closer to half a second. Is this an issue with the granularity of the Timer? This issue can also be observed when applying a similar scenario to Future.delayed
The minimal resolution of the timer is 1ms. When asking for a 500ns duration is rounded to 0ms, aka: as fast as possible.
The code is:
int milliseconds = duration.inMilliseconds;
if (milliseconds < 0) milliseconds = 0;
return _TimerFactory._factory(milliseconds, callback, true);
Maybe it should take 1ms as a minimum, if that is its actual minimum, or it should handle microseconds internally, even if it only triggers every 10-15 milliseconds and runs the events pending so far.
If you are in VM it looks like a bug. Please file an issue.
If you are in JS side see the following note on the documentation of the Timer class:
Note: If Dart code using Timer is compiled to JavaScript, the finest granularity available in the browser is 4 milliseconds.
Hey everyone so I have an array private var aFishArray:Array; that is setup with the timer inside my constructor.
tFishTimer = new Timer(800);
//Listen for timer intervals/ticks
tFishTimer.addEventListener(TimerEvent.TIMER, addMainFish,false,0,true);
//Start timer object
tFishTimer.start();
then in the end game Condition I removed the timers tFishTimer.removeEventListener(TimerEvent.TIMER, addMainFish);
tFishTimer.stop();
Now this works perfectly but the problem is when I make a new timer of the same instance inside a separate function like so
private function checkFishPowerHitBucket():void
{
for (var j:int = 0; j < aFishPowerUpArray.length; j++)
{
//get current fish in j loop
var currentfPower:mcMoreFishPowerUp = aFishPowerUpArray[j];
//test if current fish is hitting bucket
if (currentfPower.hitTestObject(bucket))
{
//If we want timer to only run a certain amount of times then new Timer(1000, ??)
tFishTimer = new Timer(100, 30);
//Listen for timer intervals/ticks
tFishTimer.addEventListener(TimerEvent.TIMER, addMainFish, false, 0, true);
//Start timer object
tFishTimer.start();
}
}
}
and then in my end game condition try to remove the timer and the movie clips from entering the screen anymore it no longer happens. The fish just keep appearing on the screen. Is there anything that i can do to remove all instances of these timers when the game is over. Im thinking that by creating a new timer with the same array it cancels the command to delete it when it starts a new timer? any help would be appreciated thanks.
also here is the addMainFish(); Function
private function addMainFish(e:Event):void
{
//Create new fish object
var newFish = new mcMainFish();
//Add fish object to stage
stage.addChild(newFish);
//Add fish to fish Array
aFishArray.push(newFish);
//trace(aFishArray.length);
}
It doesn't really 'delete' the old timer, but what has happened is that when you create the new timer and store it in the same variable, you've lost the reference to the old one, so it just resides somewhere in memory, forever running (or until it hits it's limit, if it has one).
A way to solve this particular problem is to keep a reference to the old timer(s) such as in an array, then when you make a new timer, move the old one to an Timer array first so that you have access to it still.
I do not recommend this.
Having multiple timers like this will cost you in performance and possibly a fair amount of unpredictability. Also, it can lead to some tough debugging down the road as you may have trouble tracing which timer caused which thing to fail, strange instances of objects trying to access the same data, having to deal with so many objects in itself, etc.
What you could do is just have a single dedicated timer and within that timer do All of the processing on each of your objects that need it. You could also avoid a timer altogether and just use your game loop to do this logic.
This post illustrates a similar problem and the answer there is similar to what I've mentioned here.
I'm working on a project in c, where I'm going to make some heavy physics calculations, and I want the ability to see the results when I'm finished. The way it works now is that I run GLUT on the main thread, and use a seperate thread (pthread) to do input (from terminal) and calculations. I currently use glutTimerFunc to do the animation, but the problem is that that function will fire every given time intervall no matter what. I can stop the animation by using an if statement in the animation function, stopping the variables from being updatet, but this uses a lot of unnecessary resources (I think).
To fix this problem I was thinking that I could use an extra thread with a custom timer function that I could controll myself (without glutMainLoop messing things up). Currently this is my test function to check if this would work (meaning that the function itself is in no way finished). It runs in a seperate thread createt just before glutMainLoop:
void *threadAnimation() {
while (1) {
if (animationRun) {
rotate = rotate+0.00001;
if (rotate>360) {
rotate = rotate-360;
}
glutSetWindow(window);
glutPostRedisplay();
}
}
}
The specific problem I have is that the animation just runs for a couple of seconds, and then stops. Does anybody know how I can fix this? I am planning to use timers and so on later, but what I'm looking for is a way to ensure that glutPostRedisplay will be sent the right place. I tought glutSetWindow(window) was the solution, but apparently not. If I remove glutSetWindow(window) the animation still works, just not for as long, but runs much faster (so maybe glutSetWindow(window) takes a lot of resources?)
btw the variable "window" is created like this:
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
glutInitWindowSize(854, 540);
glutInitWindowPosition(100, 100);
window = glutCreateWindow("Animation View");
init();
glutDisplayFunc(display);
glutReshapeFunc(reshape);
timerInt = pthread_create(&timerThread, NULL, &threadAnimation, NULL);
glutMainLoop();
I don't acctually know if this is correct, but it compiles just fine. Any help is greatly appriciated!
Here's a little idea, create class that will contain all dynamic settings:
class DynamicInfo {
public int vertexCount;
public float *vertexes;
...
DynamicInfo &operator=( const DynamicInfo &origin);
};
And than main application will contain those:
DynamicInfo buffers[2];
int activeBuffer = 0;
In animation thread just draw (maybe use some threads locks for one variable):
DynamicInfo *current = buffers + activeBuffer; // Or rather use reference
In calculations:
// Store currently used buffer as current (for future manipulation)
DynamicInfo *current = buffers + activeBuffer;
// We finished calculations on activeBuffer[1] so we may propagate it to application
activeBuffer = (activeBuffer + 1)%2;
// Let actual data propagate to current buffer
(*current) = buffers[activeBuffer];
Again it's locking one variable.
I have a WinForms app that starts a wpf process running using Process.Start. I would like to know when the WPF process is finished loading and I can access the process.MainWindowHandle property (its 0 before its completly loaded).
I tried polling but the handle is always 0. However, if I debug and wait (after Process.Start) for the WPF app to load - I then will get the correct handle.
Does not work:
int maxCount=100000;
int count=0;
do
{
wpfProcess.WaitForInputIdle();
_hWnd = net4ReconProcess.MainWindowHandle;
count++;
} while (_hWnd.ToInt32() == 0 || count > maxCount);
Add process.Refresh(); to the while loop.
Using a while loop for WaitForInputIdle is a non-sense because this call blocks the current thread until the other process has finished its initialization. After that, it always returns immediately. Please read the post WaitForInputIdle should really be called WaitForProcessStartupComplete – The Old New Thing
As raymond says it, it should really be called WaitForProcessStartupComplete.
You should use this code:
if (!wpfProcess.WaitForInputIdle(10000)) // 10 s timout
throw new ApplicationException("Process takes too much time to start");
_hWnd = net4ReconProcess.MainWindowHandle;