How to check timer ticks how many times? - wpf

I have a timer which ticks once every second. I would like to check when it ticks 60 times which means a minute and have it do something.

Assuming C#, this should do the job:
private int m_Time = 0;
private void Timer_Tick(...)
{
m_Time++;
if (m_Time == 60)
{
m_Time = 0;
// it's been 60 seconds, do whatever
}
// do your "every 1 second" code here
}
Essentially you make a private field that counts the number of seconds that have ticked by, then check if it's 60. If it is, a minute has passed and you can perform your logic. Then set the counter back to 0 and carry on.

Create an int field, increment it every tick, and in an if(field == 60) block you can do "something".

Related

C programming: I want subtract a weight per second

I am new in stackoverflow and i am sorry if i make mistakes.
I am starter in C language and i have one project what it needs to subtact a weight per second.For example
Weight: 50kg
Subtract per second 4%
i found this code
while(waitFor(1))
{
*weight=(*weight)-(*weight)*0.4;
}
void waitFor(int secs)
{
int retTime;
retTime = time(0) + secs; // Get finishing time.
while (time(0) < retTime); // Loop until it arrives.
}
But i dont want to wait x seconds to finish. I want faster solution. Any ideas
**Note: I want to know how much seconds need to make weight 0
Sleep command it is not working in my computer.
**
For cleaning and disinfecting of the pool dropped into the water of a chemical
a solid body. This solid body upon contact with water
immediately begins to dissolve, losing weight equal to 4% by mass per second.
If the dissolution rate of the chemical remains constant, to implement program
will accept the weight of the solid body in grams and displays after how
time will completely dissolve. The time is displayed in a "hours: minutes: seconds".
For example, if the dissolution time is 3,740 seconds, displaying 01: 02: 20.
To calculate the time you need to implement a function which accepts
gram and return the three time parameters ie
hours, minutes and seconds. Note that the time printed on the main
function.
you can use sleep(int) function in loop it will wait suspend the process up to the integer value.
while((1) && weight > 0)
{
sleep(1);
*weight=(*weight)-(*weight)*0.4;
}
it will wait for 1 seconds and subtraction will made, it will run continuously
Edit:
To find out the number of seconds required for the weight to reach 0:
unsigned seconds = 0;
while(*weight != 0){
*weight -= *weight * 0.04
seconds++;
//in case you have the patience to attend:
sleep(1000); //in milliseconds => 1 second
}
Please note that weight is considered to be a pointer to an integer type.

Using timers in ARM embedded C programming

I'm writing a pong-type game in C, which will run on an ARM board over an LCD screen. Part of the requirements for the game is something called 'magic time'.
A "magic time" period occurs at random intervals between 5 and 10 seconds - i.e, between 5 and 10 seconds after the last "magic time" period, and lasts for a random duration of 2 to 10 seconds.
I don't really understand your question (do you execute this code every second via timer interrupt, or?), but there are some errors that I see on the first sight:
while (magicTime == true) {
magicTimeLength++;
magicTime == magicTimeLength;
}
Last line (magicTime == magicTimeLength;) don't do anything - it simply evaluates if magicTime is equal to the magicTimeLength, so you will enter dead-loop.
I think that you want to do this:
Init magicTimeOccurence with random value within 5 and 10.
Init magicTimeLength with random value within 2 and 10.
Every second, if magicTimeOccurence is greater than 0, decrease
its value by one.
Once magicTimeOccurence hits 0, decrease magicTimeLength value
by one.
Check if magicTimeLength is greater than 0. If it is, it is magic
time period (so, set the magicTime flag to true). Decrement
magicTimeLength.
If magicTimeLength, set magicTime to false and go to step 1.
You should initialize your timer0 interrupt with period of 1s. I think that you accomplished it with
/* Set timer 0 period */
T0PR = 0;
T0MR0 = SYS_GetFpclk(TIMER0_PCLK_OFFSET)/(TIMER0_TICK_PER_SEC);
but make sure that is triggered every second.
Here is sample code, it should show you what I mean.
/* In void InitTimer0Interrupt() */
...
T0TCR_bit.CE = 1; /* Counting Enable */
magicTimeOccurence = 5+(rand()%5);
magicTimeLength = 2+(rand()%8);
magicTime = false;
__enable_interrupt();
}
/* In void Timer0IntrHandler (void) */
void Timer0IntrHandler (void) {
/* clear interrupt */
T0IR_bit.MR0INT = 1;
VICADDRESS = 0;
if(magicTimeOccurence > 0)
{
magicTimeOccurence--;
}
else if(magicTimeLength > 0){
magicTime = true;
magicTimeLenght--;
}
else{
magicTime = false;
magicTimeOccurence = 5+(rand()%5);
magicTimeLength = 2+(rand()%8);
}
/* take action on timer interrupt */
}

Creating a timeout using time and difftime

gcc (GCC) 4.6.0 20110419 (Red Hat 4.6.0-5)
I am trying to get the time of start and end time. And get the difference between them.
The function I have is for creating a API for our existing hardware.
The API wait_events take one argument that is time in milli-seconds. So what I am trying to get the start before the while loop. And using time to get the number of seconds. Then after 1 iteration of the loop get the time difference and then compare that difference with the time out.
Many thanks for any suggestions,
/* Wait for an event up to a specified time out.
* If an event occurs before the time out return 0
* If an event timeouts out before an event return -1 */
int wait_events(int timeout_ms)
{
time_t start = 0;
time_t end = 0;
double time_diff = 0;
/* convert to seconds */
int timeout = timeout_ms / 100;
/* Get the initial time */
start = time(NULL);
while(TRUE) {
if(open_device_flag == TRUE) {
device_evt.event_id = EVENT_DEV_OPEN;
return TRUE;
}
/* Get the end time after each iteration */
end = time(NULL);
/* Get the difference between times */
time_diff = difftime(start, end);
if(time_diff > timeout) {
/* timed out before getting an event */
return FALSE;
}
}
}
The function that will call will be like this.
int main(void)
{
#define TIMEOUT 500 /* 1/2 sec */
while(TRUE) {
if(wait_events(TIMEOUT) != 0) {
/* Process incoming event */
printf("Event fired\n");
}
else {
printf("Event timed out\n");
}
}
return 0;
}
=============== EDIT with updated results ==================
1) With no sleep -> 99.7% - 100% CPU
2) Setting usleep(10) -> 25% CPU
3) Setting usleep(100) -> 13% CPU
3) Setting usleep(1000) -> 2.6% CPU
4) Setting usleep(10000) -> 0.3 - 0.7% CPU
You're overcomplicating it - simplified:
time_t start = time();
for (;;) {
// try something
if (time() > start + 5) {
printf("5s timeout!\n");
break;
}
}
time_t should in general just be an int or long int depending on your platform counting the number of seconds since January 1st 1970.
Side note:
int timeout = timeout_ms / 1000;
One second consists of 1000 milliseconds.
Edit - another note:
You'll most likely have to ensure that the other thread(s) and/or event handling can happen, so include some kind of thread inactivity (using sleep(), nanosleep() or whatever).
Without calling a Sleep() function this a really bad design : your loop will use 100% of the CPU. Even if you are using threads, your other threads won't have much time to run as this thread will use many CPU cycles.
You should design something like that:
while(true) {
Sleep(100); // lets say you want a precision of 100 ms
// Do the compare time stuff here
}
If you need precision of the timing and are using different threads/processes, use Mutexes (semaphores with a increment/decrement of 1) or Critical Sections to make sure the time compare of your function is not interrupted by another process/thread of your own.
I believe your Red Hat is a System V so you can sync using IPC

How can I cap the framerate at 60fps?

Alright, so I'm trying to cap my framerate at 60 frames per second, but the method I'm using is slowing it down to like 40.
#define TICK_INTERVAL 30
Uint32 TimeLeft(void){
static Uint32 next_time = 0;
Uint32 now;
now = SDL_GetTicks();
if ( next_time <= now ) {
next_time = now+TICK_INTERVAL;
return(0);
}
return(next_time-now);
}
Then I call it like this: SDL_Delay(TimeLeft());
How can I cap my framerate without going over it, or having it cap it too soon?
You need to record the time before drawing the current frame, and then delay the appropriate amount from then.
For example, some pseudo code to do it would be
markedTime = currentTime();
drawFrame();
delayFrom(markedTime, 1/60);
markedTime is the time recorded before drawFrame() was called. delayFrom() is a function that delays from a given time instead of "now". 1/60 is the amount of time to delay from the first argument, in seconds.

UTC time stamp on Windows

I have a buffer with the UTC time stamp in C, I broadcast that buffer after every ten seconds. The problem is that the time difference between two packets is not consistent. After 5 to 10 iterations the time difference becomes 9, 11 and then again 10. Kindly help me to sort out this problem.
I am using <time.h> for UTC time.
If your time stamp has only 1 second resolution then there will always be +/- 1 uncertainty in the least significant digit (i.e. +/- 1 second in this case).
Clarification: if you only have a resolution of 1 second then your time values are quantized. The real time, t, represented by such a quantized value has a range of t..t+0.9999. If you take the difference of two such times, t0 and t1, then the maximum error in t1-t0 is -0.999..+0.999, which when quantized is +/-1 second. So in your case you would expect to see difference values in the range 9..11 seconds.
A thread that sleeps for X milliseconds is not guaranteed to sleep for precisely that many milliseconds. I am assuming that you have a statement that goes something like:
while(1) {
...
sleep(10); // Sleep for 10 seconds.
// fetch timestamp and send
}
You will get a more accurate gauge of time if you sleep for shorter periods (say 20 milliseconds) in a loop checking until the time has expired. When you sleep for 10 seconds, your thread gets moved further out of the immediate scheduling priority of the underlying OS.
You might also take into account that the time taken to send the timestamps may vary, depending on network conditions, etc, if you do a sleep(10) -> send ->sleep(10) type of loop, the time taken to send will be added onto the next sleep(10) in real terms.
Try something like this (forgive me, my C is a little rusty):
bool expired = false;
double last, current;
double t1, t2;
double difference = 0;
while(1) {
...
last = (double)clock();
while(!expired) {
usleep(200); // sleep for 20 milliseconds
current = (double)clock();
if(((current - last) / (double)CLOCKS_PER_SEC) >= (10.0 - difference))
expired = true;
}
t1 = (double)clock();
// Set and send the timestamp.
t2 = (double)clock();
//
// Calculate how long it took to send the stamps.
// and take that away from the next sleep cycle.
//
difference = (t2 - t1) / (double)CLOCKS_PER_SEC;
expired = false;
}
If you are not bothered about using the standard C library, you could look at using the high resolution timer functionality of windows such as QueryPerformanceFrequency/QueryPerformanceCounter functions.
LONG_INTEGER freq;
LONG_INTEGER t2, t1;
//
// Get the resolution of the timer.
//
QueryPerformanceFrequency(&freq);
// Start Task.
QueryPerformanceCounter(&t1);
... Do something ....
QueryPerformanceCounter(&t2);
// Very accurate duration in seconds.
double duration = (double)(t2.QuadPart - t1.QuadPart) / (double)freq.QuadPart;

Resources