Tamosta watering timer - timer

I'm trying to make a watering timer that turns on for 20 minutes and off for 30 and repeat the cycle during daytime. Also, change the cycle to 20 minutes ON at night-time and 1hour OFF.
I'm using these timers as a base, but I'm missing some rules to complete it.
Timer1
{"Enable":1,"Mode":1,"Time":"00:00","Window":0,"Days":"1111111","Repeat":1,"Output":1,"Action":1}
Timer2
{"Enable":1,"Mode":1,"Time":"00:20","Window":0,"Days":"1111111","Repeat":1,"Output":1,"Action":0}
Timer3
{"Enable":1,"Mode":2,"Time":"00:00","Window":0,"Days":"1111111","Repeat":1,"Output":1,"Action":1}
Timer4
{"Enable":1,"Mode":2,"Time":"00:20","Window":0,"Days":"1111111","Repeat":1,"Output":1,"Action":0}
Working on a Sonoff Basic with Tasmota 12.0.x
Any thoughts?

Related

Issue with timer Input Capture mode in STM32

I would like to trigger a timer with an external signal which happens every 1ms. Then, the timer has to count up to 90us at every rising edge of the external signal. The question is, can I do that using a general purpose timer configured as Input Compare? I don’t understand which callBack to use for this purpose.
I’m using the HAL library and TIM2 peripheral in STM32F446 microcotnroller.
This is how I configured my timer peripheral
void TIMER2_Config(void)
{
TIM_IC_InitTypeDef timer2IC_Config;
htimer2.Instance = TIM2;
htimer2.Init.CounterMode = TIM_COUNTERMODE_UP;
htimer2.Init.Period = 89; //Fck=50MHz, Timer period = 90us
htimer2.Init.Prescaler = 49;
if ( HAL_TIM_IC_Init(&htimer2) != OK)
Error_handler();
timer2IC_Config.ICPolarity = TIM_ICPOLARITY_RISING;
timer2IC_Config.ICPrescaler = TIM_ICPSC_DIV1;
if (HAL_TIM_IC_ConfigChannel(&htimer2, &timer2IC_Config, TIM_CHANNEL_1) != OK)
Error_handler();
}
What you are asking for is well within the features of this peripheral, but you must remember that the HAL library is not capable of using the full features of the chip. Sometimes you have to use access the registers directly (the LL library is another way to do this).
To have the external signal start the timer you need to use trigger mode, not input capture. Input capture means record the value of the timer which is already started. You need to set the field TIMx_CCMRx_CCxS to 0b11 (3) to make the input a trigger, then set the field TIMx_SMCR_TS to select the channel you are using, and TIMx_SMCR_SMS to 0b110 (6) to select start on trigger mode.
Next set up the prescaler and reload register to to count for the 90 microsecond delay that you want, and set TIMx_CR1_OPM to 1 to stop the counter wrapping when it reaches the limit.
Next set TIMx_CR2_MMS to 0b010 to output a trigger on the update event.
Finally you can set the ADCx_CR2_EXTSEL bits to 0b00110 to trigger on TIM2_TRGO trigger output.
This is all a bit complicated, but the reference manual is very thorough and you should read the whole chapter through and check every field in the register description section. I would recommend not mixing the HAL library with direct register access, it will probably interfere with what you are trying to do.

C# .NET system timer hiccup

private System.Timers.Timer timerSys = new System.Timers.Timer();
ticks are set to start at the beginning of each multiple of 5 seconds
entry exit
10:5:00.146 10:5:00.205
10:5:05.129 10:5:05.177
10:5:10.136 10:5:10.192
10:5:15.140 10:5:15.189
10:5:20.144 10:5:20.204
amd then a delay of 28 second
note that Windows 10 compensates for missing ticks
by firing them at close intervals
10:5:48.612 10:5:48.692
10:5:48.695 10:5:48.745
10:5:48.748 10:5:48.789
10:5:48.792 10:5:49.90
10:5:43.93 10:5:49.131
and another delay of 27 seconds
again Windows 10 crams ticks to compensate
but this time there is an even inside the second tick
that lasts about 28 seconds that makes the tick very long
10:6:16.639 10:6:16.878
this one is very long
10:6:16.883 10:6:42.980
10:6:42.984 10:6:43.236
10:6:43.241 10:6:43.321
10:6:43.326 10:6:43.479
The PC is running just two applications that I wrote.
They communicate via files and also via SQL tables.
This event happens maybe once every two months.
Questions:
What could be happening?
Is there a way to create a log file of all processes over time
My applications keep tabs of the time down to milliseconds.
So if there were a way of logging processes, I could match.
Alternately is there a way for my app to know what the OS is doing.

How do I implement a timer which turn a signal on and off (PWG) every few seconds on an Atmega324a microcontroller?

I'm programming an Atmega324a Microcontroller and I'm trying to implement a timer (in this case Timer1) which supposed to make a second led connected to my board blink.
I also need to know how to identify the pin the led is attached to
I've found the data sheet:
http://ww1.microchip.com/downloads/en/DeviceDoc/ATmega164A_PA-324A_PA-644A_PA-1284_P_Data-Sheet-40002070A.pdf
but the details are too technical for me to understand and I don't know where to start looking and most importantly, get to the result, which is the code itself.
Also, What does the ISR function do?
Down below is the current Init_timer function for Timer 0. Is it possible for me to enable both timers at the same time?
static void init_timer(void)
{
// Configure Timer0 for CTC mode, 64x prescaler for 1 ms interval
TCCR0A = _BV(WGM01);
TCCR0B = _BV(CS01) | _BV(CS00);
OCR0A = 124;
TIMSK0 = _BV(OCIE0A);
}
int main(void){
MCUSR = 0;
wdt_disable();
init_pins(); // Reset all pins to default state
init_timer(); // Initialize 1 msec timer interrupt
configure_as_output(LOAD_ON);
configure_as_output(LED1);
configure_as_output(LED2);
sei();
.
.
.
}
ISR(TIMER0_COMPA_vect)
{
static uint16_t ms_count = 0;
ms_count++; // milliseconds counter
if (ms_count == TMP107_POLL_PERIOD)
{
tmp107_command(); // send command to temperature sensor
toggle(LED1); // blink status led
ms_count = 0;
}
}
First of all: StackOverflow is a site to ask questions around source code, it is not a service delivering solutions. Please take the tour, it will help you to get satisfactory answers.
But nevermind, because you're new:
For example, you can implement a timer for a pulse width generator in these steps:
Learn to read data sheets. Nobody can relieve you of this burden.
Learn how to use output pins.
Write some tests to make sure you understand output pins.
Select a timer to measure the clock cycles. Apparently you did that already.
Learn to use this timer. Some timers can generate PWM (pulse width modulated) signals in hardware. However, the output pin is likely to be in a fixed location and the range of possible periods may not meet your requirements.
Write some tests to make sure you understand timers and interrupts.
If the required pulse period is too long for the timer, you can add an extra variable to scale down, for example.
Implement the rest of it.
Also, What does the ISR function do?
This function is called "magically" by hardware when the conditions for the interrupt are met. In the case shown, tmp107_command() and toggle(LED1) are called only every TMP107_POLL_PERIOD times.
Is it possible for me to enable both timers at the same time?
Sure.

Change timer period while running application STM32F4 [C]

I want change my timer period while running program
I make different measures requiring different timer periods.
After initialization:
TIM_TimeBaseInitStructure.TIM_Period = period - 1;
TIM_TimeBaseInitStructure.TIM_Prescaler = 8399+1;
TIM_TimeBaseInitStructure.TIM_ClockDivision = TIM_CKD_DIV1;
TIM_TimeBaseInitStructure.TIM_CounterMode = TIM_CounterMode_Up;
TIM_TimeBaseInit(TIM3, &TIM_TimeBaseInitStructure);
In main function I set: period = 10000;
Then, I receive new value via UART and try to set another value:
arr3[0] = received_str[11];
arr3[1] = received_str[12];
arr3[2] = received_str[13];
arr3[3] = received_str[14];
arr3[4] = received_str[15];
arr3[5] = '\0';
per = atoi(arr3);
period = per;
But timer period don't changes. How can I do it?
This is the problem with HAL libraries. People who use them have no clue what is behind it.
What is the timer period?
It is the combination of the PCS (prescaller) and ARR (auto reload register). The period is calculated as (ARR + 1) * (PSC + 1) / TimerClockFreq.
When you try to change the period when the timer is running you need to make sure that it is done in the safe moment to prevent glitches. The safest moment is then the UG event happens.
You have two ways of achieving it:
UG interrupt. In the interrupt routine if the ARR or PSC have changed - you should update the register. Bare in mind that the change may happen in the next cycle if the registers are shadowed.
Using the timers DMA burst more. It is more complicated to config - but the hardware take care of the registers update on the selected event. The change is instant and register shadowing does not affect it. More details read RM chapter about the timers DMA burst mode.
If you want to use more advanced hardware features forget about the HAL and program it using the bare registers having the full control.
At run time by updating auto reload register we can change the timer value.
I have done this practically.
TIM5->ARR = Value; //This is for timer5

Dart: Timer.periodic not honoring granularity of duration in VM

This may or may not be a bug, but I would like some help understanding the behavior of Timer.
Here is a test program that sets up Timer.periodic with a duration of 1000 microseconds (1 millisecond). The callback that fires increments a count. Once the count reaches 1000 intervals, the program prints the time elapsed and exits. The point being to get close to 1 second in execution time. Consider the following:
import 'dart:async'
main() {
int count = 0;
var stopwatch = new Stopwatch();
stopwatch.start();
new Timer.periodic(new Duration(microseconds: 1000), (Timer t) {
count++;
if(count == 1000){
print(stopwatch.elapsed);
stopwatch.stop();
}
});
The result is:
0:00:01.002953
That is, just over a second (assuming the remainder is coming from start time of the stopwatch).
However, if you change the resolution to be anything under 1 millisecond e.g. 500 microseconds, the Timer seems to ignore the duration entirely and executes as quickly as possible.
Result being:
0:00:00.008911
I would have expected this to be closer to half a second. Is this an issue with the granularity of the Timer? This issue can also be observed when applying a similar scenario to Future.delayed
The minimal resolution of the timer is 1ms. When asking for a 500ns duration is rounded to 0ms, aka: as fast as possible.
The code is:
int milliseconds = duration.inMilliseconds;
if (milliseconds < 0) milliseconds = 0;
return _TimerFactory._factory(milliseconds, callback, true);
Maybe it should take 1ms as a minimum, if that is its actual minimum, or it should handle microseconds internally, even if it only triggers every 10-15 milliseconds and runs the events pending so far.
If you are in VM it looks like a bug. Please file an issue.
If you are in JS side see the following note on the documentation of the Timer class:
Note: If Dart code using Timer is compiled to JavaScript, the finest granularity available in the browser is 4 milliseconds.

Resources