Implementing a delay using timers in STM32 - timer

Simply, I want to implement a delay function using stm32 timers, like the one in AVR microcontrollers "Normal mode". Can anybody help ? I just can't find that in the stm32 datasheet! It only supports PWM, input capture, output compare and one-pulse mode output!
N.B: I forgot to mention that I'm using stm32F401 microcontroller

You have very special timer for this purpose called SysTick. Set it to overflow every 1ms. In its handler
static volatile uint32_t counter;
void SysTick_Handler(void)
{
counter++;
}
inline uint32_t __attribute__((always_inline)) GetCounter(void)
{
return counter;
}
void Dealy(uint32_t ms)
{
uint32_t tickstart = GetCounter();
while((GetCounter() - tickstart) < ms);
}

If you want to be able to delay by periods shorter than milli-seconds
you can set up a timer to auto-reload and use the internal clock.
The internal clock frequency set to the timer is shown in the cube MX
clock tab. On my NUCLEO-F446RE most timers get 84MHz or 42MHz.
Timer 3 gets 42 MHz by default.
So if I set the pre scaler to 42 and the count period to maximum (0xFFFF
if 16 bit) I will have a cycling clock that changes every micro-second.
I can then, using a property of twos complement maths, simply subtract the
old time from the new to get the period.
void wait_us (int16_t delay) {
int16_t t1= htim3.Instance->CNT;
while (( htim3.Instance->CNT - t1 ) < delay) {
asm ("\t nop");
}
}
This is an old trick from PIC18 C coding

Related

How to get millisecond resolution from DS3231 RTC

How to get accurate milliseconds?
I need to calculate the delay of sending data from Arduino A to Arduino B. I tried to use DS3231 but I cannot get milliseconds. What should I do to get accurate milliseconds from DS3231?
The comment above is correct, but using millis() when you have a dedicated realtime clock makes no sense. I'll provide you with better instructions.
First thing in any hardware interfacing project is a close reading of the datasheet. The DS3231 datasheeet reveals that there are five possible frequencies of sub-second outputs (see page 13):
32 KHz
1 KHz
1.024 KHz
4.096 KHz
8.192 KHz
These last four options are achieved by various combinations of the RS1 and RS2 control bits.
So, for example, to get exact milliseconds, you'd target option 2, 1KHz. You set RS1 = 0 and RS2 = 0 (see page 13 of the datasheet you provided) and INTCN = 0 (page 9). Then you'd need an ISR to capture interrupts from the !INT/SQW pin of the device to a digital input pin on your Arduino.
volatile uint16_t milliseconds; // volatile important here since we're changing this variable inside an interrupt service routine:
ISR(INT0_vect) // or whatever pin/interrupt you choose
{
++milliseconds;
if(milliseconds == 999) // roll over to zero
milliseconds = 0;
}
OR:
const int RTCpin = 3; // use any digital pin you need.
void setup()
{
pinmode(RTCpin, INPUT);
// Global Enable INT0 interrupt
GICR |= ( 1 < < INT0);
// Signal change triggers interrupt
MCUCR |= ( 1 << ISC00);
MCUCR |= ( 0 << ISC01);
}
If these commands in setup() don't work on your Arduino, google 'Arduino external interrupt INT0'. I've shown you two ways, one with Arduino code and one in C.
Once you have this ISR working and pin3 of the DS3231 connected to a digital input pin of your choosing, that pin will be activated at 1KHz, or every millisecond. Perfect!
// down in main program now you have access to milliseconds, you might want to start off by setting:
// When 1-second RTC changes seconds:
milliseconds = 0; // So you can measure milliseconds since last second.
That's all there is to it. All you need to learn now is how to set the command register using I2C commands and you're all set.
The C code example gains 1ms every second. Should be:
{
if (milliseconds == 999) // roll over to zero
milliseconds = 0;
else
++milliseconds;
}

Understanding How DelayMS Function Works With PIC18F Microcontroller

I am trying to use a PIC18F45K22 to flash an LED on for a second, then off for a second, repeating indefinitely. The code incorporates a delay function that keeps the PIC busy for one second but I am not sure how it works (it was done through trial-and-error.) To achieve a 1 second delay, tWait should equal 125,000. I am using a 8MHz MCU clock frequency with PLL disabled. I know that 8MHz is 125nSec but after that, I am stuck. Can anyone please explain to me how it works? I've been at it for a few hours and it's driving me bonkers.
#define SYSTEM_FREQUENCY (8000000) // 8MHz
// Prototype Function
void delayS(unsigned int sec);
void main()
{
// Sets RA0 to output...
TRISA.RA0 = 0;
do
{
// Turns RA0 off...
LATA.RA0 = 0;
delayS(1);
// Turns RA0 on...
LATA.RA0 = 1;
delayS(1);
} while(1); // Repeat...
}
// My attempt at writing the function...
void delayS(unsigned int sec)
{
unsigned long tWait, tStart;
/*
To achieve a 1 Second On 1 Second Off cycle, tWait should be 125,000.
How? Why? 64 is arbitrary to achieve 125,000.
*/
tWait = Sec*(SYSTEM_FREQUENCY/64);
for(tStart = 0; tStart < tWait; tStart++);
}
The clock frequency is 8MHz, so there are 2.10^6 cycles per seconds since a cycle takes 4 clock ticks. Any assembly operation of the microcontroller takes at least one cycle. There is still a factor 16 to explain.
The factor 16 must correspond to one pass of the loop. Since unsigned long are used in this loop, it is not surprising that it takes a few operations. You will need to look at the assembly code to understand what is happening and get a precise count, like here.
If you are working with MPLAB and the xc8 compiler, there are some functions you can use, such as Delay10KTCYx(unsigned char);.
The microcontroller spends 99.99% of its time counting, wich is not critical if you flash leds. But, to use efficiently a pic, you will need to avoid these functions by using timers and interruptions. Have fun !

Count down timer for Arduino sketch

I wrote an Arduino sketch a little while ago, and I am trying to add functionality to the sketch. Basically I want a count down timer that closes a solenoid cut off valve after 30 seconds has passed.
You can do that using timers and interrupts, but some more informations is needed (which board, which processor).
Note: F_CPU is already defined if you are using arduino libraries (#define F_CPU 20000000U)
Note 2: You may want to use another timer than TIMER0 since it is use to track time on arduino
#define GMilliSecondPeriod F_CPU / 1000
unsigned int gNextOCR = 0;
volatile unsigned long gMillis = 0;
bool valveOpened = false;
// This interruption will be called every 1ms
ISR(TIMER2_COMPA_vect)
{
if(valve_open){
gMillis++;
if(gMillis >= 30000){
close_valve();
gMillis = 0;
}
}
gNextOCR += GMilliSecondPeriod;
OCR2A = gNextOCR >> 8; // smart way to handle millis, they will always be average of what they should be
}
// Just call this function within your setup
void setupTime(){
TCCR2B |= _BV(CS02);
TIMSK2 |= _BV(OCIE0A);
sei(); // enable interupts
}

Arduino Nano Timers

I want to know more about Arduino Nano timers.
What timers are there?
Do they produce interrupts?
What code would attach an interrupt handler to them?
How is delay() and delayMicroseconds() implemented...
Do they use timer interrupts? (If so, how can I have other code execute during this?)
Or do they repeatedly poll until a timer reaches a certain value?
Or do they increment a value X number of times?
Or do they do it another way?
The best way to think about the Arduino Nano timers is to think about the timers in the underlying chip: the ATmega328. It has three timers:
Timer 0: 8-bit, PWM on chip pins 11 and 12
Timer 1: 16-bit, PWM on chip pins 15 and 16
Timer 2: 8-bit, PWM on chip pins 17 and 5
All of these timers can produce two kinds of interrupts:
The "value matched" interrupt occurs when the timer value, which is added to every tick of the timer reaches a comparison value in the timer register.
The timer overflow interrupt occurs when the timer value reaches its maximum value
Unfortunately, there is no Arduino function to attach interrupts to timers. To use timer interrupts you will need to write slightly more low-level code. Basically, you will need to declare an interrupt routine something like this:
ISR(TIMER1_OVF_vect) {
...
}
This will declare a function to service timer1 overflow interrupt. Then you will need to enable the timer overflow interrupt using the TIMSK1 register. In the above example case this might look like this:
TIMSK1 |= (1<<TOIE1);
or
TIMSK1 |= BV(TOIE1);
This sets the TOIE1 (generate timer1 overflow interrupt, please) flag in the TIMSK1 register. Assuming that your interrupts are enabled, your ISR(TIMER1_OVF_vect) will get called every time timer1 overflows.
The Arduino delay() function looks as follows in the source code (wiring.c):
void delay(unsigned long ms)
{
uint16_t start = (uint16_t)micros();
while (ms > 0) {
if (((uint16_t)micros() - start) >= 1000) {
ms--;
start += 1000;
}
}
}
So internally it uses the micros() function, which indeed relies on the timer0 count. The Arduino framework uses timer0 to count milliseconds, indeed, timer0 count is is where millis() function gets its value.
The delayMicroseconds() function, on the other hand, uses certain well-timed microprocessor operations to create the delay; which function is used depends on the processor and the clock speed; the most common being nop() (no operation) which takes exactly one clock cycle. Arduino Nano uses a 16 MHz clock, and here's what the source code looks like for that:
// For a one-microsecond delay, simply return. The overhead
// of the function call yields a delay of approximately 1 1/8 µs.
if (--us == 0)
return;
// The following loop takes a quarter of a microsecond (4 cycles)
// per iteration, so execute it four times for each microsecond of
// delay requested.
us <<= 2;
// Account for the time taken in the proceeding commands.
us -= 2;
What we learn from this:
1 µs delay does nothing (the function call is the delay)
Longer delays use the left shift operation to time the delay.

Generating nanosecond delay in C on STM32

I am using STM32F2 controller and I am interfacing with an ST7036 LCD display via 8 bit parallel interface.
The datasheet says there should be a 20 nano second delay between address hold and setup time.
How do I generate a 20 nanosecond delay in C?
Use stopwatch_delay(4) below to accomplish approximately 24ns of delay. It uses the STM32's DWT_CYCCNT register, which is specifically designed to count actual clock ticks, located at address 0xE0001004.
To verify the delay accuracy (see main), you can call STOPWATCH_START, run stopwatch_delay(ticks), then call STOPWATCH_STOP and verify with CalcNanosecondsFromStopwatch(m_nStart, m_nStop). Adjust ticks as needed.
uint32_t m_nStart; //DEBUG Stopwatch start cycle counter value
uint32_t m_nStop; //DEBUG Stopwatch stop cycle counter value
#define DEMCR_TRCENA 0x01000000
/* Core Debug registers */
#define DEMCR (*((volatile uint32_t *)0xE000EDFC))
#define DWT_CTRL (*(volatile uint32_t *)0xe0001000)
#define CYCCNTENA (1<<0)
#define DWT_CYCCNT ((volatile uint32_t *)0xE0001004)
#define CPU_CYCLES *DWT_CYCCNT
#define CLK_SPEED 168000000 // EXAMPLE for CortexM4, EDIT as needed
#define STOPWATCH_START { m_nStart = *((volatile unsigned int *)0xE0001004);}
#define STOPWATCH_STOP { m_nStop = *((volatile unsigned int *)0xE0001004);}
static inline void stopwatch_reset(void)
{
/* Enable DWT */
DEMCR |= DEMCR_TRCENA;
*DWT_CYCCNT = 0;
/* Enable CPU cycle counter */
DWT_CTRL |= CYCCNTENA;
}
static inline uint32_t stopwatch_getticks()
{
return CPU_CYCLES;
}
static inline void stopwatch_delay(uint32_t ticks)
{
uint32_t end_ticks = ticks + stopwatch_getticks();
while(1)
{
if (stopwatch_getticks() >= end_ticks)
break;
}
}
uint32_t CalcNanosecondsFromStopwatch(uint32_t nStart, uint32_t nStop)
{
uint32_t nDiffTicks;
uint32_t nSystemCoreTicksPerMicrosec;
// Convert (clk speed per sec) to (clk speed per microsec)
nSystemCoreTicksPerMicrosec = CLK_SPEED / 1000000;
// Elapsed ticks
nDiffTicks = nStop - nStart;
// Elapsed nanosec = 1000 * (ticks-elapsed / clock-ticks in a microsec)
return 1000 * nDiffTicks / nSystemCoreTicksPerMicrosec;
}
void main(void)
{
int timeDiff = 0;
stopwatch_reset();
// =============================================
// Example: use a delay, and measure how long it took
STOPWATCH_START;
stopwatch_delay(168000); // 168k ticks is 1ms for 168MHz core
STOPWATCH_STOP;
timeDiff = CalcNanosecondsFromStopwatch(m_nStart, m_nStop);
printf("My delay measured to be %d nanoseconds\n", timeDiff);
// =============================================
// Example: measure function duration in nanosec
STOPWATCH_START;
// run_my_function() => do something here
STOPWATCH_STOP;
timeDiff = CalcNanosecondsFromStopwatch(m_nStart, m_nStop);
printf("My function took %d nanoseconds\n", timeDiff);
}
The first specification I found of Stm32f2 assumes a clock frequency of 120 MHz. That's about 8ns per clock cycle. You would need about three single cycle instructions between successive write or read/write operations. In C, a++; will probably do (if a is located in stack).
You should look into the FSMC peripheral available in your chip. While the configuration might be complicated, especially if you're not dropping in a memory part that it was designed for, you might find that your parallel interfaced device maps pretty well to one of the memory interface modes.
These sorts of external memory controllers must have a bunch of configurable timing options to support the range of different memory chips out there so you'll be able to guarantee the timings required by your datasheet.
The nice benefit of being able to do this is your LCD will then seem like any old memory mapped peripheral, abstracting away the lower level interfacing details.

Resources