Issue with calculating time within FreeRTOS taskENTER_CRITICAL and taskEXIT_CRITICAL - arm

I am running FreeRTOS on Arm Cortex M7, with following config:
#define configPRIO_BITS 3
#define configLIBRARY_LOWEST_INTERRUPT_PRIORITY 0xf
#define configLIBRARY_MAX_SYSCALL_INTERRUPT_PRIORITY 5
#define configKERNEL_INTERRUPT_PRIORITY ( configLIBRARY_LOWEST_INTERRUPT_PRIORITY << (8 - configPRIO_BITS) )
#define configMAX_SYSCALL_INTERRUPT_PRIORITY ( configLIBRARY_MAX_SYSCALL_INTERRUPT_PRIORITY << (8 - configPRIO_BITS) )
When I am calculating time for some data transfer using the task critical sections, it is giving wrong value (~300us):
taskENTER_CRITICAL();
{
startTime = xTaskGetCurrentTimeInUsFromISR();
// Read data from card
doDATAtransfer();
endTime = xTaskGetCurrentTimeInUsFromISR();
netTime = endTime - startTime; //~300us
}
taskEXIT_CRITICAL();
If I don't use the critical sections then it is giving correct and expected value as ~8000us.
This seems like if I measure the time with all interrupt disabled, it is giving wrong time.
Can anyone please help me understand what could be issue here?

I found out the answer! Today morning, I went for jogging and somehow it clicked to me. Here it is:
Since systick interrupt has the lowest priority, the systick ISR will not get executed within those critical sections and due to which it will not give correct value.

Related

How to generate note without library?

I'm trying to generate note for example Do , do's frequency is 523.
I wrote some codes but, i did not work
Systick 8 mhz
void note1(void){ // Note Do
for (int i = 0; i < 523; i++){
GPIOE->ODR = 0x4000;
delay_ms(1);
GPIOE->ODR = 0x0000;
delay_ms(1);
}
}
How can we solve this problem ?
EasyMx Pro v7
I'm calling the function like that
void button_handler(void)
{
note1();
// Clear pending bit depending on which one is pending
if (EXTI->PR & (1 << 0)){
EXTI->PR = (1 << 0);
}
else if (EXTI->PR & (1 << 1)){
EXTI->PR = (1 << 1);
}
}
523 times sending 1 and 0 and delay_ms 1 = 1 ms
1000 = 1 sec
On STM32 (as I can see you have it) you have timers which can be configured as PWM output.
So use timer, set period and prescaler values according to your needed frequency and set duty cycle on channel to 50%.
If you need 523Hz PWM output, then set your timer PWM to 523Hz using prescaler and period value:
timer_overflow_frequency = timer_input_clock /
(prescaler_value + 1) /
(period_value + 1) ;
Then, for your output channel set value half of timer period value.
For standard peripheral library, tutorial can be used from here:
https://stm32f4-discovery.net/2014/05/stm32f4-stm32f429-discovery-pwm-tutorial/
Link from unwind for Cube https://electronics.stackexchange.com/questions/179546/getting-pwm-to-work-on-stm32f4-using-sts-hal-libraries
You appear to have a fundamental misunderstanding. In your code note1(), the value 523 will affect only the duration of the note, nit its frequency. With 1ms high, 1ms low repeated 523 times you will generate a tone of approximately 500Hz for approximately 1.43 seconds. I say "approximately" because there will be some small overhead in the loop other then the time delays.
A time delay resolution 1ms is insufficient to generate an accurate tone in that manner. To do it in the manner you have, each delay would need to be 1/2f seconds, so for 523Hz approximately 956ms. The loop iteration count would need to be ft, so for say .25 seconds, 131 iterations.
However if button_handler() is as it appears to be an interrupt handler, you really should not be spending 1.46 seconds in an interrupt handler!
In any event this is an extraordinarily laborious, CPU intensive and inaccurate method of generating a specific frequency. The STM32 on your board is well endowed with hardware timers with direct GPIO output that will generate the frequency you need accurately with zero software over head. Even if none of the timers map to a suitable GPIO output that you need to use, ou can still get one to generate an interrupt at 1/2f and toggle the pin in the interrupt handler. Either way that will leave the processor free to do useful stuff while the tone is being output.

Understanding How DelayMS Function Works With PIC18F Microcontroller

I am trying to use a PIC18F45K22 to flash an LED on for a second, then off for a second, repeating indefinitely. The code incorporates a delay function that keeps the PIC busy for one second but I am not sure how it works (it was done through trial-and-error.) To achieve a 1 second delay, tWait should equal 125,000. I am using a 8MHz MCU clock frequency with PLL disabled. I know that 8MHz is 125nSec but after that, I am stuck. Can anyone please explain to me how it works? I've been at it for a few hours and it's driving me bonkers.
#define SYSTEM_FREQUENCY (8000000) // 8MHz
// Prototype Function
void delayS(unsigned int sec);
void main()
{
// Sets RA0 to output...
TRISA.RA0 = 0;
do
{
// Turns RA0 off...
LATA.RA0 = 0;
delayS(1);
// Turns RA0 on...
LATA.RA0 = 1;
delayS(1);
} while(1); // Repeat...
}
// My attempt at writing the function...
void delayS(unsigned int sec)
{
unsigned long tWait, tStart;
/*
To achieve a 1 Second On 1 Second Off cycle, tWait should be 125,000.
How? Why? 64 is arbitrary to achieve 125,000.
*/
tWait = Sec*(SYSTEM_FREQUENCY/64);
for(tStart = 0; tStart < tWait; tStart++);
}
The clock frequency is 8MHz, so there are 2.10^6 cycles per seconds since a cycle takes 4 clock ticks. Any assembly operation of the microcontroller takes at least one cycle. There is still a factor 16 to explain.
The factor 16 must correspond to one pass of the loop. Since unsigned long are used in this loop, it is not surprising that it takes a few operations. You will need to look at the assembly code to understand what is happening and get a precise count, like here.
If you are working with MPLAB and the xc8 compiler, there are some functions you can use, such as Delay10KTCYx(unsigned char);.
The microcontroller spends 99.99% of its time counting, wich is not critical if you flash leds. But, to use efficiently a pic, you will need to avoid these functions by using timers and interruptions. Have fun !

USB Interrupt Masks not loading STM32L151CC

I'm currently encountering a strange issue with the STM USB libraries. I am able to successfully load firmware onto the STM32L152D-EVAL board (which uses an STM32L152ZD), however, I am unable to modify the same code to work on my form-factor board, which uses the aforementioned STM32L151CC.
After stepping through the code with the debugger (a ULINK2, using the KEIL uVision4 IDE), I noticed that the code would crash while setting the interrupt mask in the function USB_SIL_Init()
uint32_t USB_SIL_Init(void)
{
/* USB interrupts initialization */
/* clear pending interrupts */
_SetISTR(0);
wInterrupt_Mask = IMR_MSK;
/* set interrupts mask */
_SetCNTR(wInterrupt_Mask);
return 0;
}
More specifically, _SetCNTR(wInterrupt_Mask); is what gives me the error. I haven't changed the value of IMR_MSK between either board. It's value is given as
#define IMR_MSK (CNTR_CTRM | CNTR_WKUPM | CNTR_SUSPM | CNTR_ERRM | CNTR_SOFM \
| CNTR_ESOFM | CNTR_RESETM )
which is 0xBF00
_SetCNTR is defined as follows
#define _SetCNTR(wRegValue) (*CNTR = (uint16_t)wRegValue)
With CNTR being defined as
/* Control register */
#define CNTR ((__IO unsigned *)(RegBase + 0x40))
And RegBase is
#define RegBase (0x40005C00L) /* USB_IP Peripheral Registers base address */
I'm currently looking through STM's documentation on this, but I can't seem to find anything specifically relating to the default states for the two different chips. I'm guessing it has something to do with the base address, however the Datasheet shows that this is the correct address.
Can anyone help me out on this?
Thanks!

FRDM-KL46Z Internal Reference clock to 48 MHz?

I am working on IAR compiler for FRDM-KL46Z Platform.
I want to use Internal clock and set it to 48 MHz (or as maximum as possible).
Till now I have done the following steps in the example sysinit.c file and function sysinit() provided.
#define NO_PLL_INIT
#if defined(NO_PLL_INIT)
mcg_clk_hz = 48000000; // It only works on 21000000 Hz, otherwise I get garbage prints on UART0.
SIM_SOPT2 &= ~SIM_SOPT_PLLFLLSEL_MASK
uart0_clk_khz = (mcg_clk_hz) / 1000;
#else
....
In FEI mode, if I do FBI mode or BLPI mode, I get very less mcu clock.
I want the mcu clk to be as high as possible, in internal clock. (According to datasheet I think it is supported, but I don't know how?)
Can anyone please explain or any code reference, much obliged.
Fixed it by doing this
#define NO_PLL_INIT
#if defined(NO_PLL_INIT)
MCG_C4 |= (MCG_C4_DRST_DRS(1) | MCG_C4_DMX32_MASK);
mcg_clk_hz = 48000000;
SIM_SOPT2 &= ~SIM_SOPT_PLLFLLSEL_MASK
uart0_clk_khz = (mcg_clk_hz) / 1000;
#else
....

Setting up multiple timers with AVR

I am trying to set up two timer interrupt routines with Teensy 2.0 Microcontroller (which is based on ATMEGA32U4 8 bit AVR 16 MHz) for independent control of two servo motors
After much trial - I was able to set one up on pin 7 of port C, but
How do I set up the second ISR to be initialized and called independently of the first?
Do I need to setup the second timer and, if so, what would such code look like?
Here is the setup code:
int main(void)
{
DDRE = 0xFF;
TCCR1A |= 1 << WGM12; // Configure timer 1 for CTC mode
TCCR1B = (1<<WGM12) | (1<<CS11) ;
OCR1A = 1000; // initial
TIMSK1 |= 1 << OCIE1A; // Output Compare A Match Interrupt Enable
sei(); // enable interrupts
// ...code that sets pulseWidth based on app logic variable.
// Not showing as its not important
}
ISR(TIMER1_COMPA_vect)
{
if (0 == pulseWidth)
{
return;
}
static uint8_t state = 0;
int dutyTotal = 20*1000;
if (0 == state)
{
PORTC |= 0b10000000;
OCR1A = pulseWidth;
state = 1;
}
else if (1 == state)
{
PORTC &= 0b01111111;
OCR1A = dutyTotal - pulseWidth;
state = 0;
}
}
While it's difficult to give a definitive answer without knowing more about your application (e.g. what kind of servo/motor, - I'm guessing model RC type with 1-2ms pule?) there are two approaches to solving the problem:
Firstly, In your code you seem to be manually generating a PWM signal by toggling PC7. You could add another output by increasing your number of states - you need one more than the number of servos to give the gap which sets the pulse repetition frequency. This is a common technique when you need to drive a lot of servos, since most RC servos don't care about pulse phasing or frequency (within limits), only the pulse width, so you can generate a bunch of different pulses one after the other on different outputs while only using one timer like this (in a sort of pseudo-code state diagram):
State 0:
Turn on output 1
Set timer TOP to pulse duration 1.
Go to state 1:
State 1:
Turn off output 1
Turn on output 2
Set timer TOP to pulse duration 1.
Go to state 2:
State 2:
Turn off output 2
Set timer TOP to pulse duration 3.
Go to state 0:
"Pulse duration 3" sets the PRF (pulse repetition frequency). If you want get fancy,
you can set this to 1/f-pd1-pd2, to give a constant frequency.
[ "TOP" is AVR-speak for the thing that sets the wrap (overflow) rate of the timer. See data sheet. ]
Secondly, there is a much easier way if you're only using two servos - use the hardware PWM functionality of the timer. The AVR timers have a built-in PWM function to do the pin-toggling for you. Timer1 on the mega32 has two PWM output pins, which could work great for your two servos and you then don't (necessarily) need an interrupt handler at all.
This is also the right solution if you are PWM driving motors directly (e.g. through an H-Bridge.)
To do this, you need to put the timer into PWM mode and enable the OC1A and OC1B output pins, e.g.
/*
* Set fast PWM mode on OC1A and OC1B with ICR1 as TOP
* (Mode 14)
*/
TCCR1A = (1 << WGM11) | (1 << COM1B1) | (1 << COM1A1);
TCCR1B = (3 << WGM12);
/*
* Clock source internal, pre-scale by 8
* (i.e. count rate = 2MHz for 16MHz crystal)
*/
TCCR1B |= (1 << CS11);
/*
* Set counter TOP value to set pulse repetition frequency.
* E.g. 50Hz (good for RC servos):
* 2e6/50 = 40000. N.B. This must be less than 65535.
* We count from t down to 0 so subtract 1 for true freq.
*/
ICR1 = 40000-1;
/* Enable OC1A and OC1B PWM output */
DDRB |= (1 << PB5) | (1 << PB6);
/* Uncomment to enable TIMER1_OVF_vect interrupts at 50Hz */
/* TIMSK1 = (1 << TOV1); */
/*
* Set both servos to centre (1.5ms pulse).
* Value for OCR1x is 2000 per ms then subtract one.
*/
OCR1A = 3000-1;
OCR1B = 3000-1;
Disclaimer - this code fragment compiles but I have not checked it on an actual device so you may need to double-check the register values. See the full datasheet at http://www.atmel.com/Images/doc7766.pdf
Also, you perhaps have some typos in your code, bit WGM12 doesn't exist in TCC1A (you've actually set bit 3, which is FOC1A - "force compare", see datasheet.) Also, you are writing DDRE to enable outputs on port E but toggling a pin on port C.
Halzephron, thank you so much for your answer. I dont have the high enough reputation to mark your answer, hopefully someone else will.
Details below:
You are absolutely right about being able to use a single IRS to control a number of servos - embarrassing I did not think of that, but given how well your simpler solution worked - I'll just use that.
... Also, you are writing DDRE to enable outputs on port E but toggling a pin on port C.
Thanks, I commented out that line, and it works the same - (so I dont need to enable output at all?)
Bit WGM12 doesn't exist in TCC1A (you've actually set bit 3, which is FOC1A - "force compare", see datasheet.)
I removed that too, but leaving the rest of my code unchanged - it results in servos moving slower, with way less torque and jittering as they do, even after arriving at desired position. Servo makes a weird "shaky" noise (frequency ~10-20, I'd say) and the arm it trembling, so for the reasons I don't understand - setting this bit seems necessary.
I suspected that a timer per motor is highly inelegant, and so I really like your second approach (built - in timer-generated PWM),
... This is also the right solution if you are PWM driving motors directly (e.g. through an H-Bridge.)
Very curious, why? I.e. what's the difference between using this method vs, say, timer-generated PWM.
In your code, I had to change below line to get 50HZ, otherwise I was getting 25HZ before (verified with a scope)
ICR1 = 20000-1; // was 40000 - 1;
One other thing I noticed with scope was that the timer-generated PWM code I have - produces less "rectangular" shape than the PWM code snippet you attached. It takes about 0.5 milliseconds for the signal to drop off to 0 with my code, and its absolutely instantaneous with yours (which is great). This solved another problem I had been bashing my head against: I could get analog servos to work fine with my code (IRS-generated PWM), but any digital servo I tried - just did not move, as if it was broken. I guess the shape of the signal is critical for digital servos, I never read this anywhere. Or maybe its something else I dont know.
As a side note, another weirdness I spent a bunch of time on - I uncommented the TIMSK1 = (1 << TOV1); line, thinking I always needed it - but what happened, my main function would get frozen, blocked forever upon first call to delay_ms(...) not sure what it is - but keeping it commented out unblocks my main loop (where I read the servo positin values from the USB HID using Teensy's sample code)
Again, many thanks for the help.

Resources