Difference between GPIOx_BSSR and GPIOx_BRR on STM32F1xx? - arm

I'm reading the reference manual of the STM32F1xx family.
page 173 & 174 describe the BSSR and BRR registers of the GPIO peripheral.
I don't understand the difference between the reset part of the BSSR register and the BRR register. They both reset the bits set to 1 and left the others untouched, right ? What's the purpose of BRR then ?

Related

How is an interrupt enabled and disabled in the IRQ on the processor side? [STM32]

From what I understand, the interrupts are enabled by using the below approach :
The steps 1) through 9) are followed in a general EXTI programming.
There are two sides for enabling and clearing the interrupt :
Peripheral Side : gets enabled by SYSCFG, EXTI blocks and also cleared in the pending register EXTI_PR once interrupt is encountered.
Processor Side : NVIC->ISER to enable. But in spite of NVIC->ICPR being there I don't see that it's being used to clear the processor side for pending register. Why?
Are there any peripheral side interrupt generators too which also don't have any pending register that required to be cleared?
Any document which explains these also would be greatly appreciated.
It is the same with all other peripheral registers. The clear path is not important from the programmer point of view. The important is information how to clear the bit.
Clear the bit you need to clear bit EXTI PR register.
Clearing sometimes happen indirectly (for example by reading the data register in some communication registers). Some peripherals have special registers only for clearing the pending bit. Example: DMA
Everything is in the Reference Manual.
Clearing the flag in the peripheral register deasserts the interrupt line coming to NVIC form peripheral.

STM32 Bare Metal C - Can't get LED to work

So I'm currently following a course for the STM32 Microprocessor however, I seem to fail at even the most basic thing: turning a LED on. The complete code is at the bottom of this post.
Important:
The hardware is functioning properly.
I am using a STM32L432KC.
First of all, we have to figure out on what pin the built-in LED is. According to the manufacturers manual, the LED should be on pin D13 (PB3).
Okay so we're looking for PB3. According to the datasheet of the STM32L432KC, PB3 is on the B port and therefore connected to the high performance bus as suggested in the image below.
Cool. So our bus is AHB2 and we're working with GPIOB. Now we have to enable the clock on that bus using the RCC_AHB3ENR register. Now, this is the part where I'm probably going to make mistakes (as this post otherwise wouldn't exist), so please pay close attention. If I understand correctly, I want bit 1 to be set to 1 as this indicates that 'GPIOBEN' is set to 'IO port B clock enabled.'.
This leads me to believe that I should set the bus register as follows:
RCC->AHB2ENR |= 0x2;
Next up I have to set the mode of the GPIO pin to output. According to the course and my documentation, this is done using GPIOx_MODER.
This leads me to believe that I should set the GPIO mode as follows:
GPIOB->MODER |= 0x40;
And last but not least to turn the actual LED on we have to set the output data register which is GPIOx_ODR.
This leads me to believe that I should set the data as follows:
GPIOB->ODR = 0x8;
I'm not sure where I'm going wrong but this is the first time I'm working with registers on such a deep level. I must be overlooking something but I've tried multiple examples and had no success. All help is appreciated.
This is the complete code:
// PB3 - User LED
// RCC->AHB2ENR
// GPIOx_MODER
// GPIOx_ODR
#include "stm32l4xx.h"
int main(void)
{
RCC->AHB2ENR |= 0x2;
GPIOB->MODER |= 0x40;
while(1)
{
GPIOB->ODR = 0x8;
}
}
Your mode register is not configured correctly. Your line of code
GPIOB->MODER |= 0x40;
can only set bits, it cannot clear them. And you have too many bits set, as the reset value of each pair is 11 and the entire register is FFFF FFFF for ports C-E, FFFF FEBF for port B.
You should use
GPIOB->MODER = (GPIOB->MODER & 0xFFFFFF3F) | 0x00000040;
although because the reset state is guaranteed, this will also work:
GPIOB->MODER &= 0xFFFFFF7F; // equivalently, ~0x0080
The note in the documentation of 11 analog mode (reset state) is not accurate for all pins. Several are in 10 alternate function mode at reset, including PB3. So you need to both clear one bit and set one.

Can I use PCINT0 and PCINT1 for all pin interrupts on atmega328pb?

I am trying to make it so that then PINB7 is pressed (which is pin of botton) LED to light up.
PINB7 is PCINT8 on board.
So i set
PCICR|=(1<<1);//enable interrupts for pins 14-8
sei();
PCMSK1|=(<<PCINT8); // mask for pin 8
i don't get what vector i should use in ISR. From what i saw I should just do PCINT8_vect, however vector doesn't get highlighted like then i use "TIMER2_COMPB_vect".So does PCINT8 vector not exist or is there way to use PCINT0 and 1 for this?
Apparently , on mega328pb , there is no vectors for individual pin interrupts but there are vectors for PCIE0,1,2,3.
0- bits 0:7
1-8:14
2-16:23
3-24:27
so if u want to use interrupt for pin b7.
PB7 is PCINT7(can be seen in chapter about i/o ports).
So id have to enable PCIE0 in PCIRC register, correct mask it TMSK0. and use ISR(PCINT0_vect).
But if you have interrupt on PCINT6 and PCINT7 , you need if else in ISR to determine which of 2 pins triggered interrupt

ATMega128 Output Flicker on Startup

I'm using an ATMega128 micro and have all of my pin inits set to output and set to low under my main section of code:
PORTB=0x00;
DDRB=0xFF;
However on startup, the output associated with PORTB.0 flicks high for a split second (I've caught it on the scope) and it seems the other outputs are doing the same. Seems like it goes LOW-HIGH-LOW. I've done some reading that it could be caused by the tri-state to output switch during startup, so I've set the PUD register to 1 before the pin inits and then back to 0 after and still no luck. Does anyone have any other ideas to keep that output off during startup? It doesn't always occur either which is what has me stumped.
The fundamental problem is a hardware issue - lack of a pull-down resistor on the GPIO so that it is floating when in the reset-default high-impedance input state.
The best you can do in software is to initialise the GPIO at the earliest opportunity immediately after the reset. To do this in CodeVisionAVR you need to use a customised startup.asm in your project as described in section 4.18 of the CoadeVisionAVR compiler manual:
...
Where I suggest you initialise PORTB and DDRB as follows:
LDI R16, 0x00
OUT PORTB, R16
LDI R16, 0xFF
OUT DDRB, R16
immediately before step 2, i.e. the first four instructions. The amount of time the GPIO will be left floating will possibly be too small for the relay to react if it is a mechanical relay. You may still have a problem for a solid state relay. The length of any pulse may depend on the power-supply rise time; if it is slow, you may get a longer pulse.

ATmega16 programing-registers mismatchs

I am programming ATmega16 and I chose that controller in device manager(atmel studio 6.2) but the registers don't match with the registers in the datasheet for ATmega16. I am using ICE 3 and I tried with the simulator but it is the same result. Register UCSRC is different, from some other controller and can't write in it, even though I set MSB to one.
according to datasheet of ATMEGA16, it is not required to set MSB of UCSRC as its initial value is 1.
When you are working with UART you need to make sure that,
You are setting proper UBRRL and UBRRH, according to controller clock using formula as below:
FOSC/16/BAUD-1
Enable RXEN and TXEN from UCSRB
Set the proper bits in UCSRCaccording to your requirement of stop bit and parity bit
Please refer following image :

Resources