ATmega16 programing-registers mismatchs - c

I am programming ATmega16 and I chose that controller in device manager(atmel studio 6.2) but the registers don't match with the registers in the datasheet for ATmega16. I am using ICE 3 and I tried with the simulator but it is the same result. Register UCSRC is different, from some other controller and can't write in it, even though I set MSB to one.

according to datasheet of ATMEGA16, it is not required to set MSB of UCSRC as its initial value is 1.
When you are working with UART you need to make sure that,
You are setting proper UBRRL and UBRRH, according to controller clock using formula as below:
FOSC/16/BAUD-1
Enable RXEN and TXEN from UCSRB
Set the proper bits in UCSRCaccording to your requirement of stop bit and parity bit
Please refer following image :

Related

Cannot change value of LCD_CR register

I am currently playing with L152C Discovery board and trying to make simple clock that would use the RTC build into the STM32 and onboard Glass LCD with LCD HAL library configured via CubeMX.
But I am currently facing a problem I can't get my head around:
CubeMX does not have an option to enable segment mux in the LCD_CR register. I would like to enable it, because it would make the segment mapping easier.
So I thought, fine, I will make an direct register manipulation, enabling the mux (bit 7 in the LCD_CR).
I used the command LCD->CR |= LCD_CR_MUX_SEG; But even after executing the command, the MUX_SEG bit is still zero. (I checked in the debug session with command stepping and SFRs memory map)
Is there something that I am doing wrong? Or is there another way to change init parameters that CubeMX configured but does not have graphical implementation of this settings option?
The application is using FreeRTOS and I executed LCD->CR |= LCD_CR_MUX_SEG; after HAL_LCD_Init(&hlcd); so I sappose that the LCD peripheral clock is running (and segments are updating).
I recorded a short video showing this problem:
https://youtu.be/0X6Zu5EPudU
To be honest, I am not skilled at direct register manipulation, so I am probably doing something wrong.
Any help would be appreciated!😇
As #KIIV said:
RM0038 Liquid crystal display controller (LCD) Note: The VSEL, MUX_SEG, BIAS and DUTY bits are write protected when the LCD is enabled (ENS bit in LCD_SR to 1).
The LCD must be disabled when making changes to the above registers.

STM32 Bare Metal C - Can't get LED to work

So I'm currently following a course for the STM32 Microprocessor however, I seem to fail at even the most basic thing: turning a LED on. The complete code is at the bottom of this post.
Important:
The hardware is functioning properly.
I am using a STM32L432KC.
First of all, we have to figure out on what pin the built-in LED is. According to the manufacturers manual, the LED should be on pin D13 (PB3).
Okay so we're looking for PB3. According to the datasheet of the STM32L432KC, PB3 is on the B port and therefore connected to the high performance bus as suggested in the image below.
Cool. So our bus is AHB2 and we're working with GPIOB. Now we have to enable the clock on that bus using the RCC_AHB3ENR register. Now, this is the part where I'm probably going to make mistakes (as this post otherwise wouldn't exist), so please pay close attention. If I understand correctly, I want bit 1 to be set to 1 as this indicates that 'GPIOBEN' is set to 'IO port B clock enabled.'.
This leads me to believe that I should set the bus register as follows:
RCC->AHB2ENR |= 0x2;
Next up I have to set the mode of the GPIO pin to output. According to the course and my documentation, this is done using GPIOx_MODER.
This leads me to believe that I should set the GPIO mode as follows:
GPIOB->MODER |= 0x40;
And last but not least to turn the actual LED on we have to set the output data register which is GPIOx_ODR.
This leads me to believe that I should set the data as follows:
GPIOB->ODR = 0x8;
I'm not sure where I'm going wrong but this is the first time I'm working with registers on such a deep level. I must be overlooking something but I've tried multiple examples and had no success. All help is appreciated.
This is the complete code:
// PB3 - User LED
// RCC->AHB2ENR
// GPIOx_MODER
// GPIOx_ODR
#include "stm32l4xx.h"
int main(void)
{
RCC->AHB2ENR |= 0x2;
GPIOB->MODER |= 0x40;
while(1)
{
GPIOB->ODR = 0x8;
}
}
Your mode register is not configured correctly. Your line of code
GPIOB->MODER |= 0x40;
can only set bits, it cannot clear them. And you have too many bits set, as the reset value of each pair is 11 and the entire register is FFFF FFFF for ports C-E, FFFF FEBF for port B.
You should use
GPIOB->MODER = (GPIOB->MODER & 0xFFFFFF3F) | 0x00000040;
although because the reset state is guaranteed, this will also work:
GPIOB->MODER &= 0xFFFFFF7F; // equivalently, ~0x0080
The note in the documentation of 11 analog mode (reset state) is not accurate for all pins. Several are in 10 alternate function mode at reset, including PB3. So you need to both clear one bit and set one.

VREF Output on STM32L0

I have an STM32L051 and want to drive an external DAC (SPI).
For that I would like to use the feature, mentioned in the manual, to output the internal reference voltage to the PB1 pin of the STM32.
I use the STM32Cube HAL as a basis. However the examples of using the VREF are limited to internal use for ADCs and comparators.
If I understand correctly, I can use the CFGR3 register to both enable the VREF as well as connect it to the PB1. Using the Cube drivers, I can use the HAL_SYSCFG_VREFINT_OutputSelect(SYSCFG_VREFINT_OUT_PB1) function, but to enable it, I should use either HAL_ADCEx_EnableVREFINT() or HAL_COMPEx_EnableVREFINT(). The manual information on SEL_VREF_OUT indicates that ENBUF_VREFINT_ADC must be set.
Furthermore no mention is made about the configuration of the pin itself. Should I simply declare it as a DAC Pin? An ADC Pin?
Answer
It is as simple as
if ( HALD_ADCEx_EnableVREFINT() != HAL_OK )
{
Error_Handling();
}
HAL_SYSCFG_VREFINT_OutputSelect(SYSCFG_VREFINT_OUT_PB1);
And I can see the 1.22 V on the PB1 output.
It does not require further pin (GPIO) configuration.
Complications and justification for the question (can be skipped)
I had some issues with the board from out electronic dept. and thus switched to the STM32L053-Discovery board. The above solution did not work, and I kept seeing 0V on PB1 (or PB0).
I assumed that was due to some configuration missing. However, after some further tests, I actually found that on that Discovery board, both PB1 and PB0 are reserved for a sensor. By closing the SB23 bridge, I could use PB1 back to the GPIO, and thus see the reference voltage on the pin.

STM32F1 - Using master SPI on bare metal

I've been trying to port some of my AVR code to drive a simple SPI LCD to ARM as a learning exercise (I'm very new to ARM in general). For this I just need to use SPI in master mode.
I looked in the datasheet for my device (STM32F103C8) and found that the SPI1 pins I need, SCK and MOSI are mapped as alternative functions of PA5 and PA7, respectively, along with other peripherals (pg.29). My understanding is that in order to use the SPI function on these pins, I need to make sure that anything else mapped to the same pin is disabled. When looking at the defaults for the peripheral clock control register, however, it looks like the other features are already disabled.
I looked at the SPI section in the reference manual, including section 25.3.3 - Configuring the SPI in master mode. First I enabled the SPI1 master clock in APB2ENR and followed the steps in this section to configure SPI1 to my needs. I also changed the settings for PA5/7 to set their mode to "Alternate Function Output push-pull" (9.1.4). Finally, I enabled SPI1 by setting CR1_SPE.
From my reading, I had thought that by loading a value into the SPI1 data register after configuring SPI as above, the data would be shifted out. However, after writing the data, the TXE flag in the SPI status register never becomes set, indicating that the data I wrote into it is just sat there.
At this point, I'm assuming that there is something else I've failed to configure correctly. For example, I'm not 100% sure about what to do with the PA5/7 pins. I've tried to understand what I can from the datasheets, but I'm not getting anywhere. Is there anything else that needs to be done before it'll work?
I'm almost certain that you did not set SSM and SSI bits in SPIx->CR1 register. SPI in these chips is pretty simple, for the polled transfers you need to set SSM, SSI, SPE, MSTR, correct format (LSBFIRST, CPOL, CPHA) and proper baudrate (BR) in SPIx->CR1 and you're good to go.

AVR/Arduino: reading timer toggled port pin

I've configured the timer 2 in CTC mode and to toggle the port pin on compare match (TCCR2A=0x42, TCCR2B=0x02, OCR2A=0x20) and have set DDR3 to output. Hence, according to the ATmega328P documentation (pages 158-163). OC2A (aka PB3) should toggle on each compare match. Unfortunately, I can't read the pin state at PORTB. Is this expected? I assumed, that even if a port is configured as output I can read the set value.
There were two problems:
In AVR Studio 4.18 I must not use the Simulator 1, because it has a bug for the timer 2 and hence can't toggle the port pin correctly. I needed to use Simulator 2 or AVR Studio 5.
I needed to read PINB instead of PORTB (though the toggling is an output operation).
I don't know about that specific microcontroller, but in some architectures you need at least a NOP between changing the port pin and the latch being updated (so you can read the change).
Also there is the maximum frequency a pin can be toggled at (many times slower than the microcontroller CPU clock). Be sure to not be over that frequency.

Resources