STM32F103C8 not receiving uart data from PC? - c

Right Now I am working with the STM32F103C8 blue pill board, I am working with UART protocol here I am able to send data to putty it's working but send data from putty to microcontroller (unable to receive data from the putty). what is the problem I can't understand everything is okay but I can't receive data?
Code:
#include "stm32f10x.h" // Device header
volatile char data;
int main(void){
//configure HSI SYSTEM clock .
RCC->CR = RCC_CR_HSION;
RCC->CFGR = RCC_CFGR_SW_HSI;
//Enable Alternate function clock, PORTA and USART1.
RCC->APB2ENR = RCC_APB2ENR_AFIOEN|RCC_APB2ENR_USART1EN|RCC_APB2ENR_IOPAEN;
//Alternate function mode PA9
GPIOA->CRH = GPIO_CRH_MODE9_1|GPIO_CRH_CNF9_1;
//ALternate function mode PA10.
GPIOA->CRH |= GPIO_CRH_MODE10_1|GPIO_CRH_CNF10_1;
//USART1 Enable.
USART1->CR1 = USART_CR1_UE;
//8-bit word length.
USART1->CR1 |= ~(USART_CR1_M);
//baudrate 8MHz,9600 baudrate.
USART1->BRR = 0x341;
//Enable TE,RE bits
USART1->CR1 |= USART_CR1_RE|USART_CR1_TE;
while(1){
//wait for receive data
while((USART1->SR &USART_SR_RXNE )==0){}
data = USART1->DR;
while((USART1->SR & USART_SR_TXE)== 0){}
USART1->DR = data;
while((USART1->SR &USART_SR_TC) == 0){}
}
this code receives data from the PC and sends data to the PC.

I got the solution for the above problem.
don't make GPIO pins as alternate function mode when you receiving data.
//ALternate function mode PA10.
GPIOA->CRH |= GPIO_CRH_MODE10_1|GPIO_CRH_CNF10_1;
here RX pin as output push-pull mode in alternate function --- it is wrong.
make this pin as input floating mode or input push-pull/pull-down mode -- it is correct if you receiving data from the PC.(according to user manual).
3.dont use the HSI clock for this, if you use the HSI clock on this board you will get the wrong data.HSI clock, not a good idea to use.
Make sure to enable the HSE clock as a system clock you will get proper output when you receive data from the PC.

Related

How do I take nRF24l01(slave) Register data using stm32f103 (master) through MISO pin

I'm trying to make a custom library for nRF24l01 for a stm32f103 target device, and I am writing code for a primary TX device.
And here I'm trying to read the register contents of nRF by sending the R_REGISTER command along with the address I am looking for, but I'm not able to figure out how to read the data after the R_REGISTER command is transmitted.
And i'm using the standard stm32f10x.h header file which comes along with startup files on Kiel uVision5.
Here are the configurations,
clock setup
RCC->CR |= RCC_CR_HSION; //HSI on
while( !(RCC_CR_HSIRDY & (RCC->CR)) ); //wait till its ready
//clocks for peripherals
RCC->APB2ENR |= RCC_APB2ENR_IOPAEN; //enable clock forport A
RCC->APB2ENR |= RCC_APB2ENR_AFIOEN; //enable clock for alternate functions
RCC->APB2ENR |= RCC_APB2ENR_SPI1EN; //enable clock for SPI1
GPIO setup
these are my custom-defined functions, they just work fine
//GPIO pin setup as alternate function
pin_mode(IOPA, GPIOA, 7, op_50MHz, op_afpp); //MOSI pin as GPIO alternate_pin can run upto 50MHz
pin_mode(IOPA, GPIOA, 6, ip, ip_pupd); //MISO pin as GPIO alternate_pin can run upto 50MHz
pin_mode(IOPA, GPIOA, 5, op_50MHz, op_afpp); //SCK pin as GPIO alternate_pin can run upto 50MHz
pin_mode(IOPA, GPIOA, 4, op_50MHz, op_gppp); //CS pin as GPIO general_puspose_pin can run upto 50MHz
SPI setup
SPI1->CR1 |= SPI_CR1_MSTR; //master mode
SPI1->CR1 |= SPI_CR1_BR_0 | SPI_CR1_BR_1 | SPI_CR1_BR_2; //at 571Kbps, max 31Mbps
SPI1->CR1 |= SPI_CR1_SSI; //Software slave management enabled
SPI1->CR2 |= SPI_CR2_SSOE; //SS o/p enable
SPI1->CR1 |= SPI_CR1_SPE; //turn on the SPI
Im stuck here
uint8_t SPI_read_uint8_t(uint8_t addr){
uint8_t reg_val;
//sending the read command first along with address where we are reading from
delay_us(50);
digital_writepin(GPIOA, 4, LOW);
SPI1->DR = (R_REGISTER | addr); //sending the R_REGISTER command along with address
while( (SPI1->SR) & (SPI_SR_BSY) );
//please help here, how do I read the Register data from MISO pin
uint8_t spi_read_write(uint8_t data)
{
while(SPI1 -> SR & SPI_SR_RXNE) (void)*(volatile uint8_t *)&SPI1 -> DR; //clean the FIFO
*(volatile uint8_t *)&SPI1 -> DR = data;
while(!(SPI1 -> SR & SPI_SR_RXNE));
return *(volatile uint8_t *)&SPI1 -> DR;
}
uint8_t youroperation(uint8_t command, uint16_t *data)
{
uint8_t status;
setCSLine();
status = spi_read_write(command);
*data = spi_read_write(0);
*data |= ((uint16_t)spi_read_write(0)) << 8;
clearCSLine();
return status
}
youroperation will return Sn status register
command parameter means Cn command bits
data will contain the Dx data bits
To read data from SPI slave you need to send dummy bytes because master provides the clock for the slave.

STM32L011K4 with DMA using I2C Start Condition not Occurring

I'm attempting to use the STM32L011K4's DMA controller to communicate with slave devices over I2C. Currently, I have no slave devices and am just trying to get the microcontroller to send the start condition out onto the I2C bus, but that is not happening.
When I run this code in debugging mode through the STM32CubeIDE, I notice that the start bit is set, but it never clears even though the reference manual says it should be cleared by hardware once the start condition occurs (page 656 for I2C_CR2).
Monitoring the SDA and SCL lines on my oscilloscope also show that they are a logical 1. Note: I'm using the NUCLEO-L011K4 on a breadboard, so the IO pins are tied to Vref through 1k resistors. All configuration registers appear to contain the desired value when the code is stuck sending the start condition, so I don't believe they are getting clobbered by a random line of code.
I'm not sure what's preventing the start condition from being sent, so any help would be greatly appreciated.
STM32L011K4 Datasheet:
https://www.st.com/content/ccc/resource/technical/document/datasheet/42/c0/ab/e5/71/7a/47/0b/DM00206508.pdf/files/DM00206508.pdf/jcr:content/translations/en.DM00206508.pdf
STM32L011K4 Reference Manual: https://www.st.com/resource/en/reference_manual/dm00108282-ultralowpower-stm32l0x1-advanced-armbased-32bit-mcus-stmicroelectronics.pdf
Initialization code:
void Init_I2C1_DMA() {
/* Basic I2C Initialization for 100 kHz I2C, 24 MHz SYSCLK, /1 APB1 scaler */
RCC->APB1ENR |= RCC_APB1ENR_I2C1EN; // Enable peripheral clock for I2C1
I2C1->CR1 &= ~(I2C_CR1_PE); // Disable I2C1
I2C1->CR1 = 0; // Reset CR1
I2C1->TIMINGR = 0; // Reset timer settings
/* APB1 clock (I2C1 clock) is set in RCC_CFGR reg -- keep at divide by 1 -- 24 MHz SYSCLK
* Refer to table 103 for timing value source. t_presc was found to be 250 ns for 100 kHz I2C, so PRESC was set to match that for SYSCLK = 24 MHz
* All subsequent settings are copied from table 103 from STM32L011K4 reference manual
*/
I2C1->TIMINGR |= (0x5 << 28)|(0x4 << 20)|(0x2 << 16)|(0x0F << 8)|(0x13 << 0);
/* Desired settings:
* RXDMAEN enable, ANF enable.
*/
I2C1->CR1 |= (0x8 << I2C_CR1_DNF_Pos)|I2C_CR1_ERRIE;
I2C1->CR2 = 0; // Reset contents (ACKs are enabled by default)
NVIC_EnableIRQ(I2C1_IRQn);
NVIC_SetPriority(I2C1_IRQn, 0);
/* DMA initialization */
/* Since this is peripheral to memory, we use I2C1_RX, which is available on DMA channels 3,7. We used channel 3, but 7 would work the same. */
RCC->AHBENR |= RCC_AHBENR_DMA1EN; // Enable peripheral clock for DMA1
DMA1_Channel3->CCR &= ~(0x00000001); // Disable Channel 3 DMA
// Configure DMA channel mapping
DMA1_CSELR->CSELR &= ~0x00000F00; // Channel 3 re-mapping mask
DMA1_CSELR->CSELR |= 0x00000600; // Channel 3 re-mapped to I2C1_RX
/* Configure NVIC for DMA */
NVIC_EnableIRQ(DMA1_Channel2_3_IRQn);
NVIC_SetPriority(DMA1_Channel2_3_IRQn, 0);
return;
}
void I2C1_DMA_Start_Read(uint8_t SlaveAddress, uint8_t RegisterAddress, int* MemoryBaseAddress, int BufferSize) {
// We need to put the device address out on the serial line before we can hand it over to the DMA
I2C1->CR1 &= ~(I2C_CR1_PE); // Disable I2C1
DMA1_Channel3->CCR &= ~DMA_CCR_EN; // Disable Channel 3 DMA
DMA1_Channel3->CCR &= ~(0x00007FFF); // Channel 3 DMA mask
// Configure DMA Channel 3 for 16 bit memory and peripheral, and other aliased settings (reference manual page 249, 10.4.3)
DMA1_Channel3->CCR |= (0b01 << 10)|(0b01 << 8)|DMA_CCR_MINC|DMA_CCR_TEIE|DMA_CCR_TCIE;
DMA1_Channel3->CPAR = (uint32_t) RegisterAddress;
DMA1_Channel3->CMAR = (uint32_t) MemoryBaseAddress;
DMA1_Channel3->CNDTR = (uint16_t) BufferSize;
I2C1->CR1 &= (~I2C_CR1_TXDMAEN); // Disable TX DMA for I2C1
I2C1->CR1 |= I2C_CR1_RXDMAEN; // Enable RX DMA for I2C1
// I2C1->CR2 |= ((uint8_t) (SlaveAddress << 1)); // Set up the slave address for DMA read
while(!(I2C1->ISR & I2C_ISR_TXE));
I2C1->TXDR |= ((uint8_t) (SlaveAddress << 1)); // Set up the slave address for DMA read
I2C1->CR2 |= I2C_CR2_RD_WRN;
DMA1_Channel3->CCR |= DMA_CCR_EN; // Activate DMA channel 3
I2C1->CR1 |= I2C_CR1_PE; // Enable I2C1
I2C1->CR2 |= I2C_CR2_START; // Generate start condition
while(I2C1->CR2 & I2C_CR2_START); // Wait until hardware clears the start bit
// ???
return;
}
According with reference manual(p. 604)
you need uncomment I2C1->CR2 |= ((uint8_t) (SlaveAddress << 1)); and comment I2C1->TXDR |= ((uint8_t) (SlaveAddress << 1)); for set slave address, and you need set the number of bytes to be transferred.
I can't see your initialisation of GPIO. Check is GPIO settings right (Alternative function and open-drain mode).
Also in reference manual written this
PE must be kept low during at least 3 APB clock cycles in order to perform the software
reset. This is ensured by writing the following software sequence: - Write PE=0 - Check
PE=0 - Write PE=1.
I think you should try to do so.
Also I advise using 4.7k resistor for pulling to VDD.

MSP430 I2C reading a SDP610 differential pressure sensor issue

I am trying to read a SDP610 sensiron differential pressure sensor via a Texas Instruments msp430.
I am having the issue of the sensor not acknowledging the command and thus, not communicating the pressure value itself. Note I have confirmed that the sensor works by hooking it up to an arduino via an opensource library and, I can see the data via this. Note my IDE is code composer. My chips is MSP430FR2311 (a launch pad breakout board).
My hardware setup is 4 wires. Vcc(3.3V), Ground(0V), SDK and SCL. The SDK and SCL lines are pulled to VCC with a 4.7Kohm resistor as per specification.
I have the following code for my MSP430 see below:
However, I do not see the response of the sensor via a logic analyser. Here is my capture. You will have to click the link. Note the top line is clock and bottom is the data.
MSP430 output.
The logic flow for reading the sensor from the datasheet and from the arduino code is as follows:
Write address of the device to the I2C line(8 bit h81)
Wait for slave acknowledge
Write command for reading (8 bit hF1)
Wait for slave acknowledge
Slave holds the master
Slave outputs 3 bytes (2 data one msb and 1 lsb then a check sum)
acknowledge
This is the datasheet for the sensor
Any tips to why the sensor is not responding.
CODE
void Read_Diff_pressure(void)
{
int rx_byte;
UCB0CTL1 |= UCTXSTT+ UCTR; // Generating START + I2C transmit (write)
UCB0I2CSA = SDP610Address; // SDP610 7 bit address 0x40
UCB0TXBUF = SDP610Read; // sending the read command 0x78
while(!(UCB0IFG & UCTXIFG)); //wait until reg address got sent
while( UCB0CTL1 & UCTXSTT); //wait till START condition is cleared
UCB0CTL1 |= UCTXSTT; //generate RE-START
UCB0I2CSA = SDP610Address; // SDP610 7 bit address 0x40
UCB0CTL1 &=~ UCTR; //receive mode
while( UCB0CTL1 & UCTXSTT); //wait till START condition is cleared
rx_byte = UCB0RXBUF; //read byte
//while(!(UCB0IFG & UCRXIFG)); //wait while the Byte is being read
UCB0CTL1 |= UCTXNACK; //generate a NACK
UCB0CTL1 |= UCTXSTP; //generate stop condition
while(UCB0CTL1 & UCTXSTP); //wait till stop condition got sent```
Pressure_result = rx_byte;
}
void InitI2C_diff(void)
{
PAOUT |= I2C_SCL_PIN|I2C_SDA_PIN;//P1.2(SDA) - P1.3(SCL) as per silk screen defined in a header
PADIR |= I2C_SCL_PIN|I2C_SDA_PIN;
PASEL0 |= (I2C_SCL_PIN|I2C_SDA_PIN); // configure I2C pins (device specific)
UCB0CTLW0 |= UCSWRST; // put eUSCI_B in reset state
UCB0CTLW0 |= UCMODE_3 | UCSYNC | UCMST; // I2C master mode, SMCL
UCB0CTL1 = UCSSEL_2 + UCSWRST; //use SMCLK + still reset
UCB0BR0 = 10; // default SMCLK 1M/10 = 100KHz
UCB0BR1 = 0; //
UCB0I2CSA = SDP610Address; //The address of the device
UCB0CTLW0 &= ~UCSWRST; // eUSCI_B in operational state
//UCB0BRW = 64; // baudrate = SMCLK / 64
}
int main(void)
{
InitI2C_diff();//Init the i2c
while (1) { // Mainloop
Read_Diff_pressure();
delay(1000);//1 Second delay before re looping
}
}
A few parts were missing compared to an old Project implementation of mine (VCNL3020 + MSP430).
For example:
set the 7-bit addressing mode, single-master environment, I2C Master, synchronous mode,..Maybe I have overlooked it
Does the sensor need itself an init?
The Init Part of the I2C only looked like this:
void I2CInit( void )
{
P1SEL |= BIT6 + BIT7; // Assign I2C pins to USCI_B0
P1SEL2|= BIT6 + BIT7;
UCB0CTL1 |= UCSWRST; // Enable SW reset
UCB0CTL0 = UCMST + UCMODE_3 + UCSYNC; // 7-bit addressing, single-master environment, I2C Master, synchronous mode
UCB0CTL1 = UCSSEL_2 + UCSWRST; // Use SMCLK, keep SW reset
UCB0BR0 = 16; // fSCL = SMCLK/UCB0BR1
UCB0BR1 = 0;
UCB0I2CIE |= UCNACKIE; // Enable not-acknowledge interrupt
UCB0I2CSA=slave_adress;
UCB0CTL1 &= ~UCSWRST; // Clear SW reset, resume operation
IE2 |= UCB0TXIE + UCB0RXIE; // Enable TX&RX interrupts
}
To not make it unnecessary complicated, you could check my implementation on github and see if it helps Github Link I2C MSP430 Main
I hope this helps a bit- have fun!
I'm not sure what your hardware looks like, but your I2C pull-ups sound too large.I know of lot of app notes talk about about 4.7K, but I'd look at the rise time of the lines with an oscilloscope. If you don't have access to a scope, I'd use 1K or 2 K and see what happens.

How to properly configure the USART_BRR register in STM32L476RG uC?

I'm trying to write my own driver for USART_TX on an STM32L476RG Nucleo Board.
Here the datasheet and the reference manual.
I'm using Keil uVision 5 and I set in the Manage dialog:
CMSIS > Core
Device > Startup
Xtal=16MHz
I want to create a single character transmitter. According to the manual instructions in Sec. 40 p 1332 I wrote this code:
// APB1 connects USART2
// The USART2 EN bit on APB1ENR1 is the 17th
// See alternate functions pins and label for USART2_TX! PA2 is the pin and AF7 (AFRL register) is the function to be set
#include "stm32l4xx.h" // Device header
#define MASK(x) ((uint32_t) (1<<(x)));
void USART2_Init(void);
void USART2_Wr(int ch);
void delayMs(int delay);
int main(void){
USART2_Init();
while(1){
USART2_Wr('A');
delayMs(100);
}
}
void USART2_Init(void){
RCC->APB1ENR1 |= MASK(17); // Enable USART2 on APB1
// we know that the pin that permits the USART2_TX is the PA2, so...
RCC->AHB2ENR |= MASK(0); // enable GPIOA
// Now, in GPIOA 2 put the AF7, which can be set by placing AF7=0111 in AFSEL2 (pin2 selected)
// AFR[0] refers to GPIOA_AFRL register
// Remember: each pin asks for 4 bits to define the alternate functions. see pg. 87
// of the datasheet
GPIOA->AFR[0] |= 0x700;
GPIOA->MODER &= ~MASK(4);// now ... we set the PA2 directly with moder as alternate function "10"
// USART Features -----------
//USART2->CR1 |=MASK(15); //OVER8=1
USART2->BRR = 0x683; //USARTDIV=16Mhz/9600?
//USART2->BRR = 0x1A1; //This one works!!!
USART2->CR1 |=MASK(0); //UE
USART2->CR1 |=MASK(3); //TE
}
void USART2_Wr(int ch){
//wait when TX buffer is empty
while(!(USART2->ISR & 0x80)) {} //when data is transfered in the register the ISR goes 0x80.
//then we lock the procedure in a while loop until it happens
USART2->TDR =(ch & 0xFF);
}
void delayMs(int delay){
int i;
for (; delay>0; delay--){
for (i=0; i<3195; i++);
}
}
Now, the problem:
The system works, but not properly. I mean: if I use RealTerm at 9600 baud-rate, as configured by 0x683 in USART_BRR reg, it shows me wrong char but if I set 2400 as baud rate on real term it works!
To extract the 0x683 in USART_BRR reg i referred to Sec. 40.5.4 USART baud rate generation and it says that if OVER8=0 the USARTDIV=BRR. In my case, USARTDIV=16MHz/9600=1667d=683h.
I think that the problem lies in this code row:
USART2->BRR = 0x683; //USARTDIV=16Mhz/9600?
because if I replace it as
USART2->BRR = 0x1A1; //USARTDIV=16Mhz/9600?
THe system works at 9600 baud rate.
What's wrong in my code or in the USARTDIV computation understanding?
Thank you in advance for your support.
Sincerely,
GM
The default clock source for the USART is PCLK1 (figure 15) PCLK1 is SYSCLK / AHB_PRESC / AHB1_PRESC. If 0x1A1 results in a baud rate of 9600, that suggests PCLK1 = 4MHz.
4MHz happens to be the default frequency of your processor (and PCLK1) at start-up when running from the internal MSI RC oscillator. So the most likely explanation is that you have not configured the clock tree, and are not running from the 16MHz HSE as you believe.
Either configure your clock tree to use the 16MHz source, or perform your calculations on the MSI frequency. The MSI precision is just about good enough over normal temperature range to maintain a sufficiently accurate baud rate, but it is not ideal.

STM32F4 I2C without library

I have to wire a code for STM32F4 discovery and pcf8574 with I2C.
I can't use any library function. I try something I didn't do it. I did write after init code.
My init code
RCC->APB1ENR|=RCC_APB1ENR_I2C1EN ; // enable APB1 peripheral clock for I2C1
RCC->AHB1ENR|=RCC_AHB1ENR_GPIOBEN; // enable clock for SCL and SDA pins
//SCL on PB6 and SDA on PB7
GPIOB->MODER|=GPIO_MODER_MODER6; // set pin to alternate function
GPIOB->MODER|=GPIO_MODER_MODER7; // set pin to alternate function
GPIOB->OSPEEDR |=GPIO_OSPEEDER_OSPEEDR6; //set GPIO speed
GPIOB->OSPEEDR |=GPIO_OSPEEDER_OSPEEDR7; //set GPIO speed
GPIOB-> OTYPER |= GPIO_OTYPER_OT_6; // set output to open drain --> the line has to be only pulled low, not driven high
GPIOB-> OTYPER |= GPIO_OTYPER_OT_7; // set output to open drain --> the line has to be only pulled low, not driven high
GPIOB-> PUPDR |=GPIO_PUPDR_PUPDR6_0; // enable pull up resistors
GPIOB-> PUPDR |=GPIO_PUPDR_PUPDR7_0; // enable pull up resistors
GPIOB-> AFR[1] = 0x44000000; // Connect I2C1 pins to AF (af4)
// configure I2C1
I2C1-> CR2 = 8; // set peripheral clock to 8mhz
I2C1-> CR2 = 40; // 100khz i2c clock
I2C1-> CR2 |= ~(I2C_CR1_SMBUS); // I2C mode
I2C1-> OAR2 = 0x00; // address not important
I2C1-> CR2 |= 1; // i2c enable;
I2C1-> CR2 |= ~(I2C_CR1_SMBUS); // I2C mode
This line does something different that you think. If the idea was to clear that bit it shuold be
I2C1-> CR2 &= ~(I2C_CR1_SMBUS); // I2C mode
Otherwise you set all the bits in the CR2 register except I2C_CR1_SMBUS which is left unchanged.
Another problem is that you try to set CR2 using CR1 bit definitions.
Same is with the enable bit - wrong register.
I2C should be reset before first use on many STM32 micros.
Hi I am having the same problem with setting an STM32F4 DISCO board to go
I noticed that you are setting
I2C1-> CR2 = 8; // set peripheral clock to 8mhz
I2C1-> CR2 = 40; // 100khz i2c clock
you are writing twice to the same register.
For the I2C clock you need to set I2C1->CCR to the calculated value
I hope this helps

Resources