So I've been trying to learn CRC and tried to implement the same on my STM32G070 nucleo board. After setting up all the required configuration I'm returned an unexpected value. The website I used to verify my values http://www.sunshine2k.de/coding/javascript/crc/crc_js.html
My initial configuration was
Poly:0x31
Init:0xff
XOR: 00
no reflect on input and output
My configuration on the stm board are as per below
uint32_t crcCheck(uint32_t data)
{
RCC->AHB1ENR |= (RCC_AHB1ENR_CRCEN);
//RESET
CRC->CR |= (CRC_CR_RESET);
//8 BIT POLYNOMIAL
CRC->CR |= (CRC_CR_POLYSIZE_1);
CRC->POL = (0x31);
CRC->INIT = (0xFF);
CRC->DR = data;
return CRC->DR;
}
for trial I provided 0x32 as input and got 0x0B from online calculator and 0x70 from my program which does not matches. However I noted something strange that when I keep the inital poly as 0X00, both of them gave me 0xA7 as an output. I've tried for a while but could not arrive on a conclusion for this behavior.
Related
I am using the STM32F0 using register level coding and am having problems with the CRC module.
Basically I cant get the results to agree with online calculators.
I've stripped it right back to as simple as possible.
If I just reset the CRC then read the Data Register out I get 0xFFFFFFFF which I would expect as that's the initial value.
Even if I write zero in though and get the result it does not agree with other tools.
The STM outputs 0xC704DD7B and the online tools give 0xF4DBDF21.
As far as I can see all the parameters are the same (I have not tried to hand calculate it!).
My bare bones code is (and I am reading the result in the debugger from the register)...
// Reset the CRC.
SET_BIT(CRC->CR, CRC_CR_RESET_Pos);
// Write 0.
CRC->DR, 0;
I am not really sure, but maybe this helps:
I once had the same problem and i tried to figure out how to get the "correct" CRC32. Unfortunately there is not "one" type how CRC32 could be calculated, but several of ways. See https://crccalc.com/
I allways leave the settings of the CRC peripheral on default:
Default Polynomial state -> Enable
Default Init Value State -> Enable
Enable Input Data Inversion Mode -> None
None Output Data Inversion Mode -> Disable
Except "Input Data Format", which I set to "Words".
When sending data to the peripheral, I revert the words "word-wise". The Result is reverted word-wise again. This leads to an CRC32 which can be verified as CRC32/MGPE2
I have a function, that tests, if If I have configured the CRC Peripheral correctly, so I get "the correct" CRC32/MPEG2:
uint8_t CRC32Test(void) {
// #brief test if CRC32 Module is configured correctly
// #params none, void
// #return u8 status: 1 = OK, 0 = NOK (not configured correctly)
// Test if CRC Module is configured correctly.
// If YES, these data must return the CRC32 0x07 D4 12 72 (Big Endian)
// or - 0x72 12 d4 07 (little Endian)
uint8_t retval = 0;
uint8_t testdata[] = {0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09, 0x0a, 0x00, 0x00};
uint32_t CRCdataU32[3] = {0,};
uint32_t* pCRCdata = (uint32_t*)testdata;
uint32_t dataSz = 3;
CRCdataU32[0] = __REV(*pCRCdata++);
CRCdataU32[1] = __REV(*pCRCdata++);
CRCdataU32[2] = __REV(*pCRCdata++);
uint32_t testCRC32 = HAL_CRC_Calculate(&hcrc, CRCdataU32, dataSz);
testCRC32 = __REV(testCRC32);
if(testCRC32 == 0x7212d407) retval = 1;
return(retval);
}
I verified this, using crccalc.com
This is most probably not the most elegant code, but it works for me. I use it for data transfer between the MCU and a PC over RS232/RS485. I don't care much which special CRC32 I use. I just need both to create the same results on the receiver and the sender. And I archieve that with that code.
I am trying to Interface Poly Phase Energy Mertering IC ADE7758 Using STM32F411VET6. My SPI is working on a prescale of 16, Baud of 6.25 MB/s and MODE 2 i.e CPOL = 1 and CPHA = 0. Here is the snapshot of settings.
My connections are like this.
STM32 - ADE7758
PE11(NSS) - Pin 21(CS)
PE12(SCK) - Pin 23(SCLK)
PE13(MISO) - Pin 24(DOUT)
PE14(MOSI) - Pin 23(DIN)
Here is the global variables and defines
uint8_t aTxBuff[1] = {0};
uint8_t aRxBuff[1] = {0};
#define enableChip HAL_GPIO_WritePin(SPI1_NSS_GPIO_Port,SPI1_NSS_Pin,GPIO_PIN_RESET)
#define disableChip HAL_GPIO_WritePin(SPI1_NSS_GPIO_Port,SPI1_NSS_Pin,GPIO_PIN_SET)
I am trying to read OPMODE(0x13) register.
First i am writing the OPMODE register with a default value of 0x04.
Here is a snapshot of waveform.
My register address is 0x13 and i am writing so i have to logically 'OR' 0x13 with 0x80 i.e My waveform should be 0x93
The Default value on OPMODE register is 0x04
Here is the code i used for writing to ADE7758.
void ADE7758_write8(char reg, unsigned char data)
{
enableChip;
reg|=0x80;
aTxBuff[0] = (unsigned char)reg;
while(HAL_SPI_GetState(&hspi4) == HAL_SPI_STATE_BUSY_TX);
HAL_SPI_Transmit(&hspi4, (uint8_t*)aTxBuff, 1, 1000);
while(HAL_SPI_GetState(&hspi4) == HAL_SPI_STATE_BUSY_TX);
aTxBuff[0] = (unsigned char)data;
HAL_SPI_Transmit(&hspi4, (uint8_t*)aTxBuff, 1, 1000);
disableChip;
}
While writing using SPI to ADE7758 everything shows correct. But the problem occurs when i read back the register.
Here is the code for reading the SPI.
unsigned char ADE7758_read8(char reg)
{
enableChip;
aTxBuff[0] = (unsigned char)reg;
HAL_SPI_TransmitReceive(&hspi4, (uint8_t*)aTxBuff, (uint8_t*)aRxBuff, 1, 1000);
DWT_Delay_us(5);
aTxBuff[0] = 0x00;
HAL_SPI_TransmitReceive(&hspi4, (uint8_t*)aTxBuff, (uint8_t*)aRxBuff, 1, 1000);
disableChip;
return (unsigned char)aRxBuff[0];
}
I have tried to debug the code and constantly monitored value aRxBuff[0] and the value is arbitary( like 0xFF,0xFC,0xDF etc ).
I don't know weather its the fault of reading times but here is the snapshot of Timing Characteristics of ADE7758
Please suggest where am i going wrong while reading SPI from ADE7758? Is it the fault of the way i am reading SPI using HAL or its the fault of timing?
Any suggestions will be appreciated.
I'm a beginner in this field. My goal is to change the output of 8 LEDs (which are connected to PORTA) according to the potentiometer. I have connected the middle line of the potentiometer to PF0, which is ADC0. I also connected the other two lines to the 5V and ground.
I know there's no problem with the chip or connection because the LEDs are working just fine.
But no matter how I change the code below (what I mean by changing is by slightly changing the ADMUX and ADCSRA registers) no output is shown!
I am using atmega128 with 16MHZ clock. Below is the code that I'm trying to solve.
#include <asf.h>
#include <avr/io.h>
#define F_CPU 16000000L
int init_board(void)
{
DDRA=0xff;
PORTA=0x01;
}
int ADC_init(void)
{
//ADCSRA
ADCSRA = 0b10000111;
//ADMUX
ADMUX = 0b01100000; // middle line connected to ADC0
}
int main (void)
{
init_board();
ADC_init();
ADCSRA |= (ADSC >> 1);
while(1)
{
if(ADSC == 0)
{
uint8_t disp_value = ADCL;
PORTA = disp_value;
delay_ms(200);
ADCSRA |= (ADSC >> 1);
}
}
}
I have no idea why the code doesn't work.
I suppose it's because it didn't set my register correctly, but I've followed all the instructions on the atmega128 datasheet.
First issue is your bit shifting, it should be ADCSRA |= (1 << ADSC).
Next issue is results reading. You set fifth bit of ADMUX to 1, so ADLAR=1 and in that mode result is left adjusted so you should read ADCH.
Moreover when you switch to 10-bit resolution, i.e. you start working with multi-byte results, be aware that reading only ADCL is not enough, see datasheet 23.3 for explanation: "Once ADCL is read, ADC access to data registers is blocked. This means that if ADCL has been read, and a conversion completes before ADCH is read, neither register is updated and the result from the conversion is lost. When ADCH is read, ADC access to the ADCH and ADCL Registers is re-enabled."
Lastly, using hardcoded delays for reading is not good practice especially when you change code later to read ADC as fast as possible. In such case after conversion start you should check if ADIF flag is set or react with interrup when ADEN is set. Refer to datasheet for details.
I have some code which should read the values of a couple of ADC pins, each time around the commutator loop.
static uint16_t adc0;
static uint16_t adc1;
void init(void) {
...
hw_configure_adcs();
...
}
void loop(void) {
...
adc0 = hw_read_adc(0);
adc1 = hw_read_adc(1);
...
}
void hw_configure_adcs(void) {
ADCSRA = (1<<ADEN) | (1<<ADPS2) | (1<<ADPS0);
}
uint16_t hw_read_adc(uint8_t n) {
ADMUX = (1<<REFS0) | (n & 0x07);
ADCSRA |= (1<<ADSC); // start conversion
uint16_t count;
for (count = 0; !(ADCSRA & (1<<ADIF)); count++); // wait for conversion to complete
// ADCSRA |= (1<<ADIF); // tried with and without this
return (ADCH << 8) | ADCL; // return ADC value
}
What I see is odd: The values of adc0 and adc1 are set to the same value and never change, until the AVR chip is restarted/reflashed.
(The value is 0x00d1 at 0.71V and 0x0128 at 1.00V, which seems reasonable.)
I have tried:
Turning the voltage down: adc0 and adc1 stay constant and only go down when the AVR code is reflashed (and hence the chip restarted).
Turning the voltage up: adc0 and adc1 stay constant and only go up when the AVR code is reflashed (and hence the chip restarted).
Returning count from hw_read_adc() instead of the ADC value: This returns varying numbers between 0x34 and 0x38 which are different for the two ADCs and continuously vary over time.
From these tests, I infer that the ADCs are being read, but that I am missing some "clear ADCH and ADCL and get them ready to accept the new reading" step.
I have re-read section 23 of http://www.atmel.com/images/Atmel-8272-8-bit-AVR-microcontroller-ATmega164A_PA-324A_PA-644A_PA-1284_P_datasheet.pdf many times, but have obviously overlooked something vital.
After much googling, I found: http://www.avrfreaks.net/forum/adc-only-happens-once-reset
The problem was that return (ADCH << 8) | ADCL; was compiled so that it read the high register first (as you might expect).
Page 252 of the datasheet says: "Otherwise, ADCL must be read first, then ADCH".
Changing my code to return ADC fixed the problem.
My guess as to what was happening is:
The read from ADCH occurred.
The read from ADCL had the effect of locking the ADC-result, to prevent tearing.
The next ADC read had nowhere to write its result, as the ADC-result was locked.
Repeat...
The Problem with your code is you read ADCH first.
ADCL must be read first, then ADCH, to ensure that the content of the Data Registers belongs to the same conversion. Once ADCL is read, ADC access to Data Registers is blocked. This means that if ADCL has been read, and a conversion completes before ADCH is read, neither register is updated and the result from the conversion is lost. When ADCH is read, ADC access to the ADCH and ADCL Registers is re-enabled.
So the correct code should be-
return ADCL | (ADCH << 8) ;
I am trying to read values from ADC0 on the ATMEGA328p. The values expected are between 0-5v. This is due to ADC0 being connected to a potentiometer connected to the 5v output of a Xplained mini. I am getting either 0v or 5v usually. With no variation when the potentiometer is changed. I have looked at multiple ADC examples and tutorials online but cant find the error in my code.
void adc_initialise (){
//set vref to AVcc, channel selection is initially ADC0 as wanted
ADMUX |= (1<<6);
//set ADC enable, Set adc clock prescalar 64
ADCSRA |= (1<<7)|(1<<2)|(1<<1);
}
uint16_t adc_read (){
ADCSRA |= (1<<6); // start conversion
while( ADCSRA & (1<<ADSC) ); //wait until conversion is complete
return ADCW;
}
float adc_calculation(uint16_t adcValue){
float stepSize = (5.0/1024.0);
float voltageIn = adcValue*stepSize;
return voltageIn;
}
then in my main i have
while(1){
adc_initialise();
uint16_t adcValue = adc_read();
float voltageIn = adc_calculation(adcValue);
adcConverterToUART(voltageIn);//I know that this part of the code is working as I have hardcoded many test values and all have transmitted correctly.
}
And as mentioned above I know the error is not in my UART code but somewhere in the above ADC code. Cheers in advance for any help.
I could mention a few things. You could try it if it help.
You should do your adc_initialise() before the while(1).
You initialize it again and again.
while( ADCSRA & (1<<ADSC) ); here you should add maybe NOPs that the compiler don't optimize it out of the code.
The rest looks good in my eyes.
Do you get any value of the conversion ?
mfg
EDIT1:
I looked in one of my old files.
There we made it like this to get the value from the ADC.
// Get count value
adValue = ADCL;
adValue |= (UI_16_t)(ADCH << 8);
ADCL is the low byte of the value and ADCH the high byte.
We shifted the high byte in the front of the low byte to get the value.