AVR ATMega328P ADC channel selection issue - c

I'm tinkering around with an ATMega328P right now and wanted to read an analogue value from a pin through the ADC and simply output the value to 4 LEDs. Really simple
#define F_CPU 20000000UL
#include <avr/io.h>
#include <avr/interrupt.h>
#include <util/delay.h>
#define BRIGHTNESS_PIN 2
#define ADC_SAMPLES 5
void init_adc()
{
//set ADC VRef to AVCC
ADMUX |= (1 << REFS0);
//enable ADC and set pre-scaler to 128
ADCSRA = (1 << ADPS0) | (1 << ADPS1) | (1 << ADPS2) | (1 << ADEN);
}
uint16_t read_adc(unsigned char channel)
{
//clear lower 4 bits of ADMUX and select ADC channel according to argument
ADMUX &= (0xF0);
ADMUX |= (channel & 0x0F); //set channel, limit channel selection to lower 4 bits
//start ADC conversion
ADCSRA |= (1 << ADSC);
//wait for conversion to finish
while(!(ADCSRA & (1 << ADIF)));
ADCSRA |= (1 << ADIF); //reset as required
return ADC;
}
int main(void)
{
uint32_t brightness_total;
uint16_t brightness = 0;
uint32_t i = 0;
init_adc();
sei();
while (1)
{
//reset LED pins
PORTB &= ~(1 << PINB0);
PORTD &= ~(1 << PIND7);
PORTD &= ~(1 << PIND6);
PORTD &= ~(1 << PIND5);
PORTB |= (1 << PINB1); //just blink
read_adc(BRIGHTNESS_PIN); //first throw-away read
//read n sample values from the ADC and average them out
brightness_total = 0;
for(i = 0; i < ADC_SAMPLES; ++i)
{
brightness_total += read_adc(BRIGHTNESS_PIN);
}
brightness = brightness_total / ADC_SAMPLES;
//set pins for LEDs depending on read value.
if(brightness > 768)
{
PORTB |= (1 << PINB0);
PORTD |= (1 << PIND7);
PORTD |= (1 << PIND6);
PORTD |= (1 << PIND5);
}
else if (brightness <= 768 && brightness > 512)
{
PORTB |= (1 << PINB0);
PORTD |= (1 << PIND7);
PORTD |= (1 << PIND6);
}
else if (brightness <= 512 && brightness > 256)
{
PORTB |= (1 << PINB0);
PORTD |= (1 << PIND7);
}
else if (brightness <= 256 && brightness >=64)
{
PORTB |= (1 << PINB0);
}
_delay_ms(500);
PORTB &= ~(1 << PINB1); //just blink
_delay_ms(500);
}
}
This works kind of fine, except the channel selection. When I select a channel it works fine, but independently from the selected channel, channel 0 also always reads and converts. What I mean with that is that if I plug the cable into the selected channel pin, it reads the values correctly. When I plug it into any other channel pin it obviously doesn't, except for ADC0. No matter what channel I set, not only does that one read but also ADC0.
Why is that and how do I fix that?
I already checked my PCB for solder bridges, but there are none and I would also expect some slightly different behaviour with that.
Also ADC4 and ADC5 don't seem to properly convert either. Any idea why that is? The only clue I found in the datasheet is, that those two use digital power, while all the other ADCs use analogue. What's the difference, why does it matter and why does it not correctly convert my anlogue signal?
Both ARef and AVCC are connected according to the datasheet, with the exception that the inductor for ARef is missing.

I think what is happening is that
ADMUX &= (0xF0);
is setting the channel to 0, and
ADMUX |= (channel & 0x0F);
is then setting the channel to the one you want. You're then taking a reading and throwing the result away, which should mean that the initial channel being set to 0 doesn't matter.
Howevever, when you then try to take an actual reading, you are setting the channel again, by using read_adc to take the reading. So, you don't ever throw a reading away.
What I would do is replace your ADMUX setting commands with:
ADMUX = (0xF0) | (channel & 0x0F)
Then move this into a separate function called something like set_adc_channel(int channel). Include a throw away read in that function, then remove the ADMUX setting from your read_adc function. Just start a conversion and get the result.
Also note that since you only ever use one channel, you could move the channel setting part to init_adc(). I assume it's in a separate function so you could later read more than one channel.
I hope that's clear. Let me know if not.

EDIT: So as you stated, ADIF is really reset by writing logic 1.
I've just tested your adc_read function and it is working for me (if you don't mind Arduino mixture)
uint16_t read_adc(unsigned char channel)
{
//clear lower 4 bits of ADMUX and select ADC channel according to argument
ADMUX &= (0xF0);
ADMUX |= (channel & 0x0F); //set channel, limit channel selection to lower 4 bits
//start ADC conversion
ADCSRA |= (1 << ADSC);
//wait for conversion to finish
while(!(ADCSRA & (1 << ADIF)));
ADCSRA |= (1 << ADIF); //reset as required
return ADC;
}
void setup() {
Serial.begin(57600);
//set ADC VRef to AVCC
ADMUX |= (1 << REFS0);
//enable ADC and set pre-scaler to 128
ADCSRA = (1 << ADPS0) | (1 << ADPS1) | (1 << ADPS2) | (1 << ADEN);
pinMode(A0, INPUT_PULLUP);
pinMode(A1, INPUT_PULLUP);
pinMode(A2, INPUT_PULLUP);
pinMode(A3, INPUT_PULLUP);
}
void loop() {
Serial.println(read_adc(0));
Serial.println(read_adc(1));
Serial.println(read_adc(2));
Serial.println(read_adc(3));
delay(1000);
}
I just connect one of these channels to 3.3V pin and it'll read 713 on it. Other channels are pulled up to levels about 1017.

Related

STM32F031 - Code runs in line-by-line debugging but not otherwise. Some functions only run if defined as macros

I am writing code for the STM32F031K6T6 MCU using the Keil uVision. The IDE information is shown in the image below:
enter image description here
The C/C++ options for Target are configured as shown here:
enter image description here
I started a new project, selected the chip, and configured the run-time environment as below:
enter image description here
I initialized the clock and configured the Flash registers for the appropriate latency. I tested the frequency using MCO and it seems correct. I also initialized some GPIOs, UART, and the SysTick. The peripheral registered is modified as expected as seen on the System View for the respective peripheral in the debugging mode.
The problem is that some functions, such as functions for sending and receiving data via UART and some functions that use GPIO ports only work in debugging mode when I run the code line-by-line. If I click the run button the code gets stuck and the chip stops responding. I still see the VAL and CURRENT registers of the SysTick updating.
This is an example of a function that works:
void System_Clock_init(void){
FLASH->ACR &= ~FLASH_ACR_LATENCY;
FLASH->ACR |= FLASH_ACR_LATENCY | 0x01;
RCC->CR |= RCC_CR_HSION;
while((RCC->CR & RCC_CR_HSIRDY) == 0);
RCC->CR &= ~RCC_CR_HSITRIM;
RCC->CR |= 16UL << 3;
RCC->CR &= ~RCC_CR_PLLON;
while((RCC->CR & RCC_CR_PLLRDY) == RCC_CR_PLLRDY);
RCC->CFGR &= ~RCC_CFGR_PLLSRC;
RCC->CFGR |= 10UL << 18;
RCC->CFGR &= ~RCC_CFGR_HPRE;
RCC->CFGR &= ~RCC_CFGR_PPRE;
RCC->CR |= RCC_CR_PLLON;
while((RCC->CR & RCC_CR_PLLRDY) == 0);
RCC->CFGR &= ~RCC_CFGR_SW;
RCC->CFGR |= RCC_CFGR_SW_PLL;
while((RCC->CFGR & RCC_CFGR_SWS) != RCC_CFGR_SWS_PLL);
}
This is an example of a function that doesn’t work:
void UV_LED_Driver(uint32_t d){
for(uint32_t i = 0; i<16; i++){
if(d&(((uint32_t)0x8000)>>i)){
SDI2_ON;
}
else {
SDI2_OFF;
}
CLK2
}
LATCH2
}
The macros used in the function above are defined as below:
// CLK2 -> PA5
// LE2 -> PA4
// SDI2 -> PA6
#define CLK2_OFF GPIOA->ODR |= (1UL << 5)
#define CLK2_ON GPIOA->ODR &= ~(1UL << 5)
#define LE2_OFF GPIOA->ODR |= (1UL << 4)
#define LE2_ON GPIOA->ODR &= ~(1UL << 4)
#define SDI2_ON GPIOA->ODR &= ~(1UL << 6)
#define SDI2_OFF GPIOA->ODR |= (1UL << 6)
#define CLK2 {CLK2_ON; us_Delay(1); CLK2_OFF;}
#define LATCH2 {LE2_ON; us_Delay(1); LE2_OFF;}
The GPIO pins used in the function above are initialized as follows:
// CLK2 -> PA5
// LE2 -> PA4
// SDI2 -> PA6
void UV_LED_Driver_Init(void){
RCC->AHBENR |= RCC_AHBENR_GPIOAEN;
GPIOA->MODER &= ~((3UL << 8) | (3UL << 10) | (3UL << 12));
GPIOA->MODER |= ((1UL << 8) | (1UL << 10) | (1UL << 12));
GPIOA->OTYPER &= ~(0x70UL);
GPIOA->PUPDR &= ~((1UL << 8) | (1UL << 10) | (1UL << 12));
GPIOA->OSPEEDR &= ~((3UL << 8) | (3UL << 10) | (3UL << 12));
GPIOA->OSPEEDR |= ((1UL << 8) | (1UL << 10) | (1UL << 12));
GPIOA->ODR |= (0x70UL);
}
And the us_Delay() function is based on SysTick. These are defined as:
static uint32_t usDelay = 0;
void SysTick_init(uint32_t ticks){
SysTick->CTRL = 0;
SysTick->LOAD = ticks - 1;
NVIC_SetPriority(SysTick_IRQn, (1<<__NVIC_PRIO_BITS) - 1);
SysTick->VAL = 0;
SysTick->CTRL |= SysTick_CTRL_CLKSOURCE_Msk;
SysTick->CTRL |= SysTick_CTRL_TICKINT_Msk;
SysTick->CTRL |= SysTick_CTRL_ENABLE_Msk;
}
void SysTick_Handler(void){
if(usDelay > 0){
usDelay--;
}
}
void us_Delay(uint32_t us){
usDelay = us;
while(usDelay != 0);
}
Now, this is the same UV_LED_Driver(uint32_t d) function defined as a macro (Runs as expected):
#define UV_LED_DRIVER(d) {for(int i = 0; i<16; i++){if(d&(0x000F>>i)){SDI2_ON;}else {SDI2_OFF;}CLK2}LATCH2}
This is the main():
#include <stm32f031x6.h>
#include "clock.h"
#include "LED_Driver.h"
#include "UART.h"
int main(void){
System_Clock_init();
Color_LED_Driver_Init();
UV_LED_Driver_Init();
Nucleo_Green_LED_Init();
UART_init();
SysTick_init(47);
//MCO_Init(); // Check PIN 18 (PA8) for the frequency of the MCO using an Oscilloscope
while(1){
UV_LED_DRIVER(~(0x0000)) // This runs well
//UV_LED_Driver((uint32_t)~(0x0000)); // If I run this line
//the debugger gets stuck here. It works if I run line-by-line
ms_Delay(100);
UV_LED_DRIVER(~(0xFFFF)) // This runs well
//UV_LED_Driver((uint32_t)~(0xFFFF)); // If I run this line
//the debugger gets stuck here. It works if I run line-by-line
ms_Delay(100);
}
}
Interestingly, if I define the functions as macros, they behave as desired. I finally tested the code on a STM32F429ZIT chip and it worked well, given the needed modifications in the initialization of the main clock and the GPIO.
Has anyone ever experienced anything similar or happens to know what could be causing this issue? I know that I could walk around this issue using CubeMX but I would like to find out what is causing this problem.
Thank you.
I asked the same question at the ST Community forum and the user waclawek.jan answered it. The problem is that I was calling the SysTick interrupt too often, not leaving any time for the main() to run. To fix the code, I just called the SysTick_init() function passing "479" as an argument instead of "47".
Thank you!

How to light up specific LED's and also shift every other led with a 74HC595 shift register?

I am currently in the process of building a drum machine with a micro controller. I am trying to figure out the logic of the sequencer. I have 16 LED's which will indicate at which 16th note the drum machine currently is playing.
So for instance, let's say the beats per minute (BPM) is 120, then the led should shift twice every second.
So for the shifting part I have written code such that if the step number is 0, we shift in a 1. If the step number is > 0, we shift in a 0. The method is called every (60/BPM) seconds.
PD5 is the serial input,
PD4 is the latch pin, PD3 is the clock pin.
void update_led(void) {
if (step_number == 0){
PORTD |= (1 << PD5); //Send a 1
PORTD |= (1 << PD4); //Read the input to 1st led
PORTD &= ~(1 << PD5);
PORTD &= ~(1 << PD4);
PORTD |= (1 << PD3); //Shift to next led
_delay_ms(40); //Apparently I need a delay here to light up LED
PORTD &= ~(1 << PD3);
}else{
PORTD |= (1 << PD4);
PORTD &= ~(1 << PD4);
PORTD |= (1 << PD3);
_delay_ms(40);
PORTD &= ~(1 << PD3);
}
}
But I also want the LEDs to static be lit for each step a user has programmed a sound. So for instance if the user presses button 1 and 5 and 9 and 13 with the instrument kick drum, the LED 1,5,9,13 should be static lit up but the leds should also shift like the above code. Does anyone have any tips on how to efficiently implement this? Is there a smart way to make a specific LED always light up when shifting the other LED's?

How to refer to a specific GPIO pin in an Arduino Uno using C code?

I am trying to design a 4-bit adder. I've set my input and output ports but am not sure where to go from there.
My "B" input port is Port D, my "A" input port as well as my "Cin" is Port B, and my "S" output port as well as my "Cout" is Port C. I am having trouble figuring out how to isolate the individual ports (such as the carry ripple) and am pretty much out of ideas aside from nested if-statements.
My code is currently as follows:
#include <avr/io.h>//library used to access the pin addresses
int main () {
DDRD &= ~(0b00111100);//B inputs
DDRB &= ~(0b00011111);//Carry-in + A inputs
DDRC |= 0b00011111;//Carry-out + S outputs
while (1) {
//PORTC |= PIND + PINB;
//PORTC &= ~(PIND + PINB);
if ((PIND & 0b00000000)&&(PINB & 0b00000000)) {
PORTC |= 0b00000000;
PORTC &= ~(0b00000000);
}
else if ((((PIND & 0b00000100)||(PIND & 0b00001000)||(PIND & 0b00010000)||(PIND & 0b00100000))&&(PINB & 0b00000000))||(((PINB & 0b00000100)||(PINB & 0b00001000)||(PINB & 0b00010000)||(PINB & 0b00100000))&&(PIND & 0b00000000)))
PORTC |= 0b00000001;
PORTC &= ~(0b11111110);
}
}
return 0;
}
If you want to refer to a specific pin/pins then it's relatively easy (assuming the registers are set as outputs):
For example, to isolate pin 4 on PORTD you do:
PORTD |= (1 << PIND4);
This will set pin 4 in PORTD to HIGH.
PORTD |= (1 << PIND4) | (1 << PIND5);
This will set pins 4 and 5 in PORTD to HIGH.
PORTD &= ~(1 << PIND4);
This will set pin 4 in PORTD to LOW.
PORTD &= ~(1 << PIND4) & ~(1 << (PIND5);
This will set pins 4 and 5 in PORTD to LOW.
You can also define a macro for the (1 << n) logic:
#define _BV(n) (1 << (n))
These tutorial have it all explained pretty well: http://maxembedded.com/2011/06/port-operations-in-avr/ and https://efundies.com/avr-bitwise-operations-in-c/.
If you go through the bitwise logic step by step on a sheet of paper it will become clearer!

How to stop timer on ATmega328

I wrote some simple code where I am using the timer1 on my Arduino Uno. The problem is I can't stop the timer any way I try.
I am using this program to count and show on the display the number of external interrupts on pin 2 while measuring the time. But when I press button fin I want to stop generating interrupts for the program which is increasing the variable time called cas. Can you help somehow, please?
My code:
#include <OLED_I2C.h>
#define alarm_output 10
#define fin 13
int suma=0;
int alarm=0;
float cas=0;
OLED myOLED(SDA, SCL, 8);
extern uint8_t MediumNumbers[];
extern uint8_t BigNumbers[];
extern uint8_t SmallFont[];
void setup(void) {
pinMode(alarm_output,OUTPUT);
digitalWrite(alarm_output,LOW);
pinMode(fin,INPUT_PULLUP);
pinMode(9,INPUT_PULLUP);
//interrupt
interrupts();
attachInterrupt(digitalPinToInterrupt(2), displej, CHANGE);
//first screen
myOLED.begin();
myOLED.setFont(SmallFont);
myOLED.print("TIME:", 0, 30);
myOLED.print("INTERRUPT:", 0, 56);
myOLED.print("Laser game", CENTER, 0);
myOLED.setFont(MediumNumbers);
myOLED.printNumF(cas,1,RIGHT,20);
myOLED.setFont(BigNumbers);
myOLED.printNumI(suma, RIGHT, 40);
myOLED.update();
//start loop
up:;
if(digitalRead(9)==1)
goto up;
// TIMER 1 for interrupt frequency 10 Hz:
cli(); // stop interrupts
TCCR1A = 0; // set entire TCCR1A register to 0
TCCR1B = 0; // same for TCCR1B
TCNT1 = 0; // initialize counter value to 0
// set compare match register for 10 Hz increments
OCR1A = 24999; // = 16000000 / (64 * 10) - 1 (must be <65536)
// turn on CTC mode
TCCR1B |= (1 << WGM12);
// Set CS12, CS11 and CS10 bits for 64 prescaler
TCCR1B |= (0 << CS12) | (1 << CS11) | (1 << CS10);
// enable timer compare interrupt
TIMSK1 |= (1 << OCIE1A);
sei(); // allow interrupts
}
void displej(){
suma++;
alarm=3;
}
ISR(TIMER1_COMPA_vect){
cas=cas+0.1;
if(alarm>0)
alarm--;
}
void loop(void) {
myOLED.setFont(MediumNumbers);
myOLED.printNumF(cas,1,RIGHT,20);
myOLED.setFont(BigNumbers);
myOLED.printNumI(suma, RIGHT, 40);
myOLED.update();
if(digitalRead(fin)==0){
cli();
TCCR1B |= (0 << CS12) | (0 << CS11) | (0 << CS10); //this do now work
detachInterrupt(digitalPinToInterrupt(2));
sei();
}
if(alarm>0)
digitalWrite(alarm_output,HIGH);
else
digitalWrite(alarm_output,LOW);
delay(10);
}
I've tested all three methods for "turning off the timer." Just comment out the one you prefer in the code below to see it demonstrated. All three are effective at getting the Arduino's LED to quit blinking.
void setup(void) {
pinMode(13,OUTPUT);
digitalWrite(13,LOW);
interrupts();
// TIMER 1 for interrupt frequency 10 Hz:
cli(); // stop interrupts
TCCR1A = 0; // set entire TCCR1A register to 0
TCCR1B = 0; // same for TCCR1B
TCNT1 = 0; // initialize counter value to 0
// set compare match register for 10 Hz increments
OCR1A = 24999; // 200 millisecond cycle
// turn on CTC mode
TCCR1B |= (1 << WGM12);
// Set CS12, CS11 and CS10 bits for 64 prescaler
TCCR1B |= (0 << CS12) | (1 << CS11) | (1 << CS10);
// enable timer compare interrupt
TIMSK1 |= (1 << OCIE1A);
sei(); // allow interrupts
}
volatile uint8_t count = 0;
volatile uint8_t timer_flip = 0;
ISR(TIMER1_COMPA_vect){
if (timer_flip == 0)
timer_flip = 1;
else
timer_flip = 0;
if (timer_flip == 1)
digitalWrite(13, HIGH);
else
digitalWrite(13, LOW);
count++;
}
void loop(void)
{
if (count > 100) // runs for a few seconds
{
//cli(); // One way to disable the timer, and all interrupts
//TCCR1B &= ~(1<< CS12); // turn off the clock altogether
//TCCR1B &= ~(1<< CS11);
//TCCR1B &= ~(1<< CS10);
//TIMSK1 &= ~(1 << OCIE1A); // turn off the timer interrupt
}
}
This exact code is running on an Uno right beside me now. I've tested all three "turn off" mechanisms above and they all work.
To make a minimal, verifiable example I stripped out all the OLED stuff. I changed pin 13 to an output and set it to blink the LED while it could (and when it stops the timers and/or interrupts are clearly disabled).
A couple of learning points:
You should not use pin 13 for your "fin" button without additional circuitry -- see this Arduino official reference.
You should declare as volatile any variable that is written to in your interrupt service routines.
Your choice of which method to turn off the timer depends on your goal. You just disable the timer interrupt, disable all interrupts, or simply turn the clock source off. The choice is yours.
Sure thing! There are two ways you can do it. You can simply stop the timer interrupt but leave the timer running using:
TIMSK1 &= ~(1 << OCIE1A);
Or, you can stop the timer altogether by altering the clock source to "none" like:
TCCR1B &= ~((1 << CS12) | (1 << CS11) | (1 << CS10));
which effectively undoes what you did to select the clock source in the first place. After this the CS12-CS11-CS10 bits will be 0-0-0 and the clock source will be stopped. See p. 134 of the datasheet.

ATMEGA 328P varying frequency

I am trying to generate a 16kHz pwm... this is the code i am working with right now. `
int main(void){
DDRD |= (1 << DDD6);
// PD6 is now an output
OCR0A = 128;
// set PWM for 50% duty cycle
TCCR0A |= (1 << COM0A1);
// set none-inverting mode
TCCR0A |= (1 << WGM01) | (1 << WGM00);
// set fast PWM Mode
TCCR0B |= (1 << CS01);
// set prescaler to 8 and starts PWM
while (1);
{
// we have a working Fast PWM
}}`
The default frequency is set to 64kHz... is there any way i can change the default frequency ? Because changing the prescalars does not help me get a frequency of 16kHz...
If you can use CTC mode instead of fast PWM (which means, you don't need to vary the pulse-pause-width ratio), you can use CTC mode to accomplish that:
DDRD |= (1 << DDD6);
// PD6 is now an output
OCR0A = 32;
// set PWM for 12.5% duty cycle
TCCR0A |= (1 << COM0A0);
// set toggle mode
TCCR0A |= (1 << WGM01);
// set CTC Mode
TCCR0B |= (1 << CS01);
// set prescaler to 8 and start PWM

Resources