I'm trying to execute the following piece of code on ATMEGA8 but the ADC doesn't seem to be working.
#include <avr/io.h>
#include "LCD.h"
int main()
{
int val=0;
ADCSRA=0x87;//ADc enabled and prescaler set to fosc/128
ADMUX= 0xC0;//REFS0 and REFS1 set using internal 2.5 volts as Vref
DDRC=0x00;// as input for the adc
PORTC=0x00;
DDRB=0xff;
while (1)
{
ADCSRA |=(1<<ADSC);
while(!(ADCSRA&(1<<ADIF)));
lcd_string("Done Conversion");
val=ADCL;
PORTB=ADCL;
ADCSRA |=(1<<ADIF);//(reseting ADIF to 1)
lcd_print(2,1,val,3);
}
return 0;
}
You have not read ADCH. The data sheet says
When ADCL is read, the ADC Data Register is not updated until ADCH is
read. Consequently, if the result is left adjusted and no more than
8-bit precision is required, it is sufficient to read ADCH. Otherwise,
ADCL must be read first, then ADCH.
val = ADCL;
val = ((ADCH<<8) | val) & 0x3F;
You are writing the result to an 8-bit port. If you want an 8-bit conversion then set the ADLAR bit in ADMUX. The 10-bit conversion will then be left-shifted by 6 bits and you can ignore the ls 2 bits in ADCL.
ADMUX = 0xE0;
...
val = ADCH;
BTW read-modify-write of ADCSRA is not recomended. To clear bit 4 – ADIF, the ADC Interrupt Flag, you could try
ADCSRA = 0x97; // rewrite config and clear ADIF
Which is your original configuration with the ADIF bit set to clear that flag. Alternatively, you could test bit 6 ADSC which remains high until the conversion is complete, and no action is required to clear it. Since you have not enabled the ADC interrupt, there is no need to clear the ADIF flag.
while (ADCSRA & (1<<ADSC)); // wait for conversion to complete
Related
In ATtiny402 There is no increment ADC value when its came on +2.5v. Its stuck on that max adc value of when it reaches the corresponding +2.5v....there is no change even i adjust the trimpot above 2.5v. And here is code.
#include <avr/io.h> #include <util/delay.h> uint16_t volatile adcVal;
void ADC0_init(void); uint16_t ADC0_read(void);
void ADC0_init(void) { /* Disable digital input buffer */ PORTA.PIN6CTRL &= ~PORT_ISC_gm; PORTA.PIN6CTRL |= PORT_ISC_INPUT_DISABLE_gc;
/* Disable pull-up resistor */
PORTA.PIN6CTRL &= ~PORT_PULLUPEN_bm;
ADC0.CTRLB = VREF_ADC0REFEN_bm ;
VREF.CTRLA = VREF_ADC0REFSEL_1_bm;
ADC0.CTRLC |= ADC_PRESC_DIV4_gc /* CLK_PER divided by 4 */
| ADC_REFSEL_INTREF_gc; /* Internal reference */
ADC0.CTRLA |= ADC_ENABLE_bm /* ADC Enable: enabled */
| ADC_RESSEL_10BIT_gc; /* 10-bit mode */
/* Select ADC channel */
ADC0.MUXPOS = ADC_MUXPOS_AIN6_gc;
}
uint16_t ADC0_read(void) { /* Start ADC conversion */ ADC0.COMMAND = ADC_STCONV_bm;
/* Wait until ADC conversion done */
while ( !(ADC0.INTFLAGS & ADC_RESRDY_bm) )
{
;
}
/* Clear the interrupt flag by writing 1: */
ADC0.INTFLAGS = ADC_RESRDY_bm;
return ADC0.RES;
}
This writes a mask with bit ADC0REFSEL_1 set and enables 2.5V reference voltage.
VREF.CTRLA = VREF_ADC0REFSEL_1_bm;
ADC can't measure voltages above the reference.
Try different value or enable VDD as a reference on ADC0.CTRLC
Regarding the second question in the comment.
I got a decimal value 4092 without this (ADC0.RES >> 2) … and why is that working in only after 2 bit shifting
The answer is to long to put it into a comment.
You have a mistake here using ADC0.CTRLB instead of VREF.CTRLB
ADC0.CTRLB = VREF_ADC0REFEN_bm;
So VREF_ADC0REFEN_bm which equal 2 is written into SAMPNUM ADC control field and ACC4 mode is enabled. This means at every ADC conversion request it makes four conversions by itself and places the sum into ADC result register. So you should divide the result by 4 to have what you expected.
VREF works anyway because (absent) line
VREF.CTRLB = VREF_ADC0REFEN_bm;
just sets permanent enabling of ADC0REF output. Without this code line the output is enabled automatically by ADC request which requires more setup time though.
I am trying to implement a 16-bit timer overflow interrupt on the ATMEGA168. The idea is to write a message to the UART I/O register when the timer overflows.
I've tested the UART separately and it works fine via RealTerm (baudrate of 9600 bits/s).
I created a base project from https://start.atmel.com/#dashboard where I had to set the input clock frequency to 16MHz to be compatible with the debugger (see page 5). So I would expect to see a 0x1 on my serial terminal every (16x106 / 1024)-1 x 216 = 4.194 seconds.
However, I'm not seeing anything on the terminal regardless of the prescaler I select. Can anyone please advise what could be going wrong?
I have attached the ISR and the main() below:
#include <atmel_start.h>
#include <stdio.h>
#include <usart_basic.h>
#include <atomic.h>
#include <avr/interrupt.h>
#include <avr/io.h>
// timer1 overflow
ISR(TIMER1_OVF_vect) {
// Send 0x1 over UART
UDR0 = 0x1;
}
int main(void) {
atmel_start_init();
// enable timer overflow interrupt for Timer1
TIMSK1 = (1<<TOIE1); // also tried |=
// start 16-bit counter with /1024 prescaler
TCCR1B = (1 << CS10) | (1 << CS12); // also tried |=
TCCR1A = 0x0;
// enable interrupts
sei();
while(true) {
// more code here...
}
}
I have tried to isolate the problem by not writing to UART in the ISR, but just incrementing a counter (declared with the volatile qualifier) and then printing its value to the screen via UART in the while(true) loop. But the counter doesn't increment either and remains stuck at 0.
You have no USART initialisation code. Specifically you don't enable the transmitter or set the baud rate. I accept that you have tried it with a counter, but that is not the code shown so we can come to no conclusion about its correctness or otherwise.
Without initialisation, the transmitter will not run, and the baud rate will be 1Mbps. Your need at least:
// Set baud rate 9600
uint16_t brr = (FOSC / 16 / 9600) - 1
UBRR0H = (uint8_t)(ubrr >> 8);
UBRR0L = (uint8_t)ubrr;
// Enable transmitter
UCSR0B = (1<<TXEN0);
// Note reset state frame is N,8,1
I am not convinced that it matters but your timer initialisation order is not idiomatic. You would normally enable the interrupt after setting the prescaler and any other configurations, and to ensure the first period is a complete period, reset the counter to zero immediately before enabling interrupts.
// set up timer with prescaler = 1024
TCCR1B = (1 << CS12) & (1 << CS11);
// initialise counter
TCNT1 = 0;
// enable overflow interrupt
TIMSK = (1 << TOIE1);
// enable global interrupts
sei();
As I said, I am not sure that will fix your problem but the elided part:
while(true) {
// more code here...
}
may well be the code that is breaking it. You would do well to discount that possibility by disabling or removing any code there temporarily.
I'm a beginner in this field. My goal is to change the output of 8 LEDs (which are connected to PORTA) according to the potentiometer. I have connected the middle line of the potentiometer to PF0, which is ADC0. I also connected the other two lines to the 5V and ground.
I know there's no problem with the chip or connection because the LEDs are working just fine.
But no matter how I change the code below (what I mean by changing is by slightly changing the ADMUX and ADCSRA registers) no output is shown!
I am using atmega128 with 16MHZ clock. Below is the code that I'm trying to solve.
#include <asf.h>
#include <avr/io.h>
#define F_CPU 16000000L
int init_board(void)
{
DDRA=0xff;
PORTA=0x01;
}
int ADC_init(void)
{
//ADCSRA
ADCSRA = 0b10000111;
//ADMUX
ADMUX = 0b01100000; // middle line connected to ADC0
}
int main (void)
{
init_board();
ADC_init();
ADCSRA |= (ADSC >> 1);
while(1)
{
if(ADSC == 0)
{
uint8_t disp_value = ADCL;
PORTA = disp_value;
delay_ms(200);
ADCSRA |= (ADSC >> 1);
}
}
}
I have no idea why the code doesn't work.
I suppose it's because it didn't set my register correctly, but I've followed all the instructions on the atmega128 datasheet.
First issue is your bit shifting, it should be ADCSRA |= (1 << ADSC).
Next issue is results reading. You set fifth bit of ADMUX to 1, so ADLAR=1 and in that mode result is left adjusted so you should read ADCH.
Moreover when you switch to 10-bit resolution, i.e. you start working with multi-byte results, be aware that reading only ADCL is not enough, see datasheet 23.3 for explanation: "Once ADCL is read, ADC access to data registers is blocked. This means that if ADCL has been read, and a conversion completes before ADCH is read, neither register is updated and the result from the conversion is lost. When ADCH is read, ADC access to the ADCH and ADCL Registers is re-enabled."
Lastly, using hardcoded delays for reading is not good practice especially when you change code later to read ADC as fast as possible. In such case after conversion start you should check if ADIF flag is set or react with interrup when ADEN is set. Refer to datasheet for details.
I have some code which should read the values of a couple of ADC pins, each time around the commutator loop.
static uint16_t adc0;
static uint16_t adc1;
void init(void) {
...
hw_configure_adcs();
...
}
void loop(void) {
...
adc0 = hw_read_adc(0);
adc1 = hw_read_adc(1);
...
}
void hw_configure_adcs(void) {
ADCSRA = (1<<ADEN) | (1<<ADPS2) | (1<<ADPS0);
}
uint16_t hw_read_adc(uint8_t n) {
ADMUX = (1<<REFS0) | (n & 0x07);
ADCSRA |= (1<<ADSC); // start conversion
uint16_t count;
for (count = 0; !(ADCSRA & (1<<ADIF)); count++); // wait for conversion to complete
// ADCSRA |= (1<<ADIF); // tried with and without this
return (ADCH << 8) | ADCL; // return ADC value
}
What I see is odd: The values of adc0 and adc1 are set to the same value and never change, until the AVR chip is restarted/reflashed.
(The value is 0x00d1 at 0.71V and 0x0128 at 1.00V, which seems reasonable.)
I have tried:
Turning the voltage down: adc0 and adc1 stay constant and only go down when the AVR code is reflashed (and hence the chip restarted).
Turning the voltage up: adc0 and adc1 stay constant and only go up when the AVR code is reflashed (and hence the chip restarted).
Returning count from hw_read_adc() instead of the ADC value: This returns varying numbers between 0x34 and 0x38 which are different for the two ADCs and continuously vary over time.
From these tests, I infer that the ADCs are being read, but that I am missing some "clear ADCH and ADCL and get them ready to accept the new reading" step.
I have re-read section 23 of http://www.atmel.com/images/Atmel-8272-8-bit-AVR-microcontroller-ATmega164A_PA-324A_PA-644A_PA-1284_P_datasheet.pdf many times, but have obviously overlooked something vital.
After much googling, I found: http://www.avrfreaks.net/forum/adc-only-happens-once-reset
The problem was that return (ADCH << 8) | ADCL; was compiled so that it read the high register first (as you might expect).
Page 252 of the datasheet says: "Otherwise, ADCL must be read first, then ADCH".
Changing my code to return ADC fixed the problem.
My guess as to what was happening is:
The read from ADCH occurred.
The read from ADCL had the effect of locking the ADC-result, to prevent tearing.
The next ADC read had nowhere to write its result, as the ADC-result was locked.
Repeat...
The Problem with your code is you read ADCH first.
ADCL must be read first, then ADCH, to ensure that the content of the Data Registers belongs to the same conversion. Once ADCL is read, ADC access to Data Registers is blocked. This means that if ADCL has been read, and a conversion completes before ADCH is read, neither register is updated and the result from the conversion is lost. When ADCH is read, ADC access to the ADCH and ADCL Registers is re-enabled.
So the correct code should be-
return ADCL | (ADCH << 8) ;
I am trying to read values from ADC0 on the ATMEGA328p. The values expected are between 0-5v. This is due to ADC0 being connected to a potentiometer connected to the 5v output of a Xplained mini. I am getting either 0v or 5v usually. With no variation when the potentiometer is changed. I have looked at multiple ADC examples and tutorials online but cant find the error in my code.
void adc_initialise (){
//set vref to AVcc, channel selection is initially ADC0 as wanted
ADMUX |= (1<<6);
//set ADC enable, Set adc clock prescalar 64
ADCSRA |= (1<<7)|(1<<2)|(1<<1);
}
uint16_t adc_read (){
ADCSRA |= (1<<6); // start conversion
while( ADCSRA & (1<<ADSC) ); //wait until conversion is complete
return ADCW;
}
float adc_calculation(uint16_t adcValue){
float stepSize = (5.0/1024.0);
float voltageIn = adcValue*stepSize;
return voltageIn;
}
then in my main i have
while(1){
adc_initialise();
uint16_t adcValue = adc_read();
float voltageIn = adc_calculation(adcValue);
adcConverterToUART(voltageIn);//I know that this part of the code is working as I have hardcoded many test values and all have transmitted correctly.
}
And as mentioned above I know the error is not in my UART code but somewhere in the above ADC code. Cheers in advance for any help.
I could mention a few things. You could try it if it help.
You should do your adc_initialise() before the while(1).
You initialize it again and again.
while( ADCSRA & (1<<ADSC) ); here you should add maybe NOPs that the compiler don't optimize it out of the code.
The rest looks good in my eyes.
Do you get any value of the conversion ?
mfg
EDIT1:
I looked in one of my old files.
There we made it like this to get the value from the ADC.
// Get count value
adValue = ADCL;
adValue |= (UI_16_t)(ADCH << 8);
ADCL is the low byte of the value and ADCH the high byte.
We shifted the high byte in the front of the low byte to get the value.