AVR photocell not working with Arduino UNO [closed] - c

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed last year.
Improve this question
I was following a tutorial I found to get a photocell to light an LED as a first step to building a morse coder/decoder. I only have an arduino UNO available instead of the ATmega328P chip itself to work with. I have connected the photocell to pinout A0 and the LED to pinout D~9 on the UNO. I tried to rewrite the code to be used for my current setup but It comes up with three errors and four warnings that I can not figure out how to solve. Any help or advice would be greatly appreciated. I'm using Atmel Studio 7 Gcc C.
#ifndef F_CPU
#define F_CPU 16000000UL
#endif
#include <avr/io.h>
#include <util/delay.h>
#define PHOTO1 0 //sets photocell to PORTC pinout A0
#define LED1 9 //sets LED to PORTB pinout 9
void init_ports_mcu()
{
DDRB = 0xFFu; //set all pins at PORTB as output
PORTB = 0x00u; // sets pins at PORTB as low - LED off
DDRC = 0xFFu; //sets all pins at PORTC as output
DDRC &= ~(1<<0); //Makes first pin at PORTC input
PORTC = 0x00u; //sets all pins at PORTC as low-turning off Photocell
}
int map(int x, int in_min, int in_max, int out_min, int out_max)
{
return (x - in_min) * (out_max - out_min) / (in_max - in_min) + out_min;
}
void ADC_init()
{
//Enable ADC sampling freq to set prescaler to max value
ADCSRA |= (1<<ADEN) | (1<<ADPS2) | (1<<ADPS1) | (1<<ADPS0);
ADMUX = (1<<REFS0); //select required channel passed as input
}
uint16_t get_lightLevel()
{
_delay_ms 10; //wait for line in channel to get selected
ADCSRA |= (1<<ADSC); //start ADC conversion
while (ADCSRA & (1<<ADSC)); //wait for conversion to complete
_delay_ms 10;
return (ADC);
}
int main(void)
{
init_ports_mcu(); //setup microcontroller i/o ports
ADC_init(); //initialize ADC
while (1)
{
switch(map(get_lightLevel(), 0, 1023, 0, 3)){ //read and map ADC values
case 0: //high light intensifies - LED is off
PORTB &= ~(1<<LED1);
PORTC &= ~(1<<PHOTO1);
break;
case 1: //middle light intensifies - LED is on
PORTB = (1<<LED1);
PORTC &= ~(1<<PHOTO1);
break;
case 2: //low light intensifies - LED is on
PORTB = (1<<LED1);
PORTC = (1<<PHOTO1);
break;
}
}
return (0);
}
Errors:
Recipe for target 'main.o' failed - Line 76
expected ';' before numeric constant - Line 44
expected ';' before numeric constant - Line 49
Warnings:
Statement with no effect [wunused values] - Line 44
Statement with no effect [wunused values] - Line 49
Larger integer implicitly turnicated to unsigned type [wover flow] - Line 69
Larger integer implicitly turnicated to unsigned type [wover flow] - Line 73

The formatting of your delays is incorrect. According to util/delay.h (Convenience functions for busy-wait delay loops) here, two functions are available:
void _delay_ms (double __ms)
void _delay_us (double __us)
You should treat these as function calls, so you need to write:
_delay_ms(10);

Related

ISR_INT0_PD2 won't work but main function is infinitely working ATMEGA32

The below is a code to enable interrupt INT0_vect when PD2 is pressed. The code doesn't ever execute the ISR, but always executes the counter loop from 0 to 9 on the 7 segment in PORT C in the main Function. Also tried the sei(); instead of enabling the I-bit in the SREG. Any ideas?
#include <avr/io.h>
#include <avr/interrupt.h>
#include <util/delay.h>
#define ISR_INT0_PD2 INT0_vect
ISR( ISR_INT0_PD2 ){
PORTC = 0x00;
_delay_ms(100);
}
int main(void)
{
int i=0;
DDRC=0xff; //portc is o/p
PORTC=0x00; // all pins on portc is 0 volt
MCUCR |= (1<<1); // falling edge
GICR |=(1<<6); // enable INT0 set pin6
SREG |=(1<<7); // set GIE pin7
while(1)
{
for(i=0;i<10;i++)
{
PORTC=i;
_delay_ms(1000);
}
}
}
[Below is the screenshot from the simulator I've been using]
For the interrupt to execute you need to call sei() defined in <avr/interrupt.h>.
https://www.nongnu.org/avr-libc/user-manual/group__avr__interrupts.html#gaad5ebd34cb344c26ac87594f79b06b73
EDIT: I was mistaken when I removed the line SREG |= (1 << 7) according to my link that is equivalent to sei(); After I wrote the below example I realized the registers are named differently on the ATMega32, so unfortunately the code below won't run.
Based on the data sheet for an ATMega32 your code should work, have you tried removing the for loop and driving PORTC to logic high (e.g. PORTC = 255)? I noticed when I was writing the code for the ATMega168 that the LED I was using was very dim with the code in your while loop. Also check that INT0 pin is connected via pull-up/pull-down resistor.
This is the code that works on my ATMega168, if I swap the register names to the ones used on the ATMega32 I end up with your code:
#include <avr/io.h>
#include <avr/interrupt.h>
#include <util/delay.h>
#define ISR_INT0_PD2 INT0_vect
ISR( ISR_INT0_PD2 ){
// If there is a logic change on any pin hold the pins attached to PORTC low for 100ms.
PORTC = 0;
_delay_ms(100);
// Relase PORTC to logic high.
PORTC = 255;
}
int main(void)
{
DDRC = 255; // Set all pins on PORTC to be outputs.
PORTC= 255; // Set all pins on PORTC to be logic high.
EIMSK = 0b00000001; // Set external interupt request enable.
EICRA = 0b00000001; // Set the external interrupt control register A to so that
// any logical change on INT0 generates an interrupt request.
sei(); // Set global interupts enable.
while(1)
{
PORTC=255; // Blink the entire PORTC bank.
_delay_ms(20);
PORTC=0;
_delay_ms(20);
}
}

Handling multiple interrupts in avr

I am new to AVR programming, so sorry if question is trivial.
Using :
OS : Windows7
IDE : Atmel studio
uC = m328p
Pins:
ADC signal - ADC0/PC0
LED_values - (PB0 - PB7)
LED_START - PD1
LED_LIGHT - PD0
BUTTON - PD2
Goal: When you press the button it turns On the LED_START and it needs to start with conversion.
AVR gets interrupt and starts ADC conversion. Basically program has two interrupts. I know that INT0 interrupt has highest priority.
I dont know how to deal with them.
I have tried several things like adding global variable "start" and changing it. And also when i only set LED START it turns On and it stays in that state until LED_values reach certain value, then LED START turns Off by it self.
So please can you show me how to handle two interrupts so that fulfills stated goal and explain me what im doing wrong.
#include <avr/io.h>
#include <util/delay.h>
#include <avr/interrupt.h>
#define F_CPU 1000000UL
#define BIT_IS_SET(byte, bit) (byte & (1 << bit))
#define BIT_IS_CLEAR(byte, bit) (!(byte & (1 << bit)))
typedef enum{false, true} bool;
bool previousState = false;
bool start = false;
char num;
void setup();
void loop();
void ADC_init();
void EI_init(); // External Interrupt
int main(void)
{
setup();
loop();
}
void setup(){
DDRC &= ~(0x1); // LDR Input
DDRB = 0xFF; //LEDs value Output
DDRD |= 0x3; //LED light LED start Output
DDRD &= ~(1 << PIND2); //Button Input
}
void loop(){
PORTD |= (1 << PIND2);
EI_init();
ADC_init();
sei();
if(start){
ADCSRA |= (1 << ADSC);
}
while(1){}
}
void ADC_init(){
ADMUX = 0x60;
ADCSRA = 0x8B;
ADCSRB = 0x0;
ADCH = 0x0;
}
ISR(ADC_vect) {
PORTB = ADCH; // assign contents of ADC high register to Port D pins
int b = (int)ADCH;
if(b > 180) { //100
PORTD = 0x1;
}else{
PORTD &= ~(0x1);
}
_delay_ms(100);
ADCSRA |= (1 << ADSC); // start next ADC
}
void EI_init(){
EIMSK |= (1 << INT0); // Interrupt enabled
EICRA |= (1 << ISC00); // any state change
}
ISR(INT0_vect){
if(BIT_IS_CLEAR(PORTD,PIND2)){
start = true;
}else{
start = false;
}
}
Here is scheme : scheme
First of all, you should make start be volatile since it is being used by both the main loop and the interrupt. The volatile keyword tells the compiler that the variable might be modified by things outside of its control, so it cannot optimize away any reads or writes to the variable:
volatile bool start = false;
Secondly, you probably want to remove this line you wrote at the end of loop:
while(1){}
That line is bad because it causes your program to go into an infinite loop where it does nothing. I think you actually want the code you wrote about it in the loop function to run multiple times.
Secondly, after you detect that the start flag has been set, you probably need to set it to 0, or else it will just be 1 forever.
Third, setting start to false in the INT0 ISR might be a bad idea, because it might get set to false before you main loop has a chance to observe it being true and handle the event. I guess it really depends on exactly what you are trying to do. You could try adding details to your question about exactly what problem you are trying to solve using the AVR. See What is the XY problem?.
There are probably other issues with your code that need to be debugged. Can you think of any ways to make this simpler? Maybe you can reduce the number of interrupts you are using. To debug, you can try blinking some LEDs to figure out what parts of your program are executing.

AVR timer overflow interrupt not working

Hello good people of stack overflow. My problem is an interrupt service routine (ISR) that seemingly never executes! Here's some info on my set up:
I am flashing an avr attiny85. I have the bare bones of a project set up so far with simply a main.c and two modules: timer and hardwareInit. In the timer module, I have a timer0_init function that I am using to set up timer0 for CTC mode to overflow ever 1 ms. Here is the function:
void timer0_init( void )
{
cli();
TCCR0B |= 3; //clock select is divided by 64.
TCCR0A |= 2; //sets mode to CTC
OCR0A = 0x7C; //sets TOP to 124 so the timer will overflow every 1 ms.
TIMSK |= 2; //Enable overflow interrupt
sei(); //enable global interrupts
}
with the timer set up, I added an ISR to increment ticks every time the counter overflows, so I can keep track of how much time has elapsed, etc.
ISR(TIMER0_OVF_vect)
{
cli();
//ticks ++;
PORTB |= ( 1 << PORTB0 );
sei();
}
as you can see, I commented out the ticks++ because it wasn't working, and replaced it with PORTB |= ( 1 << PORTB0 ); which simply turns on an LED, so if the interrupt is ever executed, I will know by proof of the LED being on.
Unfortunately, I can't get it to turn on and can't see what I'm missing. (to prove that I 1. have the LED set up on the right pin, and 2. am manipulating the correct bit in the correct register, I put just this statement PORTB |= ( 1 << PORTB0 ); in my infinite loop and confirmed the LED came on)
For further explanation, here is my main.c:
/*================================= main.c =================================*/
#define F_CPU 8000000UL
#include <avr/io.h>
#include <avr/interrupt.h>
#include <util/delay.h>
#include "timer.h"
#include "hardwareInit.h"
int main(){
//Initialize hardware HERE
DDRB |= ( 1 << PORTB0 ); //set this pin as an output for an LED
SetClockPrescale(1); //internal clock divided by 1 = 8 MHz, from hardwareInit
timer0_init(); //set up timer0 for 1 ms overflow
while(1)
{
/* if( getTicks() > 0 )
{
PORTB |= ( 1 << PORTB0 );
_delay_ms(1000);
PORTB &= ~( 1 << PORTB0 );
_delay_ms(1000);
} */
}
return 0;
}
So, what you see in the infinite loop is what I tried first, but after that didn't work, I tried something simpler, just having an empty loop (commented out previous stuff), and waiting for the interrupt to get triggered which would turn on the LED.
Any help you could give would be really appreciated. I'm quite puzzled why this hasn't been working.
You are using the wrong ISR as #andars has pointed out correctly. In CTC "Clear Timer on Compare" mode the timer will never overflow as it will be cleared on compare match.
So you enabled the wrong interrupt of the timer as well. Bit 1 of TIMSK register enables timer overflow interrupt on timer0. That won't be triggered because of the previous reason. Taken from datasheet.
As you are using OCR0A to set the compare value, you have to enable Bit 4 – OCIE0A: Timer/Counter0 Output Compare Match A Interrupt Enable.
Back to the ISR, you need the ISR(TIMER1_COMPA_vect) or ISR(TIMER1_COMPB_vect) depending on which bit you set in TIMSK. Note that the compare value should be written into the matching registers as well, OCR0A or OCR0B.
Note that, you can use the bit names in your code just like the register names, in my opinion it makes the code more transparent.
Your code should be changed as follows to enable the corresponding interrupt:
void timer0_init( void )
{
cli();
TCCR0B |= (1<<CS01) | (1<<CS00); //clock select is divided by 64.
TCCR0A |= (1<<WGM01); //sets mode to CTC
OCR0A = 0x7C; //sets TOP to 124 so the timer will overflow every 1 ms.
TIMSK |= (1<<OCIE0A); //Output Compare Match A Interrupt Enable
sei(); //enable global interrupts
}
The ISR:
ISR(TIMER0_COMPA_vect)
{
cli();
//ticks ++;
PORTB |= ( 1 << PORTB0 );
sei();
}

led and switchs program failure Tiva C board

I'm trying to run this code on TIVA C board. sw2 connected to PF0, sw1 connected to PF4 and RGB LED connected to PF1, PF2 and PF3.
When I press sw2 it shall turn the led blue and if sw1 is pressed it shall turn the led green otherwise it shall be red.
The code doesn't function properly. I hope you can point out what I did wrong.
/*************************
PORT F Addresses
*************************/
#define RCGCGPIO (*((volatile unsigned long*)0x400FE608)) //CLOCK
#define PORTFDATA (*((volatile unsigned long*)0x400253FC)) //DATA
#define PORTFDIR (*((volatile unsigned long*)0x40025400)) //DIRECTION
#define PORTFDEN (*((volatile unsigned long*)0x4002551C)) //ENABLE
#define PORTFLOCK (*((volatile unsigned long*)0x40025520)) //LOCK (lock or unlocks PF0)
#define PORTFCR (*((volatile unsigned long*)0x40025524)) //COMMIT (uncommit PF0)
#define PORTFPUR (*((volatile unsigned long*)0x40025510)) // PULL UP resistor
#define PORTFPDR (*((volatile unsigned long*)0x40025514)) // PULL Down resistor
/*************************/
int sw1;
int sw2;
int delay;
int main (void)
{
RCGCGPIO = 0x20; //Enable clock for PORT F
delay = RCGCGPIO;
PORTFLOCK = 0x4C4F434B; // unlock commit reg
PORTFCR = 0x01; // unlock PF0
PORTFDEN = 0x1F; //Enable pins 0 to 4
PORTFDIR = 0x0E; // pins 0 and 4 input - pins 1,2,3 output
PORTFPUR = 0x11;
while (1)
{
sw2 = PORTFDATA & 0x00000001;
sw1 = PORTFDATA & 0x00000010;
if (sw1 == 1)
PORTFDATA |= 0x00000002;
else if (sw2 == 1)
PORTFDATA |= 0x00000004;
else
PORTFDATA |= 0x00000008;
}
}
Here are two obvious problems with your code. There are probably more...
You set sw1 = PORTFDATA & 0x00000010, so the only possible values sw1 can have are 0x10 or 0x00. Then you test if (sw1 == 1). But this test will never be true because sw1 can never equal 1.
You use the |= operator to set the bits of PORTFDATA. But nowhere do you ever clear the bits of PORTFDATA. So your LEDs may turn on but they will never turn off.
Have you tried the sample code comming with CCS to make sure that hardware are functional?
The sample code uses TI library instead of plain register Read & Write, however you can dig out the acutal register by go to defination. Anyway, I'm a bit curious about why you use plain register in the first place instead of TI library? ARM MCU is not 8051 anymore.
A couple of things to keep in mind:
The launchpad buttons are negative logic, so if your switch1== 0x01, it is NOT being pressed.
Try something like:
If
sw1 == 0x01 //switch not pressed
PORTFDATA &= ~(0x0E); //ledz off
else
PORTFDATA ^= 0x02; //toggle red
I am also learning ARM & C solely self-study, so I feel your pain. It took many hours of toggling single leds with bitwise operators just to fathom what is going on. Stick with it!!!
/e

Understanding UART under an ATMEGA168A

I am trying to create a C program which receives a char via UART, "prints" the correspondent binary by turning on 8 leds in my breadboard and send the char back to the transmitter.
Here is the code I am using:
//CPU clock
#define F_CPU 1000000UL
//Baud
#define BAUD 9600
//Baud rate
#define BAUDRATE ((F_CPU)/(BAUD*16UL)-1)
#include <avr/io.h>
#include <util/delay.h>
#include <util/setbaud.h>
#include <avr/interrupt.h>
#include <stdint.h>
//Communication Parameters:
//8 bits of data
//1 bit stop
//No parity
void uart_init(void){
//Bit 7 - RXCIEn: RX complete interrupt enable
//Bit 6 - TXCIEn: TX complete interrupt enable
//Bit 5 - UDRIE: USART data register empty interrupt enable
//Bit 4 - RXENn: Receiver enable
//Bit 3 - TXENn: Transmitter enable
UCSR0B = 0b10011000;
//Bit 7 - RXCn: USART receive complete.
//Bit 6 - TXCn: USART transmit complete
//Bit 5 - UDREn: USART data register empty
UCSR0A = 0b00000000;
//Bit 11:0 – UBRR11:0: USART baud rate register
//Whereas H are the higher bits and L the lower bits
//It comes from the setbaud.h
UBRR0H = UBRRH_VALUE;
UBRR0L = UBRRL_VALUE;
//Bit 7:6 - UMSELn1:0: USART mode select
//00 Asynchronous USART
//01 Synchronous USART
//11 Master SPI
//Bit 5:3 - Reserved bits in MSPI mode
//Bit 2 - UDORDn: Data order
//Bit 1 - UCPHAn: Clock phase
//Bit 0 - UCPOLn: Clock polarity
UCSR0C = 0b10000110;
}
// function to send data
void uart_transmit (uint8_t data)
{
while (!( UCSR0A & (1<<UDRE0))); // wait while register is free
UDR0 = data; // load data in the register
}
int main (void)
{
//Starts UART
uart_init();
//All led GPIOs as output
DDRB = 0xFF;
DDRC = 0x01;
//Enabling interrupts
sei();
while(1)
{
;
}
return 0;
}
ISR(USART_RX_vect)
{
//Variable to hold the incoming char
uint8_t received_bit = UDR0;
PORTC ^= 0x01;
PORTB = 0x00;
PORTB = received_bit;
uart_transmit(received_bit);
}
When I flash it to the chip and start using it, I get a weird behaviour.
I am sending a "U" which is a nice binary 01010101 to compare with.
However I am getting weird answers back from my chip:
My questions regarding UART under an ATMEGA168a are the following:
When setting the F_CPU am I supposed to stay with the 1MHZ used by the ATMEGA168a or do I have to use the one of my transmitter (Intel i7)? Could it be the problem?
When does the UDR0 gets "updated"? Whenever I hit the enter to send the character to chip via Terminal?
What could be generating this issue?
In the function uart_init() you set bits 7:6 to 10 which is a reserved state according to the ATMega 168A manual. To get the desired asynchronous UART functionality, set them to 00:
UCSR0C = 0b00000110;
The other reason why your example was not working was the baudrate settings, as explained in my comment below.
You already included the <util/setbaud.h> header file, which contains macros to make UART setup easier. Look here for the documentation. These macros take the input provided by you in F_CPU and BAUDRATE and calculate the settings for the UART configuration registers (UBRRH_VALUE and UBRRL_VALUE).
You used it almost correctly, however to take advantage of the UART baudrate doubling feature of the ATmega, add the following code after setting the UBRR0H/L value:
#if USE_2X
UCSR0A |= (1 << U2X0);
#else
UCSR0A &= ~(1 << U2X0);
#endif
This sets or clears the U2X0 bit dependent on the calculations of the setbaud macros.
Also, I believe you can remove the line
#define BAUDRATE ((F_CPU)/(BAUD*16UL)-1)
because that's exactly what setbaud.h does.

Resources