variable value keeps being "0" even after assigning it a value - c

I'm trying to make a function to make a delay in ATmega32 using Timer0 but I can't get the delay right and when I used debugging I found that the variable T_tick is not changing value from 0 but other operations depend on its value so nothing is working right. I don't know what's wrong with this variable and I've been stuck here for while so please help.
My code is as follows in a single page:
#include <math.h>
#include "registers.h"
#define CPU_frequency 1000000
#define set_bit(x,Bit_num) x|=(1<<Bit_num)
#define clr_bit(x,Bit_num) x&=~(1<<Bit_num)
#define tolge(x,Bit_num) x^=(1<<Bit_num)
//timer configuration
#define Normal 'N'
#define PWM_paseCorrect 'O'
#define PWM_fast 'p'
#define CTC 'Q'
double T_tick = 0,T_maxDelay = 0;
uint32_t overflowsNumber = 0, T_initValue = 0, overflowCounter = 0;
// set the timer mood
void timer0_init(uint8_t timerMood)
{
switch(timerMood)
{
case Normal:
TCCR0 = 0x00;
break;
case PWM_paseCorrect:
TCCR0 = 0x40;
break;
case CTC:
TCCR0 = 0x08;
break;
case PWM_fast:
TCCR0 = 0x48;
break;
}
}
void delayT0(double delay)
{
//convert delay to Ms
delay= delay/1000;
//calculate tick time
T_tick = 1/CPU_frequency;
//calculate max delay time
T_maxDelay = 256*T_tick;
//calculate overflow flag count
overflowsNumber = ceil(delay/T_maxDelay);
//calculate the timer initial value
T_initValue = 256 - (delay/T_tick)/overflowsNumber;
//set timer initial value
TCNT0 = T_initValue;
//start timer in no prescaling mood
set_bit(TCCR0,0);
//Make a loop to count the overflows
overflowCounter = 0;
while (overflowCounter < overflowsNumber)
{
//wait until overflow flag =1
while ((TIFR &(1<<0)) == 0)
//clear overflow flag
set_bit(TIFR,0);
overflowCounter++;
}
overflowCounter = 0;
TCCR0 = 0x00;
}
void LED_init(uint8_t pinNumber)
{
set_bit(DDRA,pinNumber);
}
void LED_TOGLE(uint8_t pinNumber)
{
tolge(PORTA,pinNumber);
}
int main(void)
{
LED_init(0);
timer0_init(Normal);
while (1)
{
LED_TOGLE(0);
delayT0(512);
}
}
The delay is supposed to be 512ms but it's only 1ms because of the variable T_tich that i don't know what's its problem; or that's what i think.
if any one can help please do; i've been stuck in there for too long

your problem is so simple :
in the line T_tick = 1 / CPU_frequency;. after preprocessing , it's expanded to be equivalent to : T_tick = 1 / 1000000; which will always result in zero.
even my compiler gives me this warning in this line :
Clang-Tidy: Result of integer division used in a floating point context; possible loss of precision
as the right hand side is integer division as 1 / 1000000 = 0
so T_tick = 0; , that's why T_tick will be always zero.
so either write T_tick = 1.0 / CPU_frequency; or T_tick = (double)1 / CPU_frequency;.
also don't forget to turn on all your compiler warnings using compiler flags as it will really help.

Related

dsPIC33 ADC1 gives incorrect converted values

Good morning,
I am using the ADC1 of a dsPIC33EP512GM604 and getting incorrect converted values.
To check this I made a cycle of 10 consecutive sampling/conversions.
The first value is always quite different from the rest of them, but it is the nearest to the "right" value.
Here is the relevant code:
/* Setup ADC1 for measuring R */
ANSELBbits.ANSB3 = 1; //ensure AN5 is analog
TRISBbits.TRISB3 = 1; //ensure AN5 is input
AD1CON1 = 0;
AD1CON1bits.ADSIDL = 1;
AD1CON1bits.AD12B = 1;
AD1CON1bits.SSRC = 7;
AD1CON2 = 0;
AD1CON2bits.VCFG = 0b001;
AD1CON2bits.SMPI = 0;
AD1CON3=0;
AD1CON3bits.SAMC = 0b11111;
AD1CON3bits.ADCS = 0;
AD1CON4 = 0; // no dma
AD1CHS0bits.CH0NA = 0;
AD1CHS0bits.CH0SA = 5;
IFS0bits.AD1IF = 0; // Clear the A/D interrupt flag bit
IEC0bits.AD1IE = 0; // Do Not Enable A/D interrupt
/* Read voltage value */
AD1CON1bits.ADON = 1; // Enable the A/D converter
__delay_us(25);
for (N=0; N<10; N++) {
AD1CON1bits.SAMP = 1;
__delay_us(5); // Wait for sampling time (min 3 TAD)
AD1CON1bits.SAMP = 0; // Start the conversion
while (!AD1CON1bits.DONE); // wait for conversion to finish
res[N] = (double) ADC1BUF0;
/* --- just for test ---*/
sprintf(deb,"ADC1BUF0 = %.0f\r\n", res[N]);
WriteStringUART1(deb);
/* ---- end of test ----*/
And here the results, for a certain fixed input voltage corresponding to a value of 215:
ADC1BUF0 = 222
ADC1BUF0 = 301
ADC1BUF0 = 296
ADC1BUF0 = 295
ADC1BUF0 = 295
ADC1BUF0 = 296
ADC1BUF0 = 296
ADC1BUF0 = 296
ADC1BUF0 = 296
ADC1BUF0 = 295
The first value 222 is acceptable close to the expected 215, to my purposes, the other values not.
What am I doing wrong?
I've used dsPIC33FJ64MC802 and was able to use the ADC.
I don't have much idea why the readings are that way. The code below worked for me. However, I can't say for sure that it will work properly for you.
void initADC() {
AD1CON1 = 0;
AD1CON1bits.AD12B = 1;
AD1CON2 = 0;
AD1CON3 = 0;
AD1CON3bits.ADCS = 2;
AD1CHS0 = 0;
AD1CON1bits.ADON = 1;
delayMs(1);
}
int readADC(char pin, unsigned samplingCycles) {
AD1CHS0bits.CH0SA = pin;
AD1CON1bits.SAMP = 1;
__delay32(samplingCycles);
AD1CON1bits.SAMP = 0;
while(!AD1CON1bits.DONE);
return ADC1BUF0;
}
Thanks to everybody for the contributions.
I finally got the trick. I then post the answer to my own question in case it may help someone.
Here is the trick:
The dsPIC can use different VrefH for the ADC1, ie. internal Vdd or external on PIN.
I used a HW that takes 2.5V external Vref on a dsPIC pin to be used as VrefH, and set the ADC accordingly.
The problem is that the dsPIC specs state that VrefH external should be greater than 2.7 V. So 2.5 was not sufficient to make it work well. That's foolishly it!
Here is ADC code example which I made before and connection diagram for AN2 with dsPIC33EV.
#define FCY 3685000UL // __delay_ms() and __delay_us() depend on FCY.
#include <xc.h> // p33EV256GM102.h
#include <libpic30.h>
void pins_init()
{
// ANSELx has a default value of 0xFFFF
// 1 = Pin is configured as an analog input
// 0 = Pin is configured as a digital I/O pin
ANSELA = ANSELB = 0x0000; // In most cases, it is better to initially set them to 0.
}
unsigned short adc1_raw = 0;
void adc1_init()
{
// See 70621c.pdf, Example 16-3: Code Sequence to Set Up ADC Inputs
ANSELBbits.ANSB0 = 1; // pin 4 - AN2/RPI32/RB0. 1 = Pin is configured as an analog input
AD1CHS0bits.CH0SA = 2; // Channel 0 positive input is AN2. See AD1CON2bits.ALTS
AD1CHS0bits.CH0NA = 0; // 0 = Channel 0 negative input is VREFL
AD1CON1bits.FORM = 0; // 00 = unsigned integer VREFL:0..VREFH:1023
AD1CON1bits.ASAM = 0; // 0 = Sampling begins when SAMP bit is set
AD1CON1bits.AD12B = 0; // 0 = 10-bit, 4-channel ADC operation
AD1CON2bits.VCFG = 0; // AVdd for VREFH and AVss for VREFL
AD1CON2bits.CHPS = 0; // 00 = Converts CH0. Select 1-channel mode.
AD1CON2bits.CSCNA = 0; // 0 = Does not scan inputs
AD1CON2bits.ALTS = 0; // 0 = Always uses channel input selects for Sample MUX A
AD1CON3bits.ADRC = 0; // 0 = clocked from the instruction cycle clock (TCY)
AD1CON3bits.ADCS = 15; // Conversion Clock Select bits. Tad=Tcy*(ADCS+1)=(1/Fcy)*(ADCS+1)
// Tad = (1/3685000)*(15+1) = 4342 ns
// (10-bit) Tconv = 12 * Tad = 52104 ns = 52.1 us.
IFS0bits.AD1IF = 0; // Clear the A/D interrupt flag bit
IEC0bits.AD1IE = 0; // Do Not Enable A/D interrupt
AD1CON1bits.ADON = 1; // turn on ADC
__delay_us(20); // ADC stabilization delay.
}
unsigned short adc1_read()
{
AD1CON1bits.SAMP = 1; // start sampling
__delay_us(100); // Wait for (enough) sampling time
AD1CON1bits.SAMP = 0; // start converting. See AD1CON1bits.ASAM
while (!AD1CON1bits.DONE)
{
; // Wait for the conversion to complete.
// In bare metal programming, most infinite loop issues
// are handled by watchdog reset.
}
return ADC1BUF0;
}
int main(void)
{
pins_init();
adc1_init();
while (1)
{
ClrWdt();
adc1_raw = adc1_read();
}
return 0;
}

Call of function without prototype error message with Delay1TCYx (1) ( C language )

We have a project that were running using C and a PICkit 3. Our issu is that we remain getting the same error message and arent sure whta the issue is. Any help / advice would be great.
void main (void)
{
ANSEL = 0; //turn off all other analog inputs
ANSELH = 0;
ANSELbits.ANS0 = 1; // turn on RA0 analog
ADCON0 = 1; // justify right
ADC_Init(); // Call ADC_Init function
ADC_Convert();
Delay1TCYx(1); //delay 1 x 1 cycles
int reading0 = ADRESL + (ADRESH * 256); //convert result to 10 bits
if (reading0 = 0b1000000011)
{
readingsamples();
}
while(1);
}

The ADC buffer not hold the full value

I am trying to read the ADC value from the potentiometer in PIC24F Curiosity Development Board (PIC24FJ128GA204) then turn on the LED if the value more 1000 (I configured it as a 10-bit).
However, The maximum value that place in the buffer is around 500. The following code shows the problem.
Please advise.
#include <xc.h>
#define Pot_TriState _TRISC0
#define Pot_AnalogState _ANSC0
int main(void) {
Pot_TriState = 1;
Pot_AnalogState = 1;
_TRISC5 = 0;
_LATC5 = 0;
ADC_Config();
while (1) {
if (ADC1BUF10 >= 1000) {
_LATC5 = 0; //Never be executed
}
if (ADC1BUF10 >= 300) {
_LATC5 = 1;
}
}
return 0;
}
void ADC_Config() {
AD1CON1bits.ADON = 0; // ADC must be off when changing configuration
//start conversion automatically after sampling and configure ADC to either 10 or 12 bits
AD1CON1bits.SSRC = 7;
AD1CON1bits.MODE12 = 0;
AD1CON2bits.PVCFG = 0; //A/D Converter Positive Voltage Reference Configuration bits
AD1CON2bits.NVCFG0 = 0; // A/D Converter Negative Voltage Reference Configuration bit
AD1CHSbits.CH0SB = 0b01010; //01010 = AN10
AD1CHSbits.CH0SA = 0b01010; //added
AD1CON3bits.ADRC = 1; // 1 = RC clock --- ADC?s internal RC clock
AD1CON3bits.SAMC = 0b11111; // set auto sampling time -- Auto-Sample Time Select bits11111 = 31 TAD
AD1CON2bits.BUFREGEN = 1; //Conversion result is loaded into the buffer location determined by the converted channel
AD1CON1bits.ASAM = 1;
AD1CON1bits.ADON = 1;
}

Trouble getting Uno32 to run timer 1 interrupt at expected frequency

I have been trying to set up Timer 1 on my pic32mx320f128h based development board (Uno32 by Digilent). The clock speed is supposed to be 80MHz which should mean each clock cycle takes 12.5ns. a 500Hz signal should take 2,000,000ns. By that logic this is 160,000 clock cycles. I decided to use prescalar of 8 so that the 16bit timer 1 does not roll over. This should mean the timer hits a value of 20,000 for a 500Hz frequency.
However on my oscilloscope I measure a much lower frequency of only 1.57Hz
If i change the prescalar to 1, the speed increaes as expected to 1.57x8 =12.56Hz. However if I then lower PR1 by half to 9999, the speed does not exactly double resulting in only 25.2Hz
I must be missing something here, can anyone guide me on what I might try?
I have tried adjusting the priority levels, and using non multi vectored interrupts to no avail.I have also tried using timer 2 with similar speed issues.
#include <stdio.h>
#include <stdlib.h>
#include <xc.h>
#include <plib.h>
#define TICKS_PER_SECOND 80000000 //80MHz
#define PIN2_TRIS TRISDbits.TRISD8
#define PIN2_BIT PORTDbits.RD8
#define PIN2_LAT LATDbits.LATD8
#define PIN3_TRIS TRISDbits.TRISD0
#define PIN3_BIT PORTDbits.RD0
#define PIN3_LAT LATDbits.LATD0
#define PINA0_TRIS TRISBbits.TRISB2
#define PINA0_BIT PORTBbits.RB2
#define PINA0_LAT LATBbits.LATB2
int8_t pwmActive = 0;
void __ISR(_TIMER_1_VECTOR, IPL5SOFT) Timer1ISR(void){
PIN3_LAT = ~PIN3_LAT;
IFS0bits.T1IF = 0; //set timer 1 int flag back to off
}
int main(int argc, char** argv) {
// PIN2_TRIS = 1;
// PINA0_TRIS = 1;
//
// PIN3_LAT = 0;
INTDisableInterrupts();
T1CONbits.TCKPS = 1; //prescale 8
T1CONbits.TCS = 0; //80MHz internal source
PR1 = 19999; //500Hz period
TMR1 = 0; //start timer at 0
T1CONbits.ON = 1; //start the timer
IPC1bits.T1IP = 5; //int priority 5
IPC1bits.T1IS = 0; //sub int priority 0
IFS0bits.T1IF = 0; //set timer 1 int flag off
IEC0bits.T1IE = 1; //enable interrupt for timer 1
INTEnableSystemMultiVectoredInt();
PIN3_TRIS = 0;
while(1){
}
return (EXIT_SUCCESS);
}
I expect to get a 500Hz square wave toggled on pin 3 but instead the frequency is much much lower at 1.57Hz

Simple compression Algorithm in C is not working..any suggestions?

I am creating a project in nordic Micro that reads a value from an Analog input termninal an output it into UART. I am now trying to compress the data using GE Proficy Historian Compression so that only changed data are outputted in the UART. But my code is not working. Outputted data are sometime still redundant.
The idea of the program is to generate an interrupt every certain amout of time, read the adc value and if its different than previous value, output it to the UART port.
the algorithm is explained here
http://www.evsystems.net/files/GE_Historian_Compression_Overview.ppt
The main portion of the code that handles the interrupt is as shown below
void ADC_IRQHandler(void)
{
/* Clear dataready event */
NRF_ADC->EVENTS_END = 0;
// write ADC value to UART port
// Compression Algorithm should occur here
uint8_t current_Val= NRF_ADC->RESULT;
//simple_uart_put(current_evaluation);
// Construct error bands around the held point
float U_tolerance = current_Val + error_tolerance;
float L_tolerance = current_Val - error_tolerance; // integer type is selected since lower tolerance could be negative
if( first_Run == false)
{
float slope = ((current_Val - Archived_Val) / (held_Time - Archived_Time)) ;
if (slope > U_Slope || slope > L_Slope)
{
Archived_Val = current_Val; // update the archived value
held_Time = 1; // reset held time
simple_uart_put(current_Val);
first_Run = true;
Archived_Val = current_Val;
}
else
{
float Upper_Slope = (U_tolerance - Archived_Val) /( held_Time - Archived_Time);
float Lower_Slope = (L_tolerance - Archived_Val)/(held_Time- Archived_Time);
if(Upper_Slope < U_Slope) // lowest upper slope is always taken as a blanket boundry
{
U_Slope = Upper_Slope;
}
if(Lower_Slope < L_Slope)
{
L_Slope = Lower_Slope;
}
held_Time += time_increment;
}
}
if (first_Run == true) // first held point always outputted
{
// calculate the slopes of the two lines
float Upper_Slope = (U_tolerance - Start_Up_Val) /( held_Time - Archived_Time);
float Lower_Slope = (L_tolerance - Start_Up_Val)/(held_Time- Archived_Time);
// Update Max and Min slopes
if(Upper_Slope < U_Slope) // lowest upper slope is always taken as a blanket boundry
{
U_Slope = Upper_Slope;
}
if(Lower_Slope < L_Slope)
{
L_Slope = Lower_Slope;
}
held_Time += time_increment;
first_Run = false;
Archived_Val = current_Val;
}
}
The veriables are defined as follow
> uint32_t error_tolerance = 50; // error tolerance value for swining door algorithm
uint8_t Start_Up_Val = 100;
float held_Time = 1;
int Archived_Time = 0;
float U_Slope = 2500;
float L_Slope = 0;
//float slope;
uint8_t Archived_Val;
bool GE_Comp(uint8_t, uint8_t, uint8_t, int);
bool first_Run = true;
float time_increment = 0.1;
Thank you all for your contribution and mostly to #Weather Vane . It was exactly as you and others suggested, the interrupt handler was executing too much coding which prohibited it from proper functionality. I now fixed the problem by diverging the parts of the code to the main function as suggested.
Regareds

Resources