Accuracy of Timer1 as real time clock with PIC Interrupts on 16F* - c

I'm using C with the BoostC compiler. I'm worried how accurate my code is. The configuration below ticks at more or less 1Hz, (tested with an LED to the naked eye). (It uses an external watch crystal 32kHz for Timer1 on the 16f74).
I'm hoping someone can tell me...
Does my code below need any adjustments? What would be the easiest way of measuring the accuracy to the nearest CPU clock period? Will I need to get dirty with assembly to reliably ensure accuracy of the 1Hz signal?
I'm hoping the time taken to execute the timer handler (and others) doesn't even come into the picture, since the timer is always counting. So long as the handlers never take longer to execute than 1/32kHz seconds, will the 1Hz signal have essentially the accuracy of the 32kHz Crystal?
Thanks
#define T1H_DEFAULT 0x80
#define T1L_DEFAULT 0
volatile char T1H = T1H_DEFAULT;
volatile char T1L = T1L_DEFAULT;
void main(void){
// setup
tmr1h = T1H;
tmr1l = T1L;
t1con = 0b00001111; // — — T1CKPS1 T1CKPS0 T1OSCEN NOT_T1SYNC TMR1CS TMR1ON
// ...
// do nothing repeatedly while no interrupt
while(1){}
}
interrupt(void) {
// Handle Timer1
if (test_bit(pir1, TMR1IF) & test_bit(pie1, TMR1IE)){
// reset timer's 2x8 bit value
tmr1h = T1H;
tmr1l = T1L;
// do things triggered by this time tick
//reset T1 interrupt flag
clear_bit(pir1, TMR1IF);
} else
... handle other interrupts
}

I can see some improvements...
Your timer initiation inside interrupt isn't accurate.
When you set the timer counter in interrupt...
tmr1h = T1H;
tmr1l = T1L;
... then you override the current value what isn't good for accuracy.
... just use:
tmr1h = T1H; //tmr1h must be still 0!
Or even better, just set the 7th bit of tmr1h register.
The compiler must compile this order to single asm instruction like...
bsf tmr1h, 7
...to avoid losing data in tmr1 register. Because if this is made with more than one instructions the hardware can increment the counter value between execution of: read-modify-write.

Related

PIC16F877A timer1 interrupt time is not as expected

Implemented interrupt function on TIMER1 on PIC16F877A MCU on PIC-DIP40 development board. Configured the timer Prescaler to 1 and auto preload value to 55536 so that the interrupt time is 0.01s. Using a counter of 100 to count 1s interval. The Fosc is 4Mhz. So my calculation is :
interrupt time = (4 / Fosc) * (65536 - 55536) = (4/4000000) * (65536 - 55536) = 0.01 s
And used a counter of 100 to generate a 1s interval.
Currently, I have no oscilloscope to test the actual 1s interval so, I am blinking an LED (LED2) on the timer interrupt and another LED (LED1) on the same time interval 1s using __delay_ms(1000); function.
So as expected the two LEDs will blink synchronously (Turn ON and OFF at the same Time). But for some first iterations, they blink synchronously. After some iterations, there is a clear difference in time between their blinking time (Turning ON and OFF time). After several minutes the difference is almost 1s. So the timer interrupt is not working as expected.
So is my calculation wrong for interrupt time or I am missing something in the timer1 configuration?
The overall goal is to generate a 1s time interval and test the validity without using an oscilloscope.
Here is my code :
// CONFIG
#pragma config FOSC = HS // Oscillator Selection bits (HS oscillator)
#pragma config WDTE = OFF // Watchdog Timer Enable bit (WDT disabled)
#pragma config PWRTE = OFF // Power-up Timer Enable bit (PWRT disabled)
#pragma config BOREN = OFF // Brown-out Reset Enable bit (BOR disabled)
#pragma config LVP = OFF // Low-Voltage (Single-Supply) In-Circuit Serial Programming Enable bit (RB3 is digital I/O, HV on MCLR must be used for programming)
#pragma config CPD = OFF // Data EEPROM Memory Code Protection bit (Data EEPROM code protection off)
#pragma config WRT = OFF // Flash Program Memory Write Enable bits (Write protection off; all program memory may be written to by EECON control)
#pragma config CP = OFF // Flash Program Memory Code Protection bit (Code protection off)
#include <xc.h>
#include <pic16f877a.h>
#define _XTAL_FREQ 4000000
#define LED1_ON PORTDbits.RD7 = 0
#define LED1_OFF PORTDbits.RD7 = 1
#define LED2_ON PORTDbits.RD6 = 0
#define LED2_OFF PORTDbits.RD6 = 1
#define LED2_TOGGLE PORTDbits.RD6 = ~PORTDbits.RD6
uint16_t preloadValue = 55536 ;
uint16_t counter = 0 ;
uint16_t secCounter1 = 100 ;
void io_config() {
TRISD &= ~((1 << _PORTD_RD7_POSITION) | (1 << _PORTD_RD6_POSITION)) ; //RD7 and RD6 are output LEDs
}
void timer1_init(){
TMR1 = preloadValue ; //loading the preload value
T1CON &= ~((1 << _T1CON_T1CKPS1_POSN) | (1 << _T1CON_T1CKPS0_POSN) | (1 << _T1CON_TMR1CS_POSN)) ; //prescalar is 1 clock is Fosc
T1CONbits.TMR1ON = 1 ; //timer 1 is ON
LED2_ON ;
}
void interrupt_en_configure(){
INTCON |= (1 << _INTCON_GIE_POSITION) | (1 << _INTCON_PEIE_POSITION) ; //global and peripheral interrupt on
PIE1 |= _PIE1_TMR1IE_MASK ; //timer 1 interrupt enable
TMR1IF = 0 ; //clearing interupt flag
}
void __interrupt() ISR(){
if(TMR1IF){
counter ++ ;
if (counter == secCounter1){
counter = 0 ;
LED2_TOGGLE ;
}
TMR1 = preloadValue ;
TMR1IF = 0 ;
}
}
void main(void) {
io_config();
interrupt_en_configure() ;
timer1_init() ;
while (1) {
LED1_ON ;
__delay_ms(1000);
LED1_OFF ;
__delay_ms(1000);
}
}
You should not expect them to operate synchronously for the following reasons:
First you do not know how __delay_ms() is implemented or any "promises" of precision it may make - it is certainly not using TIMER1, because you are controlling that. In fact the documentation gives some implementation details, and you really cannot expect precision.
Secondly, even if __delay_ms() were both accurate and synchronous, you are invoking it in a loop with the software overhead of the loop, function call and whatever you are doing to toggle the LED. That is a few cycles on every iteration that do not affect the interrupt interval which is locked to the hardware, and independent of the software timing.
The issue of precision of __delay_ms() is in fact addressed in this Microchip support article where it starts:
If an accurate delay is required, or if there are other tasks that can be performed during the delay, then using a timer to generate an interrupt is the best way to proceed.
In this case you should trust your code over the library provided delay which is intentionally crude (because it does not use up a valuable H/W timer resource).
__delay_ms() delays by running an empty loop, but it commonly cannot be exact. You would need to look into the actual machine code that is run to calculate the real delay. BTW, this is not rocket science and a great learning task. (Been there, done that.)
Now the rest of your loop (LED switching, looping) adds to this. Therefore, your pure software driven blinker is not exact.
However, your interrupt driven blinker is not, too. You reset the timer at the end of the ISR, after several clock cycles have passed. You need to take this into account, and don't forget the interrupt latency. Even worse, depending on the conditional statement, the reset happens at different times after the timer overflow.
Producing exact timing is difficult, especially with such a simple device.
The solution is to avoid software at all for the reset of the timer. Please read chapter 8 of the data sheet and use the capture/compare/PWM module to reset the timer on the appropriate value.
The worst thing that could still happen is some jitter, just because the ISR might have different latencies. But the timer runs as exactly as your system's crystal. In average your LED will blink correctly.
Anyway, if your timing requirements are not that hard, consider to live with some inaccuracy. Then use the most simple solution you like best.

Erroneous Pulsetrain Timing

I am having some trouble with a timer on my Arduino atmega328p-pu with 16MHz clock.
I have a really simply program with only one timer, two ISRs, and one pin.
The program does the following:
Iterates through the bits of 'sequence' and sets pin4 high or low respectively. However it doesnt set the bit high for the entire period, only 1/12 of it. What you see below is a single timer that counts up from 0 to 340. There is ISRB at 28, and then ISRA happens at 340, then it loops (that is what CTC mode does, loops after ISRA). ISRB always turns off the pin, and ISRA handles whether or not the pin should be high.
Now then the problem. All the timing works for each bit, but for some reason the loopover event causes the pulse spacing to SHORTEN. Yes shorten, not widen (which if anything was what I would expect because of extra clock cycles for executing the loop event).
it makes waveforms that look like this.
_|_|_|_|_ _ _ _ _|_|_|_||_|_|_|_ _ _ _
You can see that the problem resides in the junction between two packets, but the rest of the timing is good. I cant seem to track down why.
#include <stdint.h>
uint32_t sequence =0b111100001111; // example data sequence
uint8_t packetlength = 12;
uint8_t index = 0;
void setup(){
DDRD = 0xFF; // all port D as input
bitSet(DDRD, 4); // board pin 4 output
bitSet(PORTD, 4); // start high
// initialize timer1
TCCR1A = 0; // zeros timer control register
TCCR1B = 0;
TCNT1 = 0; // sets timer counter to 0
OCR1A = 340; // compare match register 340*62.5ns = 21.25us
OCR1B = 28; // 28*62.5ns = 1.75us
TIMSK1 = 0;
TCCR1B |= (1 << WGM12); // CTC mode
TCCR1B |= (1 << CS10); // CS10 no prescaler (use CS12 for 256 prescaler for testing)
TIMSK1 |= (1 << OCIE1A); // enable timer compare A interrupt
TIMSK1 |= (1 << OCIE1B); // enable timer compare B interrupt
}
ISR(TIMER1_COMPA_vect){ // controls bit repeat rate
if (bitRead(sequence,index) == 1){
bitSet(PORTD, 4); //set high
}
index ++;
if (index == packetlength){ //loop over when end reached.
index = 0;
}
}
ISR(TIMER1_COMPB_vect){ // controls duty cycle
bitClear(PORTD, 4); // set low
}
void loop(){
//nothing
}
Edit: April 5. Scope photos demonstrating inter pulsetrain period shortening.
The important measurement value is BX-AX
Normal. 340 + 6 calculation clock cycles (best estimate from scope)
Bad. Timer only counting 284 cycles before interrupt is firing.
Also Bad, but not a huge problem. This pulse is far to wide to be reasonably explained by the clock cycles needed set the bit low. It appears to take 17, I would expect 3.
I do not see why you should expect precise timing at the bit output. Interrupts begin after a delay once requested which will vary depending on instruction execution time for each instruction being run in whatever is being interrupted. I suspect (without seeing evidence in your report of the problem) that the variation you see is identical to instruction execution time variation.
If you want precise hardware output timing, you must either use never-interrupted programmed I/O or use the various flip-bit-upon-timer-compare features of the uP's hardware peripheral set. The ISR can be used to set things up for the next compare, but not to directly flip output bits.
Once you've figured out how to setup the action to be performed by the hardware upon comparitor matches, it will be simpler to do it all in a single ISR. That service routine can arrange for both the conditional bit set and the following unconditional bit clear. You probably want the ISR to run during the lengthier part of the cycle so that latency in the actual running of your [a] ISR code does not cause the setup to be too late.
[a. In addition to your ISR code, the programming environment is causing some context save a restore to wrap what you wrote. This can add execution cycles that might not be expected. Auto-generated context save/restore often is extravagant about tucking away state so that naive programmers are not puzzled by strange foreground-background interactions. ]

CCS PICC CCP settings

(I'll gladly post code if someone can point out how to paste it in here without using the 4 space indentation system that doesn't work)
Hello folks
After ~9Hours racking by brains, I can't find an answer or find where my calculations are going wrong... but they are.
I have a circuit built using a microchip 18F2550 microcontroller.
I am using this circuit to measure the delay between 2 signals and am using the 2 CCP registers in capture mode.
This all works and the result is sent to the PC (over USB serial) all dandy, but the results are wrong.
I have to apply a gain of ~16000 to any results to get somewhere near the delay presented to the pins.
I have the delay set in the line
Timer1 is set as an internal
Timer3 is disabled
relevant interrupts are enabled
and the main routine runs continuously.
When I get a rising edge detection on the CCP1 pin, the interrupt is configured to reset timer1 to zero as well as the overflow counter
#INT_CCP1
void ccp1_isr() // Captures the rising edge of CCP1 pin.
{
if(timing==FALSE){ // only do this on the edge, any bouncing will reset timers etc.
set_timer1(0);
T1_Overflow = 0;
Pulse_Time = 0;
timing = 1; // Set flag to indicate timing.
output_high(BLUE_LED);
}
}
the timing flag ensures the times cannot be reset by another pulse on the CCP1 pin.
Timer1 should then be reset and start counting as normal. Every time it rolls around by 65535 (16bit device) another interrupt is fired after which the amount of overflows are incremented.
#INT_TIMER1
void isr()
{
T1_Overflow++;
}
Finally, when the input pin on CCP2 goes high, the CCP_2 interrupt is triggered. This captures the value of the CCP register (which is the value of Timer0 at the time the interrupt was fired) and the overflow register.
#INT_CCP2
void ccp2_isr()
{
if(timing == TRUE){ // only output this when preceded by CCP1
if(Count_Done == FALSE) // do this once only
{
Count_Done = TRUE; // and also flag to the main routine to output data to the terminal.
Pulse_time = CCP_2;
Pulse_Overflow = T1_Overflow;
measureCount++; // increment the number of measures.
}
output_low(BLUE_LED);
timing = FALSE;
}
}
CCP1 can now start responding to the inputs again.
The idea of this is that every time I get a pulse of one input at CCP1 followed by CCP2, a string is sent to the terminal with a counter, the number of overflows and the time left in the timer.
while(TRUE) // do forever while connected
{
usb_task(); // keep usb alive
if(Count_Done == TRUE)
{
printf(usb_cdc_putc, "%lu , %lu , %lu \r\n",measureCount, pulse_time, pulse_overflow);
Count_Done = FALSE;
}
so, I should get an output to the terminal of something like "1,61553,35" for a ~12ms delay between CCP1 and CCP2.
The problem is that these are the results I am getting for a 200ms pulse provided to the circuit. (Verified twice)
so where am I going wrong.
I have a 48MHZ Clock with no prescaler which implies a cycle every 20ns.
Divide by for 4 instructions per cycle for the clock which implies 5.2ns every cycle
16 bit timer which implies rollover every 65535*5.2ns = 341us per rollover.
when you do the calculations (0.000341*pulse_overflow)+pulse_time*(5.2*(10^-9))
then the above data gives 0012.27ms and not the 200ms provided.
Can anyone point out where I am going wrong with these calculations???
Your error is in "Divide by for 4 instructions per cycle for the clock which implies 5.2ns every cycle"
The counter ticks once every 4 cycles, not 4 times per cycle. So, the correct calculations are:
2.08333E-08 s/cycle of osc
8.33333E-08 s/tick of timer
0.005461333 s/rollover
You are off by a factor of 16.

Understanding How DelayMS Function Works With PIC18F Microcontroller

I am trying to use a PIC18F45K22 to flash an LED on for a second, then off for a second, repeating indefinitely. The code incorporates a delay function that keeps the PIC busy for one second but I am not sure how it works (it was done through trial-and-error.) To achieve a 1 second delay, tWait should equal 125,000. I am using a 8MHz MCU clock frequency with PLL disabled. I know that 8MHz is 125nSec but after that, I am stuck. Can anyone please explain to me how it works? I've been at it for a few hours and it's driving me bonkers.
#define SYSTEM_FREQUENCY (8000000) // 8MHz
// Prototype Function
void delayS(unsigned int sec);
void main()
{
// Sets RA0 to output...
TRISA.RA0 = 0;
do
{
// Turns RA0 off...
LATA.RA0 = 0;
delayS(1);
// Turns RA0 on...
LATA.RA0 = 1;
delayS(1);
} while(1); // Repeat...
}
// My attempt at writing the function...
void delayS(unsigned int sec)
{
unsigned long tWait, tStart;
/*
To achieve a 1 Second On 1 Second Off cycle, tWait should be 125,000.
How? Why? 64 is arbitrary to achieve 125,000.
*/
tWait = Sec*(SYSTEM_FREQUENCY/64);
for(tStart = 0; tStart < tWait; tStart++);
}
The clock frequency is 8MHz, so there are 2.10^6 cycles per seconds since a cycle takes 4 clock ticks. Any assembly operation of the microcontroller takes at least one cycle. There is still a factor 16 to explain.
The factor 16 must correspond to one pass of the loop. Since unsigned long are used in this loop, it is not surprising that it takes a few operations. You will need to look at the assembly code to understand what is happening and get a precise count, like here.
If you are working with MPLAB and the xc8 compiler, there are some functions you can use, such as Delay10KTCYx(unsigned char);.
The microcontroller spends 99.99% of its time counting, wich is not critical if you flash leds. But, to use efficiently a pic, you will need to avoid these functions by using timers and interruptions. Have fun !

Arduino Nano Timers

I want to know more about Arduino Nano timers.
What timers are there?
Do they produce interrupts?
What code would attach an interrupt handler to them?
How is delay() and delayMicroseconds() implemented...
Do they use timer interrupts? (If so, how can I have other code execute during this?)
Or do they repeatedly poll until a timer reaches a certain value?
Or do they increment a value X number of times?
Or do they do it another way?
The best way to think about the Arduino Nano timers is to think about the timers in the underlying chip: the ATmega328. It has three timers:
Timer 0: 8-bit, PWM on chip pins 11 and 12
Timer 1: 16-bit, PWM on chip pins 15 and 16
Timer 2: 8-bit, PWM on chip pins 17 and 5
All of these timers can produce two kinds of interrupts:
The "value matched" interrupt occurs when the timer value, which is added to every tick of the timer reaches a comparison value in the timer register.
The timer overflow interrupt occurs when the timer value reaches its maximum value
Unfortunately, there is no Arduino function to attach interrupts to timers. To use timer interrupts you will need to write slightly more low-level code. Basically, you will need to declare an interrupt routine something like this:
ISR(TIMER1_OVF_vect) {
...
}
This will declare a function to service timer1 overflow interrupt. Then you will need to enable the timer overflow interrupt using the TIMSK1 register. In the above example case this might look like this:
TIMSK1 |= (1<<TOIE1);
or
TIMSK1 |= BV(TOIE1);
This sets the TOIE1 (generate timer1 overflow interrupt, please) flag in the TIMSK1 register. Assuming that your interrupts are enabled, your ISR(TIMER1_OVF_vect) will get called every time timer1 overflows.
The Arduino delay() function looks as follows in the source code (wiring.c):
void delay(unsigned long ms)
{
uint16_t start = (uint16_t)micros();
while (ms > 0) {
if (((uint16_t)micros() - start) >= 1000) {
ms--;
start += 1000;
}
}
}
So internally it uses the micros() function, which indeed relies on the timer0 count. The Arduino framework uses timer0 to count milliseconds, indeed, timer0 count is is where millis() function gets its value.
The delayMicroseconds() function, on the other hand, uses certain well-timed microprocessor operations to create the delay; which function is used depends on the processor and the clock speed; the most common being nop() (no operation) which takes exactly one clock cycle. Arduino Nano uses a 16 MHz clock, and here's what the source code looks like for that:
// For a one-microsecond delay, simply return. The overhead
// of the function call yields a delay of approximately 1 1/8 µs.
if (--us == 0)
return;
// The following loop takes a quarter of a microsecond (4 cycles)
// per iteration, so execute it four times for each microsecond of
// delay requested.
us <<= 2;
// Account for the time taken in the proceeding commands.
us -= 2;
What we learn from this:
1 µs delay does nothing (the function call is the delay)
Longer delays use the left shift operation to time the delay.

Resources