PIC12 TRIS Register not setting - c

I'm trying to program a PIC12C508A to do a simple LED learning circuit. I've read some examples, the Microchip Datasheet, pic12c508a.h and pic12c508a.inc. I've tried to set the TRIS register using a C program and an ASM program but it does not take. Using MPLAB X, the XC8 compiler, and the built in simulator to check the SFR registers I can see that the TRIS is not updating even when the WREG holds the correct values. If anyone has experience with this please check out my code and see if I am doing something wrong.
#include <xc.h>
// -- CONFIG
#pragma config MCLRE = ON // RA5/MCLR/VPP Pin Function Select bit (RA5/MCLR/VPP pin function is digital input, MCLR internally tied to VDD)
#pragma config WDT = OFF // Turn Watchdog Timer Off.
#pragma config CP = OFF // Flash Program Memory Code Protection bit (Code protection off)
#pragma config OSC = IntRC // Internal RC Oscillator
// -- Internal Frequency
#define _XTAL_FREQ 400000
int main()
{
TRIS = 0b111010; // 0x3A
//---0-0 Set GP0 and GP2 as outputs
GPIO = 0b000100; // 0x04
//---1-0 Set GP2 as HIGH and GP0 as LOW
for(int countdown = 10; countdown > 0; --countdown) {
__delay_ms(60000); // Delay 1 minute.
}
GPIO = 0b000001; // 0x01
//---0-1 Set GP2 as LOW and GP0 as HIGH
while(1)
NOP();
}
I also tried it an assembly which is pretty much identical to the Gooligum tutorials for baseline PIC models.
list p=12c508a
#include <p12c508a.inc>
__CONFIG _MCLRE_ON & _CP_OFF & _WDT_OFF & _IntRC_OSC
RCCAL CODE 0x0FF ; Processor Reset Vector
res 1 ; Hold internal RC cal value, as a movlw k
RESET CODE 0x000 ; RESET VECTOR
movwf OSCCAL ; Factory Calibration
start
movlw b'111010' ; Configure GP0/GP2 as outputs
tris GPIO ;
movlw b'000100' ; Set GP2 HIGH - GREEN LED
movwf GPIO
goto $ ; loop forever
END
This all seems pretty straight forward but when I use breakpoints and examine the SFR registers in the simulator I can see that the GPIO and TRIS registers never are changed even though the WREG will hold the correct values. I've examined the ASM output that the XC8 compiler generates and it is almost identical to the ASM I wrote when it comes to setting the registers.
I've also tried using HEX values and straight integer values and the results are the same.

The answer is that the crystal frequency defined at the top of the program was way beyond the real actual value
#define _XTAL_FREQ 400000 //that's 400KHz INTOSC, impossible
Instead it should be
#define _XTAL_FREQ 4000000 //That's 4MHz INTOSC
#Justin pointed it out in the comment below it's original post.

First, in order to use GP2 as an output, do you need to clear the T0CS in the OPTION register ?
Second, I observe this in the manual:
Note: A read of the ports reads the pins, not the output data latches.
That is, if an output driver on a pin is enabled and driven high, but
the external system is holding it low, a read of the port will
indicate that the pin is low.
but I guess the simulator will assume the external system is not holding down the pin.
Third, BCF and BSF instructions look like a better way of waggling GP2 and GP0 independent of whatever else is going on in the GPIO.
I'm sorry, but other than that I don't know what to suggest.

You can try different GPIO, because according to the documentation, GP2 may be controlled by the option register.

Related

PIC16F877A timer1 interrupt time is not as expected

Implemented interrupt function on TIMER1 on PIC16F877A MCU on PIC-DIP40 development board. Configured the timer Prescaler to 1 and auto preload value to 55536 so that the interrupt time is 0.01s. Using a counter of 100 to count 1s interval. The Fosc is 4Mhz. So my calculation is :
interrupt time = (4 / Fosc) * (65536 - 55536) = (4/4000000) * (65536 - 55536) = 0.01 s
And used a counter of 100 to generate a 1s interval.
Currently, I have no oscilloscope to test the actual 1s interval so, I am blinking an LED (LED2) on the timer interrupt and another LED (LED1) on the same time interval 1s using __delay_ms(1000); function.
So as expected the two LEDs will blink synchronously (Turn ON and OFF at the same Time). But for some first iterations, they blink synchronously. After some iterations, there is a clear difference in time between their blinking time (Turning ON and OFF time). After several minutes the difference is almost 1s. So the timer interrupt is not working as expected.
So is my calculation wrong for interrupt time or I am missing something in the timer1 configuration?
The overall goal is to generate a 1s time interval and test the validity without using an oscilloscope.
Here is my code :
// CONFIG
#pragma config FOSC = HS // Oscillator Selection bits (HS oscillator)
#pragma config WDTE = OFF // Watchdog Timer Enable bit (WDT disabled)
#pragma config PWRTE = OFF // Power-up Timer Enable bit (PWRT disabled)
#pragma config BOREN = OFF // Brown-out Reset Enable bit (BOR disabled)
#pragma config LVP = OFF // Low-Voltage (Single-Supply) In-Circuit Serial Programming Enable bit (RB3 is digital I/O, HV on MCLR must be used for programming)
#pragma config CPD = OFF // Data EEPROM Memory Code Protection bit (Data EEPROM code protection off)
#pragma config WRT = OFF // Flash Program Memory Write Enable bits (Write protection off; all program memory may be written to by EECON control)
#pragma config CP = OFF // Flash Program Memory Code Protection bit (Code protection off)
#include <xc.h>
#include <pic16f877a.h>
#define _XTAL_FREQ 4000000
#define LED1_ON PORTDbits.RD7 = 0
#define LED1_OFF PORTDbits.RD7 = 1
#define LED2_ON PORTDbits.RD6 = 0
#define LED2_OFF PORTDbits.RD6 = 1
#define LED2_TOGGLE PORTDbits.RD6 = ~PORTDbits.RD6
uint16_t preloadValue = 55536 ;
uint16_t counter = 0 ;
uint16_t secCounter1 = 100 ;
void io_config() {
TRISD &= ~((1 << _PORTD_RD7_POSITION) | (1 << _PORTD_RD6_POSITION)) ; //RD7 and RD6 are output LEDs
}
void timer1_init(){
TMR1 = preloadValue ; //loading the preload value
T1CON &= ~((1 << _T1CON_T1CKPS1_POSN) | (1 << _T1CON_T1CKPS0_POSN) | (1 << _T1CON_TMR1CS_POSN)) ; //prescalar is 1 clock is Fosc
T1CONbits.TMR1ON = 1 ; //timer 1 is ON
LED2_ON ;
}
void interrupt_en_configure(){
INTCON |= (1 << _INTCON_GIE_POSITION) | (1 << _INTCON_PEIE_POSITION) ; //global and peripheral interrupt on
PIE1 |= _PIE1_TMR1IE_MASK ; //timer 1 interrupt enable
TMR1IF = 0 ; //clearing interupt flag
}
void __interrupt() ISR(){
if(TMR1IF){
counter ++ ;
if (counter == secCounter1){
counter = 0 ;
LED2_TOGGLE ;
}
TMR1 = preloadValue ;
TMR1IF = 0 ;
}
}
void main(void) {
io_config();
interrupt_en_configure() ;
timer1_init() ;
while (1) {
LED1_ON ;
__delay_ms(1000);
LED1_OFF ;
__delay_ms(1000);
}
}
You should not expect them to operate synchronously for the following reasons:
First you do not know how __delay_ms() is implemented or any "promises" of precision it may make - it is certainly not using TIMER1, because you are controlling that. In fact the documentation gives some implementation details, and you really cannot expect precision.
Secondly, even if __delay_ms() were both accurate and synchronous, you are invoking it in a loop with the software overhead of the loop, function call and whatever you are doing to toggle the LED. That is a few cycles on every iteration that do not affect the interrupt interval which is locked to the hardware, and independent of the software timing.
The issue of precision of __delay_ms() is in fact addressed in this Microchip support article where it starts:
If an accurate delay is required, or if there are other tasks that can be performed during the delay, then using a timer to generate an interrupt is the best way to proceed.
In this case you should trust your code over the library provided delay which is intentionally crude (because it does not use up a valuable H/W timer resource).
__delay_ms() delays by running an empty loop, but it commonly cannot be exact. You would need to look into the actual machine code that is run to calculate the real delay. BTW, this is not rocket science and a great learning task. (Been there, done that.)
Now the rest of your loop (LED switching, looping) adds to this. Therefore, your pure software driven blinker is not exact.
However, your interrupt driven blinker is not, too. You reset the timer at the end of the ISR, after several clock cycles have passed. You need to take this into account, and don't forget the interrupt latency. Even worse, depending on the conditional statement, the reset happens at different times after the timer overflow.
Producing exact timing is difficult, especially with such a simple device.
The solution is to avoid software at all for the reset of the timer. Please read chapter 8 of the data sheet and use the capture/compare/PWM module to reset the timer on the appropriate value.
The worst thing that could still happen is some jitter, just because the ISR might have different latencies. But the timer runs as exactly as your system's crystal. In average your LED will blink correctly.
Anyway, if your timing requirements are not that hard, consider to live with some inaccuracy. Then use the most simple solution you like best.

ADC giving strange output for PIC16F15325

I was trying to read analog voltage on pin RC3 on PIC16F15325. I have 3.23V across potentiometer and its output is nearly 1.65V which goes to pin RC3 of PIC microcontroller. For configuration and libraries I used MPLAB Code Cofigurator. Code is as follows:
#include "mcc_generated_files/mcc.h"
void main(void)
{
// initialize the device
SYSTEM_Initialize();
EUSART1_Initialize();
ADC_Initialize();
adc_result_t val1 = 0;
// When using interrupts, you need to set the Global and Peripheral Interrupt Enable bits
// Use the following macros to:
// Enable the Global Interrupts
//INTERRUPT_GlobalInterruptEnable();
// Enable the Peripheral Interrupts
//INTERRUPT_PeripheralInterruptEnable();
// Disable the Global Interrupts
//INTERRUPT_GlobalInterruptDisable();
// Disable the Peripheral Interrupts
//INTERRUPT_PeripheralInterruptDisable();
while (1)
{
// Add your application code
val1 = ADC_GetConversion(19); // selected channel RC3
printf("Value - %hu \n",val1);
DELAY_milliseconds(1000);
}
}
For voltage I mentioned above, I expected value near "511". Anything beyond 1023 [i.e. (2^10) - 1] is strange as PIC has 10-bit ADC.However uotput I get is:
Kindly, help me solving this issue.
Actually, output was correct but was looking strange only because of format. Just now I looked into datasheet which said about 2 ways of ADC result formatting.
PIC16(L)F15325/45 Datasheet page-228
and inside ADC_Initialize(); ADCON1 = 0x10 which means ADFM is left shift. I just did val1 = val1 >> 6; after ADC_GetConversion(19); and it worked as expected.

Trying to bit-bang TPIC6C595 shift register but no output

The code is for an AVR atamega168xplained mini board, with an ATmega168pb MCU. The shift register I am using is a Texas Instruments TPIC6C595 I have the drain outputs of the shift register connected to the anodes of 8 LEDs. The OE(G) pin of shift register is tied to GND, and CLR tied to 5V. There is a 100nF ceramic capacitor between shift register VCC and GND. SER OUT is left unconnected to anything since I am trying to bit-bang just this one before I move up to chaining shift registers.
What happens is that I get no output from the shift register, all drain outputs are low (tested with multimeter). When I disconnect the SER IN, SRCK, and RCK from the microcontroller i get some flickering on only one of the LEDs which I guess is a result of those pins floating and being in an undefined state. I would have expected at least to get some kind of garbage output even if the code was wrong, but I get more of an output with the microcontroller completely disconnected. I know it is outputting a signal because I can connect it to the LEDS without the shift register and see they are lit up at various levels of intensity but do not have an oscilloscope to be able to actually look at the signals.
This is the code, with the defines for the output port at the top of the file included so it's clear what's being done:
#define DDR_SREG DDRD
#define PORT_SREG PORTD
#define SRCK _BV(PORTD0)
#define RCK _BV(PORTD1)
#define SER _BV(PORTD2)
void display_write(uint8_t data)
{
char i;
PORT_SREG &= ~RCK; // latch low
for (i = 0; i < 8; ++i) {
PORT_SREG &= ~SRCK; // clock low
if (data & 1) // serial out
PORT_SREG |= SER;
else
PORT_SREG &= ~SER;
PORT_SREG |= SRCK; // clock high
data >>= 1; // shift data
}
PORT_SREG |= RCK; // latch high
}
Solved it. After doing some more research, it was apparent that this shift register has open drain outputs and cannot source current (they can only sink current). Adjusting the wiring accordingly I was able to get the shift register working to my satisfaction.

FRDM-KL46Z Internal Reference clock to 48 MHz?

I am working on IAR compiler for FRDM-KL46Z Platform.
I want to use Internal clock and set it to 48 MHz (or as maximum as possible).
Till now I have done the following steps in the example sysinit.c file and function sysinit() provided.
#define NO_PLL_INIT
#if defined(NO_PLL_INIT)
mcg_clk_hz = 48000000; // It only works on 21000000 Hz, otherwise I get garbage prints on UART0.
SIM_SOPT2 &= ~SIM_SOPT_PLLFLLSEL_MASK
uart0_clk_khz = (mcg_clk_hz) / 1000;
#else
....
In FEI mode, if I do FBI mode or BLPI mode, I get very less mcu clock.
I want the mcu clk to be as high as possible, in internal clock. (According to datasheet I think it is supported, but I don't know how?)
Can anyone please explain or any code reference, much obliged.
Fixed it by doing this
#define NO_PLL_INIT
#if defined(NO_PLL_INIT)
MCG_C4 |= (MCG_C4_DRST_DRS(1) | MCG_C4_DMX32_MASK);
mcg_clk_hz = 48000000;
SIM_SOPT2 &= ~SIM_SOPT_PLLFLLSEL_MASK
uart0_clk_khz = (mcg_clk_hz) / 1000;
#else
....

Why won't this PIC code light up my LEDs?

The following code won't set any of the pins high on my PIC18F14K50, yet it couldn't be simpler!
#include <pic18.h>
#include <htc.h>
void main(void)
{
// Set ALL pins to output:
TRISA = 0;
TRISB = 0;
TRISC = 0;
// Set ALL pins to high:
LATA = 0b11111111;
LATB = 0b11111111;
LATC = 0b11111111;
// Leave pins high and wait forever:
while (1);
}
I'm using MPLAB v8.43 and the Hi-Tech ANSI C Compiler.
A logic probe shows none of the pins high except the VUSB and the MCLR.
Any ideas?
At least some of the pins may be configured as Analog Inputs.
From the Datasheet for this device
The operation of pin RA4 as analog is selected by setting the ANS3
bit in the ANSEL register which is the default set-ting after a
Power-on Reset.
If you do not set the ANSEL register the pin cannot be used as output as it is configured as an analog input.
This applies to all the pins that can be A/D inputs, which does not cover all the pins you have.
Then again I do not see any configuration bit setup in your code. That device e.g. has 2 different instruction sets and you have to at the very least specify which instruction set you are using in the configuration bits.
You may try adding this to the top of your code just after the includes :
// Configuration BITS setup
__CONFIG(1, FOSC_INTIO2 & XINST_OFF);
__CONFIG(2, WDTEN_OFF & PWRTEN_ON);
__CONFIG(3, MCLRE_OFF);
I suppose that you didn't configure the MCPU oscillator, try to define:
; Oscillator:
config FOSC = INTIO2 ;Internal RC oscillator
;
; PLL x4 Enable bit:
config PLLCFG = OFF
and
;Define oscillator frequency
;{
movlw b'01100000'
movwf OSCCON
movlw b'01000000'
movwf OSCTUNE
;};
This directives are for MPLAB asm and not for Hi-Tech, but file registers should have the same names.

Resources