I2c bit banging Programming using C - c

I am trying to write a c code for I2c Using bit banging. I have revised the Code of wiki(http://en.wikipedia.org/wiki/I%C2%B2C). But i am unable to get a result. As per my understanding the code in the wiki is not correct. Many changes i made, but one of the major change i made,where wiki failed tell correctly is tagged/label that line with /TBN/. My code is below,
// Hardware-specific support functions that MUST be customized:
#define I2CSPEED 135
#define SCL P0_0
#define SDA P0_1
void I2C_delay() { volatile int v; int i; for (i=0; i < I2CSPEED/2; i++) v; }
bool read_SCL(void); // Set SCL as input and return current level of line, 0 or 1
bool read_SDA(void); // Set SDA as input and return current level of line, 0 or 1
void clear_SCL(void); // Actively drive SCL signal low
void clear_SDA(void); // Actively drive SDA signal low
void Set_SDA(void); // Actively drive SDA signal High;
void Set_SCL(void); // Actively drive SCL signal High
void Set_SCL(void)
{
//make P0_0 as OutputPort
SCL = 1;
}
void Set_SDA(void)
{
//make P0_1 as OutputPort
SDA = 1;
}
void clear_SCL(void)
{
//make P0_0 as OutputPort
SCL = 0;
}
void clear_SDA(void)
{
//make P0_1 as OutputPort
SDA = 0;
}
bool read_SCL(void)
{
//make P0_0 as InputPort
return SCL;
}
bool read_SDA(void)
{
//make P0_0 as InputPort
return SDA;
}
void i2c_start_cond(void) {
// set SDA to 1
Set_SDA();/*TBN*/
set_SCL();
// SCL is high, set SDA from 1 to 0.
I2C_delay();
clear_SDA();
I2C_delay();
I2C_delay();
clear_SCL();//make SCL Low for data transmission
started = true;
}
void i2c_stop_cond(void){
// set SDA to 0
clear_SDA();
I2C_delay();
// SCL is high, set SDA from 0 to 1
Set_SCL();/*TBN*/
I2C_delay();
Set_SDA();
}
// Write a bit to I2C bus
void i2c_write_bit(bool bit) {
if (bit) {
Set_SDA();/*TBN*/
} else {
clear_SDA();
}
I2C_delay();
clear_SCL();
}
// Read a bit from I2C bus
bool i2c_read_bit(void) {
bool bit;
// Let the slave drive data
read_SDA();
I2C_delay();
// SCL is high, now data is valid
bit = read_SDA();
I2C_delay();
clear_SCL();
return bit;
}
// Write a byte to I2C bus. Return 0 if ack by the slave.
bool i2c_write_byte(bool send_start,
bool send_stop,
unsigned char byte) {
unsigned bit;
bool nack;
if (send_start) {
i2c_start_cond();
}
for (bit = 0; bit < 8; bit++) {
i2c_write_bit((byte & 0x80) != 0);
byte <<= 1;
}
nack = i2c_read_bit();
if (send_stop) {
i2c_stop_cond();
}
return nack;
}
// Read a byte from I2C bus
unsigned char i2c_read_byte(bool nack, bool send_stop) {
unsigned char byte = 0;
unsigned bit;
for (bit = 0; bit < 8; bit++) {
byte = (byte << 1) | i2c_read_bit();
}
i2c_write_bit(nack);
if (send_stop) {
i2c_stop_cond();
}
return byte;
}
This code is for one master in a bus. I request experts to review my code and let me know my mistakes.

There's a lot of flat-out wrong information in the comments.
First of all, your read_bit() function never toggles the clock. That's probably your problem, along with #user3629249's comment that the master sends an ACK bit after every 8 bits from the slave. You'll have to address this in your read_byte() function.
Second: I2C does not care about clock jitter; it defines only that the data must be stable when the clock falls. There will be some nanoseconds on either side of the clock edge where the data must not change but that is not the issue here. I2C slaves don't "lock on" to the clock. In fact the I2C specification doesn't define a lower limit to high or low SCL times. You could conceivably wait days or years between clock ticks. SMB does define a timeout, but that's not your problem here either.
Lastly: oversampling doesn't really apply to I2C. You can read the bit a few times to make sure it hasn't changed but a properly functioning I2C slave will not be changing the data until sometime after the rising edge of the SDA signal, which is many thousands of nanoseconds away from the critical falling edge. I2C is not asynchronous like your usual serial port/UART.

First bit banging I2C is way more complicated than bit banging SPI.
So first I would take an SPI EEPROM like the 25AA1024 and try my bit banging skills.
Then try a simple Adadfruit I2C device like this one
https://www.adafruit.com/products/1855
It only requires I2C out and you easily port the Arduino code to C or C#.
I know that the USB device Nusbio implement I2C and SPI bit banging written in C#.
See their source code at https://github.com/madeintheusb/Nusbio.Samples

You need to implement P0_0 and P0_1, somehow. The calls you marked finally involve them and I am not seeing them in your code nohow.
If you are developing this on a real hardware then in order to affect corresponding pins in your hardware you need implement P0_0 and P0_1 macros with code accessing the particular control registers to control the logical levels at these two lines.

I stripped down the wikipedia code for my special-purpose application,
but I only got it working after I fixed the Stop condition. It never
raises the SCL so I added the following to that function:
// Stop bit setup time, minimum 4us
set_SCL(); // added this line
I2C_delay();
I am verifying the correctness now, and if so I'll update wikipedia
myself.

Related

SPI communication failed, nothing works

I am trying a lot of days to acomplish a communication via spi (stm32-nrf24..) and it doesn't work despite that I had read a lot of resources I cant make it right and I dont know why this is happening, I realy need help. I have nucleol053r8 and nrf24l01 and I want to accive a communication between them. When I am truing to debug the spi, for example to write something on DR and after to read it, nothing works! It seems that nothing is writing to DR.
My configurations for this communication are
CPOL=CPHA=0
FULL DUPLEX MODE
MSB FIRST
MASTER MODE ON
8bit DATA FRAME
SSM=SSI=1
Here is my functions for reading and writing.
void spi1_transmit(uint8_t *data,uint32_t size){
uint32_t i=0;
uint8_t temp;
while (i<size){
while (!READ_BIT(SPI1->SR,SPI_SR_TXE));
SPI1->DR=data[i];
i++;
}
while (!READ_BIT(SPI1->SR,SPI_SR_TXE));
while (READ_BIT(SPI1->SR,SPI_SR_BSY));
temp = SPI1->DR;
temp = SPI1->SR;
}
void spi1_receive(uint8_t *data,uint32_t size){
while (size){
SPI1->DR=0;
while (!READ_BIT(SPI1->SR,SPI_SR_RXNE));
*data++=(SPI1->DR);
size--;
}
}
void spi1_gpio_init(){
/*Enable clock access to GPIOA*/
SET_BIT(RCC->IOPENR,IOPAEN);
/*Setting as AF functions the proper pins. */
CLEAR_BIT(GPIOA->MODER,(1U<<10)); // PA5
SET_BIT(GPIOA->MODER,(1U<<11));
CLEAR_BIT(GPIOA->OTYPER,(1U<<5));
SET_BIT(GPIOA->OSPEEDR,(1U<<10));
SET_BIT(GPIOA->OSPEEDR,(1U<<11));
// Setting AF type AF0.
CLEAR_BIT(GPIOA->AFR[0],(1U<<20));
CLEAR_BIT(GPIOA->AFR[0],(1U<<21));
CLEAR_BIT(GPIOA->AFR[0],(1U<<22));
CLEAR_BIT(GPIOA->AFR[0],(1U<<23));
CLEAR_BIT(GPIOA->MODER,(1U<<12)); //PA6
SET_BIT(GPIOA->MODER,(1U<<13));
CLEAR_BIT(GPIOA->OTYPER,(1U<<6));
SET_BIT(GPIOA->OSPEEDR,(1U<<12));
SET_BIT(GPIOA->OSPEEDR,(1U<<13));
// Setting AF type AF0.
CLEAR_BIT(GPIOA->AFR[0],(1U<<24));
CLEAR_BIT(GPIOA->AFR[0],(1U<<25));
CLEAR_BIT(GPIOA->AFR[0],(1U<<26));
CLEAR_BIT(GPIOA->AFR[0],(1U<<27));
CLEAR_BIT(GPIOA->MODER,(1U<<14)); //PA7
SET_BIT(GPIOA->MODER,(1U<<15));
CLEAR_BIT(GPIOA->OTYPER,(1U<<7));
SET_BIT(GPIOA->OSPEEDR,(1U<<14));
SET_BIT(GPIOA->OSPEEDR,(1U<<15));
// Setting AF type AF0.
CLEAR_BIT(GPIOA->AFR[0],(1U<<28));
CLEAR_BIT(GPIOA->AFR[0],(1U<<29));
CLEAR_BIT(GPIOA->AFR[0],(1U<<30));
CLEAR_BIT(GPIOA->AFR[0],(1U<<31));
SET_BIT(GPIOA->MODER,(1U<<18)); //PA9
CLEAR_BIT(GPIOA->MODER,(1U<<19));
CLEAR_BIT(GPIOA->OTYPER,(1U<<9));
SET_BIT(GPIOA->OSPEEDR,(1U<<18));
SET_BIT(GPIOA->OSPEEDR,(1U<<19));
SET_BIT(GPIOA->MODER,(1U<<16)); //PA8
CLEAR_BIT(GPIOA->MODER,(1U<<17));
CLEAR_BIT(GPIOA->OTYPER,(1U<<8));
SET_BIT(GPIOA->OSPEEDR,(1U<<16));
SET_BIT(GPIOA->OSPEEDR,(1U<<17));
}
void spi1_config(){
/*Enable clock access to SPI1.*/
SET_BIT(RCC->APB2ENR,SPI1_EN);
/*Setting up BAUDRATE. FPCLK/32 (100) */
SET_BIT(SPI1->CR1,(1U<<5)); // 1.
CLEAR_BIT(SPI1->CR1,(1U<<4)); // 0.
CLEAR_BIT(SPI1->CR1,(1U<<3)); // 0.
/*Setting up CPOL=0 and CPHA=0.*/
CLEAR_BIT(SPI1->CR1,(1U<<0)); // CPHA=0.
CLEAR_BIT(SPI1->CR1,(1U<<1)); // CPOL=0.
/*Enable full-duplex mode.*/
CLEAR_BIT(SPI1->CR1,(1U<<10));
/*Set MSB first.*/
CLEAR_BIT(SPI1->CR1,(1U<<7));
/*Enable Master mode.*/
SET_BIT(SPI1->CR1,(1U<<2));
/*Setting up 8-bit data frame.*/
CLEAR_BIT(SPI1->CR1,(1U<<11));
/*Enable software slave management. SSM=1 and SSI=1.*/
SET_BIT(SPI1->CR1,(1U<<9)); // SSM=1.
SET_BIT(SPI1->CR1,(1U<<8)); // SSI=1.
cs_disable();
ce_disable();
/*Enable SPI peripheral.*/
SET_BIT(SPI1->CR1,(1U<<6));
}
and with this why am testing it
int main(void){
uint8_t tx_b[3],rx_b[3];
init_rcc();
spi1_gpio_init();
spi1_config();
tx_b[0]=0b00111111;
tx_b[1]=0b00001000;
tx_b[2]=0b00101000;
spi1_transmit(tx_b,3);
while(1){
spi1_receive(rx_b,3);
}
}
You need to enable SPI clock
You need to enable the peripheral
Many STM32 SPIs have FIFO and you need to force the compiler to generate the correct size store instructions.
For example
*(volatile uint8_t *)&SPI1->DR = data[i];

Using Interrupt to Transmit via USART on AVR MCU

I believe I understand how to use interrupts to receive serial data on UART of an ATmega328p, but I don't understand the mechanics of how to transmit data.
Here is a basic program that I want to use to transmit the character string "hello" using interrupts to drive transmission. I understand that the character 'o' will likely be transmitted twice, and I am ok with that.
#include <avr/io.h>
#include <avr/interrupt.h>
#define F_CPU 16000000UL
#define BAUD 19200
#define DOUBLE_SPEED 1
void initUART(unsigned int baud, unsigned int speed);
volatile uint8_t charIndex = 0;
volatile unsigned char command[5] = "hello";
int main(void)
{
//initialize UART
initUART(BAUD, DOUBLE_SPEED);
sei();
//What do I put here to initiate transmission of character string command?
//Is this even correct?
UDR0 = command[0];
while(1)
{
}
}
ISR(USART_TX_vect)
{
// Transmit complete interrupt triggered
if (charIndex >= 4)
{
//Reach the end of command, end transmission
return;
}
//transmit the first char or byte
UDR0 = command[charIndex];
//Step to the next place of the command
charIndex++;
}
void initUART(unsigned int baud, unsigned int speed)
{
unsigned int ubrr;
if(speed)
{
//double rate mode
ubrr = F_CPU/8/baud-1;
//set double speed mode
UCSR0A = (speed << U2X0);
}
else
{
//normal rate mode
ubrr = F_CPU/16/baud-1;
}
//set the baud rate
UBRR0H = (unsigned char)(ubrr >> 8);
UBRR0L = (unsigned char)(ubrr);
//enable Tx and Rx pins on MCU
UCSR0B = (1 << RXEN0) | (1 << TXEN0);
//enable transmit interrupt
UCSR0B = (1 << TXCIE0);
//set control bits, 8 bit char, 0 stop, no parity
UCSR0C = (1 <<UCSZ00) | (1 <<UCSZ01);
}
My understanding is that if I wrote the first character to UDR0 (as I did in main()), this would then trigger a Transmit Complete Interrupt, and then the next byte would be transmitted via the ISR. This does not seem to work.
The code shown here compiles using gcc. Can someone offer an explanation?
The key thing to understand is that the USART has 2 separate hardware registers that are used in the data transmission: UDRn and the Transmit Shift Register, which I'll just call TSR from now on.
When you write data to UDRn, assuming no tx is in progress, it'll get moved to the TSR immediately and the UDRE irq fires to tell you that the UDRn register is "empty". Note that at this point the transmission has just started, but the point is that you can already write the next byte to UDRn.
When the byte has been fully transmitted, the next byte is moved from UDRn to TSR and UDRE fires again. So, you can write the next byte to UDRn and so on.
You must only write data to the UDRn when it is "empty", otherwise you'll overwrite the byte it's currently storing and pending transmission.
In practice, you don't usually mind about the TXC irq, you want to work with the UDRE to feed more data to the USART module.
The TXC irq, however, is useful if you need to perform some operation when the transmission has actually completed. A common example when dealing with RS485 is to disable the transmitter once you're done sending data and possibly re-enable the receiver that you could have disabled to avoid echo.
Regarding your code
Your main issue is that you're setting UCSR0B 2 times in initUART() and the second write clears the bits you just set, so it's disabling the transmitter. You want to set all bits in one go, or use a |= on the second statement.

PIC16LF1824 SPI Slave Interface

I am trying to receive multiple bytes over SPI. The aim is when the master starts the SPI transfer, slave MCU is interrupted, and it should read the data via SPI and store it in an array, which will be then used by my application for other operations such as determining the device ID and the contents of the packet.
void interrupt __high_priority my_isr_high(void) {
if (PIR1bits.SSP1IF) { // Interrupt from SPI?
rx[buffer_pointer] = SSP1BUF; // Get data from MSSP and store in RX buffer
buffer_pointer++; // Next data
if (buffer_pointer < FRAME_SIZE) // Ended?
SSP1BUF = tx[buffer_pointer]; // Send next byte to SPI
else
buffer_pointer = FRAME_SIZE;
PIR1bits.SSP1IF = 0; // Clear interrupt flag
}
}
However, I am not receiving the 3 bytes correctly. I am sending the following from the master:
dataPacket[0] = 0x43; // Decimal 67
dataPacket[1] = 0x42; //66
dataPacket[2] = 0x41; //65
While I am receiving as follows from the ISR():
rx[0]: 67
rx[1]: 65
rx[2]: 67
Am I missing something or handling the SPI incorrectly?
This will really solve the issue that I am stuck with and maybe will also help others who what to rx multiple bytes.
I am sharing my codes here so that it helps to find the solution quickly. Also included is a .zip file for compiling. Check the Codes here
So far the above code did not work for me properly. Therefore, after a little bit of digging online and other forums I found the following way to read multiple bytes:
uint8_t SPI_ExchangeHandler(uint8_t byte){
static uint8_t i = 0;
for(i=0; i<3; i++){
SSP1BUF =0x00;
while(!SSP1STATbits.BF);
rx_buff[i]=SSP1BUF;
}
State = SEND;
return byte;
}
Although the above codes give me what expected (i.e, correct data packets in the ordered manner), however, it misses two SPI interrupts every time and then displays/captures the correct data. Hence, two sets of data are always lost and then the third one is received correctly.
Is something wrongly configured or missing?
Finally, I managed to receive all the 3 bytes correctly. Sharing the codes below:
My interrupt service routine that triggers the MCU when master SPI has data to send.
void interrupt INTERRUPT_InterruptManager (void){
if(PIE1bits.SSP1IE == 1 && PIR1bits.SSP1IF == 1)
{
SPI_ISR();
}
}
The SPI_ISR code was autogenerated by the MCC GUI.
void SPI_ISR(void)
{
SSP1BUF = SPI_xchgHandler(SSP1BUF);
}
void SPI_setExchangeHandler(uint8_t (* InterruptHandler)(uint8_t))
{
SPI_xchgHandler = InterruptHandler;
}
Then I handle the SPI via a custom function using SPI_setExchangeHandler() as follows:
#define FRAME_SIZE 10 // Frame fixed size
volatile static uint8_t rx_buff[FRAME_SIZE]; //RX buffer
uint8_t SPI_ExchangeHandler(uint8_t byte)
{
static uint8_t i = 0;
rx_buff[i]=SSP1BUF;
i++;
if(i <= 2){
rx_buff[i]=SSP1BUF;
SSP1BUF = 0x00;
while(!SSP1STATbits.BF){};
i++;
}
else{
i = 2;
}
PIR1bits.SSP1IF = 0; // Clear the SPI interrupt flag
State = SEND;
return byte;
}
And I print out the values as follows for debugging:
printf("CMD:%d \n", rx_buff[0]);
printf("BUF1: %d \n", rx_buff[1]);
printf("BUF2: %d \n\n", rx_buff[2]);
However, I am pretty sure this is not the best/optimized way to handle multiple bytes from SPI, therefore, if there is an alternative, share...

STM8 interrupt serial receive

I am new to STM8, and trying to use a STM8S103F3, using IAR Embedded Workbench.
Using C, I like to use the registers directly.
I need serial on 14400 baud, 8N2, and getting the UART transmit is easy, as there are numerous good tutorials and examples on the net.
Then the need is to have the UART receive on interrupt, nothing else will do.
That is the problem.
According to iostm8s103f3.h (IAR) there are 5 interrupts on 0x14 vector
UART1_R_IDLE, UART1_R_LBDF, UART1_R_OR, UART1_R_PE, UART1_R_RXNE
According to Silverlight Developer: Registers on the STM8,
Vector 19 (0x13) = UART_RX
According to ST Microelectronics STM8S.h
#define UART1_BaseAddress 0x5230
#define UART1_SR_RXNE ((u8)0x20) /*!< Read Data Register Not Empty mask */
#if defined(STM8S208) ||defined(STM8S207) ||defined(STM8S103) ||defined(STM8S903)
#define UART1 ((UART1_TypeDef *) UART1_BaseAddress)
#endif /* (STM8S208) ||(STM8S207) || (STM8S103) || (STM8S903) */
According to STM8S Reference manual RM0016
The RXNE flag (Rx buffer not empty) is set on the last sampling clock edge,
when the data is transferred from the shift register to the Rx buffer.
It indicates that a data is ready to be read from the SPI_DR register.
Rx buffer not empty (RXNE)
When set, this flag indicates that there is a valid received data in the Rx buffer.
This flag is reset when SPI_DR is read.
Then I wrote:
#pragma vector = UART1_R_RXNE_vector //as iostm8s103f3 is used, that means 0x14
__interrupt void UART1_IRQHandler(void)
{ unsigned character recd;
recd = UART1_SR;
if(1 == UART1_SR_RXNE) recd = UART1_DR;
etc.
No good, I continually get interrupts, UART1_SR_RXNE is set, but UART1_DR
is empty, and no UART receive has happened. I have disabled all other interrupts
I can see that can vector to this, and still no good.
The SPI also sets this flag, presumably the the UART and SPI cannot be used
together.
I sorely need to get this serial receive interrupt going. Please help.
Thank you
The problem was one bit incorrectly set in the UART1 setup.
The complete setup for the UART1 in the STM8S103F3 is now(IAR):
void InitialiseUART()
{
unsigned char tmp = UART1_SR;
tmp = UART1_DR;
// Reset the UART registers to the reset values.
UART1_CR1 = 0;
UART1_CR2 = 0;
UART1_CR4 = 0;
UART1_CR3 = 0;
UART1_CR5 = 0;
UART1_GTR = 0;
UART1_PSCR = 0;
// Set up the port to 14400,n,8,2.
UART1_CR1_M = 0; // 8 Data bits.
UART1_CR1_PCEN = 0; // Disable parity.
UART1_CR3 = 0x20; // 2 stop bits
UART1_BRR2 = 0x07; // Set the baud rate registers to 14400
UART1_BRR1 = 0x45; // based upon a 16 MHz system clock.
// Disable the transmitter and receiver.
UART1_CR2_TEN = 0; // Disable transmit.
UART1_CR2_REN = 0; // Disable receive.
// Set the clock polarity, clock phase and last bit clock pulse.
UART1_CR3_CPOL = 0;
UART1_CR3_CPHA = 0;
UART1_CR3_LBCL = 0;
// Set the Receive Interrupt RM0016 p358,362
UART1_CR2_TIEN = 0; // Transmitter interrupt enable
UART1_CR2_TCIEN = 0; // Transmission complete interrupt enable
UART1_CR2_RIEN = 1; // Receiver interrupt enable
UART1_CR2_ILIEN = 0; // IDLE Line interrupt enable
// Turn on the UART transmit, receive and the UART clock.
UART1_CR2_TEN = 1;
UART1_CR2_REN = 1;
UART1_CR1_PIEN = 0;
UART1_CR4_LBDIEN = 0;
}
//-----------------------------
#pragma vector = UART1_R_RXNE_vector
__interrupt void UART1_IRQHandler(void)
{
byte recd;
recd = UART1_DR;
//send the byte to circular buffer;
}
You forget to add global interrupt flag
asm("rim") ; //Enable global interrupt
It happens at non isolated connections whenever you connect your board's ground with other source's ground (USB<->TTL converter connected to PC etc.), In this case microcontroller is getting noise due to high value SMPS's Y capacitor etc.
Simply connect your RX and TX line's via 1K resistor and put 1nF (can be deceased for high speed) capacitors on these lines and to ground (micro controller side) to suppress noises.

Problems with PIC A/D conversion

I am trying to read analogic signal for a sort of mouse with a pic18f14k50 controller. Here the simple circuit: http://dl.dropbox.com/u/14663091/schematiconew.pdf . I have to read analogic signal from AN9 circuit port. Main function reads from the port, and blinks 30 time if threshold is reached:
void main(void) {
InitializeSystem();
#if defined(USB_INTERRUPT)
USBDeviceAttach();
#endif
while(1) {
if((USBDeviceState < CONFIGURED_STATE)||(USBSuspendControl==1)) continue;
if(!HIDTxHandleBusy(lastTransmission))
{
int readed = myReadADC2(); //Here i tried both myReadADC2() or myReadADC1()
if(readed>40) { //If read threshold > 40, blink led 30 times
int i;
for(i=0; i<30; i++) {
Delay1KTCYx(0);
mLED_1_On();
Delay1KTCYx(0);
mLED_1_Off();
}
}
lastTransmission = HIDTxPacket(HID_EP, (BYTE*)hid_report_in, 0x03);
}//end while
}//end main
I used two method to read from the AN9 port, myReadADC() that uses OpenADC() API method:
int myReadADC(void) {
#define ADC_REF_VDD_VDD_X 0b11110011
OpenADC(ADC_FOSC_RC & ADC_RIGHT_JUST & ADC_12_TAD, ADC_CH9 & ADC_INT_OFF, ADC_REF_VDD_VDD_X & ADC_REF_VDD_VSS, 0b00000010); // channel 9
SetChanADC(ADC_CH9);
ConvertADC(); // Start conversion
while(BusyADC()); // Wait for completion
return ReadADC(); // Read result
}
and myReadADC2(), that implements manual read from the port.
int myReadADC2() {
int iRet;
OSCCON=0x70; // Select 16 MHz internal clock
ANSEL = 0b00000010; // Set PORT AN9 to analog input
ANSELH = 0; // Set other PORTS as Digital I/O
/* Init ADC */
ADCON0=0b00100101; // ADC port channel 9 (AN9), Enable ADC
ADCON1=0b00000000; // Use Internal Voltage Reference (Vdd and Vss)
ADCON2=0b10101011; // Right justify result, 12 TAD, Select the FRC for 16 MHz
iRet=100;
ADCON0bits.GO=1;
while (ADCON0bits.GO); // Wait conversion done
iRet=ADRESL; // Get the 8 bit LSB result
iRet += (ADRESH << 8); // Get the 2 bit MSB result
return iDelay;
}
Both cases doesn't works, i touch (sending analogic signal) port AN9 but when I set high threshold (~50) led don't blinks, with low threshold (~0) it blinks immidiatly when i provide power to the PIC. Maybe i'm using wrong port? I'm actually passing AN9 as reading port? Or maybe threshold is wrong? How can i found the right value? Thank you
Here the MPLAB C18 Apis http://dl.dropbox.com/u/14663091/API%20microchip%20C18.pdf .
Regarding function myReadADC2(): you need to switch ANSEL and ANSELH configs as RC7/AN9 is configured in bit 1 of ANSELH. Also call me paranoid but for the line
iRet += (ADRESH << 8);
I always like to either save it a temporary variable first or cast explicitly the value ADRESH before shifting it up:
iRet += (((UINT) ADRESH) << 8);
That way I know for sure the bits won't get lost when shifting up which has bitten me before.
Regarding function myReadADC():
OpenADC() only takes two parameters. I presume that bitfield in the third parameter field is for the analog enable (ADRESH/ADRES). I'm assuming that's handled by SetChanADC() but you may have to set ADRESH/ADRES manually. It may help to set a breakpoint in the debugger and stop after configuration is complete to make sure your registers are set appropriatley.

Resources