I am new to embedded programming and am trying to get my first I2C project working. I am using the PIC32MX795F512L. I am pretty much following the microchip datasheet for I2C on PIC32. The problem I'm having is the S bit is never cleared from the I2C1STAT register. It is actually unclear to me whether I have to do this or if it is done automatically at the conclusion of the Start event, but right now I'm trying to manually clear it. However, nothing that I do seems to have an effect. If more information is needed to make it easier to understand what is happening let me know. I am using a PICKIT3 so I can get debugging information as well. I know that the Master interrupt occurs, the S bit gets set, I exit the interrupt code and hang on the while statement checking the I2C1STATbits.S.
Edit: I'm editing this post to have my new code instead of the old code. I am now using a 20MHZ peripheral clock. Just one of the many things I tried today that did not work. Delay is just a 256ms delay. Super long I know, but it was quick.
main()
{
//Setup I2C1CON
I2C1CONbits.SIDL = 0; //Continue to run while in Idle
I2C1CONbits.SCLREL = 1; //Release the clock (Unsure of this)
I2C1CONbits.A10M = 0; //Using a 7 bit slave address
I2C1CONbits.DISSLW = 1; //Slew rate control disabled because running at 100 KHZ
I2C1ADD = 0x1E; //Slave address without read or write bit
I2C1BRG = 0x060; //Set the BRG clock rate - Based on Page 24-19
I2C1CONbits.ON = 1; //Turn on the I2C module
delay();
I2C1CONbits.SEN = 1; //Initiate a start event
while(I2C1CONbits.SEN == 1); //Wait until Start event is done
I2C1TRN = 0x3C; //Load the address into the Transmit register
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 0); //Wait for a ACK from the device
I2C1TRN = 0x00;
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 0);
I2C1TRN = 0x70;
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 0);
while(1);
}
Thanks for any help.
I'm also just beggining on PIC32MZ family, setting up the I2C to talk to various memory chips.
I used your code and modified it so that it would work properly. Since I am using PIC32MZ family, I believe the I2C registers should probably be the same.
I2C configuration:
I2C1CONbits.SIDL = 0; // Stop in Idle Mode bit -> Continue module operation when the device enters Idle mode
I2C1CONbits.A10M = 0; // 10-bit Slave Address Flag bit -> I2CxADD register is a 7-bit slave address
I2C1CONbits.DISSLW = 1; // Slew Rate Control Disable bit -> Slew rate control disabled for Standard Speed mode (100 kHz)
I2C1CONbits.ACKDT = 0; // Acknowledge Data bit -> ~ACK is sent
I2C1BRG = 0x0F3; // Baud Rate Generator set to provide 100KHz for SCL with 50 MHz xtal.
I followed the transmission steps provided in the I2C Datasheet so it would be easy to follow the steps coupled with the pdf and my comments on the code.
I2C Data Transmission:
// 1. Turn on the I2C module by setting the ON bit (I2CxCON<15>) to ‘1’.
I2C1CONbits.ON = 1; // I2C Enable bit -> Enables the I2C module and configures the SDAx and SCLx pins as serial port pins
//------------- WRITE begins here ------------
// 2. Assert a Start condition on SDAx and SCLx.
I2C1CONbits.PEN = 0; // Stop Condition Enable Bit -> Stop Condition Idle
I2C1CONbits.SEN = 1; // Start Condition Enable bit -> Initiate Start condition on SDAx and SCLx pins; cleared by module
while(I2C1CONbits.SEN == 1); // SEN is to be cleared when I2C Start procedure has been completed
// 3. Load the Data on the bus
I2C1TRN = 0b10100000 ; // Write the slave address to the transmit register for I2C WRITE
while(I2C1STATbits.TRSTAT == 1); // MASTER Transmit still in progress
while(I2C1STATbits.ACKSTAT == 1); // Master should receive the ACK from Slave, which will clear the I2C1STAT<ACKSTAT> bit.
I2C1TRN = 0xCE; // Register Address
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 1);
I2C1TRN = 0xCF; // Register Value
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 1);
I2C1CONbits.PEN = 1; // Stop Condition Enable Bit -> Initiate Stop condition on SDAx and SCLx pins; cleared by module
//-------------- WRITE ends here -------------
This code works well as I used a couple of LEDs to toggle indicating a successful write procedure.
The while(!(I2C1STATbits.ACKSTAT == 0)); is the right way i guess.I2C1ADD is used to set the slave address of the current PIC mc if you are using your I2C as slave.If any master requests start with that slave address the PIC will respond to that as slave.
Read this pdf where initially they have specified that in PIC the I2C is configured as both master and slave and the master request is checked by the same PIC with with the slave address in the I2C1ADD value.
http://ww1.microchip.com/downloads/en/DeviceDoc/61116F.pdf
Related
i have a tiva c micro controller the tm4c123gxl and i have been trying for a while now to use the I2C module on the board with a digital accelrometer with no result , i have been trying to set the MDR register with a certain value to send but it stays as 0
here is the code i am using for intialization till reaching part where i set the MDR register im using step by step debugging i run the code initially to the assignment step of I2C3_MDR_R = 0x2D;
void PortDInit(void)
{
volatile unsigned long delay=0;
SYSCTL_RCGCI2C_R|=0x8; //1-set clock of I2C of module 3
delay = SYSCTL_RCGC2_R; //2-delay to allow clock to stabilize
SYSCTL_RCGC2_R |= 0x00000008; //3-port D clock
delay = SYSCTL_RCGC2_R; //4-delay to allow clock to stabilize
GPIO_PORTD_AFSEL_R |= 0x03; //5-alternate function set for I2C mode
GPIO_PORTD_DEN_R |=0x03; //6-enable digital functionality for PA6 and PA7
GPIO_PORTD_ODR_R|=0x02; //7-enable open drain mode for I2CSDA register of port A
GPIO_PORTD_PCTL_R = 0x00000033; //8-set PCTL to I2C mode
I2C3_MCR_R= 0x00000010; // 9-intialize the i2c master
I2C3_MTPR_R = 0x00000007; // 10-number of system clock cycles in 1 scl period
I2C3_MSA_R = 0x3A // set slave address and read write bit
I2C3_MDR_R = 0x2D; // data to be sent BREAK POINT HERE using single step here yields MDR with same value = 0
I2C3_MCS_R = 0x00000003; // follow transmit condition
while(I2C3_MCS_R &= 0x40 == 1); // wait bus is busy sending data
if(I2C3_MCS_R&=0x04 ==1)
{
//handle error in communication
}
else
{
//success in transmission
}
what i have done to reach this code
carefully understood the I2C protocol how it works etc.
check the data sheet and follow the initalization steps mentioned there step by step which got me to this code
i know i should use tivaware library which will be easier but using
the registers helps me understand more of how everything is working ,
im still a student
at first i didnt have the digital enable line as it wasnt mentioned
to be activated for the I2C but its only logical it should be there
as we are using digital values i tried with both yielded the same
output mdr=0
i am using keil 4 as my IDE and im viewing the values of registers of
I2C module 3 to know whether data is placed in MDR or not
hope any one helps
thanks.
This is a long shot, but here goes:
in your comments, step 6 says
//6-enable digital functionality for PA6 and PA7
but it appears you are working on GPIO_PORTD...
maybe its a comment typo (you meant PD6 and PD7)
but just double check you are looking at the right pins...
Good luck!
I am new to STM8, and trying to use a STM8S103F3, using IAR Embedded Workbench.
Using C, I like to use the registers directly.
I need serial on 14400 baud, 8N2, and getting the UART transmit is easy, as there are numerous good tutorials and examples on the net.
Then the need is to have the UART receive on interrupt, nothing else will do.
That is the problem.
According to iostm8s103f3.h (IAR) there are 5 interrupts on 0x14 vector
UART1_R_IDLE, UART1_R_LBDF, UART1_R_OR, UART1_R_PE, UART1_R_RXNE
According to Silverlight Developer: Registers on the STM8,
Vector 19 (0x13) = UART_RX
According to ST Microelectronics STM8S.h
#define UART1_BaseAddress 0x5230
#define UART1_SR_RXNE ((u8)0x20) /*!< Read Data Register Not Empty mask */
#if defined(STM8S208) ||defined(STM8S207) ||defined(STM8S103) ||defined(STM8S903)
#define UART1 ((UART1_TypeDef *) UART1_BaseAddress)
#endif /* (STM8S208) ||(STM8S207) || (STM8S103) || (STM8S903) */
According to STM8S Reference manual RM0016
The RXNE flag (Rx buffer not empty) is set on the last sampling clock edge,
when the data is transferred from the shift register to the Rx buffer.
It indicates that a data is ready to be read from the SPI_DR register.
Rx buffer not empty (RXNE)
When set, this flag indicates that there is a valid received data in the Rx buffer.
This flag is reset when SPI_DR is read.
Then I wrote:
#pragma vector = UART1_R_RXNE_vector //as iostm8s103f3 is used, that means 0x14
__interrupt void UART1_IRQHandler(void)
{ unsigned character recd;
recd = UART1_SR;
if(1 == UART1_SR_RXNE) recd = UART1_DR;
etc.
No good, I continually get interrupts, UART1_SR_RXNE is set, but UART1_DR
is empty, and no UART receive has happened. I have disabled all other interrupts
I can see that can vector to this, and still no good.
The SPI also sets this flag, presumably the the UART and SPI cannot be used
together.
I sorely need to get this serial receive interrupt going. Please help.
Thank you
The problem was one bit incorrectly set in the UART1 setup.
The complete setup for the UART1 in the STM8S103F3 is now(IAR):
void InitialiseUART()
{
unsigned char tmp = UART1_SR;
tmp = UART1_DR;
// Reset the UART registers to the reset values.
UART1_CR1 = 0;
UART1_CR2 = 0;
UART1_CR4 = 0;
UART1_CR3 = 0;
UART1_CR5 = 0;
UART1_GTR = 0;
UART1_PSCR = 0;
// Set up the port to 14400,n,8,2.
UART1_CR1_M = 0; // 8 Data bits.
UART1_CR1_PCEN = 0; // Disable parity.
UART1_CR3 = 0x20; // 2 stop bits
UART1_BRR2 = 0x07; // Set the baud rate registers to 14400
UART1_BRR1 = 0x45; // based upon a 16 MHz system clock.
// Disable the transmitter and receiver.
UART1_CR2_TEN = 0; // Disable transmit.
UART1_CR2_REN = 0; // Disable receive.
// Set the clock polarity, clock phase and last bit clock pulse.
UART1_CR3_CPOL = 0;
UART1_CR3_CPHA = 0;
UART1_CR3_LBCL = 0;
// Set the Receive Interrupt RM0016 p358,362
UART1_CR2_TIEN = 0; // Transmitter interrupt enable
UART1_CR2_TCIEN = 0; // Transmission complete interrupt enable
UART1_CR2_RIEN = 1; // Receiver interrupt enable
UART1_CR2_ILIEN = 0; // IDLE Line interrupt enable
// Turn on the UART transmit, receive and the UART clock.
UART1_CR2_TEN = 1;
UART1_CR2_REN = 1;
UART1_CR1_PIEN = 0;
UART1_CR4_LBDIEN = 0;
}
//-----------------------------
#pragma vector = UART1_R_RXNE_vector
__interrupt void UART1_IRQHandler(void)
{
byte recd;
recd = UART1_DR;
//send the byte to circular buffer;
}
You forget to add global interrupt flag
asm("rim") ; //Enable global interrupt
It happens at non isolated connections whenever you connect your board's ground with other source's ground (USB<->TTL converter connected to PC etc.), In this case microcontroller is getting noise due to high value SMPS's Y capacitor etc.
Simply connect your RX and TX line's via 1K resistor and put 1nF (can be deceased for high speed) capacitors on these lines and to ground (micro controller side) to suppress noises.
I'm struggling with, probably, a very simple problem.
I have a Cypress CY8 controller acting as SPI master, which should communicate with a PIC32mx in slave mode to exchange data packets.
However i cannot even fix simple transmission of multiple bytes from the master to the slave. I've set up the cypress to transmit a char of increasing value (0-255) with a pause (and slave select toggle) in between. The pic should read the incoming byte and then print it over uart to my pc (the uart connection works).
But the pic only prints the first character it receives continuously instead of it being updated.
If i check my logic sniffer, the cypress does send incrementing values and the pic relays them back over the MISO line (looks like the shift buffer isn't cleared).
What could this be?
The cypress without the pic attached gives proper output:
https://dl.dropboxusercontent.com/u/3264324/Schermafdruk%202015-07-28%2015.43.28.png
With the pic attached it relays the data over MISO:
https://dl.dropboxusercontent.com/u/3264324/Schermafdruk%202015-07-28%2015.43.45.png
And this is my (now) extremely basic code to test it:
TRISBbits.TRISB2 = 1; // make Ra2 pin input (SDI)
TRISBbits.TRISB5 = 0; // make Ra2 pin output (SDO)
TRISBbits.TRISB15 = 1; //make RB14 output (SCK)
ANSELA = 0; // all ports digital
ANSELB = 0; // all ports digital
SYSKEY = 0x00000000;
SYSKEY = 0xAA996655;
SYSKEY = 0x556699AA;
CFGCONbits.IOLOCK=0; // unlock configuration
CFGCONbits.PMDLOCK=0;
SDI2R = 0b0100; //SDI2 on pin RB2
SS2R = 0b0011; //SS2 on pin rb10
RPB5R = 0b0100; //SDO2 on pin RB5
// SCLK is connected to pin RB14 (SCK) by default
SYSKEY = 0x00000000;
SPI2CON = 0; // Stops and resets the SPI1.
rData=SPI2BUF; // clears the receive buffer
SPI2BRG=207; // use FPB/4 clock frequency <-- not important in slave mode right?
SPI2STATCLR=0x40; // clear the Overflo
SPI2CON=0x8180;
unsigned char t;
while(1){
t = SpiChnReadC(2);
//t = SPI2BUF; <== i've tried this also
sendData(t); <== uart routine
}
As i do receive a character and the spi data is relayed back to the cypress constantly i think something goed wrong with reading/clearing the spi data structure in the PIC. But i can't figure out why.
As i read in the datasheet, reading from SPI2BUFF gives me the received data, and clears the read flags so new data can be received, but it looks like that doesn't happen...
Can someone shine a light on this for me?
Thanks in advance
Timberleek
You should try making you SPI handler ISR driven to keep you from constantly polling, can also help the debugging since you'll only get notifications when the SPI is actually transacting.
NOTE: I'm bringing this from my FreeRTOS impl, so my ISR definition is not XC32 exactly...
/* Open SPI */
SPI1CON = 0;
spi_flags = SPICON_MODE32 | SPICON_ON;
SpiChnOpen(1,spi_flags,BRG_VAL);
SpiChnGetRov(1,TRUE);
mSPI1ClearAllIntFlags();
mSPI1SetIntPriority(priority + 1);
mSPI1SetIntSubPriority(0);
mSPI1RXIntEnable(1);
void vSPI1InterruptHandler(void)
{
unsigned long data;
if (IFS0bits.SPI1EIF == 1)
{
mSPI1EClearIntFlag();
}
if (IFS0bits.SPI1RXIF == 1)
{
data = SPI1BUF;
//sendData(data);
}
mSPI1RXClearIntFlag();
}
I'm trying to recreate a project of writing to an SD card (using FatFS) for a dsPIC33FJ128GP802 microcontroller.
Currently to collect the date from the SPI I have a do/while that loops 512 times and writes a dummy value to the SPI buffer, wait for the SPI flag, then read the SPI value, like so:
int c = 512;
do {
SPI1BUF = 0xFF;
while (!_SPIRBF);
*p++ = SPI1BUF;
} while (--c);
I'm trying to recreate this using the DMA intterupts but it's not working like I had hoped. I'm using one DMA channel, SPI is in 8 bit mode for the time being, so DMA is in byte mode, it's also in 'null write' mode, and continuous without ping pong. My buffers are only one member arrays and the DMA is matched.
DMA2CONbits.CHEN = 0; //Disable DMA
DMA2CONbits.SIZE = 1; //Receive bytes (8 bits)
DMA2CONbits.DIR = 0; //Receive from SPI to DMA
DMA2CONbits.HALF = 0; //Receive full blocks
DMA2CONbits.NULLW = 1; //null write mode on
DMA2CONbits.AMODE = 0; //Register indirect with post-increment
DMA2CONbits.MODE = 0; //continuous mode without ping-pong
DMA2REQbits.IRQSEL = 10; //Transfer done (SPI)
DMA2STA = __builtin_dmaoffset(SPIBuffA); //receive buffer
DMA2PAD = (volatile unsigned int) &SPI1BUF;
DMA2CNT = 0; //transfer count = 1
IFS1bits.DMA2IF = 0; //Clear DMA interrupt
IEC1bits.DMA2IE = 1; //Enable DMA interrupt
From what I understand from the null write mode, the DMA will write a null value every time a read is performed. However, the DMA wont start until an initial write is performed by the CPU, so I've used the manual/force method to start the DMA.
DMA1CONbits.CHEN = 1; //Enable DMA
DMA1REQbits.FORCE = 1; //Manual write
The interrupt will now start, and runs without error. However the code later shows that the collection was incorrect.
My interrupt is simple in that all I'm doing is placing the collected data (which I assume is placed in my DMAs buffer as allocated above) into a pointer which is used throughout my program.
void __attribute__((interrupt, no_auto_psv)) _DMA2Interrupt(void) {
if (RxDmaBuffer == 513) {
DMA2CONbits.CHEN = 0;
rxFlag = 1;
} else {
buffer[RxDmaBuffer] = SPI1BUF;
RxDmaBuffer++;
}
IFS1bits.DMA2IF = 0; // Clear the DMA0 Interrupt Flag
}
When the interrupt has run 512 times, I stop the DMA and throw a flag.
What am I missing? How is this not the same as the non-DMA method? Is it perhaps the lack of the while loop which waits for the completion of the SPI transmission (while (!_SPIRBF);). Unfortunately with the null write mode automatically sending and receiving the SPI data I can't manually put any sort of wait in.
I've also tried using two DMA channels, one to write and one to read, but this also didn't work (plus I need that channel later for when I come to proper writing to the SD card).
Any help would be great!
I know that this topic (DMA & SPI) has already been talked about on numerous threads in the microchip forum, actually i've read all the 15 pages in result of the search with keyword "dma" and read all the topics about dma & spi.
And I am still stuck with my problem I hope someone can help me :)
Here is the problem.
My chip is a PIC32MX775F512H.
I am trying to receive (only receive) data using SPI via DMA.
Since you cannot "just" receive in SPI, and that the SPI core starts toggling the SPI clock only if you write into the SPIBUF (SPI1ABUF for me) I am trying to receive my data using 2 DMA channels.
DMA_CHANNEL1 for the transmitting part.
DMA_CHANNEL2 for the receiving part.
I copy pasted the code from http://www.microchip.com/forums/tm.aspx?tree=true&high=&m=562453&mpage=1#
And tried to make it work without any luck. It only receives several bytes (5 or 6).
I've set the Event Enable Flags to DMA_EV_BLOCK_DONE for both dma channels, no interrupt occurs.
Do you have any idea ?
Here is the code I am using :
int Spi_recv_via_DMA(SPI_simple_master_class* SPI_Port, int8u *in_bytes, int16u num_bytes2)
{
DmaChannel dmaTxChn=DMA_CHANNEL1;
DmaChannel dmaRxChn=DMA_CHANNEL2;
SpiChannel spiTxChn=SPI_Port->channel;
int8u dummy_input;
DmaChnOpen(dmaTxChn, DMA_CHN_PRI3, DMA_OPEN_DEFAULT);
DmaChnOpen(dmaRxChn, DMA_CHN_PRI3, DMA_OPEN_DEFAULT);
DmaChnSetEventControl(dmaTxChn, DMA_EV_START_IRQ_EN | DMA_EV_START_IRQ(_SPI1A_RX_IRQ));
DmaChnSetEventControl(dmaRxChn, DMA_EV_START_IRQ_EN | DMA_EV_START_IRQ(_SPI1A_RX_IRQ));
DmaChnClrEvFlags(dmaTxChn, DMA_EV_ALL_EVNTS);
DmaChnClrEvFlags(dmaRxChn, DMA_EV_ALL_EVNTS);
DmaChnSetEvEnableFlags(dmaRxChn, DMA_EV_BLOCK_DONE);
DmaChnSetEvEnableFlags(dmaTxChn, DMA_EV_BLOCK_DONE);
//SpiChnClrTxIntFlag(spiTxChn);
//SpiChnClrRxIntFlag(spiTxChn);
DmaChnSetTxfer(dmaTxChn, tx_dummy_buffer, (void *)&SPI1ABUF, num_bytes2, 1, 1);
DmaChnSetTxfer(dmaRxChn, (void *)&SPI1ABUF, in_bytes, 1, num_bytes2, 1);
while ( (SPI1ASTAT & SPIRBE) == 0)
dummy_input = SPI1ABUF;
SPI1ASTAT &= ~SPIROV;
DmaRxIntFlag = 0;
DmaChnEnable(dmaRxChn);
DmaChnStartTxfer(dmaTxChn, DMA_WAIT_NOT, 0);
while(!DmaRxIntFlag);
return 1;
}
with those two interrupt handlers :
// handler for the DMA channel 1 interrupt
void __ISR(_DMA1_VECTOR, ipl5) DmaHandler1(void)
{
int evFlags; // event flags when getting the interrupt
//LED_On(LED_CFG);
INTClearFlag(INT_SOURCE_DMA(DMA_CHANNEL1)); // acknowledge the INT controller, we're servicing int
evFlags=DmaChnGetEvFlags(DMA_CHANNEL1); // get the event flags
if(evFlags&DMA_EV_BLOCK_DONE)
{ // just a sanity check. we enabled just the DMA_EV_BLOCK_DONE transfer done interrupt
DmaTxIntFlag = 1;
DmaChnClrEvFlags(DMA_CHANNEL1, DMA_EV_BLOCK_DONE);
}
// LED_Off(LED_CFG);
}
void __ISR(_DMA2_VECTOR, ipl5) DmaHandler2(void)
{
int evFlags; // event flags when getting the interrupt
INTClearFlag(INT_SOURCE_DMA(DMA_CHANNEL2)); // acknowledge the INT controller, we're servicing int
evFlags=DmaChnGetEvFlags(DMA_CHANNEL2); // get the event flags
if(evFlags&DMA_EV_BLOCK_DONE)
{ // just a sanity check. we enabled just the DMA_EV_BLOCK_DONE transfer done interrupt
DmaRxIntFlag = 1;
DmaChnClrEvFlags(DMA_CHANNEL2, DMA_EV_BLOCK_DONE);
}
}
So I end up waiting forever at the line : while(!DmaRxIntFlag);
I have put breakpoints in the interrupt vectors, they are never called.
This is the state of several registers during the ever lasting wait :
DMACON 0x0000C800
DMASTAT 0x00000001
I am using SPI1A port, so SPI1ABUF and _SPI1A_RX_IRQ
DCH1SPTR 0x5
DCH1SSIZ 0x2B
DCH2DPTR 0x6
DCH2DSIZ 0x2B
DCH2CON 0x00008083
DCH2ECON 0x1B10
DCH2INT 0x00800C4
DCH2SSA 0x1F805820
DCH2DSA 0x00000620
Channel 1 is used to transmit
Channel 2 is used to receive
You are missing these:
INTEnable(INT_SOURCE_DMA(dmaTxChn), INT_ENABLED); // Tx
INTEnable(INT_SOURCE_DMA(dmaRxChn), INT_ENABLED); // Rx
rigth before
DmaRxIntFlag = 0;
DmaChnEnable(dmaRxChn);
DmaChnStartTxfer(dmaTxChn, DMA_WAIT_NOT, 0);
Good luck!
Are you using the SPI in slave mode? or you are on master mode, trying to read some response for a command?
Have you check the silicon errata for this chip? The dspic 33fj family had an issue where SPI slave mode simply didn't work.
Other than that, I don't think it is a good idea to busy wait for DmaRxIntFlag change. You should configure the DMA transfer and continue with your main loop. The DMA will trigger the interrupt handler.
Hope this helps.