I'm struggling with, probably, a very simple problem.
I have a Cypress CY8 controller acting as SPI master, which should communicate with a PIC32mx in slave mode to exchange data packets.
However i cannot even fix simple transmission of multiple bytes from the master to the slave. I've set up the cypress to transmit a char of increasing value (0-255) with a pause (and slave select toggle) in between. The pic should read the incoming byte and then print it over uart to my pc (the uart connection works).
But the pic only prints the first character it receives continuously instead of it being updated.
If i check my logic sniffer, the cypress does send incrementing values and the pic relays them back over the MISO line (looks like the shift buffer isn't cleared).
What could this be?
The cypress without the pic attached gives proper output:
https://dl.dropboxusercontent.com/u/3264324/Schermafdruk%202015-07-28%2015.43.28.png
With the pic attached it relays the data over MISO:
https://dl.dropboxusercontent.com/u/3264324/Schermafdruk%202015-07-28%2015.43.45.png
And this is my (now) extremely basic code to test it:
TRISBbits.TRISB2 = 1; // make Ra2 pin input (SDI)
TRISBbits.TRISB5 = 0; // make Ra2 pin output (SDO)
TRISBbits.TRISB15 = 1; //make RB14 output (SCK)
ANSELA = 0; // all ports digital
ANSELB = 0; // all ports digital
SYSKEY = 0x00000000;
SYSKEY = 0xAA996655;
SYSKEY = 0x556699AA;
CFGCONbits.IOLOCK=0; // unlock configuration
CFGCONbits.PMDLOCK=0;
SDI2R = 0b0100; //SDI2 on pin RB2
SS2R = 0b0011; //SS2 on pin rb10
RPB5R = 0b0100; //SDO2 on pin RB5
// SCLK is connected to pin RB14 (SCK) by default
SYSKEY = 0x00000000;
SPI2CON = 0; // Stops and resets the SPI1.
rData=SPI2BUF; // clears the receive buffer
SPI2BRG=207; // use FPB/4 clock frequency <-- not important in slave mode right?
SPI2STATCLR=0x40; // clear the Overflo
SPI2CON=0x8180;
unsigned char t;
while(1){
t = SpiChnReadC(2);
//t = SPI2BUF; <== i've tried this also
sendData(t); <== uart routine
}
As i do receive a character and the spi data is relayed back to the cypress constantly i think something goed wrong with reading/clearing the spi data structure in the PIC. But i can't figure out why.
As i read in the datasheet, reading from SPI2BUFF gives me the received data, and clears the read flags so new data can be received, but it looks like that doesn't happen...
Can someone shine a light on this for me?
Thanks in advance
Timberleek
You should try making you SPI handler ISR driven to keep you from constantly polling, can also help the debugging since you'll only get notifications when the SPI is actually transacting.
NOTE: I'm bringing this from my FreeRTOS impl, so my ISR definition is not XC32 exactly...
/* Open SPI */
SPI1CON = 0;
spi_flags = SPICON_MODE32 | SPICON_ON;
SpiChnOpen(1,spi_flags,BRG_VAL);
SpiChnGetRov(1,TRUE);
mSPI1ClearAllIntFlags();
mSPI1SetIntPriority(priority + 1);
mSPI1SetIntSubPriority(0);
mSPI1RXIntEnable(1);
void vSPI1InterruptHandler(void)
{
unsigned long data;
if (IFS0bits.SPI1EIF == 1)
{
mSPI1EClearIntFlag();
}
if (IFS0bits.SPI1RXIF == 1)
{
data = SPI1BUF;
//sendData(data);
}
mSPI1RXClearIntFlag();
}
Related
I have a PIC24FJ256GA702 communicating with a AD5724RBREZ quad DAC with a SPI link.
The DAC works fine, writing is no problem, but I am stuck on reading back the control register from it.
I get the correct waveform on the PIC pin that I am expecting, and the read routine runs, but it always returns zeros. The waveform is clearly not zero- the waveform on the scope is correct for the 4 channels and the internal reference being enabled.
Scope view of waveforms - blue = clock, yellow = data input from DAC at PIC pin
(The excessive ringing on the scope image is probably caused by a long distance ground connection- in practice these chips are about 25mm apart.)
I thought that the input pin was configured as an analogue, but it was correctly a digital input.
I connected it to a counter based on Timer1, and that counter does count if I try to read the DAC. This suggests that the PPS is working, the pin is not bust, and the the input signal is clean enough to use.
I think it may be a problem with the code or the decode timing of the SPI module, but as shown in the image the data is stable during the clock cycle so I cannot see what is wrong.
I have searched the forums, and it seems most with this problem trace it to analogue functions being enabled but that is not the case here.
Would anyone like to suggest something else to try, or post some working SPI read code if my code is not looking correct?
The code follows-
void AOUT_init(void)
{
//assume PPS is unlocked (it is on reset) - see note below
//setup SPI1 itself
SPI1CON1L = 0x0160; //TX work read 0
//SPI1CON1L = 0x0060; //TX not work
//SPI1CON1L = 0x0120; //TX work, read 0
//SPI1CON1L = 0x0020; //not tried
SPI1CON1H = 0x3000;
SPI1CON2L = 0x0017; //word length (17 hex = 24 bits)
//BRG
SPI1BRGL = 0x0000; //default = no divisor
//PPS - assume unlocked at this point
ANSBbits.ANSB13 = 0;
TRISBbits.TRISB13 = TRIS_INPUT;
//##########################################################################
RPINR20bits.SDI1R = 13; //set SDI1 data input to PIC to RB13
//##########################################################################
TRISBbits.TRISB15 = TRIS_OUTPUT;
RPOR7bits.RP15R = 8; //RB15 to SDI1 clock out from PIC
TRISBbits.TRISB14 = TRIS_OUTPUT;
RPOR7bits.RP14R = 7; //RB14 to SDI1 data out from PIC
//AD5724R has additional lines - not all used in practice
//setup and set initial level here
//AOUT-/LDAC
TRISBbits.TRISB6 = TRIS_OUTPUT;
LATBbits.LATB6 = 1;
//AOUT-/SYNC
TRISBbits.TRISB7 = TRIS_OUTPUT;
LATBbits.LATB7 = 1;
//AOUT-/CLR
TRISBbits.TRISB12 = TRIS_OUTPUT;
LATBbits.LATB12 = 1;
//turn SPI on
SPI1CON1Lbits.SPIEN = 1;
SPI1CON1Lbits.MSTEN = 1; //included in definition above
//now setup the AD chip
//output range set
AOUT_TX(0x0C00,0x0100); //all channels to 10V
//control
AOUT_TX(0x1900,0x0500);
//power control - enable DACs
AOUT_TX(0x1000,0x1F00);
}
The comms routine below is included for completeness- it just controls the other DAC lines. The /SYNC line is doing a chip select function.
static void AOUT_Comms(bool bSync, bool bLDAC, bool bClr)
{
//AOUT-/LDAC
LATBbits.LATB6 = bLDAC;
//AOUT-/SYNC
LATBbits.LATB7 = bSync;
//AOUT-/CLR
LATBbits.LATB12 = bClr;
}
This is write routine which works fine.
void AOUT_TX(uint16_t dataH, uint16_t dataL)
{
//AOUT uses 24 bit data
//this routine handles /SYNC line
//relies on AD chip having much faster response than chip cycle time
//AD chip limits at about 38MHz so much quicker than PIC.
AOUT_Comms(0,1,1);
SPI1BUFL = dataL;
SPI1BUFH = dataH;
while(SPI1STATLbits.SPIBUSY) //wait until sent
{
;
}
AOUT_Comms(1,1,1);
//
}
This is the read routine, which uses the routines above.
void AOUT_read_control(uint16_t ReadH, uint16_t ReadL)
{
uint16_t temp;
//to read, transmit the register to be read, then transmit a dummy command "NOP" to clock the data out.
//send register
AOUT_TX(0x9000,0x0000);
//read out- this is similar to write but reads the received buffer at the end.
//clear buffer
while (SPI1STATLbits.SPIRBF)
{
temp = SPI1BUFL;
temp = SPI1BUFH;
}
AOUT_Comms(0,1,1);
SPI1BUFL = 0;
SPI1BUFH = 0x1800; //nop operation via control register
while(SPI1STATLbits.SPIBUSY) //wait until sent
{
;
}
while (!SPI1STATLbits.SPIRBF)
{
; //hold until something received
}
ReadH = SPI1BUFH;
ReadL = SPI1BUFL; //these read zero
AOUT_Comms(1,1,1);
//
//dummy so can check counter also connected to same pin
temp = TMR1;
temp = TMR1;
}
Update-
I checked the SPI read decode by sending the received words directly to a display as suggested by the comments. I get 0000 for both words.
Also as suggested I connected a logic analyser at the PIC pins, and it decodes both read and write correctly.Logic Analyser view- Ch1/Blue/MOSI writing 0x180000, Ch2/Green/MISP reading 0x00001F, both correct
I'm more of a high level software guy but have been working on some embedded projects lately so I'm sure there's something obvious I'm missing here, though I have spent over a week trying to debug this and every 'MSP' related link in google is purple at this point...
I currently have an MSP430F5529 set up as an I2C slave device whose only responsibility currently is to receive packets from a master device. The master uses industry grade I2C and has been heavily tested and ruled out as the source of my problem here. I'm using Code composer as my IDE using the TI v15.12.3.LTS compiler.
What is currently happening is the master queries how many packets (of size 62 bytes) the slave can hold, then sends over a few packets which the MSP is just currently discarding. This is happening every 100ms on the master side and for the minimal example below the MSP will always just send back 63 when asked how many packets it can hold. I have tested the master with a Total Phase Aardvark and everything is working fine with that so I'm sure it's a problem on the MSP side. The problem is as follows:
The program will work for 15-20 minutes, sending over tens of thousands of packets. At some point the slave starts to hold the clock line low and when paused in debug mode, is shown to be stuck in the start interrupt. The same sequence of events is happening every single time to cause this.
1) Master queries how many packets the MSP can hold.
2) A packet is sent successfully
3) Another packet is attempted but < 62 bytes are received by the MSP (counted by logging how many Rx interrupts I receive). No stop condition is sent so master times out.
4) Another packet is attempted. A single byte is sent before the stop condition is sent.
5) Another packet is attempted to be sent. A start interrupt, then a Tx interrupt happens and the device hangs.
Ignoring the fact that I'm not handling the timeout errors on the master side, something very strange is happening to cause that sequence of events, but that's what happens every single time.
Below is the minimal working example which is reproducing the problem. My particular concern is with the SetUpRx and SetUpTx functions. The examples that the Code Composer Resource Explorer gives only has examples of Rx or Tx, I'm not sure if I'm combining them in the right way. I also tried removing the SetUpRx completely, putting the device into transmit mode and replacing all calls to SetUpTx/Rx with mode = TX_MODE/RX_MODE, which did work but still eventually holds the clock line low. Ultimately I'm not 100% sure on how to set this up to receive both Rx and Tx requests.
#include "driverlib.h"
#define SLAVE_ADDRESS (0x48)
// During main loop, set mode to either RX_MODE or TX_MODE
// When I2C is finished, OR mode with I2C_DONE, hence upon exit mdoe will be one of I2C_RX_DONE or I2C_TX_DONE
#define RX_MODE (0x01)
#define TX_MODE (0x02)
#define I2C_DONE (0x04)
#define I2C_RX_DONE (RX_MODE | I2C_DONE)
#define I2C_TX_DONE (TX_MODE | I2C_DONE)
/**
* I2C message ids
*/
#define MESSAGE_ADD_PACKET (3)
#define MESSAGE_GET_NUM_SLOTS (5)
static volatile uint8_t mode = RX_MODE; // current mode, TX or RX
static volatile uint8_t rx_buff[64] = {0}; // where to write rx data
static volatile uint8_t* rx_data = rx_buff; // used in rx interrupt
static volatile uint8_t tx_len = 0; // number of bytes to reply with
static inline void SetUpRx(void) {
// Specify receive mode
USCI_B_I2C_setMode(USCI_B0_BASE, USCI_B_I2C_RECEIVE_MODE);
// Enable I2C Module to start operations
USCI_B_I2C_enable(USCI_B0_BASE);
// Enable interrupts
USCI_B_I2C_clearInterrupt(USCI_B0_BASE, USCI_B_I2C_TRANSMIT_INTERRUPT);
USCI_B_I2C_enableInterrupt(USCI_B0_BASE, USCI_B_I2C_START_INTERRUPT + USCI_B_I2C_RECEIVE_INTERRUPT + USCI_B_I2C_STOP_INTERRUPT);
mode = RX_MODE;
}
static inline void SetUpTx(void) {
//Set in transmit mode
USCI_B_I2C_setMode(USCI_B0_BASE, USCI_B_I2C_TRANSMIT_MODE);
//Enable I2C Module to start operations
USCI_B_I2C_enable(USCI_B0_BASE);
//Enable master trasmit interrupt
USCI_B_I2C_clearInterrupt(USCI_B0_BASE, USCI_B_I2C_RECEIVE_INTERRUPT);
USCI_B_I2C_enableInterrupt(USCI_B0_BASE, USCI_B_I2C_START_INTERRUPT + USCI_B_I2C_TRANSMIT_INTERRUPT + USCI_B_I2C_STOP_INTERRUPT);
mode = TX_MODE;
}
/**
* Parse the incoming message and set up the tx_data pointer and tx_len for I2C reply
*
* In most cases, tx_buff is filled with data as the replies that require it either aren't used frequently or use few bytes.
* Straight pointer assignment is likely better but that means everything will have to be volatile which seems overkill for this
*/
static void DecodeRx(void) {
static uint8_t message_id = 0;
message_id = (*rx_buff);
rx_data = rx_buff;
switch (message_id) {
case MESSAGE_ADD_PACKET: // Add some data...
// do nothing for now
tx_len = 0;
break;
case MESSAGE_GET_NUM_SLOTS: // How many packets can we send to device
tx_len = 1;
break;
default:
tx_len = 0;
break;
}
}
void main(void) {
//Stop WDT
WDT_A_hold(WDT_A_BASE);
//Assign I2C pins to USCI_B0
GPIO_setAsPeripheralModuleFunctionInputPin(GPIO_PORT_P3, GPIO_PIN0 + GPIO_PIN1);
//Initialize I2C as a slave device
USCI_B_I2C_initSlave(USCI_B0_BASE, SLAVE_ADDRESS);
// go into listening mode
SetUpRx();
while(1) {
__bis_SR_register(LPM4_bits + GIE);
// Message received over I2C, check if we have anything to transmit
switch (mode) {
case I2C_RX_DONE:
DecodeRx();
if (tx_len > 0) {
// start a reply
SetUpTx();
} else {
// nothing to do, back to listening
mode = RX_MODE;
}
break;
case I2C_TX_DONE:
// go back to listening
SetUpRx();
break;
default:
break;
}
}
}
/**
* I2C interrupt routine
*/
#pragma vector=USCI_B0_VECTOR
__interrupt void USCI_B0_ISR(void) {
switch(__even_in_range(UCB0IV,12)) {
case USCI_I2C_UCSTTIFG:
break;
case USCI_I2C_UCRXIFG:
*rx_data = USCI_B_I2C_slaveGetData(USCI_B0_BASE);
++rx_data;
break;
case USCI_I2C_UCTXIFG:
if (tx_len > 0) {
USCI_B_I2C_slavePutData(USCI_B0_BASE, 63);
--tx_len;
}
break;
case USCI_I2C_UCSTPIFG:
// OR'ing mode will let it be flagged in the main loop
mode |= I2C_DONE;
__bic_SR_register_on_exit(LPM4_bits);
break;
}
}
Any help on this would be much appreciated!
Thank you!
I am new to STM8, and trying to use a STM8S103F3, using IAR Embedded Workbench.
Using C, I like to use the registers directly.
I need serial on 14400 baud, 8N2, and getting the UART transmit is easy, as there are numerous good tutorials and examples on the net.
Then the need is to have the UART receive on interrupt, nothing else will do.
That is the problem.
According to iostm8s103f3.h (IAR) there are 5 interrupts on 0x14 vector
UART1_R_IDLE, UART1_R_LBDF, UART1_R_OR, UART1_R_PE, UART1_R_RXNE
According to Silverlight Developer: Registers on the STM8,
Vector 19 (0x13) = UART_RX
According to ST Microelectronics STM8S.h
#define UART1_BaseAddress 0x5230
#define UART1_SR_RXNE ((u8)0x20) /*!< Read Data Register Not Empty mask */
#if defined(STM8S208) ||defined(STM8S207) ||defined(STM8S103) ||defined(STM8S903)
#define UART1 ((UART1_TypeDef *) UART1_BaseAddress)
#endif /* (STM8S208) ||(STM8S207) || (STM8S103) || (STM8S903) */
According to STM8S Reference manual RM0016
The RXNE flag (Rx buffer not empty) is set on the last sampling clock edge,
when the data is transferred from the shift register to the Rx buffer.
It indicates that a data is ready to be read from the SPI_DR register.
Rx buffer not empty (RXNE)
When set, this flag indicates that there is a valid received data in the Rx buffer.
This flag is reset when SPI_DR is read.
Then I wrote:
#pragma vector = UART1_R_RXNE_vector //as iostm8s103f3 is used, that means 0x14
__interrupt void UART1_IRQHandler(void)
{ unsigned character recd;
recd = UART1_SR;
if(1 == UART1_SR_RXNE) recd = UART1_DR;
etc.
No good, I continually get interrupts, UART1_SR_RXNE is set, but UART1_DR
is empty, and no UART receive has happened. I have disabled all other interrupts
I can see that can vector to this, and still no good.
The SPI also sets this flag, presumably the the UART and SPI cannot be used
together.
I sorely need to get this serial receive interrupt going. Please help.
Thank you
The problem was one bit incorrectly set in the UART1 setup.
The complete setup for the UART1 in the STM8S103F3 is now(IAR):
void InitialiseUART()
{
unsigned char tmp = UART1_SR;
tmp = UART1_DR;
// Reset the UART registers to the reset values.
UART1_CR1 = 0;
UART1_CR2 = 0;
UART1_CR4 = 0;
UART1_CR3 = 0;
UART1_CR5 = 0;
UART1_GTR = 0;
UART1_PSCR = 0;
// Set up the port to 14400,n,8,2.
UART1_CR1_M = 0; // 8 Data bits.
UART1_CR1_PCEN = 0; // Disable parity.
UART1_CR3 = 0x20; // 2 stop bits
UART1_BRR2 = 0x07; // Set the baud rate registers to 14400
UART1_BRR1 = 0x45; // based upon a 16 MHz system clock.
// Disable the transmitter and receiver.
UART1_CR2_TEN = 0; // Disable transmit.
UART1_CR2_REN = 0; // Disable receive.
// Set the clock polarity, clock phase and last bit clock pulse.
UART1_CR3_CPOL = 0;
UART1_CR3_CPHA = 0;
UART1_CR3_LBCL = 0;
// Set the Receive Interrupt RM0016 p358,362
UART1_CR2_TIEN = 0; // Transmitter interrupt enable
UART1_CR2_TCIEN = 0; // Transmission complete interrupt enable
UART1_CR2_RIEN = 1; // Receiver interrupt enable
UART1_CR2_ILIEN = 0; // IDLE Line interrupt enable
// Turn on the UART transmit, receive and the UART clock.
UART1_CR2_TEN = 1;
UART1_CR2_REN = 1;
UART1_CR1_PIEN = 0;
UART1_CR4_LBDIEN = 0;
}
//-----------------------------
#pragma vector = UART1_R_RXNE_vector
__interrupt void UART1_IRQHandler(void)
{
byte recd;
recd = UART1_DR;
//send the byte to circular buffer;
}
You forget to add global interrupt flag
asm("rim") ; //Enable global interrupt
It happens at non isolated connections whenever you connect your board's ground with other source's ground (USB<->TTL converter connected to PC etc.), In this case microcontroller is getting noise due to high value SMPS's Y capacitor etc.
Simply connect your RX and TX line's via 1K resistor and put 1nF (can be deceased for high speed) capacitors on these lines and to ground (micro controller side) to suppress noises.
I am new to embedded programming and am trying to get my first I2C project working. I am using the PIC32MX795F512L. I am pretty much following the microchip datasheet for I2C on PIC32. The problem I'm having is the S bit is never cleared from the I2C1STAT register. It is actually unclear to me whether I have to do this or if it is done automatically at the conclusion of the Start event, but right now I'm trying to manually clear it. However, nothing that I do seems to have an effect. If more information is needed to make it easier to understand what is happening let me know. I am using a PICKIT3 so I can get debugging information as well. I know that the Master interrupt occurs, the S bit gets set, I exit the interrupt code and hang on the while statement checking the I2C1STATbits.S.
Edit: I'm editing this post to have my new code instead of the old code. I am now using a 20MHZ peripheral clock. Just one of the many things I tried today that did not work. Delay is just a 256ms delay. Super long I know, but it was quick.
main()
{
//Setup I2C1CON
I2C1CONbits.SIDL = 0; //Continue to run while in Idle
I2C1CONbits.SCLREL = 1; //Release the clock (Unsure of this)
I2C1CONbits.A10M = 0; //Using a 7 bit slave address
I2C1CONbits.DISSLW = 1; //Slew rate control disabled because running at 100 KHZ
I2C1ADD = 0x1E; //Slave address without read or write bit
I2C1BRG = 0x060; //Set the BRG clock rate - Based on Page 24-19
I2C1CONbits.ON = 1; //Turn on the I2C module
delay();
I2C1CONbits.SEN = 1; //Initiate a start event
while(I2C1CONbits.SEN == 1); //Wait until Start event is done
I2C1TRN = 0x3C; //Load the address into the Transmit register
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 0); //Wait for a ACK from the device
I2C1TRN = 0x00;
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 0);
I2C1TRN = 0x70;
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 0);
while(1);
}
Thanks for any help.
I'm also just beggining on PIC32MZ family, setting up the I2C to talk to various memory chips.
I used your code and modified it so that it would work properly. Since I am using PIC32MZ family, I believe the I2C registers should probably be the same.
I2C configuration:
I2C1CONbits.SIDL = 0; // Stop in Idle Mode bit -> Continue module operation when the device enters Idle mode
I2C1CONbits.A10M = 0; // 10-bit Slave Address Flag bit -> I2CxADD register is a 7-bit slave address
I2C1CONbits.DISSLW = 1; // Slew Rate Control Disable bit -> Slew rate control disabled for Standard Speed mode (100 kHz)
I2C1CONbits.ACKDT = 0; // Acknowledge Data bit -> ~ACK is sent
I2C1BRG = 0x0F3; // Baud Rate Generator set to provide 100KHz for SCL with 50 MHz xtal.
I followed the transmission steps provided in the I2C Datasheet so it would be easy to follow the steps coupled with the pdf and my comments on the code.
I2C Data Transmission:
// 1. Turn on the I2C module by setting the ON bit (I2CxCON<15>) to ‘1’.
I2C1CONbits.ON = 1; // I2C Enable bit -> Enables the I2C module and configures the SDAx and SCLx pins as serial port pins
//------------- WRITE begins here ------------
// 2. Assert a Start condition on SDAx and SCLx.
I2C1CONbits.PEN = 0; // Stop Condition Enable Bit -> Stop Condition Idle
I2C1CONbits.SEN = 1; // Start Condition Enable bit -> Initiate Start condition on SDAx and SCLx pins; cleared by module
while(I2C1CONbits.SEN == 1); // SEN is to be cleared when I2C Start procedure has been completed
// 3. Load the Data on the bus
I2C1TRN = 0b10100000 ; // Write the slave address to the transmit register for I2C WRITE
while(I2C1STATbits.TRSTAT == 1); // MASTER Transmit still in progress
while(I2C1STATbits.ACKSTAT == 1); // Master should receive the ACK from Slave, which will clear the I2C1STAT<ACKSTAT> bit.
I2C1TRN = 0xCE; // Register Address
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 1);
I2C1TRN = 0xCF; // Register Value
while(I2C1STATbits.TRSTAT == 1);
while(I2C1STATbits.ACKSTAT == 1);
I2C1CONbits.PEN = 1; // Stop Condition Enable Bit -> Initiate Stop condition on SDAx and SCLx pins; cleared by module
//-------------- WRITE ends here -------------
This code works well as I used a couple of LEDs to toggle indicating a successful write procedure.
The while(!(I2C1STATbits.ACKSTAT == 0)); is the right way i guess.I2C1ADD is used to set the slave address of the current PIC mc if you are using your I2C as slave.If any master requests start with that slave address the PIC will respond to that as slave.
Read this pdf where initially they have specified that in PIC the I2C is configured as both master and slave and the master request is checked by the same PIC with with the slave address in the I2C1ADD value.
http://ww1.microchip.com/downloads/en/DeviceDoc/61116F.pdf
I'm trying to recreate a project of writing to an SD card (using FatFS) for a dsPIC33FJ128GP802 microcontroller.
Currently to collect the date from the SPI I have a do/while that loops 512 times and writes a dummy value to the SPI buffer, wait for the SPI flag, then read the SPI value, like so:
int c = 512;
do {
SPI1BUF = 0xFF;
while (!_SPIRBF);
*p++ = SPI1BUF;
} while (--c);
I'm trying to recreate this using the DMA intterupts but it's not working like I had hoped. I'm using one DMA channel, SPI is in 8 bit mode for the time being, so DMA is in byte mode, it's also in 'null write' mode, and continuous without ping pong. My buffers are only one member arrays and the DMA is matched.
DMA2CONbits.CHEN = 0; //Disable DMA
DMA2CONbits.SIZE = 1; //Receive bytes (8 bits)
DMA2CONbits.DIR = 0; //Receive from SPI to DMA
DMA2CONbits.HALF = 0; //Receive full blocks
DMA2CONbits.NULLW = 1; //null write mode on
DMA2CONbits.AMODE = 0; //Register indirect with post-increment
DMA2CONbits.MODE = 0; //continuous mode without ping-pong
DMA2REQbits.IRQSEL = 10; //Transfer done (SPI)
DMA2STA = __builtin_dmaoffset(SPIBuffA); //receive buffer
DMA2PAD = (volatile unsigned int) &SPI1BUF;
DMA2CNT = 0; //transfer count = 1
IFS1bits.DMA2IF = 0; //Clear DMA interrupt
IEC1bits.DMA2IE = 1; //Enable DMA interrupt
From what I understand from the null write mode, the DMA will write a null value every time a read is performed. However, the DMA wont start until an initial write is performed by the CPU, so I've used the manual/force method to start the DMA.
DMA1CONbits.CHEN = 1; //Enable DMA
DMA1REQbits.FORCE = 1; //Manual write
The interrupt will now start, and runs without error. However the code later shows that the collection was incorrect.
My interrupt is simple in that all I'm doing is placing the collected data (which I assume is placed in my DMAs buffer as allocated above) into a pointer which is used throughout my program.
void __attribute__((interrupt, no_auto_psv)) _DMA2Interrupt(void) {
if (RxDmaBuffer == 513) {
DMA2CONbits.CHEN = 0;
rxFlag = 1;
} else {
buffer[RxDmaBuffer] = SPI1BUF;
RxDmaBuffer++;
}
IFS1bits.DMA2IF = 0; // Clear the DMA0 Interrupt Flag
}
When the interrupt has run 512 times, I stop the DMA and throw a flag.
What am I missing? How is this not the same as the non-DMA method? Is it perhaps the lack of the while loop which waits for the completion of the SPI transmission (while (!_SPIRBF);). Unfortunately with the null write mode automatically sending and receiving the SPI data I can't manually put any sort of wait in.
I've also tried using two DMA channels, one to write and one to read, but this also didn't work (plus I need that channel later for when I come to proper writing to the SD card).
Any help would be great!