STM32 I2C set SDA to low - c

Is there any way to set the SDA and SCL pin of the I2C1 connection of the STM32 to low or high signal?
I use a security chip and I have to send a wake condition, with the following condition:
if SDA is held low for a period of greater than 60us, the device will exit low power mode and
after a delay of 1500us, it will be ready to receive I2C commands.
I've already tried to toggle the actual pin with HAL_GPIO_TogglePin(GPIOB, GPIO_PIN_9);, but this isn't working.
I've configured my project with STM32CubeMX.
Thanks for your help.

In I2C, the START condition requires a High to Low transition, if you then send a dummy address 0, a NACK will be generated (or rather the lack of any response will be interpreted as a NACK). In a normal transaction, the software would respond to the NACK by generating a repeated START or a STOP condition, however this must be done in software, so all you have to do is nothing for 1.5ms. Thereafter you can generate the START with the device's actual address, and if the device is running it will generate an ACK.
I am not familiar with the HAL library driver, and frankly the documentation is abysmal, but it is possible that it does not give you the necessary control, and you will have to access the I2C peripheral at the register level for at least this procedure. You might try a zero-length I2C_MasterRequestWrite() call to address zero followed by a delay. An oscilloscope would be useful here to ensure the expected signal timing is being generated.

When you initialize I2C, GPIO pins mode is set to ALTERNATE MODE,so writing HAL commands won't work on it.
Using normal HAL libraries won't help you in this. You have to configure I2C protocol on your own using stm32 registers.

I recommend that the ownaddress of the slave address using the device of the using I2C channel sets like the below code.
I2C_InitStructure.I2C_OwnAddress1 = 0x30; // the unique slave address of the deviecs
because the master could be send the broadcast operation not the unique operation.

Related

Raspberry Pi - UART - disable TX and enable RX in program C

I need connect my raspberry pi 4 model b with a servo via UART, but it is possible only via 1 wire. That means I must connect pin TX and RX together. In order to do so, I must have a way how to manually disable only TX or RX in my C program.
I am able to easily disable RX thanks to termios.h library, but I didn't find any way how to disable TX.
I was trying to disable it through this
tcflow(fd_myUART, TCOOFF); // it should suspend output
But that didn't work, so I thought that maybe if I change the pin of TX to INPUT, it will change the pin from UART to GPIO, but that didn't work either.
Do you have a way, how to do that, please?
First of all, just "randomly" connecting both wires is a bad idea.
Below image shows how to do it better for a prototype.
Slave devices are able to pull the IO line low during a read bit or a reset while the TX signal is high.
When used in this configuration, you should not disable RX nor TX. You can use "normal" UART operation.
More information can be found here (maxim integrated tutorial 214 "USING A UART TO IMPLEMENT A 1-WIRE BUS MASTER")
Since you will have a lot of connected slave, you should consider using a dedicated chip:
I use a DS2482S-100 over I2C.

Implementing an SSI slave interface on STM32 Board

I am trying to implement a SSI Slave Protocol on a STM32 Board. Since the STM32 Boards don't have a SSI interface, I used its SPI interface in Slave(Transmit only mode). The master SSI sends 24 clock signals and the slave reacts by sending its data(3 Bytes) over the MISO pins. The problem I am facing is that the data is always shifted on the left on every clock signal coming from the master. For example assuming I am constantly sending 0x010101 from slave.
At first transmission the master receives 0x010101
At Second transmission the master receives 0x020202
At third transmission the master receives 0x040404
Can someone please give me some hints on how to solve this problem?
The data-shift with each transmission can happen when the SPI slave recognizes an (unexpected) additional clock pulse. Looking at the SSI protocol description on Wikipedia this actually makes sense:
In order to transmit N bits of data the master emits N clock cycles, followed by another clock pulse to signal the end of the transfer (so-called "Monoflop Time" - referring to the original hardware implementation of the SSI interface). Since the SPI protocol / SPI slave does not know about this additional clock pulse, it begins to output the first bit of the next data byte, which is in turn not recognized by the SSI master. As a result this leads to a shift in the data bits recognized by the SSI master on the next SSI frame.
Unfortunately, it is not easy to handle the Monoflop time correctly with the SPI slave. In order to deal with the additional clock pulse, we could try to set the SPI frame size to 25 bits on the slave side. Since the STM32 hardware only supports SPI frame sizes between 4 bit and 16 bit, the only choice is to set it to 5 bit. This is not very convenient, since we need to convert the 3 byte (24 bit) output data into 5 blocks of 5 bit (24 bit output data + 1 bit dummy data), but it should work for a "normal" transfer.
Things get more complicated though, if we also want to handle the cases "Multiple transmissions" and "Interrupting transmission" correctly. We need to monitor the clock signal to be able to detect the monoflop timeout. This can be done using a STM32 hardware timer with an external trigger. When the timer expires, we need to reset the SPI unit (in order to handle an interrupted transmission) and update the output value. This "simple" task can be quite challenging since it requires a couple of instructions - requiring a fast MCU depending on the SSI clock frequency.
Alternatively the SSI protocol can be implemented using a software-only "bit banging" solution. But this requires a fast MCU as well in order to handle a fast SSI clock correctly.
IMHO the best solution is to use a small (inexpensive) FPGA to implement the SSI slave and let the MCU feed it with data over a traditional SPI interface.

Changing hardware flow control pins on STM32

I have been reading up on handshaking and hardware flow control for serial communication and I have a question that I can't seem to find an answer to.
If you set up hardware flow control for a serial port on cubeMX it will set the pins up that are required. I know you can use alternative pins as well and this can be done through cubeMX.
My question is, could you set up hardware flow control manually by using different pins or do you strictly have to use the implemented pins?
I am using a STM32F207ZETx and I am using USB as well as serial - however when using the USB peripheral it blocks the hardware flow control pins for USART1 which I need, and I need hardware flow control for my project! The alternate pins for hardware flow control are also already used so I'm in a bit of a pickle.
My question is, could you set up hardware flow control manually by using different pins or do you strictly have to use the implemented pins?
You can do hardware flow control yourself in software, and in fact it is quite simple to do.
The USART1_RTS is an output pin. It is set/high when the USART1 is ready to receive data. The USART1_CTS is an input pin. The other end sets it high when it is ready to receive data, and low when it is not.
Let's say that you send and receive one character at a time, and use two GPIO pins for USART1_RTS and USART1_CTS instead of the hardware support.
Normally, you keep USART1_RTS high. When receiving data, if you are running out of receive buffer, you set USART1_RTS low. When you make more room in the receive buffer, you set USART1_RTS high. (If you have a buffering scheme that cannot run out of receive buffer, you can tie the RTS line high.)
Before sending each character, you check if USART1_CTS is high. If it is low, you don't send the data, but wait for USART1_CTS to become high before you do.
That's it.

Create virtual uart on stm32 microcontroller

I need to create a virtual uart port on a stm32 microcontroller. The pins are given and can be changed to timer input channels. The recieving signal is going to be modulated in current and voltage and i need to detect both. Those two pins can not be assigned to a uart. Does somebody has a tutorial or something, that can lead me in the right direction? I just started programming microcontrollers and i am still strugeling with all the timer, interrupts and details stuff.
If we are talking aobut baudrates for small (9600), then you can achieve this with timer and EXTI.
On EXTI set pin to rising and falling edge and between each IRQs check timer value.
If value is greater than start and stop condition time, you failed, else you have to check time spent for EXTi and calculate whether you received 10101010 or 11001100 or any other combination.
For TX mode, use timer for precise interrupts at bit slice for UART output data and create state machine for data output bit by bit.
Another option is to use SPI as virtual UART.

i2c transfer from gpio int handler fails on imx6sx cortex m4 side

i'm experiencing something that bugs me for days, so i am working on the imx6sx cortex m4 side, i have a sensor connected to one of the i2c buses, sensor is set up with data ready on INT1 which is connected to one of the gpios from the MCU. After boot, i configure the sensor so that it outputs data ready interrupt. Note that the i2c works also in interrupt mode, so if i try to read the sensor when the data ready line is asserted i have to wait in the GPIO INT Handler until the i2c transfer is complete in order to get another data ready int and so on.
My problem is that i don't want to wait in the GPIO INT Handler until the i2c transfer is complete, that's why i made the i2c on interrupts too, but if i don't wait in the GPIO Int Handler, something happens to the i2c because the sensor it's not ack the transfer, so i'n not getting other data-ready interrupts.
Please help if you have any idea what could be wrong, also the i2c bus Interrupt has a higher priority than the GPIO interrupt, and unfortunately i can't use a debugger for debugging, only the old-fashioned way, printfs in the console
Thanks
You could use INT1 to trigger a lower priority software interrupt to handle the i2c, then exit freeing the interrupt.
Consider using a RTOS to manage this for you.

Resources