RTC not acknowledging slave address - timer

I'm writing a driver for m41t83 RTC IC and on AT91SAM7X (based on ARM7).
I'm trying to start communication with it over TWI (fast mode) and I have no response from the RTC (always get NACK).
I don't know what might cause the problem. Attached here some photoes to clock and data wires on oscilliscope where I send the RTC address 0x68 after the start condition and send write bit but receive a NACK.
I tryed ST's forum but in vain

Related

Implementing an SSI slave interface on STM32 Board

I am trying to implement a SSI Slave Protocol on a STM32 Board. Since the STM32 Boards don't have a SSI interface, I used its SPI interface in Slave(Transmit only mode). The master SSI sends 24 clock signals and the slave reacts by sending its data(3 Bytes) over the MISO pins. The problem I am facing is that the data is always shifted on the left on every clock signal coming from the master. For example assuming I am constantly sending 0x010101 from slave.
At first transmission the master receives 0x010101
At Second transmission the master receives 0x020202
At third transmission the master receives 0x040404
Can someone please give me some hints on how to solve this problem?
The data-shift with each transmission can happen when the SPI slave recognizes an (unexpected) additional clock pulse. Looking at the SSI protocol description on Wikipedia this actually makes sense:
In order to transmit N bits of data the master emits N clock cycles, followed by another clock pulse to signal the end of the transfer (so-called "Monoflop Time" - referring to the original hardware implementation of the SSI interface). Since the SPI protocol / SPI slave does not know about this additional clock pulse, it begins to output the first bit of the next data byte, which is in turn not recognized by the SSI master. As a result this leads to a shift in the data bits recognized by the SSI master on the next SSI frame.
Unfortunately, it is not easy to handle the Monoflop time correctly with the SPI slave. In order to deal with the additional clock pulse, we could try to set the SPI frame size to 25 bits on the slave side. Since the STM32 hardware only supports SPI frame sizes between 4 bit and 16 bit, the only choice is to set it to 5 bit. This is not very convenient, since we need to convert the 3 byte (24 bit) output data into 5 blocks of 5 bit (24 bit output data + 1 bit dummy data), but it should work for a "normal" transfer.
Things get more complicated though, if we also want to handle the cases "Multiple transmissions" and "Interrupting transmission" correctly. We need to monitor the clock signal to be able to detect the monoflop timeout. This can be done using a STM32 hardware timer with an external trigger. When the timer expires, we need to reset the SPI unit (in order to handle an interrupted transmission) and update the output value. This "simple" task can be quite challenging since it requires a couple of instructions - requiring a fast MCU depending on the SSI clock frequency.
Alternatively the SSI protocol can be implemented using a software-only "bit banging" solution. But this requires a fast MCU as well in order to handle a fast SSI clock correctly.
IMHO the best solution is to use a small (inexpensive) FPGA to implement the SSI slave and let the MCU feed it with data over a traditional SPI interface.

STM32 I2C set SDA to low

Is there any way to set the SDA and SCL pin of the I2C1 connection of the STM32 to low or high signal?
I use a security chip and I have to send a wake condition, with the following condition:
if SDA is held low for a period of greater than 60us, the device will exit low power mode and
after a delay of 1500us, it will be ready to receive I2C commands.
I've already tried to toggle the actual pin with HAL_GPIO_TogglePin(GPIOB, GPIO_PIN_9);, but this isn't working.
I've configured my project with STM32CubeMX.
Thanks for your help.
In I2C, the START condition requires a High to Low transition, if you then send a dummy address 0, a NACK will be generated (or rather the lack of any response will be interpreted as a NACK). In a normal transaction, the software would respond to the NACK by generating a repeated START or a STOP condition, however this must be done in software, so all you have to do is nothing for 1.5ms. Thereafter you can generate the START with the device's actual address, and if the device is running it will generate an ACK.
I am not familiar with the HAL library driver, and frankly the documentation is abysmal, but it is possible that it does not give you the necessary control, and you will have to access the I2C peripheral at the register level for at least this procedure. You might try a zero-length I2C_MasterRequestWrite() call to address zero followed by a delay. An oscilloscope would be useful here to ensure the expected signal timing is being generated.
When you initialize I2C, GPIO pins mode is set to ALTERNATE MODE,so writing HAL commands won't work on it.
Using normal HAL libraries won't help you in this. You have to configure I2C protocol on your own using stm32 registers.
I recommend that the ownaddress of the slave address using the device of the using I2C channel sets like the below code.
I2C_InitStructure.I2C_OwnAddress1 = 0x30; // the unique slave address of the deviecs
because the master could be send the broadcast operation not the unique operation.

Create virtual uart on stm32 microcontroller

I need to create a virtual uart port on a stm32 microcontroller. The pins are given and can be changed to timer input channels. The recieving signal is going to be modulated in current and voltage and i need to detect both. Those two pins can not be assigned to a uart. Does somebody has a tutorial or something, that can lead me in the right direction? I just started programming microcontrollers and i am still strugeling with all the timer, interrupts and details stuff.
If we are talking aobut baudrates for small (9600), then you can achieve this with timer and EXTI.
On EXTI set pin to rising and falling edge and between each IRQs check timer value.
If value is greater than start and stop condition time, you failed, else you have to check time spent for EXTi and calculate whether you received 10101010 or 11001100 or any other combination.
For TX mode, use timer for precise interrupts at bit slice for UART output data and create state machine for data output bit by bit.
Another option is to use SPI as virtual UART.

i2c transfer from gpio int handler fails on imx6sx cortex m4 side

i'm experiencing something that bugs me for days, so i am working on the imx6sx cortex m4 side, i have a sensor connected to one of the i2c buses, sensor is set up with data ready on INT1 which is connected to one of the gpios from the MCU. After boot, i configure the sensor so that it outputs data ready interrupt. Note that the i2c works also in interrupt mode, so if i try to read the sensor when the data ready line is asserted i have to wait in the GPIO INT Handler until the i2c transfer is complete in order to get another data ready int and so on.
My problem is that i don't want to wait in the GPIO INT Handler until the i2c transfer is complete, that's why i made the i2c on interrupts too, but if i don't wait in the GPIO Int Handler, something happens to the i2c because the sensor it's not ack the transfer, so i'n not getting other data-ready interrupts.
Please help if you have any idea what could be wrong, also the i2c bus Interrupt has a higher priority than the GPIO interrupt, and unfortunately i can't use a debugger for debugging, only the old-fashioned way, printfs in the console
Thanks
You could use INT1 to trigger a lower priority software interrupt to handle the i2c, then exit freeing the interrupt.
Consider using a RTOS to manage this for you.

configuring UART as FIQ

I am working on lpc2468 and using UART0 of the controller for communication with sim300 gprs
module. Sometimes if i send a command for reading the signal strength of the sim the input I
receive is not correct. After looking upon the problem I found the problem that sometimes
when the UART is receiving information at same time the timer gets called and the software
goes to the timer block. in that duration some bytes sent by the module gets missed. To
prevent this i want to configure UART0 as FIQ i.e. interrupt having highest priority. can I
configure UART0 as FIQ.If yes How?
From LPC2048 data sheet,
The ARM processor core has two interrupt inputs called Interrupt
ReQuest (IRQ) and Fast Interrupt ReQuest (FIQ). The VIC takes 32
interrupt request inputs which can be programmed as FIQ or vectored
IRQ types. The programmable assignment scheme means that priorities
of interrupts from the various peripherals can be dynamically
assigned and adjusted.
So you need to find out where are the programmable registers of the Interrupt controller and change the interrupt type of UART to FIQ.
If you have simulation support, then see this to know how to change interrupt types and priorities.

Resources