I have visited on some links and looked for some example programs for I2C programming. I want to write my own code for I2C protocol. Suppose DS1307 RTC and LCD connected to 8051. I am using Keil software to write a C program. It's very difficult to write whole program of I2C for me, so I tried to break program in small parts:
Module 1: define and set pins for LCD and DS1307 RTC
Module 2: write C code for DS1307 (make functions for DS1307 such as read, write)
Module 3: write C code for LCD (data, command initialize, etc)
Module 4: main function
I understand module 1 but I am looking help to understand module 2. So again I want break module 2 in small parts.
How to break module 2 in small parts for easy understanding? How many functions should be in module2?
The Module 2 is essentially I2C driver using bit banging of 8051 port. I2C protocol follows a sequence. It is started by start sequence and stopped by stop sequence. You can have different functions. The communication is started by the master and each slave has an address. So in module2, you will write all below functions.
For example, the I2 read sequence will be following
I2C_Start(); // set I2C start sequence
I2C_Send(slave_address|1); Send I2C slave address in read mode
I2C_read_ACK(); //master will know slave got data
while(number_of bytes){
I2C_Read();
I2C_send_ACK();
number_of bytes--;
}
I2C_NAK(); //slave need to know so it will not prepare next data.
I2C_Stop(); //stop communication
Again the write on slave will have below steps
I2C_Start(); // set I2C start sequence
I2C_Send(slave_address); Send I2C slave address in write mode
I2C_read_ACK(); //master will know slave got data
while(number_of bytes){
I2C_Write();
I2C_read_ACK(); //master will know slave got data
number_of bytes--;
}
I2C_Stop(); //stop communication
I also see driver at
https://circuitdigest.com/microcontroller-projects/digital-clock-using-8051-microcontroller
Official I2C protocol is here
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwici4Ocn6jVAhUIwlQKHV_zAJ8QFggoMAA&url=https%3A%2F%2Fwww.nxp.com%2Fdocuments%2Fuser_manual%2FUM10204.pdf&usg=AFQjCNHgNi6wOD4MjIDsnT0DXTYLS_-gPQ
Related
I need connect my raspberry pi 4 model b with a servo via UART, but it is possible only via 1 wire. That means I must connect pin TX and RX together. In order to do so, I must have a way how to manually disable only TX or RX in my C program.
I am able to easily disable RX thanks to termios.h library, but I didn't find any way how to disable TX.
I was trying to disable it through this
tcflow(fd_myUART, TCOOFF); // it should suspend output
But that didn't work, so I thought that maybe if I change the pin of TX to INPUT, it will change the pin from UART to GPIO, but that didn't work either.
Do you have a way, how to do that, please?
First of all, just "randomly" connecting both wires is a bad idea.
Below image shows how to do it better for a prototype.
Slave devices are able to pull the IO line low during a read bit or a reset while the TX signal is high.
When used in this configuration, you should not disable RX nor TX. You can use "normal" UART operation.
More information can be found here (maxim integrated tutorial 214 "USING A UART TO IMPLEMENT A 1-WIRE BUS MASTER")
Since you will have a lot of connected slave, you should consider using a dedicated chip:
I use a DS2482S-100 over I2C.
I am using two STM32H743 connected via RS232. These two modules connected to same power.
They use UART with DMA. When I turn on modules at the same time, UART and DMA starts correctly.
But when i restart one of the modules while other is awake, the reset module's UART and DMA does not start therefore they cannot communicate with each other.
This problem is also happened before with STM32F4 series. MCU is connected to FPGA and they communicate via UART. When FPGA starts before MCU, DMA and UART does not start properly. What could cause this problem?
Do i need to have a high-z or floating pin states before starting UART?
After lots of debugging hours, I finally found the cause and solution.
When first bytes reach to UART peripheral, due to clock mismatch, it triggers frame error then stops the DMA. This happens more than usual when UART datarate is very high. But I had added the ErrorCallback function to handle the interrupt. Unfortunately, I misused the function.
My use :
void HAL_UART_ErrorCallback(UART_HandleTypeDef *huart)
{
HAL_UART_MspDeInit();
HAL_UART_Receive_DMA(...);
}
HAL_UART_MspDeInit does not clear structs and initializations therefore Receive_DMA function cannot start it again. So, my communication stops.
Correct use :
void HAL_UART_ErrorCallback(UART_HandleTypeDef *huart)
{
HAL_UART_DeInit();
HAL_UART_Receive_DMA(...);
}
Thanks to three typos in my code, it caused to me a lot of time. But finally, it resolved.
The UART and DMA peripherals usually have an error detector, thus has it's flags into the status register. When an error happen, the STM32 HAL will stop any transfer ongoing and wait until you treat this fail. You can check, with the debug module, the HAL status registers to troubleshoot the problem, and add the treatment to it in you code. At first you could reset the peripheral by run DeInit() and right after run Init() routine of the peripheral with error, and reset any other piece of code e.g. state machines and stuff that uses the data from this peripheral.
Is there any way to set the SDA and SCL pin of the I2C1 connection of the STM32 to low or high signal?
I use a security chip and I have to send a wake condition, with the following condition:
if SDA is held low for a period of greater than 60us, the device will exit low power mode and
after a delay of 1500us, it will be ready to receive I2C commands.
I've already tried to toggle the actual pin with HAL_GPIO_TogglePin(GPIOB, GPIO_PIN_9);, but this isn't working.
I've configured my project with STM32CubeMX.
Thanks for your help.
In I2C, the START condition requires a High to Low transition, if you then send a dummy address 0, a NACK will be generated (or rather the lack of any response will be interpreted as a NACK). In a normal transaction, the software would respond to the NACK by generating a repeated START or a STOP condition, however this must be done in software, so all you have to do is nothing for 1.5ms. Thereafter you can generate the START with the device's actual address, and if the device is running it will generate an ACK.
I am not familiar with the HAL library driver, and frankly the documentation is abysmal, but it is possible that it does not give you the necessary control, and you will have to access the I2C peripheral at the register level for at least this procedure. You might try a zero-length I2C_MasterRequestWrite() call to address zero followed by a delay. An oscilloscope would be useful here to ensure the expected signal timing is being generated.
When you initialize I2C, GPIO pins mode is set to ALTERNATE MODE,so writing HAL commands won't work on it.
Using normal HAL libraries won't help you in this. You have to configure I2C protocol on your own using stm32 registers.
I recommend that the ownaddress of the slave address using the device of the using I2C channel sets like the below code.
I2C_InitStructure.I2C_OwnAddress1 = 0x30; // the unique slave address of the deviecs
because the master could be send the broadcast operation not the unique operation.
I'm working on reading flow rates from a sensrion lg16 flow rate sensor using a USB FTDI I2C dongle. I have all the wiring done and I am able to write to it and verify I am able to write to it correctly using an oscilloscope with an I2C analyzer. My problem is when i try to do a read from the sensor, even when reading to a register after writing to it i dont get missing Acks on the I2C bus.
here my code to read the register i want to see at the moment
bytesToTransfer=0;
bytesTransfered=0;
buffer[bytesToTransfer++]=0xE5;
//buffer[bytesToTransfer++]= 0x7f;
status = I2C_DeviceWrite(ftHandle, slaveAddress, bytesToTransfer, buffer, \
&bytesTransfered, I2C_TRANSFER_OPTIONS_START_BIT|I2C_TRANSFER_OPTIONS_STOP_BIT | I2C_TRANSFER_OPTIONS_FAST_TRANSFER_BYTES);
printf("bytestxd=%d\n",bytesTransfered);
printf("status=%d\n",status);
APP_CHECK_STATUS(status);
On ISE 14.7, what do I need to do in order to have the AXI stream (which has an ipcore that loops a value) to give me output via the UART?
I had set up the project properly, added a UART and set the mhs, ucf files and the rest of the bonanza, however I do not have any idea what do I need to do to have output from the AXI via the UART.
Any ideas?
You will need to write an UART module in RTL or use an existing Ip-core which accepts AXI-stream transaction and converts it to UART messages.
However due to performance differences it is more common to find IP-cores with AXI4-Lite interface to UART. For example AXI UART 16550 (https://www.xilinx.com/support/documentation/ip_documentation/axi_uart16550/v2_0/pg143-axi-uart16550.pdf)