STM32 Nucleo-WL55JC1 UART reading wrong - c
I'm trying to read my GNSS module using STM32 Nucleo-WL55JC1. Here's my main loop code first
while (1)
{
uint8_t buff[500];
HAL_UART_Receive(&huart1, buff, strlen((char*)buff), HAL_MAX_DELAY);
HAL_UART_Transmit(&huart2, buff, strlen((char*)buff), HAL_MAX_DELAY);
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
}
When I run the code my serial monitor only print few messages and then it freezes
$PSTMVER,GNSSLIB_8.4.18.25_CP_ARM*07
$GPTXT,DEFAULT LIV3FL CONFIGURATION*12
$PSTMVER,OS20LIB_4.4.0_ARM*40
$PSTMVER,GPSAPP_2.11.0_CP_LIV3FL_RC9_ARM*20
$PSTMVER,BINIMG_4.6.15_CP_LIV3FL_RC9_ARM*27
Then I changed the receive timeout to 1000 it started outputing some NMEA data, but as you can see it mixed up with other messages
$PSTMCPU,20.86,-1,98*4F
$GPRMC,060732.000,V,0745.75046,S,11023.31916,E,,,071222,,,N*64 $GPGGA,060732.000,0745.75046,S,11023.31916,E,0,00,99.0,172.57,M,0.0,M11023.31916,E,060731.000,V,N*54
Then I unplugged and try to plug the MCU and now it only looping on this message
$PSTMVER,GPSAPP_2.11.0_CP_LIV3FL_RC9_ARM*20
$PSTMVPSTMVER,OS20LIB_4.4.0_ARM*40
$PSTMVER,GPSAPP_2.11.0_CP_LIV3FL_RC9_ARM*20
$PSTMVPSTMVER,OS20LIB_4.4.0_ARM*40
$PSTMVER,GPSAPP_2.11.0_CP_LIV3FL_RC9_ARM*20
$PSTMVPSTMVER,OS20LIB_4.4.0_ARM*40
I've tried using the same module on ESP it prints the message correctly
$PSTMCPU,21.69,-1,98*4F
$GPRMC,062153.000,V,0745.76371,S,11023.30606,E,,,071222,,,N*6C
$GPGGA,062153.000,0745.76371,S,11023.30606,E,0,02,99.0,189.71,M,0.0,M,,*77
$GPVTG,,T,,M,,N,,K,N*2C
$GNGSA,A,1,26,27,,,,,,,,,,,99.0,99.0,99.0*1F
$GNGSA,A,1,,,,,,,,,,,,,99.0,99.0,99.0*1E
$GPGSV,2,1,05,26,53,342,32,27,39,189,27,22,35,019,,23,14,149,*75
$GPGSV,2,2,05,21,07,251,,,,,,,,,,,,,*4E
$GLGSV,2,1,06,70,70,053,,74,33,309,,71,29,352,,69,26,142,27*6F
$GLGSV,2,2,06,80,15,158,,85,11,205,,,,,,,,,*69
$GPGLL,0745.76371,S,11023.30606,E,062153.000,V,N*5F
This is my first time using STM32 IDE and its HAL (I only use Arduino IDE before) so I kinda lost why my output is really different compared with the ESP.
Edit 1
As chux - Reinstate Monica's suggests I've modified the code to the following
uint8_t buff[500];
ret = HAL_UART_Receive(&huart1, buff, sizeof buff , HAL_MAX_DELAY);
while(ret != HAL_OK); // Continue when Receive returns HAL_OK
HAL_UART_Transmit(&huart2, buff, sizeof buff, HAL_MAX_DELAY);
The stuck problem is still there, but the output now changes to
$PSTMVER,GNSSLIB_8.4.18.25_CP_ARM*07
$GPTXT,DEFAULT LIV3FL CONFIGURATION*12
$PSTMVER,OS20LIB_4.4.0_ARM*40
$PSTMVER,GPSAPP_2.11.0_CP_LIV3FL_RC9_ARM*20
$PSTMVER,BINIMG_4.6.15_CP_LIV3FL_RC9_ARM*27
$PSTMVER,SWCFG_8306532d*33
$PSTMVER,STAGPSLIB_6.0.0_ARM*5A
$PSTMVER,STA8090_822bc043*61
$GPTXT,(C)2000-2018 ST Microelectronics*29
$GPTXT,DEFAULT LIV3FL CONFIGURATION*12
$PSTMSWCONFIG,1,0,12,000205f105070a0a0e0d0c0b0a090608070203630e110c04180c0155030110500f00000f071402050a40fffffffffffffffffffff (stuck here)
Reflashing without unplug and plugging the MCU
$GNGSA,A,3,03,16,32,27,26,22,,,,,,,1.9,1.0,1.6*26
$GNGSA,A,3,,,,,,,,,,,,,1.9,1.0,1.6*22
$GPGSV,3,1,10,16,58,305,23,27,53,179,23,32,47,086,23,22,46,034,22*72
$GPGSV,3,2,10,26,39,354,20,10,30,157,14,08,25,206,13,03,14,310,27*7B
$GPGSV,3,3,10,31,12,021,,21,10,237,,,,,,,,,*7E
$GLGSV,2,1,08,70,72,122,,71,48,350,,84,23,127,,74,22,326,*6C
$GLGSV,2,2,08,75,19,269,19,85,17,189,,69,11,151,18,80,11,144,19*6B
$GPGLL,0745.76098,S,11023.30836,E,065605.000,A,A*4D
$PSTMCPU,36.35,-1,98*40
$G (stuck here)
strlen((char*)buff) is invalid as the contents of buff are indeterminate.
Consider HAL_UART_Receive(&huart1, buff, sizeof buff, HAL_MAX_DELAY);
.. and use the return value of HAL_UART_Receive() to determine success level.
HAL_UART_Transmit() should only attempt to transmit the number of characters received, not strlen((char*)buff).
Related
USART not reading all modem responses
I send commands to the modem via USART1 and copy them to USART2. I copy the modem response to USART2. USART2 displays the modem's response to the first AT command (AT\r\n) "AT OK". The response to the second command (AT+CSCS?\r\n) is not displayed by USART2, or USART1 for some reason does not read the second response from the modem. Why? The code: { char Test[]="AT\r\n"; char Reply[]=""; char Text[]="Module Connected"; char Text1[]="SMS with GSM Module"; char Text2[]="Checking Module..."; char Text3[]="Module Unconnected"; char NewLine[]="\r\n"; char Line[]="--------------------------------------------"; uint8_t flag = 1; uint8_t flag1=1; char STM32Req[]="STM32 request: "; char MdmAnswr[]="Modem Answer: "; char GsmTest[6] = "AT\r\n"; char SmsEncoding[] = "AT+CSCS?\r\n"; /* Infinite loop */ for(;;) { HAL_UART_Transmit(&huart2, (uint8_t*)Text1, strlen(Text1), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); osDelay(500); HAL_UART_Transmit(&huart2, (uint8_t*)Text2, strlen(Text2), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); osDelay(500); while(flag==1){ HAL_UART_Transmit(&huart2, (uint8_t*)STM32Req, strlen(STM32Req), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)Test, strlen(Test), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); HAL_UART_Transmit(&huart1, (uint8_t*)Test, strlen(Test), HAL_MAX_DELAY); HAL_UART_Receive(&huart1, (uint8_t*)Reply, 10, 100); osDelay(1000); HAL_UART_Transmit(&huart2, (uint8_t*)MdmAnswr, strlen(MdmAnswr), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)Reply, 10, HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); osDelay(500); if (strstr(Reply,"OK\r\n")) { HAL_UART_Transmit(&huart2, (uint8_t*)Text, strlen(Text), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); osDelay(500); HAL_UART_Transmit(&huart2, (uint8_t*)Line, strlen(Line), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)Line, strlen(Line), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); osDelay(500); flag=0; } } } while(flag1==1){ HAL_UART_Transmit(&huart2, (uint8_t*)STM32Req, strlen(STM32Req), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)SmsEncoding, strlen(SmsEncoding), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); HAL_UART_Transmit(&huart1, (uint8_t*)SmsEncoding, strlen(SmsEncoding), HAL_MAX_DELAY); HAL_UART_Receive(&huart1, (uint8_t*)Reply, 500, 100); osDelay(1000); HAL_UART_Transmit(&huart2, (uint8_t*)MdmAnswr, strlen(MdmAnswr), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)Reply, strlen(Reply), HAL_MAX_DELAY); HAL_UART_Transmit(&huart2, (uint8_t*)NewLine, strlen(NewLine), HAL_MAX_DELAY); } enter image description here
You have some things that are correct and some things that needs improvement or are wrong. The correct thing is to wait for a final result code like OK like you do, however you cannot wait for OK just some of the commands. You need to wait for a final result code from the modem every single time you send an AT command line. No exceptions! And also you need to check for more than just OK, there are several other final result codes like ERROR and more. Also regarding how to receive data from the modem your attempt to read up till 10 bytes with HAL_UART_Receive into the Reply buffer of only size 1 will cause some memory corruption when more than one byte is read. That bug by itself makes all bets off with regards to behaviour since it is of type undefined behaviour (that phrase has very specific meaning in the C standard and is something that demands attention to avoid invoking). But regardless, reading 10 bytes at the time is the wrong approach. The serial connection is just a stream of bytes with no inherent structure. I assume HAL_UART_Receive uses/implements a small (say 16 bytes) FIFO buffer, and exactly how bytes received on the wire are grouped and forwarded to your program code you do not know and you cannot depend on any specific behaviour here. If the modem sends four bytes O, K, \r and \n, calling HAL_UART_Receive might return "OK\r\n" if you are lucky however it might also first return just "O" and then on the next call return "K\r\n". This is perfectly valid behaviour by the USART and your code must be able to handle this. The simplest way to do this properly is to just read one character at the time and then copy that character into a temporary buffer until you have received a complete line of data because all responses (final and intermediate) ends with "\r\n", and then process that response line (where checking for if the line is a final result code is the first thing that should be done). This problem is in data communication protocols known as framing. See this answer for some more information. Related to what I wrote above about ALWAYS reading and parsing responses from the modem for every single command line send, your calls to osDelay must be thrown out and permanently banished to never return. Your lack of waiting for a final result code from the modem before sending the next command line will likely trigger abort behaviour. Abortion of AT commands is defined in chapter "5.6.1 Aborting commands" in V.250. So the answer to your "why" question is probably AT command abortion, but you have several issues you need to fix before you can have any hope of having reliable behaviour (and please do before asking a new question with the same unfixed code), so without those things fixed guesses of causes of problems are of little value.
Linux on RPi debian, hidraw write() to USB device outputs a few junk characters to /dev/hidraw0 which if not cleared jam the device
We have a set of USB devices which we monitor using a RPi. The monitoring code polls the devices using hidraw direct interface about once a second. The protocol uses 64 byte packets to send commands and receive data and all responses are 64 bytes long at most. The same scheme works fine under Windows using the Windows HID driver. On Linux however we use hidraw and find that the device interface gets jammed after a short time resulting in unsuccessful write{}s to the device. After a lot of investigation I came across a recommendation to try to follow the communication between a host and an hidraw device using this in a terminal: sudo cat /dev/hidraw0 As it turns out, running this command outputs 4-8 bytes of unreadable characters to the terminal every write() and unexpectedly it also clears the jam for hidraw0. All subsequent write()'s and read()'s to that device work flawlessly. If that device is disconnected and then reconnected the jam condition returns shortly thereafter. I have single stepped the code and verified that the "junk" is output during the execution of the write(). I tried to add fsync() calls before and after the write() in hope to clear the buffers and avoid this issue but that did not help. The code for the write() and subsequent read() is standard as follows: #define USB_PACKET 64 #define USB_WRDELAY 10 //ms FILE* fd; int errno, res; char packet[USB_PACKET]; fd = 0; /* Open the Device with non-blocking reads. */ fd = open("/dev/hidraw0", O_RDWR|O_NONBLOCK); if (fd < 0) { perror("Unable to open device"); return 0; // failure } memset(packet, 0x0, sizeof(packet)); packet[0] = 0x34; // command code - request for USB device status bytes fsync(); res = write(fd, &packet, sizeof(packet)); fsync(); if (res < 0) { printf("Error: %d in USB write()\n", errno); close(fd); return 0; // failure } else { usleep(1000*USB_WRDELAY ); // delay gives OS and device time to respond res = read(fd, &packet, sizeof(packet)); if (res < 0) { printf("Error: %d in USB read()\n", errno); close(fd); return 0; // failure } else { // good read, packet holds the response data // process the device data close(fd); return 1; // OK } } return 0; // failure This is a sample of the gibberish we read on the terminal running the cat command for each executed write(): 4n��#/5 � I am not understanding where this junk comes from and how to get rid of it. I tried several things that did not work out such as adding a read() with a timeout before the write - hoping it is some data left from a previous incomplete read(). Also tried to write a smaller buffer as I need only send only a 2 byte command as well as adding a delay between the open() and write(). Unfortunately using the cat in the terminal interferes with the hot plug/unplug detection of the USB devices so it is not a solution we can use in deployment. I'll appreciate any words of wisdom on this.
File transfer with arduino
My final goal is to send a 30 KB file over XBEE to another arduino. But for now i am just trying to duplicate a 4KB file on SD connected to first arduino. First i tried to send the data one byte by one byte.it worked and file duplicated successfully . but i have to have a buffer and then send data on 64 byte packets to XBEE so i should be able to read and write file in 64 byte packets. this is what i have done: #include <SD.h> #include <SPI.h> void setup() { Serial.begin(115200); while (!Serial) { ; // wait for serial port to connect. Needed for native USB port only } if (!SD.begin(4)) { Serial.println("begin failed"); return; } File file = SD.open("student.jpg",FILE_READ); File endFile = SD.open("cop.jpg",FILE_WRITE); Serial.flush(); char buf[64]; if(file) { while (file.position() < file.size()) { while (file.read(buf, sizeof(buf)) == sizeof(buf)) // read chunk of 64bytes { Serial.println(((float)file.position()/(float)file.size())*100);//progress % endFile.write(buf); // Send to xbee via serial delay(50); } } file.close(); } } void loop() { } It successfully finish its progress until 100% but when i open the SD on laptop the file is created but it shown as 0 KB file. whats the problem?
You are not telling .write what's the length of your buffer, so it will think it's a null-terminated string (which it isn't). Plus, the inner loop appears to be not only unnecessary but even harmful because it would skip the last chunk if it's less than 64 bytes. Check this out: while(file.position() < file.size()) { // The docs tell me this should be file.readBytes... but then I wonder why file.read even compiled for you? // So if readBytes doesn't work, go back to "read". int bytesRead = file.readBytes(buf, sizeof(buf)); Serial.println(((float)file.position()/(float)file.size())*100);//progress % // We have to specify the length! Otherwise it will stop when encountering a null byte... endFile.write(buf, bytesRead); // Send to xbee via serial delay(50); }
Named pipe messages corrupted (Win32,C)
I have some pipe communication code - received bytes are no match for sent bytes. There is a loop where 'CallNamedPipe' is called to send messages to server. Now only 1st message is received intact, all the rest are received partially filled with 0xCD byte. It seems, that when I free the memory after sending - it is still being read by server thread. MSDN says that CallNamedPipe() is a complete message sequence: open pipe, sent bytes and close the pipe. So, this seems strange to me. I must mention, that this code is built by VC++ 6.0 - a very old compiler. Code runs on Windows 7, maybe I need to use compatibility mode? Both client and server executables run on the same physical system, not remotely. Client uses CreateProcess() on startup to start the server. The messages are sent much later on, so racing conditions should not matter, I hope. Thanks for any advice. ============ Client side (pseudocode): ============ for (iPiece=0; iPiece < nPieces; ++iPiece) { buffer = malloc (2048); // copy some data bytes into buffer (1..2048 bytes) // log 1st 32 DWORDS from message about to be sent if (! CallNamedPipe (name, buffer, nBytes, ..., 1000)) { // diagnostics: call to GetLastError(), etc. } free (buffer); } ============ Server side (pseudocode): ============ DWORD __stdcall ServerThreadProc (PVOID p) { UINT cbMaxMsg = 0x10000; // 64K for a pipe message PVOID buffer = malloc (cbMaxMsg); HANDLE hPipe; BOOL fAbort = 0; hPipe = CreateNamedPipe (name, PIPE_ACCESS_DUPLEX, PIPE_WAIT | PIPE_TYPE_MESSAGE | PIPE_READMODE_MESSAGE, PIPE_UNLIMITED_INSTANCES, cbMaxMsg, cbMaxMsg, 1000, NULL); while (fAbort == 0) // one of pipe messages sets fAbort=1, so thread can return. { if (ConnectNamedPipe (hPipe, NULL)) { DWORD bytesLoaded = 0; ReadFile (hPipe, buffer, cbMaxMsg, &bytesLoaded, NULL); if (bytesLoaded) { // log 1st 32 DWORDS from received message // process the pipe message (switch/case) // data may be written back to client after processing FlushFileBuffers (hPipe); } DisconnectNamedPipe (hPipe); } else { // diagnostics, GLE(), etc. } } free (buffer); CloseHandle (hPipe); return 0; }
Using PIPE_TYPE_MESSAGE makes the pipe a message pipe. It will give a message like behavior to the server. If you try and read more than the standard message size, it will give you complete messages. 0xCD is a standard fill, I think it is for heap, but not sure. So far, we are reading 64k of data, and writing 2k of data. It looks like CallNamedPipe doesn't return until the data is accepted (as there is a timeout - set to 1000ms). The behavior of these systems, is that once kernel has the data buffer then the client code is not allowed to change the memory. I would say the most likely case, is that the buffer is not being filled correctly, and that the amount of data in the server is consistent with messages in the pipe. You have not provided enough data to verify this.
Serial communication buffer data out of order
I'm trying to implement a simple SLAM project with Arduino and C on Linux Mint 15. The Arduino project is sending data to notebook via bluetooth (serial). The data is read by a C program. In Arduino serial, the data is shown correctly, but in the notebook, the received data is wrong. (In image, white is Arduino data. The terminal shows the 'received' data.) I'm sending d080x096y099z035 (for example) and receiving 99z0356y0999z035 (out of order?). So, I have some questions: What can I do to make the read()command in C, read the data in the correct order and length? (order: d000x000y000z000, length = 16) In the Arduino sending function, are there length differences using Serial.print(char buffer[]) and Serial.println(char buffer[])? (Like adding a '\n' or something else at the end of buffer?) Should I use the delay() function in the Arduino code or in the C code? In Arduino: ... int buffer_size = 17; char buffer[17]; //void setup() void loop(){ //create the string resp = "d000x111y222z333" ... resp.toCharArray(buffer, buffersize); bluetooth.print(buffer); delay(200); } In C program: ... int fd = open("/dev/rfcomm4", O_RDONLY | O_NOCTTY | O_NDELAY); printf("fd code %d\n", fd); if (fd == -1) { gchar *msg = "open_port: Unable to open /dev/rfcomm4"; gtk_label_set_text(GTK_LABEL(label), msg); perror("error: "); } char buffer[17]; int n; printf("entering in loop...\n"); while (1) { n = read(fd, buffer, sizeof(buffer)); printf("%s\n", buffer); }
Sorry I'm not an expert but just a few ideas you might check concering your questions: to 1) I guess it might be a problem with encoding, as Python AFAIK expects files to be unicode. So try open (.... ,encoding='ascii') or whatever encoding you use Please also pay attention that you might block the GTK mainthread, that causes heavy delays in your UI. So I recommend creating a own thread for reading the serial port and filling an internal buffer, that get's rendered by the GTK mainthread, if you send an update request: http://www.pardon-sleeuwaegen.be/antoon/python/page0.html