I am a beginner at C programming. I researched how to get a solution to my problem but I didn't find an answer so I asked here. My problem is:
I want to convert a hex array to a string. for example:
it is my input hex: uint8_t hex_in[4]={0x10,0x01,0x00,0x11};
and I want to string output like that: "10010011"
I tried some solutions but it gives me as "101011" as getting rid of zeros.
How can I obtain an 8-digit string?
#include <stdio.h>
#include <stdlib.h>
#include <stdint.h>
int main(){
char dene[2];
uint8_t hex_in[4]={0x10,0x01,0x00,0x11};
//sprintf(dene, "%x%*x%x%x", dev[0],dev[1],2,dev[2],dev[3]);
//sprintf(dene, "%02x",hex_in[1]);
printf("dene %s\n",dene);
}
In order to store the output in a string, the string must be large enough. In this case holding 8 digits + the null terminator. Not 2 = 1 digit + the null terminator.
Then you can print each number with %02x or %02X to get 2 digits. Lower-case x gives lower case abcdef, upper-case X gives ABCDEF - otherwise they are equivalent.
Corrected code:
#include <stdio.h>
#include <stdint.h>
int main(void)
{
char str[9];
uint8_t hex_in[4]={0x10,0x01,0x00,0x11};
sprintf(str,"%02x%02x%02x%02x\n", hex_in[0],hex_in[1],hex_in[2],hex_in[3]);
puts(str);
}
Though pedantically, you should always print uint8_t and other types from stdint.h using the PRIx8 etc specifiers from inttypes.h:
#include <inttypes.h>
sprintf(str,"%02"PRIx8"%02"PRIx8"%02"PRIx8"%02"PRIx8"\n",
hex_in[0],hex_in[1],hex_in[2],hex_in[3]);
Related
I want the result to be the value of ac[0],but it is showing something confusing.The second line works.
#include <stdio.h>
int main(void){
char a='a';
char b='b';
char ac[]={-24,1,2,3,4,5,6,7,8,9,};
char *p=ac;
printf("*p=%c ",*p);//this is the part that produces problem
printf("*p=%d ",*p);//this works
//! *(p+n)<--->ac[n]
}
%c prints the character encoded by integer hold by the char integer value.
The most popular 7 bits system system is ASCII :
(above 0x7f they are not standard).
So I want to print the copyright symbol and putchar() just cuts off the the most significant byte of the character which results in an unprintable character.
I am using Ubuntu MATE and the encoding I am using is en_US.UTF-8.
Now what I know is that the hex value for © is 0xc2a9 and when I try putchar('©' - 0x70) it gives me 9 which has the hex value of 0x39 add 0x70 to it and you'll get 0xa9 which is the least significant byte of 0xc2a9
#include <stdio.h>
main()
{
printf("©\n");
putchar('©');
putchar('\n');
}
I expect the output to be:
©
©
rather than:
©
�
The putchar function takes an int argument and casts it to an unsigned char to print it. So you can't pass it a multibyte character.
You need to call putchar twice, once for each byte in the codepoint.
putchar(0xc2);
putchar(0xa9);
You could try the wide version: putwchar
Edit: That was actually more difficult than I thought. Here's what I needed to make it work:
#include <locale.h>
#include <wchar.h>
#include <stdio.h>
int main() {
setlocale(LC_ALL, "");
putwchar(L'©');
return 0;
}
I need to convert uint16_t value to a string. I want the string to be a decimal respresentation of the number.
Example: uint16_t i=256 string: 256
I tried with itoa(i,string, 10) but when i value increases starts printing negative values.
I send the string via the Serial Port.(UART)
It is there some alternative?
Use sprintf with %u format for unsigned int:
uint16_t i = 33000;
sprintf(str, "%u", i);
You can try to use sprintf() for common "to string" conversions. For example:
#include <stdio.h>
#include <math.h>
int main() {
uint16_t i = 256;
char str[80];
int str_len = sprintf(str, "%d", i);
return(0);
}
For more info look at this article.
My computer made a beep sound even though I did not add \a to my code. Why?
Program:
#include <stdio.h>
#include <limits.h>
#include <float.h>
#include <stdlib.h>
#define START_CHAR ' '
#define END_CHAR 'DEL'
int main(void)
{ /* This code prints characters on keyboard.*/
/* declaration */
int char_code;
for (char_code=(int)START_CHAR; char_code<=(int)END_CHAR; char_code=char_code+1)
printf("%c", (char)char_code);
printf("\n");
return(0);
}
'DEL' is not a valid character constant. It ends up being equal to 4474188. And since you have char_code defined as an int, the loop goes from 32 (the ASCII code for a space) to 4474188. So it loops through the full character set multiple times.
You should be using 0x7F instead.
I have been given this school project. I have to alphabetically sort list of items by Czech rules. Before I dig deeper, I have decided to test it on a 16 by 16 matrix so I did this:
typedef struct {
wint_t **field;
}LIST;
...
setlocale(LC_CTYPE,NULL);
....
list->field=(wint_t **)malloc(16*sizeof(wint_t *));
for(int i=0;i<16;i++)
list->field[i]=(wint_t *)malloc(16*sizeof(wint_t));
In another function I am trying to assign a char. Like this:
sorted->field[15][15] = L'C';
wprintf(L"%c\n",sorted->field[15][15]);
Everything is fine. Char is printed. But when I try to change it to
sorted->field[15][15] = L'Č';
It says: Extraneous characters in wide character constant ignored. (Xcode) And the printing part is skipped. The main.c file is in UTF-8. If I try to print this:
printf("ěščřžýááíé\n");
It prints it out as written. I am not sure if I should allocate mem using wint_t or wchar_t or if I am doing it right. I tested it with both but none of them works.
clang seems to support entering arbitrary byte sequences into to wide strings with the \x notation:
wchar_t c = L'\x2126';
This compiles without notice.
Edit: Adapting what I find on wikipedia about wide characters, the following works for me:
#include <stdio.h>
#include <wchar.h>
#include <stdlib.h>
#include <locale.h>
int main(void)
{
setlocale(LC_ALL,"");
wchar_t myChar1 = L'\x2126';
wchar_t myChar2 = 0x2126; // hexadecimal encoding of char Ω using UTF-16
wprintf(L"This is char: %lc \n",myChar1);
wprintf(L"This is char: %lc \n",myChar2);
}
and prints nice Ω characters in my terminal. Make sure that your teminal is able to interpret utf-8 characters.