kindly check the following program
#include <stdio.h>
#include<stdlib.h>
int main()
{
char *Date= NULL;
unsigned short y=2013;
Date = malloc(3);
sprintf((char *)Date,"%hu",y);
printf("%d %d %d %d %d \n",Date[0],Date[1],Date[2],Date[3],Date[4]);
printf("%s %d %d",Date,strlen(Date),sizeof(y));
}
output:
50 48 49 51 0
2013 4 2
How I am getting the string length 4 instead 2,as I am putting a short integer value into the memory so it should be occupied in 2 byte of memory.But how it is taking 4 byte.
How each byte getting 2 0 1 3 from the input, instead 20 in one byte and 13 in another byte.
I want to put 20 to one byte and 13 to another byte.How to do that.kindly tell something
Kindly give some answer.
As indicate by its name, the sprintf function write a formated string. So, your number 2013 is converted to "2013" (a 5 character string).
You are invoking undefined behaviour.
You have allocated only 3 bytes for Date and storing 5 bytes.
Four bytes for 2013 and 1 NUL byte. So you should allocate at least 5 bytes if you want to store 2013.
If you want to transfer a stream of bytes then I suggest you do in the following way:
#include <stdio.h>
#include<string.h>
#include<stdlib.h>
int main()
{
unsigned char *Date= NULL;
unsigned short y=2013;
unsigned char *p;
p = (unsigned char*) &y;
Date = malloc(3);
Date[0] = *p;
Date[1] = *(p+1);
Date[2] = 0;
printf("%s %d %d",Date,strlen(Date),sizeof(y));
}
This outputs:
� 2 2
The strange char is because interpreting some byte values as a string. Plain char may be signed or unsigned depending on your implementation. So use unsigned char to avoid incorrect interpretation of bytes.
Related
#include <stdio.h>
wchar_t wc = L' 459';
printf("%d", wc); //result : 32
I know the 'space' is 'decimal 32' in ASCII code table.
What I don't understand is, as far as I know, if there's not enough space for a variable to store value, the value would be the 'last digits' of the original value.
Like, if I put binary value '1100 1001 0011 0110' into single byte variable, it would be '0011 0110' which is 'the last byte' of the original binary value.
But the code above shows 'the first byte' of the original value.
I'd like to know what happen in memory level when I execute the code above.
_int64 x = 0x0041'0042'0043'0044ULL;
printf("%016llx\n", x); //prints 0041004200430044
wchar_t wc;
wc = x;
printf("%04X\n", wc); //prints 0044 as you expect
wc = L'\x0041\x0042\x0043\x0044'; //prints 0041, uses the first character
printf("%04X\n", wc);
If you assign an integer value that's too large, the compiler takes the max value 0x0044 that fits in 2 bytes.
If you try to assign several elements in to one element, the compiler takes the first element 0x0041 which fits. L'x' is mean to be a single wide character.
VS2019 will issue a warning for wchar_t wc = L' 459', unless warning level is set to less than 3, but that's not recommended. Use warning level 3 or higher.
wchar_t is a primitive type, not a typedef for unsigned short, but they are both 2 bytes in Windows (4 bytes in linux)
Note that 'abcd' is 4 bytes. The L prefix indicates 2 bytes per element (in Windows), so L'abcd' is 8 bytes.
To see what is inside wc, lets look at Unicode character L'X' which has UTF-16 encoding of 0x0058 (similar to ASCII values up to 128)
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
int main(void)
{
wchar_t wc = L'X';
wprintf(L"%c\n", wc);
char buf[256];
memcpy(buf, &wc, 2);
for (int i = 0; i < 2; i++)
printf("%02X ", buf[i] & 0xff);
printf("\n");
return 0;
}
The output will be 58 00. It is not 00 58 because Windows runs on little-endian systems and the bytes are flipped.
Another weird thing is that UTF16 uses for 4 bytes for some code points. So you will get a warning for this line:
wchar_t wc = L'😀';
Instead you want to use string:
wchar_t *wstr = L"😀";
::MessageBoxW(0, wstr, 0, 0); //console may not display this correctly
This string will be 6 bytes (2 elements + null terminating char)
In below program why data of character printed properly in normal int variable case and why it is not printing the data properly in int?
Case 1
#include <stdio.h>
int main()
{
int *i = NULL;
char s = 'A';
i = (int *)&s; // storing
printf("i - %d\n",*i);
return 0;
}
Output :
i - 1837016897
Why 65 value not printed here?
Case 2
#include <stdio.h>
int main()
{
int *i = NULL;
char s = 'A';
i = (int *)&s; // storing
printf("i - %c\n",*i); // if we display character stored here
// then it is printed properly
return 0;
}
Output:
i - A
Case 3
#include <stdio.h>
int main()
{
int i = 0;
char s = 'A';
i = s;
printf("i - %d\n",i); // in this case data is properly printing
return 0;
}
Output:
i - 65
int is 4 bytes long, char is 1 byte. This means that you cannot convert like this. You are taking the pointer of a 1-byte variable and are telling the program to interpret it like a 4-byte variable, so whatever is behind it in the memory will also be used in the program.
1837016897 in hexadecimal value becomes 0x6D7EA741, where the last byte (0x41) is actually decimal 65, so the character does show up in your result (if you're wondering why it is the last byte and not the first, this is because of endianness - you can read up on that yourself if you like).
Your programs 1 and 2 exhibit undefined behaviour because they refer to an object of type char via an lvalue of type int. This is not allowed according to section 6.5 paragraph 7 of the standard.
Your program 3 is OK because char is implicitly converted to int when passed to a function like printf, which is perfectly normal and well-defined.
i = (int *)&s;
this makes sure i points to the address of s, but since s is char and i is int* when dereferencing i, (*i) the compiler looks for int type unless using cast like so ( *(char*)i ), so it looks at the address of s, but looks at 4 bytes instead of 1 (assuming 32-bit int)
int requires 4 bytes to be stored in memory while char requires only 1 byte to be stored in memory. So char s = 'A' stores only one byte with value 65 at memory address &s.
In case 1, you try to print 4 bytes at the memory address pointed by &s as am integer. Now the memory adjacent to &s may have garbage values hence, you get 1837016897 instead of 65.
In case 2, you are printing using %c which recasts i into a char and hence prints only one byte.
In case 3, i=s stores the value of s i.e. 65 into i, hence you get 65 as the output.
#include <stdio.h>
int main()
{
char * name = "bob";
int x = sizeof(name);
printf("%s is %d characters\n",name,x);
}
I have the above code. I want to print the number of characters in this string. It keeps printing 8 instead of 3. Why?
sizeof() returns byte size. Specifically, it gives the bytes required to store an object of the type of the operand. In this case sizeof() is returning the byte size of a pointer to a string, which on your computer is 8 bytes, or, 64-bits.
strlen() is what you are looking for:
#include <stdio.h>
#include <string.h> // include string.h header to use strlen()
int main()
{
char * name = "bob";
int x = strlen(name); // use strlen() here
printf("%s is %d characters\n",name,x);
}
use strlen for finding the length of a string.
Each character is atleast 1 byte wide. It prints 8 because sizeof gets a pointer to bob and, on your machine, a pointer is 8 bytes wide.
strlen gives you the number of characters in a string. sizeof gives you the size of the object in bytes. On your system, an object of type char * is apparently 8 bytes wide.
I have two strings as below
Str1: 1234\099
Str2: 123499
If I calculate MD5 on these string using unsigned char *MD5(const unsigned char *d, unsigned long n, unsigned char *md);
Will the function MD5 calculate the hash for first 4 bytes of Str1 or would it take the complete string "1234\099".
Will the MD5 of Str1 and Str2 be same or different????
Thanks
That all depends on what value you pass for the n parameter.
If you use strlen() to calculate n, then the MD5() function will process data up to, but not including, the null (because you told it to, in effect).
If you pass in the correct length for n (I assume that's 7 for Str1), then MD5() will include all the data in the hash, including null bytes and data past the null.
The hash for Str1 and Str2 will be different (assuming you pass in something larger than 3 for n).
const unsigned char *d it's firs element of array, not C-style string.
So specify proper length (7 elements) of (const unsigned char *)"1234\099" you will obtain its MD5 into md.
Yes, Michael is right. I executed the below program and it seems that MD5 calculation takes place on second argument irrespective of what is in the string.
#include <stdio.h>
#include <openssl/md5.h>
int main() { <br>
char *p = NULL;<br>
char *c = NULL;<br>
char *q = NULL;<br>
p = malloc(10*(sizeof(char)));
strncpy(p, "ABCDEDFGO", 9);
c = malloc(200*(sizeof(char)));
memset(c,0, 200);
p[3] = '\0';
q = MD5(p, 9, c);
printf("\n For 9 bytes hash %x", *c);
q = MD5(p, 4, c);
printf("\n For 4 bytes hash %x", *c);
q = MD5(p, 3, c);
printf("\n For 3 bytes hash %x", *c);
free(c);
return (0);
}
For 9 bytes hash ffffffc9
For 4 bytes hash ffffffe5
For 3 bytes hash ffffff90
I think my understanding of bytes arrays and char arrays is causing me some issues, here is my problem:
I have an application that pulls messages from Websphere MQ and sends them onto a target system.
A MQ message has a MQBYTE24 (byte array 24 essentially) that represents the MSGID of the message. My goal is to convert this to a hexidecimal string.
On the WMQ explorer on my Linux box message 1 in the queue has a message identifier of "AMQ QM01" (at least that it what it looks like), and the bytes are below as displayed in the explorer:
00000 41 4D 51 20 51 4D 30 31--20 20 20 20 20 20 20 20 |AMQ QM01 |
00010 BD F4 A8 4E A2 A3 06 20-- |...N... |
Now when my code runs I pick up that same message id and try convert it to a hex string.
The exact message id while debugging is:
AMQ QM01 \275\364\250N\242\243\006
And after running through my conversion (code below) i get:
414D5120514D30312020202020202020FFFFFF4EFFFF6
As you can see it is slightly different to the one that the WMQ Explorer shows, any idea what i am doing wrong here?
I assume it is me converting from the MQBYTE24 to char....something is going wrong there...
Below is a small sample program that produces the "wrong result".....i assune i must use a byte array instead of char?
The output for the following is:
Result: 414D5120514D30312020202020202020FFFFFF4EFFFF6
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
int main(){
char name[41]="AMQ QM01 \275\364\250N\242\243\006";
char buffer[82]="";
char *pbuffer = buffer;
FILE *fp_1;
FILE *fp_2;
int size;
char *buffer_1 = NULL;
char *buffer_2 = NULL;
int rc = convertStrToHex(buffer, name);
printf( "Result: %s\n", pbuffer );
}
return 0;
}
int convertStrToHex(char *buffer, char str[10]){
int len = strlen(str);
int i;
for( i = 0; i < len ;i++ ){
sprintf(buffer, "%X", str[i]);
buffer +=2;
};
}
Thanks for the help :-)
Lynton
Depending on the compiler and platform char is signed or not and printf's behaviour is different.
Just cast str[i] to unsigned char (or change the type of str in the function's prototype) and it will work. For example (prototype changed):
int convertStrToHex(char *buffer, unsigned char str[10]){
int len = strlen(str);
int i;
for( i = 0; i < len ;i++ ){
sprintf(buffer, "%X", str[i]);
buffer +=2;
};
}
BTW: it considered as unsafe to pass a string without it's allocated length and to use sprintf. You should use snprintf with the real length of buffer or at least handle the size limit yourself inside the loop. In case strlen(str) is larger than buffer's size * 2.
As several other answers already point out, you need to cast the characters to unsigned char to avoid their being padded with FF to fill a 32-bit int's worth of bytes. But there's actually another issue: that lone number 6 at the end will only print as one character in the output. You want each character to take up exactly two positions, so you need a zero-padded field specifier. Putting it all together, you get
sprintf(buffer, "%02X", (unsigned char)str[i]);
Try
sprintf(buffer, "%X", (unsigned char)str[i]);