Convert byte array / char array to hexidecimal string in C - c

I think my understanding of bytes arrays and char arrays is causing me some issues, here is my problem:
I have an application that pulls messages from Websphere MQ and sends them onto a target system.
A MQ message has a MQBYTE24 (byte array 24 essentially) that represents the MSGID of the message. My goal is to convert this to a hexidecimal string.
On the WMQ explorer on my Linux box message 1 in the queue has a message identifier of "AMQ QM01" (at least that it what it looks like), and the bytes are below as displayed in the explorer:
00000 41 4D 51 20 51 4D 30 31--20 20 20 20 20 20 20 20 |AMQ QM01 |
00010 BD F4 A8 4E A2 A3 06 20-- |...N... |
Now when my code runs I pick up that same message id and try convert it to a hex string.
The exact message id while debugging is:
AMQ QM01 \275\364\250N\242\243\006
And after running through my conversion (code below) i get:
414D5120514D30312020202020202020FFFFFF4EFFFF6
As you can see it is slightly different to the one that the WMQ Explorer shows, any idea what i am doing wrong here?
I assume it is me converting from the MQBYTE24 to char....something is going wrong there...
Below is a small sample program that produces the "wrong result".....i assune i must use a byte array instead of char?
The output for the following is:
Result: 414D5120514D30312020202020202020FFFFFF4EFFFF6
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
int main(){
char name[41]="AMQ QM01 \275\364\250N\242\243\006";
char buffer[82]="";
char *pbuffer = buffer;
FILE *fp_1;
FILE *fp_2;
int size;
char *buffer_1 = NULL;
char *buffer_2 = NULL;
int rc = convertStrToHex(buffer, name);
printf( "Result: %s\n", pbuffer );
}
return 0;
}
int convertStrToHex(char *buffer, char str[10]){
int len = strlen(str);
int i;
for( i = 0; i < len ;i++ ){
sprintf(buffer, "%X", str[i]);
buffer +=2;
};
}
Thanks for the help :-)
Lynton

Depending on the compiler and platform char is signed or not and printf's behaviour is different.
Just cast str[i] to unsigned char (or change the type of str in the function's prototype) and it will work. For example (prototype changed):
int convertStrToHex(char *buffer, unsigned char str[10]){
int len = strlen(str);
int i;
for( i = 0; i < len ;i++ ){
sprintf(buffer, "%X", str[i]);
buffer +=2;
};
}
BTW: it considered as unsafe to pass a string without it's allocated length and to use sprintf. You should use snprintf with the real length of buffer or at least handle the size limit yourself inside the loop. In case strlen(str) is larger than buffer's size * 2.

As several other answers already point out, you need to cast the characters to unsigned char to avoid their being padded with FF to fill a 32-bit int's worth of bytes. But there's actually another issue: that lone number 6 at the end will only print as one character in the output. You want each character to take up exactly two positions, so you need a zero-padded field specifier. Putting it all together, you get
sprintf(buffer, "%02X", (unsigned char)str[i]);

Try
sprintf(buffer, "%X", (unsigned char)str[i]);

Related

Sliced string and char buffer are not equal

I want to extract the packet number from the received message and want to compare with originally sent packet_count and trying with following code
char string[] = "Hello client 0000!";
char new[4];
char bufNew[4];
int i=0;
int count=0;
int num=0;
int packet_counter=0;
for (i=13;i<17;i++)
{
new[count] = string[i];
count = count+1;
}
new[4]='\0';
printf("Sliced no is: %s\n",new);
sprintf(bufNew, "%04d", packet_counter);
bufNew[4]='\0';
printf("packet counter is: %s\n",bufNew);
printf("String compare result: %d\n",strcmp(new,bufNew));
Although the output look same, strings are different.
Output:
Sliced no is: 0000
packet counter is: 0000
String compare result: 48
Please guide how to make both these strings equal.
It is likely that sprintf(bufNew, "%04d", packet_counter); overwrites the terminating 0 of new (since as others said, it is used out of its bounds). Try moving the printf("Sliced no is: %s\n",new); down below bufNew[4]='\0'; to see it for yourself.
To fix it, newand bufNew needs to be declared with size 5:
char new[5];
char bufNew[5];

sscanf returning -1 while scaning short int from a memory pointer in c?

I am trying to scan a short int (16 bits/2 bytes) from a memory pointer using sscanf as shown below. But it is showing some weird behaviour.
#include <stdio.h>
#include <errno.h>
int main()
{
unsigned char p[] = {0x00, 0x1A};
short int s = 0;
int ret = sscanf(p,"%hu",&s);
printf("Actual data %02x %02x\n",p[0],p[1]);
printf("s = %hu ret = %d errno = %d\n",s,ret,errno);
return 0;
}
Output:
Actual data 00 1a
s = 0 ret = -1 errno = 0
Please help me to understand what I am doing wrong !
p is pointing at a 0x00 byte, so it looks like an empty string, hence there is nothing for sscanf to parse.
It looks like just want to copy those two bytes verbatim to your short, not convert a C string to a short ? If so then instead of the sscanf call you could just use memcpy:
memcpy(&s, p, sizeof(s));
Note that this assumes that your buffer and the destination short have the same endianness. If not then you will also need to take care of the byte order.

How much space to allocate for printing long int value in string?

I want to store a long value (LONG_MAX in my test program) in a dynamically allocated string, but I'm confused how much memory I need to allocate for the number to be displayed in the string.
My fist attempt:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <limits.h>
int main(void)
{
char *format = "Room %lu somedata\n";
char *description = malloc(sizeof(char) * strlen(format) + 1);
sprintf(description, format, LONG_MAX);
puts(description);
return 0;
}
Compiled with
gcc test.c
And then running it (and piping it into hexdump):
./a.out | hd
Returns
00000000 52 6f 6f 6d 20 39 32 32 33 33 37 32 30 33 36 38 |Room 92233720368|
00000010 35 34 37 37 35 38 30 37 20 62 6c 61 62 6c 61 0a |54775807 blabla.|
00000020 0a |.|
00000021
Looking at the output, it seems my memory allocation of sizeof(char) * strlen(format) + 1 is wrong (too less memory allocated) and it works more accidentally?
What is the correct amount to allocate then?
My next idea was (pseudo-code):
sizeof(char) * strlen(format) + strlen(LONG_MAX) + 1
This seems too complicated and pretty non-idomatic. Or am I doing something totally wrong?
You are doing it totally wrong. LONG_MAX is an integer, so you can't call strlen (). And it's not the number that gives the longest result, LONG_MIN is. Because it prints a minus character as well.
A nice method is to write a function
char* mallocprintf (...)
which has the same arguments as printf and returns a string allocated using malloc with the exactly right length. How you do this: First figure out what a va_list is and how to use it. Then figure out how to use vsnprintf to find out how long the result of printf would be without actually printing. Then you call malloc, and call vsnprintf again to produce the string.
This has the big advantage that it works when you print strings using %s, or strings using %s with some large field length. Guess how many characters %999999d prints.
You can use snprintf() to figure out the length without worrying about the size of LONG_MAX.
When you call snprintf with NULL string, it'll return a number of bytes that would have been required if it was write into the buffer and then you know exactly how many bytes are required.
char *format = "Room %lu somedata\n";
int len = snprintf(0, 0, format, LONG_MAX); // Returns the number of
//bytes that would have been required for writing.
char *description = malloc( len+1 );
if(!description)
{
/* error handling */
}
snprintf(description, len+1, format, LON_MAX);
Convert the predefined constant numeric value to a string, using macro expansion as explaned in convert digital to string in macro:
#define STRINGIZER_(exp) #exp
#define STRINGIZER(exp) STRINGIZER_(exp)
(code courtesy of Whozcraig). Then you can use
int max_digit = strlen(STRINGIZER(LONG_MAX))+1;
or
int max_digit = strlen(STRINGIZER(LONG_MIN));
for signed values, and
int max_digit = strlen(STRINGIZER(ULONG_MAX));
for unsigned values.
Since the value of LONG_MAX is a compile-time, not a run-time value, you are ensured this writes the correct constant for your compiler into the executable.
To allocate enough room, consider worst case
// Over approximate log10(pow(2,bit_width))
#define MAX_STR_INT(type) (sizeof(type)*CHAR_BIT/3 + 3)
char *format = "Room %lu somedata\n";
size_t n = strlen(format) + MAX_STR_INT(unsigned long) + 1;
char *description = malloc(n);
sprintf(description, format, LONG_MAX);
Pedantic code would consider potential other locales
snprintf(description, n, format, LONG_MAX);
Yet in the end, recommend a 2x buffer
char *description = malloc(n*2);
sprintf(description, format, LONG_MAX);
Note: printing with specifier "%lu" ,meant for unsigned long and passing a long LONG_MAX in undefined behavior. Suggest ULONG_MAX
sprintf(description, format, ULONG_MAX);
With credit to the answer by #Jongware, I believe the ultimate way to do this is the following:
#define STRINGIZER_(exp) #exp
#define STRINGIZER(exp) STRINGIZER_(exp)
const size_t LENGTH = sizeof(STRINGIZER(LONG_MAX)) - 1;
The string conversion turns it into a string literal and therefore appends a null termination, therefore -1.
And not that since everything is compile-time constants, you could as well simply declare the string as
const char *format = "Room " STRINGIZER(LONG_MAX) " somedata\n";
You cannot use the format. You need to observer
LONG_MAX = 2147483647 = 10 characters
"Room somedata\n" = 15 characters
Add the Null = 26 characters
so use
malloc(26)
should suffice.
You have to allocate a number of char equal to the digits of the number LONG_MAX that is 2147483647. The you have to allocate 10 digit more.
in your format string you fave
Room = 4 chars
somedata\n = 9
spaces = 2
null termination = 1
The you have to malloc 26 chars
If you want to determinate runtime how man digit your number has you have to write a function that test the number digit by digit:
while(n!=0)
{
n/=10; /* n=n/10 */
++count;
}
Another way is to store temporary the sprintf result in a local buffer and the mallocate strlen(tempStr)+1 chars.
Usually this is done by formatting into a "known" large enough buffer on the stack and then dynamically allocated whatever is needed to fit the formatted string. i.e.:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <limits.h>
int main(void)
{
char buffer[1024];
sprintf(buffer, "Room %lu somedata\n", LONG_MAX);
char *description = malloc( strlen( buffer ) + 1 );
strcpy( description, buffer );
puts(description);
return 0;
}
Here, your using strlen(format) for allocation of memory is bit problematic. it will allocate memory considering the %lu lexicographically, not based on the lexicographical width of the value that can be printed with %lu.
You should consider the max possible value for unsigned long,
ULONG_MAX 4294967295
lexicographically 10 chars.
So, you've to allocate space for
The actual string (containing chars), plus
10 chars (at max), for the lexicographical value for %lu , plus
1 char to represnt - sign, in case the value is negative, plus
1 null terminator.
Well, if long is a 32-bit on your machine, then LONG_MAX should be 2147483647, which is 10 characters long. You need to account for that, the rest of your string, and the null character.
Keep in mind that long is a signed value with maximum value of LONG_MAX, and you are using %lu (which is supposed to print an unsigned long value). If you can pass a signed value to this function, then add an additional character for the minus sign, otherwise you might use ULONG_MAX to make it clearer what your limits are.
If you are unsure which architecture you are running on, you might use something like:
// this should work for signed 32 or 64 bit values
#define NUM_CHARS ((sizeof(long) == 8) ? 21 : 11)
Or, play safe and simply use 21. :)
Use the following code to calculate the number of characters necessary to hold the decimal representation of any positve integer:
#include <math.h>
...
size_t characters_needed_decimal(long long unsigned int llu)
{
size_t s = 1;
if (0LL != llu)
{
s += log10(llu);
}
return s;
}
Mind to add 1 when using a C-"string" to store the number, as C-"string"s are 0-terminated.
Use it like this:
#include <limits.h>
#include <stdlib.h>
#include <stdio.h>
size_t characters_needed_decimal(long long unsigned int);
int main(void)
{
size_t s = characters_needed_decimal(LONG_MAX);
++s; /* 1+ as we want a sign */
char * p = malloc(s + 1); /* add one for the 0-termination */
if (NULL == p)
{
perror("malloc() failed");
exit(EXIT_FAILURE);
}
sprintf(p, "%ld", LONG_MAX);
printf("LONG_MAX = %s\n", p);
sprintf(p, "%ld", LONG_MIN);
printf("LONG_MIN = %s\n", p);
free(p);
return EXIT_SUCCESS;
}
Safest:
Rather than predict the allocation needed, uses asprintf(). This function allocates memory as needed.
char *description = NULL;
asprintf(&description, "Room %lu somedata\n", LONG_MAX);
asprintf() is not standard C, but is common in *nix and its source code is available to accommodate other systems.
Why Use Asprintf?
apple
android

is there a way to get full array from base address in c?

i am trying to pass an "unsigned char *" to another program by "execl" command by address
Here is the first program:
unsigned char myString;
...
unsigned char * myarr = malloc(80*sizeof(char));
...
//myarr is filled with some encrypted data
...
printf("\nresult:\t");
for(i=0;*(myarr+i)!=0x00;i++)
printf("%X ",*(myarr+i));
...
myString = malloc(80*sizeof(char));
myString = *myarr;
...
execl(".../Child", "Child", &myString, NULL);
On the second program;
unsigned char *myString;
...
myString = (unsigned char *)argv[1];
...
unsigned char * mynewarr = malloc(80*sizeof(char));
mynewarr = myString;
...
printf("\nresult:\t");
for(i=0;*(mynewarr+i)!=0x00;i++)
printf("%X ",*(mynewarr+i));
HERE THE RESULTS I GOT
first program
result: 20 DD 3E 99 2 94 7E C6 D DD 4 A 36 85 5B DA
second program
result: 20
why results are different? what am i doing wrong? Please help me.
i am using eclipse and i am coding in ubuntu 13.10.
I apologize. I completely misunderstood your question and jumped the gun.
The arguments to execl have to be null terminated strings and you are just passing myString incorrectly. That said there are caveats to what you are doing. You can't have embedded nulls in the encrypted data or they will be interpreted (prematurely) as the end of the string. As Zan Lynx notes, you can text encode your strings if they are binary. You also can't pass a string of unbounded length. There are system limits on how big the combined size of the argument list and environment can be and if exceeded execl will fail with E2BIG. (My initial misunderstanding was thinking you were trying to get around this limitation.)
Here is minimal working example of your programs (some liberties taken).
First program:
int main(int argc, char *argv[])
{
unsigned char *myString = malloc(80 * sizeof(char));
strcpy(myString, "filled with secret sauce");
printf("\nresult:\t");
for(int i = 0; *(myString + i) != '\0'; i++)
printf("%X ", *(myString + i));
printf("\n");
execl("./execpgm", "execpgm", myString, (char *) NULL);
perror("execl");
exit(1);
}
Second program:
int main(int argc, char *argv[])
{
unsigned char *mynewarr = malloc(80 * sizeof(unsigned char));
strncpy(mynewarr, argv[1], 80);
printf("\nresult:\t");
for (int i = 0; *(mynewarr + i) != '\0'; i++)
printf("%X ", *(mynewarr + i));
printf("\n");
exit(1);
}
Again, my apologies for answering without first understanding.
You can fix this by writing the complete string into the command line of the new process or by writing into a temporary file and passing that filename.
You said it was encrypted data. Which means that it probably contains zeros. Which means that it won't work to just write a string into the command line. Although you could write it in an escaped format like '\0' for null and then '\\' for backslash, or you could use BASE64 encoding.
But writing a pointer will not work because the memory space is wiped out when exec runs.

Char isn't converting to int

For some reason my C program is refusing to convert elements of argv into ints, and I can't figure out why.
int main(int argc, char *argv[])
{
fprintf(stdout, "%s\n", argv[1]);
//Make conversions to int
int bufferquesize = (int)argv[1] - '0';
fprintf(stdout, "%d\n", bufferquesize);
}
And this is the output when running ./test 50:
50
-1076276207
I have tried removing the (int), throwing both a * and an & between (int) and argv[1] - the former gave me a 5 but not 50, but the latter gave me an output similar to the one above. Removing the - '0' operation doesn't help much. I also tried making a char first = argv[1] and using first for the conversion instead, and this weirdly enough gave me a 17 regardless of input.
I'm extremely confused. What is going on?
Try using atoi(argv[1]) ("ascii to int").
argv[1] is a char * not a char you can't convert a char * to an int. If you want to change the first character in argv[1] to an int you can do.
int i = (int)(argv[1][0] - '0');
I just wrote this
#include<stdio.h>
#include<stdlib.h>
int main(int argc, char **argv) {
printf("%s\n", argv[1]);
int i = (int)(argv[1][0] - '0');
printf("%d\n", i);
return 0;
}
and ran it like this
./testargv 1243
and got
1243
1
You are just trying to convert a char* to int, which of course doesn't make much sense. You probably need to do it like:
int bufferquesize = 0;
for (int i = 0; argv[1][i] != '\0'; ++i) {
bufferquesize *= 10; bufferquesize += argv[1][i] - '0';
}
This assumes, however, that your char* ends with '\0', which it should, but probably doesn't have to do.
(type) exists to cast types - to change the way a program looks a piece of memory. Specifically, it reads the byte encoding of the character '5' and transfers it to memory. A char* is an array of chars, and chars are one byte unsigned integers. argv[1] points to the first character. Check here for a quick explanation of pointers in C. So your "string" is represented in memory as:
['5']['0']
when you cast
int i = (int) *argv[1]
you're only casting the first element to an int, thus why you
The function you're looking for is either atoi() as mentioned by Scott Hunter, or strtol(), which I prefer because of its error detecting behaviour.

Resources