I've just recently been required to work with C—I normally work with Python and a bit of Java—and I've been running into some issues.
I created a function that converts a base-10 unsigned int into a character array that represents the equivalent hex. Now I need to be able to set a variable with type uint32_t to this 'hex'; what can I do to make sure this char[] is treated as an actual hex value?
The code is below:
int DecToHex(long int conversion, char * regParams[])
{
int hold[8];
for (int index = 0; conversion > 0; index++)
{
hold[index] = conversion % 16;
conversion = conversion / 16;
}
int j = 0;
for (int i = 7; i > -1; i--)
{
if (hold[i] < 10 && hold[i] >= 0)
{
regParams[j] = '0' + hold[i];
}
else if (hold[i] > 9 && hold[i] < 16)
{
regParams[j] = '7' + hold[i];
}
else
{
j--;
}
j++;
}
return 0;
}
You should just use snprintf:
int x = 0xDA8F;
char buf[9];
snprintf(buf, sizeof(buf), "%X", x); // Use %x for lowercase hex digits
To convert a hex representation of a number to an int, use strtol (the third argument to it lets you specify the base), or, since you want to assign it to an unsigned data type, strtoul.
The code would look something like this:
char* HexStr = "8ADF4B";
uint32_t Num = strtoul(HexStr, NULL, 16);
printf("%X\n", Num); // Outputs 8ADF4B
Related
unsigned long long int power(int base, unsigned int exponent)
{
if (exponent == 0)
return 1;
else
return base * power(base, exponent - 1);
}
I am working on a program where I need to take in a string of 8 characters (e.g. "I want t") then convert this into a long long int in the pack function. I have the pack function working fine.
unsigned long long int pack(char unpack[])
{
/*converting string to long long int here
didn't post code because its large*/
}
After I enter "I want t" I get "Value in Decimal = 5269342824372117620" and then I send the decimal to the unpack function. So I need to convert 5269342824372117620 back into "I want t". I tried bit manipulation which was unsuccessful any help would be greatly appreciated.
void unpack(long long int pack)
{
long long int bin;
char convert[100];
for(int i = 63, j = 0, k = 0; i >= 0; i--,j++)
{
if((pack & (1 << i)) != 0)
bin += power(2,j);
if(j % 8 == 0)
{
convert[k] = (char)bin;
bin = 0;
k++;
j = -1;
}
}
printf("String: %s\n", convert);
}
A simple solution for your problem is to consider the characters in the string to be digits in a large base that encompasses all possible values. For example base64 encoding can convert strings of 8 characters to 48-bit numbers, but you can only use a subset of at most 64 different characters in the source string.
To convert any 8 byte string into a number, you must use a base of at least 256.
Given your extra input, After I enter "I want t" I get "Value in Decimal = 5269342824372117620", and since 5269342824372117620 == 0x492077616e742074, you do indeed use base 256, big-endian order and ASCII encoding for the characters.
Here is a simple portable pack function for this method:
unsigned long long pack(const char *s) {
unsigned long long x = 0;
int i;
for (i = 0; i < 8; i++) {
x = x * 256 + (unsigned char)s[i];
}
return x;
}
The unpack function is easy to derive: compute the remainders of divisions in the reverse order:
char *unpack(char *dest, unsigned long long x) {
/* dest is assumed to have a length of at least 9 */
int i;
for (i = 8; i-- > 0; ) {
s[i] = x % 256;
x = x / 256;
}
s[8] = '\0'; /* set the null terminator */
return s;
}
For a potentially faster but less portable solution, you could use this, but you would get a different conversion on little-endian systems such as current Macs and PCs:
#include <string.h>
unsigned long long pack(const char *s) {
unsigned long long x;
memcpy(&x, s, 8);
return x;
}
char *unpack(char *s, unsigned long long x) {
memcpy(s, &x, 8);
s[8] = '\0';
return s;
}
Right now I am trying to convert an int to a char in C programming. After doing research, I found that I should be able to do it like this:
int value = 10;
char result = (char) value;
What I would like is for this to return 'A' (and for 0-9 to return '0'-'9') but this returns a new line character I think.
My whole function looks like this:
char int2char (int radix, int value) {
if (value < 0 || value >= radix) {
return '?';
}
char result = (char) value;
return result;
}
to convert int to char you do not have to do anything
char x;
int y;
/* do something */
x = y;
only one int to char value as the printable (usually ASCII) digit like in your example:
const char digits[] = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
int inttochar(int val, int base)
{
return digits[val % base];
}
if you want to convert to the string (char *) then you need to use any of the stansdard functions like sprintf, itoa, ltoa, utoa, ultoa .... or write one yourself:
char *reverse(char *str);
const char digits[] = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
char *convert(int number, char *buff, int base)
{
char *result = (buff == NULL || base > strlen(digits) || base < 2) ? NULL : buff;
char sign = 0;
if (number < 0)
{
sign = '-';
}
if (result != NULL)
{
do
{
*buff++ = digits[abs(number % (base ))];
number /= base;
} while (number);
if(sign) *buff++ = sign;
if (!*result) *buff++ = '0';
*buff = 0;
reverse(result);
}
return result;
}
A portable way of doing this would be to define a
const char* foo = "0123456789ABC...";
where ... are the rest of the characters that you want to consider.
Then and foo[value] will evaluate to a particular char. For example foo[0] will be '0', and foo[10] will be 'A'.
If you assume a particular encoding (such as the common but by no means ubiquitous ASCII) then your code is not strictly portable.
Characters use an encoding (typically ASCII) to map numbers to a particular character. The codes for the characters '0' to '9' are consecutive, so for values less than 10 you add the value to the character constant '0'. For values 10 or more, you add the value minus 10 to the character constant 'A':
char result;
if (value >= 10) {
result = 'A' + value - 10;
} else {
result = '0' + value;
}
Converting Int to Char
I take it that OP wants more that just a 1 digit conversion as radix was supplied.
To convert an int into a string, (not just 1 char) there is the sprintf(buf, "%d", value) approach.
To do so to any radix, string management becomes an issue as well as dealing the corner case of INT_MIN
The following C99 solution returns a char* whose lifetime is valid to the end of the block. It does so by providing a compound literal via the macro.
#include <stdio.h>
#include <stdlib.h>
#include <limits.h>
// Maximum buffer size needed
#define ITOA_BASE_N (sizeof(unsigned)*CHAR_BIT + 2)
char *itoa_base(char *s, int x, int base) {
s += ITOA_BASE_N - 1;
*s = '\0';
if (base >= 2 && base <= 36) {
int x0 = x;
do {
*(--s) = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ"[abs(x % base)];
x /= base;
} while (x);
if (x0 < 0) {
*(--s) = '-';
}
}
return s;
}
#define TO_BASE(x,b) itoa_base((char [ITOA_BASE_N]){0} , (x), (b))
Sample usage and tests
void test(int x) {
printf("base10:% 11d base2:%35s base36:%7s ", x, TO_BASE(x, 2), TO_BASE(x, 36));
printf("%ld\n", strtol(TO_BASE(x, 36), NULL, 36));
}
int main(void) {
test(0);
test(-1);
test(42);
test(INT_MAX);
test(-INT_MAX);
test(INT_MIN);
}
Output
base10: 0 base2: 0 base36: 0 0
base10: -1 base2: -1 base36: -1 -1
base10: 42 base2: 101010 base36: 16 42
base10: 2147483647 base2: 1111111111111111111111111111111 base36: ZIK0ZJ 2147483647
base10:-2147483647 base2: -1111111111111111111111111111111 base36:-ZIK0ZJ -2147483647
base10:-2147483648 base2: -10000000000000000000000000000000 base36:-ZIK0ZK -2147483648
Ref How to use compound literals to fprintf() multiple formatted numbers with arbitrary bases?
Check out the ascii table
The values stored in a char are interpreted as the characters corresponding to that table. The value of 10 is a newline
So characters in C are based on ASCII (or UTF-8 which is backwards-compatible with ascii codes). This means that under the hood, "A" is actually the number "65" (except in binary rather than decimal). All a "char" is in C is an integer with enough bytes to represent every ASCII character. If you want to convert an int to a char, you'll need to instruct the computer to interpret the bytes of an int as ASCII values - and it's been a while since I've done C, but I believe the compiler will complain since char holds fewer bytes than int. This means we need a function, as you've written. Thus,
if(value < 10) return '0'+value;
return 'A'+value-10;
will be what you want to return from your function. Keep your bounds checks with "radix" as you've done, imho that is good practice in C.
1. Converting int to char by type casting
Source File charConvertByCasting.c
#include <stdio.h>
int main(){
int i = 66; // ~~Type Casting Syntax~~
printf("%c", (char) i); // (type_name) expression
return 0;
}
Executable charConvertByCasting.exe command line output:
C:\Users\boqsc\Desktop\tcc>tcc -run charconvert.c
B
Additional resources:
https://www.tutorialspoint.com/cprogramming/c_type_casting.htm
https://www.tutorialspoint.com/cprogramming/c_data_types.htm
2. Convert int to char by assignment
Source File charConvertByAssignment.c
#include <stdio.h>
int main(){
int i = 66;
char c = i;
printf("%c", c);
return 0;
}
Executable charConvertByAssignment.exe command line output:
C:\Users\boqsc\Desktop\tcc>tcc -run charconvert.c
B
You can do
char a;
a = '0' + 5;
You will get character representation of that number.
Borrowing the idea from the existing answers, i.e. making use of array index.
Here is a "just works" simple demo for "integer to char[]" conversion in base 10, without any of <stdio.h>'s printf family interfaces.
Test:
$ cc -o testint2str testint2str.c && ./testint2str
Result: 234789
Code:
#include <stdio.h>
#include <string.h>
static char digits[] = "0123456789";
void int2str (char *buf, size_t sz, int num);
/*
Test:
cc -o testint2str testint2str.c && ./testint2str
*/
int
main ()
{
int num = 234789;
char buf[1024] = { 0 };
int2str (buf, sizeof buf, num);
printf ("Result: %s\n", buf);
}
void
int2str (char *buf, size_t sz, int num)
{
/*
Convert integer type to char*, in base-10 form.
*/
char *bufp = buf;
int i = 0;
// NOTE-1
void __reverse (char *__buf, int __start, int __end)
{
char __bufclone[__end - __start];
int i = 0;
int __nchars = sizeof __bufclone;
for (i = 0; i < __nchars; i++)
{
__bufclone[i] = __buf[__end - 1 - i];
}
memmove (__buf, __bufclone, __nchars);
}
while (num > 0)
{
bufp[i++] = digits[num % 10]; // NOTE-2
num /= 10;
}
__reverse (buf, 0, i);
// NOTE-3
bufp[i] = '\0';
}
// NOTE-1:
// "Nested function" is GNU's C Extension. Put it outside if not
// compiled by GCC.
// NOTE-2:
// 10 can be replaced by any radix, like 16 for hexidecimal outputs.
//
// NOTE-3:
// Make sure inserting trailing "null-terminator" after all things
// done.
NOTE-1:
"Nested function" is GNU's C Extension. Put it outside if not
compiled by GCC.
NOTE-2:
10 can be replaced by any radix, like 16 for hexidecimal outputs.
NOTE-3:
Make sure inserting trailing "null-terminator" after all things
done.
is convert unsigned char / int to hex, trouble is no correct translate ascci table:
original: 0D, converted: AD
code:
char int_to_hex(int d) {
if (d < 0) return -1;
if (d > 16) return -1;
if (d <= 9) return '0' + d;
d -= 10;
return (char)('A' + d);
}
void uint_to_hex(unsigned char in, char **out, int *olen) {
int i = 0;
int remain[2];
int result = (int)in;
while (result) {
remain[i++] = result % 16;
result /= (int)16;
}
for (i = 1; i >= 0; --i) {
char c = int_to_hex(remain[i]);
if( (int)c == -1 ) { continue; }
*((*out) + (*olen)) = (char)c; (*olen)++;
}
}
where is incorrect ?..
First of all one hex digit will cover values from 0 (0) to 15 (F) not from 0 to 16 (as you seem to assume in int_to_hex function).
Also you don't really need typecasting to char in last return of that function.
In fact all typecasts here are rather unnecessary
And typecasting of literal 16 is purely absurd...
And how sample input for uint_to_hex is looking ? You are aware, that on input (in) you can pass only single character ?
I think that char pointer would suffice for out - i mean, you want to return array, right ?
So instead of ** should be only singe *.
And I don't understand why olen has to be pointer (and argument to function too).
And you should try to write more concise and readable code, it will help you understand what you are doing too.
I'm trying to convert a string of binary characters to an integer value.
For example: "100101101001" I would split it into four segments using a for loop then store it in array[4]. However whenever I use the function atoi(), I encounter a problem where it does not convert the character string properly if the string starts with "0".
An example would be "1001" = 1001, but if it is 0110 it would be converted to 110, also with 0001 it would be come only 1.
Here is the code that I made:
for(i = 0; i < strlen(store); i++)
{
bits[counter] = store [i];
counter++;
if(counter == 4)
{
sscanf(bits, "%d", &testing);
printf("%d\n", testing);
counter = 0;
}
}
The atoi() function only converts decimal numbers, in base 10.
You can use strtoul() to convert binary numbers, by specifying a base argument of 2. There is no need to "split" the string, and leading zeroes won't matter of course (as they shouldn't, 000102 is equal to 102):
const char *binary = "00010";
unsigned long value;
char *endp = NULL;
value = strtoul(binary, &endp, 2);
if(endp != NULL && *endp == '\0')
printf("converted binary '%s' to integer %lu\n", binary, value);
atoi() convert from char array to int and not to binary
you can use the following function
int chartobin(char *s, unsigned int *x) {
int len = strlen(s), bit;
*x = 0;
if(len>32 || len<1) return -1;
while(*s) {
bit = (*s++ - '0');
if((bit&(~1U))!=0) return -1;
if (bit) *x += (1<<(len-1));
len--;
}
return 0;
}
Tested and it works
I have a string (unsigned char) and i want to fill it with only hex characters.
my code is
unsigned char str[STR_LEN] = {0};
for(i = 0;i<STR_LEN;i++) {
sprintf(str[i],"%x",rand()%16);
}
Of course, when running this I get segfaulted
string is an array of char-s not unsigned char-s
you are using str[i] (which is of type unsigned char) as a 1st argument to sprintf, but it requires type char * (pointer).
This should be a little better:
char str[STR_LEN + 1];
for(i = 0; i < STR_LEN; i++) {
sprintf(str + i, "%x", rand() % 16);
}
The first argument to sprintf() should be a char*, but str[i] is a char: this is the cause of the segmentation fault. The compiler should have emitted a warning about this. gcc main.c, without specifying a high warning level, emitted the following:
warning: passing argument 1 of sprintf makes pointer from integer without a cast
A hex representation of a character can be 1 or 2 characters (9 or AB for example). For formatting, set the precision to 2 and the fill character to 0. Also need to add one character for the terminating null to str and set the step of the for loop to 2 instead of 1 (to prevent overwriting previous value):
unsigned char str[STR_LEN + 1] = {0};
int i;
for (i = 0; i < STR_LEN; i += 2)
{
sprintf(&str[i], "%02X", rand() % 16);
}
You could try something like this:
#include <stdio.h>
#include <stdlib.h>
#define STR_LEN 20
int main(void)
{
unsigned char str[STR_LEN + 1] = {0};
const char *hex_digits = "0123456789ABCDEF";
int i;
for( i = 0 ; i < STR_LEN; i++ ) {
str[i] = hex_digits[ ( rand() % 16 ) ];
}
printf( "%s\n", str );
return 0;
}
There are several unclarities and problems in your code. I interpret "hex character" to mean "hex digit", i.e. a symbol from {0,1,2,3,4,5,6,7,8,9,a,b,c,d,e,f}, not "the hexadecimal value of an ascii character's code point". This might or might not be what you meant.
This should do it:
void hex_fill(char *buf, size_t max)
{
static const char hexdigit[16] = "0123456789abcdef";
if(max < 1)
return;
--max;
for(i = 0; i < max; ++i)
buf[i] = hexdigit[rand() % sizeof hexdigit];
buf[max] = '\0';
}
The above will always 0-terminate the string, so there's no requirement that you do so in advance. It will properly handle all buffer sizes.
My variation on some of answers below; note the time seeded rand function and instead of a char using a const size, I use a vector that is then converted to a string array.
Boost variate generator docs
std::string GetRandomHexString(unsigned int count)
{
std::vector<char> charVect = std::vector<char>(count);
//Rand generator
typedef boost::random::mt19937 RNGType;
RNGType rng(std::time(nullptr) + (unsigned int)clock());
//seeding rng
uniform_int<> range(0, 15); //Setting min max
boost::variate_generator<RNGType, boost::uniform_int<> >generate(rng, range); //Creating our generator
//Explicit chars to sample from
const char hexChars[16] = { '0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F' };
//
for (int i = 0; i < count; i++)
{
charVect[i] = hexChars[generate()];
}
//
return std::string(charVect.begin(), charVect.end());;
}
Examples (count = 32):
1B62C49C416A623398B89A55EBD3E9AC
26CFD2D1C14B9F475BF99E4D537E2283
B8709C1E87F673957927A7F752D0B82A
DFED20E9C957C4EEBF4661E7F7A58460
4F86A631AE5A05467BA416C4854609F8