I need to know how to put bits into a character array.
for example,
I want to put 0001 bits into a character array using C or C++.
Need your help guys. Thanks.
Maybe this more generic code will give you the idea:
void setBitAt( char* buf, int bufByteSize, int bitPosition, bool value )
{
if(bitPosition < sizeof(char)*8*bufByteSize)
{
int byteOffset= bitPosition/8;
int bitOffset = bitPosition - byteOffset*8;
if(value == true)
{
buf[byteOffset] |= (1 << bitOffset);
}
else
{
buf[byteOffset] &= ~(1 << bitOffset);;
}
}
}
//use it as follow:
char chArray[16];
setBitAt(chArray,16*sizeof(char),5,true); //to set bit at pos 5 to 1
Is that really all?
char buf[1];
buf[0] = char(1);
If you want bit masking then it would be something like
enum Enum
{
MASK_01 = 0x1,
MASK_02 = 0x2,
MASK_03 = 0x4,
MASK_04 = 0x8,
};
char buf[4];
buf[0] = Enum::MASK_01;
buf[1] = Enum::MASK_02;
buf[2] = Enum::MASK_03;
buf[3] = Enum::MASK_04;
If you provide information on what you are actually trying to do, we may be able to help you more.
EDIT: Thanks for the extra information. Does this help:
enum Enum
{
BIT_0000000000000001 = 0x0001,
BIT_0000000000000010 = 0x0002,
BIT_0000000000000100 = 0x0004,
BIT_0000000000001000 = 0x0008,
BIT_0000000000010000 = 0x0010,
BIT_0000000000100000 = 0x0020,
BIT_0000000001000000 = 0x0040,
BIT_0000000010000000 = 0x0080,
BIT_0000000100000000 = 0x0100,
BIT_0000001000000000 = 0x0200,
BIT_0000010000000000 = 0x0400,
BIT_0000100000000000 = 0x0800,
BIT_0001000000000000 = 0x1000,
BIT_0010000000000000 = 0x2000,
BIT_0100000000000000 = 0x4000,
BIT_1000000000000000 = 0x8000,
};
int main( int argc, char* argv[] )
{
char someArray[8];
memset( someArray, 0, 8 );
// create an int with the bits you want set
int combinedBits = BIT_0000000000000001|
BIT_0000000000000010|
BIT_1000000000000000;
// clear first two bytes
memset( someArray, 0, 2 );
// set the first two bytes in the array
*(int*)someArray |= combinedBits;
// retrieve the bytes
int retrievedBytes = *(int*)someArray;
// test if a bit is set
if ( retrievedBytes & BIT_0000000000000001 )
{
//do something
}
}
Now the naming of the enums is intentionally intense for clarity. also you may notice that there are only 16 bits in the enum, instead of a possible 32 for an int. This is because you mentioned the first two bytes. Using this method, only the first two bytes of the array will be changed, using those enums. Im not sure if this code would be messed up by endianess, so you will have to make sure you test on your own machines. HTH.
You put bits in a character array using C or C++ the way you put anything into anything else -- they're all bits anyway.
Since sizeof(char) == 1 by definition, you can only put 8 bits per element of the array.
If you need help with how to twiddle bits, that's an entirely different issue and has nothing to do with chars and arrays.
C doesn't support binary literals, so you'll have to represent the value as hex.
char *p;
*p++ = 0x10;
*p++ = 0xFE;
Take a look at the functions hton() and htonl() for converting multi-byte values to network-byte-order.
Related
I'm new to C and I created some code that doesn't work...
I get a warning while initLetterLib(): integer conversion resulted in truncation
I try to memcpy my libraryLetter into my outputLED, but it doesn't work.
I just get 0x00 into my outputLED.
I tried to copy something else in outputLED - this worked really fine.
But I dont get why there is a problem with my libraryLetters...
#define LETTER_WIDTH 6
typedef unsigned char letter[LETTER_WIDTH];
letter libraryLetters[128];
void initLetterLib(){
*libraryLetters[0x20] = 0x000000000000; // Blank
*libraryLetters['A'] = 0xFE909090FE00;
*libraryLetters['H'] = 0xFE101010FE00;
*libraryLetters['L'] = 0xFE0202020200;
*libraryLetters['O'] = 0xFE828282FE00;
*libraryLetters['U'] = 0xFE020202FE00;
*libraryLetters['R'] = 0xFE9894946200;
*libraryLetters['Z'] = 0x868A92A2C200;
*libraryLetters['I'] = 0x0000FE000000;
*libraryLetters['F'] = 0xFE9090808000;
}
// takes a String and generates the outputsequence for LEDs
unsigned char * stringToLEDText(char* textString)
{
static unsigned char outputLED[LED_STEPS];
unsigned char i = 0; // index
// check length of string text
unsigned short length = strlen(textString);
// if more than 10 letters are used return error
if (length > LETTERS_LED_OUTPUT)
{
printf("Error: Too much letters. Just 10 Letters are allowed\n");
return 0;
}
// through complete string
for (i = 0; i < length; i++)
{
memcpy(&outputLED[i * LETTER_WIDTH], &(libraryLetters[textString[i]]),
LETTER_WIDTH);
}
// fills rest with 0
for (i = length * LETTER_WIDTH; i < LED_STEPS; i++)
{
outputLED[i] = 0x00;
}
return outputLED;
}
Any ideas?
Thanks
Fabian
Your code doesn't make much sense. First of all, hiding an array behind a typedef is not a good idea. Get rid of that.
Using the default "primitive data types" of C is not a good idea either, since these are non-portable and of varied length. Instead use the stdint.h types. This is pretty much mandatory practice in embedded systems programming.
As for the actual problem, you can't assign an array like this
*libraryLetters[0x20] = 0x000000000000;
This doesn't make any sense. You are telling the compiler to store a 64 bit integer in the first byte of your 6 byte array. What you probably meant to do is this:
const uint8_t letters [128][LETTER_WIDTH] =
{
[0x20] = {0x00, 0x00, 0x00, 0x00, 0x00, 0x00};
['A'] = {0xFE, 0x90, 0x90, 0x90, 0xFE, 0x00};
...
};
Assuming this is a symbol table for some display. If so it should be const and allocated in flash.
*libraryLetters[x] is of type unsigned char and you are trying to assign a number to it outside the range of an unsigned char.
It looks like you are trying to assign a sequence of 6 bytes to *libraryLetters[x]. One way to do that is using memcpy, for example:
memcpy(libraryLetters['A'], "\xFE\x90\x90\x90\xFE\x00", 6);
You define your letter type as unsigned char which will only hold a single byte, but then you attempt to store a 6 byte integer. So you only get the last byte, which is zero in all your letters, if you want to be able to use an arbitrary length of the letter array. Otherwise, it will be much easier to use a 64-byte type, as suggested in the comments.
Instead, you should add the letters as
libraryLetters['H'][0] = 0xFE;
libraryLetters['H'][1] = 0x90;
...
Or you could use memcpy(libraryLetters['A'], letter_number, LETTER_WIDTH) as suggested by Ian Abbott.
I cant seem to wrap my head around what is happening here:
I have a function that calculates the difference between two dates in seconds. Then it gets bitshifted around to produce some output:
void GetUTC(unsigned char buffer[5])
{
struct tm str_time;
time_t start;
time_t current = time(NULL);
str_time.tm_year = 2010 - 1900;
str_time.tm_mon = 1;
str_time.tm_mday = 1;
str_time.tm_hour = 0;
str_time.tm_min = 0;
str_time.tm_sec = 0;
str_time.tm_isdst = 0;
start = mktime(&str_time);
printf("%s\n",ctime(&start));
printf("%s\n\n",ctime(¤t));
uint32_t someInt = difftime(current,start);
buffer[0] = 2;
buffer[1] = (someInt & 0xff);
someInt = someInt >> 8;
buffer[2] = (someInt & 0xff);
someInt = someInt >> 8;
buffer[3] = someInt;
someInt = someInt >> 8;
buffer[4] = (10 & 0xff);
}
I feed it this array:
unsigned char toWrite[5];
Then call it
GetUTC(toWrite);
All goes well. Now I have a function I am trying to feed the new array into, which takes these parameters:
void gatt_write_attribute(GDBusProxy *proxy, const char *arg)
And I call it with:
gatt_write_attribute(someProxy, toWrite);
However the array I passed to the function gatt_write_attribute shows that it's the jumbled garbage: $2\20023\n23 instead of my expected value of toWrite. (numbers differ since it has to do with time):
[0]: 2
[1]: 23
[2]: 54
[3]: 128
[4]: 10
I tried adding a terminating \0 to the end of toWrite, but it did not change anything. I tried casting it to a const char pointer and that didn't work either.
I feel like I'm missing a very simple detail. Could someone explain why I cant pass this char array to the gatt_write_attribute function?
As noted by Ian Abbott and Chux in the comments to my original question, I was indeed being silly and was not realizing that my debugger was displaying the thing being pointed to as an array of characters instead of numbers. This is what caused the garbled values.
I should have probably allowed myself a bit more sleep in-between work hours.
You need to pass the array to the first function as a pointer. Otherwise it just gets copied to the stack and the function writes to the new copy. The garbage is probably there because it has not been initialised.
I am trying to convert char array to unsigned short but its not working as it should.
char szASCbuf[64] = "123456789123456789123456789";
int StoreToFlash(char szASCbuf[], int StartAddress)
{
int iCtr;
int ErrorCode = 0;
int address = StartAddress;
unsigned short *us_Buf = (unsigned short*)szASCbuf;
// Write to flash
for(iCtr=0;iCtr<28;iCtr++)
{
ErrorCode = Flash_Write(address++, us_Buf[iCtr]);
if((ErrorCode &0x45)!= 0)
{
Flash_ClearError();
}
}
return ErrorCode;
}
When I see the Conversion, on us_Buf[0] I have value 12594, us_Buf[1]= 13108 like that and I have values only uptous_Buf[5]` after that it is "0" all remaining address.
I have tried to declare char array like this also
char szASCbuf[64] = {'1','2','3','4','5','6','7','8','9','1',.....'\0'};
I am passing the parameters to function like this
StoreToFlash(szASCbuf, FlashPointer); //Flashpointe=0
I am using IAR embedded workbench for ARM. Big enedian 32.
Any suggestions where i am doing wrong?
Thanks in advance.
Reinterpreting the char array szASCbuf as an array of short is not safe because of alignment issues. The char type has the least strict alignment requirements and short is usually stricter. This means that szAscBuf might start at address 13, whereas a short should start at either 12 or 14.
This also violates the strict aliasing rule, since szAscBuf and us_Buf are pointing at the same location while having different pointer types. The compiler might perform optimisations which don't take this into account and this could manifest in some very nasty bugs.
The correct way to write this code is to iterate over the original szASCBuf with a step of 2 and then do some bit-twiddling to produce a 2-byte value out of it:
for (size_t i = 0; i < sizeof(szAscbuf); i += 2) {
uint16_t value = (szAscbuf[i] << 8) | szAscbuf[i + 1];
ErrorCode = Flash_Write(address++, value);
if (ErrorCode & 0x45) {
Flash_ClearError();
}
}
If you really intended to treat the digit characters with their numeric value, this will do it:
uint16_t value = (szAscbuf[i] - '0') + (szAscbuf[i + 1] - '0');
In case you just want the numeric value of each character in a 2-byte value (1, 2, 3, 4, ...), iterate over the array with a step of 1 and fetch it this way:
uint16_t value = szAscbuf[i] - '0';
That's normal !
Your char array is "123456789123456789123456789" or {'1','2','3','4','5','6','7','8','9','1',.....'\0'}
But in ASCII '1' is 0x31, so when you read the array as a short * on a big endian architecture, it gives :
{ 0x3132, 0x3334, ... }
say differently in decimal :
{ 12594, 13108, ... }
I am sending byte arrays between a TCP socket server & client in C. The information that I am sending is a series of integers.
I have it working, but because I am not too conversant with C, I was wondering if anyone could suggest a better solution, or at least to look and tell me that I'm not being too crazy or using outdated code with what I'm doing.
First, I generate a random decimal value, let's say "350". I need to transmit this over the socket connection as a hex byte array. It is decoded back to its decimal value at the other end.
So far, I convert it to hex this way:
unsigned char hexstr[4];
sprintf(hexstr, "%02X", numToConvert); \\ where numToConvert is a decimal integer value like 350
At this point, I have a string in hexstr that's something like "15E" (again, using the hex value of 350 for an example).
Now, I need to store this in a byte array so that it looks something like: myArray = {0X00, 0X00, 0X01, 0X5E};
Obviously I can't just write: myArray = {0X00, 0X00, 0X01, 0X5E} because the values will be different every time, since a new random number is generated every time.
Currently, I do it like this (pseudocode because the string manipulation part is irrelevant but long):
lastTwoChars = getLastTwoCharsFromString(hexstr); // so lastTwoChars would now contain "5E"
Then (actual code):
sscanf(lastTwoChars, "%0X", &res); // now the variable res contains the byte representation of lastTwoChars, is my understanding
Then finally:
myArray[3] = res;
Then, I take the next two rightmost chars from hexstr (again, using the sample value of "15E", this would be "01" -- if there's only 1 more character, as in this case "1" was the only character left after taking out "5E" from "15E", I add 0s to the left to pad) and convert that the same way using sscanf, then insert into myArray[2]. Repeat for myArray[1] and myArray[0].
Then I send the array using write().
So, after hours of plugging away at it, this all does work... but because I don't use C very much, I have a nagging suspicion that there's something I am missing in all this. Can anyone comment if what I'm doing seems OK, or there's something obvious I'm using improperly or neglecting to use?
#include <stdio.h>
#include <limits.h>
int main(){
unsigned num = 0x15E;//num=350
int i, size = sizeof(unsigned);
unsigned char myArray[size];
for(i=size-1;i>=0;--i, num>>=CHAR_BIT){
myArray[i] = num & 0xFF;
}
for(i=0;i<size;++i){
printf("0X%02hhX ", myArray[i]);//0X02X
}
printf("\n");
return 0;
}
On the transmit side, convert a 32-bit number to a four byte array with this code
void ConvertValueToArray( uint32_t value, uint8_t array[] )
{
int i;
for ( i = 3; i >= 0; i-- )
{
array[i] = value & 0xff;
value >>= 8;
}
}
On the receive side, convert the byte array back into a number with this code
uint32_t ConvertArrayToValue( uint8_t array[] )
{
int i;
uint32_t value = 0;
for ( i = 0; i < 4; i++ )
{
value <<= 8;
value |= array[i];
}
return( value );
}
Note that it's important not to use generic types like int when writing this kind of code, since an int can be different sizes on different systems. The fixed-sized types are defined in <stdint.h>.
Here's a simple test that demonstrates the conversions (without actually sending the byte arrays over the network).
#include <stdio.h>
#include <stdint.h>
int main( void )
{
uint32_t input, output;
uint8_t byte_array[4];
input = 350;
ConvertValueToArray( input, byte_array );
output = ConvertArrayToValue( byte_array );
printf( "%u\n", output );
}
If your array is 4-byte aligned (and even if it isn't on machines that support unaligned access), you can use the htonl function to convert a 32-bit integer from host to network byte order and store the whole thing at once:
#include <arpa/inet.h> // or <netinet/in.h>
...
*(uint32_t*)myArray = htonl(num);
I´m making a program that builds a binary message. I´m using char strings to hold the binary value. So I´ve initialized a bunch of char strings that has the default values. Then I combine them by running a for loop and read them into a large string (aismsg/ais_packet). And everything worked fine untill I added msg14Text[], then the string I´m building (aismsg/ais_packet) gets shortened as shown below (even though I´m not using the variable). Seems like when I add msg14Text[], it changes the value of one of the other strings. Is this maybe a memory allocation problem?
Part of the code:
char ais_packet[257]; //Allokerer array for ais data pakke.
char aismsg[175]; //Allokerer array for meldingen.
int burst_nr = 1; //Indicates with burst it is transmittin (1-7).
char ramp_up[] = "00000000"; //Ramp up buffer.
char train_seq[] = "010101010101010101010101"; //Training sequence 24 bits of alternating 0-1.s
char hdlc_flag[] = "01111110"; //HDLC Start and END flag.
char buffer[] = "000000000000000000000000"; //Data packet buffer.
char msgID1[] = "000001"; //msg. 1.
char msgID14[] ="010100"; //msg. 14.
char repeat[] = "00"; //repetert 0 ganger.
char mmsi[] = "000111010110111100110100010101"; //Gir 123456789 som MMSI.
char nav_stat[] = "1111"; //Gir 15= AIS-SART test, endres til 14 (1110) for aktiv AIS-SART.x'
char rot[] = "10000000"; //Rate of Turn -128 betyr ikkje tilgjengelig.
char sogBin[] = "1111111111"; //Tilsvarer 1023 = not available = default.
char pos_acc[] = "0"; //Posisjonsnøyaktighet over 10m. 1 = under 10m.
char lonBin[] = "0110011110010001101011000000"; // Tilsvarer 181 grader som er default verdi for Longitude.
char latBin[] = "011010000010010000101000000"; // Tilsvarer 91 grader som er default verdi for Latitude.
char cogBin[] = "111000010000"; //Tilsvarer 3600 = not available = default.
char headingBin[] = "111111111"; //511 = not available = default
char timestamp[] = "111100"; //Tid siden melding er generert, 60 = default = ts not available.
char spec_man[] = "01"; //Special manouver 0 = default, 1 = not engaged in special manouver
char spare[] = "000";
char spareMSG14[] = "00"; //Reserved.
char raim[] = "0"; //RAIM 0 = not in use.
char comm_state[] = "00011100000000000000"; // First 2bit: Sync state: 3 = no UTC sync = default, 0 = UTC sync. 0011100000000000000
char msg14Text[] = "100100101101111011111100"; //CAUSING TROUBLE!!!! for AIS melding 14 står "Test" med 6-bit ASCII koding.
The enitre code for the function can be found at pastebin.com/wj0RxyLX
Output of ais packet with msg14Text[]:
00000000
Output of ais packet without msg14Text[]:
0000000001010101010101010101010101111110000001000001110101101111001101000101011111100000000011010000000000000110100011000101111000000101100100000100001100101110000100000000110011111000100000011100000000000000001000100110100101111110000000000000000000000000
aispacket should consist of the following variables:
ramp_up[] + train_seq[] + hdlc_flag[] + Datapacket(168bit) + crc(16bit) + hdlc_flag[] + buffer[] + '\0'
"Is this maybe a memory allocation problem?"
You don't explicitly allocate any memory in your code. Note that char repeat[] = "00"; is statically allocated array whose size is equal to size of 3 chars and whose content is being initialized by string literal "00".
Problem is most likely in copying of these strings into ais_packet since you do that in nonstandard way (character by character) which causes your code to be hard to read and it's quite easy to make a mistake there:
for(int k=0; k<256; k++)
{
...
if(k==256) // are you sure that value of k will reach 256 ?
I recommend you to use C-style functions that have been created for this purpose: Craete ais_packet by copying first string into it by using strcpy and keep extending content of this ais_packet by appending other strings by using strcat.
This question will also help you: Using strcat in C
At the end of the ugly for (k=0; k < 168; k++) { if ... else if ...} loop
else if(k==168)
{
aismsg[k] = '\0';
k=0;
}
This will make either (k <=168) the loop run forever, or (k <168) never be executed. (there are more instances of this pattern)
BTW another way to do the same (also faster) would be
....
unsigned dst=0;
memcpy (array+dst, src1, 123);
dst += 123;
memcpy(array+dst, src2, 234);
dst += 234;
...
array[dst] = 0;
Just a thought, but if you are building binary messages, why not use actual binary instead of char arrays? Here's a way to use structs inside a union to bit pack binary data.
// declaration
typedef union
{
uint32_t packed;
struct {
uint16_t sample1: 12; // 12 bits long
uint16_t sample2: 14;
uint16_t 6; // unused bits
} data;
} u1;
// instantiation
u1 pack1;
// setting
pack1.data.sample1 = 1234;
//getting
uint16_t newval = pack1.data.sample2;
// setting bit 6 in sample 1
pack1.data.sample1 |= (1 << 6);
// setting lo nibble in sample1 to 0101
pack1.data.sample1 &= 0b11110101;
// getting the whole packed value
uint32_t binmsg = pack1.packed;