Encrypted string is not the same as test vector - c

So I am trying to encrypt some data with the EVP-API from OpenSSL. But I am not recieving the same result as the test vectors.
This is the main function
#include <stdio.h>
#include <windows.h>
#include <openssl\aes.h>
#include <openssl\evp.h>
int main()
{
unsigned char *to = (unsigned char*)malloc(2056);
ZeroMemory(to,2056);
int *tosize;
unsigned char* key = (unsigned char*)"0000000000000000000000000000000000000000000000000000000000000000";
unsigned char* iv = (unsigned char*)"00000000000000000000000000000000";
unsigned char* plain = (unsigned char*)"00000000000000000000000000000000";
to = AESEncrypt(key,iv,plain,strlen((const char*)plain));
if (to != 0)
{
for (int i = 0; i < strlen((const char*)to);i++)
{
printf("%x02", (int*)UCHAR(to[i]));
}
}
}
And this is the function which I am trying to call. No errors recieved. Every call is true (no error).
unsigned char* AESEncrypt(unsigned char* key, unsigned char*iv, unsigned char*plain, size_t plainsize)
{
EVP_CIPHER_CTX *x = (EVP_CIPHER_CTX*) malloc(sizeof(EVP_CIPHER_CTX));
EVP_CIPHER_CTX_init(x);
if (EVP_EncryptInit(x,EVP_aes_256_cbc(),key,iv))
{
unsigned char* to = (unsigned char*) malloc(plainsize + EVP_CIPHER_CTX_block_size(x));
int tosize = 0;
if(EVP_EncryptUpdate(x,to,&tosize,plain,plainsize))
{
if (EVP_EncryptFinal(x,to,&tosize))
{
return to;
}
}
}
return 0;
}
This is the test vector:
KEY = 0000000000000000000000000000000000000000000000000000000000000000
IV = 00000000000000000000000000000000
PLAINTEXT = 80000000000000000000000000000000
CIPHERTEXT = ddc6bf790c15760d8d9aeb6f9a75fd4e
This is what I am recieving:
CIPHERTEXT = 5a0215028e.... and it goes on. As you see it is not correct.
What could I be doing wrong?

CAVS test vectors follow a fairly straight-forward pattern. From test to test they vary to certain degrees, but one thing is eminently consistent:
Hex Strings are BYTE representations; NOT character data
Hence your "test" is entirely wrong. You're using a key, iv, and plain text that is filled with the character '0', not the byte value 0. So obviously no wonder you're getting a different result.
For that specific test your arrays should be:
unsigned char key[64] = {0};
unsigned char iv[16] = {0};
unsigned char plain[16] = {0};
Furthermore, your size sent to your encryption function should be the byte count of your plain text. Finally, your encryption function should ideally take a target buffer AND a modifiable size as the output parameters.
int AESEncrypt256(
unsigned char* key, // key must be 64 bytes wide.
unsigned char *iv, // IV must be 16 bytes wide.
unsigned char *src, // source buffer to encryt
unsigned int src_len, // length of source buffer in bytes
unsigned char *dst, // target buffer to write to
unsigned int *dst_len); // in: size of dst, out: bytes written to dst
And code the function to match those parameters. You'll be glad you did.
CAVS tests vectors are not "text". They are bytes and should be treated as such. You need to get a handle on that now, as the Monte Carlo tests are likely going to cost you some hair if you don't.
Do yourself an enormous favor and write some code now that translates a string of hex digits into an unsigned char byte array. You'll need the same to translate back for result strings. And make these routines solid as you will be using them a lot when writing these tests.
Spoiler Alert
Test the string for an odd number of chars first. If it is odd, the first byte in your translated buffer should be based on 0c, where c is the first char in the input string. From then on (or if the number of chars is even, then from the very start) grab them two-at-a-time when converting the rest of the byte string into real bytes. This means this
123456
results in a byte array of
{ 0x12, 0x34, 0x56 }
while this:
89AB1
should be:
{0x08, 0x9A, 0xB1 }

Related

Converting a struct to a hex string in C

I have the following struct
typedef struct __attribute__((packed)) word {
uint16_t value;
uint8_t flag
} Word;
I want to convert it to a hex string. For example if value = 0xabcd and flag = 0x01 I want to get the string 01abcd
if I do some pointer juggling
Word word;
word.value = 0xabcd;
wordk.flag = 0x01;
printf("word: %X\n", *(int *)&word);
I get the output that I want (word: 1ABCD) but this doesn't seem safe
and when I tried to do this after looking at some of the answer here
char ptr[3];
memcpy(ptr, word, 3);
printf("word: %02X%02X%02X\n", ptr[2], ptr[1], ptr[0]);
I got word: 01FFFFFFABFFFFFFCD, for some reason the first two bytes are being extended to a full int
There's no real gain from messing around with pointers or type-punning, if all you want is to output the values of the two structure members. Just print them individually:
printf("word: %02x%04x\n", (unsigned int)word.flag, (unsigned int)word.value);
Use a simple sprintf to convert to a string:
int main(void)
{
Word v = { 0xabcd, 0x01 };
char s[10];
sprintf(s, "%02x%04x", v.flag, v.value);
puts(s);
}
I want to get the string 01abcd
So you want to print the binary representation of the struct on a little endian machine backwards. There's no obvious advantage of using sprintf for this apart from it being quick & dirty to implement.
If you want something with performance, then hacking this out manually isn't rocket science - simply iterate over the struct byte by byte and convert each nibble to the corresponding hex character:
void stringify (const uint8_t* data, size_t size, char* dst)
{
for(size_t i=0; i<size; i++)
{
uint8_t byte = data[size-i-1]; // read starting from the end of the data
dst[i*2] = "0123456789ABCDEF"[ byte >> 4 ]; // ms nibble
dst[i*2+1] = "0123456789ABCDEF"[ byte & 0xF ]; // ls nibble
}
dst[size*2] = '\0';
}
This will give "01ABCD" on a little endian machine and "01CDAB" on a big endian machine.
(Optionally add restrict to data and dst parameters but I doubt it will improve performance since the const correctness already blocks aliasing at some extent.)
Could join them mathematically first.
printf("word: %06lx\n", wordk.flag * 0x10000L + word.value);
Using long in case we are on a 16-bit int machine.

Same short encryption result using AES with different keys

I have a single plain text which is
unsigned char plaintext[] = "Hi, this is trial number one";
For the keys, instead of using something like:
unsigned char key[16] = "azertyuiopqsdfg";
I decided to use tons of them like "dog", "azkier", "jfieifdragon", ...
My code so far looks like this:
unsigned char *aes_encrypt(unsigned char *plaintext, unsigned char *key)
{
EVP_CIPHER_CTX *ctx;
ctx = EVP_CIPHER_CTX_new();
unsigned char iv[16] = "0000000000000000";
int c_len = strlen(plaintext) + AES_BLOCK_SIZE;
int f_len = 0;
unsigned char *ciphertext = malloc(c_len);
EVP_EncryptInit_ex(ctx, EVP_aes_128_cbc(), NULL, key, iv);
EVP_EncryptUpdate(ctx, ciphertext, &c_len, plaintext, strlen(plaintext));
EVP_EncryptFinal_ex(ctx, ciphertext+c_len, &f_len);
EVP_CIPHER_CTX_free(ctx);
return ciphertext;
}
When I compile and run, the output looks something like this:
the key: dog
the plain: Hi, this is trial number one
ciphertext: 157a320
the key: azkier
the plain: Hi, this is trial number one
ciphertext: 157a320
.....
My questions are:
why I always get the same ciphertext even though I'm using different keys?
Also, why the ciphertext is really short? My plaintext is pretty long tho.
Thanks.
Update --> The way I call the aes_encrypt is like this:
unsigned char plaintext[] = "Hi, this is trial number one";
unsigned char *cipher;
cipher = aes_encrypt(plaintext, "dog");
printf("The cipher is: %x\n", cipher);
free(cipher);
unsigned char *cipher;
cipher = aes_encrypt(plaintext, "azkier");
printf("The cipher is: %x\n", cipher);
free(cipher);
In your test code:
printf("The cipher is: %x\n", cipher);
Well, of course that doesn't work -- %x prints the address of cipher as hexadecimal, not its contents. If you want a dump of the contents of cipher, you'll need to loop over each byte yourself.
Additionally, the key parameter to EVP_EncryptInit_ex is a fixed-length buffer, whose size is set based on the cipher you're using. It is not a string. Passing a short string may cause unpredictable behavior, as whatever data happens to be stored after the string ends may be used as part of the key.

AES encryption using openssl

I'm trying to encrypt using openssl AES encryption function.
here's my code:
#include <stdio.h>
#include <openssl/aes.h>
#include <string.h>
const static unsigned char aes_key[]={"passwordpasswor"}; //15 characters + \0
void print_data(const char *tittle, const void* data, int len);
int main() {
unsigned char aes_input[]="#!/bin/bash\necho hello world";
unsigned char iv[AES_BLOCK_SIZE];
memset (iv,0x00,AES_BLOCK_SIZE);
unsigned char enc_out[sizeof(aes_input)];
unsigned char dec_out[sizeof(aes_input)];
AES_KEY enc_key,dec_key;
AES_set_encrypt_key(aes_key,sizeof(aes_key)*8,&enc_key);
AES_cbc_encrypt(aes_input,enc_out,sizeof(aes_input),&enc_key,iv,AES_ENCRYPT);
//decryption
memset(iv,0x00,AES_BLOCK_SIZE);
AES_set_decrypt_key(aes_key,sizeof(aes_key)*8,&dec_key);
AES_cbc_encrypt(enc_out,dec_out,sizeof(aes_input),&dec_key,iv,AES_DECRYPT);
//verify
printf("original %s\n",aes_input);
printf("encrypted %s\n",enc_out);
printf("decrypted %s\n",dec_out);
return 0;
}
the code produces the following output (with an extra newline between each for clarity):
original #!/bin/bash
echo hello world
encrypted ���jv�.)��$I���b�:dmPvTQޜ�#!/bin/bash
echo hello world
decrypted #!/bin/bash
echo hello world
I've tried other messages, it seems that the encryption message will show the original message if used with printf.
You're trying to dump non-printable data to a terminal device, and specifically doing so using a library call that expects null-termination. The output of an AES encryption can contain bytes of any value (including embedded nullchar values)
You need the following:
Properly size your output buffer size. By default AES_cbc_encrypt uses pkcs padding and will use up to one full additional block for padding data.
Dump your output using an alternative mechanism, such as a trivial hexdump routine.
Both of the above are done below:
#include <stdio.h>
#include <openssl/aes.h>
#include <string.h>
static hex_print(const void *pv, size_t len)
{
static const char alpha[] = "0123456789abcdef";
const unsigned char *beg = pv, *end = beg+len;
for (; beg != end; ++beg)
{
putc(alpha[(*beg >> 4) & 0xF], stdout);
putc(alpha[*beg & 0xF], stdout);
}
putc('\n', stdout);
}
const static unsigned char aes_key[]={"passwordpasswor"}; //15 characters + \0
void print_data(const char *tittle, const void* data, int len);
int main() {
unsigned char aes_input[]="#!/bin/bash\necho hello world";
unsigned char enc_out[AES_BLOCK_SIZE * ((sizeof(aes_input) + AES_BLOCK_SIZE)/AES_BLOCK_SIZE)];
unsigned char dec_out[sizeof(aes_input)];
unsigned char iv[AES_BLOCK_SIZE] = {0};
AES_KEY enc_key,dec_key;
AES_set_encrypt_key(aes_key,sizeof(aes_key)*8,&enc_key);
AES_cbc_encrypt(aes_input,enc_out,sizeof(aes_input),&enc_key,iv,AES_ENCRYPT);
//decryption
memset(iv,0x00,AES_BLOCK_SIZE);
AES_set_decrypt_key(aes_key,sizeof(aes_key)*8,&dec_key);
AES_cbc_encrypt(enc_out,dec_out,sizeof(aes_input),&dec_key,iv,AES_DECRYPT);
//verify
printf("original %s\n",aes_input);
hex_print(enc_out, sizeof enc_out);
printf("decrypted %s\n",dec_out);
return 0;
}
Output
original #!/bin/bash
echo hello world
e389c96a76d708b42e29b4b4052449f1ffc762db3a646d1650765451de9c1dd0
decrypted #!/bin/bash
echo hello world
Note in particular the last byte of the encryption. It isn't 00, which means the printf call you were incorrectly using was marching beyond that buffer and into the land of undefined behavior. In fact, there are no nullchar bytes in that string (it is a different, yet closely-related problem when there is an embedded 00 in the middle of your data, in which case printf would have stopped prematurely.
In this case, I can speculate (with little value; such is the nature of undefined behavior) that the march took printf into the next automatic variable on the stack, which was the decrypted array.
Modifying the output sequence to use all hex output will demonstrate the difference between plaintext and encrypted data. For example, changing the last three functional lines of your program to:
//verify
hex_print(aes_input, sizeof(aes_input));
hex_print(enc_out, sizeof enc_out);
hex_print(dec_out, sizeof(dec_out));
will deliver the following output:
23212f62696e2f626173680a6563686f2068656c6c6f20776f726c6400
e389c96a76d708b42e29b4b4052449f1ffc762db3a646d1650765451de9c1dd0
23212f62696e2f626173680a6563686f2068656c6c6f20776f726c6400
which makes sense. if you walk the bytes (two digits per) in the original and decrypted strings you can see they're (a) equal, (b) not equal to the cipher text, and (c), a little time in an ascii table will show you they are indeed the original text message.
Best of luck.
You have encrypted the terminating nul of the string by using the argument
sizeof(aes_input)
You are consistent, so it gets decrypted too. Unfortunately the encrypted string no longer has a nul terminator, since it was encrypted too. So I recommend
strlen(aes_input)
for the arguments (but not for the string allocation). You must also terminate the two strings enc_out[] and dec_out[].

Unsigned char to char* and int in C?

Working on some encryption that requires unsigned char's in the functions, but want to convert to a char for use after it's been decrypted. So, I have:
unsigned char plaintext[16];
char *plainchar;
int plainint;
... Code now populates plaintext with data that happens to all be plain text
Now at this point, let's say plaintext is actually a data string of "0123456789". How can I get the value of plaintext into plainchar as "012456789", and at the same time plainint as 123456789?
-- Edit --
Doing this when plaintext is equal to "AAAAAAAAAA105450":
unsigned char plaintext[16];
char *plainchar;
int plainint;
... Code now populates plaintext with data that happens to all be plain text
plainchar = (char*)plaintext;
Makes plainchar equal to "AAAAAAAAAA105450╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠┤∙7" with a sizeof = 51. The encryption code is the rijndael example code, so it should be working fine.
Thanks,
Ben
Your plain text string is not terminated. All strings must have an extra character that tells the end of the string. This character is '\0'.
Declare the plaintext variable as
unsigned char plaintext[17];
and after you are done with the decryption add this
plaintext[last_pos] = '\0';
Change last_pos to the last position of the decrypted text, default to 16 (last index of the array).
I think its simply
plainchar = (char*)plaintext;
sscanf( plainchar, "%d", &plainint );
for unsigned char to char*
plainchar = (char*)plaintext;
for unsigned to int
sscanf( plainchar, "%d", &plainint );

How to convert an Unsigned Character array into a hexadecimal string in C

Is it possible to represent an unsigned character array as a string?
When I searched for it, I found out that only memset() was able to do this (But character by character).
Assuming that is not the correct way, is there a way to do the conversion?
Context: I am trying to store the output of a cryptographic hash function which happens to be an array of unsigned characters.
eg:
unsigned char data[N]; ...
for(i=0;i<N;i++) printf("%x",data[i]);
My goal is to represent the data as a String (%s) rather than access it by each element. Since I need the output of the hash as a String for further processing.
Thanks!
So, based on your update, are you talking about trying to convert a unsigned char buffer into a hexadecimal interpretation, something like this:
#define bufferSize 10
int main() {
unsigned char buffer[bufferSize]={1,2,3,4,5,6,7,8,9,10};
char converted[bufferSize*2 + 1];
int i;
for(i=0;i<bufferSize;i++) {
sprintf(&converted[i*2], "%02X", buffer[i]);
/* equivalent using snprintf, notice len field keeps reducing
with each pass, to prevent overruns
snprintf(&converted[i*2], sizeof(converted)-(i*2),"%02X", buffer[i]);
*/
}
printf("%s\n", converted);
return 0;
}
Which outputs:
0102030405060708090A
In C, a string is an array of char, terminated with a character whose value is 0.
Whether or not char is a signed or unsigned type is not specified by the language, you have to be explicit and use unsigned char or signed char if you really care.
It's not clear what you mean by "representing" an unsigned character array as string. It's easy enough to cast away the sign, if you want to do something like:
const unsigned char abc[] = { 65, 66,67, 0 }; // ASCII values for 'A', 'B', 'C'.
printf("The English alphabet starts out with '%s'\n", (const char *) abc);
This will work, to printf() there isn't much difference, it will see a pointer to an array of characters and interpret them as a string.
Of course, if you're on a system that doesn't use ASCII, there might creep in cases where doing this won't work. Again, your question isn't very clear.
Well a string in C is nothing else than a few chars one after another. If they are unsigned or signed is not much of a problem, you can easily cast them.
So to get a string out of a unsigned char array all you have to do is to make sure that the last byte is a terminating byte '\0' and then cast this array to char * (or copy it into a array of char)
I successfully use this to convert unsigned char array to std:string
unsigned char array[128];
std::stringstream buffer;
for (int i = 0; i < 128; i++)
{
buffer << std::hex << std::setfill('0');
buffer << std::setw(2) << static_cast<unsigned>(array[i]);
}
std::string hexString = buffer.str();
An example as you've asked:
unsigned char arr [SIZE];

Resources