Encrypting and decrypting a message with Blowfish - c

Here is a basic code of encryping and decrypting a message:
#include <stdio.h>
#include <openssl/blowfish.h>
#include <string.h>
//gcc cryptage.c -o cryptage -lcrypto
int main(){
BF_KEY *key = malloc(sizeof(BF_KEY));
unsigned char *crypt_key = "Key of encryption";
const unsigned char *in = "Message to encrypt";
int len = strlen(crypt_key);
unsigned char *out = malloc(sizeof(char)*len);
unsigned char *result = malloc(sizeof(char)*len);
//Defining encryption key
BF_set_key(key, len, crypt_key);
//Encryption
BF_ecb_encrypt(in, out, key, BF_ENCRYPT);
//Décryption
BF_ecb_encrypt(out, result, key, BF_DECRYPT);
fprintf(stdout,"Result: %s\n",result);
return 0;
}
My problem is the result i get. It's always a String of 8 caracters, no more.
Can you please help me encrypt and decrypt the full message?
Thank you!

As #WhozCraig says, do things 8 bytes at a time.
The data to encrypt should be viewed as a byte array and not a C string.
So consider the string to encrypt with the \0 and padded with random data to form a byte array that is a multiple of 8.
Call encrypt multiple times, encrypting 8 bytes per iteration.
To decrypt, call decryption the same number of iterations. Note that the result buffer may need to be sized up to a multiple of 8.
const unsigned char *in = "Message to encrypt";
size_t InSize = strlen(in) + 1;
int KeyLen = strlen(crypt_key);
size_t OutSize = (InSize + 7) & (~7);
unsigned char *out = malloc(Outsize);
unsigned char *outnext = out;
//Defining encryption key
BF_set_key(key, KeyLen, crypt_key);
//Encryption
while (InSize >= 8) {
BF_ecb_encrypt(in, outnext, key, BF_ENCRYPT);
in += 8;
outnext += 8;
InSize -= 8;
}
if (Insize > 0) { // Cope with non-octal length
unsigned char buf8[8];
memcpy(buf8, in, InSize);
for (i=InSize; i<8; i++) {
buf8[i] = rand();
}
BF_ecb_encrypt(buf8, outnext, key, BF_ENCRYPT);
}
//Décryption
unsigned char *result = malloc(OutSize);
unsigned char *resultNext = result;
while (OutSize) {
BF_ecb_encrypt(out, resultNext, key, BF_DECRYPT);
out += 8;
resultNext += 8;
OutSize -= 8;
}
fprintf(stdout,"Result: %s\n",result);
// No need to print the random bytes that were generated.
return 0;
}
Not quite comfortable have a known byte (\0) encoded in the last block. A different length indication may be prudent.

Related

Generate HMAC SHA256 in C

I am creating an HMAC digest in java application and want to verify it into the C program. I have a hardcoded secret key in hex format.
I'm getting Segmentation fault while trying to calculate HmacSHA256 in C. I couldn't figure out what I am messing up.
The java program
byte[] decodedKey = hexStringToByteArray("d44d4435c5eea8791456f2e20d7e176a");
SecretKey key = new SecretKeySpec(decodedKey, 0, decodedKey.length, "AES");
try {
Mac mac = Mac.getInstance("HmacSHA256"); //Creating a Mac object
mac.init(key); //Initializing the Mac object
byte[] bytes = challenge.getBytes();
byte[] macResult = mac.doFinal(bytes);
return macResult;
} catch (NoSuchAlgorithmException e) {
System.out.println("Not valid algorithm"+ e);
} catch (InvalidKeyException e) {
System.out.println("Invalid key"+ e);
}
C program
const char* key = hexstr_to_char("d44d4435c5eea8791456f2e20d7e176a");
unsigned char *result;
unsigned int* resultlen;
hmac_sha256(key, strlen(key),
challenge, strlen("d44d4435c5eea8791456f2e20d7e176a"),
result, resultlen);
unsigned char* hmac_sha256(const void *key, int keylen,
const unsigned char *data, int datalen,
unsigned char *result, unsigned int* resultlen)
{
return HMAC(EVP_sha256(), key, keylen, data, datalen, result, resultlen);
}
unsigned char* hexstr_to_char(const char* hexstr)
{
size_t len = strlen(hexstr);
if (len % 2 != 0)
return NULL;
size_t final_len = len / 2;
unsigned char* chrs = (unsigned char*)malloc((final_len+1) * sizeof(*chrs));
for (size_t i=0, j=0; j<final_len; i+=2, j++)
chrs[j] = (hexstr[i] % 32 + 9) % 25 * 16 + (hexstr[i+1] % 32 + 9) % 25;
chrs[final_len] = '\0';
return chrs;
}
If you read the documentation for HMAC() more carefully you'll see that there are a few ways to call the method to determine size of the output buffer, or to let it return a static array buffer.
You can pass NULL for the result and get the length with a first call, then allocate the result buffer and call again.
Or you can pass NULL for the result buffer, and it will return a static array buffer (which you don't own), however that is documented as not thread-safe.
The easiest way may be to rely on EVP_MAX_MD_SIZE to statically declare your buffer. You should also declare the length variable as int not int*, and pass its address with the & operation - this is very basic C.
Try this:
unsigned int resultlen = 0;
unsigned char resultbuf[EVP_MAX_MD_SIZE];
hmac_sha256(key, strlen(key), challenge, strlen("d44d4435c5eea8791456f2e20d7e176a"),
resultbuf, &resultlen);
Documentation:
https://www.openssl.org/docs/man1.1.1/man3/HMAC.html
There are many C language tutorials out there.

Implementing AES-128 CBC by passing array as input data

I am working on AES-128/192/256, basically I am getting data from a broker as string, I just need to encrypt that data and I need to verify that.
I already come across these https://github.com/empyreanx/tiny-AES128-C, https://github.com/kokke/tiny-AES-c links.
My code is:
static void test_encrypt_cbc(void)
{
unsigned char input[] =
"So_letmeknowRuinterested/towork#thiscompany.comElsewilllookother";
//64bits
unsigned char cipher[sizeof input];
printf("size of in:%lu\n",strlen(input));
unsigned char key[] = "Gns7AauH3dnaod=="; //16 bits
unsigned char iv[] = "vhdNaleuTHenaOlL"; //16 bits
AES128_CBC_encrypt_buffer(cipher, input, 64, key, iv);
if(0 == memcmp((char*) cipher, (char*) input, 64))
{
printf("SUCCESS!\n");
}
else
{
printf("FAILURE!\n");
}
}
I also printed cipher text after encryption it is printing some undefined character.
I don't know but I am comparing "cipher" with "input", Its FAILURE finally!
Please anyone can tell me where I am doing wrong.
Thanks in advance.
This is logical to cipher to be different from input, isn't?
To check that encryption worked, you should decrypt the encoded message and check that they are equal:
static void test_encrypt_cbc(void)
{
/* 64 bytes, or 512 bits */
unsigned char input[] =
"So_letmeknowRuinterested/towork#thiscompany.comElsewilllookother";
unsigned char cipher[sizeof input];
unsigned char output[sizeof input];
printf("size of in:%lu\n",strlen(input));
/* 16 bytes or 128 bits */
unsigned char key[] = "Gns7AauH3dnaod==";
unsigned char iv[] = "vhdNaleuTHenaOlL";
/* input --> cipher */
AES128_CBC_encrypt_buffer(cipher, input, 64, key, iv);
/* cipher --> output */
AES128_CBC_decrypt_buffer(output, cipher, 64, key, iv);
if(0 == memcmp((char*) output, (char*) input, 64))
{
printf("SUCCESS!\n");
}
else
{
int i;
printf("FAILURE!\nInput and output are different:\n");
for (i = 0; i < sizeof input; ++i)
{
printf("%02x - %02x\n", input[i], output[i]);
}
}
}

Validation system for openssl hmac code using nist vector values as input

I have wrote a c code which takes the input value of key and message makes call to openssl hmac functions and generate result of mac code.
Input values are collected from NIST Test Vectors
#define KEY_SIZE 11 // in bytes
#define MSG_SIZE 129 // in bytes
#include <stdio.h>
#include <string.h>
#include <openssl/hmac.h>
void str2hex(char *, char*, int);
int main() {
char *key, *msg;
unsigned char keyy[KEY_SIZE], msgt[MSG_SIZE], temp[4];
unsigned char* result;
unsigned int i, len = 20,Tlen = 10;
key = "";//values specified below
msg ="";//values specified below
/*CONVERT STRING TO HEX DIGITS - KEY*/
str2hex(key, keyy, KEY_SIZE);
//CONVERT STRING TO HEX DIGITS - MSG*//
str2hex(msg, msgt, MSG_SIZE);
result = (unsigned char*)malloc(sizeof(char) * len);
HMAC_CTX ctx;
HMAC_CTX_init(&ctx);
HMAC_Init_ex(&ctx, keyy, strlen(keyy), EVP_sha1(), NULL);
HMAC_Update(&ctx, (unsigned char*)&msgt, strlen(msgt));
HMAC_Final(&ctx, result, &len);
HMAC_CTX_cleanup(&ctx);
printf("HMAC digest: ");
for (i = 0; i < Tlen; i++)
printf("%02x", result[i]);
printf("\n");
free(result);
return 0;
}
//===================== string to hex conversion
================================//
void str2hex(char *str, char *hex, int len) {
int tt, ss;
unsigned char temp[4];
for (tt = 0, ss = 0; tt < len, ss < 2 * len; tt++, ss += 2) {
temp[0] = '0';
temp[1] = 'x';
temp[2] = str[ss];
temp[3] = str[ss + 1];
hex[tt] = (int) strtol(temp, NULL, 0);
}
}
//---------------------------------------------------------------------------------//
The first input given:
Key = 82f3b69a1bff4de15c33
Msg = fcd6d98bef45ed6850806e96f255fa0c8114b72873abe8f43c10bea7c1df706f10458e6d4e1c9201f057b8492fa10fe4b541d0fc9d41ef839acff1bc76e3fdfebf2235b5bd0347a9a6303e83152f9f8db941b1b94a8a1ce5c273b55dc94d99a171377969234134e7dad1ab4c8e46d18df4dc016764cf95a11ac4b491a2646be1
Output generated:
HMAC digest: 1ba0e66cf72efc349207
Nist_Mac = 1ba0e66cf72efc349207
It matches so success
But for the Second input
Key = 4766e6fe5dffc98a5c50
Msg = d68b828a153f5198c005ee36c0af2ff92e84907517f01d9b7c7993469df5c21078fa356a8c9715ece2414be94e10e547f32cbb8d0582523ed3bb0066046e51722094aa44533d2c876e82db402fbb00a6c2f2cc3487973dfc1674463e81e42a39d9402941f39b5e126bafe864ea1648c0a5be0a912697a87e4f8eabf79cbf130e
Output generated:
HMAC digest: ca96f112a79882074b63
Nist_Mac = 007e4504041a12f9e345
Its failing.If any one could check my code and kindly let me know what am i doing wrong it will be really helpfull.
You have two issues here.
The first is that you're using strlen on an array of characters that may contain a null byte. Since this function counts the number of bytes until it find a null byte, you won't get what you expect if your array contains a null byte (as is the case for your second example).
Instead of using strlen on the byte array to determine the length, use the actual length of the data. Since you're converting a string containing hex digits to bytes, the length of the byte array is half the length of the input string.
HMAC_Init_ex(&ctx, keyy, strlen(key)/2, EVP_sha1(), NULL);
HMAC_Update(&ctx, msgt, strlen(msg)/2);
Note also that you should pass msgt to HMAC_Update, not &msgt, as the latter is a pointer to an array.
The second issue is in your str2hex function. When you construct temp, you don't have enough space for a terminating null byte. This causes strtol, which expects a null-terminated string, to read past the end of the array. This invokes undefined behavior.
In this particular case you're "lucky" that it works, as the byte in memory that follows temp happens to contain either a null byte or a non-digit. You can't however depend on this behavior. Fix this by making temp one byte longer and explicitly setting that byte to 0. And while you're at it, you should also fix the signed / unsigned mismatch in your function arguments and change the type of temp to an unsigned char array.
void str2hex(char *, unsigned char*, int);
...
void str2hex(char *str, unsigned char *hex, int len) {
int tt, ss;
char temp[5];
for (tt = 0, ss = 0; tt < len, ss < 2 * len; tt++, ss += 2) {
temp[0] = '0';
temp[1] = 'x';
temp[2] = str[ss];
temp[3] = str[ss + 1];
temp[4] = 0;
hex[tt] = strtol(temp, NULL, 0);
}
}
At byte position 58 in the message, you have a 0x00 byte (null). Since you're doing an strlen(msgt), this results in 58 instead of 128. Excerpt from the documentation (emphasis mine):
The C library function size_t strlen(const char *str) computes the length of the string str up to, but not including the terminating null character.
Just use the proper length of the message and don't use string operations on char arrays that do not contain printable bytes.

Printing out byte array as formatted text gives different outputs - C

I am trying to print out byte array as one byte at the time in hexadecimal format within for loop like this:
int my_function(void *data)
{
obuf = (str*)data;
int i;
for (i = 0; i < obuf->len; i++)
{
printf("%02X:", obuf->s[i]);
}
return 0;
}
str in this case is structure from Kamailio - review at http://www.asipto.com/pub/kamailio-devel-guide/#c05str
The expected output:
80:70:0F:80:00:00:96:00:1D:54:7D:7C:36:9D:1B:9A:20:BF:F9:68:E8:E8:E8:F8:68:98:E8:EE:E8:B4:7C:3C:34:74:74:64:74:69:2C:5A:3A:3A:3A:3A:3A:3A:32:24:43:AD:19:1D:1D:1D:1D:13:1D:1B:3B:60:AB:AB:AB:AB:AB:0A:BA:BA:BA:BA:B0:AB:AB:AB:AB:AB:0A:BA:BA:BA:BA:B9:3B:61:88:43:
What I am getting:
FFFFFF80:70:0F:FFFFFF80:00:00:FFFFFF96:00:1D:54:7D:7C:36:FFFFFF9D:1B:FFFFFF9A:20:FFFFFFBF:FFFFFFF9:68:FFFFFFE8:FFFFFFE8:FFFFFFE8:FFFFFFF8:68:FFFFFF98:FFFFFFE8:FFFFFFEE:FFFFFFE8:FFFFFFB4:7C:3C:34:74:74:64:74:69:2C:5A:3A:3A:3A:3A:3A:3A:32:24:43:FFFFFFAD:19:1D:1D:1D:1D:13:1D:1B:3B:60:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:0A:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFB0:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:0A:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFB9:3B:61:FFFFFF88:43:
Could someone please help me understand why there are some of bytes prefixed with FFFFFF and other aren't?
Thanks in advance
Looks like obuf->s[i] returns a signed value
You would need to cast it to a unsigned value to get rid of the FFF.. at start.
printf("%02X:", (unsigned char)(obuf->s[i]));
The problem appears with chars that have the most significant bit set (which are out of the proper pure ASCII set range 0-127). The key point is to consider chars as unsigned.
printf("%02X:", (unsigned char)(obuf->s[i]));
See this simple compilable repro C code:
#include <stdio.h>
#include <string.h>
struct _str {
char* s; /* pointer to the beginning of string (char array) */
int len; /* string length */
};
typedef struct _str str;
int my_function(void *data)
{
str* obuf;
int i;
obuf = (str*)data;
for (i = 0; i < obuf->len; i++) {
printf("%02X:", (unsigned char)(obuf->s[i]));
}
return 0;
}
int main(void)
{
char buf[2];
str s;
/* Test with ordinary ASCII string */
s.s = "Hello";
s.len = strlen(s.s);
my_function(&s);
printf("\n");
/* Test with char values with most significant bit set */
buf[0] = 0xF1;
buf[1] = 0x00;
s.s = buf;
s.len = 1;
my_function(&s);
return 0;
}
With MSVC, I get this output:
48:65:6C:6C:6F:
F1:

AES128/CBC test not working

This is my code (the errors checking were deliberately omitted for code readability):
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <gcrypt.h>
#define GCRYPT_VERSION "1.5.0"
#define GCRY_CIPHER GCRY_CIPHER_AES128
int main(void){
if(!gcry_check_version(GCRYPT_VERSION)){
fputs("libgcrypt version mismatch\n", stderr);
exit(2);
}
gcry_control(GCRYCTL_SUSPEND_SECMEM_WARN);
gcry_control(GCRYCTL_INIT_SECMEM, 16384, 0);
gcry_control(GCRYCTL_RESUME_SECMEM_WARN);
gcry_control (GCRYCTL_INITIALIZATION_FINISHED, 0);
int algo = -1;
size_t i;
const char *name = "aes128";
char plain_text[16] = {0x80};
char key[16] = {0};
char iniVector[16] = {0};
size_t txtLenght = strlen(plain_text);
char *encBuffer = malloc(txtLenght);
gcry_cipher_hd_t hd;
algo = gcry_cipher_map_name(name);
size_t blkLength = gcry_cipher_get_algo_blklen(GCRY_CIPHER);
size_t keyLength = gcry_cipher_get_algo_keylen(GCRY_CIPHER);
gcry_cipher_open(&hd, algo, GCRY_CIPHER_MODE_CBC, 0);
gcry_cipher_setkey(hd, key, keyLength);
gcry_cipher_setiv(hd, iniVector, blkLength);
gcry_cipher_encrypt(hd, encBuffer, txtLenght, plain_text, txtLenght);
printf("encBuffer = ");
for(i = 0; i < txtLenght; i++){
printf("%02x", (unsigned char) encBuffer[i]);
}
printf("\n");
gcry_cipher_close(hd);
free(encBuffer);
return 0;
}
Expected result:
KEY = 00000000000000000000000000000000
IV = 00000000000000000000000000000000
PLAINTEXT = 80000000000000000000000000000000
CIPHERTEXT = 3ad78e726c1ec02b7ebfe92b23d9ec34
My result:
KEY = 00000000000000000000000000000000
IV = 00000000000000000000000000000000
PLAINTEXT = 80000000000000000000000000000000
CIPHERTEXT = 42
Why i got this output? What am I doing wrong?
You are using ASCII values but the AES test vectors are given as hexadecimal values.
Try with hex values instead (the remaining values of the array are initialized by 0 in C):
char plain_text[16] = {0x80};
char key[16] = {0};
char iniVector[16] = {0};
size_t txtLenght = sizeof plain_text;

Resources