I'm trying to create a SHA256 hash with a key using OpenSSL's HMAC functions. My stack keeps getting corrupted (every value set to 0) after I call HMAC_Init_ex. I'm using Xcode and running OS X 10.8.5. Running "openssl version" in my terminal outputs "OpenSSL 0.9.8y 5 Feb 2013".
Here is my function and all my #includes:
#include <stdio.h>
#include <openssl/hmac.h>
char* hash(char *str, char* key){
int inputLen = strlen(str);
int keyLen = strlen(key);
HMAC_CTX ctx;
HMAC_CTX_init(&ctx);
HMAC_Init_ex(&ctx, key, keyLen, EVP_sha256(), NULL); // Everything is fine up to here.
HMAC_Update(&ctx, str, inputLen); // By the time this line runs, str and key are NULL, and inputLen and keyLen are 0.
char* ret = malloc(65*sizeof(char));
HMAC_Final(&ctx, ret, 65);
HMAC_CTX_cleanup(&ctx);
ret[65] = '\0';
return ret;
}
This code should be working. It probably has something to do with my libraries, but I don't know what. Did I do something wrong when importing the libraries?
Update:
I found an example from here that uses the fully encapsulated hmac function and says that this is essentially the same as what I was doing before, and strangely, it works. So I have circumvented my problem, but you could still answer in case it helps somebody else. Except it's probably some weird, specific issue with my libraries. The working function:
char* hash(char *str, char* key){
int inputLen = strlen(str);
int keyLen = strlen(key);
unsigned int retLen = 65;
char* ret = emalloc(retLen*sizeof(char));
ret = HMAC(EVP_sha256(), key, keyLen, (unsigned char*)str, inputLen, NULL, NULL);
return ret;
}
You missed OpenSSL/Initialization.The ENGINE or ssh config provides you with the cipher methods, digest methods, etc.
#include <openssl/engine.h>
#include <openssl/hmac.h>
HMAC_CTX ctx;
result = (unsigned char*) malloc(sizeof(char) * result_len);
ENGINE_load_builtin_engines();
ENGINE_register_all_complete();
HMAC_CTX_init(&ctx);
HMAC_Init_ex(&ctx, key, 16, EVP_sha256(), NULL);
HMAC_Update(&ctx, data, 8);
HMAC_Final(&ctx, result, &result_len);
HMAC_CTX_cleanup(&ctx);
use [ssh -Version] get openssl version,when the version less than 1.0.1e, should use HMAC_Init... instead of HMAC_Init_ex...
in version 1.0.1e, the implement of HMAC like this
unsigned char *HMAC(const EVP_MD *evp_md, const void *key, int key_len,const unsigned char *d, size_t n, unsigned char *md,unsigned int *md_len)
{
HMAC_CTX c;
static unsigned char m[EVP_MAX_MD_SIZE];
if (md == NULL) md=m;
HMAC_CTX_init(&c);
if (!HMAC_Init(&c,key,key_len,evp_md))
goto err;
if (!HMAC_Update(&c,d,n))
goto err;
if (!HMAC_Final(&c,md,md_len))
goto err;
HMAC_CTX_cleanup(&c);
return md;
err:
return NULL;
}
you can compare the HMAC() API in the later version of openssl
Related
I am creating an HMAC digest in java application and want to verify it into the C program. I have a hardcoded secret key in hex format.
I'm getting Segmentation fault while trying to calculate HmacSHA256 in C. I couldn't figure out what I am messing up.
The java program
byte[] decodedKey = hexStringToByteArray("d44d4435c5eea8791456f2e20d7e176a");
SecretKey key = new SecretKeySpec(decodedKey, 0, decodedKey.length, "AES");
try {
Mac mac = Mac.getInstance("HmacSHA256"); //Creating a Mac object
mac.init(key); //Initializing the Mac object
byte[] bytes = challenge.getBytes();
byte[] macResult = mac.doFinal(bytes);
return macResult;
} catch (NoSuchAlgorithmException e) {
System.out.println("Not valid algorithm"+ e);
} catch (InvalidKeyException e) {
System.out.println("Invalid key"+ e);
}
C program
const char* key = hexstr_to_char("d44d4435c5eea8791456f2e20d7e176a");
unsigned char *result;
unsigned int* resultlen;
hmac_sha256(key, strlen(key),
challenge, strlen("d44d4435c5eea8791456f2e20d7e176a"),
result, resultlen);
unsigned char* hmac_sha256(const void *key, int keylen,
const unsigned char *data, int datalen,
unsigned char *result, unsigned int* resultlen)
{
return HMAC(EVP_sha256(), key, keylen, data, datalen, result, resultlen);
}
unsigned char* hexstr_to_char(const char* hexstr)
{
size_t len = strlen(hexstr);
if (len % 2 != 0)
return NULL;
size_t final_len = len / 2;
unsigned char* chrs = (unsigned char*)malloc((final_len+1) * sizeof(*chrs));
for (size_t i=0, j=0; j<final_len; i+=2, j++)
chrs[j] = (hexstr[i] % 32 + 9) % 25 * 16 + (hexstr[i+1] % 32 + 9) % 25;
chrs[final_len] = '\0';
return chrs;
}
If you read the documentation for HMAC() more carefully you'll see that there are a few ways to call the method to determine size of the output buffer, or to let it return a static array buffer.
You can pass NULL for the result and get the length with a first call, then allocate the result buffer and call again.
Or you can pass NULL for the result buffer, and it will return a static array buffer (which you don't own), however that is documented as not thread-safe.
The easiest way may be to rely on EVP_MAX_MD_SIZE to statically declare your buffer. You should also declare the length variable as int not int*, and pass its address with the & operation - this is very basic C.
Try this:
unsigned int resultlen = 0;
unsigned char resultbuf[EVP_MAX_MD_SIZE];
hmac_sha256(key, strlen(key), challenge, strlen("d44d4435c5eea8791456f2e20d7e176a"),
resultbuf, &resultlen);
Documentation:
https://www.openssl.org/docs/man1.1.1/man3/HMAC.html
There are many C language tutorials out there.
I want to encode X509 structure into DER bytes. Following source code example from the openssl (version > 0.9.7) man page I need to do (if I would i2d_X509 to allocate memory on its own):
int len;
unsigned char *buf;
buf = NULL;
len = i2d_X509(x, &buf);
if (len < 0)
/* error */
However, it is not completely clear (but I assume it is needed to call OPENSSL_free) from the documentation what is the right way to free memory after I am done with buf.
What is the correct way to free buf?
Short answer: OPENSSL_free must be used to free buf.
Long answer:
IMPLEMENT_ASN1_FUNCTIONS macro is expanded to definition of i2d_X509 function. The example below demonstrates that, put following source code into a source.c:
#include <openssl/asn1t.h>
IMPLEMENT_ASN1_FUNCTIONS(X509)
After execution of gcc -E source.c the macro is expanded to:
X509 *d2i_X509(X509 **a, const unsigned char **in, long len) { return (X509 *)ASN1_item_d2i((ASN1_VALUE **)a, in, len, (&(X509_it))); }
int i2d_X509(X509 *a, unsigned char **out) { return ASN1_item_i2d((ASN1_VALUE *)a, out, (&(X509_it))); }
X509 *X509_new(void) { return (X509 *)ASN1_item_new((&(X509_it))); }
void X509_free(X509 *a) { ASN1_item_free((ASN1_VALUE *)a, (&(X509_it))); }
The point of interest is definition of i2d_X509, in turn that function calls ASN1_item_i2d. As per source code of openssl, ASN1_item_i2d is a function defined in tasn_enc.c file:
static int asn1_item_flags_i2d(ASN1_VALUE *val, unsigned char **out,
const ASN1_ITEM *it, int flags)
{
if (out && !*out) {
unsigned char *p, *buf;
int len;
len = ASN1_item_ex_i2d(&val, NULL, it, -1, flags);
if (len <= 0)
return len;
buf = OPENSSL_malloc(len);
if (buf == NULL)
return -1;
p = buf;
ASN1_item_ex_i2d(&val, &p, it, -1, flags);
*out = buf;
return len;
}
return ASN1_item_ex_i2d(&val, out, it, -1, flags);
}
The branch if (out && !*out) is used in a case described in the original question (buf is NULL). So, internally, openssl allocates memory for the buf using OPENSSL_malloc, and as a consequence OPENSSL_free must be used to deallocate memory.
Note: I looked at the source code of openssl available on the GH at the current time.
This is a continuation of my previous question: Intermittent decryption failures in EVP_DecryptFinal_ex when using AES-128/CBC.
I am trying to encrypt and decrypt using the C OpenSSL EVP library. After I received an answer to my question above, I updated my code accordingly:
This variable:
int len = outlen1 + outlen2;
Stores the number of bytes encrypted in the encrypt function. I then pass that variable to the decrypt function (the passing is not shown in the code below) which then uses what is supposedly the actual number of bytes.
However on some input strings, I get segmentation faults at the EVP_DecryptFinal_ex() function.
Something is obviously wrong with the number of bytes encrypted/decrypted or padding. I just don't know what that is.
char* encrypt(char *key, char *s) {
unsigned char iv[16] = {[0 ... 15 ] = 0};
unsigned char outbuf[1024] = {[0 ... 1023] = 0};
int outlen1, outlen2;
EVP_CIPHER_CTX ctx;
EVP_CIPHER_CTX_init(&ctx);
EVP_EncryptInit_ex(&ctx, EVP_aes_128_cbc(), NULL, key, iv);
if (EVP_EncryptUpdate(&ctx, outbuf, &outlen1, s, strlen(s)) == 1) {
if (EVP_EncryptFinal_ex(&ctx, outbuf + outlen1, &outlen2) == 1) {
EVP_CIPHER_CTX_cleanup(&ctx);
len = outlen1 + outlen2;
return strdup(outbuf);
}
}
EVP_CIPHER_CTX_cleanup(&ctx);
return NULL;
}
char* decrypt(char *key, char *s, int len) {
unsigned char iv[16] = {[0 ... 15 ] = 0};
unsigned char outbuf[1024] = {[0 ... 1023] = 0};
int outlen1, outlen2;
printf("len: %d\n", len);
printf("strlen(s): %d\n", strlen(s));
EVP_CIPHER_CTX ctx;
EVP_CIPHER_CTX_init(&ctx);
EVP_DecryptInit_ex(&ctx, EVP_aes_128_cbc(), NULL, key, iv);
if (EVP_DecryptUpdate(&ctx, outbuf, &outlen1, s, len) == 1) {
printf("After update\n");
if (EVP_DecryptFinal_ex(&ctx, outbuf + outlen1, &outlen2) == 1) {
printf("After final\n");
EVP_CIPHER_CTX_cleanup(&ctx);
return strdup(outbuf);
}
}
EVP_CIPHER_CTX_cleanup(&ctx);
return NULL;
}
NOTE:
I was able to fix the problems I previously had where decrypt final would fail to decrypt certain strings. Those strings can be decrypted fine now. However, some other strings are facing the same problem but this time I am getting seg faults.
You cannot use string functions on binary data. This is especially the case if that binary data is indistinguishable from random. Random binary data may contain null characters anywhere, or not at all. strdup uses strcpy internally, which relies on the null character to be present.
Following the OpenSSL docs, I /think/ that what I'm doing is correct.. but apparently it's not. Compiling the file (with gcc -g -Wall -Wextra -lssl sign.c) yields no errors or warnings. EVP_VerifyFinal() always returns 0 (Meaning the check failed). What is causing that?
static const EVP_MD * type;
unsigned char * sha(char * input)
{
EVP_MD_CTX c;
unsigned char *md;
unsigned int md_len;
md = malloc(EVP_MAX_MD_SIZE);
EVP_MD_CTX_init(&c);
EVP_DigestInit_ex(&c, type, NULL);
EVP_DigestUpdate(&c, input, strlen(input));
EVP_DigestFinal_ex(&c, md, &md_len);
EVP_MD_CTX_cleanup(&c);
return md;
}
unsigned char * sign(EVP_PKEY * key, unsigned char * data)
{
EVP_MD_CTX c;
unsigned char *sig;
unsigned int len;
EVP_MD_CTX_init(&c);
sig = malloc(EVP_PKEY_size(key));
EVP_SignInit(&c, type);
EVP_SignUpdate(&c, data, strlen((char *)data));
EVP_SignFinal(&c, sig, &len, key);
EVP_MD_CTX_cleanup(&c);
return sig;
}
int verify(EVP_PKEY * key, unsigned char * data, unsigned char * original)
{
EVP_MD_CTX c;
int ret;
EVP_MD_CTX_init(&c);
EVP_VerifyInit(&c, type);
EVP_VerifyUpdate(&c, data, (unsigned int)sizeof(data));
ret = EVP_VerifyFinal(&c, original, (unsigned int)strlen((char *)original), key);
return ret;
}
int main(void)
{
EVP_PKEY *sk, *pk;
FILE *sfd, *pfd;
unsigned char *hash, *sig;
unsigned int i;
sfd = fopen("secret.pem", "r");
pfd = fopen("public.pem", "r");
sk = PEM_read_PrivateKey(sfd, NULL, NULL, NULL);
pk = PEM_read_PUBKEY(pfd, NULL, NULL, NULL);
fclose(sfd);
fclose(pfd);
OpenSSL_add_all_digests();
type = EVP_get_digestbyname("SHA1");
hash = sha("moo");
for(i = 0; i < sizeof(hash); i++)
printf("%02x", hash[i]);
printf("\n");
sig = sign(sk, hash);
switch( verify(pk, sig, hash) )
{
case 0:
printf("Check failed.\n");
break;
case 1:
printf("Check succeeded!\n");
break;
default:
printf("Oh look, an error: %d", ERR_get_error());
break;
}
return 0;
}
You are not passing the correct size:
EVP_VerifyUpdate(&c, data, (unsigned int)sizeof(data));
Since data is defined as unsigned char *, sizeof(data) is probably 4 or 8 (number of bytes required to hold a pointer).
Try passing the actual number of bytes that you've allocated, i.e. EVP_PKEY_size(key). You will have to pass that to your verify() function.
It's possible that something else is also wrong, but this one caught my eye.
I know this is an old question, but one bug could confuse users.
In the verify function original and data should be swapped, original past to EVP_VerifyUpdate and data to EVP_VerifyFinal. String length should not be used as a byte array can contain 0x00 as a valid value which will be recognized as "\0".
I've found some md5 code that consists of the following prototypes...
I've been trying to find out where I have to put the string I want to hash, what functions I need to call, and where to find the string once it has been hashed. I'm confused with regards to what the uint32 buf[4] and uint32 bits[2] are in the struct.
struct MD5Context {
uint32 buf[4];
uint32 bits[2];
unsigned char in[64];
};
/*
* Start MD5 accumulation. Set bit count to 0 and buffer to mysterious
* initialization constants.
*/
void MD5Init(struct MD5Context *context);
/*
* Update context to reflect the concatenation of another buffer full
* of bytes.
*/
void MD5Update(struct MD5Context *context, unsigned char const *buf, unsigned len);
/*
* Final wrapup - pad to 64-byte boundary with the bit pattern
* 1 0* (64-bit count of bits processed, MSB-first)
*/
void MD5Final(unsigned char digest[16], struct MD5Context *context);
/*
* The core of the MD5 algorithm, this alters an existing MD5 hash to
* reflect the addition of 16 longwords of new data. MD5Update blocks
* the data and converts bytes into longwords for this routine.
*/
void MD5Transform(uint32 buf[4], uint32 const in[16]);
I don't know this particular library, but I've used very similar calls. So this is my best guess:
unsigned char digest[16];
const char* string = "Hello World";
struct MD5Context context;
MD5Init(&context);
MD5Update(&context, string, strlen(string));
MD5Final(digest, &context);
This will give you back an integer representation of the hash. You can then turn this into a hex representation if you want to pass it around as a string.
char md5string[33];
for(int i = 0; i < 16; ++i)
sprintf(&md5string[i*2], "%02x", (unsigned int)digest[i]);
Here's a complete example:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#if defined(__APPLE__)
# define COMMON_DIGEST_FOR_OPENSSL
# include <CommonCrypto/CommonDigest.h>
# define SHA1 CC_SHA1
#else
# include <openssl/md5.h>
#endif
char *str2md5(const char *str, int length) {
int n;
MD5_CTX c;
unsigned char digest[16];
char *out = (char*)malloc(33);
MD5_Init(&c);
while (length > 0) {
if (length > 512) {
MD5_Update(&c, str, 512);
} else {
MD5_Update(&c, str, length);
}
length -= 512;
str += 512;
}
MD5_Final(digest, &c);
for (n = 0; n < 16; ++n) {
snprintf(&(out[n*2]), 16*2, "%02x", (unsigned int)digest[n]);
}
return out;
}
int main(int argc, char **argv) {
char *output = str2md5("hello", strlen("hello"));
printf("%s\n", output);
free(output);
return 0;
}
As other answers have mentioned, the following calls will compute the hash:
MD5Context md5;
MD5Init(&md5);
MD5Update(&md5, data, datalen);
MD5Final(digest, &md5);
The purpose of splitting it up into that many functions is to let you stream large datasets.
For example, if you're hashing a 10GB file and it doesn't fit into ram, here's how you would go about doing it. You would read the file in smaller chunks and call MD5Update on them.
MD5Context md5;
MD5Init(&md5);
fread(/* Read a block into data. */)
MD5Update(&md5, data, datalen);
fread(/* Read the next block into data. */)
MD5Update(&md5, data, datalen);
fread(/* Read the next block into data. */)
MD5Update(&md5, data, datalen);
...
// Now finish to get the final hash value.
MD5Final(digest, &md5);
To be honest, the comments accompanying the prototypes seem clear enough. Something like this should do the trick:
void compute_md5(char *str, unsigned char digest[16]) {
MD5Context ctx;
MD5Init(&ctx);
MD5Update(&ctx, str, strlen(str));
MD5Final(digest, &ctx);
}
where str is a C string you want the hash of, and digest is the resulting MD5 digest.
It would appear that you should
Create a struct MD5context and pass it to MD5Init to get it into a proper starting condition
Call MD5Update with the context and your data
Call MD5Final to get the resulting hash
These three functions and the structure definition make a nice abstract interface to the hash algorithm. I'm not sure why you were shown the core transform function in that header as you probably shouldn't interact with it directly.
The author could have done a little more implementation hiding by making the structure an abstract type, but then you would have been forced to allocate the structure on the heap every time (as opposed to now where you can put it on the stack if you so desire).
All of the existing answers use the deprecated MD5Init(), MD5Update(), and MD5Final().
Instead, use EVP_DigestInit_ex(), EVP_DigestUpdate(), and EVP_DigestFinal_ex(), e.g.
// example.c
//
// gcc example.c -lssl -lcrypto -o example
#include <openssl/evp.h>
#include <stdio.h>
#include <string.h>
void bytes2md5(const char *data, int len, char *md5buf) {
// Based on https://www.openssl.org/docs/manmaster/man3/EVP_DigestUpdate.html
EVP_MD_CTX *mdctx = EVP_MD_CTX_new();
const EVP_MD *md = EVP_md5();
unsigned char md_value[EVP_MAX_MD_SIZE];
unsigned int md_len, i;
EVP_DigestInit_ex(mdctx, md, NULL);
EVP_DigestUpdate(mdctx, data, len);
EVP_DigestFinal_ex(mdctx, md_value, &md_len);
EVP_MD_CTX_free(mdctx);
for (i = 0; i < md_len; i++) {
snprintf(&(md5buf[i * 2]), 16 * 2, "%02x", md_value[i]);
}
}
int main(void) {
const char *hello = "hello";
char md5[33]; // 32 characters + null terminator
bytes2md5(hello, strlen(hello), md5);
printf("%s\n", md5);
}