void MD5Convert(unsigned char* BufferMD5, unsigned char* DstMD5, int size)
{
unsigned char digest[16];
char buf[32];
int tmp_i;
int counter_DstMD5 = 0;
int i = 0;
//16 +16 = 32
char tmp_c[2];
for (i; i < size; i++)
{
tmp_i = BufferMD5[i];
//itoa(tmp_i, tmp_c, 16);
sprintf_s(tmp_c, "%02X", BufferMD5[i]);
DstMD5[counter_DstMD5] = tmp_c[0];
counter_DstMD5++;
DstMD5[counter_DstMD5] = tmp_c[1];
counter_DstMD5++;
}
}
visual studio give me this following message:
"Run-Time Check Failure #2 - Stack around the variable 'tmp_c' was corrupted."
the code above doesn't work with visual studio + c, I've tried everything, but it ends up overflowing somewhere.
Sorry if the question was poorly worded, but I couldn't find a possible alternative
see the code here
sprintf_s(tmp_c, "%02X", BufferMD5[i]);
You need 3 characters to store two-character string as you need to accommodate null terminating character. tmp_c is too short and you write outside its bounds.
char tmp_c[3];
Related
I try to learn the XOR algorithm using C. I have found a great example on KyleBank's GitHub:
#include <stdio.h>
#include <string.h>
void encryptDecrypt(char *input, char *output) {
char key[] = {'K', 'C', 'Q'}; //Can be any chars, and any size array
int i;
for(i = 0; i < strlen(input); i++) {
output[i] = input[i] ^ key[i % (sizeof(key)/sizeof(char))];
}
}
int main (int argc, char *argv[]) {
char baseStr[] = "kylewbanks.com";
char encrypted[strlen(baseStr)];
encryptDecrypt(baseStr, encrypted);
printf("Encrypted:%s\n", encrypted);
char decrypted[strlen(baseStr)];
encryptDecrypt(encrypted, decrypted);
printf("Decrypted:%s\n", decrypted);
}
The above works well under Linux and gcc.
However, it does not compile in Visual Studio under Windows.
I am using build tools included in Visual Studio 2017.
What am I doing wrong?
Microsoft's compiler does not support C99 VLAs (see the note here). Array sizes must be a constant expression. The code is also broken because it fails accommodate and place a nul terminator in the output.
In this case, decrypted and encrypted might be declared thus:
char encrypted[sizeof(baseStr)] ;
...
char decrypted[sizeof(baseStr)] ;
And encryptDecrypt() modified thus:
void encryptDecrypt(char *input, char *output) {
...
output[i] = 0 ;
}
Finally the signed mismatch warning may be cleaned up by declaring i as type size_t.
On Windows of course you could always use MinGW/GCC if you want more modern C support. Or you could use C++ and std::string or std::vector containers if you want to stick with Microsoft's compiler.
Use malloc for dynamic memory allocation. Requires #include <stdlib.h>
char baseStr[] = "123";
char *encrypted = malloc(strlen(baseStr) + 1);
...
free(encrypted);
As mentioned before, you have to add 1 for the null-terminated character at the end.
The char* pointer is one piece of information, it shows where the string begins. But where does it end? strlen and other C functions have no idea where the string ends, so they go through all the characters until a '\0' character is encountered.
For efficiency, take strlen(input) out of the loop and calculate it only once:
void encryptDecrypt(char *input, char *output)
{
char key[] = { 'K', 'C', 'Q' };
int keysize = sizeof(key);
size_t i;
size_t len = strlen(input);
for(i = 0; i < len; i++)
output[i] = input[i] ^ key[i % keysize];
output[len] = 0; //will be same as output[i] = 0;
}
The function int main should return zero. Note that this method cannot be described as "encryption" by modern standards. You can call it "obfuscation".
I'm working on an assignment for class so I would prefer for you to assist me in the matter or rather point me in the right direction rather than just giving me an answer. That being said my program encrypts a string based on a Ceasar Cipher (This is not by any means secure it is merely a simple program). When it runs through encoding and decoding the result is find and never has any issues assuming the number of passes (or rather number of times I encrypt it) is less than 100. When the encode and decode are over 100 passes I get a different string than expected as an output for my encode, and my decode only outputs a single character (the first).
This is the code and note it is intentional that the output is not within the normal ascii range as that is what the assignment asks.
enum CODE_METHOD {ENCODE, DECODE};
int mystrlen(const unsigned char *string)
{
int length = 0;
int index;
for(index = 0;*(string+index) != 0; index++)
length++;
return length;
}
void jumble(unsigned char *string,
const unsigned char *password,
enum CODE_METHOD method,
int passes)
{
int index, index_2;
int len = mystrlen(password);
for(index_2 = 0; index_2 < passes; index_2++)
for(index=0; *(string+index) != '\0'; ++index )
*(string+index) += (method == ENCODE) ? *(password+(index % len))
: - *(password+(index % len));
}
Also here is an example driver which would call this
#include <stdio.h> /* printf, putchar */
#include "jumble.h" /* mystrlen, jumble, ENCODE, DECODE */
void test_stress(int passes)
{
unsigned char phrase[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
unsigned char *password = (unsigned char*) "rumpelstiltskin";
size_t length = sizeof(phrase);
size_t i;
printf("\nStress ======================================\n");
printf("Original phrase:\n");
printf("%s\n", phrase);
jumble(phrase, password, ENCODE, passes);
printf("\nEncoded phrase:\n");
for (i = 0; i < length; i++)
putchar(phrase[i]);
jumble(phrase, password, DECODE, passes);
printf("\nDecoded back:\n");
printf("%s\n", phrase);
}
int main(void)
{
test_stress(100);
return 0;
}
Here is the expected output of the above driver:
Stress ======================================
Original phrase:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
Encoded phrase:
Éö×¹v3˜Mz›8RGØæÈ…B§\‰ª
Decoded back:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
Recieved output
Stress ======================================
Original phrase:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
Encoded phrase:
É Ñä£Îù ¿ÒþÏÄãPD ³ ÇÈ
Decoded back:
A
Again I would like to be pointed in the right direction here
Thanks!
I have wrote a c code which takes the input value of key and message makes call to openssl hmac functions and generate result of mac code.
Input values are collected from NIST Test Vectors
#define KEY_SIZE 11 // in bytes
#define MSG_SIZE 129 // in bytes
#include <stdio.h>
#include <string.h>
#include <openssl/hmac.h>
void str2hex(char *, char*, int);
int main() {
char *key, *msg;
unsigned char keyy[KEY_SIZE], msgt[MSG_SIZE], temp[4];
unsigned char* result;
unsigned int i, len = 20,Tlen = 10;
key = "";//values specified below
msg ="";//values specified below
/*CONVERT STRING TO HEX DIGITS - KEY*/
str2hex(key, keyy, KEY_SIZE);
//CONVERT STRING TO HEX DIGITS - MSG*//
str2hex(msg, msgt, MSG_SIZE);
result = (unsigned char*)malloc(sizeof(char) * len);
HMAC_CTX ctx;
HMAC_CTX_init(&ctx);
HMAC_Init_ex(&ctx, keyy, strlen(keyy), EVP_sha1(), NULL);
HMAC_Update(&ctx, (unsigned char*)&msgt, strlen(msgt));
HMAC_Final(&ctx, result, &len);
HMAC_CTX_cleanup(&ctx);
printf("HMAC digest: ");
for (i = 0; i < Tlen; i++)
printf("%02x", result[i]);
printf("\n");
free(result);
return 0;
}
//===================== string to hex conversion
================================//
void str2hex(char *str, char *hex, int len) {
int tt, ss;
unsigned char temp[4];
for (tt = 0, ss = 0; tt < len, ss < 2 * len; tt++, ss += 2) {
temp[0] = '0';
temp[1] = 'x';
temp[2] = str[ss];
temp[3] = str[ss + 1];
hex[tt] = (int) strtol(temp, NULL, 0);
}
}
//---------------------------------------------------------------------------------//
The first input given:
Key = 82f3b69a1bff4de15c33
Msg = fcd6d98bef45ed6850806e96f255fa0c8114b72873abe8f43c10bea7c1df706f10458e6d4e1c9201f057b8492fa10fe4b541d0fc9d41ef839acff1bc76e3fdfebf2235b5bd0347a9a6303e83152f9f8db941b1b94a8a1ce5c273b55dc94d99a171377969234134e7dad1ab4c8e46d18df4dc016764cf95a11ac4b491a2646be1
Output generated:
HMAC digest: 1ba0e66cf72efc349207
Nist_Mac = 1ba0e66cf72efc349207
It matches so success
But for the Second input
Key = 4766e6fe5dffc98a5c50
Msg = d68b828a153f5198c005ee36c0af2ff92e84907517f01d9b7c7993469df5c21078fa356a8c9715ece2414be94e10e547f32cbb8d0582523ed3bb0066046e51722094aa44533d2c876e82db402fbb00a6c2f2cc3487973dfc1674463e81e42a39d9402941f39b5e126bafe864ea1648c0a5be0a912697a87e4f8eabf79cbf130e
Output generated:
HMAC digest: ca96f112a79882074b63
Nist_Mac = 007e4504041a12f9e345
Its failing.If any one could check my code and kindly let me know what am i doing wrong it will be really helpfull.
You have two issues here.
The first is that you're using strlen on an array of characters that may contain a null byte. Since this function counts the number of bytes until it find a null byte, you won't get what you expect if your array contains a null byte (as is the case for your second example).
Instead of using strlen on the byte array to determine the length, use the actual length of the data. Since you're converting a string containing hex digits to bytes, the length of the byte array is half the length of the input string.
HMAC_Init_ex(&ctx, keyy, strlen(key)/2, EVP_sha1(), NULL);
HMAC_Update(&ctx, msgt, strlen(msg)/2);
Note also that you should pass msgt to HMAC_Update, not &msgt, as the latter is a pointer to an array.
The second issue is in your str2hex function. When you construct temp, you don't have enough space for a terminating null byte. This causes strtol, which expects a null-terminated string, to read past the end of the array. This invokes undefined behavior.
In this particular case you're "lucky" that it works, as the byte in memory that follows temp happens to contain either a null byte or a non-digit. You can't however depend on this behavior. Fix this by making temp one byte longer and explicitly setting that byte to 0. And while you're at it, you should also fix the signed / unsigned mismatch in your function arguments and change the type of temp to an unsigned char array.
void str2hex(char *, unsigned char*, int);
...
void str2hex(char *str, unsigned char *hex, int len) {
int tt, ss;
char temp[5];
for (tt = 0, ss = 0; tt < len, ss < 2 * len; tt++, ss += 2) {
temp[0] = '0';
temp[1] = 'x';
temp[2] = str[ss];
temp[3] = str[ss + 1];
temp[4] = 0;
hex[tt] = strtol(temp, NULL, 0);
}
}
At byte position 58 in the message, you have a 0x00 byte (null). Since you're doing an strlen(msgt), this results in 58 instead of 128. Excerpt from the documentation (emphasis mine):
The C library function size_t strlen(const char *str) computes the length of the string str up to, but not including the terminating null character.
Just use the proper length of the message and don't use string operations on char arrays that do not contain printable bytes.
This is probably a really stupid question, but
I have an array of structs outside of int main
typedef struct{
char c;
int k;
}factor_t;
and I declared
factor_t *factors = malloc(INIT*sizeof(*factors));
where INIT is 10
After running my function, I have an array of structs each which holds a char, c, and integer, k - e.g., factors[5].c could hold "b" or "d" or "e" and factors[5].k could hold "3" or "33" or "333"
I need to somehow insert these into a string, but I can't seem to
strcat(destination,c or k);
they both give me pointer to integer errors, destination is a char*
How would I go about putting these into a string? I'm aiming to get a string that looks like
ck
ck
ck
that is, a pattern of "ck\n" per struct, where c = char and k = integer
I use strcat(destination, "\n"); for the \n and it works, but I can't do the same with c and k
Calculate the length of the string and output with that offset.
#include <stdio.h>
#include <string.h>
typedef struct{
char c;
int k;
}factor_t;
void struct_cat(char *str, factor_t f) {
sprintf(str + strlen(str), "%c%d", f.c, f.k);
}
int main(void) {
factor_t fac = {'b', 33};
char buf[100] = "hogehoge";
struct_cat(buf, fac);
puts(buf);
return 0;
}
strcat appends a copy of the source string to the destination string. It expects c to be a null terminated string not a single char
If you want to add a single char to an array that is larger than n and the null terminating char is at index n
destination[n] = c;
destination[n+1] = '\0;
you have to be certain that destination is large enough.
If you want to format print additional data to destination string, again make sure destination is large enough and do :
sprintf(destination + n, "%c%d\n", c, k);
Or if you know how that destination has m chars left :
snprintf(destination + n, m, "%c%d\n", c, k);
With this if you attempt to print more than m chars the extra ones will be discarded.
You can use sprintf to do so . -
size_t len=strlen(destination); // calculate before concatenation
sprintf(&destination[len], "%c%d\n", factors[5].c,factors[5].k); // string with newline
destination should be of type char *.
If you need separate this feature use function (like #MikeCAT). But use of snprintf() and strncat() does not allow to go beyond the array bounds:
void strncat_struct(char *buffer, size_t buffer_size, factor_t f)
{
char tmp_buf[32];
snprintf(tmp_buf, sizeof(tmp_buf), "%c, %d\n", f.c, f.k);
strncat(buffer, tmp_buf, buffer_size);
}
int32_t main(int32_t argc, char **argv)
{
//...
char buffer[256] = {0};
for(i = 0; i < INIT; i++) {
strncat_struct(buffer, sizeof(buffer), factors[i]);
}
//...
}
Without using additional function. It is theoretically faster, couse there is no need to calculate string length:
int32_t main(int32_t argc, char **argv)
{
//...
char buffer[256];
char *buf_ptr = buffer;
size_t buf_size = sizeof(buffer);
for(i = 0; i < INIT; i++) {
int32_t printed;
printed = snprintf(buf_ptr, buf_size, "%c, %d\n", factors[i].c, factors[i].k);
buf_ptr += printed;
buf_size -= printed;
}
//...
}
I am trying to print out byte array as one byte at the time in hexadecimal format within for loop like this:
int my_function(void *data)
{
obuf = (str*)data;
int i;
for (i = 0; i < obuf->len; i++)
{
printf("%02X:", obuf->s[i]);
}
return 0;
}
str in this case is structure from Kamailio - review at http://www.asipto.com/pub/kamailio-devel-guide/#c05str
The expected output:
80:70:0F:80:00:00:96:00:1D:54:7D:7C:36:9D:1B:9A:20:BF:F9:68:E8:E8:E8:F8:68:98:E8:EE:E8:B4:7C:3C:34:74:74:64:74:69:2C:5A:3A:3A:3A:3A:3A:3A:32:24:43:AD:19:1D:1D:1D:1D:13:1D:1B:3B:60:AB:AB:AB:AB:AB:0A:BA:BA:BA:BA:B0:AB:AB:AB:AB:AB:0A:BA:BA:BA:BA:B9:3B:61:88:43:
What I am getting:
FFFFFF80:70:0F:FFFFFF80:00:00:FFFFFF96:00:1D:54:7D:7C:36:FFFFFF9D:1B:FFFFFF9A:20:FFFFFFBF:FFFFFFF9:68:FFFFFFE8:FFFFFFE8:FFFFFFE8:FFFFFFF8:68:FFFFFF98:FFFFFFE8:FFFFFFEE:FFFFFFE8:FFFFFFB4:7C:3C:34:74:74:64:74:69:2C:5A:3A:3A:3A:3A:3A:3A:32:24:43:FFFFFFAD:19:1D:1D:1D:1D:13:1D:1B:3B:60:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:0A:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFB0:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:FFFFFFAB:0A:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFBA:FFFFFFB9:3B:61:FFFFFF88:43:
Could someone please help me understand why there are some of bytes prefixed with FFFFFF and other aren't?
Thanks in advance
Looks like obuf->s[i] returns a signed value
You would need to cast it to a unsigned value to get rid of the FFF.. at start.
printf("%02X:", (unsigned char)(obuf->s[i]));
The problem appears with chars that have the most significant bit set (which are out of the proper pure ASCII set range 0-127). The key point is to consider chars as unsigned.
printf("%02X:", (unsigned char)(obuf->s[i]));
See this simple compilable repro C code:
#include <stdio.h>
#include <string.h>
struct _str {
char* s; /* pointer to the beginning of string (char array) */
int len; /* string length */
};
typedef struct _str str;
int my_function(void *data)
{
str* obuf;
int i;
obuf = (str*)data;
for (i = 0; i < obuf->len; i++) {
printf("%02X:", (unsigned char)(obuf->s[i]));
}
return 0;
}
int main(void)
{
char buf[2];
str s;
/* Test with ordinary ASCII string */
s.s = "Hello";
s.len = strlen(s.s);
my_function(&s);
printf("\n");
/* Test with char values with most significant bit set */
buf[0] = 0xF1;
buf[1] = 0x00;
s.s = buf;
s.len = 1;
my_function(&s);
return 0;
}
With MSVC, I get this output:
48:65:6C:6C:6F:
F1: