Encode UTF8String to ASN1 with CryptoAPI - c

I use this code to encode UTF8String in ASN1:
const char *charExtensionValue = "test value тест тест with some cyrillic symbols";
CERT_NAME_VALUE myNameValue;
myNameValue.dwValueType = CERT_RDN_UTF8_STRING;
myNameValue.Value.cbData = (DWORD)(strlen(charExtensionValue)+1)*2;
myNameValue.Value.pbData = (LPBYTE)charExtensionValue;
CERT_BLOB encodedBlob;
bool checkASN1Encoding = CryptEncodeObjectEx(X509_ASN_ENCODING | PKCS_7_ASN_ENCODING, X509_ANY_STRING, &myNameValue, CRYPT_ENCODE_ALLOC_FLAG, NULL, &encodedBlob.pbData, &encodedBlob.cbData);
CryptEncodeObjectEx works well, without any errors, but the result is not expected:
OCTET STRING, encapsulates {
UTF8String "ø§³û¦© Ґѐô´
What am I doing wrong?

the docs say CERT_RDN_UTF8_STRING means the value member must be "An array of 16 bit Unicode characters UTF8 encoded on the wire as a sequence of one, two, or three, eight-bit characters." but charExtensionValue points to an array of 8 bit characters. Also you are calculating the string as if it is a UTF-16 string which it is not. – Stuart

Related

Which type of Charset suitable for image encoding ? [Kotlin]

I had tried to convert btye array to string in charset-8, but it's not working. Can someone guide me please.
Here is how I convert Bitmap to bytearray
private fun BitmapToByteArray(): ByteArray
{
val stream = ByteArrayOutputStream()
btm1!!.compress(Bitmap.CompressFormat.PNG, 100, stream)
val bitmapdata: ByteArray = stream.toByteArray()
return bitmapdata
}
Here is how I encrypt the data
private fun encrypting_data(bitmapdata: ByteArray): String {
val key = secretkey!!.text.toString()
val btm1 = bitmapdata.toString(Charsets.UTF_8)
val s = btm1
//generating key from given secret key
val skey: Key = SecretKeySpec(key.toByteArray(), "AES")
print(skey.toString())
val c: Cipher = Cipher.getInstance("AES")
c.init(Cipher.ENCRYPT_MODE, skey)
//encrypting text string
val re = c.doFinal(s.toByteArray())
//converting encrypted string to base64
val re_base64 = Base64.encodeToString(re, Base64.NO_WRAP or Base64.NO_PADDING)
Log.e("aaAA", re_base64.toString())
//converting each chr of base64 string to binary and combining it
for (i in re_base64) {
var single_b_string = Integer.toBinaryString((i.toInt()))
//if binary str is less than 8 bit then making it 8 bit by adding 0's
if (single_b_string.length < 8) {
for (j in 1..(8 - single_b_string.length)) {
single_b_string = "0" + single_b_string
}
}
//final binary string to hide in image
b_string = b_string + single_b_string
}
Log.e("barraylength", b_string.toString())
Log.e("barray", b_string!!.length.toString())
return b_string.toString()
}
please guide me, thank you
Short answer: none.
Charsets are used to map characters to binary and vice-versa. It doesn't make sense to decode the bytes of an image into a string using a character encoding. There is even a chance that you find sequences of bytes that are not valid sequences in the character encoding that you choose, so they will not be converted to characters correctly.
Sometimes it's necessary to use text to represent binary data (e.g. when using text-only transports/media to store it).
In these cases, you can use other kinds of encodings, for instance Base64, but I guess you know about it because you're already sort of using base64 here as well.
Note that, in your current code, you are converting a ByteArray (bitmapdata) into a String (btm1/s) only to convert it back into a ByteArray (s.toByteArray()). Why do you even need to do so?

Waves: Get 15 mnemonic words from a hex seed

Is there a way to get the 15 mnemonic words from a given hex seed?
Any method in the waves JS crypto library to do this?
If by hex seed you mean byteEncoded or base58encoded seed, then yes:
const {libs} = require('#waves/waves-transactions')
// convert base58 string to bytes
const encoded = 'xrv7ffrv2A9g5pKSxt7gHGrPYJgRnsEMDyc4G7srbia6PhXYLDKVsDxnqsEqhAVbbko7N1tDyaSrWCZBoMyvdwaFNjWNPjKdcoZTKbKr2Vw9vu53Uf4dYpyWCyvfPbRskHfgt9q'
const bytes = libs.crypto.base58decode(encoded)
// if string is hex encoded use this function instead
// const bytes libs.crypto.hexStringToByteArray(encoded)
const decoded = libs.marshall.parsePrimitives.P_STRING_FIXED(bytes.length)(bytes)
console.log(decoded)
Otherwise it is not, since hash function is irreversible

Converting a C implementation of OpenSSL to Ruby code

In C I can do the following:
bignum = BN_new();
BN_bin2bn(my_message, 32, bignum);
group = EC_GROUP_new_by_curve_name(NID_X9_62_prime256v1);
ecp = EC_POINT_new(group);
check = EC_POINT_set_compressed_coordinates_GFp(group, ecp, bignum, 0, NULL);
key = EC_KEY_new_by_curve_name(NID_X9_62_prime256v1);
check = EC_KEY_set_public_key(key, ecp);
check = EVP_PKEY_set1_EC_KEY(public_key, key);
In Ruby, I thought this would do the same thing, but I get an error*
bignum = OpenSSL::BN.new(my_message, 2)
group = OpenSSL::PKey::EC::Group.new('prime256v1')
group.point_conversion_form = :compressed
public_key = OpenSSL::PKey::EC::Point.new(group, bignum)
In both instances I can log bignum and see that it is the same, and I'm pretty positive prime256v1 is the correct group.
In both cases C and Ruby are using the same version of OpenSSL (OpenSSL 1.0.2p 14 Aug 2018)
Any advice on what I'm doing wrong here would be massively appreciated.
*The error message I get is invalid encoding (OpenSSL::PKey::EC::Point::Error)
The EC_POINT_set_compressed_coordinates_GFp function in C expects you to pass in the x-coordinate of the point and separately a value to specify which of the two possible points it could be (you are passing in a literal 0, in reality you should determine the actual value).
In Ruby, the Point initializer is expecting the point encoded as a string that includes information about both coordinates (I don’t know if this format has a name, but it’s pretty common and is documented by the SECG). In the case of compressed coordinates this string is basically the same 32 bytes as in the C code, but with an extra byte at the start, either 0x02 or 0x03, which correspond to passing 0 or 1 as the y-bit to EC_POINT_set_compressed_coordinates_GFp.
If the string doesn’t start with 0x02 or 0x03 (or 0x04 for uncompressed points) or is the wrong length, then you will get the invalid encoding error.
It doesn’t look like the Ruby OpenSSL bindings provide a way to specify a point using separate x and y coordinates. The simplest way would be to add the 0x02 or 0x03 prefix to the string before passing it to Point.new.
If you already have this string you can use it in C to create a point using EC_POINT_oct2point. Ruby itself calls EC_POINT_oct2point if you pass a string to Point.new.

Inserting integer array with postgresql in C (libpq)

I'm trying to post an integer array into my postgresql database. I'm aware that I could format everything as a string and then send that string as one SQL command. However, I believe the PQexecParams function should also bring some help. However, I'm kind of lost as how to use it.
//we need to convert the number into network byte order
int val1 = 131;
int val2 = 2342;
int val3[5] = { 0, 7, 15, 31, 63 };
//set the values to use
const char *values[3] = { (char *) &val1, (char *) &val2, (char *) val3 };
//calculate the lengths of each of the values
int lengths[3] = { sizeof(val1), sizeof(val2), sizeof(val3) * 5 };
//state which parameters are binary
int binary[3] = { 1, 1, 1 };
PGresult *res = PQexecParams(conn, "INSERT INTO family VALUES($1::int4, $2::int4, $3::INTEGER[])", 3, //number of parameters
NULL, //ignore the Oid field
values, //values to substitute $1 and $2
lengths, //the lengths, in bytes, of each of the parameter values
binary, //whether the values are binary or not
0); //we want the result in text format
Yes this is copied from some tutorial.
However this returns :
ERROR: invalid array flags
Using a conventional method does work:
PQexec(conn, "INSERT INTO family VALUES (2432, 31, '{0,1,2,3,4,5}')");
Inserts data just fine, and I can read it out fine as well.
Any help would be greatly appreciated! :)
libpq's PQexecParams can accept values in text or binary form.
For text values, you must sprintf the integer into a buffer that you put in your char** values array. This is usually how it's done. You can use text format with query parameters, there is no particular reason to fall back to interpolating the parameters into the SQL string yourself.
If you want to use binary mode transfers, you must instead ensure the integer is the correct size for the target field, is in network byte order, and that you have specified the type OID. Use htonl (for uint32_t) or htons (for uint16_t) for that. It's fine to cast away signedness since you're just re-ordering the bytes.
So:
You cannot ignore the OID field if you're planning to use binary transfer
Use htonl, don't brew your own byte-order conversion
Your values array construction is wrong. You're putting char**s into an array of char* and casting away the wrong type. You want &val1[0] or (equivalent in most/all real-world C implementations, but not technically the same per the spec) just val1, instead of (char*)&val1
You cannot assume that the on-wire format of integer[] is the same as C's int32_t[]. You must pass the type OID INT4ARRAYOID (see include/catalog/pg_type.h or select oid from pg_type where typname = '_int4' - the internal type name of an array is _ in front of its base type) and must construct a PostgreSQL array value compatible with the typreceive function in pg_type for that type (which is array_recv) if you intend to send in binary mode. In particular, binary-format arrays have a header. You cannot just leave out the header.
In other words, the code is broken in multiple exciting ways and cannot possibly work as written.
Really, there is rarely any benefit in sending integers in binary mode. Sending in text-mode is often actually faster because it's often more compact on the wire (small values). If you're going to use binary mode, you will need to understand how C represents integers, how network vs host byte order works, etc.
Especially when working with arrays, text format is easier.
libpq could make this a lot easier than it presently does by offering good array construct / deconstruct functions for both text and binary arrays. Patches are, as always, welcome. Right now, 3rd party libraries like libpqtypes largely fill this role.

How to Sort UTF-8 String which Contains Digits & Characters?

I am working on Program(in c) which require sorting.
One of the requirement of sorting is : Digits Sorting.
Digit sorting shall be completed from least significant digit (i.e. the rightmost digit) and to the most significant
digit (i.e. the leftmost digit) such that the numbers 21, 2, and 11 are sorted as follows: 2, 11, 21.
The given string is in UTF-8 and may contains Special Characters,Digits,Latin letters ,Cyrillic letters ,Hiragana/Katakana etc.
It give following sorting Order :
1
1a
1b
2
11
110
110a
Henry7
Henry24
You might want to consider using the ICU library (International Components for Unicode), which includes a collation (sorting) API.
I think you mean "sort numerical characters in text strings as numbers." You may try using Qt's QString::localeAwareCompare() which makes use of locale and platform settings to compare strings. At least on OS X, this should mean it will respect the user selected locale which include the behavior you want.
Or you can convert it to utf16 and sort by code point value if you don't care about locale.
Use std::sort's custom comparator function by checking with QString::localeAwareCompare().
Comparator function:
void sortLocaleAware(QStringList &sList)
{
std::sort(sList.begin(), sList.end(), [](const QString &s1, const QString &s2){
return s1.localeAwareCompare(s2) < 0;
});
}
Usage:
QStringList myList = { "4a", "3b", "52a" ,"13ş", "34İ" };
sortLocaleAware(myList);

Resources