Fileformat for EC public/private keys? - file

If I wanted to store both a private and a public key in a single file, what would be the easiest format to use? Especially if I'm planning to use the BouncyCastle library for Java?

On a theoretical point of view, the public key can be recomputed from the private key (computational cost for that is slightly lower than the cost for producing a single ECDSA signature, or doing half of ECDH, so it is fast). Therefore, conceptually, you only have to store the private key, and the standard format for that is PKCS#8, which is supported by Java with java.security.spec.PKCS8EncodedKeySpec. Moreover, the PKCS#8 format includes provisions for optionally encoding the public key along the private key in the same blob, so this really looks like what you are looking for.
The tricky thing, however, is to convince the cryptographic provider (e.g. BouncyCastle) to extract the public key as such and/or recompute it. Apparently, if you create a PKCS8EncodedKeySpec from a PKCS#8-encoded EC private key which also contains the public key, BouncyCastle will be kind enough to internally keep a copy of the encoded public key and write it back if you decide to reencode the private key in PKCS#8 format. However, it does nothing else with it; it handles it as an opaque blob.
Hence you must recompute the public key. Wading through the JCE and BouncyCastle API and unimplemented bits, I found the following, which appears to work (JDK 1.6.0_24, BouncyCastle 1.46):
import java.security.KeyFactory;
import java.security.PrivateKey;
import java.security.PublicKey;
import java.security.Provider;
import java.security.spec.PKCS8EncodedKeySpec;
import org.bouncycastle.jce.provider.BouncyCastleProvider;
import org.bouncycastle.jce.provider.JCEECPrivateKey;
import org.bouncycastle.jce.provider.JCEECPublicKey;
import org.bouncycastle.jce.spec.ECParameterSpec;
import org.bouncycastle.jce.spec.ECPublicKeySpec;
// Create the provider and an appropriate key factory.
Provider pp = new BouncyCastleProvider();
KeyFactory kf = KeyFactory.getInstance("EC", pp);
// Decode the private key (read as a byte[] called 'buf').
PKCS8EncodedKeySpec ks = new PKCS8EncodedKeySpec(buf);
PrivateKey sk = kf.generatePrivate(ks);
// Recompute public key.
JCEECPrivateKey priv = (JCEECPrivateKey)sk;
ECParameterSpec params = priv.getParameters();
ECPublicKeySpec pubKS = new ECPublicKeySpec(
params.getG().multiply(priv.getD()), params);
PublicKey pk = kf.generatePublic(pubKS);
// To reencode the private key.
buf = kf.getKeySpec(sk, PKCS8EncodedKeySpec.class).getEncoded();
Conceptually, I should use kf.getkeySpec() with org.bouncycastle.jce.spec.ECPrivateKeySpec instead of ruthlessly casting the private key to the JCEECPrivateKey class, but the clean method appears not to be implemented yet in BouncyCastle.

Try this (BouncyCastle v1.47, using JDK 1.7.* but I assume JDK 1.6.* will be fine too):
// Recreate the private key.
final KeyFactory kf = KeyFactory.getInstance("EC", "BC");
final PKCS8EncodedKeySpec encPrivKeySpec = new PKCS8EncodedKeySpec(rawPrivKey);
final PrivateKey privKey = kf.generatePrivate(encPrivKeySpec);
final byte[] rawPrivKey = privKey.getEncoded();
// Recreate the public key.
final X509EncodedKeySpec pubKeySpec = new X509EncodedKeySpec(rawPubKey);
final PublicKey pubKey = kf.generatePublic(pubKeySpec);
final byte[] rawPubKey = pubKey.getEncoded();
where rawPrivKey and rawPubKey are arrays of byte type.
I suggest you encrypt the encoded private key with a block cipher (i.e. AES) otherwise the file is subject to be stolen and then you are indefinitely exposed.

Related

how to openSSL 1.1.1 ECDH with 25519

i need to implement ecdh with 25519 using openssl.
using:
key = EC_KEY_new_by_curve_name(NID_X25519)
fails.
using this:
EVP_PKEY *pkey = NULL;
EVP_PKEY_CTX *pctx = EVP_PKEY_CTX_new_id(NID_X25519, NULL);
EVP_PKEY_keygen_init(pctx);
EVP_PKEY_keygen(pctx, &pkey);
seems to work but i have no idea how to export the public key in uncompressed bin format. or how to import the other sides public key.
any help?
Importing the other side's public key from raw binary format can be done with the EVP_PKEY_new_raw_public_key() function. Man page here:
https://www.openssl.org/docs/man1.1.1/man3/EVP_PKEY_new_raw_public_key.html
Exporting the public key in raw binary format is a little more tricky since there is no function to do it. You can do it in SubjectPublicKeyInfo format using i2d_PUBKEY() described here:
https://www.openssl.org/docs/man1.1.1/man3/i2d_PUBKEY.html
Fortunately the SubjectPublicKeyInfo format has the raw public key as the last 32 bytes of its output. So you can use i2d_PUBKEY() and just use the last the 32 bytes.

Is there an equivalent to Kafka's KTable in Apache Flink?

Apache Kafka has a concept of a KTable, where
where each data record represents an update
Essentially, I can consume a kafka topic, and only keep the latest message for per key.
Is there a similar concept available in Apache Flink? I have read about Flink's Table API, but does not seem to be solving the same problem.
Some help comparing and contrasting the 2 frameworks would be helpful. I am not looking for which is better or worse. But rather just how they differ. The answer for which is right would then depend on my requirements.
You are right. Flink's Table API and its Table class do not correspond to Kafka's KTable. The Table API is a relational language-embedded API (think of SQL integrated in Java and Scala).
Flink's DataStream API does not have a built-in concept that corresponds to a KTable. Instead, Flink offers sophisticated state management and a KTable would be a regular operator with keyed state.
For example, a stateful operator with two inputs that stores the latest value observed from the first input and joins it with values from the second input, can be implemented with a CoFlatMapFunction as follows:
DataStream<Tuple2<Long, String>> first = ...
DataStream<Tuple2<Long, String>> second = ...
DataStream<Tuple2<String, String>> result = first
// connect first and second stream
.connect(second)
// key both streams on the first (Long) attribute
.keyBy(0, 0)
// join them
.flatMap(new TableLookup());
// ------
public static class TableLookup
extends RichCoFlatMapFunction<Tuple2<Long,String>, Tuple2<Long,String>, Tuple2<String,String>> {
// keyed state
private ValueState<String> lastVal;
#Override
public void open(Configuration conf) {
ValueStateDescriptor<String> valueDesc =
new ValueStateDescriptor<String>("table", Types.STRING);
lastVal = getRuntimeContext().getState(valueDesc);
}
#Override
public void flatMap1(Tuple2<Long, String> value, Collector<Tuple2<String, String>> out) throws Exception {
// update the value for the current Long key with the String value.
lastVal.update(value.f1);
}
#Override
public void flatMap2(Tuple2<Long, String> value, Collector<Tuple2<String, String>> out) throws Exception {
// look up latest String for current Long key.
String lookup = lastVal.value();
// emit current String and looked-up String
out.collect(Tuple2.of(value.f1, lookup));
}
}
In general, state can be used very flexibly with Flink and let's you implement a wide range of use cases. There are also more state types, such as ListState and MapState and with a ProcessFunction you have fine-grained control over time, for example to remove the state of a key if it has not been updated for a certain amount of time (KTables have a configuration for that as far as I know).

How to receive Map parameters in Resteasy?

I would like to receive these HTTP parameters (POST) in my Resteasy service:
customFields[my_key]=some_value
customFields[my_key2]=some_value2
Something like this doesn't work:
#Form(prefix="customFields")
Map<String, String> customFields
... what happens here is that on the server the new Map is initialized, and the key for the Map entry is set (i.e. "my_key") but value is not set.
Does anyone know how to handle the case like mine, where I need to receive unknown number of fields (within a Map), but each of them properly structured (HTTP map/dictionary notation).
This is a known bug. The workaround is to use your own string wrapper as the map value type. For example:
public class StringWrapper implements Serializable {
private static final long serialVersionUID = 1L;
#FormParam("value")
public String value;
}
Redefine your map as:
#Form(prefix="customFields")
Map<String, StringWrapper> customFields;
And then pass the values to it as customFields[my_key].value=some_value

GAE Datastore with GWT, making more friendly/smaller keys

I am currently working with GWT, GAE and using JPA as my ORM. I have an issue where the keys that GAE is generating are too large reasonably to be used on a mobile device with RequestFactory. The amount of data in a small list is overwhelming due to the size of the ID/KEY when converted to String.
I am using String for my key's so that I can handle inheritence.
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Extension(vendorName = "datanucleus", key = "gae.encoded-pk", value = "true")
protected String key;
This creates a key that is very long example "agxzbWFydGJhcnNpdGVyFAsSDUVzdGFibGlzaG1lbnQYuAIM" and gets larger due to storing object type and parent in the key.
I need a way to create a smaller unique id but still have the ability to handle inheritence in GAE. I tried Long as the #Id/key but was not able to use a #OneToMany relationship on my objects due to the relationship that is built into the String/Key key.
The other option is to create a sequence for each class and use a Long property for that id. There is an example below but I am not sure how to handle a generated Long sequence in app engine.
#GeneratedValue
private Long friendlyClassSpecificKey;
Any advice would be appreciated. If there is another option other than using the sequence for each class type I am interested but if not is there an example of creating a sequence (That is not the #ID) for a specific class?
I came up with a good solution for smaller keys. I think the best way to do this cleanly is to use jpa/jdo 2 for app engine with an unowned relationship. This way you can fetch the keys from (Long) id using just their type and not have to use the parent relationship.
This is the base datstore object and notice I am using the app engine key.
#Entity
#Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public class DatastoreObject {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Key key;
public Long getId() {
return key.getId();
}
}
This class will use the #Unowned attribute supported in jpa 2 so that the inventory's key does not contain the parent establishment key. Otherwise you would have to pass the parent id in also and resolve that to a key based on type. This is because in an owned relationship the child key contains the parent key also.
#Entity
public class Establishment extends DatastoreObject {
#Unowned
#OneToOne(cascade = CascadeType.ALL)
private Inventory inventory;
}
Then in my dao base class I use the class
public class DaoBase<T extends DatastoreObject> {
protected Class<T> clazz;
#SuppressWarnings("unchecked")
public DaoBase() {
clazz = (Class<T>) ((ParameterizedType) getClass()
.getGenericSuperclass()).getActualTypeArguments()[0];
}
/**
* Find an object by it's shortened long id
* #param id
* #return
* #throws EntityNotFoundException
*/
public T find(Long id) {
if (id == null) {
return null;
}
EntityManager em = ThreadLocalPersistenceManager.getEntityManager();
Key key = getKey(id);
T obj = em.find(clazz, key);
return obj;
}
protected Key getKey(Long id) {
return KeyFactory.createKey(clazz.getSimpleName(), id);
}
}

Displaying Mutable PostgreSQL Arrays in the NetBeans Master/Detail Sample Form using JPA 1.0

Some Background
I have a game database with a table called Games that has multiple attributes and one called Genres. The Genres attribute is defined as an integer[] in PostgreSQL. For the sake of simplicity, I'm not using any foreign key constraints, but essentially each integer in this array is a foreign key constraint on the id attribute in the Genres table. First time working with the NetBeans Master/Detail Sample Form and Java persistence and it's been working great so far except for 1 thing. I get this error when the program tries to display a column that has a 1-dimensional integer array. In this example, the value is {1, 11}.
Exception Description: The object [{1,11}], of class [class org.postgresql.jdbc3.Jdbc3Array], from mapping [oracle.toplink.essentials.mappings.DirectToFieldMapping[genres-->final.public.games.genres]] with descriptor [RelationalDescriptor(finalproject.Games --> [DatabaseTable(final.public.games)])], could not be converted to [class [B].
Exception [TOPLINK-3002] (Oracle TopLink Essentials - 2.0.1 (Build b09d-fcs (12/06/2007))): oracle.toplink.essentials.exceptions.ConversionException
My Research
From what I've been able to read, it looks like PostgreSQL arrays need something special done to them before you can display and edit them in this template. By default, the sample form uses TopLink Essentials (JPA 1.0) as its persistence library, but I can also use Hibernate (JPA 1.0).
Here is the code that needs to be changed in some way. From the Games.java file:
#Entity
#Table(name = "games", catalog = "final", schema = "public")
#NamedQueries({
// omitting named queries
#NamedQuery(name = "Games.findByGenres", query = "SELECT g FROM Games g WHERE g.genres = :genres")
})
public class Games implements Serializable {
#Transient
private PropertyChangeSupport changeSupport = new PropertyChangeSupport(this);
private static final long serialVersionUID = 1L;
// omitting other attributes
#Column(name = "genres")
private Serializable genres;
// omitting constructors and other getters/setters
public Serializable getGenres() {
return genres;
}
public void setGenres(Serializable genres) {
Serializable oldGenres = this.genres;
this.genres = genres;
changeSupport.firePropertyChange("genres", oldGenres, genres);
}
} // end class Games
Here are also some of the sites that might have the solution that I'm just not understanding:
https://forum.hibernate.org/viewtopic.php?t=946973
http://blog.xebia.com/2009/11/09/understanding-and-writing-hibernate-user-types/
// omitted hyperlink due to user restriction
Attempted Solutions
I'm able to get the data to display if I change the type of genres to String, but it is immutable and I cannot edit it. This is what I changed to do this:
#Column(name = "genres")
private String genres;
public String getGenres() {
return genres;
}
public void setGenres(String genres) {
String oldGenres = this.genres;
this.genres = genres;
changeSupport.firePropertyChange("genres", oldGenres, genres);
}
I also attempted to create a UserType file for use with Hibernate (JPA 1.0), but had no idea what was going wrong there.
I also attempted to use the #OneToMany and other tags, but these aren't working probably because I'm not using them properly.
What I'm Looking For
There has to be a simple way to get this data to display and make it editable, but since I'm completely new to persistence, I have no idea what to do.
The effort put into your question shows. Unfortunately JPA does not currently support PostgreSQL arrays. The fundamental problem is that arrays are not frequently used in many other databases frequently and so heavy reliance on them is somewhat PostgreSQL specific. Thus you can expect that general cross-db persistence API's are not generally going to support them well if at all. JPA is no exception, having currently no support for PostgreSQL arrays.
I have been looking at writing my own persistence API in Java that would support arrays, but it hasn't happened yet, would be PostgreSQL-only when written, and would be based on a very different principle than JPA and friends.

Resources