[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

encodings: do we need binary at all?




There is an interesting argument being brought up here -- are binary
encodings of any sort strictly needed?

Recently, I was playing with a colleague with a strawman certificate
format for which we developed a very trivial ad hoc binary encoding,
which was not too onerous to use.

On the other hand, it has occured to me that given the general
overhead of public key systems, binary formats might not be needed at
all. They probably don't provide a performance advantage. The only
advantages they provide are some increase in compactness and some ease
of producing a cannonical encoding for signature purposes. Of course,
one can think of ways of achieving the latter without binary
encodings, and the former isn't necessarily such an important feature.

To give a really extreme example, this is an ASCII representation of a
public key from the SSH system. It was originally just a single line,
but I've broken it up with backslash/newlines to make it more readable:

1024 37 85799595668887144041234497858720965191494394274243010276995452\
7289397549856435807651462408205171548319480002414448720219295632456503\
7288575015538644624806921929027414093682073278113060408377555754422808\
5989367574906462469630862751348976715281503899082491996857682292734325\
009646270999199065958155229155773963 perry@jekyll.piermont.com

Now, this isn't a certificate -- its just a key (the last field
containing the name@host is just a comment) -- but one wonders a bit
about why representing these things in ASCII is such a bad thing.

The biggest problem in my mind is the fact that ASCII makes it harder
to send around certificates in small spaces like UDP datagrams, but
again, using bases other than 10 for the numbers alleviates that
somewhat and it is also not clear how important this compactness is in
the first place.

Comments?

Perry

Follow-Ups: