[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Bignums




> When a byte string is treated as a bignum (for crypto use), should the
> first byte given be considered the most-significant byte of the number,
> or the least-significant byte?

Time for me to stop lurking on this list...

I'd go for least significant first, despite the increased difficulty for
humans to read the raw data, for efficiency reasons.

It's *much* easier to get storage alignment correct if you know where
the word boundaries are.  On a byte-wide machine, it doesn't matter
which way you read the number; on anything else it can make an enormous
difference.  Big-endian numbers would require the string to be read in,
parsed completely, and then copied or moved to aligned storage.

The assumption I'm making is that almost all data is never read by
people.  On the odd occasion that someone wants to read it, they should
either learn to read backwards, or use a tool to convert for them.  I
spend quite a bit of my time playing with bignums, but hardly ever do I
read them to understand what's happening.  When I do, it's rare that I
can comprehend much about a particular number of more than a very few
bytes without firing up bc(1) or the like to poke around in its
structure and any of its companion numbers.


Paul


Follow-Ups: References: