[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Re-keying Issues Document



Dan,

Thank you, I very much appreciate your mention of keying on
the number of bytes transmitted.  The important point of
course is that you found a human input to provide some
true randomness.  But, of course this won't hold as a generic
case.  A transaction system for example might always transmit
at a constant or quasiconstant rate.

The other question that comes to mind, is that it seems that
the number of bytes transmitted could often be quite small
compared to what we usually demand of our random numbers.

In any case, I think the point is that bigger keys are better
than re-keying, in the absence of HW RNGs.


On 01-Oct-98 Dan McDonald wrote:
>Mitch,
>
>These are good points, and provide strong arguments for real HW random number
>generators.  One small nit, though...
>
>> For any software key generator, any new key will be predictable
>> given a knowledge of the algorithm and its inputs.  An example
>> would be,
>> 
>>     K-new  =  func( seed, T ),
>> 
>> where T is any information that varies from one invocation to the
>> next.  T could be the previous result or a clock, or what-have-you.
>> In any case T is also predictable (else we're talking about random
>> number hardware).
>
>Is T really predictable?  I ask this because if I frequently rekey based on
>the number of bytes I transmit, T will vary based on a human's input.  Human
>input is not very predictable.  If T is something along the lines of a
>nanosecond timer, the human input differences amplify.
>
>I'm not saying we don't need better randomness.  Maybe I'm arguing that
>byte-based lifetimes provide better security, because of the unpredictability
>of humans using those bytes.
>
>Dan


References: