[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]


I'd like to insert a few words of caution about timings of crypto routines.
The speed of eating an apple, orange, or banana differs depending on
whether or not you start with peeled fruit, partially digested, or juiced.
Similarly, the following factors must be normalized when timing software:

	1. Blocksize and number of blocks.  Working with the same
	   short piece of data several hundred thousand times 
	   can be misleading due to data cache effects.
	   Blocks that are too large can cause swapping and TLB miss
	   rates that might cause overly pessimistic timings.
	2. Data dependencies.  Some algorithms have different data
	   usage patterns depending on the input.  Encrypting a
	   block of all 0's, for example, obscures this effect.
	3. Endianicity.  For protocols, the time to rearrange the
	   data to/from network byte order should be considered.
	   This transformation is sometimes embedded into the algorithm
	4. The compiler and switches.  Try all the compilers that are
	   available for the machine, and try all the optimization levels.
	   Make sure the routine still gets correct results, choose the
	   fastest result.

Code developers usually distribute a timing test with the routines.
Often this is useful to the developer for determining if a change
helps or hurts, but it might be useless for comparisons to other
implementations or other algorithms due to the factors listed above.

Follow-Ups: References: