Geek moment of the day: Entropy Key documentation
I that's how it works" writeup.
The Entropy Key uses but not beyond, breakdown in order to generate noise. In other words, it has a pair of devices that are wired up in such a way that as a high potential is applied across them, where electrons do not normally flow in this direction and would be blocked, the high voltage compresses the semiconduction gap sufficiently that the occasional stray electron will quantum tunnel through the P-N junction. (This is sometimes referred to as avalanche noise.) When this happens is unpredictable, and the occurrence of these events is what the Entropy Key measures.
These noise generators are then coupled to a 72MHz ARM Cortex-M3 CPU on the device. This processor samples the generators at a high frequency, forming a stream of random bytes. These streams of bytes are then analyzed using Ueli Maurer’s universal test for random bit generators whereby the amount of entropy in the streams is estimated rather conservatively. The streams are also exclusive-ORed together and that stream’s entropy is estimated in the same manner. If the raw streams appear to have severely reduced entropy then it indicates a fault in that generator, if the third stream has low entropy then it indicates that the generators have correlated and are not independently gathering entropy. Any of those three states are considered a failure mode and will result in the eKey locking itself out of the host, returning only an error code instead of generating entropy packets.
The two raw streams are then processed further in a de-biasing process invented by John von Neumann. Their entropy is estimated after the de-biasing process has been performed. Again, if the estimated entropy in the streams is seen to vary too wildly at this stage, the Entropy Key will lock itself out. The processed streams are then mixed into a pool made with a secure hashing function. Once at least 50% more (estimated) entropy has been mixed into the pool than it could possibly hold it is finalised and another pool initialised. Once enough pools have been processed to fill 20000 bits, the totality is subjected to the tests stipulated in FIPS 140-2. These tests produce a PASS/FAIL indicator for the block. On its own, this is not useful, since a perfectly random block could quite plausibly fail the tests. The Entropy Key therefore keeps running statistics on the FIPS 140-2 tests and will lock itself out if the ratio of failed blocks to passed blocks rises above a conservative estimate of the statistical likelihood of failure.
Once the block has been analysed, regardless of its PASS/FAIL indication, it is chopped up into 32 byte packets and these are handed off to the protocol handler in the device. Through this process therefore, each 256 bit block of data handed to the host was formed from somewhere in the region of between 3000 and 5000 bits read from the generators.
This has been your geeky moment of the day. (Or if you're lucky, one of many. ;-) Carry on.