Given a hash function, hash the input strings "0", "1", "2",... that is, the natural numbers written out in some base, getting as output a sequence of hashes, one per number. The concatenation of the output hashes is a stream of random bits. How good is its randomness? It's probably not periodic -- which is good -- because the string representation of each input number gets longer as numbers get larger: use arbitrary precision integers when necessary.
We can also consider constructions in which the strings being hashed are of the form "PREFIX SEPARATOR NUMBER", where PREFIX is an arbitrarily long initialization vector supplied by the user. This construction might need care to avoid attacks like length extension. We might also use a keyed hash function and derive the key from the user-supplied initialization vector: we leave details of this unspecified because it also requires care.
Use big-endian to cache partial computation, taking advantage of the big end of stringified numbers not changing very quickly.
Unlike traditional counter mode which is periodic due to its fixed counter width, typically one block, we unashamedly make the counter an arbitrary precision integer so it never rolls over, at the cost of performance.
All this is Roll Your Own Cryptography, so probably not a good idea if you need security.
No comments :
Post a Comment