Академический Документы
Профессиональный Документы
Культура Документы
being applied to output from a weakly random entropy source, together with a short,
uniformly random seed, generates a highly random output that appears independent
from the source and uniformly distributed.[1] Examples of weakly random sources
include radioactive decay or thermal noise; the only restriction on possible
sources is that there is no way they can be fully controlled, calculated or
predicted, and that a lower bound on their entropy rate can be established. For a
given source, a randomness extractor can even be considered to be a true random
number generator (TRNG); but there is no single extractor that has been proven to
produce truly random output from any type of weakly random source.
Sometimes the term "bias" is used to denote a weakly random source's departure from
uniformity, and in older literature, some extractors are called unbiasing
algorithms,[2] as they take the randomness from a so-called "biased" source and
output a distribution that appears unbiased. The weakly random source will always
be longer than the extractor's output, but an efficient extractor is one that
lowers this ratio of lengths as much as possible, while simultaneously keeping the
seed length low. Intuitively, this means that as much randomness as possible has
been "extracted" from the source.
Contents
1 Formal definition of extractors
1.1 Strong extractors
1.2 Explicit extractors
1.3 Dispersers
2 Randomness extractors in cryptography
3 Examples
3.1 Von Neumann extractor
3.2 Chaos machine
3.3 Cryptographic hash function
4 Applications
5 See also
6 References
Formal definition of extractors
The min-entropy of a distribution {\displaystyle X} X (denoted {\displaystyle
H_{\infty }(X)} H_{\infty}(X)), is the largest real number {\displaystyle k} k such
that {\displaystyle \Pr[X=x]\leq 2^{-k}} \Pr[X =x] \leq 2^{-k} for every
{\displaystyle x} x in the range of {\displaystyle X} X. In essence, this measures
how likely {\displaystyle X} X is to take its most likely value, giving a worst-
case bound on how random {\displaystyle X} X appears. Letting {\displaystyle
U_{\ell }} U_{\ell} denote the uniform distribution over {\displaystyle \
{0,1\}^{\ell }} \{0, 1 \}^{\ell}, clearly {\displaystyle H_{\infty }
(U_{\ell })=\ell } H_{\infty}(U_{\ell}) = \ell.
Intuitively, an extractor takes a weakly random n-bit input and a short, uniformly
random seed and produces an m-bit output that looks uniformly random. The aim is to
have a low {\displaystyle d} d (i.e. to use as little uniform randomness as
possible) and as high an {\displaystyle m} m as possible (i.e. to get out as many
close-to-random bits of output as we can).
Strong extractors
An extractor is strong if concatenating the seed with the extractor's output yields
a distribution that is still close to uniform.
Explicit extractors
Using the probabilistic method, it can be shown that there exists a (k, e)-
extractor, i.e. that the construction is possible. However, it is usually not
enough merely to show that an extractor exists. An explicit construction is needed,
which is given as follows:
Definition (Explicit Extractor): For functions k(n), e(n), d(n), m(n) a family Ext
= {Extn} of functions
By the probabilistic method, it can be shown that there exists a (k, e)-extractor
with seed length
The following paragraphs define and establish an important relationship between two
kinds of ERF--k-ERF and k-APRF--which are useful in Exposure-Resilient
cryptography.
The goal is to construct an adaptive ERF whose output is highly random and
uniformly distributed. But a stronger condition is often needed in which every
output occurs with almost uniform probability. For this purpose Almost-Perfect
Resilient Functions (APRF) are used. The definition of an APRF is as follows:
This lemma is proved by Kamp and Zuckerman.[7] The lemma is proved by examining the
distance from uniform of the output, which in a {\displaystyle 2^{-m}\epsilon (n)}
2^{-m} \epsilon(n)-extractor obviously is at most {\displaystyle 2^{-m}\epsilon
(n)} 2^{-m} \epsilon(n), which satisfies the condition of the APRF.
The lemma leads to the following theorem, stating that there in fact exists a k-
APRF function as described:
Theorem (existence): For any positive constant {\displaystyle \gamma \leq {\frac
{1}{2}}} \gamma \leq \frac{1}{2}, there exists an explicit k-APRF {\displaystyle
f:\{0,1\}^{n}\rightarrow \{0,1\}^{m}} f: \{0,1\}^{n} \rightarrow \{0,1\}^{m},
computable in a linear number of arithmetic operations on {\displaystyle m} m-bit
strings, with {\displaystyle m=\Omega (n^{2\gamma })} m = \Omega(n^{2\gamma}) and
{\displaystyle k=n^{{\frac {1}{2}}+\gamma }} k = n^{\frac{1}{2}+\gamma}.
That this extractor fulfills the criteria of the lemma is trivially true as
{\displaystyle \epsilon =2^{-cm}} \epsilon = 2^{-cm} is a negligible function.
Examples
Von Neumann extractor
Further information: Bernoulli sequence
Perhaps the earliest example is due to John von Neumann. His extractor took
successive pairs of consecutive bits (non-overlapping) from the input stream. If
the two bits matched, no output was generated. If the bits differed, the value of
the first bit was output. The Von Neumann extractor can be shown to produce a
uniform output even if the distribution of input bits is not uniform so long as
each bit has the same probability of being one and there is no correlation between
successive bits.[8]
Thus, it takes as input a Bernoulli sequence with p not necessarily equal to 1/2,
and outputs a Bernoulli sequence with {\displaystyle p=1/2.} p = 1/2. More
generally, it applies to any exchangeable sequence�it only relies on the fact that
for any pair, 01 and 10 are equally likely: for independent trials, these have
probabilities {\displaystyle p\cdot (1-p)=(1-p)\cdot p} {\displaystyle p\cdot (1-
p)=(1-p)\cdot p}, while for an exchangeable sequence the probability may be more
complicated, but both are equally likely.
Chaos machine
Another approach is to use the output of a chaos machine applied to the input
stream. This approach generally relies on properties of chaotic systems. Input bits
are pushed to the machine, evolving orbits and trajectories in multiple dynamical
systems. Thus, small differences in the input produce very different outputs. Such
a machine has a uniform output even if the distribution of input bits is not
uniform or has serious flaws, and can therefore use weak entropy sources.
Additionally, this scheme allows for increased complexity, quality, and security of
the output stream, controlled by specifying three parameters: time cost, memory
required, and secret key.
Applications
Randomness extractors are used widely in cryptographic applications, whereby a
cryptographic hash function is applied to a high-entropy, but non-uniform source,
such as disk drive timing information or keyboard delays, to yield a uniformly
random result.
Random extraction is also used to convert data to a simple random sample, which is
normally distributed, and independent, which is desired by statistics.
See also