Академический Документы
Профессиональный Документы
Культура Документы
0
lo
0 .1 .2 .3 .4 .5 .6 .7 .8 .9 1
Number
of 30
Literals
q Entropy H (P I 1
: Average
.
+ : Minimum
: 47.6 - H ( P l )
Fig. 3: 6-input single-output functions with don't cares.
0 .1 .2 .3 .4 .5 .6 .7 .8 .9 1
PI
Fig. 1: 6-input, single-output functions. of
Literals
Figure 2 shows the average literal count versus entropy 20
data for 6, 7, 8 and 9-input functions. Each point represents an
average over 200 randomly generated functions. A near-linear
relationship between L and H ( P l ) is evident. Since these func- 0 .1 .2 .3 .4 .5 .6 .7 .8 .9 1
tions are fully specified, d = 0. and according to Eq. (5), the
slope should be K(n). The measured slope for each n is given Entropy H ( P 1 )
in Table 1 . Also, we notice that K ( n ) / K ( n - 1 ) is close to 2.
Fig. 4: 7-input single-output functions with don't cares.
This confirms the previous observations [9]:
K ( n ) = k.2" (6) Table 2 - Ratio of literal count to entrow (L/H) 1
Don't cares n =6 n =7
for some constant k.
LM Normalized
100xd % LM Normalized
La LM
0% 47.6 1 .00 93.1 1.oo
20% 38.1 0.80 73.6 0.79
Number 40% 27 .O 0.57 53.0 0.57
60% 16.8 0.35 35.5 0.38
Literals 80% 7.7 0.16 15.8 0.17
100
Multioutput Functwns. We again consider two cases. In the
lirst case, we assume that the functions are fully specified and
0 .2 .4 .6 .8 1 in the second case, the outputs are specified only for a subset of
the input vectors.
Entmpy H (P 1 1
Fully specifred function: Figure 5 shows the relationship
Fig. 2: fully specified n-input functions. between the literal count and the entropy as obtained from
1.200 randomly generated 7-input 2-output fully specified func-
tions. The entropy, H (P),for the functions was computed from
Eq. (2). The functions were minimized by MIS using a stan-
dard script. Figure 6 shows similar data obtained from 1,500
47.59 randomly generated 7-input 3-output functions. Figures 7 and 8
I7 I 93.05 I 1.96 I summarize the data obtained from randomly generated %input,
4-output functions and g-input, 7-output functions, respectively.
An interesting inference from these results is that,
irrespective of the number of inputs and outputs, the average
amount of hardware (literal count) is always given by the com-
Partially specified function: Don't cares in a combina- putational work formula. For multi-output functions, from Eqs.
tional network specification are known to reduce the amount of (5) and (6). we get
logic. Figure 3 gives the results for 6-input single-output func-
tions. The five sets of data correspond to the functions with L ( 4 d . P ) = (l-d).k'2".H(P) (7)
Paper 17.4
303
observe about 640 literals in Fig. 8.
A closer examination of these results shows that the
number of literals to entropy relation slightly deviates from
linearity. That is, the L/IH ratio decreases slightly when the
entropy H increases. The average L 4 ratio for diffemt ranges
of H is listed in Table 3. Since the average LM ratio on the
range of H between 0 and 1 (third column) is very close to the
values of K ( n ) for single output function shown in Table 1, we
conclude that K ( n ) is independent of the number of outputs and
I I I I I I I I I is simply a function of the number of inputs. The reduced
0 .2 .4 .6 .8 1 1.2 1.4 1.6 1.8 2 slope for higher entropy may be due to greater sharing between
the multiple output functions.
Entropy H (PI
300 7
96.5 93.116
Entropy H ( P )
Partially specified function: Figure 9 illustrates the effect
Fig. 6: 7-input 3-output random functions (1500 samples). of don't cares on 7-input 3-output functions. Five sets of data
are shown in this figure. These correspond to fully specified
functions (shown by . in Fig. 9), and the functions having 20%
(m), 40% (A), 60% (0) and 80% (+) don't cares. Each set con-
sists of 650 random samples. All functions were minimized by
MIS. The average L& ratios are listed in Table 4. These
results justify that the L/IH ratio is proportional to 1-d where d
is the fraction of don't cares (Eq. (7)).
0 1 '
I I I I I I I
0 .5 1 1.5 2 25 3 3.5 4
Entropy H (PI
Fig. 7: 8-input I-output random functions (750 samples). of 150
Literals
50
0 .5 1 1.5 2 2.5 3
Entropy H (P)
Paper 17.4
304
f n=9
0 .1 .2 .3 .4 .5 .6 .I .8 .9 1
H
FSM
Name
in-
puts/
out-
I puts
Don’t
cam(%)
lOOx
d
K(n)
Entropy
H(P)
I
Literalmt
L(n,4P)I
Random/
Mustang
61
1I 1
35.5
72.0
130.2
242.5
I 1.86
I i:; 1
I 1.11
0.95
dk15 I 517 0.0 23 3.97 I 9249692 possible to estimate the area and delay of a circuit from its
dk16 IIE 15.6 80 6.03 400/383/307 functional description. Statistical measures have not been used
dkll 5/6 0.0 23 3.81 89/83/84 in the modem logic synthesis. We believe, there are relevant
dk21 4/6 12.5 12 3.18 33t29R9 applications.
dk512 517 6.3 23 4.34 93/82/84
bbtaS 515 25.0 23 2.99 51/35/36
beecount 617 20.3 41 235 88/56/50 References
red 116 10.3 84 239 60/60/65
modulo12 515 25.0 23 3.59 62/42/43 1. L. Hellerman, “A Measure of Computational Work,”
malati 919 91.6 338 4.25 120/193/181 IEEE Trans. Computers, vol. C-21. pp. 439446. May
SE 114 84.4 84 226 3ODUn 1972.
lion9 6/5 60.9 41 3.31 62/49t22 2. R. W. Cook and M. J. Flynn. “Logical Network Cost
and Entropy,” IEEE Tram. Computers, vol. C-22. pp.
4. TIMING 823-826. September 1973.
3. E. Kellerman, “A Formula for Logical Network Cost,”
Figure 10 shows the relationship between the delay of the
implemented singleanput Boolean functions and entropy. The
IEEE Trans, Computers, vol. C-17. pp. 881-884, Sep-
delay is estimated using the unit-fanout delay model in MIS, tember 1968.
i.e.. each gate is assumed to have one unit of delay and each 4. N. Pippenger. “Information Theory and the Complexity
fanout adds 0.2 unit. Every point in Fig. 10 represents an aver- of Boolean Functions ,” Moth. Syst. Theory, vol. 10, pp.
age of 30 random samples. The observed linear relationship 129-167, 1977.
between the square of delay ( D 2 ) and entropy (H) can be 5. V. D. Agrawal. “An Information Theoretic Approach to
expressed as follows: Digital Fault Testing,” IEEE Tram. Computers, vol. C-
D 2 = R (n)-H 30, pp. 582-587, August 1981.
6. P. K. Lala, “An Algorithm for the State Assignment of
Table 6 summarized the slopes of the straight lines fitted to the Asynchronous Sequential Circuits,” Electronics Letters,
data of Fig. 10. Since R(n)Zt(n-1) is approximately 2, we vol. 14, pp. 199-201, 1978.
assume that R ( n ) is proportional to 2”. Thus
7. C. R. Edwards and E. Eris. “State Assignment and
Entropy.” Electronics Letters. vol. 14, pp. 390-391,
1978.
where kD is a constant of proportionality. 8. R. K. Brayton et al. “MIS: A Multiple Level Logic
Optimization System,” IEEE Tram. CAD, vol. CAD-6,
5. CONCLUSION pp. 1062-1081, November 1987.
We have presented a statistical measure of the complex- 9. K. Mase. “Comments on A Measure of Compu&ational
ity of Boolean functions. We measure computational work Work and Logical Network Cost and Entropy,” IEEE
based on the information theoretic entropy of the output signals. . C-27, pp. 94-95. January 1978.
Trans. C o m p ~ e r svol.
Computational work is shown to have a direct relation to the
hardware needed to implement the function. Circuit delay is 10. R. Lisanke. “Finite state machine benchmark set,” Prel-
also related to the computational work. The computational iminary benchmark collection. September 1987.
work is reduced in direct proportion of don’t cares in the 11. S. Devadas et al, “MUSTANG: State Assignment of Fin-
specification of a function. Partially specified functions are ite State Machines Targeting Multi-Level Logic Imple-
known to require less hardware compared to the fully-specified mentations,” IEEE Truns. CAD, vol. CAD-7, pp. 1290-
functions with the same number of input variables. Thus, it is 1300. December 1988.
Paper 17.4
305