Академический Документы
Профессиональный Документы
Культура Документы
Jon Sporring
(sporring@diku.dk)
Department of Computer Science / University of Copenhagen
Universitetsparken 1 / DK-2100 Copenhagen East
DENMARK
It can be seen that the spatial quantization has only constant The higher order moments of the entropy change func-
effect on the information change and will hereafter be ig- tion does not monotonically grow with increased inner scale.
nored. The main reason for this being that different structures are
Using the image as a distribution, pt = 1c It = GtcI , dominating at different inner scales. But comparing the en-
R
where c = I (x)dx, and @I = 2r2I (The Heat Equation), tropy change functions in figure 1 with 2 it is seen that the
2 @t
where r is the Laplacian operator, thus yields,
difference in the finer structure results in difference of the
12.46 1.2
100
Entropy (bits)
12.42 0.8
y (pixels)
skew: 0.28
12.38 0.4
400
12.36 0.2
500
12.34 0
100 200 300 400 500 -3 -2 -1 0 1 2 3 4 5 -3 -2 -1 0 1 2 3 4 5
x (pixels) Scale (log(pixels^2)) Scale (log(pixels^2))
Fabric_0007 Fabric_0007 x 10
-4 Fabric_0007
(4a) 12.48
(4b) 7
(4c)
12.475
6 max: (0.012,0.00066)
100
4 skew: 1.1
y (pixels)
12.46
kurt: 2.9
12.455 3
300
12.45
2
400 12.445
1
12.44
500
12.435 0
100 200 300 400 500 -3 -2 -1 0 1 2 3 4 5 -3 -2 -1 0 1 2 3 4 5
x (pixels) Scale (log(pixels^2)) Scale (log(pixels^2))
Fabric_0008 Fabric_0008 x 10
-3 Fabric_0008
(3a) (3b) 3
(3c)
12.45
2.5 max: (1.4,0.0028)
100
Entropy change (bits/log(pixel^2))
mean: 0.66
2
12.4 var: 1
200
Entropy (bits)
y (pixels)
skew: -0.37
1.5
kurt: -0.048
300
12.35
1
400
12.3 0.5
500
0
100 200 300 400 500 -3 -2 -1 0 1 2 3 4 5 -3 -2 -1 0 1 2 3 4 5
x (pixels) Scale (log(pixels^2)) Scale (log(pixels^2))
Fabric_0012 Fabric_0012 x 10
-3 Fabric_0012
(2a) 12.48
(2b) 1
(2c)
12.47 0.9 max: (-0.46,0.00098)
200
Entropy (bits)
12.41 0.3
400
12.4 0.2
12.39 0.1
500
12.38 0
100 200 300 400 500 -3 -2 -1 0 1 2 3 4 5 -3 -2 -1 0 1 2 3 4 5
x (pixels) Scale (log(pixels^2)) Scale (log(pixels^2))
Figure 1. Examples of the entropy-scale function. The a’s are the image/distribution, the b’s entropy
as a function of logarithmic scale, and the c’s the entropy change with respect to the logarithmic
scale.
level006.b level015.b
7
20 20 6
y (pixels)
y (pixels)
60 60
4
80 80
3
100 100
2
120 120
1
20 40 60 80 100 120 20 40 60 80 100 120 10 15 20 25
x (pixels) x (pixels) mean blob size (pixel)
Figure 2. The images in (a) and (b) are the first and the last from a sequence of images of a fronto-
parallel plane with circular texture taken with different zoom-value. The graph in (c) is the point of
maximum entropy change plottet against the estimated ‘blob’-expansion
One application of the entropy is scale-selection: scales where p and q are two distributions defined on identical N -
of maximal information loss is the scales of which the dimensional domains to which x belongs. Historically, the
dominating image content deteriorates fastest. It might Kullback divergence is a measure of the waste of band-width
even be possible to globally distinguish several dominating when coding with an incorrect distribution.
scales. Another application is quantization: Using the en- In the case of the spatial distributions in Scale-Space, p
tropy change, a spatial down-sampling (a pyramid) can be and q belong to a continuous one-parameter family of dis-
calculated in such a way, that the information loss is con- tributions, and hence in the limit of infinitesimal change the
stant. This refines the a priori natural logarithmic scale by change in Kullback is,
image content. Z
A natural extension of this paper would be local en- [
K pa ; pa+ ] = (x) log ppa(x(x) ) dx pa
a+
tropies, for e.g. texture segmentation. It is of course straight Z log pa (x) ? log pa+ (x)
forward to window the images, but this would not be a ‘true’
local method. Another important matter is that the presented
= pa (x)
dx
7. Acknowledgements Z @p (x)
= t
@t
dx
x x x References
One of the basic assumptions of statistical mechanics is
the indifference of thermodynamics to position of the sys- [1] P. W. Atkins. Physical Chemistry. Oxford University Press,
tem under study. The number of different arrangements of a 1990.
fixed number of subsystems is signified by the multiplicity [2] L. Florack, B. ter Haar Romeny, J. Koenderink, and
M. Viergever. Linear scale-space. Journal of Mathematical
function,
(fni g) = QNn! !
Imaging and Vision, 4:325–351, 1994.
g [3] M. Jägersand. Saliency maps and attention selection in scale
i i and spatial coordinates: An information theoretic approach.
where is the number of particles in subsystem i and N =
P n nisi the In Fifth International Conference on Computer Vision, pages
i i total number of particles in the thermodynam- 195–202. IEEE Computer Society Press, June 1995.
ically closed system under study. The thermodynamic (di- [4] E. T. Jaynes. Prior probabilities. IEEE Transactions on sys-
mensionless) entropy is given as, tems science and cybernetics, 4(3):227–241, 1968.
[5] C. Kittel and H. Kroemer. Thermal Physics. W. H. Freeman
S = log g and Company, New York, 1980.
[6] J. J. Koenderink. The structure of images. Biological Cyber-
which can be simplified using Stirling’s approximation,
p netics, 50:363–370, 1984.
x ! ' 2xx+ 12 exp(?x + : : : ) [7] M. Li and P. Vitányi. An introduction to Kolmogorov com-
plexity and its applications. Springer-Verlag, 1993.
where the higher order terms can be neglected for x greater [8] T. Lindeberg. Scale-Space Theory in Computater Vision.
than about 10. The thermodynamic entropy is thus approx- Kluwer Academic Publishers, The Netherlands, 1994.
imated as, [9] R. Picard, C. Graczyk, S. Mann, J. Wachman, L. Picard, and
L. Campbell. Vistex. via ftp:whitechapel.media.mit.edu,
S = log g X 1995. Copyright 1995 Massachusetts Institute of Technol-
= log N ! ? log ni ! ogy.
[10] W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flan-
i
= 1 ?2 k log 2 + (N + 12 ) log N ? N
nery. Numerical recipes in C. Cambridge University Press,
1992.
[11] J. Rissanen. Stochastic Complexity in Statistical Inquiry.
X X
? (ni + 21 ) log ni + ni
k k World Scientific, 1989.
[12] B. M. ter Haar Romeny (Ed.), editor. Geometry-Driven Dif-
i=1 i=1 fusion in Computer Vision. Kluwer Academic Publishers,
1 ? k 1
= 2 log 2 + (N + 2 ) log N
The Netherlands, 1994.
[13] A. P. Witkin. Scale space filtering. In Proc. of Interna-
X
tional Joint Conference on Artificial Intelligence (IJCAI),
? (ni + 21 ) log N
k
ni
? (N + k2 ) log N Karlsruhe, Germany, 1983.
i=1
X
= 1 ?2 k log 2N ? (ni + 12 ) log nNi
k
i=1
X
= c ? N < S (N ) > ? 21 log nNi
k
i=1