Вы находитесь на странице: 1из 5

Physics 212: Statistical mechanics II, Fall 2006 Lecture XXVII-XXVIII

Many physical systems have a unique ground state in the limit of zero temperature, and hence zero entropy. However, there are a number of systems where either because of a degeneracy of ground states or because of dynamical considerations, a nonzero residual entropy is measured down to zero temperature (typically by measuring the specic heat over a range of temperatures up to some well-understood high-temperature limit, then integrating). We rst discuss ice models, where a residual entropy results from a geometric degeneracy of congurations, and then glasses, in which a unique ground state may exist. Reference: Ziman, Models of Disorder Water ice is a surprisingly complicated material. At atmospheric pressure, there is a range of temperatures where the Ice I structure appears: this is a wurzite structure in which each oxygen atom is tetrahedrally coordinated with four other oxygen atoms. There is exactly one hydrogen atom on each oxygen-oxygen bond, but the hydrogen atom sits closer to one oxygen atom than the other, in such a way that each oxygen has two hydrogen atoms close to it. This Pauling model of ice has an entropy resulting from the dierent ways in which the hydrogen atoms can be arranged. To estimate the entropy, we can start from assuming that each bond has 2 congurations, depending on whether the hydrogen atom is closer to one oxygen or the other (giving 22N congurations, where N is the number of oxygen atoms), then noting that for each oxygen atom, only 6 of the 16 possible congurations of its four bonds will have exactly 2 hydrogens next to the oxygen. This leads to the estimate S log 22N (6/16)N 3 = kB = kB log (1) N N 2 which is within about 1 percent of the correct answer. Note that we ignored some correlations between the bond congurations in this argument, so this agreement is a bit surprising. Similar geometric degeneracies appear in many magnetic and other models when interactions are frustrated: not every interaction energy can be simultaneously minimized. An example is the Ising antiferromagnet on the triangular lattice, which has a macroscopic entropy of ground states. Glasses: violate both periodicity (simplifying assumption of solid-state physics) and ergodicity (simplifying assumption of statistical mechanics). Simplest example of a glass, but very hard to study theoretically: glass of hard spheres or congurational glass. Other types include covalently bonded glasses (random bond networks) and disorder-driven glasses, which turn out to be simpler theoretically. The radial distribution function of a congurational glass is liquid-like, but has a shear modulus like a solid: instantaneous snapshot looks like a liquid, but time-domain study reveals no long-distance ow of a tracer particle. To get some understanding, consider the phenomenon known as random close packing of hard spheres in 3D. It is believed that the most ecient packing of spheres in 3D is the hcp or fcp structure, with about 74% of space occupied by spheres (actually sphere packing is one of many mathematical problems with an upper critical dimension: sphere packing becomes simple above 24 dimensions!). However, if you shake ball-bearings in a can and then push down on the lid, you will nd that the spheres do not go into this periodic structure, but instead form an apparently random structure in which each sphere is rigidly held by its neighbors, but the total volume fraction is 1

only about 63% (the detailed number seems to depend slightly on the precise way the structure is created). Now suppose that our hard-sphere system is at nite temperature and just below the random close-packed volume fraction, and that we track the motion of one particular hard sphere (a tracer particle). One can picture the tracer particle as spending most of its time vibrating in a cage of nearest-neighbor spheres, and only occasionally nding enough room to squeeze out of the cage and move to a dierent set of nearest neighbors. This gives some justication for the phenomenological free-volume theory of congurational glasses: dynamical properties such as the tracer particle diusion constant scale as D exp(C/(V V0 )) (2) where C and V0 are some model-dependent constants, and V is the volume per particle. The picture is that V0 is something like the cage volume in the random close-packed structure: when the volume per particle is close to the cage volume, then the diusion constant becomes exponentially small compared to its behavior at low packing. Denition of random systems We will discuss a rst example of physics in random systems that at rst glance looks quite dierent from the statistical mechanics problems discussed so far: Anderson localization of a single electron in a random potential. Although this appears to be a single-particle property, it is a famous example of how a simple RG argument can be used to get a qualitative picture for a complicated system where an exact treatment is extremely dicult. Consider electronic wavefunctions (solutions of the nonrelativistic time-independent Schrodinger equation) in a random potential. A simple guess is that at low energy, there are trapped states that decay exponentially at large distances, while at high energy, there are scattering states in which on long times the electron executes a random walk, and the wavefunction extends to spatial innity. In three dimensions this picture is correct: in fact there is a special energy, known as the mobility edge, that separates extended from localized states. An argument due to Mott for the existence of the mobility edge is that having extended and localized states coexist at the same energy is only possible for rare disorder potentials, as a small perturbation will mix the two states (as the energy denominator is zero) and give rise to two extended states. In one and two dimensions, it is now believed that all states are localized by a random potential, although the localization length (the length scale on which the localized wavefunctions decay exponentially) becomes extremely long at high energies, especially in 2D. How is this even possible? You can convince yourself that a random walk in 1D or 2D returns to the same points over and over again, while in 3D and higher the mean number of returns to a given point is nite. As a result of this repetition, even a small bump in the random potential is amplied by being visited a large number of times: the result of constructive quantum interference is to lead to localization by even weak disorder. We can give an RG argument that supports this result and is much simpler than a serious calculation. Our assumption is going to be that there is a renormalization group ow in a single parameter, which we take to be the average conductance g. Now when the states are extended and a scattering picture is correct, the conductance is large; when states are localized, the conductance

becomes extremely small. In the extended regime, we assume that Ohms law applies: d ln g = (d 2)g. (3) d ln L To understand this, note that it describes just power-counting scaling, since g is spatially dimensionless in 2d, has dimensions of inverse length in 3D, and dimensions of length in 1D. In the localized regime, when g is small, there is a localization length scale below which g goes to zero exponentially: g(L) exp(L/). Then dg/dL = (1/)g, or d ln g = (L/). (4) d ln L Now make a plot of how g evolves with L. In 3D there are two regimes separated by an unstable xed point, while in 1D and 2D, the only stable xed point is at g = 0. Harris criterion Now lets return to classical statistical mechanics models of many interacting degrees of freedom. Suppose that a magnetic or other material is made in an imperfect way: it may be logical to model the system as having either random values of J or random values of h, or both. The challenge is to compute physical quantities, such as the free energy, and only then average over the randomness: that is, the experimentally relevant free energy is F = (kT log Z) , (5)

where the denote the average over disorder, distinct from the sum over thermally weighted congurations that makes up Z. Much eort (the replica and supersymmetry approaches) has been devoted to nding eective ways to carry out such nonthermal averages. Now suppose that we are given a model like the Ising model, and asked whether a small amount of randomness is relevant at the ordinary (clean) phase transition that we have studied in the past. It turns out that for the most common type of disorder, namely disorder that couples to a term appearing in the energy density (such as random J in the Ising model, referred to as bond disorder), is simple to analyze in this way. Consider the random-bond Ising model: the energy function is still (for the ferromagnet) E=
ij

Jij si sj ,

(6)

but now Jij is an independent random variable on each bond (say, drawn from a Gaussian, or from a log-normal distribution to ensure that J remains positive if that is physically important). We are now going to determine whether a small amount of disorder in the J distribution, which couples to the local energy density, is relevant or irrelevant at the phase transition, using a rescaling argument. As in the Ginzburg criterion, we ask whether the randomness becomes more or less signicant as we near the transition. If the system is away by a dimensionless distance t = |T Tc |/Tc from the phase transition in the clean case, its correlation length is t . To get an estimate of how strong the overall disorder of J is over a correlation volume, recall that J is an independent random variable on each bond, so that the relative uctuations in average J fall o as the inverse square root of the correlation volume: J td/2 . J 3 (7)

Now, in the spirit of the Ginzburg criterion, we want to compare these uctuations in J to something. Note that since it is the dimensionless coupling J/T that determines the location of the phase transition, a uctuation in J can be thought of as a local uctuation in temperature. So to see whether the uctuations in temperature are larger or smaller than the original distance in temperature from the critical point, we need to compare td/2 with t as t 0. The uctuations will eventually dominate if d/2 < 1 2 d > 0 > 0, (8) where is our old friend the specic heat exponent and we have used a scaling relation = 2 d. To reiterate, if is positive then the critical point is unstable to uctuations because the relative importance of uctuations grows as we move toward the transitions; if is negative, then it is at least possible that the critical point is stable. This result is known as the Harris criterion. The random-eld Ising model, where the disorder is in the coecient h that couples to the order parameter, is somewhat more dicult to analyze, but also very interesting (cf. Cardy). I would like to give one example of how renormalization-group methods have been used to understand spin glasses. The term spin glass refers to a magnetic model with random interactions that lead to glassy dynamics: experimentally these systems have been actively studied since the 1970s. Suppose that disorder is relevant at a clean critical point: then even a small amount of disorder will ow to a large degree of disorder, and we need to nd a way to treat the latter. Now in principle this is a very hard problem: we have an initial probability distribution of random couplings, and would like to nd the evolution of this probability distribution. Since the probability distribution p(J) is a function, not just a nite number of parameters, now we are doing a functional RG: the ow equations will be equations for a function, i.e., partial dierential equations. In one dimension, there is an intuiive real-space renormalization group method to study this problem. It even works well for quantum systems, so let me explain how it works for one of those (this will follow closely work of DS Fisher). Suppose that we have a distribution of couplings Ji in the Heisenberg antiferromagnet for a spin-half chain, H=
i

Ji si si+1 .

(9)

We would like to develop a rescaling procedure to understand the (highly nontrivial) In an ordinary momentum-shell RG, we integrate out the highest energy scales to get a new problem. Here, we can look along the chain for the largest coupling Ji . If this coupling were innitely larger than its nearest neighbors, then the two spins it joins would form a singlet with a large energy gap Ji , and we could assume in continuing the process that this singlet was a good description of the low-energy physics. Since Ji will not actually be innite, lets keep the assumption that the singlet forms between spins i and i + 1, but now think about how the rest of the couplings in the chain are modied. We can do perturbation theory if Ji is much larger than all the other couplings: in second-order perturbation theory, spins i 1 and i + 2 will no longer be totally independent of each other but will be coupled by a new eective coupling Je . To avoid doing the perturbation-theory calculation (a straightforward exercise in quantum mechanics), note that the new coupling will have an energy denominator Ji because spins i 1 and i + 2 can only be coupled by an excited intermediate state 4

of the bond from i to i + 1, and the singlet-triplet splitting is just Ji in units where h = 1. In order to get an eective coupling, I need two powers of energy in the numerator, and I need Ji1 and Ji+1 to appear since if either one is zero then Je is zero. Hence we arrive at Je = Ji1 Ji+1 , Ji (10)

which is correct up to a possible numerical factor. The remainder of the RSRG approach lies in converting the equation above to a partial differential equation for the distribution p(J), and then looking for asymptotic solutions. Although this becomes a bit technical, the nal results are worth the trouble: the asymptotic distribution of couplings is very broad, which post hoc justies the second-order perturbation theory result used above. The renormalization-group ows can be used to show that the disorder-averaged correlations are critical (power-law) in space, but with modications from the clean case.

Вам также может понравиться