Cluster updates
Critical Slowing down
The Metropolis Monte Carlo method works very well in simulating the properties of the 2D Ising model. However, close to the Curie temperature , the simulations suffer from critical slowing down, and more efficient algorithms are needed. Cluster update algorithms are the most succesful global update methods in use. These methods update the variables globally, in one step, whereas the standard local methods operate on one variable at a time. These cluster algorithms are useful in many applications involving graphs and lattices.
A global update can reduce the autocorrelation time of the update and thus greatly reduce the statistical errors.
The Ising model does not have dynamics built into it: there is no kinetic energy term associated with the spins . The Metropolis Monte Carlo method generates successive configurations of spins, but this does not represent the real time evolution of a system of spins. In a real system, the dynamical variables are functions of time. An interesting quantity is the relaxation time, which is the time scale over which the system approaches equilibrium. If is a quantity which relaxes towards its equilibrium value , the the relaxation time can be theoretically described as
Near the critical temperature, the relaxation time becomes very large and can be shown to diverge for an infinite system.
Autocorrelation time in Metropolis simulations
In a Metropolis simulation, the successive spin configurations also exhibit a type of critical slowing down near the phase transition temperature of the finite lattice. This is not the same as relaxation in a real system. However, it is useful to measure a relaxation time for the Metropolis “dynamics” because it helps to determine how many steps to skip in order to generate statistically independent configurations. Recall that one Monte Carlo step per spin is taken conventionally to be N Metropolis steps. If the correlation time is of the order of a single Monte Carlo step, then every configuration can be used in measuring averages. But if the correlation time is longer, then approximately Monte Carlo steps should be discarded between every data point. The time autocorrelation function is defined as:
where
is the measured value of the observable in iteration .
If Monte Carlo steps separated in time by intermediate steps are truly uncorrelated, then should be zero (i.e., of where is the number of steps used in computing the averages).
Swendsen-Wang cluster update
[Swendsen and Wang, PRL 58 (1987) 86]
Beginning with an arbitary spin , one SW cluster update cycle consists of:
Inspect all nearest neighbor spins . If , create a bond between sites with probability (otherwise, no bond).
Construct clusters = sets of points connected by bonds.
Set each cluster to a random value .

Figure illustrating how clutsers are build for a global update.
Wolff single cluster update
[U. Wolff, PRL 62 (1989) 361]
Principle: do the cluster decomposition as in SW, but invert (‘flip’) only one randomly chosen cluster! In practice:
Choose random site .
Study neighbouring sites . If , join site to cluster with probability .
Repeat step 2 for site , if it was joined to the cluster. Keep on doing this as long as the cluster grows.
When the cluster is finished, invert the spins which belong to it.
Properties
Usually slightly more effective than SW.
The minimum cluster size = 1, maximum = volume.
Nicely recursive.
Satisfies ergodicity and detailed balance
Performance is worse than single updates away from the transition
In what models clusters work, in what models they fail? Clusters (usually) fail, if
there are frustrated couplings (spin glasses, gauge theories . . . )
one cannot construct a symmetric reflection operation
spins are ‘frozen’ in place by external fields etc.
Cluster updates are (normally) usable only in proximity of a 2nd order phase transitions: large correlation lengths mean large clusters. Nevertheless, sometimes they are useful when correlation lengths are finite but still large.
The system is "stuck" and does not relax
Cluster updates cover much more phase space than local updates close to a phase transition!
Challenge 11.3:
Measure the auto-correlation function and compare local updates and cluster updates