A Revolution: Belief Propagation in Graphs with Cycles

Part of Advances in Neural Information Processing Systems 10 (NIPS 1997)

Bibtex Metadata Paper

Authors

Brendan J. Frey, David MacKay

Abstract

Until recently, artificial intelligence researchers have frowned upon the application of probability propagation in Bayesian belief net(cid:173) works that have cycles. The probability propagation algorithm is only exact in networks that are cycle-free. However, it has recently been discovered that the two best error-correcting decoding algo(cid:173) rithms are actually performing probability propagation in belief networks with cycles.

1 Communicating over a noisy channel

Our increasingly wired world demands efficient methods for communicating bits of information over physical channels that introduce errors. Examples of real-world channels include twisted-pair telephone wires, shielded cable-TV wire, fiber-optic cable, deep-space radio, terrestrial radio, and indoor radio. Engineers attempt to correct the errors introduced by the noise in these channels through the use of channel coding which adds protection to the information source, so that some channel errors can be corrected. A popular model of a physical channel is shown in Fig. 1. A vector of K information bits u = (Ut, ... ,UK), Uk E {O, I} is encoded, and a vector of N codeword bits x = (Xl! ... ,XN) is transmitted into the channel. Independent Gaussian noise with variance (12 is then added to each codeword bit,

.. Brendan Frey is currently a Beckman Fellow at the Beckman Institute for Advanced

Science and Technology, University of Illinois at Urbana-Champaign.

480

B. J Frey and D. J. C. MacKay

Gaussian noise with variance (J2

--U--'~~I~ _ E _ ncoder __ ~1 x ~ y

.I~ _ Decoder _ ~--U~.~

Figure 1: A communication system with a channel that adds Gaussian noise to the transmitted discrete-time sequence.

producing the real-valued channel output vector y = (Y!, ... ,YN). The decoder must then use this received vector to make a guess U at the original information vector. The probability P" (e) of bit error is minimized by choosing the Uk that maximizes P(ukly) for k = 1, ... , K. The rate K/N of a code is the number of information bits communicated per codeword bit. We will consider rate ~ 1/2 systems in this paper, where N == 2K. The simplest rate 1/2 encoder duplicates each information hit: X2k-l = X2k = Uk, k = 1, ... , K. The optimal decoder for this repetition code simply averages together pairs of noisy channel outputs and then applies a threshold: