The Essential Guide to Image Processing- P14:We are in the middle of an exciting period of time in the field of image processing. Indeed, scarcely a week passes where we do not hear an announcement of some new technological breakthrough in the areas of digital computation and telecommunication. | 396 CHAPTER 16 Lossless Image Compression the last two symbols in the ordered list are assigned codewords that have the same length and are alike except for their final bit. Given a source with alphabet S consisting of N symbols s with probabilities pk P sk 0 k N 1 a Huffman code corresponding to source S can be constructed by iteratively constructing a binary tree as follows 1. Arrange the symbols of S such that the probabilitiespk are in decreasing order . p0 pl P N-1 and consider the ordered symbols Sk 0 k N 1 as the leaf nodes of a tree. Let T be the set of the leaf nodes corresponding to the ordered symbols of S. 2. Take the two nodes in T with the smallest probabilities and merge them into a new node whose probability is the sum of the probabilities of these two nodes. For the tree construction make the new resulting node the parent of the two least probable nodes of T by connecting the new node to each of the two least probable nodes. Each connection between two nodes form a branch of the tree so two new branches are generated. Assign a value of 1 to one branch and 0 to the other branch. 3. Update T by replacing the two least probable nodes in T with their parent node and reorder the nodes with their subtrees if needed. If T contains more than one node repeat from Step 2 otherwise the last node in T is the root node of the tree. 4. The codeword of a symbol Sk e S 0 k N 1 can be obtained by traversing the linked path of the tree from the root node to the leaf node corresponding to Sk 0 k N 1 while reading sequentially the bit values assigned to the tree branches of the traversed path. The Huffman code construction procedure is illustrated by the example shown in Fig. for the source alphabet S s0 s1 s2 s3 with symbol probabilities as given in Table . The resulting symbol codewords are listed in the 3rd column of Table . For this example the source entropy is H S and the resulting average bit rate is Bh k 0 Pklk bits per symbol