# Independent component analysis P5

## Information Theory Estimation theory gives one approach to characterizing random variables. This was based on building parametric models and describing the data by the parameters. An alternative approach is given by information theory. Here the emphasis is on coding. We want to code the observations. The observations can then be stored in the memory of a computer, or transmitted by a communications channel, for example. Finding a suitable code depends on the statistical properties of the data. In independent component analysis (ICA), estimation theory and information theory offer the two principal theoretical approaches. In this chapter, the basic concepts of information. | Independent Component Analysis. Aapo Hyvarinen Juha Karhunen Erkki Oja Copyright 2001 John Wiley Sons Inc. ISBNs 0-471-40540-X Hardback 0-471-22131-7 Electronic 5 Information Theory Estimation theory gives one approach to characterizing random variables. This was based on building parametric models and describing the data by the parameters. An alternative approach is given by information theory. Here the emphasis is on coding. We want to code the observations. The observations can then be stored in the memory of a computer or transmitted by a communications channel for example. Finding a suitable code depends on the statistical properties of the data. In independent component analysis ICA estimation theory and information theory offer the two principal theoretical approaches. In this chapter the basic concepts of information theory are introduced. The latter half of the chapter deals with a more specialized topic approximation of entropy. These concepts are needed in the ICA methods of Part II. ENTROPY Definition of entropy Entropy is the basic concept of information theory. Entropy P is defined for a discrete-valued random variable X as H X - P X ai loSP X ai i where the a are the possible values of X. Depending on what the base of the logarithm is different units of entropy are obtained. Usually the logarithm with base 2 is used in which case the unit is called a bit. In the following the base is 105 106 INFORMATION THEORY Fig. The function in plotted on the interval 0 1 . not important since it only changes the measurement scale so it is not explicitly mentioned. Let us define the function as p plogp for 0 p 1 This is a nonnegative function that is zero for p 0 and for p 1 and positive for values in between it is plotted in Fig. . Using this function entropy can be written as E F X F X ai Considering the shape of f we see that the entropy is small if the probabilities P X a are close to 0 or 1 and large if the probabilities are .

TÀI LIỆU LIÊN QUAN
9    206    0
31    1173    49
1    914    80
89    258    13
80    341    33
51    286    10
95    398    39
16    10    4
47    31    2
1    455    20
TÀI LIỆU XEM NHIỀU
13    39565    2293
3    24053    245
25    23399    4213
16    19374    2824
20    18910    1535
1    18573    604
14    18188    2909
37    15358    2942
3    14741    310
1    13090    116
TỪ KHÓA LIÊN QUAN
TÀI LIỆU MỚI ĐĂNG
240    57    4    29-11-2023
122    27    3    29-11-2023
7    59    1    29-11-2023
48    126    8    29-11-2023
387    107    1    29-11-2023
18    48    1    29-11-2023
31    61    3    29-11-2023
125    45    2    29-11-2023
6    53    1    29-11-2023
28    63    2    29-11-2023
4    52    1    29-11-2023
299    36    1    29-11-2023
8    292    1    29-11-2023
8    288    1    29-11-2023
5    81    1    29-11-2023
2    180    2    29-11-2023
12    36    1    29-11-2023
61    88    1    29-11-2023
16    77    1    29-11-2023
15    161    5    29-11-2023
TÀI LIỆU HOT
3    24053    245
13    39565    2293
3    2262    81
580    4496    361
62    6028    1
584    2815    96
171    5254    710
2    2567    78
51    3934    193
53    4300    187
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.