Independent component analysis P5

Information Theory Estimation theory gives one approach to characterizing random variables. This was based on building parametric models and describing the data by the parameters. An alternative approach is given by information theory. Here the emphasis is on coding. We want to code the observations. The observations can then be stored in the memory of a computer, or transmitted by a communications channel, for example. Finding a suitable code depends on the statistical properties of the data. In independent component analysis (ICA), estimation theory and information theory offer the two principal theoretical approaches. In this chapter, the basic concepts of information. | Independent Component Analysis. Aapo Hyvarinen Juha Karhunen Erkki Oja Copyright 2001 John Wiley Sons Inc. ISBNs 0-471-40540-X Hardback 0-471-22131-7 Electronic 5 Information Theory Estimation theory gives one approach to characterizing random variables. This was based on building parametric models and describing the data by the parameters. An alternative approach is given by information theory. Here the emphasis is on coding. We want to code the observations. The observations can then be stored in the memory of a computer or transmitted by a communications channel for example. Finding a suitable code depends on the statistical properties of the data. In independent component analysis ICA estimation theory and information theory offer the two principal theoretical approaches. In this chapter the basic concepts of information theory are introduced. The latter half of the chapter deals with a more specialized topic approximation of entropy. These concepts are needed in the ICA methods of Part II. ENTROPY Definition of entropy Entropy is the basic concept of information theory. Entropy P is defined for a discrete-valued random variable X as H X - P X ai loSP X ai i where the a are the possible values of X. Depending on what the base of the logarithm is different units of entropy are obtained. Usually the logarithm with base 2 is used in which case the unit is called a bit. In the following the base is 105 106 INFORMATION THEORY Fig. The function in plotted on the interval 0 1 . not important since it only changes the measurement scale so it is not explicitly mentioned. Let us define the function as p plogp for 0 p 1 This is a nonnegative function that is zero for p 0 and for p 1 and positive for values in between it is plotted in Fig. . Using this function entropy can be written as E F X F X ai Considering the shape of f we see that the entropy is small if the probabilities P X a are close to 0 or 1 and large if the probabilities are .

Không thể tạo bản xem trước, hãy bấm tải xuống
TÀI LIỆU LIÊN QUAN
31    942    42
TỪ KHÓA LIÊN QUAN
TÀI LIỆU MỚI ĐĂNG
100    14    1    07-12-2022
463    10    1    07-12-2022
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.