Modeling Of Data part 6

Lawson, ., and Hanson, R. 1974, Solving Least Squares Problems (Englewood Cliffs, NJ: Prentice-Hall). Forsythe, ., Malcolm, ., and Moler, . 1977, Computer Methods for Mathematical Computations (Englewood Cliffs, NJ: Prentice-Hall) | NonlinearModels 681 Lawson . and Hanson R. 1974 Solving Least Squares Problems Englewood Cliffs NJ Prentice-Hall . Forsythe . Malcolm . and Moler . 1977 Computer Methods for Mathematical Computations Englewood Cliffs NJ Prentice-Hall Chapter 9. S Nonlinear Models i I I f GO _ O 3 X-X Q Q O 3- V V Zt -4. We now consider fitting when the model depends nonlinearly on the set of M I 2-1 g unknown parameters ak k 1 2 . . M. We use the same approach as in previous S. 5 sections namely to define a 2 merit function and determine best-fit parameters 5 by its minimization. With nonlinear dependences however the minimization must proceed iteratively. Given trial values for the parameters we develop a procedure that improves the trial solution. The procedure is then repeated until 2 stops or 5 effectively stops decreasing. How is this problem different from the general nonlinear function minimization g 3 3 problem already dealt with in Chapter 10 Superficially not at all Sufficiently close to the minimum we expect the 2 function to be well approximated by a 5 quadratic form which we can write as W O 1 2 a y - d a -a D a 8 I p where d is an M-vector and D is an M x M matrix. Compare equation . 4-111. If the approximation is a good one we know how to jump from the current trial parameters acur to the minimizing ones amin in a single leap namely 3 9 ju CO amin acur D 1 -VX2 acur 0 m Compare equation . 8 On the other hand might be a poor local approximation to the shape jj of the function that we are trying to minimize at acur. In that case about all we can do is take a step down the gradient as in the steepest descent method . 3 g a In other words c - Q Q a v anext acur - constant xV 2 aCur P a SB 8 S 20 . where the constant is small enough not to exhaust the downhill direction. To use or we must be able to compute the gradient of the 2 g functionat any set of parameters a. To use we also need

Không thể tạo bản xem trước, hãy bấm tải xuống
TÀI LIỆU MỚI ĐĂNG
24    19    1    27-11-2024
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.