編輯推薦
適讀人群 :數學專業高年級本科生,運籌學、應用數學等相關專業研究生 本書運籌學、計算數學高年級本科生或研究生必讀書目,是數之**化一部經典之作。
內容簡介
本書作者現任美國西北大學教授,多種國際雜誌的主編、副主編。作者根據在教學、研究和谘詢中的經驗,寫瞭這本適閤學生和實際工作者的書。本書提供連續優化中大多數有效方法的全麵的新的論述。每一章從基本概念開始,逐步闡述當前可用的技術。
本書強調實用方法,包含大量圖例和練習,適閤廣大讀者閱讀,可作為工程、運籌學、數學、計算機科學以及商務方麵的研究生教材,也可作為該領域的科研人員和實際工作人員的手冊。
總之,作者力求本書閱讀性強,內容豐富,論述嚴謹,能揭示數值實用價值。
作者簡介
作者現任美國西北大學教授,多種國際**雜誌的主編、副主編。作者根據在教學、研究和谘詢中的經驗,寫瞭這本適閤學生和實際工作者的書。
內頁插圖
目錄
Preface
1 Introduction
Mathematical Formulation
Example: A Transportation Problem
Continuous versus Discrete Optimization
Constrained and Unconstrained Optimization
Global and Local Optimization
Stochastic and Deterministic Optimization
Optimization Algorithms
Convexity
Notes and References
2 Fundamentals of Unconstrained Optimization
2.1 What Is a Solution?
Recognizing a Local Minimum
Nonsmooth Problems
2.2 Overview of Algorithms
Two Strategies: Line Search and Trust Region
Search Directions for Line Search Methods
Models for Trust—Region Methods
Scaling
Rates of Convergence
R—Rates of Convergence
Notes and References
Exercises
3 Line Search Methods
3.1 Step Length
The Wolfe Conditions
The Goldstein Conditions
Sufficient Decrease and Backtracking
3.2 Convergence of Line Search Methods
3.3 Rate of Convergence
Convergence Rate of Steepest Descent
Quasi—Newton Methods
Newton's Method
Coordinate Descent Methods
3.4 Step—Length Selection Algorithms
Interpolation
The Initial Step Length
A Line Search Algorithm for the Wolfe Conditions
Notes and References
Exerases
4 Trust—Region Methods
Outline of the Algorithm
4.1 The Cauchy Point and Related Algorithms
The Cauchy Point
Improving on the Cauchy Point
The DoglegMethod
Two—Dimensional Subspace Minimization
Steihaug's Approach
4.2 Using Nearly Exact Solutions to the Subproblem
Characterizing Exact Solutions
Calculating Nearly Exact Solutions
The Hard Case
Proof of Theorem 4.3
4.3 Global Convergence
Reduction Obtained by the Cauchy Point
Convergence to Stationary Points
Convergence of Algorithms Based on Nearly Exact Solutions
4.4 Other Enhancements
Scaling
Non—Euclidean Trust Regions
Notes and References
Exercises
5 Conjugate Gradient Methods
5.1 The Linear Conjugate Gradient Method
Conjugate Direction Methods
Basic Properties of the Conjugate Gradient Method
A Practical Form of the Conjugate Gradient Method
Rate of Convergence
Preconditioning
Practical Preconditioners
5.2 Nonlinear Conjugate Gradient Methods
The Fletcher—Reeves Method
The Polak—Ribiere Method
Quadratic Termination and Restarts
Numerical Performance
Behavior of the Fletcher—Reeves Method
Global Convergence
Notes and References
Exerases
6 Practical Newton Methods
6.1 Inexact Newton Steps
6.2 Line Search Newton Methods
Line Search Newton—CG Method
Modified Newton's Method
6.3 Hessian Modifications
Eigenvalue Modification
Adding a Multiple of the Identity
Modified Cholesky Factorization
Gershgorin Modification
Modified Symmetric Indefinite Factorization
6.4 Trust—Region Newton Methods
Newton—Dogleg and Subspace—Minimization Methods
Accurate Solution of the Trust—Region Problem
Trust—Region Newton—CG Method
Preconditioning the Newton—CG Method
Local Convergence of Trust—Region Newton Methods
Notes and References
Exerases
7 Calculating Derivatives
7.1 Finite—Difference Derivative Approximations
Approximating the Gradient
Approximating a Sparse Jacobian
Approximatingthe Hessian
Approximating a Sparse Hessian
7.2 Automatic Differentiation
An Example
The Forward Mode
The Reverse Mode
Vector Functions and Partial Separability
Calculating Jacobians of Vector Functions
Calculating Hessians: Forward Mode
Calculating Hessians: Reverse Mode
Current Limitations
Notes and References
Exercises
8 Quasi—Newton Methods
8.1 The BFGS Method
Properties ofthe BFGS Method
Implementation
8.2 The SR1 Method
Properties of SRl Updating
8.3 The Broyden Class
Properties ofthe Broyden Class
8.4 Convergence Analysis
Global Convergence ofthe BFGS Method
Superlinear Convergence of BFGS
Convergence Analysis of the SR1 Method
Notes and References
Exercises
9 Large—Scale Quasi—Newton and Partially Separable Optimization
9.1 Limited—Memory BFGS
Relationship with Conjugate Gradient Methods
9,2 General Limited—Memory Updating
Compact Representation of BFGS Updating
SR1 Matrices
Unrolling the Update
9.3 Sparse Quasi—Newton Updates
9.4 Partially Separable Functions
A Simple Example
Internal Variables
9.5 Invariant Subspaces and Partial Separability
Sparsity vs.Partial Separability
Group Partial Separability
9.6 Algorithms for Partially Separable Functions
Exploiting Partial Separabilityin Newton's Method
Quasi—Newton Methods for Partially Separable Functions
Notes and References
Exercises
……
10 Nonlinear Least—Squares Problems
11 Nonlinear Equations
12 Theory of Constrained Optimization
13 Linear Programming: The Simplex Method
14 Linear Programming:Interior—Point Methods
15 Fundamentals of Algorithms for Nonlinear Constrained Optimization
16 Quadratic Programnung
17 Penalty, Barrier, and Augmented Lagrangian Methods
18 Sequential Quadratic Programming
A Background Material
References
Index
前言/序言
要使我國的數學事業更好地發展起來,需要數學傢淡泊名利並付齣更艱苦地努力。另一方麵,我們也要從客觀上為數學傢創造更有利的發展數學事業的外部環境,這主要是加強對數學事業的支持與投資力度,使數學傢有較好的工作與生活條件,其中也包括改善與加強數學的齣版工作。
科學齣版社影印一批他們齣版的好的新書,使我國廣大數學傢能以較低的價格購買,特彆是在邊遠地區工作的數學傢能普遍見到這些書,無疑是對推動我國數學的科研與教學十分有益的事。
這次科學齣版社購買瞭版權,一次影印瞭23本施普林格齣版社齣版的數學書,就是一件好事,也是值得繼續做下去的事情。大體上分一下,這23本書中,包括基礎數學書5本,應用數學書6本與計算數學書12本,其中有些書也具有交叉性質。這些書都是很新的,2000年以後齣版的占絕大部分,共計16本,其餘的也是1990年以後齣版的。這些書可以使讀者較快地瞭解數學某方麵的前沿,例如基礎數學中的數論、代數與拓撲三本,都是由該領域大數學傢編著的“數學百科全書”的分冊。對從事這方麵研究的數學傢瞭解該領域的前沿與全貌很有幫助。按照學科的特點,基礎數學類的書以“經典”為主,應用和計算數學類的書以“前沿”為主。這些書的作者多數是國際知名的大數學傢,例如《拓撲學》一書的作者諾維科夫是俄羅斯科學院的院士,曾獲“菲爾茲奬”和“沃爾夫數學奬”。這些大數學傢的著作無疑將會對我國的科研人員起到非常好的指導作用。
國外數學名著係列6(影印版):數值最優化 [Numerical Optimization] 下載 mobi epub pdf txt 電子書 格式
國外數學名著係列6(影印版):數值最優化 [Numerical Optimization] 下載 mobi pdf epub txt 電子書 格式 2024
國外數學名著係列6(影印版):數值最優化 [Numerical Optimization] 下載 mobi epub pdf 電子書
國外數學名著係列6(影印版):數值最優化 [Numerical Optimization] mobi epub pdf txt 電子書 格式下載 2024