图文详情
- ISBN:9787030166753
- 装帧:一般胶版纸
- 册数:暂无
- 重量:暂无
- 开本:16开
- 页数:636
- 出版时间:2016-07-09
- 条形码:9787030166753 ; 978-7-03-016675-3
本书特色
适读人群 :数学专业高年级本科生,运筹学、应用数学等相关专业研究生 本书运筹学、计算数学高年级本科生或研究生必读书目,是数之**化一部经典之作。
内容简介
本书作者现任美国西北大学教授,多种国际杂志的主编、副主编。作者根据在教学、研究和咨询中的经验,写了这本适合学生和实际工作者的书。本书提供连续优化中大多数有效方法的全面的新的论述。每一章从基本概念开始,逐步阐述当前可用的技术。 本书强调实用方法,包含大量图例和练习,适合广大读者阅读,可作为工程、运筹学、数学、计算机科学以及商务方面的研究生教材,也可作为该领域的科研人员和实际工作人员的手册。 总之,作者力求本书阅读性强,内容丰富,论述严谨,能揭示数值实用价值。
目录
Preface
1 Introduction
Mathematical Formulation
Example: A Transportation Problem
Continuous versus Discrete Optimization
Constrained and Unconstrained Optimization
Global and Local Optimization
Stochastic and Deterministic Optimization
Optimization Algorithms
Convexity
Notes and References
2 Fundamentals of Unconstrained Optimization
2.1 What Is a Solution?
Recognizing a Local Minimum
Nonsmooth Problems
2.2 Overview of Algorithms
Two Strategies: Line Search and Trust Region
Search Directions for Line Search Methods
Models for Trust—Region Methods
Scaling
Rates of Convergence
R—Rates of Convergence
Notes and References
Exercises
3 Line Search Methods
3.1 Step Length
The Wolfe Conditions
The Goldstein Conditions
Sufficient Decrease and Backtracking
3.2 Convergence of Line Search Methods
3.3 Rate of Convergence
Convergence Rate of Steepest Descent
Quasi—Newton Methods
Newton's Method
Coordinate Descent Methods
3.4 Step—Length Selection Algorithms
Interpolation
The Initial Step Length
A Line Search Algorithm for the Wolfe Conditions
Notes and References
Exerases
4 Trust—Region Methods
Outline of the Algorithm
4.1 The Cauchy Point and Related Algorithms
The Cauchy Point
Improving on the Cauchy Point
The DoglegMethod
Two—Dimensional Subspace Minimization
Steihaug's Approach
4.2 Using Nearly Exact Solutions to the Subproblem
Characterizing Exact Solutions
Calculating Nearly Exact Solutions
The Hard Case
Proof of Theorem 4.3
4.3 Global Convergence
Reduction Obtained by the Cauchy Point
Convergence to Stationary Points
Convergence of Algorithms Based on Nearly Exact Solutions
4.4 Other Enhancements
Scaling
Non—Euclidean Trust Regions
Notes and References
Exercises
5 Conjugate Gradient Methods
5.1 The Linear Conjugate Gradient Method
Conjugate Direction Methods
Basic Properties of the Conjugate Gradient Method
A Practical Form of the Conjugate Gradient Method
Rate of Convergence
Preconditioning
Practical Preconditioners
5.2 Nonlinear Conjugate Gradient Methods
The Fletcher—Reeves Method
The Polak—Ribiere Method
Quadratic Termination and Restarts
Numerical Performance
Behavior of the Fletcher—Reeves Method
Global Convergence
Notes and References
Exerases
6 Practical Newton Methods
6.1 Inexact Newton Steps
6.2 Line Search Newton Methods
Line Search Newton—CG Method
Modified Newton's Method
6.3 Hessian Modifications
Eigenvalue Modification
Adding a Multiple of the Identity
Modified Cholesky Factorization
Gershgorin Modification
Modified Symmetric Indefinite Factorization
6.4 Trust—Region Newton Methods
Newton—Dogleg and Subspace—Minimization Methods
Accurate Solution of the Trust—Region Problem
Trust—Region Newton—CG Method
Preconditioning the Newton—CG Method
Local Convergence of Trust—Region Newton Methods
Notes and References
Exerases
7 Calculating Derivatives
7.1 Finite—Difference Derivative Approximations
Approximating the Gradient
Approximating a Sparse Jacobian
Approximatingthe Hessian
Approximating a Sparse Hessian
7.2 Automatic Differentiation
An Example
The Forward Mode
The Reverse Mode
Vector Functions and Partial Separability
Calculating Jacobians of Vector Functions
Calculating Hessians: Forward Mode
Calculating Hessians: Reverse Mode
Current Limitations
Notes and References
Exercises
8 Quasi—Newton Methods
8.1 The BFGS Method
Properties ofthe BFGS Method
Implementation
8.2 The SR1 Method
Properties of SRl Updating
8.3 The Broyden Class
Properties ofthe Broyden Class
8.4 Convergence Analysis
Global Convergence ofthe BFGS Method
Superlinear Convergence of BFGS
Convergence Analysis of the SR1 Method
Notes and References
Exercises
9 Large—Scale Quasi—Newton and Partially Separable Optimization
9.1 Limited—Memory BFGS
Relationship with Conjugate Gradient Methods
9,2 General Limited—Memory Updating
Compact Representation of BFGS Updating
SR1 Matrices
Unrolling the Update
9.3 Sparse Quasi—Newton Updates
9.4 Partially Separable Functions
A Simple Example
Internal Variables
9.5 Invariant Subspaces and Partial Separability
Sparsity vs.Partial Separability
Group Partial Separability
9.6 Algorithms for Partially Separable Functions
Exploiting Partial Separabilityin Newton's Method
Quasi—Newton Methods for Partially Separable Functions
Notes and References
Exercises
……
10 Nonlinear Least—Squares Problems
11 Nonlinear Equations
12 Theory of Constrained Optimization
13 Linear Programming: The Simplex Method
14 Linear Programming:Interior—Point Methods
15 Fundamentals of Algorithms for Nonlinear Constrained Optimization
16 Quadratic Programnung
17 Penalty, Barrier, and Augmented Lagrangian Methods
18 Sequential Quadratic Programming
A Background Material
References
Index
1 Introduction
Mathematical Formulation
Example: A Transportation Problem
Continuous versus Discrete Optimization
Constrained and Unconstrained Optimization
Global and Local Optimization
Stochastic and Deterministic Optimization
Optimization Algorithms
Convexity
Notes and References
2 Fundamentals of Unconstrained Optimization
2.1 What Is a Solution?
Recognizing a Local Minimum
Nonsmooth Problems
2.2 Overview of Algorithms
Two Strategies: Line Search and Trust Region
Search Directions for Line Search Methods
Models for Trust—Region Methods
Scaling
Rates of Convergence
R—Rates of Convergence
Notes and References
Exercises
3 Line Search Methods
3.1 Step Length
The Wolfe Conditions
The Goldstein Conditions
Sufficient Decrease and Backtracking
3.2 Convergence of Line Search Methods
3.3 Rate of Convergence
Convergence Rate of Steepest Descent
Quasi—Newton Methods
Newton's Method
Coordinate Descent Methods
3.4 Step—Length Selection Algorithms
Interpolation
The Initial Step Length
A Line Search Algorithm for the Wolfe Conditions
Notes and References
Exerases
4 Trust—Region Methods
Outline of the Algorithm
4.1 The Cauchy Point and Related Algorithms
The Cauchy Point
Improving on the Cauchy Point
The DoglegMethod
Two—Dimensional Subspace Minimization
Steihaug's Approach
4.2 Using Nearly Exact Solutions to the Subproblem
Characterizing Exact Solutions
Calculating Nearly Exact Solutions
The Hard Case
Proof of Theorem 4.3
4.3 Global Convergence
Reduction Obtained by the Cauchy Point
Convergence to Stationary Points
Convergence of Algorithms Based on Nearly Exact Solutions
4.4 Other Enhancements
Scaling
Non—Euclidean Trust Regions
Notes and References
Exercises
5 Conjugate Gradient Methods
5.1 The Linear Conjugate Gradient Method
Conjugate Direction Methods
Basic Properties of the Conjugate Gradient Method
A Practical Form of the Conjugate Gradient Method
Rate of Convergence
Preconditioning
Practical Preconditioners
5.2 Nonlinear Conjugate Gradient Methods
The Fletcher—Reeves Method
The Polak—Ribiere Method
Quadratic Termination and Restarts
Numerical Performance
Behavior of the Fletcher—Reeves Method
Global Convergence
Notes and References
Exerases
6 Practical Newton Methods
6.1 Inexact Newton Steps
6.2 Line Search Newton Methods
Line Search Newton—CG Method
Modified Newton's Method
6.3 Hessian Modifications
Eigenvalue Modification
Adding a Multiple of the Identity
Modified Cholesky Factorization
Gershgorin Modification
Modified Symmetric Indefinite Factorization
6.4 Trust—Region Newton Methods
Newton—Dogleg and Subspace—Minimization Methods
Accurate Solution of the Trust—Region Problem
Trust—Region Newton—CG Method
Preconditioning the Newton—CG Method
Local Convergence of Trust—Region Newton Methods
Notes and References
Exerases
7 Calculating Derivatives
7.1 Finite—Difference Derivative Approximations
Approximating the Gradient
Approximating a Sparse Jacobian
Approximatingthe Hessian
Approximating a Sparse Hessian
7.2 Automatic Differentiation
An Example
The Forward Mode
The Reverse Mode
Vector Functions and Partial Separability
Calculating Jacobians of Vector Functions
Calculating Hessians: Forward Mode
Calculating Hessians: Reverse Mode
Current Limitations
Notes and References
Exercises
8 Quasi—Newton Methods
8.1 The BFGS Method
Properties ofthe BFGS Method
Implementation
8.2 The SR1 Method
Properties of SRl Updating
8.3 The Broyden Class
Properties ofthe Broyden Class
8.4 Convergence Analysis
Global Convergence ofthe BFGS Method
Superlinear Convergence of BFGS
Convergence Analysis of the SR1 Method
Notes and References
Exercises
9 Large—Scale Quasi—Newton and Partially Separable Optimization
9.1 Limited—Memory BFGS
Relationship with Conjugate Gradient Methods
9,2 General Limited—Memory Updating
Compact Representation of BFGS Updating
SR1 Matrices
Unrolling the Update
9.3 Sparse Quasi—Newton Updates
9.4 Partially Separable Functions
A Simple Example
Internal Variables
9.5 Invariant Subspaces and Partial Separability
Sparsity vs.Partial Separability
Group Partial Separability
9.6 Algorithms for Partially Separable Functions
Exploiting Partial Separabilityin Newton's Method
Quasi—Newton Methods for Partially Separable Functions
Notes and References
Exercises
……
10 Nonlinear Least—Squares Problems
11 Nonlinear Equations
12 Theory of Constrained Optimization
13 Linear Programming: The Simplex Method
14 Linear Programming:Interior—Point Methods
15 Fundamentals of Algorithms for Nonlinear Constrained Optimization
16 Quadratic Programnung
17 Penalty, Barrier, and Augmented Lagrangian Methods
18 Sequential Quadratic Programming
A Background Material
References
Index
展开全部
作者简介
作者现任美国西北大学教授,多种国际**杂志的主编、副主编。作者根据在教学、研究和咨询中的经验,写了这本适合学生和实际工作者的书。
本类五星书
本类畅销
-
勒维特之星-大发现系列丛书
¥4.0¥16.0 -
喜马拉雅山珍稀鸟类图鉴
¥27.2¥68.0 -
昆虫的生存之道
¥12.2¥38.0 -
昆虫采集制作及主要目科简易识别手册
¥15.0¥50.0 -
古文诗词中的地球与环境事件
¥8.7¥28.0 -
声音简史
¥21.3¥52.0 -
不匹配的一对:动物王国的性别文化
¥16.7¥42.8 -
物理学之美-插图珍藏版
¥20.7¥69.0 -
现代物理学的概念和理论
¥18.4¥68.0 -
技术史入门
¥14.4¥48.0 -
几何原本
¥35.6¥93.6 -
改变世界的发现
¥15.4¥48.0 -
图说相对论(32开平装)
¥13.8¥46.0 -
数学的魅力;初等数学概念演绎
¥7.7¥22.0 -
星空探奇
¥14.0¥39.0 -
宇宙与人
¥10.5¥35.0 -
数学专题讲座
¥13.3¥29.0 -
袁隆平口述自传
¥19.9¥51.0 -
为了人人晓得相对论
¥3.9¥13.5 -
一代神话:哥本哈根学派
¥8.1¥15.5