An Introduction to Optimization
A modern, up-to-date introduction to optimization theory and methods
This authoritative book serves as an introductory text to optimization at the senior undergraduate and beginning graduate levels. With consistently accessible and elementary treatment of all topics, An Introduction to Optimization, Second Edition helps students build a solid working knowledge of the field, including unconstrained optimization, linear programming, and constrained optimization.
Supplemented with more than one hundred tables and illustrations, an extensive bibliography, and numerous worked examples to illustrate both theory and algorithms, this book also provides:
* A review of the required mathematical background material
* A mathematical discussion at a level accessible to MBA and business students
* A treatment of both linear and nonlinear programming
* An introduction to recent developments, including neural networks, genetic algorithms, and interior-point methods
* A chapter on the use of descent algorithms for the training of feedforward neural networks
* Exercise problems after every chapter, many new to this edition
* MATLAB(r) exercises and examples
* Accompanying Instructor's Solutions Manual available on request
An Introduction to Optimization, Second Edition helps students prepare for the advanced topics and technological developments that lie ahead. It is also a useful book for researchers and professionals in mathematics, electrical engineering, economics, statistics, and business.
An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Other editions - View all
afﬁne artiﬁcial problem augmented matrix basic feasible solution basis canonical augmented matrix chromosomes compute Consider the problem convex function convex set corresponding crossover deﬁned denoted derivative dual eigenvalues equations Example Exercise exists feasible direction feasible point feasible set Figure ﬁrst ﬁxed FONC formula function f genetic algorithm given global minimizer gradient algorithm Hence Hessian Hessian matrix hyperplane inequality input iteration Karmarkar’s KKT condition Lagrange condition Lemma linear programming linear programming problem linearly independent LP problem MATLAB maximize minimize f minimizer of f neural network neuron Newton’s method norm Note objective function objective function value obtain order of convergence orthogonal output positive deﬁnite primal problem in standard problem minimize Proof pseudoinverse quadratic function rankA satisﬁes satisfy secant method sequence Show simplex algorithm simplex method solving Speciﬁcally standard form steepest descent subspace sufﬁcient conditions Suppose symmetric tableau tangent Theorem update variables vector