Publication:
Optimization Methods In Training Neural Networks

Loading...
Thumbnail Image
Date
2003-07
Authors
Sathasivam, Saratha
Journal Title
Journal ISSN
Volume Title
Publisher
Research Projects
Organizational Units
Journal Issue
Abstract
In steepest descent methods, we construct a functional which when extremized will deliver a solution. The function will have convexity properties, so that the vector that extremizes the function is the solution of the algebraic problem in question. This means the search for the vector for which the gradient of the function is zero can be done in an iterative fashion. A special steepest descent method which is appropriate for the solution of the linear algebraic problem is the 'Conjugate Gradient Method'.
Description
Keywords
Mathematical optimization
Citation