$LastChangedDate: 16-12-2008 $
optimbase
Provides an abstract class for a general optimization
component.
SYNOPSIS
newobj = optimbase_new ()
this = optimbase_destroy (this)
this = optimbase_configure (this,key,value)
value = optimbase_cget (this,key)
value = optimbase_get (this,key)
this = optimbase_set ( this , key , value )
[ this , isok ] = optimbase_checkbounds ( this )
[ opt , isok ] = optimbase_checkx0 ( this )
this = optimbase_display ( this )
[ this , result ] = optimbase_function ( this , x , index )
[ this , hasbounds ] = optimbase_hasbounds ( this )
value = optimbase_histget ( this , iter , key )
this = optimbase_histset ( this , iter , key , value )
this = optimbase_incriter ( this )
[ this , isfeasible ] = optimbase_isfeasible ( this , x )
this = optimbase_log (this,msg)
optimbase_outputcmd ( this , state , data )
data = optimbase_outstruct ( this )
[ this , p ] = optimbase_proj2bnds ( this , x )
this = optimbase_stoplog ( this , msg )
[this , terminate , status] = optimbase_terminate (this , previousfopt , currentfopt , previousxopt , currentxopt )
Purpose
The goal of this component is to provide a building block for
optimization methods. The goal is to provide a building block for a large
class of specialized optimization methods. This component manages
the number of variables,
the minimum and maximum bounds,
the number of non linear inequality constraints,
the cost function,
the logging system,
various termination criteria,
etc...
Design
This toolbox is designed with Oriented Object ideas in mind.
Features
The following is a list of features the Optimbase toolbox currently
provides :
Manage cost function
optionnal additionnal argument
direct communication of the task to perform : cost function
or inequality constraints
Manage various termination criteria, including
maximum number of iterations,
tolerance on function value (relative or absolute),
tolerance on x (relative or absolute),
maximum number of evaluations of cost function,
Manage the history of the convergence, including
history of function values,
history of optimum point.
Provide query features for
the status of the optimization process,
the number of iterations,
the number of function evaluations,
function value at initial point,
function value at optimal point,
the optimum parameters,
etc...
Description
This set of commands allows to manage an abstract optimization
method. The goal is to provide a building block for a large class of
specialized optimization methods. This component manages the number of
variables, the minimum and maximum bounds, the number of non linear
inequality constraints, the logging system, various termination criteria,
the cost function, etc...
The optimization problem to solve is the following
where
n
number of variables
nbineq
number of inequality constraints
Functions
The following functions are available.
newobj = optimbase_new ()
Creates a new optimization object.
newobj
The new object.
this = optimbase_destroy (this)
Destroy the given object.
this
The current object.
this = optimbase_configure (this,key,value)
Configure the current object with the given value for the
given key.
this
The current object.
key
the key to configure. The following keys are
available.
-verbose
set to 1 to enable verbose logging. (default is
0)
-verbosetermination
set to 1 to enable verbose termination logging.
(default is 0)
-x0
the initial guess.
-maxfunevals
the maximum number of function evalutations
(default is 100). If this criteria is triggered, the
status of the optimization is set to
"maxfuneval".
-maxiter
the maximum number of iterations (default is 100).
If this criteria is triggered, the status of the
optimization is set to "maxiter".
-tolfunabsolute
the absolute tolerance for the function value
(default is 0.0).
-tolfunrelative
the relative tolerance for the function value
(default is %eps).
-tolfunmethod
the method used for the tolerance on function
value in the termination criteria.
The following values are available : "enabled",
"disabled" (default is "disabled"). If this criteria is
triggered, the status of the optimization is set to
"tolf".
-tolxabsolute
the absolute tolerance on x (default is
0.0).
-tolxrelative
the relative tolerance on x (default is
%eps).
-tolxmethod
the method used for the tolerance on x in the
termination criteria.
The following values are available : "enabled",
"disabled" (default is "enabled"). If this criteria is
triggered, the status of the optimization is set to
"tolx".
-function
the objective function, which computes the value
of the cost and the non linear constraints, if
any.
See below for the details of the communication
between the optimization system and the cost
function.
-costfargument
an additionnal argument, passed to the cost
function.
-outputcommand
a command which is called back for output.
See below for the details of the communication
between the optimization system and the output command
function.
-outputcommandarg
an additionnal argument, passed to the output
command.
-numberofvariables
the number of variables to optimize (default is
0).
-storehistory
set to 1 to enable the history storing (default is
0).
-boundsmin
the minimum bounds for the parameters, as an array
of values (default is empty, i.e. there are no
bounds).
-boundsmax
the maximum bounds for the parameters, as an array
of values (default is empty, i.e. there are no
bounds).
-nbineqconst
the number of inequality constraints (default is
0)
value
the value.
value = optimbase_cget (this,key)
Get the value for the given key. If the key is unknown,
generates an error.
this
The current object.
key
the name of the key to quiery. The list of available
keys is the same as for the optimbase_configure
function.
[ this , isok ] = optimbase_checkbounds ( this )
Check if the bounds are consistent and puts an error message
if not. One could generate an error, but errors are not testable
with the current system.
this
The current object.
[ opt , isok ] = optimbase_checkx0 ( this )
Returns %T if the initial guess is consistent with the bounds
and the non linear inequality constraints, if any.
this
The current object.
this = optimbase_display ( this )
Display the current settings in the console.
this
The current object.
[ this , result ] = optimbase_function ( this , x , index
)
Call the cost function and return the value.
If a cost function additionnal argument is defined in current
object, pass it to the function. If an index is defined as input
argument, pass it to the function, always as second argument.
this
The current object.
x
the current point
index
optional, the value to compute.
See below in the section "Cost function" for details on
this index.
this = optimbase_set ( this , key , value )
Set the value for the given key. If the key is unknown,
generates an error.
this
The current object.
key
the key to set
The following keys are available :
-funevals
the number of function evaluations
-iterations
the number of iterations
-xopt
the x optimum
-fopt
the optimum cost function value
-historyxopt
an array, with nbiter values, containing the
history of x during the iterations.
This array is available after optimization if the
history storing was enabled with the -storehistory
option.
-historyfopt
an array, with nbiter values, containing the
history of the function value during the
iterations.
This array is available after optimization if the
history storing was enabled with the -storehistory
option.
-fx0
the function value for the initial guess
-status
a string containing the status of the
optimization
value
the value to set
value = optimbase_get (this,key)
Get the value for the given key. If the key is unknown,
generates an error. This command corresponds with options which are
not available directly to the optimbase_configure function, but are
computed internally.
this
The current object.
key
the name of the key to quiery.
The list of available keys is the same as the
optimbase_set function.
[ this , hasbounds ] = optimbase_hasbounds ( this )
Returns %T if current problem has bounds.
this
The current object.
this = optimbase_histset ( this , iter , key , value )
Set the history value at given iteration for the given key. If
the key is unknown, generates an error.
this
The current object.
iter
the iteration number to get
key
the name of the key to quiery.
The list of available keys is the following : "-xopt",
"-fopt".
value
the value to set
value = optimbase_histget ( this , iter , key )
Returns the history value at the given iteration number for
the given key. If the key is unknown, generates an error.
this
The current object.
iter
the iteration number to get
key
the name of the key to quiery.
The list of available keys is the same as the
optimbase_histset function.
this = optimbase_incriter ( this )
Increments the number of iterations.
this
The current object.
[ this , isfeasible ] = optimbase_isfeasible ( this , x )
Returns 1 if the given point satisfies bounds constraints and
inequality constraints. Returns 0 if the given point is not in the
bounds. Returns -1 if the given point does not satisfies inequality
constraints.
this
The current object.
x
the current point
this = optimbase_log (this,msg)
If verbose logging is enabled, prints the given message in the
console. If verbose logging is disabled, does nothing.
this
The current object.
msg
the message to print
optimbase_outputcmd ( this , state , data )
Calls back user's output command.
this
The current object.
data = optimbase_outstruct ( this )
Returns a tlist with basic optimization fields. This tlist is
suitable for use as an input argument of the output function. This
tlist may be enriched by children (specialize) optimization
methods.
this
The current object.
[ this , p ] = optimbase_proj2bnds ( this , x )
Returns a point, which is the projection of the given point
into the bounds.
this
The current object.
x
the current point
this = optimbase_stoplog ( this , msg )
Prints the given stopping rule message if verbose termination
is enabled. If verbose termination is disabled, does nothing.
this
The current object.
msg
the message to print
[this , terminate , status] = optimbase_terminate (this ,
previousfopt , currentfopt , previousxopt , currentxopt )
Returns 1 if the algorithm terminates. Returns 0 if the
algorithm must continue. If the -verbosetermination option is
enabled, messages are printed detailing the termination intermediate
steps. The optimbase_terminate function takes into account the
number of iterations, the number of evaluations of the cost
function, the tolerance on x and the tolerance on f. See below in
the section "Termination" for more details.
this
The current object.
previousfopt
the previous value of the cost function
currentfopt
the current value of the cost function
previousxopt
the previous x optimum
currentxopt
the current x optimum
terminate
1 if the algorithm must terminate, 0 if the algorithm
must continue
status
if terminate = 1, the detailed status of the
termination, as a string. If terminate = 0, the status is
"continue".
The following status are available :
"maxiter"
the maximum number of iterations, provided by the
-maxiter option, is reached.
"maxfuneval"
the maximum number of function evaluations,
provided by the -maxfunevals option, is reached
"tolf"
the tolerance on the function value is reached.
This status is associated with the -tolfunmethod,
-tolfunabsolute and -tolfunrelative options.
"tolx"
the tolerance on x is reached. This status is
associated with the -tolxmethod, -tolxabsolute and
-tolxrelative options.
The cost function
The option -function allows to configure the cost function. The cost
function is used to compute the cost and the value of the nonlinear
inequality constraints.
In the more general case, the cost function is expected to have the
following header
where
x
the current point
index
optional, an integer representing the value to compute
data
optional, a user-defined data.
This argument is configured with the -costfargument
option.
y
the result
The index input parameter has the following meaning
index = 1 (or no index)
the result is the value of the cost function
index = 2
the result is the value of the non-linear inequality
constraints, as an array of values
index = 3
the result is an array, which content is the following. At
index #1, the value of the cost function. At index #2 to the end,
the list of values of the nonlinear inequality constraints.
In the most simple case, the cost function is expected to have the
following header
where x is the current point and y is the value of the cost. This
case is associated with an unconstrained problem without any additionnal
parameter.
The output function
The option -outputcommand allows to configure a command which is
called back at the start of the optimization, at each iteration and at the
end of the optimization.
The output function must have the following header
where
state
a string representing the current state of the algorithm.
Available values are "init", "iter", "done".
data
a tlist containing at least the following entries
x
the current optimum
fval
the current function value
iteration
the current iteration index
funccount
the number of function evaluations
myobj
a user-defined parameter.
This input parameter is defined with the -outputcommandarg
option.
The output function may be used when debugging the specialized
optimization algorithm, so that a verbose logging is produced. It may also
be used to write one or several report files in a specialized format
(ASCII, LaTeX, Excel, Hdf5, etc...). The user-defined parameter may be
used in that case to store file names or logging options.
The data tlist argument may contain more fields than the current
presented ones. These additionnal fields may contain values which are
specific to the specialized algorithm, such as the simplex in a
Nelder-Mead method, the gradient of the cost function in a BFGS method,
etc...
Termination
The current component takes into account for several generic
termination criterias. Specialized termination criterias should be
implemented in specialized optimization algorithms, by calling the
optimbase_termination function and adding external criterias, rather than
by modification of this function.
The optimbase_terminate function uses a set of rules to compute if
the termination occurs, which leads to an optimization status which is
equal to one of the following : "continue", "maxiter", "maxfunevals",
"tolf", "tolx". The set of rules is the following.
By default, the status is "continue" and the terminate flag is
0.
The number of iterations is examined and compared to the
-maxiter option : if the following condition
= maxiter
]]>
is true, then the status is set to "maxiter" and terminate is
set to 1.
The number of function evaluations and compared to the
-maxfunevals option is examined : if the following condition
= maxfunevals
]]>
is true, then the status is set to "maxfuneval" and terminate is
set to 1.
The tolerance on function value is examined depending on the
value of the -tolfunmethod.
"disabled"
then the tolerance on f is just skipped.
"enabled"
if the following condition
is true, then the status is set to "tolf" and terminate is
set to 1.
The relative termination criteria on the function value works
well if the function value at optimum is near zero. In that case, the
function value at initial guess fx0 may be used as
previousfopt.
The absolute termination criteria on the function value works if
the user has an accurate idea of the optimum function value.
The tolerance on x is examined depending on the value of the
-tolxmethod.
"disabled"
then the tolerance on x is just skipped.
"enabled"
if the following condition
is true, then the status is set to "tolx" and terminate is
set to 1.
The relative termination criteria on the function value works
well if x at optimum is different from zero. In that case, the
condition measures the distance between two iterates.
The absolute termination criteria on the function value works if
the user has an accurate idea of the scale of the optimum x. If the
optimum x is near 0, the relative tolerance will not work and the
absolute tolerance is more appropriate.
Example : Setting up an optimization
In the following example, one searches the minimum of the 2D
Rosenbrock function. One begins by defining the function "rosenbrock"
which computes the Rosenbrock function. The traditionnal initial guess
[-1.2 1.0] is used. The initial simplex is computed along the axes with a
length equal to 0.1. The Nelder-Mead algorithm with variable simplex size
is used. The verbose mode is enabled so that messages are generated during
the algorithm. After the optimization is performed, the optimum is
retrieved with quiery features.
Authors
Michael Baudin, 2008-2009
TODO
manage equality constraints
manage linear constraints
manage quadratic objective
manage linear objective
methods with derivatives : add a flag to manage for the
derivatives of the cost function