TOMLAB Appendix D
This page is part of the TOMLAB Manual. See TOMLAB Manual. |
Global Variables and Recursive Calls
The use of globally defined variables in TOMLAB is well motivated, for example to avoid unnecessary evaluations, storage of sparse patterns, internal communication, computation of elapsed CPU time etc. The global variables used in TOMLAB are listed in #Table: The global variables used in TOMLAB.
Even though global variables is efficient to use in many cases, it will be trouble with recursive algorithms and recursive calls. Therefore, the routines globalSave and globalGet have been defined. The globalSave routine saves all global variables in a structure glbSave(depth) and then initialize all of them as empty. By using the depth variable, an arbitrarily number of recursions are possible. The other routine globalGet retrieves all global variables in the structure glbSave(depth).
For solving some kinds of problems it could be suitable or even necessary to apply algorithms which is based on a recursive approach. A common case occurs when an optimization solver calls another solver to solve a subproblem. For example, the EGO algorithm (implemented in the routine ego) solves an unconstrained (uc) and a box-bounded global optimization problem (glb) in each iteration. To avoid that the global variables are not re-initialized or given new values by the underlying procedure TOMLAB saves the global variables in the workspace before the underlying procedure is called. Directly after the call to the underlying procedure the global variables are restored.
To illustrate the idea, the following code would be a possible part of the ego code, where the routines globalSave and globalGet are called.
...
...
global GlobalLevel
if isempty(GlobalLevel)
GlobalLevel=1;
else
GlobalLevel=GlobalLevel+1;
end
Level=GlobalLevel globalSave(Level);
EGOResult = glbSolve(EGOProb);
globalGet(Level);
GlobalLevel=GlobalLevel-1;
...
...
Level=GlobalLevel globalSave(Level);
[DACEResult] = ucSolve(DACEProb);
globalGet(1);
globalGet(Level);
GlobalLevel=GlobalLevel-1;
...
...
In most cases the user does not need to define the above statements and instead use the special driver routine tomSolve that does the above global variable checks and savings and calls the solver in between. In the actual implementation of the ego solver the above code is simplified to the following:
...
...
EGOResult = tomSolve('glbSolve',EGOProb);
...
...
DACEResult = tomSolve('ucSolve',DACEProb);
...
...
This safely handles the global variables and is the recommended way for users in need of recursive optimization solutions.
Table: The global variables used in TOMLAB
Name | Description |
---|---|
MAXCOLS | Number of screen columns. Default 120. |
MAXMENU | Number of menu items showed on one screen. Default 50. |
MAX_c | Maximum number of constraints to be printed. |
MAX_x | Maximum number of variables to be printed. |
MAX_r | Maximum number of residuals to be printed. |
n_f | Counter for the number of function evaluations. |
n_g | Counter for the number of gradient evaluations. |
n_H | Counter for the number of Hessian evaluations. |
n_c | Counter for the number of constraint evaluations. |
n_dc | Counter for the number of constraint normal evaluations. |
n_d2c | Counter for the number of evaluations of the 2nd part of 2nd deriva- tive matrix of the Lagrangian function. |
n_r | Counter for the number of residual evaluations. |
n_J | Counter for the number of Jacobian evaluations. |
n_d2r | Counter for the number of evaluations of the 2nd part of the Hessian for a nonlinear least squares problem . |
NLP_x | Value of x when computing NLP f. |
NLP_f | Function value. |
NLP_xg | Value of x when computing NLP g. |
NLP_g | Gradient value. |
NLP_xH | Value of x when computing NLP H. |
NLP_H | Hessian value. |
NLP_xc | Value of x when computing NLP c. |
NLP_c | Constraints value. |
NLP_pSepFunc | Number of partially separable functions. |
NLP_pSepIndex | Index for the separated function computed. |
US_A | Problem dependent information sent between user routines. The user is recommended to always use this variable. |
LS_A | Problem dependent information sent from residual routine to Jaco- bian routine. |
LS_x | Value of x when computing LS_r |
LS_r | Residual value. |
LS_xJ | Value of x when computing LS_J |
LS_J | Jacobian value. |
SEP_z | Separated variables z. |
SEP_Jz | Jacobian for separated variables z. |
wNLLS | Weighting of least squares residuals (internal variable in nlp_r and nlp_J). |
alphaV | Vector with all step lengths a for each iteration. |
BUILDP | Flag. |
F_X | Matrix with function values. |
pLen | Number of iterations so far. |
p_dx | Matrix with all search directions. |
X_max | The biggest x-values for all iterations. |
X_min | The smallest x-values for all iterations. |
X_NEW | Last x point in line search. Possible new x_k. |
X_OLD | Last known base point x_{k} |
probType | Defines the type of optimization problem. |
solvType | Defines the solver type. |
answer | Used by the GUI for user control options. |
instruction | Used by the GUI for user control options. |
question | Used by the GUI for user control options. |
plotData | Structure with plotting parameters. |
Prob | Problem structure, see TOMLAB Appendix A. |
Result | Result structure, see TOMLAB Appendix B. |
runNumber | Vector index when Result is an array of structures. |
TIME0 | Used to compute CPU time and real time elapsed. |
TIME1 | Used to compute CPU time and real time elapsed |
cJPI | Used to store sparsity pattern for the constraint Jacobian when au- tomatic differentiation is used. |
HPI | Used to store sparsity pattern for the Hessian when automatic dif- ferentiation is used. |
JPI | Used to store sparsity pattern for the Jacobian when automatic dif- ferentiation is used. |
glbSave | Used to save global variables in recursive calls to TOMLAB. |