Quickguide MAD Problem: Difference between revisions

From TomWiki
Jump to navigationJump to search
(Created page with "{{Part Of Manual|title=the Quickguide Manual|link=Quickguide}} TOMLAB /MAD is a package for general automatic differentiation of Matlab code. Usage is applicable f...")
 
No edit summary
 
Line 26: Line 26:
Open the file for viewing, and execute madQG in Matlab.
Open the file for viewing, and execute madQG in Matlab.


<syntaxhighlight lang="matlab">
<source lang="matlab">
  % madQG are two examples for defining and solving nonlinear
  % madQG are two examples for defining and solving nonlinear
  % programming problems using TOMLAB /MAD
  % programming problems using TOMLAB /MAD
Line 58: Line 58:
  Prob2.ADCons = -1; % Lagrangian function for the nonlinear constraints.
  Prob2.ADCons = -1; % Lagrangian function for the nonlinear constraints.
  Result2 = tomRun('conopt', Prob2, 1);  % Uses second order information.
  Result2 = tomRun('conopt', Prob2, 1);  % Uses second order information.
</syntaxhighlight>
</source>

Latest revision as of 18:35, 17 January 2012

Notice.png

This page is part of the Quickguide Manual. See Quickguide.

TOMLAB /MAD is a package for general automatic differentiation of Matlab code. Usage is applicable for any applications needing derivatives. The package can be used standalone or as part of TOMLAB when floating point precision derivatives are needed.

Following is a simple example of standalone use:

 >> x = 1;
 >> x = fmad(x,1);
 >> y = sin(x);
 >> y
 
 value =
 
     0.8415
 
 
 derivatives =
 
     0.5403

An example problem with TOMLAB is included in the guide. The following file defines and solves two problems in TOMLAB.

File: tomlab/quickguide/madQG.m

Open the file for viewing, and execute madQG in Matlab.

 % madQG are two examples for defining and solving nonlinear
 % programming problems using TOMLAB /MAD
 
 Name = 'RBB Problem';
 x_0 = [-1.2 1]';     % Starting values for the optimization
 x_L = [-10;-10];     % Lower bounds for x.
 x_U = [2;2];         % Upper bounds for x.
 fLowBnd = 0;         % Lower bound on function.
 
 c_L = -1000;         % Lower bound on nonlinear constraints.
 c_U = 0;             % Upper bound on nonlinear constraints.
 
 Prob1 = conAssign('rbbQG_f', [], [], [], x_L, x_U, Name, x_0,...
                 [], fLowBnd, [], [], [], 'rbbQG_c', [], [], [], c_L, c_U);
 
 Prob2 = conAssign('rbbQG_f', 'rbbQG_g', [], [], x_L, x_U, Name, x_0,...
                 [], fLowBnd, [], [], [], 'rbbQG_c', 'rbbQG_dc', [], [], c_L, c_U);
                              
 Prob1.Warning = 0;    % Turning off warnings.            
 Prob2.Warning = 0;    % Turning off warnings.            
             
 madinitglobals;
 Prob1.ADObj  = 1; % Gradient calculated
 Prob1.ADCons = 1; % Jacobian calculated
 Result1 = tomRun('snopt', Prob1, 1);  % Only uses first order information.
 
 madinitglobals;
 Prob2.CONOPT.options.LS2PTJ = 0;
 Prob2.ADObj  = -1; % Hessian calculated
 Prob2.ADCons = -1; % Lagrangian function for the nonlinear constraints.
 Result2 = tomRun('conopt', Prob2, 1);  % Uses second order information.