## Introduction

Optimization of non-linear objective functions and potentially also non-linear constraints is a common problem in many scientific, engineering and business areas.

More often than not, the objective and constraint functions are not easily differentiable, which limits the possibility of applying the more efficient derivative-based optimization algorithms (of which IPOPT is a prominent open-source example).

There are several more or less time-efficient ways of solving (constrained) non-linear optimization problems, including simulated annealing, genetic algorithms and direct search methods such as the Nelder-Mead method (a good overview of mathematical optimization with several links is provided here.)

## Developments

### COBYLA2 for .NET

One rather popular code for solving non-linear optimization problems involving (potentially nonlinear) constraints is the *COBYLA2 *method. Originally formulated and implemented in Fortran 77 by Michael Powell, this method has been ported to both Fortran 90 and C (through f2c conversion), and it has also been integrated into the Python scientific calculations library scipy. *COBYLA2 *is simple to use, but it exhibits a rather slow convergence rate; the number of function and constraint evaluations required to locate an optimum that meets the constraints is often very large. As a first-attempt solver when objective and constraint function gradients are complex or time-consuming to derive, *COBYLA2 *is however often a good choice.

Being primarily a .NET developer, it has until recently not been straightforward to incorporate *COBYLA2 *though. It has of course been possible to build a native DLL and call *COBYLA2 *via P/Invoke. This works relatively well in a full .NET Framework environment, but it is only barely applicable in Silverlight applications and completely inapplicable in Windows Phone applications. On the other hand, the Fortran code is relatively well structured and well documented, so I recently decided to port *COBYLA2 *to C#.

I based the porting on the Fortran 90 code, since this implementation already utilizes more C/C++/C# like constructs. The porting has been successful and I have released the end result as an open-source project cscobyla on Github. Included in this project are unit tests demonstrating the use of *COBYLA2 *in C# applications. The C# code is fully portable to all Windows technologies, so it should be possible to incorporate the code in Windows Store (a.k.a. Metro), Silverlight and Windows Phone applications just as easy as it can be incorporated in regular .NET applications. When built into a C# class library, it should be straightforward to reference the *COBYLA2 *optimization from any other .NET language as well.

### COBYLA2 for Java

Encouraged by the successful porting to C#, I then embarked on a Java porting experience! I have not been able to find a pure Java implementation of *COBYLA2*, so I considered this a good Java development exercise. Java does not provide delegates, multidimensional arrays can only be represented as jagged arrays, and Java does not support the *goto* statement, so I was forced to make some design changes. Nevertheless, the Java porting effort also succeeded, and I have also made this result available as open source on Github, the jcobyla project.

### BOBYQA for .NET

And this is not all there is! As mentioned, *COBYLA2 *converges slowly. Michael Powell has made several efforts to overcome this issue for specialized cases by making use of a local quadratic approximation rather than a linear approximation that is being used in *COBYLA2*. These improvements have been made available in the NEWUOA code for unconstrained optimization of nonlinear objective functions, and more recently for bound constrained optimization of nonlinear objective functions in the BOBYQA (Bound Optimization BY Quadratic Approximation) code. These codes exhibits substantially improved convergence properties compared to *COBYLA2*, albeit for unconstrained or bound constrained problems only.

In particular, I consider the *BOBYQA *being a very interesting development, as many problems encountered in for example my own field of expertise, radiation therapy, can be formulated as bound constrained problems. *BOBYQA *has not been available for the .NET platform other than through P/Invoke, so again I decided to port Fortran code to C#. This time I had to rely on the Fortran 77 implementation, since I have not been able to identify any Fortran 90 port of *BOBYQA*. It took some additional effort, but even *BOBYQA *is now available as open source in C#. I have denoted this project csbobyqa and made it available on Github. Also written using standard C# constructs, the code should be easily incorporated in any .NET, Windows Store (Metro), Silverlight or Windows Phone application.

### BOBYQA for Java

In the case of *BOBYQA*, there actually already is an open-source Java implementation available as part of the Apache Commons Math project. The API for this Java class can be found here.

To confirm that my C# implementation is sufficiently efficient, I have also identified the longest-running unit test (bound-constrained Rosen with varying number of interpolation points) in the Java implementation and implemented the corresponding unit test in the C# unit test library. On my laptop, the C# unit test takes around 15 seconds in each consecutive run, whereas the corresponding unit test on the Java implementation ranges from 15 to 40 seconds. It is encouraging that the C# implementation consistently performs equal to or better than the Java implementation in this case.

## Using the code

The implementations are relatively faithful to the original Fortran 77 and 90 implementations. It should be noted however that the indexing of the variables and constraints arrays in these implementations are zero-based, i.e. for a problem with 3 variables, `x[0]`

, `x[1]`

and `x[2]`

should be employed in the objective (and, where applicable, constraints) function evaluations.

### Calling COBYLA2 from C#

To successfully invoke the *COBYLA2 *algorithm from C#, incorporate the *Cobyla.cs* file from the *cscobyla* project in your application. Then, implement a method for computing objective function and (potentially) constraints functions with the following signature:

void calcfc(int n, int m, double[] x, out double f, double[] con)

where `n`

is the number of variables, `m`

is the number of constraints, `x`

is the variable array, `f`

is the calculated objective function value and `con`

is the array of calculated constraints function values.

To minimize the objective function subject to constraints, call one of the two overloaded `Cobyla.FindMinimum`

methods:

CobylaExitStatus FindMinimum(CalcfcDelegate calcfc, int n, int m, double[] x,
double rhobeg, double rhoend, int iprint, int maxfun,
out double f, out double[] con, out int iters, TextWriter logger);

CobylaExitStatus FindMinimum(CalcfcDelegate calcfc, int n, int m, double[] x,
double rhobeg, double rhoend, int iprint, int maxfun, TextWriter logger);

where `x`

on input is the initial variable array, `rhobeg`

and `rhoend`

are the initial and final values of the simplex, `iprint`

(0..3) specifies the level of output to the console, `maxfun`

is the maximum allowed number of function evaluations, `f`

is the objective function value and `con`

the constraints function values at variables optimum and `iters`

is the actual number of function evaluations performed in the optimization. If defined, `logger`

is a text writer where output from *COBYLA2* is logged.

On return `x`

is the optimal obtained variable values. Both methods return final optimization status, which is one of normal termination, maximum iterations reached or diverging rounding errors.

The latter of the two overloaded methods also implements default values as follows: `rhobeg = 0.5`

, `rhoend = 1.0e-6`

, `iprint = 2`

, `maxfun = 3500`

and `logger = null`

. The method can thus optionally be called as follows, employing the default parameter values in the minimization:

var status = Cobyla.FindMinimum(calcfc, n, m, x);

Here is a simple example for minimizing the product of two variables, provided that the variables are confined to the border of the unit circle. First, implement the objective function and constraints computation method:

public static void calcfc1(int n, int m, double[] x, out double f, double[] con)
{
f = x[0] * x[1];
con[0] = 1.0 - x[0] * x[0] - x[1] * x[1];
}

Implicitly, the requirement is that all constraint function values be non-negative, i.e. `con[j] ≥ 0`

for all constraints `j`

.

To perform the actual optimization, define initial variable values and call `Cobyla.FindMinimum`

:

var x = new[] { 1.0, 1.0 };
var status = Cobyla.FindMinimum(calcfc1, 2, 1, x);

### Calling COBYLA2 from JAVA

The files *Cobyla.java*, *Calcfc.java* and *CobylaExitStatus.java* from the *jcobyla* project can be included in the package *com.cureos.numerics* of any Java project.

In Java, the objective function and (potentially) constraints functions computation is represented by the `Compute`

method in the `Calcfc`

interface.
Implement the interface explicitly or anonymously. The `Compute`

method exhibits the following signature:

double Compute(int n, int m, double[] x, double[] con)

where `n`

is the number of variables, `m`

is the number of constraints, `x`

is the variable array, and `con`

is the array of calculated constraints
function values. The method should return the value of the objective function.
To minimize the objective function subject to constraints, call the static `Cobyla.FindMinimum`

method:

CobylaExitStatus FindMinimum(Calcfc calcfc, int n, int m, double[] x,
double rhobeg, double rhoend, int iprint, int maxfun);

where `x`

on input is the initial variable array, `rhobeg`

and `rhoend`

are the initial and final values of the simplex, `iprint`

(0..3) specifies the level of output to the console, and `maxfun`

is the maximum allowed number of function evaluations

On output `x`

contains the
optimal obtained variable values. The method returns final optimization status, which is one of normal termination, maximum iterations
reached or diverging rounding errors.

Here is a simple example taken from the book *Practical Methods of Optimization, Volume 2*, by Roger Fletcher (equation 9.1.15). First, implement the `Calcfc`

interface. It is fully possible to make an implicit interface implementation:

Calcfc calcfc = new Calcfc() {
@Override
public double Compute(int n, int m, double[] x, double[] con) {
con[0] = x[1] - x[0] * x[0];
con[1] = 1.0 - x[0] * x[0] - x[1] * x[1];
return -x[0] - x[1];
}
};

Implicitly, the requirement is that all constraint function values be non-negative, i.e. `con[j] ≥ 0`

for all constraints `j`

. Next, define initial variable values and call `Cobyla.FindMinimum`

with suitable control parameters:

double[] x = {1.0, 1.0 };
CobylaExitStatus result = Cobyla.FindMinimum(calcfc, 2, 2, x, 0.5, 1.0e-6, 1, 3500);

### Calling BOBYQA from C#

To successfully invoke the *BOBYQA *algorithm from C#, incorporate the *Bobyqa.cs* file from the *csbobyqa* project in your application. Then, , implement a method for computing the objective function with the following signature:

double calfun(int n, double[] x)

where `n`

is the number of variables and `x`

is the variable array. The method should return the calculated objective
function value.

To minimize the objective function subject to bounds, call the static `Bobyqa.FindMinimum`

method:

BobyqaExitStatus FindMinimum(Func<int, double[], double> calfun, int n, double[] x,
double[] xl, double[] xu, int npt, double rhobeg,
double rhoend, int iprint, int maxfun, TextWriter logger)

<code>

where

`x`

on input is the initial variable array,

`xl`

and

`xu`

are lower and upper variable bounds, respectively,

`npt`

is the number
of interpolation conditions (recommended value

`2 * n + 1`

),

`rhobeg`

and

`rhoend`

are initial and final values of a trust region radius,

`iprint`

(0..3)
specifies the level of output to the console,

`maxfun`

is the maximum allowed number of function evaluations, and

`logger`

is a text writer to where BOBYQA's log will be output.

If `xl`

and/or `xu`

are set to `null`

, all optimization variables are considered to be unbounded downwards and upwards, respectively. If `npt`

is set to a non-positive value, the `npt`

value applied in the optimization is equal to `2 * n + 1`

. If `rhobeg`

is set to a non-positive value,
the `rhobeg`

value applied in the optimization will be based on the variable start values and the ranges of the bounds. If `rhoend`

is set to a
non-positive value, the `rhoend`

value applied in the optimization will be one millionth of the applied `rhobeg`

.

The `FindMinimum`

method also implements default values as follows: `xl = null`

, `xu = null`

, `npt = -1`

, `rhobeg = -1`

, `rhoend = -1`

, `iprint = 1`

, `maxfun = 10000`

and `logger = null`

.
To solve an unbounded optimization problem, the method can thus potentially be called as follows, employing the above default parameter values in the
minimization:

var status = Bobyqa.FindMinimum(calfun, n, x);

On output `x`

provides the optimal obtained variable values. The method returns an enumerated optimization status value, which should be equal to `Normal`

if optimization is successfully performed.

Here is a simple example HS05 from the Hock-Schittkowski collection. Note that the objective function need not be externally defined but can sufficiently be included as a lambda expression in the method call:

var xx = new[] { 0.0, 0.0 };
var status = Bobyqa.FindMinimum(
(n, x) =>
Math.Sin(x[0] + x[1]) + Math.Pow(x[0] - x[1], 2.0) - 1.5 * x[0] + 2.5 * x[1] + 1,
2,
xx,
new[] { -1.5, -3.0 },
new[] { 4.0, 3.0 });

## Where to find the code

A Java version of *BOBYQA *is incorporated in Apache Commons Math: http://commons.apache.org/math/

The development of *COBYLA2 *for Java also inspired Reinhard Oldenburg to make a Javascript adaptation of *COBYLA2*. So for those looking to incorporate nonlinear optimization directly on your website, look no further than here.

Good luck with the derivative-free optimizing!

## History

December 14, 2012: First version, adapted from the original article on my blog.

December 19, 2012: Added latest revisions of source code to article.