Hello Maple community!
As a part of a more complex (global) optimization problem I have a simple trigonometric parametrization of a classical probability distribution, i.e. just a list of positive number that sum up to 1. For test purposes, I take now one arbitrary element, p, of that probability list and use it as a target function for an optimization. Of course, one expects that one will find suitable parameter sets for minimizing/maximizing p. So here is my simple target function for the case of N=10 probabilities (requiring N-1=9 parameters):
objective := proc(x1, x2, x3, x4, x5, x6, x7, x8, x9)
local phi, i, p;
phi := Pi/2;
for i from 1 to nargs do
phi[i] := args[i];
p := seq((sin(phi[i-1])*mul(cos(phi[j]), j=i..N-1))^2, i=1..N);
When I take now some random point and evaluate the target function I get some value between 0 and 1, as it should be:
some_point := [seq(RandomTools[Generate](float(range=0..evalf(2*Pi), method=uniform)), i=1..9)]:
When I want to do a (local) optimization Maple always tells me that I'm sitting in some local optimum (--> "Warning, no iterations performed as initial point satisfies first-order conditions"):
Optimization[Maximize](objective, seq(0..2*Pi, i=1..9), initialpoint=some_point);
This seems rather strange and unlikely to me. Especially when I use fdiff to get some estimate of the gradient it is not vanishing:
seq(fdiff(objective, [i], some_point), i=1..N-1);
So my question is: Why does the Optimization package refuse to work? I would like to avoid programming my own simple conjugate gradient method but it seems I have to!? Does somebody see what I'm doing wrong here? Thanks in advance!