Items tagged with directsearch directsearch Tagged Items Feed

hi

DirectSearch answer has confused me. How to reduce the residual.
See the program.Direct.mw

in maple 15

https://drive.google.com/file/d/0B8F2D27rfQWgVXE1alN0V3JWU1U/edit?usp=sharing

there are 3 equation to be minimized

and i limit x between x + 5 and x - 5 as constraints

 

though f1 got a error in first line of command,

later i type a correct command for f1 in later part of script

hello

please compare result of DirectSearch and implicitplot. which of them is correct??

please help me.please......

resul.mw

 

 

how i can trust in DirectSearch solution result.is there any creteria?

my variable is intensity.

this is my code:

ep0 := 1/(4*3.14); el := 8.54*10^(-2); hbar := 1; vf := 1/300; kb := 1; tem := 2.586*10^(-2); ci := 1; p := 1.458*10^16; beta := 2; ai := 7.1*10^(-4); bi := ai/sqrt(3); enph := .196; d := enph/(kb*tem); n0 := 1/(exp(enph/(kb*tem))-1); gama := hbar*vf; intensity := 10000001; w := 1.55; impurity := 7.2*10^3;

g := hbar*beta/(bi^2*sqrt(2*p*enph)); aa := g^2*(n0+1)/(2*Pi*hbar*gama^2); bb := g^2*n0/(2*Pi*hbar*gama^2); cc := 2/(Pi*gama^2); l := (1*hbar)*w/(2*kb*tem);u := el^2*intensity/(32*w*hbar^2);

 

DirectSearch:-SolveEquations([op([((enph*ln(1+exp(c+enph/(kb*tem)))/(kb*tem)-polylog(2, -exp(c))+polylog(2, -exp(c+enph/(kb*tem))))*enph*(kb*tem)^2-(enph^2*ln(1+exp(c+enph/(kb*tem)))/(kb^2*tem^2)+2*enph*polylog(2, -exp(c+enph/(kb*tem)))/(kb*tem)+2*polylog(3, -exp(c))-2*polylog(3, -exp(c+enph/(kb*tem))))*(kb*tem)^3+(-exp(b)*enph*ln(1+exp(c+enph/(kb*tem)))+exp(c+d)*enph*ln(1+exp(b-d+enph/(kb*tem)))+exp(b)*kb*tem*polylog(2, -exp(c))-exp(c+d)*kb*tem*polylog(2, -exp(b-d))-exp(b)*kb*tem*polylog(2, -exp(c+enph/(kb*tem)))+exp(c+d)*kb*tem*polylog(2, -exp(b-d+enph/(kb*tem))))*enph*(kb*tem)^2/((exp(b)-exp(c+d))*kb*tem)+(exp(b)*enph^2*ln(1+exp(c+enph/(kb*tem)))-exp(c+d)*enph^2*ln(1+exp(b-d+enph/(kb*tem)))+2*exp(b)*enph*kb*tem*polylog(2, -exp(c+enph/(kb*tem)))-2*exp(c+d)*enph*kb*tem*polylog(2, -exp(b-d+enph/(kb*tem)))+2*exp(b)*kb^2*tem^2*polylog(3, -exp(c))-2*exp(c+d)*kb^2*tem^2*polylog(3, -exp(b-d))-2*exp(b)*kb^2*tem^2*polylog(3, -exp(c+enph/(kb*tem)))+2*exp(c+d)*kb^2*tem^2*polylog(3, -exp(b-d+enph/(kb*tem))))*(kb*tem)^3/((exp(b)-exp(c+d))*kb^2*tem^2))*bb+u*(1/(1+exp(-l-c))-1/((1+exp(-l-c))*(1+exp(l-b))))-(((1*enph)*(enph-2*kb*tem*ln(1+exp(-b+enph/(kb*tem))))/(2*kb^2*tem^2)+2*kb^2*tem^2*(-polylog(2, -exp(-b+enph/(kb*tem)))+polylog(2, -cosh(b)+sinh(b))))*enph*(kb*tem)^2-(enph^2*(enph-3*kb*tem*ln(1+exp(-b+enph/(kb*tem))))-6*kb^2*tem^2*(enph*polylog(2, -exp(-b+enph/(kb*tem)))+kb*tem*(-polylog(3, -exp(-b+enph/(kb*tem)))+polylog(3, -cosh(b)+sinh(b)))))*(kb*tem)^3/(3*kb^3*tem^3)-(-exp(b)*enph^2+exp(c+d)*enph^2-2*exp(c+d)*enph*kb*tem*ln(1+exp(-b+enph/(kb*tem)))+2*exp(b)*enph*kb*tem*ln(1+exp(-c-d+enph/(kb*tem)))+2*exp(c+d)*kb^2*tem^2*polylog(2, -exp(-b))-2*exp(b)*kb^2*tem^2*polylog(2, -exp(-c-d))-2*exp(c+d)*kb^2*tem^2*polylog(2, -exp(-b+enph/(kb*tem)))+2*exp(b)*kb^2*tem^2*polylog(2, -exp(-c-d+enph/(kb*tem))))*enph*(kb*tem)^2/((2*(-exp(b)+exp(c+d)))*kb^2*tem^2)-(exp(b)*enph^3-exp(c+d)*enph^3+3*exp(c+d)*enph^2*kb*tem*ln(1+exp(-b+enph/(kb*tem)))-3*exp(b)*enph^2*kb*tem*ln(1+exp(-c-d+enph/(kb*tem)))+6*exp(c+d)*enph*kb^2*tem^2*polylog(2, -exp(-b+enph/(kb*tem)))-6*exp(b)*enph*kb^2*tem^2*polylog(2, -exp(-c-d+enph/(kb*tem)))+6*exp(c+d)*kb^3*tem^3*polylog(3, -exp(-b))-6*exp(b)*kb^3*tem^3*polylog(3, -exp(-c-d))-6*exp(c+d)*kb^3*tem^3*polylog(3, -exp(-b+enph/(kb*tem)))+6*exp(b)*kb^3*tem^3*polylog(3, -exp(-c-d+enph/(kb*tem))))*(kb*tem)^3/((3*(-exp(b)+exp(c+d)))*kb^3*tem^3))*aa-u*(1/(1+exp(l-b))-1/((1+exp(-l-c))*(1+exp(l-b)))) = 0, -cc*polylog(2, -exp(b))+cc*polylog(2, -exp(-c))-impurity = 0])], tolerances = 10^(-8), evaluationlimit = 20000)

 

hello

1-DirectSearch results is like this:

[0.,[0.], [x = -.400000000000000], 11]


x=.4 is the answer of SolveEquations (code is in the second question) please interpret other terms.

2-how can i save only x?

this is my code:

restart;

a := Matrix([1, 2, 3, 4, 5]);

for k from 1 by 1 to 5 do

z = DirectSearch:-SolveEquations(a(1, k)*x+2 = 0)

end do

 

The DirectSearch package is a powerful Maple  tool. However, every soft has its advantages and disadvantages. In particular, the DS has problems in the case of a thin feasible set in higher dimensions. Recently a serious bug in the DS was detected by me. Solving an optimization problem, the DirectSearch produces the error communication

Warning, initial point [x1 = 1., x2 = 1., x4 = 2., y1 = 2., y2 = 3., y4 = 2.] does not satisfy the inequality constraints; trying to find a feasible initial point
Error, (in DirectSearch:-Search) cannot find feasible initial point; specify a new one
 while that initial point satisfies the constraints.

 

restart

DirectSearch:-Search(((x2-x1)^2+(y2-y1)^2)*((x4-x1)^2+(y4-y1)^2), {seq(parse(y || j) >= -(2/3)*parse(x || j)+2, j = 1 .. 4), seq(parse(y || j) >= (1/2)*parse(x || j)-3/2, j = 1 .. 4), seq(parse(y || j) <= 4, j = 1 .. 4), seq(parse(y || j) <= -3*parse(x || j)+16, j = 1 .. 4), seq(parse(y || j) <= 2*parse(x || j)+2, j = 1 .. 4), (x2-x1)*(x4-x1)+(y2-y1)*(y4-y1) = 0, (x3-x2)*(x2-x1)+(y3-y2)*(y2-y1) = 0, (x4-x1)*(x4-x3)+(y4-y1)*(y4-y3) = 0, (x4-x3)*(x3-x2)+(y4-y3)*(y3-y2) = 0}, maximize, initialpoint = [x1 = 1, x2 = 1, x3 = 2, x4 = 2, y1 = 2, y2 = 3, y3 = 3, y4 = 2])

Error, (in DirectSearch:-Search) cannot find feasible initial point; specify a new one

 

eval({seq(parse(y || j) >= -(2/3)*parse(x || j)+2, j = 1 .. 4), seq(parse(y || j) >= (1/2)*parse(x || j)-3/2, j = 1 .. 4), seq(parse(y || j) <= 4, j = 1 .. 4), seq(parse(y || j) <= -3*parse(x || j)+16, j = 1 .. 4), seq(parse(y || j) <= 2*parse(x || j)+2, j = 1 .. 4), (x2-x1)*(x4-x1)+(y2-y1)*(y4-y1) = 0, (x3-x2)*(x2-x1)+(y3-y2)*(y2-y1) = 0, (x4-x1)*(x4-x3)+(y4-y1)*(y4-y3) = 0, (x4-x3)*(x3-x2)+(y4-y3)*(y3-y2) = 0}, [x1 = 1, x2 = 1, x3 = 2, x4 = 2, y1 = 2, y2 = 3, y3 = 3, y4 = 2])

{0 = 0, -1 <= 2, -1 <= 3, 2 <= 4, 2 <= 6, 2 <= 10, 2 <= 13, 3 <= 4, 3 <= 6, 3 <= 10, 3 <= 13, -1/2 <= 2, -1/2 <= 3, 2/3 <= 2, 2/3 <= 3, 4/3 <= 2, 4/3 <= 3}

(1)

``

 

Download opti.mw

Hi,

Previously I got some great help from Markiyan Hirnyk who introduced me to the DirectSearch package. I am having a little trouble implementing it for this function:

y := proc (E) options operator, arrow; -_C4*MathieuS(-a, -q, E)*(Int(MathieuC(-a, -q, E)*(-a+2*q*cos(2*E)), E))+_C4*(Int(MathieuS(-a, -q, E)*(-a+2*q*cos(2*E)), E))*MathieuC(-a, -q, E)-_C2*MathieuC(-a, -q, E)-_C3*MathieuS(-a, -q, E)-_C4*MathieuS(-a, -q, E)*MathieuCPrime(-a, -q, E)+_C4*MathieuSPrime(-a, -q, E...

Hi,

I'm using the DirectSearch package in a 10 periods model and in the first period i get this values:

"
> DirectSearch[SolveEquations](sys, assume = positive);
Warning, complex or non-numeric value encountered; trying to find a feasible point
[HFloat(1.1842542076623546e-32),

Vector[column](%id = 18446744078126621390), [

x1a = HFloat(4204.651582462925),

x1c = HFloat(4204.651582462925),

i'm using DirectSearch package to solve the following system of equations (in order to find x1a,x1c,x2a,x2c):

How can i limit the solutions just for positive values of x1a,x2a,x1c,x2c? (Currently, I'm just using  

Thanks

Gil

Hi,

how can i install the DirectSearch optimization package for mac version (MAPLE 16)? i

i have the files from maple 15.

Gil

Here is a short wrapper which automates repeated calls to the DirectSearch 2 curve-fitting routine. It offers both time and repetition (solver restart) limits.

The global optimization package DirectSearch 2 (see Application Center link, and here) has some very nice features. One aspect which I really like is that it can do curve-fitting: to fit an expression using tabular data. By this, I mean that it can find optimal values of parameters present in an expression (formula) such that the residual error between that formula and the tabular data is minimized.

Maple itself has commands from the CurveFitting and Statistics packages for data regression, such as NonlinearFit, etc. But those use local optimization solvers, and quite often for the nonlinear case one may need a global optimizer in order to produce a good fit. The nonlinear problem may have local extrema which are not even close to being globally optimal or provide a close fit.

Maplesoft offers the (commercially available) GlobalOptimization package as an add-on to Maple, but its solvers are not hooked into those mentioned curve-fitting commands. One has to set up the proper residual-based objective function onself in order to use this for curve-fitting, and some of the bells and whistles may be harder to do.

So this is why I really like the fact that the DirectSearch 2 package has its own exported commands to do curve-fitting, integrated with its global solvers.

But as the DirectSearch package's author mentions, the fitting routine may sometimes exit too early. Repeat starts of the solver, for the very same parameter ranges, can produce varying results due to randomization steps performed by the solver. This post is branched off from another thread which involved such a problematic example.

Global optimization is often a dark art. Sometimes one may wish to simply have the engine work for 24 hours, and produce whatever best result it can. That's the basic enhancement this wrapper offers.

Here is the wrapper, and a few illustrative calls to it on the mentioned curve-fitting example that show informative  progress status messages, etc. I've tried to make the wrapper pretty generic. It could be reused for other similar purposes.

Other improvements are possible, but might make it less generic. A target option is possible, where attainment of the target would cause an immediate stop. The wrapper could be made into an appliable module, and the running best result could be stored in a module local so that any error (and ensuing halt) would not wipe out the best result from potentially hours and hours worth of conputation.

restart:
randomize():

repeater:=proc(  funccall::uneval
               , {maxtime::numeric:=60}
               , {maxiter::posint:=10}
               , {access::appliable:=proc(a) SFloat(a[1]); end proc}
               , {initial::anything:=[infinity]}
              )
          local best, current, elapsed, i, starttime;
            starttime:=time[real]();
            elapsed:=time[real]()-starttime;
            i:=1; best:=[infinity];
            while elapsed<maxtime and i<=maxiter do
              userinfo(2,repeater,`iteration `,i);
              try
                timelimit(maxtime-elapsed,assign('current',eval(funccall)));
              catch "time expired":
              end try;
              if is(access(current)<access(best)) then
                best:=current;
                userinfo(1,repeater,`new best `,access(best));
              end if;
              i:=i+1;
              elapsed:=time[real]()-starttime;
              userinfo(2,repeater,`elapsed time `,elapsed);
            end do;
            if best<>initial then
              return best;
            else
              error "time limit exceeded during first attempt";
            end if;
          end proc:


X := Vector([seq(.1*j, j = 0 .. 16), 1.65], datatype = float): 

Y := Vector([2.61, 2.62, 2.62, 2.62, 2.63, 2.63, 2.74, 2.98, 3.66,
             5.04, 7.52, 10.74, 12.62, 10.17, 5, 2.64, 11.5, 35.4],
            datatype = float):

F := a*cosh(b*x^c*sin(d*x^e));

                                    /   c    /   e\\
                         F := a cosh\b x  sin\d x //

infolevel[repeater]:=2: # or 1, or not at all (ie. 0)
interface(warnlevel=0): # disabling warnings. disable if you want.

repeater(DirectSearch:-DataFit(F
                      , [a=0..10, b=-10..10, c=0..100, d=0..7, e=0..4]
                      , X, Y, x
                      , strategy=globalsearch
                      , evaluationlimit=30000
                              ));
repeater: iteration  1
repeater: new best  9.81701944539358706
repeater: elapsed time  15.884
repeater: iteration  2
repeater: new best  2.30718902535293857
repeater: elapsed time  22.354
repeater: iteration  3
repeater: new best  0.627585701120743822e-4
repeater: elapsed time  30.777
repeater: iteration  4
repeater: elapsed time  47.959
repeater: iteration  5
repeater: new best  0.627585700905294148e-4
repeater: elapsed time  55.221
repeater: iteration  6
repeater: elapsed time  60.009
 [0.0000627585700905294, [a = 2.61748237902808, b = 1.71949329097179, 

   c = 2.30924401405164, d = 1.50333106110324, e = 1.84597267458055], 4333]


# without userinfo messages printed
infolevel[repeater]:=0:
repeater(DirectSearch:-DataFit(F
                      , [a=0..10, b=-10..10, c=0..100, d=0..7, e=0..4]
                      , X, Y, x
                      , strategy=globalsearch
                      , evaluationlimit=30000
                              ));

 [0.0000627585701341043, [a = 2.61748226209478, b = 1.71949332125427, 

   c = 2.30924369227236, d = 1.50333090706676, e = 1.84597294290477], 6050]


# illustrating early timeout
infolevel[repeater]:=2:
repeater(DirectSearch:-DataFit(F
                      , [a=0..10, b=-10..10, c=0..100, d=0..7, e=0..4]
                      , X, Y, x
                      , strategy=globalsearch
                      , evaluationlimit=30000
                              ),
         maxtime=2);

repeater: iteration  1
repeater: elapsed time  2.002
Error, (in repeater) time limit exceeded during first attempt

# illustrating iteration limit cutoff
infolevel[repeater]:=2:
repeater(DirectSearch:-DataFit(F
                      , [a=0..10, b=-10..10, c=0..100, d=0..7, e=0..4]
                      , X, Y, x
                      , strategy=globalsearch
                      , evaluationlimit=30000
                              ),
         maxiter=1);

repeater: iteration  1
repeater: new best  5.68594272127419575
repeater: elapsed time  7.084
 [5.68594272127420, [a = 3.51723075672918, b = -1.48456068506828, 

   c = 1.60544055207338, d = 6.99999999983179, e = 3.72070034285212], 2793]


# giving it a large total time limit, with reduced userinfo messages
infolevel[repeater]:=1:
Digits:=15:
repeater(DirectSearch:-DataFit(F
                      , [a=0..10, b=-10..10, c=0..100, d=0..7, e=0..4]
                      , X, Y, x
                      , strategy=globalsearch
                      , evaluationlimit=30000
                              ),
         maxtime=2000, maxiter=1000);

repeater: new best  3.10971990123465947
repeater: new best  0.627585701270853103e-4
repeater: new best  0.627585700896181428e-4
repeater: new best  0.627585700896051324e-4
repeater: new best  0.627585700895833535e-4
repeater: new best  0.627585700895607885e-4
 [0.0000627585700895608, [a = 2.61748239185387, b = -1.71949328487160, 

   c = 2.30924398692221, d = 1.50333104262348, e = 1.84597270535142], 6502]

Hi,

how can i install the DirectSearch optimization package for mac version (MAPLE 15)?

Gil

How to play the Maple worksheet Alkylation Process Model Using DirectSearch

(http://www.maplesoft.com/applications/view.aspx?SID=1675)

There is some equivalence between GlobalOptimization commands: GetLastSolution and GlovalSolve and DirectSearch commands?

Gracias

I have been using the DirectSearch package contributed by Dr. Sergey N. Moiseev.  It works very well for the problems I have been working on and would like to use it in Matlab.  Is anyone aware of a Matlab version, how I may convert this to a Matlab file, or call Maple from Malab? 

 

Please see the solution output in blue color, there are 7 answers seperated by commas

Can you please help me:

1 - What "0.0000591392596494206" stands for?

2 - What "the matrix 4 rows by 1 column" stands for?

and

3 - What "460" stands for?

 

http://www...

Page 1 of 1