CygnusX1

10 Reputation

One Badge

2 years, 226 days

MaplePrimes Activity


These are replies submitted by CygnusX1

@acer 

I've got your code running pretty well in my application, I'll post it back here probably later tonight.

But is there an easy way to feed a list of starting values into the DirectSearch optimizer, akin to NLPSolve's "initialpoint" option?  I did play with the initialpoints option under GlobalOptima(), but it did not appear to be as simple as giving my, well... my list of N initial points.

@acer 

Thank you again, for all of your help on this.  I was called away from this yesterday, but got back on it this morning, and have been using your work and improving my end of things.  I automated the definition of limits and start point, and next I'll add an iterative process to redefine those limits and restart the loop, as it moves toward a solution.

I came across GlobalOptima, and I'm wondering if there's any advantage in using it over GlobalSearch?

@acer

Well, Maple 14 is better.  It gets past the unapply statement with no issue.  Your bandpeak process works just fine in this version.  Now I'm seeing the "Warning, no iterations performed as initial point satisfies first-order conditions" response at the final nested optimization, in the Cac.mw file.  I'd be surprised if that's true, but as you noted, so many variables in a problem likely not well-suited to local-only optimization.

The DirectSearch method appears to be working, albeit slowly.  It's 1200 seconds in, and showing activity, but no result yet.  These problems usually run in a few minutes using various algorithms (Pointer Robust or Trust Region Framework), so I did't expect it to run in seconds.

Am I taking a slower than necessary approach to the inner loop, by using optimization?  Maybe it would be quicker to digitize it into a few hundred points, and just find the maximum point?  I'd sacrifice some accuracy for speed, at least in the initial global optimization.

@acer 

It appears I'm still having an issue with your unapply statement, although the specific error has changed now to "Error, (in unapply) incorrect number of arguments to _Inert_AND".  I suppose this may be a bug or limit in Maple 12.  Let me see if I can get a newer version installed on this machine, I think my purchase of Maple 14 included two separate installation codes, only one of which I ever used.  Perhaps it will run in there.

Unfortunately, any of the newer versions I purchased are already locked to other machines which I cannot access from my present location.

@acer Thank you!  I will spend some time with this later today, and see if I can understand everything you've done.  I really appreciate the help.

I may also need to look into buying another newer license for Maple, and look at adding DirectSearch.  I have a circuit simulator that I have classically used for this task, as a more global approach is often needed, but using it for this task removes it from use for other requirements.  This is why I was hoping to move this optimization into Maple, if it can be made to work.

Thank you, again!

There's a chance I might be able to get to a machine with v.14, but likely nothing later than that, from where I'm at right now.  I've continued playing on my own, including going back to the original nested approach, but without any success.  Updated worksheet below, I think I'm facing two problems suprimposed on one-another:

1. My own abilities within Maple

2. A possible issue with NLPSolve() in this version of Maple.

transformer_with_short_(Primes).mw

@acer 

Sorry about that, acer.  I thought I was actually making it easier for you by not uploading a fresh doc, but just posting the additions.  In any case, here's the new worksheet.

transformer_with_short_(Primes).mw

Yes, I'm running v.12 on this machine.  I have newer versions, but not as easily accessible from where I am working now.  For what I normally do, I haven't noticed any difference between v.12 thru v.2019, such that I tend to float between versions without much notice.

I had removed numeric from the unapply statement because it was throwing "Error, (in unapply) 'numeric' option must only include names in the input parameters for the generated procedure".  I actually didn't understand it, as I thought the input parameters were only names for the generated procedure, so I removed it to get past that step in the code.

@acer 

I'm coming to the conclusion that there is something very basic I am missing, WRT procedures or NLPSolve.  In setting up the multivariate optimization, it works fine when I code it as a simple one-line NLPSolve(), but fails when I move the exact same code into your proc format.  I am not sure why.

First, I re-created Gamma, using all variables for Z[1]..Z[N] (rather than their assigned constants), as follows
 

Zin := ZSp:
Zin_save := ZSp:
for i from 1 to N do
  Zin := simplify(Zmv[i]*(Zin_save+I*Zmv[i]*tan(E))/(Zmv[i]+I*Zin_save*tan(E))):
  Zin_save := Zin: # used for iteration when just plugging Zin back in doesn't work
#  print(evalf(subs(Zstub=Z[2],f=f1,Zin)));
end do:
#Zin;
Gamma := abs((Zin-ZL)/(Zin+ZL)):
#Gammaproc := unapply(Gamma,[f,Zstub,seq(Zmv[n],n=1..N)],numeric):
Gammaproc := unapply(Gamma,[f,Zstub,seq(Zmv[n],n=1..N)]):

If I follow this with a simple NLPSolve, whether maximize or minimize, it works just fine:

Zstub_start := Z[5];
plot(Gammaproc(f,Zstub_start,seq(Z[n],n=1..N)),f=f1..f2);
NLPSolve(Gammaproc(f,Zstub_start,seq(Z[n],n=1..N)),f=f1..f2);
NLPSolve(Gammaproc(f,Zstub_start,seq(Z[n],n=1..N)),f=f1..f2,maximize);

But if I try to place it into the procedure, it fails, no matter the arrangement:

vars := ps,seq(p||s,s=1..N); 
bandpeak := proc(vars)
  local f, sol:
#  if not p::numeric then return 'procname'(p); end if;
#  sol := NLPSolve(Gammaproc(f,ps,seq(p||s,s=1..N)),f=f1..f2,
#                  'evaluationlimit'=100,
#                  #'method'=':-branchandbound',
#                  #'method'=':-nonlinearsimplex',
#                  'maximize'
#                  );
  sol := NLPSolve(Gammaproc(f,ps,seq(p||s,s=1..N)),f=f1..f2,maximize);
  if sol[1]::numeric then
    return sol[1];
  else
    return Float(undefined);
  end if;
end proc:

Not sure what I'm doing wrong, I'm not used to coding procedures in Maple, it's just not how I've used it most in the past.

@acer 

Well, I got your code digested, copied, and working in my worksheet.  However, it only made me realize something I should have seen before.  The bandpeak() will always reduce with increasing Zstub.  Oops.

I know from experience that Zstub will typically land somewhere in the range Z[2]..Z[N], more often biased toward the low end of that range.  I guess I'm best just starting with Zstub = Z[2] initial value, and jumping directly to the multivariate optimization of all Z[1]..Z[N].  That will be fun.

Thanks again!

@acer 

Thank you!  It will take me some time to digest this.  But just a quick note to say that the maxima are only occuring at the f1 & f2 endpoints in this first simple case, of adding Zstub to the existing array Z[1]..Z[N].  After using the above to find a starting point for Zstub, I will need to unassign Z[1]..Z[N], and do a multivariate optimization on all of them (plus Zstub), to find the optimum solution.  In this case, the maxima can occur anywhere in the range f1..f2.

Thanks!

@one man 

Oh, definitely.  The zeros of the derivative in the f-domain will typically be the maxima, so that internal loop could be handled with finding the dG/df zero corresponding maximum Gamma, if that is indeed quicker.

But the outer / secondary optimization is required to be a true optimization, as this univariate example is only the simpler first part of the problem.  After using this to find a reasonable starting point for Zstub, I plan to go back an unassign Z[1]..Z[N], and run a full multivariate optimization on |Gamma|, to minimize it in the range f1..f2.  The optimum case is usually equi-ripple, as was the original Chebyshev transformatoin before adding Zstub in the section titled "Optimization".

Hopefully using the example provided here, I'll be able to figure out that more complex problem... we will see!

Thanks

Noting that my original internal loop was working without method=nonlinearsimplex, I initially tried adding the following:

bandpeak := proc(p)
  local Gam, f, Zst:
  Gam := eval(Gamma,Zstub=p);
  printf("Gam = %a",Gam);
  sol := NLPSolve(Gam,f=f1..f2,maximize);
  return sol[1];
end proc;

But it failed on "Error, (in Optimization:-NLPSolve) abs is not differentiable at non-real arguments".

So I added the method=nonlinearsimplex, but now it fails on "Error, (in Optimization:-NLPSolve) method=nonlinearsimplex option is available for unconstrained unbounded problems only".

I'm not sure why the original NLPSolve(subs(Zstub=Z[2],Gamma),f=f1..f2,maximize); works, but the code above returns an error, or if there's a better method I should be choosing for this application.

Thanks!

Thank you for your reply.  Here's a copy of the file, as cleaned up as I could make it:

transformer_with_short_(Primes).mw@tomleslie 

I will try working your code into the bottom of this worksheet, and see if I can get it working.  I've really had no coursework in programming, so my skills there are pretty novice, I'm pretty much self-taught on Maple.

Thanks!

Page 1 of 1