957 Reputation

10 Badges

14 years, 305 days
University of Kent

Social Networks and Content at

MaplePrimes Activity

These are replies submitted by casperyc

@itsme Yes, mpl is quite useful. But it may not be what I am expecting. As @Alejandro Jakubi noticed, I need more complicated/long 2D output, it may not be doable at the moment.

I am quite surprised that Maple does not have a syntax to save as PDF. It can only be done by mouse click.

I would be handy to have a command "save" to save the worksheet, like

>save ""

or something like

>save "myMaple.PDF"



@Markiyan Hirnyk 

Here is the "general" expression for the problem. You can trust that the target funtion I have given in the example is the "simplest" form, for a real 8 year dataset. The target function is definitely correct.

In "real" problem is that, it does not follow a "named" distribution. So there is NO "density", I think.


It would be great to have it. But if not, you could try this:


What I am doing now is, instead of the "Code Edit Region"  in Maple itself, I have decied to use Notepad++.

I write in Notepad++, which has synatex highlight.

Then in the same dir, I creat a Maple worksheet, I ask Maple to

> read "the_maple_code";


It has been working quite well for me to manage my "long" and "large" pieces of codes.



@Alejandro Jakubi 

I believe I need the GUI features. Such as the output from


and I wil need the 2D math output.


Is there a way to achieve this then? Batch  (one by one) run worksheets in GUI and export (with the results) to PDF?

But I dont need to "see" the results when it is running (if that helps). I only need to see the final results in PDF.




@Alejandro Jakubi Thanks. I think this is the second bug I have discovered in a short period of time. There was one not long ago, here.

Yes, the elementwise operator "~" is very helpful in many places.

The "bug" does not bother me a lot, but is quit strange.


@Markiyan Hirnyk 

I am a research student on statistical models. The one I am looking at right now is a mixture, which can be found here,

but again, it does not have a specific "density or probability function". So there is no way to get the estimates symbolically. Further, the ability to estimate ALL parameters also depend on the data. That's why the likelihood (target funtion) is so "complicated".

There are a lot of details to be considered. The function I used as an example is a "small" polynomial. That's why in practice, i can't focus too much time on the structure and maximize each one individually.


Thanks for your input!


@Axel Vogt 

What I am trying to say is, the target function is generated by another procedure, and to my main interested, would be the maximum likelihood estimates. I can't spend too much time to look into each of them seperately.

I hope to get something that works fast (maybe just under a minute for each optimization).



I keeping using the previous result as an intial point. And I can keep getting bigger tarfun.



When you say "prescision", I assume that it can be modified by "tolerances"? By default, it is 10^6 (from the help manual provided by the author of the package).

I understand that the target function can be more accurate as I run it many many times. But it is going to end at some point? say maybe the max of the target fun is -463ish?

For now, it keeps getting bigger and bigger. But is there a fast way to get to the "best" answer.




@Axel Vogt 


Thanks for your input.

First of all, this is only a "small" and "simple" example of the actualy problem. What I am doing is getting maximum likelihood estimates (mle) for a particular sets of statistical models. In the end, it's an optimization problem. That's why I like to get some standard errors (hessian matrix) if possible.


Secondly, I am not criticising the package or the methodthat's used. I am only trying to find a best way to achieve my goal (getting mle).

I followed your point about investigate the target function itself for this example. But in practice, I don't have the time to do that. And I dont think I have the knowledge either.

I think we can all agree that the optimization is a difficult task.


Thirdly, I don't trust Matlab either, not totally. I want to know if that is possible to do the same task, using Matlab's routine. If the answer agrees, then I have some reassurance.

The problem is, again, using the same "GlobalOptimal" function twice, it could give me a different result.

I further noticed that it's the way I put constraint. I should restrict w[1]=0..0.5.

Even so, for this example, I once got the target function to be -462ish.

So the end result may not be a global max.


Lastly, I do hope some clever Mathematicians can improve this in the future. As we get more computing power, we can use more complicated models, with more variables. Actually fitting the model is a real challenge.




@Markiyan Hirnyk Yes, I am currently investigating this.


Just as a quick random question, if I want to get "better" results by setting "tolerances", which function is it better to optimize? the original likelihood, or log lokelihood?

It is only a "small and simple" example I have illustrated here.


And if you dont mind, check my top post again, by little difference in the log likelihood, the estimates mu[p] can change a lot.


Any general suggestion on this?


Many thanks for your comment!

@Markiyan Hirnyk Hi Markiyan, I have attached the document.

Basically, I am not sure which synatex is more "reliable". Using "Search" sometimes give the "best" result. But "Global" can take a long time, end up with the "best" result.

If I run "Global" again, with it's "best" result as initial points, it can give a "bestter" result.

Use the above example, I reached "-462" (on the log scale of the target function) once.


So how reliable is this? Could there be a better way to optimize this ?



@Carl Love Yes, I am using its V2, here


But for some reason, it does not always give me a Global maxima. If I run it serveral times, it could give me different results. And using log(targetfun) and targetfun, could give me different answers as well.

I am trying to investiagte it now.


@itsme like I said. Yes, in our "real" situations,either will work. the real world is just too complicated in that aspect.

I had a similar situation. When a function (self written) and calls up other various functions, either Grid or Thread helps a bit.

In my case, I "create" a new procedure that only takes two inputs, so I use Matrix() to call it in batch.

It's "faster" then a double loop on the two variables.


The double loop works, the Matrix() works. Grid just hangs and does nothing or what so ever. Double loop or Matrix() runs under 5 minutes. Grid can take up to 30 min, nothing happens.




@Carl Love First of all, I have to say that 'myfun' does call a lot of other functions and procedures and returns a single value. Does that mean I have to check every single one of them?

If just to be safe, say I assume it does not qualify for multithreading and use the Grid option, is there going to be less efficient? Broadly speaking, how much difference?

All I know for sure is that, my code can run using a double loop, and a single Matrix function over these two arguments. And they are all independent.


@Carl Love 

Have these two:







Maple some how "tells" me, it can be simplifed to:






I wonder if this is possible.


Maple Worksheet - Error

Failed to load the worksheet /maplenet/convert/ .


3 4 5 6 7 8 9 Last Page 5 of 22