JacquesC

Prof. Jacques Carette

2401 Reputation

17 Badges

20 years, 89 days
McMaster University
Professor or university staff
Hamilton, Ontario, Canada

Social Networks and Content at Maplesoft.com

From a Maple perspective: I first started using it in 1985 (it was Maple 4.0, but I still have a Maple 3.3 manual!). Worked as a Maple tutor in 1987. Joined the company in 1991 as the sole GUI developer and wrote the first Windows version of Maple (for Windows 3.0). Founded the Math group in 1992. Worked remotely from France (still in Math, hosted by the ALGO project) from fall 1993 to summer 1996 where I did my PhD in complex dynamics in Orsay. Soon after I returned to Ontario, I became the Manager of the Math Group, which I grew from 2 people to 12 in 2.5 years. Got "promoted" into project management (for Maple 6, the last of the releases which allowed a lot of backward incompatibilities, aka the last time that design mistakes from the past were allowed to be fixed), and then moved on to an ill-fated web project (it was 1999 after all). After that, worked on coordinating the output from the (many!) research labs Maplesoft then worked with, as well as some Maple design and coding (inert form, the box model for Maplets, some aspects of MathML, context menus, a prototype compiler, and more), as well as some of the initial work on MapleNet. In 2002, an opportunity came up for a faculty position, which I took. After many years of being confronted with Maple weaknesses, I got a number of ideas of how I would go about 'doing better' -- but these ideas required a radical change of architecture, which I could not do within Maplesoft. I have been working on producing a 'better' system ever since.

MaplePrimes Activity


These are replies submitted by JacquesC

I think a recent paper (see also arXiv version) relates to this.
Your problem was indeed a new Maple 11 problem, and has been quite well addressed in the rest of this thread. My response above, on the other hand, was to Robert's post, rather than yours. What he was showing is related to multiple representations, which was the initial starting point of my message.
The link you tried to post in your reply did not show up - I can't find your PDF file!
You are essentially exposing the fact that Maple has evolved over many years, and some object representations (like intervals) do not have a canonical representation in Maple (I know of at least 3). With no canonical representation, they really become second-class citizens, which essentially means that operations on them are done by specialized commands, which are not always so easy to fathom (as you demonstrate above). Of course fixing this would require some changes that would not be backwards compatible. So even though this state of affairs dates from the Maple V era [when the third representation was added], it is unlikely to change. At least, it won't change until enough users scream that they would much prefer a consistent system over one that's compatible with last year's model. The real problem with that is there is a solution to that dilemma: Mathematica. It is much more consistent, and is entirely incompatible with Maple... Then again, mathematics is still rather idiosyncratic. Mathematica's consistent syntax is rather disconcerting because it is quite unfamiliar. Maybe consistency just isn't that important?
Take a closer look at the author of that post...
Can you give us a few details? I know of a number of papers on differential resolvants and related works that might be interesting to you, but I would like to know a bit more to steer you towards the right ones. Maybe you could post the results at order 2 and 3 here? I am sure that a number of people on primes would have good ideas on where to look next. This is definitely an area where symbolic computation should be able to help you. In fact, there are a lot of known algorithms for (fast!) computations of various factorizations of matrices with polynomial entries! [Most of the world experts on that domain are actually loosely associated with Maplesoft]. Now, some of those algorithms are implemented in Maple itself, but they are not always easy to get at, since they require you to set up your problem in very specific ways. Again, there are readers here who can help. There are also algorithms that are not yet integrated in Maple, but they are usually in LinBox, which can be accessed from Maple, AFAIK.
You can do more simply map(irem, [$1..5], 13) because, by default map(f, L, b, c, d) == map(proc(a1) f(a1, a, b, c) end, L) which is rather convenient. Note that there is map2 to map onto the 2nd argument, and in fact map[n] for the n'th.
You can do more simply map(irem, [$1..5], 13) because, by default map(f, L, b, c, d) == map(proc(a1) f(a1, a, b, c) end, L) which is rather convenient. Note that there is map2 to map onto the 2nd argument, and in fact map[n] for the n'th.
Let's assume for the sake of discussion that the whole product goes open source, but that there is a strong "gatekeeper" at Maplesoft. Then my opinion is: 1) the core library would indeed get more maintenance 2) but at the same time, new releases (of the core library, probably the kernel too) would likely introduce many more backwards-incompatible changes [because more design bugs would get fixed], which would improve the product but make upgrading harder 3) there would be less innovation in the user interface 4) there would probably be fewer totally new packages in the library, especially packages as thoroughly thought through and designed [like VectorCalculus (warts and all), the various Student packages, etc]. In other words, things would be very different indeed. Would they be better? I quite uncertain, and tend to think that it would be overall negative. That said, the library is essentially visible as it is. The only thing really 'missing' is a community process by which Maple users could submit patches to existing code, and have these patches migrate to the official code base. The biggest difficulties with that are licensing (ie IP ownership) and proper testing (ie ensuring that patches do not break existing code). This is one area where a more open process could in fact work. Perhaps I'll start a thread somewhere else on primes to cover this topic, to see the feeling of the community.
Unfortunately, I am not aware of any maple-based introduction to the functional style. The various programming guides spread the information a little bit throughout. But you probably know more about functional programming than you think: if you've ever used map, select, or remove, you have programmed 'functionally'. Whenever you use a local procedure or an inline arrow procedure, you are thinking functionally. For that matter, using add, mul and seq is functional, while using the dreaded for loop is a sure sign of 'imperative' thinking. The basics of functional programming are straightforward (see John Hughes' fantastic Why Functional Programming Matters for more): 1) functions are first-class objects. You can pass them around just like anything else. 2) side-effects are bad and should be isolated/minimized. This includes assignments and IO actions. If you understand why 'map' is preferable to a for loop doing the same thing, then you're already 80% of the way there. You just need to next understand how this can be generalized to other structures. To me, there are two advantages to functional-style programming: the resulting code is easier to reason about, and the low-level "plumbing" gets abstracted out. Unfortunately, in Maple, it is not convenient to program purely in the functional style -- there is not equivalent to let bindings. So one is forced to use assignments anyways, a real pity.
Note that I did not wish to leave the impression that maintenance work is not done - in fact I explicitly stated that my belief is that when "real work" is done to any routine, it is done properly. What I wanted to convey is that at any long-lived software company, there is a real tension between the programmers who want to "do the right thing" and management who want to "move forward to sell more". Any company that does not settle into some reasonable compromise position between these two extremes disappears pretty quickly. IMHO, Maplesoft will be around for quite a few years yet!
I quite agree that the expressions above are an even better way to write this table, thanks. If one was concerned about (re)writing a whole system (like Maple) instead of just one function, then this kind of information should in fact be stored globally, like the the FunctionAdvisor database, with a convenient method of extracting this information. 'Modern' Maple is really nice, in that it would allow one to (re)write Maple's own library in about 1/2 the amount of code present in the current implementation, while at the same time making it much more maintainable [and generally keeping the same efficiency profile].

What does this say about the efficiency of the current code used in Maple. Is there a performance reason to prefer one style over another?

In this particular case, I believe that the 'modern' code is more efficient. But this is not always true, there are times where a more modern, more readable piece of code may turn out to be (somewhat) less efficient. You really need to do timings to know.

Should we expect all of the Maple code to be (re-)written in the modern style? and updated as the language is updated?

I am sure that any piece of code which needs (serious) maintenance would be modernized. But to do this systematically would require manpower much beyond Maplesoft's current capacity. As an example, take a look at how long it took for most packages to be made into modules (which were introduced in Maple 6)? There was a big push for Maple 11, but I am not even sure they all are yet. Basically, this is where the fact that Maplesoft is a commercial enterprise comes in. There is little ROI to systematic maintenance. The only people who suffer are the Maplesoft programmers who have to deal with ancient codes. And it's their job, isn't it? Of course, the real cost is that maintenance on Maple takes considerably more effort than it should, for the same reasons. No manager wants to pay (on their schedule) the up-front cost of proper code hygiene, and all managers rail on the cost (on their schedule) of each new feature, as the development time is much more than it 'ought' to be. This is not Maple-specific by any means, it is thoroughly documented as a problem with all software development projects once they reach release 2.0, and only gets worse and worse over time whenever proper maintenance is not done. <wild_guess> While there were various point releases of Mathematica, it took them something like 6 years before they came up with major changes. When I last talked to some Mathematica developers, they were saying that the code base was unmaintainable because of various long-term structural issues. And now we have Mathematica 6, after tons of years of work from Wolfram Research. </wild_guess>

What are the benefits of the modern style, other than appearance and ease-of-HUMAN-understanding?

Well, those two are rather important, aren't they? Another thing is that, as new library code tends to (mostly) get written in the more modern style, then any kernel optimizations made based on measurements of efficiency of new code will tend to benefit the modern style much more than the older style. This is how code "bit rots", ie where old code continues to work but somehow works less and less well over time, because surrounding infrastructure has shifted over the years.
There are times when closed-forms are really quite useless - and this is one of them. Even for polynomials of low degree (3,4 in terms of radicals, 5,6 in terms of hypergeometric, 7,8 in terms of Lauricella functions), the closed-forms are awful for any further computations. Maple's RootOf is a much better answer, since it can be manipulated (symbolically) quite easily. Now, it may turn out that your polynomials are highly structured, so that closed-forms are useful. In that case, I would suggest turning your polynomials into systems of differential equations (for the roots, in terms of the parameters). In some situations, dsolve is a fair bit more powerful than solve!
It was pointed out to me that type(infinity+infinity*x,polynom(constant,x)) returns 'true' by design. I had forgotten that 'constant' means "maple constant" rather than "numerical constant". In other words, type(true+false*x, polynom(constant, x)) is also true, by design. It is the use of that particular type check which is ill-suited to its purpose, rather than the type-check being ``wrong''. As others have mentioned recently, this is another case of 'computer science' and 'mathematics' clashing. While in this situation what we get is clearly a case of GIGO, it's not clear that there aren't subtle cases where this causes actual bugs. While on the topic of GIGO: solve(true*x-false, x); returns false/true, as does solve(true*x-false). More fun is that solve(infinity*x-false) returns 0 ! GIGO indeed!
First 58 59 60 61 62 63 64 Last Page 60 of 119