We have had a site license for several years. For a lot of our teaching we are switching to publishers online testing, mainly because TA is so poor. We may well jump ship completely.
I completely agree about the LaTeX issues. I would add that you can't put a less than symbol in the algorithm, or you couldn't last time I used it. It seems to do some incredibly inept syntax checking of algorthms that flags all sorts of sensible use as errors but it passes all sorts of meaningless junk. If it can't properly check the algorithms, why not just pass them through? I could go on. And I completely agree that LaTeX authoring is or ought to be much quicker than using the dreadful MS equation editor.
And then the whole process of using the LaTeX converter is such a nightmare of clicking, uploading and downloading. (And the mot recent version has this error connected with inserting spaces all over the place). A grad student here wrote a python script to automate the whole process - why on earth hasn't Maple managed to produce some little application to do this automatically on the main platforms?
Agree to about maths rendering - things like primes are almost invisible. And it doesn't scale which is a huge accessibility problem.
And then there is the "previewer" for formula input which when you preview 2*(a+b+c+d) produces 2(((a+b)+c)+d) which confuses students no end.
I could go on, but I think a clear example of the lack of seriousness of Maplesoft on this is that we are at Version 7 and we STILL have the following nonsense in the algorithm generation:
$c = frac(-1,-1); produces (I kid you not) $c = --1 AND this then causes an error if passed to Maple.
produces $q= -0.
I mean really! My guess is that the LaTeX converter and the underlying engine were inherited by Maple from whatever the preceeding system was that they took over, and nobody at Maple has actually understood how either of them works well enough to fix the glaring errors.
So I too am longing to hear that all this nonsense is going to be fixed.