acer

32480 Reputation

29 Badges

20 years, 6 days
Ontario, Canada

Social Networks and Content at Maplesoft.com

MaplePrimes Activity


These are replies submitted by acer

@erdem You might consider calling the EigenConditionNumbers command at default working precision (Digits=10 say), with options supplied to make it return both  estimated eigenvalues and condition numbers. Then you might test the estimated condition numbers, to try and gauge whether you need to rerun under higher working precision.

You might call that command and supply the option 'balance=scale', to help keep the condition numbers smaller. (See here.)

Let me know if you need code to estimate bounds on the eigenvalues' errors, as a function of condition numbers and working precision. Or see here and here.

But most importantly, you should follow Carl's advice below and ensure that the eigen-solving is done as a floating-point computation and not as an exact one. You can do that by either forming the Matrices with datatype=float or by wrapping each Matrix in a call to `evalf` before passing to the LinearAlgebra command.

Christopher, you wrote,

    sqrt(a)/a is in fact simplified to 1/sqrt(a)

and there it does seem that you mean unassigned symbolic `a` there. But what is actually happening under that call to the `simplify` command is that symbolic a^(1/2)/a becomes 1/a^(1/2). Just for the sake of clarity.

Another somewhat related inconsistency is the following distinction, which can sometimes be awkward (such as when trying to do matching),

A := 1/x^(p);

                               1 
                               --
                                p
                               x 

dismantle(A);

PROD(3)
   POWER(3)
      NAME(4): x
      NAME(4): p
   INTNEG(2): -1


B := x^(-p);

                              (-p)
                             x    

dismantle(B);

POWER(3)
   NAME(4): x
   SUM(3)
      NAME(4): p
      INTNEG(2): -1

Here, the powers of symbolic `p` involved in A-B do not combine unless we issue a subsequent command to force it. Also, the value of `A` becomes that of `B` under a call to `simplify` or `combine` but not under a call to `normal` or `expand`. And the value of `B` becomes that of `A` under `expand` but not under `normal`, `simplify`, or `combine`. And A/B and B/A do not become 1 under `normal`, but they do under `simplify`, `combine`, and `expand`. I expect that some of this might confuse new users.

acer

Please read my response again. I wrote a power of 1/2, not a call to sqrt.

You can enter '1/2^(1/2)' to check that. It autosimplifies to 2^(1/2)/2.

I interpreted Christopher's comments as being about power 1/2 values, and I can understand that some others in your audience (and possibly the OP) might not be fully aware of the distinctions. The fact that they prettyprint the same as 2D output doesn't help for general understanding.

Unprotecting and redefining `^` is of course possible, just as is producing some other mechanism that merely prints like power 1/2. But each of those will be quite some other object, and not the same as 2^(1/2)/2 in value, which I suppose should be made clear to the audience.

It might be nice if a solution which satisfies Christopher here would also allow for simplification (`simplify` or `radnormal`) on 6^(1/2)/3^(1/2) or its reciprocal say.

I doubt that the return result from (un-redefined, global) 1/sqrt(2), which indeed is 2^(1/2)/2, can be easily (if at all) prevented from autosimplifying. Changing how that value prints, or creating quite another value, is not the same thing.

I'm not saying that your solution might not satisfy Christopher. (You might want to demonstrate its use, though.) His original question was somewhat ambiguous, given that unevaluated 'sqrt(7)' is not what the call sqrt(7) returns. But I wasn't making a claim about autosimplification of '1/sqrt(7)', I was making it about '1/7^(1/2)'.

acer

Please read my response again. I wrote a power of 1/2, not a call to sqrt.

You can enter '1/2^(1/2)' to check that. It autosimplifies to 2^(1/2)/2.

I interpreted Christopher's comments as being about power 1/2 values, and I can understand that some others in your audience (and possibly the OP) might not be fully aware of the distinctions. The fact that they prettyprint the same as 2D output doesn't help for general understanding.

Unprotecting and redefining `^` is of course possible, just as is producing some other mechanism that merely prints like power 1/2. But each of those will be quite some other object, and not the same as 2^(1/2)/2 in value, which I suppose should be made clear to the audience.

It might be nice if a solution which satisfies Christopher here would also allow for simplification (`simplify` or `radnormal`) on 6^(1/2)/3^(1/2) or its reciprocal say.

I doubt that the return result from (un-redefined, global) 1/sqrt(2), which indeed is 2^(1/2)/2, can be easily (if at all) prevented from autosimplifying. Changing how that value prints, or creating quite another value, is not the same thing.

I'm not saying that your solution might not satisfy Christopher. (You might want to demonstrate its use, though.) His original question was somewhat ambiguous, given that unevaluated 'sqrt(7)' is not what the call sqrt(7) returns. But I wasn't making a claim about autosimplification of '1/sqrt(7)', I was making it about '1/7^(1/2)'.

acer

@Christopher2222 If the conversion of 1/7^(1/2) to 7^(1/2)/7 is done by Maple as an automatic simplification then you may have to resort to some kludgy representation in order to foil that mechanism. For example, by using ``(), or replacement by manually typeset constructions.

@Christopher2222 If the conversion of 1/7^(1/2) to 7^(1/2)/7 is done by Maple as an automatic simplification then you may have to resort to some kludgy representation in order to foil that mechanism. For example, by using ``(), or replacement by manually typeset constructions.

I misread it as Eigenvalues(evalf(A)) which it wasn't. That's preconceptions for you...

ps. Sometimes I wonder about the merit of recompiling (overloaded) LAPACK with quad precision, as a performance bridge between fast double precision and much slower (gmp based) high precision.

acer

I misread it as Eigenvalues(evalf(A)) which it wasn't. That's preconceptions for you...

ps. Sometimes I wonder about the merit of recompiling (overloaded) LAPACK with quad precision, as a performance bridge between fast double precision and much slower (gmp based) high precision.

acer

Do you need Digits that high?

There will be a significant slowdown when Digits rises above 15 (the value of trunc(evalhf(Digits)), which is the cutoff for using purely compiled double precision LAPACK).

acer

@Alejandro Jakubi Could not Beta(x,1-x) be taught to immediately return the corresponding csc result if `x` is of type `fraction`? Ie, if real `x` is noninteger and rational. (That was my tentative thinking when making my earlier comment.)

Simplification, if symbolic `x` were assumed to have the corresponding property, would also be nice of course.

@Adri van der Meer It may well be easier or more convenient to take a numerical limit than to prove using Maple that it is a contraction. But of course I was considering other, perhaps more demanding examples. One of my intended points (which I may not have conveyed adequately, of course) was indeed that using fsolve might not make best sense.

@Axel Vogt The Asker gave an example. But that might just be a simple illustration.

Some other examples might not be explicitly solvable, either by rsolve for the iterative formula or by solve for the limiting form. In such a case one might be tempted to call fsolve to attempt to compute an approximate floating-point solution. ...And if fsolve were the route taken, then it might be slightly amusing to note that fsolve's non-polynomial univariate rootfinder can attempt various inverse iteration approaches -- one of which might relate to the given iterative formula.

@Alejandro Jakubi Ah, it seems as if the "normalization" of GAMMA(x) to Pi*csc(Pi*x)/GAMMA(1-x) is only done if the csc(..) can be converted to radicals (because that makes it more subsequently useful, or in hopes of additional simplifications!?).

convert(csc(Pi*2/5),'radical');

                                    (1/2)       
               1  (1/2) /     (1/2)\       (1/2)
               - 2      \5 - 5     /      5     
               5               
                 
convert(csc(Pi*2/7),'radical');

                              /2   \
                           csc|- Pi|
                              \7   /

(It's mildly amusing how topics show up in bursts. Case in point being trig->radicals.)

But then could not `Beta` improve, when it sees an appropriate pair of arguments x and 1-x? ...because of the ensuing multiplicative cancellation of GAMMA terms. That could be regardless of any successful conversion to radicals.

I think that I generally agree with you about `int` methods. But how should it be improved?

@Joe Riel The only commands I can think of, offhand, that use _nresults are in the MTM package.

Some of those instances may seem to mimic Matlab functions (especially where the corresponding top Maple command offers an optional side-effect on a name). For example, compare Maple's MTM:-coeffs with Maple's coeffs, and with Matlab's (Symbolic Math Toolbox) coeffs.

That's partly why I mentioned _nresults, because the Question touched on Matlab vs Maple programming. I don't recall, however, whether Matlab allows user-defined functions with that kind of behaviour.

@Joe Riel The only commands I can think of, offhand, that use _nresults are in the MTM package.

Some of those instances may seem to mimic Matlab functions (especially where the corresponding top Maple command offers an optional side-effect on a name). For example, compare Maple's MTM:-coeffs with Maple's coeffs, and with Matlab's (Symbolic Math Toolbox) coeffs.

That's partly why I mentioned _nresults, because the Question touched on Matlab vs Maple programming. I don't recall, however, whether Matlab allows user-defined functions with that kind of behaviour.

First 378 379 380 381 382 383 384 Last Page 380 of 594