## ReScaliing a projective vector...

I want to rescale a projective vector. Have been using gcd on the numerators and denominators. This works in simple situations. It doesn;t work well here, admitadely the points have been just made up for the question.  Square roots seem to make it mal-preform. I run into a lot of squate roots in symbolic situations. What would be a better way? I have been wondering if frontend would help?

 > restart
 > Prntmsg::boolean:=true; Normalise_Projective_Point:=1; ReScl::boolean:=true;
 (1)
 >
 > ProjLP:=overload([       proc(A::Vector[row],B::Vector[row],prnt::boolean:=Prntmsg)       description "2 projective points to create a projective line vector";       option overload;       local Vp ,gcdn,gcdd,vp ;       uses LinearAlgebra;               Vp:=CrossProduct(A,B)^%T;#print("2nd ",Vp);       if ReScl then          gcdn := gcd(gcd(numer(Vp[1]),numer(Vp[2])), numer(Vp[3]));          gcdd := gcd(gcd(denom(Vp[1]),denom(Vp[2])), denom(Vp[3]));          Vp:=simplify(Vp*gcdd/gcdn);       end if;       if Prntmsg then          print("Line vector from two projective points. " );       end if;       return Vp       end proc,       proc(A::Vector[column],B::Vector[column],prnt::boolean:=Prntmsg)       description "2 lines to get intersection projective point";       option overload;       uses LinearAlgebra;       local  Vp;            Vp:=CrossProduct(A,B)^%T;                   if Vp[3]<>0 and Normalise_Projective_Point<>0 then            Vp:=Vp/Vp[3];       end if;       if Prntmsg then            print("Meet of two Lines ");       end if;       return Vp    end proc       ]);
 >
 (2)
 > #maplemint(ProjLP)
 > pt1:=: pt2:=: pt3:=: pt4:=:
 >
 > l1:=ProjLP(pt1,pt2)
 (3)
 > l2:=ProjLP(pt3,pt4)
 (4)
 > l3:=ProjLP(pt1,pt4)
 (5)
 > l4:=ProjLP(pt2,pt4)
 (6)
 > pl1l2:=simplify(ProjLP(l1,l2))
 (7)
 > pl2l3:=simplify(ProjLP(l2,l3))
 (8)
 > (ProjLP(pl1l2,pl2l3)); length(%)
 (9)
 > ReScl:=false
 (10)
 > # doing nothing seems to work better here than rescaling
 > (ProjLP(pl1l2,pl2l3)); length(%)
 (11)
 >

## Wigner J, Clebsch-Gordon...

Hi,

is there any package out there to calculate Wigner J symbols and Clebsch-Gordon coefficients in Maple

## Is this a bug or does Maple hate me? ...

I am running Maple 2023 - yes I should update - and I found a weird "bug" if you could call it that. For different versions of the Physics package I am getting different answers on the same problem.

This is what I was getting when I run Version 1410:

 > restart;
 > with(Physics):

 > Physics:-Version()
 (1)
 > Setup(mathematicalnotation=true):
 > g_[arbitrary]:
 (2)
 > LG :=(g_[~mu,~nu]*Ricci[mu,nu])*sqrt(-%g_[determinant]);
 (3)
 > SG:=Intc(LG,X)
 (4)
 > EQ:=Fundiff(SG,%g_[~delta,~gamma])/sqrt(-%g_[determinant])
 (5)
 > Simplify(subs(%g_=g_,EQ))
 (6)
 >
 >

And this is what I get if I used the latet update for 2023, Version 1683:

 > restart;
 > with(Physics):
 > Physics:-Version();
 (1)
 > Setup(mathematicalnotation=true):
 > g_[arbitrary]:
 (2)
 > LG :=(g_[~mu,~nu]*Ricci[mu,nu])*sqrt(-%g_[determinant]);
 (3)
 > SG:=Intc(LG,X)
 (4)
 > EQ:=Fundiff(SG,%g_[~delta,~gamma])/sqrt(-%g_[determinant])
 (5)
 > Simplify(subs(%g_=g_,EQ))
 (6)
 >

Strange right? I bring this up because it makes me wonder about potential errors in other computations...

The answer - equation 6 - in 1410 is the correct answer. This is simply a derivation of the Einstein Tensor.

## Union and indexed sets...

Or some other type of calculation in indexed set. I tried the maple android app with no luck. How is done in maple?

## why do i have a problem?...

I`m trying execute the example of Deeplearnig:

with(DeepLearning);
v1 := Vector(8, i -> i, datatype = float[8]);
v2 := Vector(8, [-1.0, 1.0, 5.0, 11.0, 19.0, 29.0, 41.0, 55.0], datatype = float[8]);
model := Sequential([DenseLayer(1, inputshape = [1])]);
model := Vector(2, {(1) = Typesetting:-mi("`DeepLearning

Model`"), (2) = Typesetting:-mi("`<keras.engine.sequential.Se\

quential object at 0x000001C5B6520700>`")})

model:-Compile(optimizer = "sgd", loss = "mean_squared_error");
model:-Fit(v1, v2, epochs = 500);
"<Python object: <keras.callbacks.History object at 0x000001C5CC\

EE9DE0>>"

convert("<Python object: <keras.callbacks.History object at 0x000001C5CCEE9DE0>>", 'symbol');
<Python object: <keras.callbacks.History object at 0x000001C5CCE\

E9DE0>>

model:-Predict([10]);

But, finally, there is this error:

Error, (in Predict) AttributeError: 'CatchOutErr' object has no attribute 'flush'

['Traceback (most recent call last):\n', '  File "C:\\Program Files\\Maple 2023\\Python.X86_64_WINDOWS\\lib\\site-packages\\keras\\utils\\traceback_utils.py", line 70, in error_handler\n    raise e.with_traceback(filtered_tb) from None\n', '  File "C:\\Program Files\\Maple 2023\\Python.X86_64_WINDOWS\\lib\\site-packages\\keras\\utils\\io_utils.py", line 80, in print_msg\n    sys.stdout.flush()\n', "AttributeError: 'CatchOutErr' object has no attribute 'flush'\n"]

What`s is happening?

Thanks!

## Line integral input ?...

Can this be better done in Maple ? , see worksheet.

## Display numbers in the contour plot...

How can numbers be displayed inside the contour plot?

restart;
with(plots);
contourplot(x*exp(-x^2 - y^2), x = -2 .. 2, y = -2 .. 2, axes = boxed);
like this

## Unexpected results from GraphTheory:-WienerIndex? ...

OEIS A034828 and OEIS A000292 (which give the Wiener index for the cycle graph and the path graph respectively) mention that

the Wiener index of the cycle of length 19 is 855 and
the Wiener index of the path with 19 edges is 1330

However,

```GraphTheory:-WienerIndex(GraphTheory:-CycleGraph(19));
=
38

GraphTheory:-WienerIndex(GraphTheory:-PathGraph(20));
=
38

```

So what happened here?

## Solving linear equations ...

I am trying to solve several problems of  solving  around 200 undetermined variables out of a set of aroud 300 2nd-order equations (such as a*b=c).

I just use "solve" command.

1. Maple continuingly evaluates and does not return result, how to make it work?

2. In some problems, i have results, but there are great of freedom, which i want to restrict them in some way.

## How does Maple perform in this comparison?...

As I learned here Maple is also a multi-paradigm programming language.

I was wondering how Maple compares in this chart.

Even though I am not a computer scientist, I would say that Maple is on a par with the number one (not sure about pipelines).

Would this claim be correct?

## Status bar: Memory - What memory is displayed...

I have noticed a substantial difference between the memory Maple displays per worksheet

and what the task manager (red arrow) indicates. After kernel restart it looks like this

What Maple displays does not seem to correlate with the physical memory used/allocated.

What is actually displayed and how can we make use of this information?

Also: Is the displayed Time the total process time or the time the Gui waits for the server to reply? Hard to tell.

## Avoiding the "roots of complex number"...

Hello,

I'm encountering an issue with the "roots of complex number" message while running my Maple code. Maple seems unable to solve this problem for me.

I would greatly appreciate any ideas or suggestions that could help me resolve this error.

Vib-code.mw

## Trouble with AI Formula Search...

Every last query I make in the AI Formula Assistant returns this message...

This happens even when I use a basic canned query shown in use-case examples (e.g., surface area, sphere).

I have accepted the Terms of Use.  Is there some other setting I need to enable? Thanks.