Karishma

880 Reputation

9 Badges

17 years, 217 days

Social Networks and Content at Maplesoft.com

Director, Academic Product Management at Maplesoft

MaplePrimes Activity


These are Posts that have been published by Karishma

Over the past year, I have spent a lot of time talking to educators, researchers, and engineers about AI. The feeling is almost universal: it is impressive, it is helpful, but you should absolutely not trust it with your math even if it sounds confident.

That tension between how capable AI feels and how accurate it actually is has been on my mind for months. AI is not going away. The challenge now is figuring out how to make it reliable.

That is where Maple MCP comes in.

Maple MCP (Model Context Protocol) connects large language models like ChatGPT, Claude, Cohere, and Perplexity to Maple’s world-class math engine.

When your AI encounters math, your AI can turn to Maple to handle the computation so the results are ones you can actually trust.

It is a simple idea, but an important one: Maple does the math and the AI does the talking. Instead of guessing, the AI can be directed to call on Maple whenever accuracy matters.

Model Context Protocol (MCP) is an emerging open standard that allows AI systems to connect to external tools and data sources. It gives language models a structured way to request computations, pass inputs, and receive reliable outputs, rather than trying to predict everything in text form.

Here is a high-level view of how MCP fits into the broader ecosystem:

MCP Architecture Diagram

Figure 1. High-level architecture of the Model Context Protocol (MCP)
Source: modelcontextprotocol.io

MCP lets an AI system connect securely to specialized services, like Maple, that provide capabilities the model does not have on its own.

If you want to learn more about the MCP standard, the documentation is a great starting point: Model Context Protocol documentation

Here is a glimpse of what happens when Maple joins the conversation:

Examples of Maple MCP in action

Figure 2. Examples of Maple MCP in action

Depending on the prompt, Maple MCP can evaluate expressions symbolically or numerically, execute Maple code, expand or factor expressions, integrate or solve equations, and even generate interactive visualizations. If you ask for an exploration or an activity, it can create a Maple Learn document with the parameters and sliders already in place.

As an example of how this plays out in practice, I asked Maple MCP:

“I'd like to create an interactive math activity in Maple that allows my students to explore the tangent of a line for the function f(x) = sin(x) + 0.5x for various values of x.”

It generated a complete Maple Learn activity that was ready to use and share. You can open the interactive version here: interactive tangent line activity .

In full disclosure, I did have to go back and forth a bit to get the exact results I wanted, mostly because my prompt wasn’t very specific, but the process was smooth, and I know it will only get better over time.

What is exciting is that this does not replace the LLM; it complements it. The model still explains, reasons, and interacts naturally. Maple simply steps in to do the math—the part AI cannot reliably do on its own.

We have opened the Maple MCP public beta, and I would love for you to try it.

Sign up today and we will send you everything you need to get started!

When we think about AI, most of us picture tools like ChatGPT or Gemini. However, the reality is that AI is already built into the tools we use every day, even something as familiar as a web search. And if AI is everywhere, then so are its mistakes.

A Surprising Answer from Google

Recently, I was talking with my colleague Paulina, Senior Architect at Maplesoft, who also manages the team that creates all the Maple Learn content. We were talking about Google’s AI Overview, and I said I liked it because it usually seemed accurate. She disagreed, saying she’d found plenty of errors. Naturally, I asked for an example.

Her suggestion was simple: search “is x + y a polynomial.”

So I did. Here’s what Google’s AI Overview told me:

“No, x + y is not a polynomial”

My reaction? HUH?!

The explanation correctly defined what a polynomial is but still failed to recognize that both x and y each have an implicit exponent of 1. The logic was there, but the conclusion was wrong.

Using It in the Classroom

This makes a great classroom example because it’s quick and engaging. Ask your students first whether x + y is a polynomial, then show them the AI result. The surprise sparks discussion: why does the explanation sound right but end with the wrong conclusion?

In just a few minutes, you’ve not only reviewed a basic concept but also reinforced the habit of questioning answers even when they look authoritative.

Why This Matters

As I said in a previous post, the real issue isn’t the math slip, it’s the habit of accepting answers without questioning them. It’s our responsibility to teach students how to use these tools responsibly, especially as AI use continues to grow. Critical thinking has always mattered, and now it’s essential.

 

On the very first day of class, a student once told math educator Sam Densley: “Your class feels safe.”

Open classroom door with students inside

Honestly, I can’t think of a better compliment for a teacher. I reflected on this in a LinkedIn post, and I want to share those thoughts here too.

A Story of Struggle

I rarely admit this, because it still carries a sting of shame. In my role at Maplesoft, people often assume I was naturally good at math. The truth is, I wasn’t. I had to work hard, and I failed along the way.

In fact, I failed my very first engineering course, Fundamentals of Electrical Engineering. Not once, but twice. The third time, I finally earned an A.

That second failure nearly crushed me. The first time, I told myself I was just adjusting to university life. But failing again, while my friends all passed easily, left me feeling stupid, ashamed, and like I didn’t belong.

When I got the news, I called my father. He left work to meet me, and instead of offering empty reassurances, he did something unexpected: he told me about his own struggles in school, the courses he failed, the moments he nearly gave up. Here was someone I admired, a successful engineer, admitting that he had stumbled too.

In that moment, the weight lifted. I wasn’t dumb. I wasn’t alone.

That experience has stayed with me ever since: the shame, the anxiety, the voice in my head whispering “I’m not cut out for this.” But also the relief of realizing I wasn’t the only one. And that’s why I believe vulnerability is key.

When teachers open up, something powerful happens:

  • Students stop thinking they’re the only ones who feel lost.
  • They see that failure isn’t the end; it’s part of the process.
  • It gives students permission to be honest about their own struggles.

That’s how you chip away at math anxiety and help students believe: “I can do this too.”

Why Vulnerability Matters

Abstract metallic mask with mathematical symbols

I can’t recall a single teacher in my own schooling who openly acknowledged their academic struggles. Why is that?

We tell students that “struggle is normal,” but simply saying the words isn’t enough. Students need to see it in us.

When teachers hide their struggles, students assume they’re the only ones who falter. That’s when math anxiety takes root. But when teachers are vulnerable, the cycle breaks. Students realize that struggle doesn’t mean they’re “bad at math.” It means they’re learning. Vulnerability builds trust, and trust is the foundation of a safe classroom.

What I Hear from Instructors

In my work at Maplesoft, I often hear instructors say: “Students don’t come to office hours — I wish they did.”

And I get it. Sometimes students are too anxious or hesitant to ask for help, even when a teacher makes it clear they’re available. That’s one of the reasons we built the Student Success Platform. It gives instructors a way to see where students are struggling without calling anyone out. Even if students stay silent, their struggles don’t stay invisible.

But tools can only go so far. They can reveal where students need support and even help illuminate concepts in new ways. What they can’t do is replace a teacher. Real learning happens when students feel safe, and that safety comes from trust. Trust isn’t built on flawless lectures or perfect answers. It grows when teachers are willing to be human, willing to admit they’ve struggled too.

That’s when students believe you mean it. And that’s when they’re more likely to walk through the door and ask for help.

The Real Lesson

Ultimately, what matters most in the classroom, whether in mathematics or any other subject, isn’t perfection. It’s effort.

As a new school year begins, it’s worth remembering:

  • Students don’t just need formulas.
  • They need to know struggle is normal.
  • They need to know questions are welcome.
  • They need to know the classroom is safe enough to try.

Because long after they move on, that’s what they’ll remember: not just what they learned, but how they felt.

With the launch of ChatGPT 5.0, many people are testing it out and circulating their results. In our “random” Slack channel, where we share anything interesting that crosses our path, Filipe from IT posted one that stood out. He’d come across a simple math problem, double-checked it himself, and confirmed it was real:

ChatGPT 5.0 Example

As you can see, the AI-generated solution walked through clean, logical-looking steps and somehow concluded:

x = –0.21

I have two engineering degrees, and if I hadn’t known there was an error, I might not have spotted it. If I’d been tired, distracted, or rushing, I would have almost certainly missed it because I would have assumed AI could handle something this simple.

Most of us in the MaplePrimes community already understand that AI needs to be used with care. But our students may not always remember, especially at the start of the school year if they’ve already grown used to relying on AI without question. 

And if we’re honest, trusting without double-checking isn’t new. Before AI, plenty of us took shortcuts: splitting up the work, swapping answers, and just assuming they were right. I remember doing it myself in university, sometimes without even thinking twice. The tools might be different now, but that habit of skipping the “are we sure?” step has been around for a long time.

The difference now is that general-purpose AI tools such as ChatGPT have become the first place we turn for almost anything we do. They respond confidently and are often correct, which can lead us to become complacent. We trust them without question. If students develop the habit of doing this, especially while they are still learning, the stakes can be much higher as they carry those habits into work, research, and other areas of their lives.

The example above is making its rounds on social media because it’s memorable. It’s a basic problem, yet the AI still got it wrong and in a way that’s easy to miss if you’re not paying attention.

Using it in the classroom can be a great way to help students remember that AI’s answers need to be checked. It’s not about discouraging them from using AI, but about reinforcing the habit of verifying results and thinking critically about what they see.

So here’s my suggestion:

  • Show this example in your class, no matter the subject. If your students are using AI, they’ll benefit from seeing it.
  • Spend 10 minutes discussing it.
  • Use it as a jumping-off point to talk about what’s OK and not OK when using AI for your course.
  • Share other examples like this throughout the year as small reminders, so “critical thinking” becomes second nature.

This isn’t just about catching an AI’s bad subtraction. It’s about building a culture of verification and reasoning in our students. The tools will keep improving, but so will the temptation to turn off our own thinking.

If we can help students get into the habit of checking, AI can be a powerful partner without putting them on autopilot.

To the MaplePrimes community: How do you talk to your students to help them build strong habits when working with AI? Do you bring in examples like this one, or use other strategies? I’d love it if you could share your thoughts, tips, and ideas.

 

We’re thrilled to announce the launch of our new Student Success Platform! Over the past several months, our academic team has dedicated itself to understanding how we can better support institutions in addressing their concerns around student retention rates. The numbers tell a concerning story: In the U.S., nearly 25% of first-year undergraduates don’t complete their studies, and in STEM fields, the numbers are even higher. In both STEM programs and non-STEM programs with math gateway courses, struggles with math are often a key reason students do not, or cannot, continue their studies. This has a profound impact on both the students’ futures and the institution’s revenue and funding.

From what we’re hearing from institutions and instructors, one of the most pressing issues is the lack of readiness among first-year students, particularly in math courses. With larger class sizes and students arriving with varying levels of preparedness, instructors face challenges in providing the personalized support that is essential. Additionally, many students don’t fully utilize existing resources, such as office hours or TA sessions, which increases their risk of falling behind and ultimately dropping out.

Our new Student Success Platform is designed to tackle these issues head-on. It combines all of our existing tools with exciting new features to help students succeed on their own terms—without adding to instructors' already busy workloads. The early feedback has been fantastic, and we can’t wait for you to see the impact it can make.

You can read more about the Student Success Platform here: https://www.maplesoft.com/student-success-platform/

 

1 2 3 4 5 Page 1 of 5