When AI Becomes Personal: Teaching, and the Tools of Tomorrow

Article

Featuring

Mohamed Awwad

Share

Artificial intelligence is not new. It’s newly accessible.

With the recent surge of interest in AI, the excitement feels familiar to us in industrial and manufacturing engineering. Years ago, data science went through the same hype cycle. Today, AI—particularly generative AI—is riding a similar wave. The algorithms have existed for decades. What’s changed is access. Generative tools like ChatGPT have made powerful computing capabilities available to anyone with an internet connection. Accessibility is transformative, but it also creates anxiety, uncertainty, and a significant trust gap.

As an academic and educator, I’m now tasked with helping students navigate that gap. Since I’ll coordinate Capstone Design projects next year, I plan to strongly encourage students to use AI tools like Enzzo to support new product design. But this isn’t just about tool adoption but responsible integration. It’s about learning how to evaluate, question, and apply AI critically.

My background straddles two worlds. I trained in mechanical design and production engineering but transitioned to industrial engineering, specializing in logistics, supply chain systems, and operations research. I’ve lived the shift from hardware to systems thinking and watched AI evolve from abstract academic theory to daily practice.

"AI can be an extraordinary teaching tool—if we lead by example. I’ve had to rethink my curriculum. In the past, students submitted written reports on case studies. With generative AI in the mix, those reports started to look eerily similar. So I turned to an AI-powered platform called Breakout Learning. "

That’s why I’m both optimistic and cautious. AI can be an extraordinary teaching tool—if we lead by example. I’ve had to rethink my curriculum. In the past, students submitted written reports on case studies. With generative AI in the mix, those reports started to look eerily similar. So I turned to an AI-powered platform called Breakout Learning. It allows students to participate in live, verbal case discussions, while AI tracks and analyzes their comprehension and collaboration. It’s not about punishing AI use—it’s about teaching better engagement with it.

But we can’t ignore the real challenges. There are urgent questions around transparency, consent, and data privacy. Most people don’t know how their data is being used or even that they’ve consented to it, which fuels distrust. Then there’s the issue of hallucinations—when AI generates false but convincing information. Even if these errors improve, they make it harder for students, educators, and the public to know what to trust.

"As educators, we need to embrace this moment. AI won’t go away. It will grow, evolve, and become as common as electricity. Our responsibility is to guide students into this new world—not by banning the tools, but by showing how to use them wisely."

History does offer clues. The adoption of credit cards, the rise of the internet, even the first steam engine—all these revolutions began with doubt. Industry 5.0, where we now find ourselves, is meant to be human-centric, resilient, and sustainable. Trust must be built like it always has been: through transparency, public education, and careful regulation.

As educators, we need to embrace this moment. AI won’t go away. It will grow, evolve, and become as common as electricity. Our responsibility is to guide students into this new world—not by banning the tools, but by showing how to use them wisely.

And if you ask me what keeps me up at night? It’s not that students will use AI. They’ll use it uncritically, without ever learning to question what they’re given. Critical thinking is the one thing AI can’t replace. Our job is to make sure it never does.

Mohamed Awwad

Associate Professor at Cal Poly IME

Share

Share

Recommended

Recommended

2025 Enzzo, Inc. All Rights Reserved.

2025 Enzzo, Inc. All Rights Reserved.

2025 Enzzo, Inc. All Rights Reserved.