In this post, we feature Dr Leonard Ng Wei Tat, an Assistant Professor from the School of Materials Science and Engineering. While his primary research focuses on data-driven techniques for printed electronics and high-throughput manufacturing, he has been pioneering the thoughtful integration of AI in education through Professor Leodar, a Singlish-speaking AI teaching assistant. In his article, Dr Ng shares crucial insights from its use in his data science and artificial intelligence course, emphasising the importance of selective deployment while maintaining traditional academic rigour. His pragmatic approach demonstrates how institutions can effectively balance technological innovation with fundamental learning principles, drawing from his expertise in both technical implementation and pedagogical needs.
“Aiyoh, you ask one very solid question lah!” When students hear this distinctly Singaporean response from our AI teaching assistant, they know they’re engaging with something unique in the rapidly evolving landscape of educational technology. But as universities worldwide rush to implement AI solutions in their classrooms, our experience at Nanyang Technological University (NTU) offers crucial insights into what makes an effective AI teaching assistant—and more importantly, why it isn’t suitable for every course.
Our journey began with a fundamental question: In an era where AI tools are as ubiquitous as search engines, how do we create meaningful educational experiences while preserving the essential aspects of traditional learning? Rather than fighting the inevitable tide of AI adoption, we chose to embrace it selectively by developing our own pseudo-instructor: Professor Leodar, a Singlish-speaking AI teaching assistant with a carefully crafted personality and deep integration with our curriculum.
Since January 2024, Professor Leodar has handled over 12,334 queries from 154 students, but the numbers tell only part of the story. What makes our approach distinctive isn’t just the creation of a virtual instructor with local cultural awareness—it’s our careful consideration of where and when such tools are appropriate.
Let me be absolutely clear: there is no substitute for the fundamental academic experiences that shape deep understanding. The midnight oil must still burn as students work through complex derivations by hand, synthesize information from multiple sources, and develop the critical thinking skills that come from wrestling with difficult concepts. Our AI teaching assistant is a supplement to—never a replacement for—these essential learning processes.
Our development process focused on three key elements often overlooked in the rush to implement AI solutions. First, we carefully selected technical subjects where answers have clear right/wrong outcomes as pilot courses. We deliberately avoided subjects requiring deep conceptual synthesis, complex derivations, or extensive critical analysis—areas where traditional study methods remain irreplaceable. Second, we ensured strong educator leadership throughout development, recognising that pedagogical expertise is crucial in determining appropriate use cases. Third, we built a comprehensive knowledge base from course materials before implementing any AI functionality.
The technical architecture relies on Retrieval Augmented Generation (RAG) technology deployed via Amazon Web Services (AWS), allowing real-time updates for course materials and announcements. However, we’re careful to limit its scope to appropriate applications—primarily in technical subjects where solutions follow clear patterns and methodologies.
The data tells a fascinating story about student engagement. During our first continuous assessment week, we recorded over 1,200 queries—but the pattern of usage proved more revealing than the volume. Our highest-performing students don’t use Professor Leodar as a crutch; they use it as one tool in a broader learning strategy. While struggling students often seek quick answers, top performers use the AI assistant to supplement their understanding after putting in the necessary groundwork through traditional study methods.
In our data science and AI course, successful students consistently demonstrated an ability to grasp fundamental concepts through traditional learning before using AI to explore applications. They spend long hours in the lab not because they’re relying on AI, but because they’re doing the essential work of understanding core principles, working through problems by hand, and developing deep conceptual understanding.
The impact has been significant: 97.1% of students report positive experiences with Professor Leodar. However, this success comes with an important caveat: it’s limited to specific types of courses and specific types of questions. We actively discourage its use for tasks requiring deep analytical thinking, theoretical understanding, or creative problem-solving—areas where traditional study methods remain essential.
For institutions considering their own AI teaching assistants, our experience offers several crucial warnings. First, resist the temptation to deploy chatbots across all courses—many subjects require traditional learning methods that cannot and should not be shortcut. Second, carefully evaluate where AI assistance is truly beneficial versus where it might hinder deep learning. Finally, maintain rigorous academic standards that require students to demonstrate understanding through traditional means.
As we prepare students for a workforce where AI proficiency isn’t optional, the question isn’t whether to embrace AI tools but how to integrate them meaningfully—and selectively—into education. Our experience with Professor Leodar suggests that success lies in finding the right balance: leveraging AI where appropriate while preserving the irreplaceable aspects of traditional academic rigor. The midnight oil will still burn in university libraries and laboratories, not because students are asking chatbots for answers, but because genuine understanding requires time, effort, and dedicated study.
Related Media Links: