Artificial intelligence is reshaping education, offering innovative ways to personalize learning and bridge achievement gaps. However, deploying AI in classrooms isn’t without its challenges. Two key considerations – balancing simplicity with building trust in AI systems – often go hand in hand. Together, they determine whether these tools will revolutionize learning or remain underutilized.
After all, few teachers or parents want their students to learn from tech-driven platforms that they neither understand nor trust to do the job.
For the educational technology teams working with the Learning Engineering Virtual Institute (LEVI) this lesson has bubbled to the surface during the first half of the project. By analyzing their successes and challenges, we can uncover strategies to make AI tools both accessible and trustworthy for educators and students.
The Simplicity Imperative
AI-powered educational tools thrive on personalization. Platforms like Rori and ALTER-Math can adapt to student’s needs and provide them with tailored learning experiences. Yet, if this adaptability goes too far, it can veer into complexity and leave users – especially teachers – feeling confused and overwhelmed.
Early iterations of the math tutor chatbot Rori, for instance, gave students significant freedom to choose their learning paths. However, when the developers watched students using the platform, they saw that many students found the choices confusing, and they often selected topics they had already mastered because that path was easier and quicker than selecting more challenging work. Recognizing this, the team streamlined the user interface and made learning pathways more structured.
Such simplifications do more than improve usability – they foster confidence in the systems themselves. When students and teachers feel comfortable navigating a tool, they are more likely to trust its recommendations and analytics. In Rori’s case, simplifying the curriculum pathways reassured teachers that the platform understood their students’ needs and guided them appropriately.
Trust: The Currency of Edtech
Another emerging lesson from the LEVI project is that trust is non-negotiable. This includes both trust in the learning platform’s functionality and its ethical design. The LEVI teams have discovered that transparency and relatability are pivotal to building trust with their users.
Take MATHStream, a video-based math learning tool under development by Carnegie Learning. Developers initially believed that hyper-realistic avatars would be necessary to foster engagement with young people – who would want to learn math from a cartoon? Surprisingly, students connected just as well – or even better – with cartoonish avatars. This finding shaped their approach to designing AI-based learning agents, focusing on relatability rather than perfection and hyper-realism.
Importantly, building trust in AI with users also means being upfront about AI’s capabilities and limitations. MATHStream found that students appreciated knowing when they were interacting with a bot and understanding where it might fall short. This honesty reduced unrealistic expectations and fostered a sense of partnership between users and the technology.
By presenting these insights in actionable formats for teachers and students, Eedi has balanced the complexity of its backend with the simplicity of its user-facing interface.
The Sweet Spot: Simplicity + Trust
Simplicity and trust aren’t just complementary – they are interdependent. Simplifying user interfaces and workflows often makes AI tools more transparent and easier to understand, which builds trust.
Eedi, another LEVI partner, developed algorithms to predict students’ performance on math skills with remarkable accuracy. By presenting these insights in actionable formats for teachers and students, Eedi has balanced the complexity of its backend with the simplicity of its user-facing interface. This transparency earned the trust of educators, who could see the direct impact of these predictions on learning outcomes.
ALTER-Math, from the University of Florida, further illustrates this synergy. The tool uses AI-powered “teachable agents” that allow students to “teach” the AI, fostering an interactive and empowering learning experience. These agents rely on a structured knowledge graph that simplifies interactions while maintaining a high level of personalization. This approach has led to consistent learning gains across student performance levels, including those who traditionally struggle with educational technology. By combining simplicity and trust, ALTER-Math ensured that all students felt supported, regardless of their starting point.
Building Trust in AI: The Road Ahead
The future of AI in education isn’t just about what the technology can do – it’s about how well it can connect with the people it serves. As AI continues to evolve, the interplay between simplicity and trust will remain central to its success in classrooms. These tools must be both intuitive for students and transparent for adults. The work of the LEVI teams may serve as a blueprint, showing how deliberate design choices can build trust in AI and transform educational technology into a reliable tool for students and teachers.