Skip to main content
Skip to Main Content
Skip to main content
Navigation

AI Literacy for All Students

By Aimee DeFoe Nov 19, 2025

Discussion about artificial intelligence (AI) surrounds us, as well as our students, on a daily basis. As educators, we find ourselves making frequent decisions about when and how to use AI, often without the background knowledge we want to feel sure we are making the right choices. This is where teaching AI literacy comes in.

AI literacy is the ability to understand, evaluate, and make informed decisions about the use of AI technologies, and to use them in ways that strengthen human thinking and creativity.

Understanding AI technologies begins with the realization that AI literacy involves much more than learning how to use the large language models we are now all familiar with. For students to begin to truly understand AI and how to use it to put good into the world, we must help them to engage with how it works. We must ensure they understand AI is not magic, but human-driven computation that they have the power to shape and direct. When students investigate how AI systems sense, classify, and act, they gain the confidence to question, refine, and responsibly harness these technologies, building skills essential for participating fully in an AI-powered world.

Additionally, AI literacy must be intentionally taught to all students to ensure every learner can thrive in a future shaped by AI, and has equitable access to careers in engineering, data science, computer science and even related fields that have not yet developed.

Robots Make AI Learning Visible

Given the importance of understanding how AI works, how can we take core ideas of AI, such as computer perception, and how computers “sense” their environments, and make them visible for students of all ages? Teaching AI literacy with a robot is an ideal way to make AI concepts tangible for students. Students can use sensors on a robot to collect data and use this data to solve problems and make decisions in real world contexts.

For example, young students can begin learning about computer perception—the extraction of meaning from sensory information—by exploring how sensors on robots interpret the world around them. When they discover that sensors like the Eye Sensor on VEX 123 or VEX GO read color as a numerical hue value, rather than the nuanced way humans perceive light, they can directly compare human and robot perception. This builds an early understanding that computers “see” differently from people. As students investigate further and notice how changes in lighting affect those hue value readings, their understanding deepens. They begin to recognize not only how sensor perception works, but also its limitations and the conditions that influence AI-driven interpretations of the world.

VEX 123 Activity for Hue Value Scavenger Hunt

Deepening Understanding with AI Vision

VEXcode AIM AI Vision Dashboard showing the live feed from the AI Vision Sensor

As students move into upper elementary and middle school, they can extend their understanding of sensor data through AI Vision on platforms like VEX AIM, IQ, EXP, and VEXcode VR. Hands-on experiences with AI Vision help learners begin to see that computer perception involves classifying and interpreting sensor data. This opens the door to deeper conversations about accuracy, reliability, and the limits of AI.

For example, in the Trick the Sensor activity (available for VEX AIM in the VEX AIM Intro Course, as well as for VEX IQ and EXP in the 2025 Hour of AI Activities), students investigate how AI Vision identifies objects by deliberately trying to “trick” it. They test drawings, classroom items, and 2D or 3D shapes in different field locations to see whether the robot detects them and how it labels them. When students discover that the sensor confidently misidentifies certain objects or loses track of them depending on angle, size, or distance, they begin to ask exactly the kinds of questions that build AI literacy: Why did it think a marker was a barrel? What data is the AI relying on? How could we improve its ability to classify correctly? Students can follow up on this experience by watching this video about how the AIM AI Vision Sensor is trained. These investigations help students understand that classification is a data-driven process—not magic—and that humans play an essential role in shaping how AI systems interpret the world.

Middle and high school learners can continue to develop their coding skills and their understanding of AI with the AI Vision Sensors for VEX IQ and VEX EXP. Using the AI Vision Utility Window in VEXcode, students can view eight different kinds of sensor data about pre-trained objects being streamed in real time, and use this data to make coding decisions based on it. When students do this, they are evaluating the performance of AI, and designing solutions that account for the sensor’s strengths and weaknesses. STEM Labs such as Clean Water Mission for EXP provide real world context, allowing students to experience firsthand how AI, coding, and human problem-solving intertwine to accomplish meaningful tasks.

Students can also practice these skills in VEXcode VR, where they can explore AI Vision on the virtual Hero Bots in the V5RC and VIQRC Playgrounds. Each virtual Hero Bot is pre-trained to detect game objects and report information such as height, width, and x/y coordinates. Students must analyze this data and transform it into strategic decisions—determining how to navigate, when to pick up an object, or how to score the most points in a time-based challenge.

In V5RC Push Back and VIQRC Mix & Match, students learn that AI doesn’t give a robot instant or perfect awareness. The robot must gather sensor data, interpret it, and act on it through the logic students create. Errors in detection or interpretation invite students to refine their code, building the understanding that AI systems must be tested, validated, and improved over time.

VEXcode VR V5RC AI Sensor detecting game elements with labels displayed above

Exploring AI Perception with Rover Rescue

In the Rover Rescue Playground in VEXcode VR, students complete missions that depend on AI perception, such as finding minerals or detecting aliens. They observe the data being streamed from the VR Rover’s AI in real time through a minimap, strengthening their connection between what the robot perceives and the decisions it makes. Each of these VEXcode VR experiences reinforce a crucial concept: AI is a tool that requires oversight, critique, and thoughtful direction from humans.

VEXcode VR Rover Rescue playground showing the AI sensor detecting minerals, obstacles, or aliens with labels above

When students learn how AI systems sense, interpret, and act on data, they begin to see AI not as something mysterious, but as a human-designed tool they can understand, question, and improve. Across the VEX Continuum, hands-on robotics experiences make the inner workings of AI visible and accessible, giving learners the opportunity to test ideas, investigate limitations, and refine solutions. In doing so, students build the core of true AI literacy: the ability to use AI responsibly and creatively, in ways that strengthen, rather than replace, human thinking.

To celebrate CS Education Week, all Enhanced and Premium VEXcode VR Playgrounds are unlocked and free to all throughout the month of December!

To view all of the 2025-2026 Hour of AI Activities, select this link.

Want to learn more about how to teach AI literacy with VEX? Schedule a 1-on-1 with a VEX expert in PD+