VEXID: Utilizing AI to Develop a VEX IQ Parts Identifier
In this session from the 2024 VEX Robotics Educators Conference, Mai Vu, AI Program Manager for the Innovation Center of St. Vrain Valley School District, introduces an AI-based VEX IQ Parts Identifier, designed to help new students, teachers, and mentors in VEX Robotics. The tool aids in quickly identifying VEX IQ components, a crucial skill for beginners in robotics. Watch this video to gain insight on this tool, as student developers Malcolm Smith and Rohan Mysore demonstrate its functionality, user experience, and impact on learners' journeys in robotics education.
So Mai and her students are going to show you something extremely innovative, something very cool, something I can guarantee you have never seen before. And I personally can't wait to use it myself. And I'm sure you guys will feel the same. So please give Mai and her students a warm welcome.
(audience applauding)
Hello. Thank you so much. And it's such an honor to be here. First off, what you're gonna see is something called VEXID, it stands for VEX Identification Product. And let's get started.
So what we have here is the students here are from the, you guys want to come up, do an introduction here? These students here are from our Artificial Intelligence Leadership Project. And let me tell you a little bit about us. We are from the St. Vrain Valley School District, located in Longmont, Colorado. This is part of our, as I said, a project from our artificial intelligence team. I'm the AI program manager. To my left is Rohan Mysore, who's in 11th grade and part of the team, and Malcolm Smith, who's also part of the team.
So what is AI leadership as how we see it in St. Vrain? It is a project team where students receive paid work-based learning opportunities. These are high school students who explore potential career options through hands-on projects and mentoring from industry professionals. This project was actually an internal project for our robotics leadership team. Malcolm here is a lead developer, Rohan was another developer, and Grayson is our mentor, and I'm the project manager.
For the agenda today, what you're gonna hear from us is this: why did we create this project? Why was it necessary? Two, why did we integrate AI into robotics? The technical discussions about machine learning. Would this product be useful for our robotics team? And what's next. What do we have envisioned after this project is over?
I want to step back and just tell you a little bit about our robotics at St. Vrain Valley School District. In the last season, we had over 260 VEX teams, which is approximately 1,040 students who participated in our robotics. Not only that, the robotics leadership team also held a signature event called the NoCo RoboRidge, and 71 teams entered the competition. This achievement, again, could not have been possible without Miss Alex Downing and Mr. Jake Roberts. We also had a strong presence at the VEX World. We had 19 total teams, with 13 teams in the VRC division. In the IQ division, we had six teams. We also had a team that just won the Excellence Award, which is an amazing accomplishment for us as a district. I also like to thank our leadership team, Axel Reitzig, who is in the audience, again, Ms. Downing, and Mr. Jake Roberts.
When we thought of this project, I thought, what is the potential impact of VEXID? One of the big things is how can we enhance STEM education? When you take a look at robotics, you're always going to have new students learning. A lot of the students might not be familiar with the parts or where they belong. They rely on teachers, mentors, and students who've done this before. From a teacher's perspective, if you want to help kids, you've got maybe a lineup of like 20 kids there just waiting to ask you for your help or to someone, right? There could be long queues with that way to just ask a question. So we thought, how can we leverage AI to help in this process? We thought that maybe machine learning to the rescue.
Okay, so the idea was this: let's create an identification parts to help all new students and mentors learn about the different robotic parts in their vocabulary. So who are the types of learners we could help?
Thank you for your attention and support. We are excited to share this journey with you and look forward to your feedback and collaboration.
We could help students who are non-English, non-native English speakers, students who are on IEPs, special needs students, and also independent learners, right? Sometimes, they just wanna learn by themselves, right? They just wanna, "I can do this, I wanna try this myself." And also new mentors, teachers, parents who might be learning alongside their kids, and also students, unlike a teacher, they can ask again and again and again, right? Because it's a machine that's helping them. So it'd be that kinda like robotic tutor. That's what we had envisioned.
So we think that this product, we can help all students achieve success. Number one, you'll have much more self-sufficient students, right? They can do this on their own. They can say, "Huh, what does this mean? Where does this go? How may I integrate this?" And then for teachers, what they can do is for new teachers, they can learn alongside the students. But for seasoned teachers, and this is where I think AI integrated with this really helps because for seasoned teachers, what you can do is you don't have to answer the basic questions anymore, right? You can then help the students with building more complex robots or delve deeper into the programming of the robots.
So I think this also helps enhance STEM education because for one, you give the familiarity of robotic parts, and it can spark an interest in STEM, right? And number two, what we think is that early exposure to such topics can cultivate long-term passion and proficiency in this project.
With that being said, I'm gonna hand it over to Rohan, who's gonna give you a demo and just tell you a little bit more.
---
All right. Hello. Yeah. Check, check. So, hi. As Ms. Vu said, I'm Rohan. I'm an 11th grader. I'm a junior at our high school and I'm part of our Innovation Center's AI Leadership Team project team. So before we dive into a demo of the product that we made, I'm gonna take a little bit of a more personal take on this project.
So I'm a high schooler, but when I was in middle school, I did VEX. I was part of the first year of my middle school's VEX program. And to be honest, it was fun, but it was a struggle because our teachers were learning, we were learning, nobody really knew anything. So we spent over half the season learning what each individual part did. And I was doing the metal, the metal section of VEX, so what does a C channel do? What does this screw do? What does a bolt do? What does this weird looking thing do? I spent all of that time, and sure, it was a learning experience, but if I had a product like VEXID, it would've really elevated my learning experience to a whole new level. I would've been able to learn faster, learn better. And so that was the major goal of this project.
So now I'm gonna do a quick little demo of what we've built. So this is just a demo. It's still a prototype, but this is a demo of VEXID, this is the UI that we've made. So let's say I'm gonna take a role of a new student in VEX IQ. So if you guys, I'm gonna put this, so let's say I have this part, give me a second. Sorry, this is a little bit, let's say I have this part. Sure, it's a simple part. I'm a new student. I have no idea what this does, right? I could go ask my teacher, but my teacher's helping another student. I could go search it up online, but I don't really know how to use the internet all that well. Or I can just put it in this webcam, go over here, hit the identify button, and it kind of tells me what it is. It's a two by beam.
So as you can see, it tells me, it gives me a little picture, tells me what it is, and it gives me a little bit of a description of kind of what it is. So I'll read it out for those of you in the back. Stronger and wider than one by beams, two by beams are like the big brothers of your VEX IQ creations, giving them extra strength and stability.
They're like the superheroes that swoop in to save the day when your builds need a little extra support. So I'll do another example part. This thing, this is a little bit more of a complex part. I don't know what this does. I don't know how to use it. I'm gonna hit identify. 90 degree beams. It's a 90 degree beam.
Let's say I am a middle school student, I struggle with reading, but I still wanna learn about robotics. I can go over here and I can hit the read button. With 90 degree beams, you can make perfect corners, just like the corners of the room in your VEXID builds. They're like the building blocks that give your creations a strong foundation and keep everything in place. Yeah, so I kind of just read out the description. So now, as a student, I know what this part does. I kind of know how to use it, I can match it to the image, and I just learned something without having to search anything up, without having to ask my teacher, I just learned.
So I'm gonna try a couple more parts. So I'll do this thing. I don't know what this is. It looks like some sort of sensor. Touch LED. The Touch LED. And I'll read it out. The Touch LED is a cool tool that lets your robot sense when it's being touched and even light up to show different signals or messages. It's like a secret communicator between your robot and the outside world, adding an extra layer of interactivity and fun to your VEXID projects. And yeah, and since this is still in the prototype stage, if you see on the right side of the thing, it tells you the confidence that our AI model or our machine learning model thinks what it is. So right now, it's a hundred percent confident this is a Touch LED sensor, which is correct.
So I'll just do, I'll do one more. One by beams. Yeah, and so that's kind of, that's our product. So this is just a video for any of you. We'll have a link at the end of this for our slides. So this is just a video of some more parts if you guys like to see that. I'll go to the next slide.
So yeah, now, we're gonna go a little bit into the product development timeline of this project. So we kind of separated this project into four phases. Our ideation phase, creation phase, prototype one, version one, prototype two, or version two.
So our ideation phase. We are on the AI team, and we love AI. We also love robotics. So we were trying to think of, as Ms. Vu said, how to merge these two. And so we thought about VEX IQ and like just part vocabulary and part naming and helping students learn. So we decided to come up with VEXID, something that can help you identify a part, learn this vocab, and just learn more about VEX.
We made four major goals at the start of this project. Our first thing was to kind of show a computer part, and it'll tell you what it is. Present our idea to a board of teachers to see if our idea was feasible and it was actually deployable. Have a model that was able to successfully identify all VEX IQ parts. We clarified it successfully at a 90% accuracy or above. We wanted to deploy this model in classrooms in our local community.
Now, I'll hand it over to Malcolm to talk a little bit more about image classification and how we built this project.
At this point, the project is a little bit all over the place, and I really wanted to solidify some of the terms and how this works, and especially what the role of AI is in this project. This is a very simple use case of AI. This is a very old problem that a lot of people have been working on for many, many years, and it's a problem called image classification. The idea is you have an image, and you want to sort it based on what the computer sees, and it'll place it into different classes based on how confident it is about which category it goes into. For example, you would have a class or a category for a cat, a dog, and a bird.
And this AI model would categorize images between the three. What we're building with VEXID is a program that would be able to classify all of the VEX parts and pictures of them into categories for all of those different parts or for all of those different types of parts so that students can very easily learn that vocabulary.
So this is image classification. And what we ended up doing is we tried using two different online softwares for making image classification models. This is Apple's Create ML and Google's Teachable Machine. They're both very easy to use, and I'll be talking more about how you could maybe use them if you want to work them into your own projects.
The issue is for VEX IQ, there are 451 total parts, which means that to make a complete model that would be able to identify each of them individually, that would have to be 451 total classes. And the biggest thing about AI that you may have heard about, and it's kind of a big deal that you really want to think of whenever you're developing something like this, is the data set.
For each of these pictures, for example, we would need maybe hundreds of pictures for each individual part, and then we would need hundreds of parts, and that would make a massive data set, and that would be a lot of work to generate and build. So that was really one of the most difficult aspects of this, is building a data set that's large enough so that we can make a good model without having to put in too much effort and without having to spend a lot of time just laboring away on this endless dataset.
Another, the last part about that is you want it to be as transferable as possible. So you want it to apply to any environment. You want it to be able to identify images that are taken in one classroom with very different lighting, and another classroom in all of these different environments. We want it to be able to be deployed very widely and very accessibly, and that's why we ended up doing that.
Music Cue
So this is version one. The way that we solved the image generation problem is using a software called POV-Ray and a Python script that would generate a full complete image data set using only the CAD models that are found online for the VEX parts. We're given these models that'll look something like this. We will render them using this program. Then it will generate thousands of images for each of these different parts, and we are able to randomize everything. We randomize the background, the angle, the size, the blur, the texture, the color. That way, it'll be able to apply to any environment, any kind of camera, and any kind of thing. That made a very large data set and was able to generate it much more efficiently than just taking pictures and hoping that it works.
The first model successfully implemented this. We used the data generation, we made this data set with about 30,000 images. It turns out you don't really need that many, but that was only for seven test parts in the very early prototype. The only issue, the roadblock is that it only had 70% accuracy, and the dataset generator also ran locally on my laptop, so I just had to leave that running overnight. That wasn't really helpful, it wasn't very productive, and it wasn't really saving any of that time by generating it using a script. We weren't able to get the level of parts that we needed, and it wasn't very accurate. An example of that is you get parts that are like the angle beams that are very similar. What we ended up doing was place them into categories in order to better increase the accuracy.
Music Cue
Another thing, this was my project for the World AI Youth Competition back in 2023. It won third place out of the world project.
Thank You Note
Thank you for your attention and interest in our project. We appreciate your support and look forward to sharing more updates in the future.
Final Message
Stay tuned for more developments as we continue to refine and improve our model. Your feedback and encouragement are invaluable to us.
And that was a lot for the development, and it really spoke to kind of the promise of this project. But of course, I wasn't done. So we went on and we built version two. For version two, we used online data generation. We used Amazon's EC2 AWS servers to run the data, the same script. That was able to run overnight, and it could run pretty much as long as you wanted it to. We used that to generate a dataset that would apply to the entire VEX IQ Educators Kit, second generation kit, and then that would work for all of those projects.
We used a white uniform background and a top-down perspective. Instead of varying the angle to be any different type of angle, we just used the top-down perspective and batched them into groups. All of the angle beams that are very, very similar would all go in the same group. This is kind of the orientation problem. If you're varying the angle to be anything, you'll get something like this where all of the parts, especially these beams that are at different angles and stuff, will all line up like this and they'll all look very similar. It's impossible for the model to tell them apart.
In order to build this, we generalized and made the assumption that people would be looking at this from a top-down point of view, and we generated all of the 3D models so they would look in this orientation. There's still about 60 degrees of compliance and 60 degrees of randomness in what the actual angle will be. But this definitely helped us, and making those assumptions about the environment and about what this is gonna be used in definitely helped us as we built the next version.
We also supplemented the dataset that we were generating online with the dataset that we were able to generate by taking pictures. Our AI Leadership Project Team at the Innovation Center all came together and were able to generate many, many images and just take pictures using our environment. That was a great way to validate the existing dataset that we had because we know that not only does it work on the generated dataset, but that same kind of information is also able to apply to the real world.
This is kind of another example. This is some of the data for one of the classes of the version two model. You can see there's some that were generated in different lighting; those were taken by our students, and these were all generated online. This had about 93% accuracy on the validation dataset, which means that it did pass our benchmark and was able to successfully identify most of the parts within the education kit. We also only needed an image dataset of about 10,000 images. We ended up trimming that down actually. But we were able to generate over 50,000 different images using the online method, which would really help, especially if we're trying to push this even farther and to be even more accurate to have that dataset to back you up. That's really important.
And now I'm gonna hand it off to Rohan who's gonna talk a little bit more about how we built this within the community and how we presented it in our district.
Yeah, so going back to one of our four goals, one of our goals was to present VEXID to a team of teachers and VEX coaches in our community to receive feedback on if this would be helpful or not. We were able to present our version two model to a committee of VEX coaches from our district. They're all really experienced, and their feedback was really, really insightful. Some of the major feedback we got is that this would be valuable for starting up a VEX program. This would be really valuable for exacerbating learning in a new VEX environment.
Maybe for more experienced coaches and students, this would be more of a tool that they can use less for learning and more for efficiency. But they also gave us a lot of feedback on how we can optimize this, how we can deploy it, and a lot of the feedback which we took into our later steps.
So why is this helpful? Why is VEXID helpful? We covered this in the beginning, but we're going to sum it up. Kids can use VEXID as a tool to learn about VEX IQ parts in their classroom. Students will have instant access to a library of knowledge about VEX IQ parts through this model. Students will be able to quickly learn and expand their knowledge base, and new VEX coaches will be able to learn along with their students. They can just plug a part in, click it, and get it memorized in their head too. Overall, it's a quick learning experience for everybody involved with this project.
Going back to the feedback we got from the teachers, they gave us a lot of feedback on how we can deploy this. Sadly, with the time scope we were given, we were not able to deploy this in our local community, but we definitely plan to in the future. We want to integrate this model into VEXID programs in our local area, asking more teachers and coaches how this is helpful to them, and how they could see it being used in their classroom. We also want to make the UI and the code open source so anybody can access it or build it from scratch. With access to the program like this, all you need is a little webcam and a blank sheet of paper. It's really easy, quick to use, and incredibly accessible.
[Music Cue]
So now, I'm gonna pass it to Malcolm who's gonna talk a little bit more about the next steps.
Yeah, so this project is something that's not done. Right now, we have a prototype that works on the kit, which is very useful, and is something that we're going to try to deploy in some of our local VEX programs. But there's still a lot of room to build on this project, make it more accurate, and use better systems. A couple of reasons for this is that it's really easy to generate new data. It's actually generating data for the next version, version three, as we speak. Because we're running it online, we can add classes very easily. We put a new part in by just dragging it into the folder, running the generation program, and then putting that data into the machine learning model. This means we're able to expand it very quickly and hopefully use a lot more of the commonly used parts, making it accessible and applicable to all of VEX IQ.
We also want to deploy this in classrooms, as Rohan was saying. This is something designed primarily for the teams in our district and for teams around the world who might benefit from it. We really want to get to the point where we're able to deploy this and ensure people are using it as soon as possible. We also want to make the code for this open source so that you're able to generate projects like this, especially the data generation program. I think things like that are very helpful as you're approaching new AI-based projects and similar kit parts or anything in robotics.
[Music Cue]
Thank you for your attention and support. We look forward to seeing how VEXID can make a difference in classrooms and robotics programs worldwide.
I think that this project, not only is it a project that's gonna help students within our district and VEX IQ students, but it's a project that's really helped us as students, me and Rohan, as we've been building this project. We've learned about all of the tools that are out there with AI and gone through all of the project development steps that we would go through as we were building a project later on in life.
And so I think that I definitely recommend if you have students, maybe you have students in older age groups who are interested in this aspect of how you can build a project, share it, and make some kind of contribution to the community, I would definitely recommend projects like this.
So just for us, what we used, we used Google's Teachable Machine. It's really easy. You would just generate your images into, you would put your images into each of these different categories, it'll make the classes, it'll train it all by itself, and it's really accessible. So if you wanna build a project like that, that's an option. Really just be looking at these kinds of solutions and the kind of projects that you could build to really help out the robotics community. I think that's a great way to go forward with projects like this.
And now I'm gonna hand it back off to Mai who's gonna give us a conclusion.
Thank you. Thank you so much for your time.
So in terms of next steps, what we'd like to do is potentially build another prototype of this and then put this into the classrooms there. We're ready to take any questions that you might have. We also, after the presentation, will have a table in the back there that you, if you wanna come play with the model, we would appreciate that.
But before I do that, I'd like to call Rohan and Malcolm back on stage. One of the things we'd like to do in terms of the AI Leadership Team is to give real-world experience and also authentic work. I think we've achieved that in terms of our goal and our project. Can you guys please give a big round of applause?
(audience applauding)
They did a fantastic job. Okay, with that, we'll take any questions you might have.
So do you know when this will be live? Because I would love to use this. We have all these robotics kits and half the battle is getting them to get them out. This would really fix that learning curve. So if you want any, you know, any piloting, speaking from my district, we're happy to do that.
Fantastic, thank you. Just while I'm walking over, I'll just mention, I think it's such a great example, like what you said, to share this with the community because as we can see, so many people are interested in this. Have you considered adding, after you show the part and do the description, another picture, image of it connected? Like some of the corner connectors and stuff would be helpful for kids to see how they are actually used?
Yes. Yes, we have. And that's phase two of our project there. Yes, thank you. And then Rohan would like to, go ahead.
Yeah, so that's actually a great question. That was literally in our last discussion about version three. We wanted to have teachers and people submit a form of just where this part can be used on a robot and it would just appear on the UI. So yeah, that's a really good point and that's where we want to move with this project.
Can the students, is it Rohan and, what was your name?
Malcolm.
Malcolm, sorry. Can you talk a little bit more about your role in the whole project? What was your role in the whole project?
Yeah, so I started the project using Teachable Machine and wrote the data generation code. Rohan has helped a lot or, yeah.
He's done a lot of the dataset management and made sure that we have all of the parts and that we're applying it to the right set and to the right kit, and a lot of the communication stuff as well. As we're trying to deploy this, we're talking with a bunch of teachers and we're having meetings with people in our district. Yeah, that sums it up.
Thank you so much for your presentation once again. It was really, really useful. I want to ask from the manager's perspective, how do you, how did you manage to give autonomy to these students to do this great work? And are you a school? Do you have this as your class time or is this a co-curricular activity where students are dedicated to put this amount of time as an extracurricular thing?
Yeah, great question. First of all, I am blessed that I get to work with such gifted students, okay? So I was gonna say that. So I'm blessed with that. From a project management point of view. So as we said, we have a, so they're part of our leadership team. So they're actually, we meet after school, it's work-based learning, and they actually get paid to do this. So they've been doing this for quite a while. We've started, I guess, late August, beginning of September. That's when we started this project. Did I answer your questions? Anything else?
Yeah, no, I'm just so mesmerized by the level of motivation and the great work that you're able to generate as a project. So I was just wondering like how it worked. Yeah, that answered my question. Thank you so much.
Awesome. Again, let me, just have to say, I have to give compliments to the kids there, right? So you have to give, so I try to pick projects, that where the kids have intrinsic motivation. Yeah, I'm kind of just bouncing off Ms. Vu. We're also incredibly grateful to have such good mentors and such good opportunity to learn so much. I think Malcolm can definitely attest to that.
So for our innovators, what coding backgrounds did you guys have before you even started all of the machine learning and AI?
Yeah, so I have experience with a lot of programming languages. This is one of the first projects where I was really using Python a lot. So I'd done maybe one or two projects using Python before, but I was mostly in JavaScript, Java, C# and things like that. But then I moved into this, and this was a way to learn basically a new programming language and learn a lot of the libraries and how to use that.
Yeah. Yeah, I'm also, I've done a couple projects in Python here and there with my like dad and just kind of for fun. I've also taken, I'm fluent in Java, so I've taken the AP computer science classes at our school. I think Malcolm has too. And Teachable Machine. Machine learning is new to me. I joined the AI team like this year, last year, I guess. But it's really interesting to me. I'm still learning. And I also wanted to add that we are, I am planning for the third version to be able to use PyTorch and some of the existing programming libraries to generate these machine learning models. But for this current version, it's really accessible. And to build a teachable machine model, you don't need to know any code. It's just kind of drag and drop, and it's very simple. So I wouldn't think that coding background is gateway, especially if you're working on machine learning projects.
Yeah, and just like, if anyone doesn't know, you can literally just search up teachablemachine.com on Google and it just pops up right there. It's free. Just make a Google account. It's pretty easy.
Yeah, it is quite simple. And again, we're more than happy to show you afterwards if you want to stick around and we can show you how to do that.
Okay, well, in the true spirit of VEX, a very innovative and heartwarming story. Thank you so much. Let's give them another round of applause please.
(audience applauding)
(upbeat music)
Share
Like this video? Share it with others!
Additional Resources
Like this video? Discuss it in the VEX Professional Learning Community.
Learn more about the VEX Robotics Educators Conference at conference.vex.com.