Adjusting a Color Signature
In this video, Alaina walks through adjusting a color signature after testing the Turn to Blue project in a new environment. You will learn about hue and saturation and how to use the hue and saturation range sliders in the AI Vision Utility to make a color signature more resilient.
This video references a prior AI Vision Sensor video and uses a project with a color signature already configured. Be sure to watch the Detecting a Color with the AI Vision Sensor video first.
Welcome back to the VEX Classroom. My name is Alaina, and in this video, we're going to be continuing on with what you've already done in the AI Vision Sensor video series. Now, we're going to talk about Adjusting a Configured Color. But before we can get there, we need to go over to our field to test the Turn to Blue project that you were previously running with Audra.
For a reminder, in this project, the robot turns until the color blue, which was configured on the sensor, is detected by the AI Vision Sensor, and then the robot will stop moving. Because we have this in a forever loop, any time you move the object, the robot will continue to look for that color blue that was configured in here. So let's go over to the field and talk about it.
[Music Cue]
So now we are at our field, and we're going to run that same project that you did with Audra in the other room. We're running this project where the robot's going to turn to the right until it detects blue, but we are seeing the robot is actually going right past. It must be catching something blue off to the side, but that's definitely a green ring and not one of our blue Buckyballs. So let's talk about what just happened.
We saw with our robot when it was running this project, the Turn to Blue project, that when we moved from that classroom to our field, it did not work as intended. A lot of this comes down to the lighting source. Our surrounding environment is different. The ambient light, the detected colors are going to be different as a result. That's because color is all about the way our eyes and our brain, or in this case, our sensor, is interpreting different wavelengths of visible light. It's not some inherent property. It's all about how that object, whether it be our eyes or the AI Vision Sensor, is interpreting that interaction of light, the visible light.
Any change to the environment in our lighting is going to change our perception of color, especially to the sensor. We can check what that perception of the sensor is of these colors in our utility again. I'm going to open up our devices window for the AI Vision Sensor, and I can go back to configure. Remember that your EXP brain needs to be off, and then the sensor itself needs to be plugged into your computer with that USB-C cable. Now, if I move my sensor here, I can see what it's seeing. I'm seeing the edge of blue of this blue ball for just a moment, but when I'm actually looking at this ball, it's not recognizing the color blue, except for in these very specific circumstances.
We have two options here. One, we could go through and do the same system as before and set our color to create a new color, or we can add that. We can see, if we added another color, this change in color. It's slightly lighter in this situation. Or we can start using these sliders that are down here for our hue and our saturation ranges. Everything we're doing as we change these ranges is all based on tuning our configuration. We're making it more resilient, so it works in different conditions, not just when one particular face of the Buckyball is in one specific type of lighting.
Let's talk about the difference between hue and saturation. The first property that I want to talk about is hue. Hue is the color that's being perceived as defined by its position on the color wheel, which you can see over to the right-hand side. The color wheel, like all circles, has from 0 to 359.99 degrees here of that element, and each degree value equates to a different hue, a different color that we would see. Something like our Buckyball is probably not quite as dark as 240 degrees. It might be somewhere in this 220-degree, 230-degree range, versus a light green would be more close to 120, or the red, which would be somewhere maybe from 350 to 15 degrees or something like that. This is all based on the color of that object.
[Music Cue]
Thank you for watching this video on adjusting a configured color with the AI Vision Sensor. We hope you found it helpful and informative.
If you have any questions or need further assistance, please feel free to reach out. Happy experimenting, and see you in the next video!
If we go back to VEXcode, our hue range is right here. If we move that slider, you can see we're starting to get more and more recognition. This range says to add or subtract 31 degrees from the hue value of this configured color that we have up in the corner. So now we're seeing what that range can do to help us. But this obviously is not enough to detect the whole Buckyball.
So now we can look at something like our saturation. In contrast to hue, saturation is the intensity or the purity of the color. Something like the red that we use for VEX EXP is going to be this one all the way here on the right. That is our 100%. It is the purest version of that red. As you go down in percentage, it's going to become more muted or dull. It becomes more gray. You can see that it's less intense. It's a less pure color. So that's the range of saturation.
Saturation does not have a specific measure like hue, where it equates to a specific degree value. It's all relative amounts of saturation, which is why we're using a percentage to compare those saturations back and forth. Now back in VEXcode EXP, we can see here our saturation range is going from zero to roughly one. That's going to be the decimal equivalent of those percentages that we were talking about on that other image. So now we can see that plus or minus. We're adding a certain amount of percentage of saturation or subtracting it from the value of this initially configured color here.
As I drag this around, I'm getting different amounts of recognition from the sensor because we're telling it to recognize more ranges of saturation. This includes additional types of blue, whether it is more intense or less intense, because that's what our saturation is to become this solid color here. We can see this very stable box as part of it.
Now we want to test the resiliency. I talked about making sure that our configuration is resilient and works in different configurations. If I rotate my robot and go over to this other Buckyball over here, I can see that it's still working for this same Buckyball. It is shifting a little bit more, so I can start to move the saturation or the hue to slightly increase those. Now I can see a bit more of a stable square as part of it.
If you need to, you can also freeze the video and change your saturation range and your hue range. So now I can freeze it, and I can see here is one very clear square that is being recognized with this particular hue and saturation range setting. Now that I know both of these Buckyballs are being detected with this new updated and adjusted configured color, I can hit the close button. Then I want to make sure that I select done to save those changes.
I can actually see the different settings that we have for our hue and our saturation ranges. Once that is there, I can go ahead and download this updated project to the brain. Now I have my robot back on the field. The new project is downloaded with the updated configuration, and I'm going to run this project. We should see our robot go ahead and turn and stop at that blue. If I move that, now it's going to continue to move until it sees that second Buckyball. Now I know that my robot is behaving as intended.
This was all an error from our color configuration, which we were able to update in our utility by changing our configured colors and tuning it so that our configuration was more resilient. Now that we have tested our project, we identified our error, which was in our configuration with our colors. Then we went through and we tuned our color configuration to make it more resilient. In order to do that, we changed our hue and our saturation sliders.
Thank you for following along with this process. We hope this helps you in configuring your own projects.
So we have a good recap of all the things that happened and why the project that worked so well right here on this table didn't work so well when we first got to the field. So first, let's address that question. We talked over at the field about how lighting impacts our color. Color is made up of light waves. So that perception of color comes from those light waves. And because those are light waves, they're impacted by everything around us in terms of ambient lighting, whether it's your overhead lights, the sun is really bright outside of the window. All of that's going to affect your color configuration.
In general, I will say it is the best practice to configure your color where you're going to be testing your project. So if we wanted to test our Turn to Blue project over on the field, the first thing we should have done when we got to the field is to reconfigure our color or to tune it, so we could make it more resilient in that other environment.
The way that we tuned our color and made it more resilient is by adjusting our hue and our saturation. To recap, hue is that perception of color, and it's based on a color wheel. So from 0 to 360 degrees, that value is your hue value. So that slider that's in VEXcode in the utility is going to adjust the range of hue values that are going to be detected as that color, whether it's plus five, minus five, plus 35, -35, that range is what allows us to have more accuracy in the colors that we want our robot to be able to collect data on.
And finally, we have our saturation. Our saturation is how intense or how pure a color is. So you have something that is very pure and very bright, like this Buckyball that might have 90 to 100% saturation, or something that is more muted, more gray, that's going to have a lower saturation. So it will have the same hue value, whether it's this blue or a more muted grayish blue. But those saturation percentages can be different. So that is the other range that we were adjusting in VEXcode. Saturation is a relative measurement, which is why we talk about it in terms of percentages. And we saw that in VEXcode with the 0 to 1 range, using the decimal version of those percentage values.
So now that we know how hue and saturation ranges affect our color configuration for the AI vision sensor, we can use that information as we move forward with our testing to determine if there's something we need to manipulate or adjust in our color configuration, or if this is something that's wrong in our code. We can use that to help solve these problems as we move forward using our AI vision sensor.
If you have any questions about hue or saturation or color configuration with the AI vision sensor, I would love to hear them in the PD+ community. Or you can book a 1-on-1 session and we can talk about it together.
See you next time.
For a reminder, in this project, the robot turns until the color blue, which was configured on the sensor, is detected by the AI Vision Sensor, and then the robot will stop moving. Because we have this in a forever loop, any time you move the object, the robot will continue to look for that color blue that was configured in here. So let's go over to the field and talk about it.
[Music Cue]
So now we are at our field, and we're going to run that same project that you did with Audra in the other room. We're running this project where the robot's going to turn to the right until it detects blue, but we are seeing the robot is actually going right past. It must be catching something blue off to the side, but that's definitely a green ring and not one of our blue Buckyballs. So let's talk about what just happened.
We saw with our robot when it was running this project, the Turn to Blue project, that when we moved from that classroom to our field, it did not work as intended. A lot of this comes down to the lighting source. Our surrounding environment is different. The ambient light, the detected colors are going to be different as a result. That's because color is all about the way our eyes and our brain, or in this case, our sensor, is interpreting different wavelengths of visible light. It's not some inherent property. It's all about how that object, whether it be our eyes or the AI Vision Sensor, is interpreting that interaction of light, the visible light.
Any change to the environment in our lighting is going to change our perception of color, especially to the sensor. We can check what that perception of the sensor is of these colors in our utility again. I'm going to open up our devices window for the AI Vision Sensor, and I can go back to configure. Remember that your EXP brain needs to be off, and then the sensor itself needs to be plugged into your computer with that USB-C cable. Now, if I move my sensor here, I can see what it's seeing. I'm seeing the edge of blue of this blue ball for just a moment, but when I'm actually looking at this ball, it's not recognizing the color blue, except for in these very specific circumstances.
We have two options here. One, we could go through and do the same system as before and set our color to create a new color, or we can add that. We can see, if we added another color, this change in color. It's slightly lighter in this situation. Or we can start using these sliders that are down here for our hue and our saturation ranges. Everything we're doing as we change these ranges is all based on tuning our configuration. We're making it more resilient, so it works in different conditions, not just when one particular face of the Buckyball is in one specific type of lighting.
Let's talk about the difference between hue and saturation. The first property that I want to talk about is hue. Hue is the color that's being perceived as defined by its position on the color wheel, which you can see over to the right-hand side. The color wheel, like all circles, has from 0 to 359.99 degrees here of that element, and each degree value equates to a different hue, a different color that we would see. Something like our Buckyball is probably not quite as dark as 240 degrees. It might be somewhere in this 220-degree, 230-degree range, versus a light green would be more close to 120, or the red, which would be somewhere maybe from 350 to 15 degrees or something like that. This is all based on the color of that object.
[Music Cue]
Thank you for watching this video on adjusting a configured color with the AI Vision Sensor. We hope you found it helpful and informative.
If you have any questions or need further assistance, please feel free to reach out. Happy experimenting, and see you in the next video!
If we go back to VEXcode, our hue range is right here. If we move that slider, you can see we're starting to get more and more recognition. This range says to add or subtract 31 degrees from the hue value of this configured color that we have up in the corner. So now we're seeing what that range can do to help us. But this obviously is not enough to detect the whole Buckyball.
So now we can look at something like our saturation. In contrast to hue, saturation is the intensity or the purity of the color. Something like the red that we use for VEX EXP is going to be this one all the way here on the right. That is our 100%. It is the purest version of that red. As you go down in percentage, it's going to become more muted or dull. It becomes more gray. You can see that it's less intense. It's a less pure color. So that's the range of saturation.
Saturation does not have a specific measure like hue, where it equates to a specific degree value. It's all relative amounts of saturation, which is why we're using a percentage to compare those saturations back and forth. Now back in VEXcode EXP, we can see here our saturation range is going from zero to roughly one. That's going to be the decimal equivalent of those percentages that we were talking about on that other image. So now we can see that plus or minus. We're adding a certain amount of percentage of saturation or subtracting it from the value of this initially configured color here.
As I drag this around, I'm getting different amounts of recognition from the sensor because we're telling it to recognize more ranges of saturation. This includes additional types of blue, whether it is more intense or less intense, because that's what our saturation is to become this solid color here. We can see this very stable box as part of it.
Now we want to test the resiliency. I talked about making sure that our configuration is resilient and works in different configurations. If I rotate my robot and go over to this other Buckyball over here, I can see that it's still working for this same Buckyball. It is shifting a little bit more, so I can start to move the saturation or the hue to slightly increase those. Now I can see a bit more of a stable square as part of it.
If you need to, you can also freeze the video and change your saturation range and your hue range. So now I can freeze it, and I can see here is one very clear square that is being recognized with this particular hue and saturation range setting. Now that I know both of these Buckyballs are being detected with this new updated and adjusted configured color, I can hit the close button. Then I want to make sure that I select done to save those changes.
I can actually see the different settings that we have for our hue and our saturation ranges. Once that is there, I can go ahead and download this updated project to the brain. Now I have my robot back on the field. The new project is downloaded with the updated configuration, and I'm going to run this project. We should see our robot go ahead and turn and stop at that blue. If I move that, now it's going to continue to move until it sees that second Buckyball. Now I know that my robot is behaving as intended.
This was all an error from our color configuration, which we were able to update in our utility by changing our configured colors and tuning it so that our configuration was more resilient. Now that we have tested our project, we identified our error, which was in our configuration with our colors. Then we went through and we tuned our color configuration to make it more resilient. In order to do that, we changed our hue and our saturation sliders.
Thank you for following along with this process. We hope this helps you in configuring your own projects.
So we have a good recap of all the things that happened and why the project that worked so well right here on this table didn't work so well when we first got to the field. So first, let's address that question. We talked over at the field about how lighting impacts our color. Color is made up of light waves. So that perception of color comes from those light waves. And because those are light waves, they're impacted by everything around us in terms of ambient lighting, whether it's your overhead lights, the sun is really bright outside of the window. All of that's going to affect your color configuration.
In general, I will say it is the best practice to configure your color where you're going to be testing your project. So if we wanted to test our Turn to Blue project over on the field, the first thing we should have done when we got to the field is to reconfigure our color or to tune it, so we could make it more resilient in that other environment.
The way that we tuned our color and made it more resilient is by adjusting our hue and our saturation. To recap, hue is that perception of color, and it's based on a color wheel. So from 0 to 360 degrees, that value is your hue value. So that slider that's in VEXcode in the utility is going to adjust the range of hue values that are going to be detected as that color, whether it's plus five, minus five, plus 35, -35, that range is what allows us to have more accuracy in the colors that we want our robot to be able to collect data on.
And finally, we have our saturation. Our saturation is how intense or how pure a color is. So you have something that is very pure and very bright, like this Buckyball that might have 90 to 100% saturation, or something that is more muted, more gray, that's going to have a lower saturation. So it will have the same hue value, whether it's this blue or a more muted grayish blue. But those saturation percentages can be different. So that is the other range that we were adjusting in VEXcode. Saturation is a relative measurement, which is why we talk about it in terms of percentages. And we saw that in VEXcode with the 0 to 1 range, using the decimal version of those percentage values.
So now that we know how hue and saturation ranges affect our color configuration for the AI vision sensor, we can use that information as we move forward with our testing to determine if there's something we need to manipulate or adjust in our color configuration, or if this is something that's wrong in our code. We can use that to help solve these problems as we move forward using our AI vision sensor.
If you have any questions about hue or saturation or color configuration with the AI vision sensor, I would love to hear them in the PD+ community. Or you can book a 1-on-1 session and we can talk about it together.
See you next time.
Share
Like this video? Share it with others!
Additional Resources
Want to learn about coding with the AI Vision Sensor in Python? Schedule a 1-on-1 or post in the VEX Professional Learning Community.