Aibo in front of ACES

Learning to Adapt to Illumination Changes

Here, we show the results of our approach to achieving color constancy on Sony Aibo ERS-210A robots under two different lighting conditions: Bright (1500 lux) and Dark (400 lux). Our approach involves training two separate color classifiers for the different conditions, and then using a similarity measure based on KL-divergence to allow the robot to determine which classifier to use.

We used two identical robots, one (on the left in the videos) using our approach to recognize when the lights have changed, and the other using a standard classifier for the initial lighting conditions. In the first experiment, the lights are initially off. When they are turned on, both robots are confused. However, after a short while, the robot equipped with our color constancy approach recovers and continues moving towards the ball.

In the second experiment, the lights are initially on. Again, when the lights are turned off, the robot equipped with our approach is able to recover, while the normal robot is not. Note that it takes a bit longer in this case, because of different parameter settings.

One of the challenge events at the next RoboCup competition will involve changing lighting conditions. Additionally, future RoboCup rules may not guarantee the stable lighting we are used to. This line of research will allow us be competitive both in the challenge events and future RoboCup competitions.

Full details of our approach are available in the following papers:

Valid CSS!
Valid XHTML 1.0!