Technology: Traffic light tester

Publisher: Timofey Uvarov | Publish Date: 2025-04-21 | Tag: #News#

As autonomous driving systems evolve, so too must the testing infrastructure that supports them. While much attention is paid to lane detection, object tracking, and general scene understanding, traffic light recognition remains a critical — yet under-tested — aspect of visual autonomy.

 

At Fourier Image Lab, we set out to change that.

 

A Purpose-Built Simulation System

 

We’ve developed a traffic light simulation and testing platform designed from the ground up to evaluate camera perception systems under real-world signal conditions and failure modes.

 

The product consists of a matrix of 40 LED, each containing red, yellow, green, and white emitters. The brightness of each individual row is independently controllable, allowing us to reproduce the entire range of lighting scenarios — from faint incandescent bulbs found in legacy intersections to high-intensity directional LED signals.


企业微信截图_9bc00e90-a20f-49f4-a649-a6f4ad9c9cde.png

Traffic Light Simulation Array: 40 rows × 4 colors (Red, Yellow, Green, White), fully programmable



This granular control is crucial, because while modern HDR sensors may advertise 100+ dB dynamic range using 16- or even 24-bit containers, the data is ultimately tone-mapped to 8-bit before reaching the neural network. That tone mapping process introduces compression artifacts and quantization noise, often causing a loss of critical precision where it matters most — in small, bright objects like traffic lights.

 

Saturation, Hue Drift, Flicker, and Signal Failure


In real-world traffic scenarios, one of the most frequent failure modes is spectral channel saturation — where a red or yellow signal overwhelms one color channel but not the others. This can make it difficult or impossible to distinguish between lights of different colors, especially when the image is tone-mapped and passed through ISP pipelines.


 

To further challenge perception systems, our platform also allows precise control over flickering parameters, including frequency and duty cycle. This enables testing under regional signal conditions — from low-frequency flickering incandescent bulbs used in parts of the U.S. to high-frequency PWM-modulated LED signals typical in Asia and Europe. By mimicking these flicker styles, we can identify how various sensors and algorithms respond to temporal instability — a major source of missed or ghosted signals.


 

Our system monitors:

 

Which channels saturate first under varying light intensities

The hue and saturation values of red vs. yellow signals across the entire brightness range

Delta thresholds between hue/sat values, triggering alerts when color signals become ambiguous or fail classification standards

Signal dropout or misinterpretation under flickering conditions, especially in multi-exposure or rolling shutter sensor architectures


 

We treat channel saturation and flicker-induced dropout as functional failures, and flag them accordingly. This helps developers not only benchmark their sensor and ISP stack, but also improve resilience in real-world driving environments.


 

Image Analysis and Signal Verification

The platform supports both automated and manual workflows for extracting meaningful metrics from camera-captured frames.




Automatic Detection

企业微信截图_dca2066b-eae2-46da-b4e6-6474a0f4924d.png

automatic detection of the traffic light pattern



The system detects light positions automatically and aligns them with signal geometry.




Manual Corner Assistance

企业微信截图_8eeeac1b-81f9-4fda-bbd1-85cd601a688a.png

manual corner detection


In edge cases where automatic detection fails, users can manually annotate the grid corners — the system will interpolate the signal points


 

RGB Overlay Map

企业微信截图_cb0ab6dd-0e8c-45aa-8403-1716614f887f.png

RGB values are computed by averaging pixel values


Side-by-Side Comparison of two cameras

企业微信截图_23643e3a-2106-434f-91c7-3149e80fb0ef.png


Naked eye observations

 

In the left image, color fidelity is preserved across a wide dynamic range — with most of the signal rows retaining accurate hue and separation. Saturation only occurs at the bottom-most row, where white levels clip as expected under extreme intensity.


In contrast, the right image exhibits early saturation and color breakdown. From the 5th row downward, red, yellow, and green signals begin to blur into each other, and meaningful color distinction is lost. Only the top 3–4 rows appear properly exposed.


 

Another noticeable artifact is color bleeding, particularly visible on the top (red) and bottom (white) rows. The red signal appears orange-yellow, and the white signal exhibits an unexpected greenish tint. While the exact cause of this distortion is unclear, it may stem from local tone mapping or histogram equalization techniques (such as CLAHE) that operate non-uniformly across image regions — potentially skewing channel balance and local contrast.


 

These results underscore the importance of controlled signal testing under known brightness and flicker conditions. The visual inconsistencies between sensors highlight how critical both sensor behavior and ISP tone mapping are in preserving safety-critical color information.

 

Quantitative Analysis: RGB Channel Response to Signal Intensity


 

To complement the visual comparison, we analyzed the RGB values of red and yellow signals across increasing intensity levels (rows 1–10) from each camera. These values were obtained by averaging pixel intensities from the center of each illuminated signal.


 

Left Camera - better performer

企业微信截图_c603ef09-78e5-4296-ae97-ee09d2dbf0fb.png


Left Camera – RGB values for Red and Yellow signals across rows 1 to 10


 

For the red signal, all three color channels (R, G, B) show gradual progression with saturation beginning only at row 7 — which corresponds with the white-out observed in the last row of the photo. This indicates a healthy dynamic response.


 

For the yellow signal, however, the green channel saturates first, followed by blue, and finally red. This staggered saturation is a critical insight: when viewing cropped sections of the red and yellow rows from this camera.

企业微信截图_bf1e92ae-eca8-422f-b8d7-056411bd8448.png

top row is red that becomes yellow, than white


 

We can observe the red signal temporarily adopting a yellow/orange hue before becoming fully saturated. This shift can fool a classifier into reading red as yellow — a safety-critical misclassification scenario.


 

Right Camera (underperformer)

image.png

Right Camera – RGB values for Red and Yellow signals across rows 1 to 10


 

Here, the data tells a more alarming story. For the red signal, all RGB channels saturate fully by row 3, offering no gradient or usable color discrimination beyond that point.

 

The yellow signal is even more problematic: green is saturated from the very first row, blue reaches max level by row 3, red climbs and saturates around row 5.

 

This kind of response indicates that the camera is not capturing the color signal faithfully at all, and that color balancing or tone-mapping inside the ISP may be distorting or flattening the output. In fact, none of the signals were represented correctly — making this camera (an actual automotive-grade sensor with auto-exposure enabled) highly unsuitable for traffic light detection.


 

These plots, combined with our structured simulator and visualization tools, highlight the urgency for rigorous camera qualification in perception systems. Even among sensors marketed as “HDR-ready,” internal ISP design choices, tone-mapping, and exposure logic can lead to catastrophic misinterpretations of safety-critical cues like traffic signals.