Following up on my LED Matrix bringup post - as I am using the Source Measure Unit (SMU) as my 5V power supply, I can quickly measure the 5V current draw of the LED matrix under different clock and pixel data conditions. Of course, these measurements are from one LED panel at room temperature - not a large enough sample set for setting specifications. But this data can be a rough estimate, useful if you're planning a portable LED matrix application. (click to expand)
Notice that the LED panel current consumption decreases as clock frequency increases. This suggests that the LED 'on time' is decreasing. I'm guessing this is caused by frequency-invariant delays in the LED driver shift registers.
For example, the output of the LED driver shift registers may be updated a fixed 50 nanoseconds after latching the data (for any clock rate), whereas I start loading the next line exactly 6 clock periods later. As I increase the clock rate (decreasing the period), the time between latch + 50 ns (LED on) and latch + 6 clock periods (LED off) becomes shorter. I need to track down a datasheet for these parts to see if this holds water.
Less LED 'on time' per time unit should also decrease in the perceived brightness of the display, I'll take another quick video to demonstrate this affect. I didn't notice this during my initial testing, but I was not cycling through clock rates with a fixed display pattern (and the display is really bright so I wasn't looking directly at it too much!).