Many makers default to standard 60 LED/m strips, but they often result in visible hotspots ("dotting") unless used with thick diffusers that kill 30-40% of the light output.
I've been digging into the specs for High-Density (120 LEDs/m) strips to see if they are worth the extra cost and complexity. Here is what I found:
1. The "Dotting" Physics
With 60 LEDs/m, the gap between sources is ~16mm. In a channel with a depth of <10mm, the light doesn't mix well. Moving to 120 LEDs/m halves the gap to ~8mm, allowing for a seamless "neon" effect even with clear lenses.
2. The Current & Voltage Drop Trade-off
This is the critical part often missed:
12V Systems: Doubling the LEDs doubles the current (Amps). On a 12V strip, this causes massive voltage drop. You might see the end of a 2m strip dim by 20%.
24V Systems: This is the sweet spot. 24V 120LED/m strips maintain efficiency with half the current of their 12V counterparts, allowing for longer runs without power injection.
3. Thermal Density
More LEDs in the same footprint means higher heat density (W/cm²). You must use aluminum channels for thermal dissipation, or you will accelerate lumen depreciation.
Has anyone else experimented with 144 LEDs/m (WS2812B style) for architectural lighting? How did you handle the data refresh rates vs. power?
loading...