Hi Bill -- thanks for the reply. I haven't been working on this for a couple of months, so I apologize for the late follow up to my earlier post.
I have been experimenting with injecting additional 5V power into my LPD8806 strips, with unexpected results.
First of all, the "before" scenario... in my setup, as I mentioned, a couple of the particularly long stretches of pixels aren't lighting correctly. I have read elsewhere that when there isn't enough voltage being delivered to one of these strips, you'll see the color drop off towards the end of the run. In my case, that's not what's happening at all -- the entire run of about 20 feet is off. If I program RGB white (255R, 255G, 255B) for the whole strip, I get a very dim brownish red through the entire strip. I metered the voltage on the strip, and it's only reading 2.6V, so it's very clear what the problem is.
So as I asked below, I took the suggestion to try injecting some additional power down the strip. In your message below, you mention that if I'm powering from two different power supplies (rather than just feeding another line from the same power supply to the opposite end of the strip, for example), I should cut the +5V connection running between the two segments of the strip, such that the data, ground, and clock feeds are shared between the two segments, but the first segment gets +5V from one power supply, and the second segment gets +5V from the other.
When I read that, I assumed it had something to do with what would happen if you ran the power supplies in series -- rather than delivering 5V to the strip, the combined power supplies would be delivering 10V, which would destroy the strips.
Here's the thing, though...I tested the above scenario on a short, expendable piece of strip to see what would happen: I provided power, data and clock from the front end of the strip, and then added an ADDITIONAL +5V AND ground from a second power source. The strip immediately got a lot brighter, which was expected, although I wondered if the strip would wink out from being over-driven.
Surprise, though...when I metered the strip, though, it still showed +5V -- exactly what I'm shooting for. The strip seemed to be doing exactly what it was supposed -- the colors REALLY popped and it seemed to be responding to its color information exactly as I'd programmed.
The only thing I noticed was that my LED controller (a SANDevices E682) is reporting that there's pixel power present, even with the power supply feeding the controller cut off. So the 5V is leaking backwards to the controller, but it doesn't seem to care about that.
So, the question:
in your message below, you mention cutting the +5v connection between strips when powering from two different power supplies. From my testing, at least, it does *not* appear that having two power supplies are resulting in an over-volt condition. I do notice that power is leaking back into the controller, such that the LED controller is reporting the presence of +5V even when the main power supply is disconnected, but is that really a problem? What is the concern we're trying to avoid by cutting the +5V connection between strips?
I have tried running a second run from my main power supply down to the other end of the strip (which, presumably, doesn't require cutting any connections between the strips). It DOES work, but the strip only meters +3.4V and isn't as bright, obviously, as it would be at +5V. So I hope you see my dilemma -- I'm giving up a pretty significant amount of brightness by not using a secondary power supply. If I'm going for brightness (and I am), I'd like to do what's going to get the maximum bang for the buck, but I can't tell if there's another consideration that I'm missing.
Thanks for your help!