The 15th and 20th LEDs stay always on bright white no matter what. LEDs 16, 17, 18, and 19, pick random colors (They seem to always be off, bright white, bright red, bright yellow, or bright purple though).
I'm a bit stumped at this point. I'm kinda wondering if the LDP6803 chip in the 15th or 16th LED isn't doing the shifting correctly and is outputting incorrect bits. So my next move would be to cut out those LEDs from the strand and see if I can get it to work without them. But before I cut into my beautiful new toy, I thought I'd ask if anyone else had ideas on how to troubleshoot this.
Thanks in advance for any help! (Details below)
Details on my setup:
Power. The Arduino (UNO) is powered from the USB. The LED strand is powered by a 5V external power supply (a spare ATX power supply I use for this purpose). I have the ATX ground connected to the ground pin on the arduino board and to the ground (blue) wire on the strand. The 5V from the ATX supply is connected to the red wire on the LED strand. I'm using pins 2 and 3 for the clock and data pins as well.
Code. I wrote very simple sketch in the process of debugging. It turns on one LED for half a second and then moves to the next LED until it goes through all 20. As I said, the first 14 behave exactly how I expect, and the last 6 do not. I have also tried adjusting maxcpu up and down, to no effect. In any case, here's the sketch:
Code: Select all
#include <LPD6803.h>
#include <TimerOne.h>
int dataPin = 2; // 'yellow' wire
int clockPin = 3; // 'green' wire
LPD6803 strip = LPD6803(20, dataPin, clockPin);
void setup() {
strip.setCPUmax(50);
strip.begin();
strip.show();
}
void loop() {
for (int i=0; i < strip.numPixels(); i++)
{
strip.setPixelColor(i,Color(17,17,0));
delay(500);
strip.show();
strip.setPixelColor(i,Color(0,0,0));
delay(100);
strip.show();
}
}
unsigned int Color(byte r, byte g, byte b)
{
return( ((unsigned int)g & 0x1F )<<10 | ((unsigned int)b & 0x1F)<<5 | (unsigned int)r & 0x1F);
}