Thanks mikronauts, but I read that product page before getting started on this. Unfortunately, it doesn't actually contain or link to a datasheet for the LPD8806 as they state on the page itself. I then found this LPD8806 datasheet
elsewhere, and read it through as well.
I am glad that once you were provided with a link to a product page, which had a link to the data sheet, you were able to find another data sheet.
I then found this tutorial
on controlling an LPD8806 LED strip on the Adafruit wiki, and followed it using a Netduino rather than a Raspberry pi. It worked.
Yes, Adafruit's product page also links to some Arduino source code for the LPD8806 LED strip.
Good, we finally prodded you into doing some research for yourself.
After that, I decided to see what I could do myself and attempted to have the LED strip cycle through all color values at a fast rate. Defying all odds, I managed it, as seen here
Nope, if you can adapt sample code from somewhere, you are not defying odds, as you don't really have to fully understand what is going on if the code ports easily.
It's a mystery how I managed to achieve exactly the result I wanted without, as you say, understanding these LEDs or SPI, but here it is, somehow cycling away on my desk. I managed to do this by using SpiDevice.Write() to write a byte array containing color values to the LED strip as often as I possibly could, like I explained in previous posts here. My guess is that the SpiDevice class takes care of the subtler timing issues, and communication via SPI, so that you can more or less call SpiDevice.Write() whenever you want without dealing with all that stuff. Hence my original question: how can I call SpiDevice.Write() faster, if the only thing limiting me currently is the DispatcherTimer.
You proved you did not understand SPI, bit rates etc in your earlier post where you wrote:
"Theoretically, I'd be able to change an LED's color 20,000,000 times a second, correct? I'd be able to do this if I could call SpiDevice.Write() 20,000,000 times a second, correct?"
My response was:
"Your posts make it very clear that you do NOT understand these LED's, SPI, spi speed, or the concept of one bit per clock for the data transfer."
This was due to the following:
- 20Mhz, is one bit per clock, you cannot update RGB values with a single bit. You CANNOT change an LED's color per clock. With 24 bit RGB data you need at least 24 clock (you should have known this)
- Calling SpiDevWrite() 20,000,000 times per second is nuts, and likely not even possible. It sends at least eight bits per call, so to call it 20M times would send 160,000,000 bits. To change a single LED 20M times per second, at just 24 bits per led (ignoring overhead etc) would need 480,000,000Hz clock.
- if you have more than one LED on the strip, multiply 480,000,000Hz by number of LED's to get how high a clock you would need to update 20M times (ignoring overhead)
This is why I wrote what I did, and why I knew you did not understand any of this.
What was bothering me was that you were not even trying to understand.
I'm aware that my knowledge of SPI is minimal at best, but I'm willing to wager that part of the problem in this thread can also be attributed to a lack of knowledge of the WinIoT SpiDevice class on the part of some of those participating in this discussion, or this whole issue wouldn't have come up. "Ignoring questions, and repeating pre-conceived notions formed without understanding the subject matter" apparently works both ways.
No. I was trying to prod you into getting the information needed to help you (data sheet etc).
WinIoT SpiDevice class was totally irrelevant until you know how to control the device over SPI, and until you figured out some basic facts like how many bytes per LED, what the actual clock rate needed was, which is independent of the OS or library.
Meanwhile, my original question was "How would one write to an SPI device as fast as possible in a Universal Windows App?" For the sake of anyone finding this thread in the future, I'll provide the answer I've found elsewhere: create an endless loop on a different thread, and use System.Diagnostics.Stopwatch to time any calls to SpiDevice.Write(). It's not much faster than DispatcherTimer, but still noticeably so.
Your question was very premature, as you did not understand what you were trying to control, and how to control it.
Regarding porting Arduino code and getting it running... that still does not require really understanding SPI, it is like translating from one language to another - to put it in every day terms, most people can drive a car without understanding how it really works.
For future reference, you would have gotten help much faster if you had, in your initial post:
- identified the LED controller chip (LPD8806)
- specified how many LED's were on the strip you are trying to control
Once you know the data format the LED needs, the rest is relatively simple.
(number of LED's) * (24 bits) * (string refresh rate per second) = minimum SPI speed needed
100 * 24 * 60 = 144000
Now depending on framing etc, SPI latency between transactions etc, you might need 200,000 bps from SPI. A far cry from 20Mbps