Hi again Louis.
After taking up the SmartMatrix again after a while finally I got the sketch FeatureDemo working to some extent. I started from scratch and used the Arduino genuine Serial1.write() technik instead of ESP32 freertos, so the crushes are gone.
My problem with replacing your HW-drivers with Glediator serial protocol is, that I do not have good knowledge of the SmartMatrix structure, which was partly documented in version 2, but a lot of things happened after that. I think, that even for other people to be able to write successful implementations of their ideas, it is important to understand the basic ideas of your building elements. Remember, that the useful documentation is more important for wide spreading the use of the library, than it’s smart solutions. Is there a more recent documentation of SmartMatrix on the web?
I understand, that you have backgroudLayer, scrollingLayer and indexedLayer. They are copying their data to their corresponding refreshLayers. This is happening with swapBuffers() right?
But looking at the sketch FeatureDemo, there are things going on with no swapBuffers call and still the LED-matrix is refreshed. Is it the mysterious callBack mechanics in the hidden underground?
So I tried:
rgb24 refreshRow[kMatrixWidth];
for (int y = 0; y < kMatrixHeight; y++) {
memset(refreshRow, 0, kMatrixWidth*3);
backgroundLayer.fillRefreshRow(y, refreshRow);
// indexedLayer.fillRefreshRow(y, refreshRow);
scrollingLayer.fillRefreshRow(y, refreshRow);
}
But I had to comment out the indexedLayer line, since it did not work right. How does SmartMatrix fill the HW with the content of all Layers? Is the order of filling from these three Layers correct with regards to the “foreground”/background thinking?
What is the actual purpose of the indexedLayer?
I have some more thoughts.
On my LED-matrix (homemade from Chinese strips WS2812B) there is not much contrast of scrolling text, so I can see it well on the background. Will you consider to create a black “border” around the text?
Also I have seen the discussion here about the ArtNet. Since I am using Jinx, I came to the physical limits of Glediator protocol (number of pixels vs. refreshrate), so I had to try something faster. My findings were, that WiFi in a usual city flat will very soon drop packets, which is critical because you have no feedback that all packets were received. So you end up using wired Ethernet, which is not very useful on clothes on stage. Also, Arduinos and even ESP32 are too slow to reliably handle the huge and continuous stream of data, so I implemented the ArtNet with Raspberry Pi4 instead, which works perfectly for me. Not only it has 4 extra UARTS for my Glediator protocol, but I don’t have to worry about memory limits and I have plenty of performance for tricks like arbitrary angle rotation etc.
Another thing is the color-depth. Why are you so keen on the “at least 24bit” color, when so much of the precision is lost in the color-correction? Do I understand, that color-correction is gamma correction, right?
I understand, that 24 bits is kind of more comfortable to handle, than 16 bits, but if you have small memory, it is luxury, and you can hardly see the difference with the naked eye.
I hope you don’t take my words as criticism. I admire your work and enthusiasm and yes, I am a little bit frustrated with reading a lot of code, that is way too sophisticated for me just to try to understand the intension of it.