I recently got this very severe Moire problem in a voxel engine I have been working on for about four years. I googled it of course, but nobody had ever seen anything like it.
It took me quite a while to figure out what was happening here.
This problem is a result of a decision I made to create an ambient light level value. To facilitate this, I encoded the day and night values into a single "float". In C++, the mantissa (precision component) of the “float” is limited to 23 bits. I experimented with it, and I found it could handle 24 bits accurately. So I came up with the bright idea of encoding two values of 0xFFF into the same float in order to compress the vertex color and intensity data. To access the upper 12 bits, we simply have to use shift operators “<< 12” and “>> 12”. In the graphics shader, I multiplied and divided by 0x1001. It was working very well.
But then I attempted to create the above texture where the vertexes on the same primitive were not the same color. And I got that crazy moiré pattern you see above. It took me a while to realize where that was coming from.
Let us say we encode the RRR and GGG values as 0x00RR_RGGG. If the value in RRR is 0xFFE on one vertex, and the other vertex (on the same primitive) is 0xFFF, then the GPU will attempt to interpolate between those two points. But what it sees is this:
0x00FF_E000 => 0x00FF_F000, and that green color is going to cycle like 0x00FF_E001 => 0x00FF_E002 going through the full gamut of green color shades as we progress across the primitive between the two vertices.
It gets a lot crazier when you factor in the night and day intensity and blue/alpha encoding.
I could have just used a different shader, and in fact, I saved off the original shader before fixing it. I had to add another three floats. Each vertex is now consuming 48 bytes.
12 floats: position(xyz),color(xyzw),intensity(xy),texture(u,v),textureArrayIndex
It is a massive cost per vertex. I may come back to it by possibly using a different shader.