So I should call the GPU a bad dog because it won't fetch fast enough?
But seriously, remember that nebula dissolve shader I posted here last time that worked fine?
When I tested it out on my laptop with a 960M it tanked the fps from 60 to 40 when close up. It makes only one texture fetch at the start but apparently this for every pixel:
color.rgb *= multColor;
float avg = normSum+color.r+color.g+color.b;
float spd = 0.8;
spd = m_Speed;
color.a *= 0.55+sin(g_Time*spd+avg*5.0)*0.45+0.1;
is enough to completely destroy the framerate? I moved as much math as possible to the vert shader but it still wasn't enough.
I ended up putting a checkbox in the options to disable the animation if needed.
This reminds me that I should test the new planet shader there too, to see if it explodes or something before proceeding.