Hi!
I'm trying to make a shader that renders lots of points in various size. I have read about GLSL and I know that vertex shader can be used to calculate and set gl_PointSize variable which can be used in fragment shader to render different size points. However, this doesn't seem to work with jME (all the points are rendered with the size set in the Java code).
I noticed that vertex point size mode have to be enabled this to work (glEnable(GL_VERTEX_PROGRAM_POINT_SIZE_ARB)). How can I enable it in jME or is it even possible?
Best Regards,
-Lauri
You can't do it in jME (make that a suggestion), you must use LWJGL and make the call yourself.
I have been working with resizable textured points. Here is the code to make points resizable in your shader. (You have to make sure OpenGL 2.0 is supported before calling this).
GL11.glEnable(GL20.GL_VERTEX_PROGRAM_POINT_SIZE);
r.draw(points);
GL11.glDisable(GL20.GL_VERTEX_PROGRAM_POINT_SIZE);
Also you might want to invalidate states (in case jme will eventually support that feature).
I checked that OpenGL 2.0 is supported by:
if(GLContext.getCapabilities().OpenGL20) {
System.out.println("OpenGL 2.0 supported!");
}
Then I extended the class that renders the points from Geometry
public class PointRenderingClass extends Geometry {
Point points = null;
public PointRenderingClass(String name) {
super(name);
.
.
// locationList and colorList are populated with locations and colors of the points..
.
.
points = new Point("points",
(Vector3f[])locationList.toArray(new Vector3f[0]),
null,
(ColorRGBA[])colorList.toArray(new ColorRGBA[0]),
null);
GLSLShaderObjectsState shader =
DisplaySystem.getDisplaySystem().getRenderer().createGLSLShaderObjectsState();
.
.
// shader is loaded..
.
.
points.setRenderState(shader);
}
public void draw(Renderer r) {
GL11.glEnable(GL20.GL_VERTEX_PROGRAM_POINT_SIZE);
r.draw(stars);
GL11.glDisable(GL20.GL_VERTEX_PROGRAM_POINT_SIZE);
}
Now, it seems that the shaders won't be used and fixed pipeline is used instead. The draw method fill be called (I checked that). Was this the way you meant to do it or am I missing something here? :)
Also you might want to invalidate states (in case jme will eventually support that feature).
Can you explain what you meant by that. I didn't get it.. :?
Best Regards,
-Lauri
If you use that approach, you have to make sure your points are using QUEUE_SKIP. Otherwise the renderer uses queries and when it does, the OpenGL calls are made later, so calling glEnable(), querying geometry to be rendered and then calling glDiable() wont do anything.
You have to make sure shader state is applied to your points. So you need a call upateRenderState() after setting the shader state.
Invalidating states:
jME keeps track of what it thinks OpenGL state is. If you modify the OpenGL state with direct GL calls, you should tell jME that it no longer has the accurate information about the OpenGL state.
I think you should subclass one of the ***Batch classes instead, which have predraw and postdraw methods in which you can apply your states.
Thanks for the replies. Got the point size mode working (i.e. I can change the size of the points from vertex shader). It was the QUEUE_SKIP that was missing.
However, now I can't draw antialiased points (points bigger than 1 are squares). I have enabled antialiasing for the points:
points.setAntialiased(true);
Also, I have AlphaState state for the points with blending enabled:
AlphaState alpha =DisplaySystem.getDisplaySystem().getRenderer().createAlphaState();
alpha.setBlendEnabled(true);
alpha.setDstFunction(AlphaState.DB_ONE);
alpha.setSrcFunction(AlphaState.SB_SRC_ALPHA);
alpha.setEnabled(true);
points.setRenderState(alpha);
After creating and setting all the render states, I update render states with:
updateRenderState();
Is there something still missing?
lex said:
If you use that approach, you have to make sure your points are using QUEUE_SKIP.
Are there other approaches to this problem?
lex said:
You have to make sure shader state is applied to your points. So you need a call upateRenderState() after setting the shader state.
That was done.
lex said:
Invalidating states:
jME keeps track of what it thinks OpenGL state is. If you modify the OpenGL state with direct GL calls, you should tell jME that it no longer has the accurate information about the OpenGL state.
How can I actually do that? I tried to look for this and found the StateRecord class, but couldn't figure out how to get object of this class and where to invalidate it.
Momoko_Fan said:
I think you should subclass one of the ***Batch classes instead, which have predraw and postdraw methods in which you can apply your states.
Ok, do I still have to invalidate the states?
Best Regards,
Lauri
Just off the hip thought here so ignore if you like, but for the destination function, maybe try "one minus src" instead.
renanse said:
Just off the hip thought here so ignore if you like, but for the destination function, maybe try "one minus src" instead.
There should be DB_ONE_MINUS_SRC_ALPHA.. :) Just an mistake copying code into the post.
Anyway, still can't get antialiased points.
-Lauri
There are always many approaches to the same problems. For example, subclassing the ***Batch class, as Momoko_Fan has mentioned.
You only need to invalidate states when you toggle something jME keeps track of. Since jME does not support point size in vertex shader, you don't have to worry about it right now. However when/if it gets added, you will have to go back and change the code. And you will probably choose to replace direct openGL calls by calls via jME api, so you wont need to invalidate states (just remember to patch your code, or you can get very strange bugs).
Thanks for all the replies. Managed to get the point size mode working with point sprites and that's good enough for now.
My goal was to render all the visible stars (from the bright star catalog) with 'proper' brightness varying the size and alpha value of the point. The relative brightness (relative to most brightness star in the sky) is now calculated at loading time and stored in to the alpha channel of the star color.
In the vertex shader actual size and alpha color are calculated basing on its relative brightness. The brightness scale of stars is logarithmic, so it was little bit challenging to get them look good and realistic. Basically, the the size and alpha are scaled between predefined max and min sizes. This gave me enough control to the the look of star field.
Maybe some kind of physically correct calculations with atmospheric effects could be added later…
-Lauri
Sounds cool! how about a screenshot?
Here you go…
The most brightest star is Sirius (magnitude -1.44) on the bottom left corner. The difference between Sirius and the dimmest stars (magnitude about +6.5) is about 600 times, so it’s little bit tricky to get them look good.
-Lauri
Making the source code available anywhere? I would be interested in taking a look.
Thats really cool! Though, I couldn't tell the difference between stars and the dust on my screen.
You can experiment with different mapping functions that convert the star intensity to the the visible color range on the screen. Because each color is stored using 1 byte, the gray intensity (black to white) resolution has a discrete range of 256, so if its not possible to have one star exactly 600 times brighter that the other. So you might as well pick a function that makes all the star look bright.
For example: pixel_intensity = -1 + 2 / (1 + e^(-star_intesity))
Or: pixel_intensity = C + 2 / (1 + e^(-star_intesity)), where C is in [-1, 0]
lex said:
Because each color is stored using 1 byte, the gray intensity (black to white) resolution has a discrete range of 256, so if its not possible to have one star exactly 600 times brighter that the other. So you might as well pick a function that makes all the star look bright.
That is why I'm also varying the size of the star. In this way, I can artificially make more contrast between the brightest and the faintest stars. If I would only vary the brightness and would make the brightness scale of the stars smaller, then the star field wouldn't look real (you couldn't recognize the constellations, for example). If you look at the stars on the sky, you see that there are lot of really faint stars. Actually 45 percent of the stars are brightness 6 or dimmer (960 times dimmer than the Sirius). On the other hand, there are just 4 stars brighter than 0 (15 stars brighter than 1; 50 stars brighter than 2; 174 stars brighter than 3). The most difficult trick is to get that contrast visible for the player and I think I don't want to just make the stars brighter because that wouldn't look real.
darkfrog said:
Making the source code available anywhere? I would be interested in taking a look. :)
Not sure just yet. I'm still experimenting with this and feel that the code is not yet finished enough.. :)