JOGL Support (JOGL2 that is)

@nehon said:
The thing I don't get is that the mentioned test case does not use multisampled framebuffer.

Gonna look into it, and see if I can help

Thanks. In this case, the problem comes from JoglRenderer, you might find a tiny difference between the renderer based on its competitor.

@normen I'm writing NewtMouseInput. I will commit in a few minutes or in an hour.
2 Likes

@gouessej from the opengl doc :

GL_FRAMEBUFFER_INCOMPLETE_MULTISAMPLE is returned if the value of GL_RENDERBUFFER_SAMPLES is not the same for all attached renderbuffers; if the value of GL_TEXTURE_SAMPLES is the not same for all attached textures; or, if the attached images are a mix of renderbuffers and textures, the value of GL_RENDERBUFFER_SAMPLES does not match the value of GL_TEXTURE_SAMPLES.



GL_FRAMEBUFFER_INCOMPLETE_MULTISAMPLE is also returned if the value of GL_TEXTURE_FIXED_SAMPLE_LOCATIONS is not the same for all attached textures; or, if the attached images are a mix of renderbuffers and textures, the value of GL_TEXTURE_FIXED_SAMPLE_LOCATIONS is not GL_TRUE for all attached textures.



Maybe you already checked this, but are all those variable the same?

2 Likes

@nehon I already looked at that but I didnā€™t try to display the values of these variables to find the root cause, we should do it.



@normen I have not tested these classes but the NEWT canvas and the NEWT display are on the repository now. Mouse and keyboard input are not very smart, they do not need java.awt.Robot but they look like existing AWT implementations. I donā€™t know yet how to implement TouchInput.



Applets should work with the both canvases thanks to the brigde.



Some JogAmp contributors (including me) have to work on JInput and when it is ready, we will use JInput For JogAmp.



I still have to implement headless rendering and off-screen rendering.

2 Likes

@gouessej: You donā€™t need to implement TouchInput, its for touchscreens only. Headless rendering doesnā€™t require an OpenGL context so you donā€™t have to worry about it either.

@normen Sven has just given me an example using touch input with NEWT, I just need to use a mouse listener :slight_smile:

@gouessej said:
@normen Sven has just given me an example using touch input with NEWT, I just need to use a mouse listener :)

Heh, interesting. We are looking into mapping mouse and touch via one input but it turns out that in a real app/game one probably wants to address both environments separately. We do have a touch input interface in core for such stuff as well. But the mouse input "wrapping" is also in place so theres use for that too :)

@gouessej I found the issue.

in updateRenderBuffer

line 1443

http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/jogl/com/jme3/renderer/jogl/JoglRenderer.java#1443



it should be

[java]

if (fb.getSamples() > 1 ā€¦

[/java]

instead of

[java]

if (fb.getSamples() > 0 ā€¦

[/java]

jme non multisampled framebuffers number of samples is 1.

I tested it and the testcase works fine.

1 Like

mhhhā€¦there might be other issues though, the skybox is not rendered, or at least itā€™s black , and there is an unbearable lag when you move the cam. (best noticed in 1280x720 or higher)

the framerate display numbers around 200, but when the water pools are in the view you can feel this lag.

This is not the case with the lwjgl renderer.

Maybe you re-attach the frame buffer on each frame?

@nehon Thanks for the fix. Maybe I did another mistake, you are probably right, I will have to investigate a bit more.



@normen I can hardly handle them separately or I have to check if touch input is supported to know whether to provide a TouchInput implementation and no MouseInput implementation or the opposite.

@normen MouseEvent.getPressure() seems to return 0 when coming from a real mouse. Therefore, I will return both a TouchInput implementation and a MouseInput implementation even though the former will produce no JMonkeyEngine specific event with no touch screen. This TouchInput implementation will probably use 2 delegates, depending on whether ā€œsimulateMouseā€ is set to true.

1 Like

@normen Sven advised me to rely on getPointerId(int) rather than on getPressure().

@gouessej: I donā€™t think it is necessary to simulate touch input on desktop. The mouse simulation on Android is only done to be compatible with older jME3 apps before touch input was implemented.

@Momoko_Fan said:
@gouessej: I don't think it is necessary to simulate touch input on desktop. The mouse simulation on Android is only done to be compatible with older jME3 apps before touch input was implemented.

Ok thanks. I have no tablet under GNU Linux to test NEWT X11 driver, I can't even make some tests.

@gouessej: Regarding offscreen buffer support, you donā€™t have to check for FBO availability. Offscreen buffers should be implemented by using pbuffers in both JOGL and LWJGL. This means that supporting OpenGL2/FBO is not required for offscreen buffer support.

@Momoko_Fan said:
@gouessej: Regarding offscreen buffer support, you don't have to check for FBO availability. Offscreen buffers should be implemented by using pbuffers in both JOGL and LWJGL. This means that supporting OpenGL2/FBO is not required for offscreen buffer support.

Ok I will have to refine my tests but some people might want to use the renderer based on the fixed pipeline with this off-screen buffer.
@gouessej said:
Ok I will have to refine my tests but some people might want to use the renderer based on the fixed pipeline with this off-screen buffer.

Yeah, that's what I meant.
@Momoko_Fan said:
Yeah, that's what I meant.

Done ;)

Sven asked me to replace GL2 by GL2GL3. Why is glAlphaFunc still used in the renderer based on the programmable pipeline? This is the single call that is probably not supported by OpenGL-ES 2.0 in this renderer.
1 Like

@gouessej: glAlphaFunc is a special caseā€¦ There are many people who still use it in the RenderState even though they shouldnā€™t cause its gone in OGL3 and Android. All of the jME3 core shaders do it properly by using an alpha discard threshold and do not rely on glAlphaFunc. On the other hand for OpenGL1 we have bindings that allow you to bind the alpha discard threshold parameter into the fixed binding which ends up in a call to glAlphaFunc.



We can support it on desktop, by checking if the GL is GL2 and if so enabling alpha func. However I think we should have some sort of warning shown so that users know not to rely on it in the future.

@Momoko_Fan said:
@gouessej: glAlphaFunc is a special case... There are many people who still use it in the RenderState even though they shouldn't cause its gone in OGL3 and Android. All of the jME3 core shaders do it properly by using an alpha discard threshold and do not rely on glAlphaFunc. On the other hand for OpenGL1 we have bindings that allow you to bind the alpha discard threshold parameter into the fixed binding which ends up in a call to glAlphaFunc.

We can support it on desktop, by checking if the GL is GL2 and if so enabling alpha func. However I think we should have some sort of warning shown so that users know not to rely on it in the future.


Slightly off topic: Unshaded does not have an alpha discard threshold. It should probably be added... and BitmapFont's loader should probably be setup to turn it on so that text doesn't z-fight with itself.

As it is, alpha test is still required in some cases.

Btw @gouessej , feel free to tell @sgothel heā€™s more than welcome to join us in this thread :wink: It seems the two of you are actively communicating, which is great, but itā€™d be even greater if his input was provided in this thread directly. That way thereā€™s less branching of the discussion and fewer details left out.