Paging @rickard, I could really use your help! I’m trying to push the FOV a bit using the SDK-side distortion filter, and I’m having trouble with the texture buffer size (which should be increasing with a larger FOV).
In this code,
[java]@Override
protected void initFilter(AssetManager manager, RenderManager renderManager, ViewPort vp, int w, int h) {
material = new Material(manager, “oculusvr/shaders/Oculus.j3md”);
Matrix4f projMat = OculusRiftUtil.toMatrix4f(Hmd.getPerspectiveProjection(
eyeRenderDesc.Fov, 0.1f, 1000000f, true));
vp.getCamera().setProjectionMatrix(projMat);
TextureHeader eth = eyeTexture.Header;
eth.TextureSize = hmd.getFovTextureSize(eyeIndex, eyeRenderDesc.Fov, 1.0f);
eth.TextureSize.h = 1; // debug set to 1
eth.TextureSize.w = 1; // debug set to 1
eth.RenderViewport.Size = eth.TextureSize;
eth.RenderViewport.Pos = new OvrVector2i(0, 0);
}[/java]
… I’m trying to set the texture size given a certain FoV passed to getFovTextureSize(…). However, the result of that function doesn’t seem to matter, since I set if I later set the width & height to 1 (as seen above), nothing changes. However, if I set it to 0, all that renders is a solid brown color. The values that come out of getFovTextureSize(…) is h: 2705, w: 1257, which is no different than h: 1 & w: 1. This matters, because as FoV goes up, the texture size to support that FoV should go up too. The center of the image gets fuzzy when the FOV gets increased, and @jherico suggested it is because the Texture buffer isn’t being sized up.
Here is the image showing the comparison of the normal FOV & the increased FOV:
I asked @jherico, and he suggested these two things:
- Take a look at the code where the framebuffer textures are allocated and make sure they’re respecting the depth values passed back by the texture size function
- Make sure that after you bind the framebuffer you’re calling glViewport with the appropriate size
… but I am getting in a little over my head. Any help by anyone would be appreciated!