# Nonlinear perspective projection for camera

Hi,

i’m struggling with the JME architecture. I don’t know the right place to implement a nonlinear perspective projection for the camera or where to feed JME with this.
I’ve tried a shader using the FilterPostProcessor, but it didn’t work since the near and far clipping planes are still flat.
Fiddeling with the projectionMatrix also didn’t help.

In the end i want something similar to the first Image here Figure 1 (inward cylindrical projection).

Any help appreciated

Unsure how this would be used in a game.

Can you explain what you are actually trying to do?

I can’t see how any normal 4x4 projection matrix style camera will achieve this effect… so you are likely to have to do something really strange. The linked article talks about a lot of strange camera work.

I’m also unsure how to use this in games.
But i’m willing to explore it.
I guess the answer is somewhere in lwjgl.

Since your vertex shader does not need to a projection matrix, you can definitely transform points vertex-points non-linearly. Eg. Instead of a projection matrix, you use an equation.

One problem thou with nonlinear projections is the following:
While you can successfully transform vertices into your desired space, the vertices form triangles.
And here is the issue, to render 3D mesh with a nonlinear perspective, you would like to transform its triangles into nonlinear space.
A triangle transformed into nonlinear space is most probably not a triangle anymore! Yes it can be a shape with 3 vertexes, but the edges can be curved!

Thus achieving nonlinear perspective with vertex shader is problematic. It can be achieved approximately if you use small triangles, ideally tesselation to simulate curved triangle edges.
The advantage of this is that it should allow you to render even 360 degree scenes.

An alternative, as you have mentioned, is to apply the nonlinear transform as a post process effect. The advantage of this is that it is probably cheaper than the above and faster to implement. The disadvantage is you are limited to your field of view and as you mention the near and far clipping planes are flat.

Another approach is to use ray tracing.

I’ve already implemented a fisheye look in the vertexshader, but this is not the way to go.
Another problem is culling.
There are some projects for quake and minecraft.

But they use an environmentMap and then simply use vertexshaders on the env cube to get the look.
These are still outward looking cameras not inward.

Yes, because it’s easier to have a camera at a single point than an infinite number of points everywhere.