I have been messing around with setViewPort on the camera/renderer and I’m not sure if I’m understanding what they functions do.
setViewPort default is 0,1,0,1.

If you change it, what is suppose to happen. I understand if you do 0,0.5,0,1 that would split the screen vertically.

But what I’m not getting is when the resolution changes. The viewport for the window changes.
if I do 1024,768 resolution the far left is some where around -6. but if I increase the window it then changes to a higher number. It can get to somewhere around -12 for the viewPort.

The resolution of the viewport also changes how much is view depending on the size of the window.
so if the window is say 2048 and you set the viewport size to 1024. Everything is larger. Which I understand 2 pixels for everyone. But the viewPort is not -6 on a window 1024 with a viewportsize of 1024. with a window of 2048 and viewportsize of 1024, you still have -12 for far left of the screen.

If the viewportsize is 1024, then why is the viewport far left now -12.

What I’m getting at, is there a way to have the viewport far left to be the same no matter what the screen resolution is.
The spawning of enemies have to changes based on the screen resolution. Because in higher resolution you see farther. So spawning at -8 - -10 from player is right for smaller resolution but for higher resolution, it spawns on the screen, dont’ like that.
but a spawn of -14 - -18 for higher resolution is to far for lower resolution.

Just wandering how to handle this. Before I had my own engine, and I would change glViewPort, to be the same size no matter what the resolution is, this what 512, 1024, 2048 resolution inside the code is identical. But from JME, the viewport is always changing based on resolution.
No matter what you set the setViewPort to, it reports the values I set back, but on the screen they are different.

First read.

I found that I had to pick resolutions I wanted the player to use and create the assets accordingly for each.

Scaling things just didn’t look near as good as creating assets for specific resolution’s.

Maybe there is some magic code to do it right and others could chime in.

Having had some familiarity with raw OpenGL myself before having started with JME 10 years ago, I find most of the mappings very natural when I found the OpenGL calls and traced back up through the layers.

For example, somewhere down in the renderer, JME is probably calling glViewport(). There might be some ‘ahah’ moment looking at how it gets called maybe.

Yeah, Only thing I see that calls renderer.setViewPort(x,y,w,h), is inside RenderManager.setViewPort(Camera).

I do not see any other code that will call glViewPort which is through Renderer.setViewPort

This method gets the x through camera settings. viewportLeft * width, y viewPortBottom * height, width = subtract x from (width * viewPortRight) and height the same way.

Maybe this is way it never worked, because using the standard viewPort 3d or Gui, never calls this method, so the viewport is not changed.

This was a 2 mins. search, could be other ways, but didn’t see search.