Render scale?

Render scale is a feature that has been slowly making its way into games nowadays, and it is an absolute godsend for people with weak computers or oversized desktop resolutions. (mainly to circumvent the painful fullscreen mode in non-native resolutions)
It basically sets a custom render resolution which is then scaled up or down to fit the real viewport.

I was wondering if there was a way to implement this in a JME game.
Two ideas came to my mind:

  • Change the camera/viewport resolution directly but keep the frame boundaries (so far I’ve not had much luck doing this, either it shrinks the viewport on screen or it ignores the resize call on the camera and just renders at whatever size the viewport takes up)

  • Make a post processor that takes the scene from an offscreen viewport and manually code the scaling process. I have little experience with post processors and given how they can be a touchy subject for certain hardware, i wanted to look at a more lightweight solution first, is possible.

Any tips?

I guess I don’t understand why they can’t just run the game at a lower full screen resolution.

If I select full screen and 640x480 in the settings dialog, my computer switches resolution and runs that.

Is there some platform/OS specific issue that I’m unaware of?

1 Like

It’s an inconvenience, and depending on the OS, can be really severe.

On windows 7 for example (i know, technically ancient, but still many gamers use it), if i launch anything in a non native resolution, then a number of my applications will resize themselves and get moved around, forcing me to manually put them back after the game is closed.

Besides, entering fullscreen mode and alt tabbing out is not the nicest experience: it tends to flash secondary screens and take ages for the screen to transition. Not ideal when you just wanted to respond to someone’s message quickly, for example.

Next, having seperate resolutions for the ingame scene and user interface can be beneficial: It’s much less intensive to show simple text and images, so why not have them as crisp as can be, while only compromising on the scene?

Finally, a % render scale would allow hitting resolutions that are not supported by the graphics chip or monitor.
This all goes both ways, not just downsampling but supersampling for those who have the horsepower to drive it.

Render to texture is pretty easy and there are loads of examples of it here, I’d just go with that. I have a RenderToTexture class that does it all for you if you’d like (it was taken from this forum and I just updated it to work with current versions)

I’d be interested to take a look at that, thanks!

I did a quick test with the snippet i found in this thread of yours:

It works LIKE A CHARM!

I snapped a 100% screenshot, a 200% supersampling one (with trilinear filtering for minfilter) and a tiny 12.5% one for show.

I guess what you guys might get wrong is how the scale is intended to work (if I am not getting it wrong):

Instead of rendering the whole image at a lower resolution, the frame buffer is masked with some kind of checker-board/chess board, which means like 50% resolution means only every second pixel is actually rendered.
The next frame, the other pixels are rendered. That way you actually have the full high resolution at the cost of “fps”. Having a scale of 50% is like running at 30FPS, since the actual full screen is only updated half the native fps.

Then clever temporal AA techniques come to use to reduce differences for quickly changing outcomes or you could always have the actual output 2 frames behind. That way should roughly be how rainbow six does it to great avail (I don’t notice a difference, you probably can only see it in fast moving scenes if you record it and then in slow motion pay attention to those screen parts)

1 Like