Resolution-independent styling with Lemur

I’d like to use Lemur styling, but I’m also aiming at resolution-independency. For example. font size is saved on the style but don’t get adjusted on a different resolution… what should I do?

What would you like it to do?

The font size is only scaling the one font you have selected. For resolution independence, I personally just scale my GUI node based on some standard desired ‘size’. I don’t know if that works for you or not.

1 Like

For my project, using Nifty, for almost every graphic element in my GUI I have two of the same images at different resolutions. I load these images and controls using a method that determines which one to load based on the screen resolution. If the height is >= 720 then I use larger textures and fonts, otherwise I use the small ones.

Somewhat similar to how the Android API works. With Android you include graphics of differing resolutions and place them in a folder such as hdpi or mdpi, high dots per inch/medium dots per inch, and when loading a graphic the Android system will determine which one to load based on the screen density.

Nice to know that Android can do it automatically. I was worrying about that topic for quite a while.
For PC and Mac I currently don’t know a way to distinguish “Retina” displays from normal displays.
The solution I came up with is a slider type switch for the user - so the user selects the font size manually in the options or at first start of the app from 3 or 4 choices. Always start with a big text (so that the user can read it - even on a “Retina” display).

Android does it automatically if you’re using the Android API, I haven’t used jME for Android development, but I don’t think the assetLoader system uses the Android API, the assets folder in your project isn’t setup like the resources folder in a standard android project.

I don’t know about iPhone, but Android exposes the screen resolution and pixel densities to the developer. If Mac and iPhone do the same then you could maybe multiply your ‘standard’ font size by the screen density divided by 72.

A standard computer monitor has, if I recall, 72 pixels per inch. So if you divide the reported number of pixels per inch by 72 you should get a number telling you how much larger or smaller your font needs to be in order to remain the same physical size when compared to a standard 72dpi monitor.

There is no way that is a thing, I have used so many different monitors, of similar size running very different resolutions. My 15.4 Dell laptop from 10/11 years ago runs native 1920 x 1200 (a tad niche, but still).

1 Like

Agreed, definitely not a thing. Screen resolution is completely independent of screen size. Manufacturing limitations of LCDs has made common ratios more prevalent but it’s only by accident and you can’t count on that.

@pspeed @thetoucher
It’s not really a thing anymore, but it was a thing in the early days of computing. Now you just need an arbitrary point to measure against so why not 72?

“Since the 1980s, the Microsoft Windows operating system has set the default display “DPI” to 96 PPI, while Apple/Macintosh computers have used a default of 72 PPI…”

“Microsoft began writing its software to treat the screen as though it provided a PPI characteristic that is 4/3 of what the screen actually displayed. Because most screens at the time provided around 72 PPI, Microsoft essentially wrote its software to assume that every screen provides 96 PPI (because 72 * (1+1/3) = 96).”

For instance when you create a PNG image there is a DPI setting which is used when printing the image. I believe the default DPI for a PNG image is still 72.

Well, they may have defaulted it to “some value” but that value was guaranteed to be wrong a LOT. Worse before than now as I remember having monitors on the same desk of various sizes but all with the same resolution. With LCD there are at least some physical limitations that meant smaller screens were likely to have fewer pixels… but even that’s all gone now as my phone has better resolution than any 19" CRT I ever used.

That’s just a legacy of pixel-oriented GUI toolkits. This is why everything in Windows used to become smaller and smaller with each new monitor that you bought.

Any modern UI toolkit uses pixel-independent units to avoid being resolution dependent. Then you could easily specify a mapping between virtual and physical units depending on the monitor’s DPI or user-preference. With HiDPI monitors becoming the norm now (3840×2160 resolution) there’s really no excuse anymore.

@pspeed That’s where density independent pixels come in.

@Momoko_Fan Density independent pixels are measured using the formula I gave. You divide the displays pixel density by an arbitrary density to measure against then scale by the result. Android’s dip, or density independent pixel, uses the same formula except their default density to measure against is 160 rather than 72.

Android displays vary greatly, they just picked an arbitrary value to measure against.

If you’ve never written an Android app you have various options for specifying the width/height and font sizes in your XML layouts. You can specify, for instance, a width of 32px which would equate to exactly 32 pixels. This is generally discouraged however. It is preferred to use density independent pixels so 32dip or 32dp.

This is interpreted by Android as 32 * (PPI/160).

Hm?
But how does Android distinguish between medium dpi and high dpi then?
There must be some info (e.g. the hardware reports that dpi to Android or a vast database).
In the end, dpi = dots per inch = pixels per inch.
And since inch is a fixed unit, you express a real density via this unit dpi.

For example: 300 dpi is what you should use when designing flyers and posters.
That’s where dpi comes into play when using Inkscape (.svg) or Gimp (.png) image editors.
Until recently Inkscape used 90 dpi per default - so I had to change size of my to-print-thingies.
Yes, png saves the dpi value - and you would print it with the exact same dpi with your printer to get a 1-to-1 original sized image.

You can also convert to other units (e.g. “dots per centimeter”), the conversion factor is 2.54 because one inch is 2.54 centimeters.

The modern screens, VR goggles, smartphones, micro (OLED), smart watches, they differ significantly. There are some screens for Apple desktop devices that deviate (have slightly higher pixel density) and the mentioned 4k displays most certainly have more dpi. Also consider video projectors. Also consider the distance of the viewing person to that screen or projection. Some people have impaired vision - which should raise the scaling factor too (better accessibility / barrier-free software).

So I think, this is a topic to worry about and my solution is the best I could come up with.

One problem that remains is that the rest of the UI would need to adapt (which can become difficult when most of your UI was drawn with a fancy image editor by an artist who was not aware of that topic). This is one more reason why designing a flexible and appealing UI is quite difficult… :chimpanzee_sad:

I would expect additional density ranges to be added with future versions of Android. When I first started programming Android apps only ldpi, mdpi and hdpi were available. Also note:

Also check out:
http://developer.android.com/reference/android/util/DisplayMetrics.html

For things like fonts and UI element sizing you typically want to use dp or dip (dp = dip) for specifying the elements size, or use sizes relative to parent elements. You do this for your entire UI so everything scales appropriately

You still do this for elements that display images such as png, however; there’s an added layer for images in order to avoid blurry or blocky images on different devices. For each image you create multiple copies of it at different resolutions and store those copies, with the exact same filename, in different folders under the resources directory. Those folders are titled according to the aforementioned density ranges such as ldpi, mdpi and hdpi. One image for each density range your app supports.

When you load an image using the Android API or via the XML layout you specify the filename of the image and the Android resource loader will look for the file in the directory with the title that most closely matches the devices screen density.

You do not have to create an image for each density range. If the folder matching the devices density range is not found the resource loader will just look for the next closest matching folder. Conversely if the folder that matches the devices density range is found, but the requested filename is not found within that folder the resource loader will look for the filename in the folder with the next closest density range title and continue doing so until the file is found.

1 Like

Just realized that scaling the GUI node don’t work if using tonegodgui… is there a workaround or any plan to implement a Indicator-like widget on Lemur? :slight_smile:

I’m not sure what this is… can you explain?

https://wiki.jmonkeyengine.org/doku.php/jme3:contributions:tonegodgui:indicator

Basically a glorified progress bar

Can you describe what it does that Lemur’s progress bar doesn’t?

Clipping on 4-ways?
Layered background, content, and overlay?
Alpha-mask?

Not sure… maybe it covers all of these?

Not sure but Lemur uses built in JME shaders and so can’t support what most people would call clipping “out of the box”. You can use custom materials that might support it.

All Lemur GUI elements are layered sets of components with some default layers like border, background, and I think overlay… it is also trivial to add additional layers as needed. (Lemur doesn’t care how many you have it’s just that the named layers are ordered for you… and you can set the ordering even.)

I mean, no… but I’m also not sure exactly how it applies here. What may be needed is one (or two) new component implementations to make some layers behave like you want. I guess one thing is that the progress itself is shown as a quad in Lemur and currently the only component implementations are all stretched quads. I’ve wanted a clipped quad for a while (where the texture coordinates are set relative to size instead of being constant)… and that would cover this particular case nicely. You could just have the progress portion be any bitmap texture you’d like than and it would clip for the progress value.