Next release of the Engine

Sure. Install an SDK, build the Engine from source, and run tests from jme3-examples.

Currently, jme3-examples is configured to use jme3-lwjgl (LWJGL v2). You should also rebuild using jme3-lwjgl3 (LWJGL v3) and run tests that way.

If you have your own apps and/or libraries, you could try rebuilding them with v3.5.0 and see if anything breaks.

1 Like

No its not an example, its an implementation of BlendSpace interface in which you can use the circle area function to get your blend weight (and the circle area can be sliced out using the circle radius and a central angle), I have described in details on the comments.

Also, in this blend space you can blend between more than 2 actions easily.

I have added a test case on “jme3-examples” that tests its usage on a primitive mesh that blends between 4 actions.

If you think that this should not be included then let me know, I will convert to draft or either close and keep it trivial no problem :slightly_smiling_face:.

If you would like to see a video of the operation as well, let me know.

1 Like

I have described in details on the comments.

You’ve described how it calculates blend weights. What I don’t understand is why someone would want to calculate weights that way.

2 Likes

The same why would someone calculates the blend weight linearly, from my point of view it doesn’t really matter how you want to calculate the blend weight, what matters is the result which should be between 0 and 1 and also the ease of controlling the result.

Generally, someone would use a linear blend {anything} because it’s obvious and straightforward to reason about.

A visual graph of outputs, viz-a-viz inputs from your function might help illustrate your point…

2 Likes

Thanks for illustrations, could you please give me an example ? Do you mean graphing the function manually ?

Exactly. It doesn’t have to be a very high-resolution graph - just show what this does in visual terms.

1 Like

Well generally the graphed function would be multiple inscribed circles starting from the origin point and moving to the unit circle, the blend weight would be the area of a sector of one of infinitely inscribed circles based on the angle and the r.

The angle would define the sector area, 0 for no value and 360 for 360/360 * inscribed area.

I will do a manual graph.

So, essentially this is a function of the form:

weight = theta * (r^2)

where r and theta can vary in the range 0..1?

What problem is this intended to solve? What set of circumstances would make it more intuitive or natural?

2 Likes

Please, take a look at PR #1701. I wrote a new RectangleMesh class that is similar to Quad except that it is based on the Rectangle class, which makes it possible to independently configure its center and extent.

I don’t know how to make normal direction and texture coordinates configurable, but I’d appreciate it if somebody explain me how.

1 Like

So many of these “new mesh” proposals could be covered by a handful of utility methods that essentially boil down to a single:

setBuffer(Type.Position, 3,
                new float[] {....});

call.

There is a reason JME chose not to include 100 different Mesh subclasses. Beyond super simple stuff it’s best to just call the setBuffer() yourself or use a 3D modeling tool.

1 Like

It is over kill to use a 3d Modeling for a basic mesh.
Then yes you could do it through setbuffer, but I can tell you very little newbie would know how to do that.

I guess it boils down to what kind of engine do you want! My first impression of JME was a high level engine for easy of use and starting.

Creating VBO AND VAO is not a beginner level engine.
Let me ask, what kind of level is JME. The higher end level or then you have Lidgdx which is a low level engine, but JME doesn’t come close to that level.

JME is to restricting of using opengl. You don’t have control of the window very easily. JME wants the (what I call) the viewport. Those dimension are completely up to the user but JME forces them to be -1 to 1. That is not required and no use of scissors from opengl. You couldn’t even control window position, it forced it to be center. So low level is out. but you are talking about creating your own VAO VBO data.

My 2 sense.

1 Like

But in the sense of long term maintenance, every line of code has a “forever burden” and so should achieve the most impact possible.

Add offsets to Quad and 1 in 100 developers might use it. Add rotation to quad and maybe 1 in 10 will use it… 4-5 by mistake because they didn’t realize they could rotate a geometry.

Then someone wants the same thing for cylinder so we add another constructor/setters/getters/block of code to that class.

Then someone wants the same thing for box so we add another constructor/setters/getters/block of code to that class.

…then for cone.

Or we stop thinking like “I didn’t know how to do this so I want to add this very specific method right here” and we start thinking about “what would help the most people?”.

For example, a MeshUtils class with static methods for translating or rotating a mesh, maybe another one for centering a mesh. These would work with quad, box, sphere, loaded geometry, etc… Now instead of a hundred methods spread across a dozen classes amounting to many hundreds of lines of code… we have three utility methods of about 4-5 lines each.

We need to stop thinking “I didn’t know how to this thing so we need to add a button for me here”. That second part is what leads to bloat.

5 Likes

I don’t know how to make normal direction and texture coordinates configurable, but I’d appreciate it if somebody explain me how.

In addition to positions and indices, a Mesh can optionally include buffers for normals and texture coordinates. Buffers are added using mesh.setBuffer(). Quad already does this. There’s also a tutorial on custom meshes in the wiki: Custom Mesh Shapes :: jMonkeyEngine Docs

1 Like

Thank you, I’m going to read that wiki entry and make the necessary changes accordingly.

Anyway, according to Paul, this kind of class could potentially bloat the engine. I’m still going to add the suggested features, but feel free to close the PR if you too think the class is not really necessary.

1 Like

Yes, but then when would the Utility my created. Bloat can be a real issue.

I thinking having many people trying to create meshes through array would be challenging.

Example. Particle Monkey has been around for several years, and it is still not part of the JME library and it is a far better particle system. From what I read no one like what is in JME, but nothing is done about it. Its been waiting on someone to create a Editor so it can be part of the main library and again its been years. From what I’ve read that some started it but abandon it and its been forgotten.

I would love to have Particle Monkey to be part of the standard engine (my 2 sense). libraries are not a very nice addition to JME. Most of them are github download and compile yourself. And on top of that some of them require additional downloads and compiles to get working. You can spend hours to get a library compiled and ready to use inside JME.

Add on libraries seem to be a fore thought and not well thought out for JME (from a newbie view point). Maybe you guys that have been here for years and forget what it looks like on the outside of JME and trying to learn it. Even SDK doesn’t have a nice feature to add on libraries, I understand not supporting all libraries but if it is posted on JME’s website then I think for newbies having a simple add-on would be great. But from what I read “SDK” is a sore spot, many like and many hate it.

I don’t use it for programming, but I used it for the graphic interface to get things done. I don’t like netbeans, it is a slow program (on my machine compared to others), but the user interfaces are nice even though some have bugs and needs to be fixed and updated. But that is a huge undertaking

Just a thought.

1 Like

Nah, the best ones are generally available through maven repos. So it’s just a line in the build.gradle or the pom file.

I’ve been in this community for 11 years now. Being part of the engine is death for anything that needs to evolve. It’s big and it’s scary for new folks to feel like contributing and then they have to wait up to a year or more before they actually see their change in a release. (used to be even longer)

Smaller separate projects can iterate fast and are often more approachable for the ‘new guys’.

We need a vibrant ecosystem where contributions can be made at any level and new exciting libraries can pop up and gain brain share. The SDK should include the good ones by default but users are free to pick what they need.

All that being said, if someone wanted to put the effort into making a MeshUtils for JME core that has translate(), rotate(), and center() methods… that would be a fine addition.

Quad upperLeft = MeshUtils.translate(new Quad(w, h), 0, -w, 0);

…I could have used such a thing dozens of times (but not for quads, ironically).

1 Like

My game is almost entirely Quads, 3D game made from Quads like the old wolfstein game. I use lower left, lower center and center orientation all the time in my game. All the NPC are Quads also with texture animation for the animation. No 3d Models.

So many games.

Check out LibGDX DelvEdit game have made many games and very little models, most of them are simple shapes like quad.

I do like the idea of MeshUtils for changing orientation. I don’t get the translate and rotate function. I’m probably not following your logic. Just doing that in the code to rotate or translate. Now rotate and translate changes based on orientation. But having something to change orientation sounds good to me.

Please explain your thoughts behind rotate/translate in definning, just talking about just when it return the rotate is translate is just set??? Because you could just do upperLeft.setRotate() or upperLeft.setTranslation(), unless you meant something else.

1 Like

This deserves its own thread.

Mine, too. Just not Quad extends Mesh quads. I just make a buffer and put a bunch of quads in it.

1 Like

Thinking something like

   private static Mesh centerMesh(Mesh mesh, BoundingVolume volume) {
      Mesh newMesh = mesh.clone();

      Vector3f center = volume.getCenter();
      float offset = center.y;
      VertexBuffer pb = mesh.getBuffer(Type.Position);
      if (pb != null ) {
         FloatBuffer fb = (FloatBuffer) pb.getData();
         float[] uvCoordinates = BufferUtils.getFloatArray(fb);
         
         for(int x=0; x < uvCoordinates.length / 3;x++ ) {
            
            if (center.x != 0 ) {
               uvCoordinates[x * 3] = uvCoordinates[x * 3] - center.x;
            }
            if (center.y != 0) {
               uvCoordinates[x * 3 + 1] = uvCoordinates[x * 3 + 1] - center.y;
            }
            if (center.z != 0) {
               uvCoordinates[x * 3 + 2] = uvCoordinates[x * 3 + 2] - center.z;
            }
         }
         
         //apply new texture coordinates
         VertexBuffer uvCoordsBuffer = new VertexBuffer(Type.Position);
         uvCoordsBuffer.setupData(Usage.Static, 3, com.jme3.scene.VertexBuffer.Format.Float,
                 BufferUtils.createFloatBuffer(uvCoordinates));
         newMesh.clearBuffer(Type.Position);
         newMesh.setBuffer(uvCoordsBuffer);
         newMesh.updateBound();

      }
      return newMesh;
   }

What do you think. Small clip of the MeshUtils class

I tested with Box and Quad.