Can anyone help with this SurfaceNets implementation please?

Hi all,

I’ve been playing around and learning for about a week having not done any 3D stuff before, and I think I now understand the basics of vertices/indices/normals - just about, and I’ve been playing with the various IsoSurface demos and libraries that the community has put out or recommended over the years, as well as the voxel libraries, as I’d like to implement various destructible surfaces

It’s helped me crystalize exactly what I want to work towards and taught me loads about JME internals that I needed to know but I haven’t actually had much success, between building old sources and trying to rip out isosurface functionality from demos and old threads, i’ve only been able to get block-based voxel stuff to actually render, none of the isosurface or Marching Cubes code that I have found has actually worked out for me so far, for various various reasons lol, it’s been a fun week though.

I have found this implementation of SurfaceNets:

The demo code work’s to output an object file, and looks great, if only I can get the in memory generation working - I’m wondering if this may be the simplest entry point for me to start prototyping with isosurfaces, the part that’s stumping me at the moment is how to get the appropriate values to build a JME mesh out of that SurfaceNets class.

I tried to follow [Solved] Simple Dynamic Mesh Example with JME3 - #3 by glitch83

But I am unable to make the leap in my mind between the Vertices/Faces exposed in the class and the data JME needs to build a mesh.

Can one of you 3D wizards please kindly point out the obvious stuff I am missing here? I would love to get this working and share it.

Thanks everyone

Have you looked at this tutorial:

Yeah, so I guess where I’m not sure specifically is given SurfaceNets exposes this:

public final List<double[]> vertices;
public final List<int[]> faces;

and a custom mesh wants this:

Vector3f [] vertices = new Vector3f[4];
vertices[0] = new Vector3f(0,0,0);

int [] indexes = { 2,0,1, 1,3,2 };

Vector2f[] texCoord = new Vector2f[4];

I can see I need to unpack the List faces into int[] indexes, keeping the order, right? do i then need to convert the doubles to floats and format them into Vector3F’s in triplets for our vertice array? if thats right then maybe its just the texture part tripping me up, do I need one Vector2f per vertice no matter the shape?

I’m going to have another go now I’m fresh but a sanity check would be appreciated if im on the right path now? thanks

You might also take a look at

There are many ways to construct a mesh but OpenGL deals in floats not doubles… so everything will be float in the end.

Whether your texture coordinates are two floats, or three floats, etc. depends entirely on the shader’s expectations.

Note that none of this is really specific to JME as it does a really good job of trying to be a thin wrapper around OpenGL in this case.

A mesh is a set of vertex attributes buffers.
A material is the shaders + parameters.
A geometry puts those together and relates directly to a single OpenGL draw call.

Thanks dude - So in the case of:
Material mat = new Material(assetManager,

I am allowed to supply one Vector2f per vertice and it will work as expected? And if I go outside that it will be specified to some other minimum or pattern.

I believe if you pass a 3 component vector to a shader that is expecting 2 then it will just ignore the third one. But I guess we are getting off topic.


Well I’ve had a go today and I think I’ve got the principal, but I haven’t got SurfaceNets spitting out the right data yet so I’m not sure if a couple of issues remain, for example I’m getting a lot of blank index positions.

Anyway I’ll leave this here for anyone curious and any implementation pointers are much appreciated.

SurfaceNets is spitting out the wrongs shapes for the (intent of) my isodata so i’m just getting a plane for now, so this looks wrong but seems true to the .obj model.