I need some information for my new project! I consider using the Java Monkey Engine for a medical application. I want to display a human brain (Data from MRI, SRF Format) with a single mesh with up to 300k vertices and a calculated texture, based on the local bending of the mesh. I don’t have experience with working directly on a mesh, that is the reason why I want to ask if its possible to use the mesh data to calculate a texture that segments the brain.
What exactly is, "a texture that segments the brain"? If you can define that precisely, it would be possible to generate a shader or render state that produces the desired output into the texture.
I want to use the mean curvature operator on the mesh, in order to generate a texture that divides the brain in two different parts (sulci, gyri). I just don't have any experience with working directly on a mesh, that is the reason why I'm a little afraid of the JME limits. Is there any potential problem with:
- Import a mesh with up to 300k Vertices
- Do calculations directly on that mesh
- Generate a texture, apply it accurate on the mesh
- Render the result with 20 FPS
look into marching cubes for mesh representations of medical data sets…
however i would suggest that you look at other techniques outside of boundary representation…
look up splatting, ray tracing, shaders, and the shear warp factorization of viewing matrix paper by philip lacroute (i think) for very good info on issues and solutions in visualising large medical data sets…
None of the 4 points you mention should pose a problem as far as I can see.