Using JMonkeyEngine as non-realtime render

Hello everyone. I want to create an application that will perform distributed 3D rendering. And I want to separate rendering process into individual frames and let each node to render one frame in time. So after all frames have been rendered, I will composite them into one big video-file.

In fact I want to distribute rendering process through Java applet. Clients will enter to website, download applet, applet will request server for frame (scene file) to render, begin rendering process and then send bitmap back to the server. Then applet make request for second frame to render etc… Server will collect rendered bitmaps into video file. Applet will contain all libs (.jar’s) need for rendering frames.

So my question: it is possible to use JMonkeyEngine for this purpose?

Possible, yes. It’s not what it’s really designed for though.

jme is designed for realtime rendering…Some of those optimisations etc may not be appropriate for what you are doing but without knowing more of your use-case I can’t really comment beyond that.

Like @zarch said it is certainly possible, but jME is a realtime rendering engine and is optimized for that case. Offline rendering algorithms often uses a more expensive rendering technique like raytracing to achieve higher realism, and those computations is often made on the CPU instead of the GPU… although there is GPU accelerated alternatives available there is no such thing in jME… you would probably need to dive into OpenCL compute shaders and other stuff (which jME has no built in support for)… but it really depends on what you are trying to achieve…