Capture Live Video

I wrote some code that lets you capture live video with a perfect, constant framerate.



It involved adding a class called IsoTimer which extends Timer,

and adding setTimer and getTimer methods to Application.



I also added a class called VideoProcessor which extends SceneProcessor

and implements the actual video encoding using Xuggle.



All the details, as well as some sample videos, are here:

Capture Live Video Feeds from JMonkeyEngine



I’m new to jMonkeyEngine so please let me know what you think!



sincerely,

–Robert McIntyre

6 Likes

here’s an example video:

http://www.youtube.com/watch?v=WIJt9aRGusc

Well it looks better than enything I was able to get with Fraps:)



How does it behave is the jme application itself is kinda cpu/gpu heavy and it will not reach the desired framerates?

I’ll take a look at this later

EmpirePhoenix said:
Well it looks better than enything I was able to get with Fraps:)

as an alternative u can try afterburner from msi, video and screenshot capture and GPU monitoring as bonus :roll:
EmpirePhoenix said:
How does it behave is the jme application itself is kinda cpu/gpu heavy and it will not reach the desired framerates?


The application will take its time to calculate every frame and step the physics world exactly (1/fps) seconds each time, so no matter how cpu/gpu heavy the application is, the video will be what you would see if you were running screen capturing software on a computer which could run the application at full fps.

How could I get this added to JME3? If the dependency on Xuggle is a deal breaker, it may be possible to use (painfully) JMF to do the same thing.

Well I was askin, because o a client server system the server will not slow down, guess I will just test how good it works for me and post a video

Dude, that’s epic!!!

That’s brilliant work, congrats.

This could be used to save cinematics to video format to be able play them back in cutscenes. and since it’s pre recorded you could put any fancy graphic stuff in the scene…




bortreb said:
How could I get this added to JME3? If the dependency on Xuggle is a deal breaker, it may be possible to use (painfully) JMF to do the same thing.

It depends, what's the Xuggle license?

lgpl says their site, shouldn’t be a real problem then or?

After reverting to a 32 bit JVM (64 bit Xuggle windows is not support due to MSCV?) I got it to work nicely! This is indeed a better approach then recording realtime with fraps on a super-machine.

However I can only see the “scene” but without the shader effects from the FilterPostProcessor. The PssmShadowRenderer however is visible. Is there a simple solution?

Also the GUI is in anther viewport (Application’s guiViewPort). Is there an easy way to combine two viewports into the same video?



Anyway, the idea looks great and has a lot of potential, but I hope you can answer my questions! :wink:

1 Like

maximusgrey:

Thank you for your input. I find it interesting that you had to convert to a 32 bit JVM to get Xuggle to work. I want video capture (with audio) to be a simple part of jMonkeyEngine, so I think I’ll rewrite VideoProcessor to use JMF or something more portable because of this. Your input is greatly appreciated.

Your shader effects most likely not showing up because your VideoProcessor is placed before the FilterPostProcessor. In the javadoc for VideoProcessor I wrote:


[java]
/**
* <code>VideoProcessor</code> copies the frames it receives to video.
* To ensure smooth video at a constant framerate, you should set your
* application's timer to a new {@link IsoTimer}. This class will
* auto-determine the framerate of the video based on the time difference
* between the first two frames it receives, although you can manually set
* the framerate by calling <code>setFps(newFramerate)</code>. Be sure to
* place this processor *after* any other processors whose effects you want
* to be included in the output video. You can attach multiple
* <code>VideoProcessor</code>s to the same <code>ViewPort</code>.
*
* For example,
* <code>
* someViewPort.addProcessor(new VideoProcessor(file1));
* someViewPort.addProcessor(someShadowRenderer);
* someViewPort.addProcessor(new VideoProcessor(file2));
* </code>
*
* will output a video without shadows to <code>file1</code> and a video
* with shadows to <code>file2</code>
*
* @author Robert McIntyre
*
*/
[/java]

So the VideoProcessor sees the effects of every processor that comes *before* it. The physics example at http://aurellem.org/cortex/capture-video.html#sec-7-1 shows how you can add more processors to get what you want. If the ordering of SceneProcessors is not your problem, I'd love to hear more details and try to reproduce it myself.

Is the documentation good? What would have made it more clear to you documentation wise? Any suggestions would be greatly appreciated!

With regards to combining two ViewPorts into the same video, you have two options as far as I know:
1. You could attach a VideoProcessor to your GUIViewPort, output to a seprate file, and then overlay the videos.
2. I don't know if this is possible, but you may be able to create a ViewPort that "contains" or "sees" both your main ViewPort and your GUIViewPort, and then just record from that.
3. If there's not a way to combine ViewPorts into another ViewPort, maybe there should be :)


I'll make another post in the next two weeks with a more portable version of VideoProcessor that
1.) doesn't use Xuggle
2.) can record audio (hopefully!)
3.) addresses combining multiple ViewPorts.
4.) has many more examples.
with the hope that it may be included in jMonkeyEngine core.
sincerely,
--Robert McIntyre
1 Like

Hi @bortreb

Thxs for your extensive reply. To start off I think your documentation is great, thats not the problem :wink:


  • I needed to revert to 32bit because there is no 64 bit windows version of Xuggle (see: http://www.xuggle.com/xuggler/downloads/ ). I read on another forum that is due to licensing issues with MSCV?!? There are 64 bit versions for linux and macs, but although I’m typing this on a Mac we do need a windows version too. ;-(
  • Second point about the ordering of the processors and having the video one last. That was the first thing that also came into my mind, however it did not solve the problem. As mentioned in my previous post it does record the shadows from the Shadows processors but not the effects from the FilterPostProcessor. I then thought maybe our code is wrong, so I extended JME’s TestFog method simpleInitApp() by adding the VideoProcessor at the end, again same results… Maybe this is due to the “post” part?
  • Then about the GUI; yes I also thought about option 1 but that’s not very elegant. I will try solution 2 or 3 but that takes some time. :wink:



    As for your 4 new features; Great!

    I have been thinking about writing something like this myself for some time, being able to render movies at high quality is very useful. So keep up the good work.

    - Maxim
1 Like

@maximusgrey



I figured out how to get the gui elements to appear in the video output.



All you have to do is make a new viewport, attach whatever other scene processors you want to it, and then do something like this:



[java]

ViewPort compositeViewPort = renderManager.createPostView(“composite”, cam);

// attach more SceneProcessors!

compositeViewPort.attachScene(this.rootNode);

compositeViewPort.attachScene(this.guiNode);

[/java]



I hope this helps.



sincerely,

–Robert McIntyre

1 Like

Hey bortreb,



Great stuff here! Though I too noticed it wasn’t capturing filters I had on my scene. For instance if I added a cartoon edge filter, or more obviously a color filter. This was kind of a downer for me because my whole scene is done in cartoon style.



I’m not sure exactly why this is, maybe the post process filters get applied after the framebuffer is passed to the postFrame() function? Or maybe post processes are done independant of the given framebuffer. I’m not really sure, just tossing some ideas



Maybe you can make the recorder like the TestRenderToTexture test case. I tried that one and applied some filters to the scene that’s being rendered as a texture and they showed up correctly on the cube.



Thanks for this, hopefully the filter situation will get sorted out :slight_smile: