Good news on the front of video support for Java: Finally: Enhanced Video Support Coming in Java 7. I recomend also reading the link URL: Chet Haase’s Blog, that is just under the title of the article.
I was looking for a framework to develop quickly a simple NLE editor in Java. Of course, I have also searched the jME forum for any discusion about video rendering. And I found this discussion and the previous. After reading all the threads I luckly found the links above, plus another Framework, FMJ, that is an open source replacement of JMF for Java 5, API-compatible with JMF, based on FFMPEG for playback and editing or the default system library, if you only need playback. Read the supported media formats page for information.
I hope that this can help.
If it is fully JMF compatible, I think it should be easy and quick to port your tests and code to this framework.
FMJ is not fully compatible yet… and a long while ago it did have problems with FOBS… but… they were working on it! Also looks like they took over the FFMPEG for Java project, which would replace FOBS. FOBS has the advantage that (with a few changes I made) it gives access to a native ByteBuffer containing the decoded information, but this is not a requirment for my render plugin.
And as for JMC, that's probably a big step back compared to JMF for getting video in OpenGL, since they're just gonna wrap native video players. I hope it's not gonna be something simple (and stupid)(and crash-prone) like wrapping DirectShow on Windows, and that they at least allow themselves to use something that will give them a better change to in the future make it into a more flexible (and corssplatform) API (FFMPEG), but I certainly wouldn't count on that.
llama said:
Also looks like they took over the FFMPEG for Java project, which would replace FOBS.
I do not know if they used FOBS earlyer, but now they do not use it anymore: expand "more" link of "8. FFMPEG-Java" here. They say:
The existing ffmpeg wrappers have shortcomings: (FOBS4JMF: an unnecessary C++ wrapper is in between Java and ffmpeg, hiding functionality) (jffmpeg: attempt to port ffmpeg to Java, lags behind ffmpeg). [...]
I do not know if is possible to work with ByteBuffer using current FFMPEG-Java. Anyway, they say the JMF compatibility API is almost complete.
I think that, if JMC would be such a "silly simphony" of unusefull wrappers, FMJ is going to be the only up to date solution. I mean JMF is abbandoned and, soon or later, it will become completelly unusable.
FOBS is just a JMF plugin, so it should work with FMJ too, now that FMJ is more complete (afaik FOBS doesn't use any code from the secret sun. package or something).
FOBS might have a "useless" C++ wrapper, but it IS by far the best FFMPEG wrapper I ever tried. Didn't try the FFMPEG wrapper of FMJ, though on their website it doesn't look completly finished yet (of course in CVS it could be in a much better state).
Also my code is just a plugin too, it doesn't depend on FOBS, however (using reflection, so it doesn't even require the FOBS jars at build time), you can tell it to use the FOBS ByteBuffer shortcut, so you are bypassing the useless ByteBuffer -> int[] -> ByteBuffer process.
Let me know if you can get FMJ to work, I won't miss having to use JMF one bit
@nemo: i wrote some code using jMyron (it's the camera system also used by processing, i used it because i need motion/color tracking as well). it's really basic compared to what lllamma did (though i think i dont fully understand what he is doing ), basicly i just grab frames and write them to a bufferedImage. using a (native) buffer would be much more elegant and of course much faster … but somehow i didn't have the time to look into it. maybe it helps you … (you can kick out the scaling(filter) part of the code … i think jme scales the textures itself if they are not power of 2). also the FORCED_WIDTH, FORCED_HEIGHT stuff is not needed when you are on mac osx, though it won't work on win for me without it.
jmyron project here -> http://webcamxtra.sourceforge.net/ (runs on mac and win)
public class TimedVideoGrabber extends TimerTask {
public static int FORCED_WIDTH = 160;
public static int FORCED_HEIGHT = 120;
private int scaleWidh = 0;
private int scaleHeight = 0;
private int frameRate;
private int[] data;
private JMyron cam;
private Timer t;
private BufferedImage image;
private boolean isScaling = false;
public TimedVideoGrabber(int frameRate) {
this.frameRate = frameRate;
this.cam = new JMyron();
this.cam.start(FORCED_WIDTH, FORCED_HEIGHT);
cam.findGlobs(0);
cam.update();
startTimer();
}
public TimedVideoGrabber(int frameRate, int width, int height) {
this.frameRate = frameRate;
isScaling = true;
this.scaleWidh = width;
this.scaleHeight = height;
this.cam = new JMyron();
this.cam.start(FORCED_WIDTH, FORCED_HEIGHT);
cam.findGlobs(0);
cam.update();
startTimer();
}
private void startTimer() {
t = new Timer(true);
t.scheduleAtFixedRate(this, 1000, 1000 / frameRate);
}
private void createImage() {
BufferedImage bi = new BufferedImage(FORCED_WIDTH, FORCED_HEIGHT, BufferedImage.TYPE_INT_ARGB);
bi.setRGB(0, 0, FORCED_WIDTH, FORCED_HEIGHT, this.data, 0, FORCED_WIDTH);
if (isScaling) {
ScaleFilter sf = new ScaleFilter(scaleWidh, scaleHeight);
BufferedImage out = new BufferedImage(scaleWidh, scaleHeight, BufferedImage.TYPE_INT_ARGB);
sf.filter(bi, out);
bi = out;
}
image = bi;
}
public BufferedImage getImage() {
return image;
}
public void stop() {
t.cancel();
t.purge();
cam.stop();
}
public void run() {
try {
cam.update();
this.data = cam.image();
createImage();
} catch (Exception e) {
System.out.println("fetching of image from cam failed! message: " + e.getMessage());
}
}
}
@renanse: is there any progress in supporting videos in jme without all the fobs-tons-of-libraries-here-and-there stuff?
All I did was write an importer for a specific type of video format (one that is only used by a handful of games) I'll look at releasing that code sometime soon if someone would like to expand on it.
hi ppl!
I am developing a program using Artoolkit plus and need to use a camera as input, right now I am using a webcam and your Classes. To be able to capture images from the webcam I created a new constructor:
public JMFVideoImage(int scalemethod) throws NoPlayerException, CannotRealizeException, IOException {
this.scalemethod = scalemethod;
Vector deviceList = CaptureDeviceManager.getDeviceList(new VideoFormat(null));
System.out.println(deviceList.size());
deviceInfo = (CaptureDeviceInfo) deviceList.firstElement();
Manager.setHint(Manager.PLUGIN_PLAYER, new Boolean(true));
ByteBufferRenderer.listener = this;
this.initializeDevice();
jmfplayer.addControllerListener(this);
}
and this two methods:
private void waitForDevice() {
while (jmfplayer.getState() != jmfplayer.getTargetState()) {
try {
Thread.sleep(500);
} catch (InterruptedException ex) {
}
}
}
private void initializeDevice() throws NoPlayerException, IOException {
try {
jmfplayer = Manager.createRealizedPlayer(deviceInfo.getLocator());
//formatControl = (FormatControl) jmfplayer.getControl("javax.media.control.FormatControl");
//formatControl.setFormat(formatControl.getSupportedFormats()[0]); <-- this is the 640x480 video format
jmfplayer.realize();
jmfplayer.prefetch();
waitForDevice();
System.out.println(formatControl.getFormat().toString());
} catch (CannotRealizeException ex) {
Logger.getLogger(JMFVideoImage.class.getName()).log(Level.SEVERE, null, ex);
}
}
With this I am able to get a video source from my web cam in your example BUT it always uses the 320x240 video format even with the 640 being the first of the formats array ... is there any reason you might think of ?
EDIT: Oki I have been messing around with JSTudio and discovered that if it uses your render it would just renders the camera image witth 320 (in 640 it will just show a black square) but if is using the default setings it was able to render at 640 so I suppose the settings for 640 are not implemented in your ByteBufferRender, could you please help me implement it or point me in the right direction ?
The renderer doesn't really look at the size… I've rendered video of many different sizes.
It does look at the pixel format (and it doesn't handle all of them), specifically the packing, which I suppose could be different (but that's rather strange since it's the same webcam).
You can check that by looking at the format info it dumps when starting and comparing it to the 320x240 case.
Secondly the actual video data is received in the process(Buffer buf) method. You can check here if maybe it's delivered differently, and perhaps dump some buffer contents to see if there's an actual image in it (it's raw data so a black image is probably all 0s)
thanks for your quick reply
this is the output from JStudio when using your renderer:
ByteBufferRenderer: setInputFormat
Format (RGB masks, pixelstride, linestride, bbp, flipped
3
1
2
3
1920
24
true
Width: 640 Height: 480
Frame buffer size (bytes): 1228800
And this are the device details:
Name = vfw:Microsoft WDM Image Capture (Win32):1
Locator = vfw://1
Output Formats---->
0. javax.media.format.RGBFormat
RGB, 320x240, Length=230400, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=960, Flipped
1. javax.media.format.RGBFormat
RGB, 160x120, Length=57600, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=480, Flipped
2. javax.media.format.RGBFormat
RGB, 176x144, Length=76032, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=528, Flipped
3. javax.media.format.RGBFormat
RGB, 352x288, Length=304128, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=1056, Flipped
4. javax.media.format.RGBFormat
RGB, 640x480, Length=921600, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=1920, Flipped
5. javax.media.format.YUVFormat
YUV Video Format: Size = java.awt.Dimension[width=160,height=120] MaxDataLength = 28800 DataType = class [B yuvType = 2 StrideY = 160 StrideUV = 80 OffsetY = 0 OffsetU = 19200 OffsetV = 24000
6. javax.media.format.YUVFormat
YUV Video Format: Size = java.awt.Dimension[width=176,height=144] MaxDataLength = 38016 DataType = class [B yuvType = 2 StrideY = 176 StrideUV = 88 OffsetY = 0 OffsetU = 25344 OffsetV = 31680
7. javax.media.format.YUVFormat
YUV Video Format: Size = java.awt.Dimension[width=320,height=240] MaxDataLength = 115200 DataType = class [B yuvType = 2 StrideY = 320 StrideUV = 160 OffsetY = 0 OffsetU = 76800 OffsetV = 96000
8. javax.media.format.YUVFormat
YUV Video Format: Size = java.awt.Dimension[width=352,height=288] MaxDataLength = 152064 DataType = class [B yuvType = 2 StrideY = 352 StrideUV = 176 OffsetY = 0 OffsetU = 101376 OffsetV = 126720
9. javax.media.format.YUVFormat
YUV Video Format: Size = java.awt.Dimension[width=640,height=480] MaxDataLength = 460800 DataType = class [B yuvType = 2 StrideY = 640 StrideUV = 320 OffsetY = 0 OffsetU = 307200 OffsetV = 384000
In this computer I am only using JMF and turned off fops optmization (put the flag false) and what I see is a white window 640x480 with only the top right corner with video in (320x240 i think)
I will try to test different setings and configurations!
well, dump the contents of the buffer (probably an int[]s), in the mentioned method… if that's all "white" then the source is giving you white data. Not much I can do about that then…
What format is jfjr dumping when you use jME? Same as the JMF studio dump? (if so, why not give the output of that in the first place?)
Also, don't bother using the renderer in JMF studio, after all, it doesn't render anything (it just puts the data in a ByteBuffer)
oki now its getting crazy :S I followed your advice and wrote to disk the buffer of frame 1 and then opened the output with irfanview and there it was a .raw image with 640x480 image :S so the part of capturing is the conversion to texture that must be failling i guess :S
The output from jme is this:
ByteBufferRenderer: setInputFormat
Format (RGB masks, pixelstride, linestride, bbp, flipped
3
1
2
3
960
24
true
set Size: 320 240
FOBS found
Frame buffer size (bytes): 307200
RGB, 640x480, FrameRate=15.0, Length=921600, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=1920, Flipped
using normal quad
2/Abr/2008 1:48:40 com.jme.scene.Node attachChild
INFO: Child (quad) attached to this node (rootNode)
ByteBufferRenderer: setInputFormat
Format (RGB masks, pixelstride, linestride, bbp, flipped
3
1
2
3
1920
24
true
set Size: 640 480
FOBS found
Frame buffer size (bytes): 1228800
frame data:921600 frame: 1 time: 7 fps:142.85713 type: byte[]
It starts as being a 320 so my change format its only being used AFTER the first start :S so maybe the texture is expecting the wrong size...
oki ppl i think i got it I was able to get the size of the format the one I wanted before the player started and now i am getting an image
one final question is there any way to set a texture ignoring the power of two rule? I need the video to be in the right proportions to be used with ARtollkit Plus.
jME doesn't support rectangular textures yet.
However, it should be very simple to make a rectangular quad, and use texture coordinates to properly put the texture on it.
Greetings,
I have a problem with the TestJMFVideoImage Application. It only show the first frame of the video.
Here is the console info:
Fobs4JMF - Native shared library found
57.1667First Position: 0, 0 Duration: 57166
Frame Rate: 12
Opening Thread[JMF thread: com.sun.media.PlaybackEngine@1e0f2f6[ com.sun.media.PlaybackEngine@1e0f2f6 ] ( configureThread),9,system]
ByteBufferRenderer: setInputFormat
Format (RGB masks, pixelstride, linestride, bbp, flipped
ff0000
ff
ff00
1
352
32
false
FOBS found
Frame buffer size (bytes): 405504
ByteBufferRenderer: setInputFormat
Format (RGB masks, pixelstride, linestride, bbp, flipped
ff0000
ff
ff00
1
352
32
false
FOBS found
Frame buffer size (bytes): 405504
A important thing is that i can play the video file with the FOBS JMF player. But when I try to play the video using the FOBS JMF player form eclipse o using the TestJMFVideoImage app it only show the first frame.
Any help. Thanks in advance.
Did you try any other video files?
Hi,
Yes I test many video files with the same result. Only the first frame is showed.
More info,
I have being testing some video files and the only one that run have the FFmpeg MPEG 1/2 CODEC.
Others CODEC as ISO MPEG-4, S-Mpeg 4 version 2 and MPEG-1 only show the first frame.
Any help?
Well, that certainly is… mysterious. I've had succes with a lot of other formats (XVid avi and such)… I'm also not sure what you mean that "FFMPEG" mpeg 1/2 works and MPEG-1 does not.
Did you use the JMF version that comes with FOBS? What platform?
Also you might try sticking to the "older" containers (avi and mpg mostly)… since FOBS is pretty old it comes with an older version of FFMPEG.
Hi,
I made a clean installation and another PC and all works perfectly!!!.
Hi I get a strange error at line:
tex.setImage(image);
in TestJMFVideoImage
Terminal dump:
Jun 17, 2008 3:01:50 AM com.jme.app.BaseGame start
INFO: Application started.
Jun 17, 2008 3:01:50 AM com.jme.system.PropertiesIO <init>
INFO: PropertiesIO created
Jun 17, 2008 3:01:50 AM com.jme.system.PropertiesIO load
INFO: Read properties
Jun 17, 2008 3:01:52 AM com.jme.system.PropertiesIO save
INFO: Saved properties
Jun 17, 2008 3:01:52 AM com.jme.app.BaseSimpleGame initSystem
INFO: jME version 1.0
Jun 17, 2008 3:01:52 AM com.jme.input.joystick.DummyJoystickInput <init>
INFO: Joystick support is disabled
Jun 17, 2008 3:01:52 AM com.jme.system.lwjgl.LWJGLDisplaySystem <init>
INFO: LWJGL Display System created.
Jun 17, 2008 3:01:52 AM com.jme.renderer.lwjgl.LWJGLRenderer <init>
INFO: LWJGLRenderer created. W: 640H: 480
Jun 17, 2008 3:01:53 AM com.jme.app.BaseSimpleGame initSystem
INFO: Running on: null
Driver version: null
ATI Technologies Inc. - ATI Radeon X1600 OpenGL Engine - 2.0 ATI-1.5.28
Jun 17, 2008 3:01:53 AM com.jme.renderer.AbstractCamera <init>
INFO: Camera created.
Jun 17, 2008 3:01:53 AM com.jme.util.lwjgl.LWJGLTimer <init>
INFO: Timer resolution: 1000 ticks per second
Jun 17, 2008 3:01:53 AM com.jme.scene.Node <init>
INFO: Node created.
Jun 17, 2008 3:01:53 AM com.jme.scene.Node <init>
INFO: Node created.
Jun 17, 2008 3:01:53 AM com.jme.scene.Node attachChild
INFO: Child (FPS label) attached to this node (FPS node)
Fobs4JMF - Native shared library found
[mpeg4 @ 0x17023460]frame skip 8
[mpeg4 @ 0x17023460]frame skip 8
[mpeg4 @ 0x17023460]frame skip 8
[mpeg4 @ 0x17023460]frame skip 8
1394.64First Position: 0, 166 Duration: 1394644
Frame Rate: 23.976
Opening Thread[JMF thread: com.sun.media.PlaybackEngine@ae717f[ com.sun.media.PlaybackEngine@ae717f ] ( configureThread),9,system]
Fobs Java2DRenderer: setInputFormat
Fobs Java2DRenderer: setInputFormat
Created player for: file:Bleach - 01.avi
using normal box
Image: JMFVideoImage@91f61c
Jun 17, 2008 3:01:53 AM com.jme.scene.Node attachChild
INFO: Child (box) attached to this node (rootNode)
Jun 17, 2008 3:01:53 AM class TestJMFVideoImage start()
SEVERE: Exception in game loop
java.lang.NullPointerException
at com.jme.scene.state.lwjgl.LWJGLTextureState.load(Unknown Source)
at com.jme.scene.state.lwjgl.LWJGLTextureState.apply(Unknown Source)
at com.jme.renderer.lwjgl.LWJGLRenderer.applyStates(Unknown Source)
at com.jme.renderer.lwjgl.LWJGLRenderer.draw(Unknown Source)
at com.jme.scene.batch.TriangleBatch.draw(Unknown Source)
at com.jme.scene.TriMesh.draw(Unknown Source)
at com.jme.scene.Spatial.onDraw(Unknown Source)
at com.jme.scene.Node.draw(Unknown Source)
at com.jme.scene.Spatial.onDraw(Unknown Source)
at com.jme.renderer.lwjgl.LWJGLRenderer.draw(Unknown Source)
at com.jme.app.SimpleGame.render(Unknown Source)
at com.jme.app.BaseGame.start(Unknown Source)
at TestJMFVideoImage.main(TestJMFVideoImage.java:199)