[SOLVED] Rendering a YUV420 video data. Need help writing the material

Hello, new to jMonkey and game dev here.

I would like to render a YUV420 video buffer onto an object - a skybox, an inside out sphere or just a rectangular surface for example…

The YUV420 data comes from LibVLC rendering a video into a memory buffer.

LibVLC has support for converting YUV data to RGB data using DirectX, however I am finding that it is not efficient enough for my purposes.

As I understand I may be able to create a shader and a material that will render the YUV data onto an object, however I am not yet sure whether this is possible and how to go about doing this.

This is what I’m mostly stuck on at the moment:

As I undertand a Vertex shader (which as I understand should be the basic predefined vert shader for my purposes) will give me the coordinate of a pixel in a 2D texture that my Fragment shader needs to calculate the color of.

And a Texture2D object has an Image object behind it describing the actual texture. Well the Image objects can not be created with YUV format behind it as I understand (Image object needs a format parameter, and there isn’t one for YUV) . I somehow need to be passing a YUV data buffer to the Frag shader instead of a texture.

I am only just starting to look into all the docs available on creating custom materials and shaders so any help/input/corrections to my understand on this will be greatly appreciated.

I don’t think this is true… your data is your data. If the texture is YUV format then the values will be YUV.

Cut and paste Unshaded.j3md and it’s vert and frag shaders into your own.

Modify the frag shader to take the color from the texture and convert it from YUV to RGB.

Done. (basically)

Going to start playing about with this today, although this may prove to be difficult due to the difficulty of debugging.

One thing that I’d like to get a clarification on:

A YUV420 Image has 3 planes. First Y plane has 1 byte per pixel. Second and third U V planes have 1 byte per 4 pixels.

I can see that inside a OpenGL shader I can access each pixel of a texture with:
vec4 texture2D(sampler2D s, vec2 coord)

However I am not clear how this function will work for my use case, since it returns a vec4, i.e. four values, wheres I only expect to have 1 value at each pixel location.

Usually it’s (R,G,B,A), so Red, Green, Blue, Alpha. In your case it’s (Y,U,V,A).

That it’s 1 byte for 4 pixels doesn’t matter - that’s how compressions can work. It still samples each fragment (pixel) in a way that you will have four values for each pixel later on.

Also, note that you get pseudo- floating point numbers (0.0 … 1.0) in the vec4, because that’s how OpenGL and GLSL work - with a custom floating point number, not with bytes (0 … 255). But that’s not a problem, because you can convert the number any way you want with a -clever- conversion routine (note that floating point may have rounding errors and thus you may want to add a safety margin when converting numbers to a different range).
EDIT: Though, there are integer textures too, which may be used for palette-based shaders. But these are usually not used. Usually you handle [0.0 … 1.0] interval of that custom floating point (it’s not IEEE floating point though).

Well that was actually easier then expected to implement. Here’s what I did to implement the conversion for a sky type material. I basically copied the sky material definition and made the following changes:

In the j3md file I created 3 texture definitions for each Y, U and V planes. like so:

 MaterialParameters {
        Texture2D yTexture
        Texture2D uTexture
        Texture2D vTexture
        ...
 }

Then in my frag shader I accessed the textures and did the transformation to RGB colorspace like so:

#import "Common/ShaderLib/Optics.glsllib"

uniform ENVMAP m_yTexture;
uniform ENVMAP m_uTexture;
uniform ENVMAP m_vTexture;

varying vec3 direction;

void main() {
    vec3 dir = normalize(direction);
    float y = Optics_GetEnvColor(m_yTexture, dir).r;
    float u = Optics_GetEnvColor(m_uTexture, dir).r;
    float v = Optics_GetEnvColor(m_vTexture, dir).r;

     y=1.1643*(y-0.0625);
     u=u-0.5;
     v=v-0.5;

    float r=y+1.5958*v;
    float g=y-0.39173*u-0.81290*v;
    float b=y+2.017*u;


    if(r > 1.0) {
        r =1.0;
    }
    if(b > 1.0) {
        b =1.0;
    }
    if(g > 1.0) {
        g =1.0;
    }

    if(r < 0.0) {
        r = 0.0;
    }
    if(g < 0.0) {
        g = 0.0;
    }
    if(b < 0.0) {
        b = 0.0;
    }

    gl_FragColor = vec4(r, g, b, 1.0);
}

and finally when creating the 3 images for the Y U V textures out of the 3 Y U V ByteBuffers I simply had to set the image type to “Luminance” like so:

     Image yImage = new Image(Image.Format.Luminance8,
                                    imageWidth,
                                    imageHeight,
                                    vByteBuffer, ColorSpace.Linear);

    Texture2D yTexture = new Texture2D(yImage);
 Image uImage = new Image(Image.Format.Luminance8,
                                    imageWidth/2,
                                    imageHeight/2,
                                    vByteBuffer, ColorSpace.Linear);

    Texture2D uTexture = new Texture2D(uImage);

Luminance image type reads the texture 1 byte per pixel and sets each RGB component of the vec4 that you get inside your shader to that single Luminance value,. This is exactly how I needed to read the data in the 3 Y U V planes.

1 Like

Yes, you can do that. It’s simply a non-alpha gray image then (“Luminance8” means the same).
Usually you would pack your Y, U, V into a texture with R, G, B - to make things more efficient.
Also … if you need the Alpha channel someday, a (Y, U, V, A) texture might come in handy.
Anyways, you solved it your way and are happy with it - that’s all that matters. :slight_smile: