Loading YUV420/NV21/YCbCr as texture

Is it possible to load YCbCr images as textures without converting them to RGB? Or is it not supported by the graphics pipeline? I went through the formates in Image.Format but could not find find anything.

um… no? yes? ask question differently?

It’s hard to properly answer this question from my POV. It’s sooo easy to blend colors in RGBA format…

so uh… no the engine does not currently load those images without automatically converting them to RGB(A) space for the OpenGL calls.

Yes, you can do it… on the order of: you would have to possibly rewrite OpenGL to support your ColorProfile math… which is then re-adapted to RGB anyways, because Monitors use RGB (additive light values for a pixel) to display…

YCbCr is just a Difference from Luminance model… Thus Cb and Cr consider Y as a Negative Value… or inherently, a bad Matrices math type situation that makes it non-obvious for someone not familiar with YCbCr…

um… its easier for me to understand (R+G+B) = white than (Y + (Y-Cb) + (Y-Cr)) = white

or “what part of e^{i \pi} + 1 = 0 don’t you understand?”

so… uh… no, but you could make a rendering system based on that?

I’m certainly not going to try to stop you, if that’s something you’d like to do. :sweat_smile:
As long as you can have fun with it.

jME3 doesn’t expose it because it isn’t commonly supported by hardware. Even if it is supported, there might be limitations with mipmapping, wrapping modes, filtering modes, etc.
So the next logical question is, what is this for?
Are you making your own video player? In that case, why can’t you simply decode it in your video player shader?

I asked this because we use the Android camera feed to show it in the JME world. We are developing an Augmented reality app ( https://play.google.com/store/apps/details?id=com.bitstars.holoplayer ) which uses JME for rendering. It works fine to render the camera preview but we have to convert each yuv frame to rgb and the conversion takes about 5% per frame. Its not a very big deal but since in computer vision yuv is much more useful I just thought it might also be more efficient for the rendering pipeline to render. I dont want to write my own rendering pipeline @Relic724 :wink: but I am always looking for ways to optimize the performance.

oh, well, shucks…

All you need to do is “shuffle data” then… if you handle the data as “raw” and just make sure to write your own shader to handle the values coming in from that “sampler” correctly… you should be able to bypass the natural automatic translation of images… (you will need to write your own class for it though)

I don’t have enough experience in android’s camera capturing format to give help beyond the generic advice on this one… but I know that shaders can accept most any type of data… it’s up to us to know what to do with it from there. That info you just supplied was the key that was needed.

Yeah, you just have to write a shader for it, and a class that gets it from raw to organized data for the shader. That should solve it nicely.

edit: I guess that you would be making a “Texture” type class… but it would definitely be a new thing. :smile:

Why not just use Android’s built-in functionality for this?
You have the SurfaceTexture class which let’s you stream images from camera to OpenGL, see here:
https://developer.android.com/reference/android/graphics/SurfaceTexture.html

jME3 exposes the texture ID via Texture2D.getImage().getId().

Note: if you don’t know something, it’s ok just not to respond.

1 Like

https://developer.android.com/reference/android/graphics/SurfaceTexture.html

Ugh… Looks like they require the texture to be bound to GL_TEXTURE_EXTERNAL_OES binding to work … and you need to use special shader extensions to access the image.

So I guess you will need to modify the engine after all… Maybe create a Texture2DExternal class or something that will use the proper constants.

Good luck - if you get something working, you can submit a pull request :chimpanzee_closedgrin:

3 Likes

agreed, and it would save me a lot of face/pride. Pride be damned, IMO, if I can learn something new and grow. The netbeans forum… erg…

Let’s just say that there wasn’t even a “RTFM” out of that one. I made sure to go ahead and post the answers to my own questions on that one… just so it wouldn’t be like the two and three year old similar questions that were lurking untouched.

Silence is a **** for communications.

edit: and @Momoko_Fan whipped up some cool info to help solve it! woot! :smile: I am genuinely interested in the eventual solution to this one… even if only to store it in “the back of my head” for future reference. I may eventually come across this type of problem in a different context, later on.

1 Like

On this forum, if a question goes more than two days without an answer it’s from one of two reasons:

  1. it was answered five times already that week and people are tired of saying it.
  2. it was poorly asked and people are tired of asking for basic information to solve problems.

Otherwise, on this forum, even the worst-formed questions seem to get answers within a day or so.

3 Likes

Indeed! :heart_eyes: That is one of our strong points.

There is a third case,
its something so special and obscure, that noone actually knows anything about it. However that happens only rarely.

1 Like

And, I recognized this topic as one of those! +1 to @Empire_Phoenix! I think that this is what happened to me on the Netbeans forums, while I was trying to find out info for DarkMonkey.

…you mean THIS topic… the one that you didn’t even really let a day pass before you responded?

[quote=“pspeed, post:14, topic:31893”]
…you mean THIS topic… the one that you didn’t even really let a day pass before you responded?
[/quote] Ouch!

… uh… Yeah?.. I can only project information as this topic is outside of my known sphere of knowledge, and responded ASAP to get clarification/try and help… I wanted to help @simon_heinen, even if I “flailed about”… That experience with the Netbeans forums, it “scarred” me, a bit, and my responding to this topic was a “kneejerk” reflex on my part. I… I apologize? … I’ll make sure not to do it again? … :sweat_smile:

Well, I just rejected the notion that this thread was a category (3) style post. Those are so exceptionally rare here as there is already a wide breadth of experience on this forum.

Whew! Ok! :smile:

It’s all good then!

I will toodle off back to my own lil’ project now and keep crunching away at it.

Game On,
Charles

I am not convinced of cat 3. I usually think:
If Paul does not yet have written a library to do the same thing for himself you are probably doing it wrong

1 Like

LOL. Well, there are still some parts of my game very much unfinished, so… :smile:

Actually, the really obscure stuff, usually @Momoko_Fan has the answer. “Oh, that driver won’t let you do that CAP unless you swing two dead cats around your head by the full moon in February… and then only if you half-apply SP2…”

1 Like