[SOLVED] Problems With Real Time Texturing Of Meshes From Real World Camera Images

A brief explanation of what I am trying to do, I am working on an Augmented Reality User interface for controlling a robot. I have a camera that i know its location/orientation at all times in the world. I am using a 3d sensor to find planar regions in the world and making meshes in the scene in the proper locations relative to the camera. Now I want to texture those planar regions with the camera image so that in my 3d view it is easier to see what i am looking at. I first simply tried calculating the UV maps by projecting the mesh verts onto the camera image, but i run into the issue of the textures not lining up along the seams of the triangles in the mesh.

like in this post
https://bitlush.com/blog/arbitrary-quadrilaterals-in-opengl-es-2-0

I think my answer is I need to do some sort of projection mapping or a custom Shader but I am not sure how to do either of those at the moment and could use some guidance. Or maybe I am completely off course.

I tried to draw up a quick example of what i have and what i am trying to do. i would think this would be an easy problem, but I feel like i am missing something. any help would be greatly appreciated.

What I have

What I want

I’m sure there are many ways to achieve what you want.

If it were me, I think I would write a custom filter that replaces blue pixels with camera data and apply that filter to the 3-D view you have.

2 Likes

clever i never would have through of that one. The only problem with that in my situation is i would like to save the textures onto meshes that are not currently in view

1 Like

Ah. Then I think you need to write texture (UV) coordinates to the meshes at some moment when they are in view.

PS: JME meshes can have up to 9 sets of texture coordinates. Some cleverness may be needed to avoid exceeding this limit.

i was able to write what i believe were correct texture UV coordinants to the mesh, the problem is that the texture needs to be … unskewed… is that a word… to fit on the plane, and in doing so there is a non-affine transformation on the texture causing seams in the polygone along the triangle divisions. i am pretty certain from my searching that a custom shader needs to be written, i just cant quite figure out how. :confused:

1 Like

How about using the video-capture image as your texture and the video’s screen coordinates as your UV coordinates? Then I think you could use the standard Unshaded.j3md material and its shaders.

That is exactly what I did, I will try and post actual pictures from what I did when I get to work in the morning, but basically it is something like this. I am using the vertex screen coordinates and covering it to a 0-1 for the UV map coordinates.
What I’m getting:seam
What I was expecting:projective

1 Like

I believe I understand the issue now.

Subdividing the mesh should help. I’ll consider and try to come up with a better fix.

The texture is in screen space so this is always going to be weird, I think… but the requirement to save it out again is what’s making it hard. Else it’s just a filter problem as already suggested.

The issue you will always run into trying to project this as if it were a normal texture is that textures will be perspective-corrected/foreshortened/etc… and the texture is already that. And while this is pretty trivial to fix in the shader, essentially “undoing” the projection, that leaves you still unable to save anything out.

Maybe it’s time to re-examine that requirement as it could be solved a different way… perhaps only unprojecting textures when you write them out or something. Because whatever you do at runtime is going to be WAY less performant than a screen space option. (Because the texture is already in screen space… so projecting just to unproject it again is a bit of a waste.)

will post a picture soon from the actual UI showing the problem, but if i am understanding you correctly i may have used the wrong wording. I do not need to save the textures out for future use or anything, mostly as the robot is moving and a surface that was seen previously is no longer in view, i want that surface to still look like the real world texture, as i make a surface i want to take the real image, use it as the texture, and then not update the texture on that surface anymore.

Ok i think i may have a way forward for this that will be acceptable for what i a doing for now. first off, simply Subdividing the mesh did help a bit. here is the results of simply using an Unshaded.j3md material, and setting the uv coordinants to match the vertex locations on the image. it is a little hard to understand what is going on because i am running this in sim instead of on the real robot, but the left hand side of the UI shows “real” video coming in for the robot, and then on the right hand side is the environment being built up based on simulated 3d scans in the robot simulator. i am also overlaying that onto the video side to augment the view with what the robot can see.
https://www.youtube.com/watch?v=2Dznn702Dss

but there are a few things i dont like about this approach, mainly that the planar region has to be subdevided enough that the skewing of the image isnt noticeable,

but i think i may have a better solution, i found an old jme3 projected texture project on github that i was able to revive, with this i will be able to color my planar regions that are out of robot camera view different colors so i can still distinguish them, but anything in camera view i will be able to project the robot camera image onto those regions.
https://www.youtube.com/watch?v=yiTruRidCew&feature=youtu.be

i would still like to have old planar regions keep their textures as they leave the view, but i think i will have to give up on that for now. if you guys have any other ideas, please let me know.

1 Like

Yeah, to be honest, that approach will never really work until triangles are smaller than pixels… or at least equivalent in size as compared to any pixel that will be rendered for that triangle in any orientation.

I’ve been trying to think of an easy way to describe why this happens… but it requires drawing pictures and stuff. That would be a waste of time if you already know why this happens.

Yeah i know why it happens so no need to draw pictures for me, i think the shader solution i found with the projector is going to be what i need for now. and if i want textures to remain while out of view, i will just set up multiple projectors.

1 Like