How to get UV from collision with raytracer

Hello all,



I’ve been playing with the trace sample of JME3.



But I’m still searching on how to obtain the UV coordinate of the collision contact point. Do you know how can i get it easily ?



Here’s the little sample i’ve used to obtain interesting infos (contact point, material, geometry etc) in order to compute color and so on …

I just need to know how can i add some lines to obtain the UV at the contact point.


CollisionResults colision = trace(pos.clone(), dir);
if (colision!=null)
{

		Triangle triangle = new Triangle();
		
		triangle=results.getCollision(0).getTriangle(triangle);
		Geometry geom = results.getCollision(0).getGeometry();
		Vector3f normal = results.getCollision(0).getContactNormal();
		com.jme3.material.Material material = results.getCollision(0).getGeometry().getMaterial();
		Vector3f contactPt = results.getCollision(0).getContactPoint().clone();


}
[/CODE]<br /> CollisionResults colision = trace(pos.clone(), dir);<br /> if (colision!=null)<br /> {<br /> <br /> Triangle triangle = new Triangle();<br /> <br /> triangle=results.getCollision(0).getTriangle(triangle);<br /> Geometry geom = results.getCollision(0).getGeometry();<br /> Vector3f normal = results.getCollision(0).getContactNormal();<br /> com.jme3.material.Material material = results.getCollision(0).getGeometry().getMaterial();<br /> Vector3f contactPt = results.getCollision(0).getContactPoint().clone();<br /> <br /> ....<br /> }<br />

Anyway here’s a first result of the improved raytracer with a nice jeep model directly imported from blender (Thanks to the blender importer that work very well, good job)



https://lh6.googleusercontent.com/-_-b0uchj-_M/TpwbD9kChfI/AAAAAAAAIYM/VwBqYxRpdSw/s720/jeepJME3raytracer.jpg



Still didn’t find how to obtain uv coordinates, but will work with “world coordinate” instead for the moment…

Why do you need the UV for?

You can just check which triangle it is and then check the texture coords for that vertex… Its just two buffers containing that data… I don’t know what exactly you want to “find out”, all the data is there, else the model could not be displayed :?

I want exactly what you said :slight_smile:



I’ve understood a little bit more what you were telling me. I should be also with the Mesh object (that was were i was firstly blocking)



Now i got this little snippet :



[java]// Obtain the targeted triangle

Triangle triangle = new Triangle();

triangle=results.getCollision(0).getTriangle(triangle);

int idx_triangle=results.getCollision(0).getTriangleIndex();

// Get the Geometry targeted

Geometry geom = results.getCollision(0).getGeometry();

// Get the mesh from geometry

Mesh mesh = geom.getMesh();

// Get the texCoord buffer for that mesh.

VertexBuffer texcoords = mesh.getBuffer(Type.TexCoord);

// Get the U/V coordinate

if (texcoords!=null)

{

FloatBuffer fb = (FloatBuffer) texcoords.getData();

fb.position(idx_triangle);

float uvx = fb.get();

float uvy = fb.get();

fb.rewind();

}[/java]



Is something like this should help me ?

Yes, its in the mesh, in the buffer, stored as floats, so best get it as a FloatBuffer. I don’t exactly remember how the data is stored though, I think it should be two floats for each vertex (three floats). But that is easily found out through searching the site and… :google:

With the mesh reference and some remember and searching a little bit I’ve found that piece of code a little arranged (coming from : http://hub.jmonkeyengine.org/groups/development-discussion-jme3/forum/topic/3d-paint-in-jme3/).



[java]// Get the U/V coordinate

if (texcoords != null) {



VertexBuffer index = mesh.getBuffer(Type.Index);

if (index.getData() instanceof ShortBuffer) {

int index1 = ((ShortBuffer) index.getData()).get(idx_triangle * 3 + 0);

int index2 = ((ShortBuffer) index.getData()).get(idx_triangle * 3 + 1);

int index3 = ((ShortBuffer) index.getData()).get(idx_triangle * 3 + 2);



FloatBuffer fb = (FloatBuffer) texcoords.getData();

float w1 = results.getCollision(0).getContactPoint().getY();

float w2 = results.getCollision(0).getContactPoint().getZ();

float w0 = results.getCollision(0).getContactPoint().getX();

float s0 = fb.get(index1 * 2);

float t0 = fb.get(index1 * 2 + 1);

float s1 = fb.get(index2 * 2);

float t1 = fb.get(index2 * 2 + 1);

float s2 = fb.get(index3 * 2);

float t2 = fb.get(index3 * 2 + 1);

// Calculate the interpolated UV

u = w0 * s0 + w1 * s1 + w2 * s2;

v = w0 * t0 + w1 * t1 + w2 * t2;

fb.rewind();

}

}[/java]



And now it seems to work nearly correctly and i can raytrace with texture. I’ll post some nice result soon here.

Big thanks for the help.

3 Likes

That’s nice, how are the performances?

How to say…



First it’s certainly faster than if i had do it myself (to obtain collision result). All the stuff in jme are very kindly done, never tought it was possible to do raytracing of scene as easy as this.



But I use simple raytracing algorithm (consuming and unefficient) and don’t know how to tell you how the performance are.



Sometimes, depending on the scene i got good results and sometimes (even if seem to have less poly) not…



But i can try to give you some time for next time. For theses images, count less than hour, depending if shadow, ao or refelction depth…



One with texture (i still have some problem find correct uv or something so i’ve cheated i bit) :



https://lh4.googleusercontent.com/-oF2h9ceFrP4/Tp739wPanGI/AAAAAAAAIZE/-ofzCQGWSss/s720/Untitled-3.jpg



And 2 with kind of Ambiant occlusion added :



https://lh6.googleusercontent.com/-FkRKUR_9I8Y/Tp74G8Wo99I/AAAAAAAAIZU/Ax0oCwYMvTo/s658/iiiii.jpg

https://lh3.googleusercontent.com/-dlGVAoZ0jvM/Tp74IFk8CNI/AAAAAAAAIZc/H9RrlLuAFcM/s720/JJJJ.jpg

1 Like

What is this test was for?

For my personal amazement :slight_smile: And I also love raytracer, as well as JME, and when i find the “raytracer” sample in jme3 i was surprised and needed to try.



After that, I’m using JME3 for all my personal 3D experiment but also in my work i use JME3 when 3D is needed. For realtime presentation of something jme3 rocks. But if you want to have more beautifull image to show, you need to make something else like using raytracing … but here i’ve decided to directly integrate a “custom” renderer suitable for jme3 that render better image (but in slower time than realtime of course ^^).



I hope I’ve answered your question :slight_smile:

oh ok that’s not intended for real time.

anyway very nice work, those shots are great!

I don’t think you can do it in realtime with quality like this ^^.



The only possible thing that i can do now to optimize it is :

  • Optimize the algorithm in term of how to raytrace efficiently.
  • Find if it’s possible to multithread that.



    But for the 2nd point, i’ve tried and i got a nice “conccurent modification exception” … but i’ll try to “duplicate my scene” in order to see if i can multithread to gain time in render.



    I’ll post some result from time to time, and perhaps why not the (or a) code :slight_smile:

You should do changes to the scene graph via callables if you go from another thread:

http://hub.jmonkeyengine.org/groups/general-2/snippets/single/10/

Isn’t it possible to achieve real time by gpu acceleration? using OpenCL?

If it was we’d see it in games all the time. Theres example applications rendering maybe one model using two GPU’s etc… But its simply not that far yet.

I have seen some real time engine using CUDA, using real time using ray tracing algorithms with more then 30fps. I even played one of them using nvidia 210m chipset. Dont remember the names right now.



But I’ve always wondered if its possible for CUDA why not OpenCL? As far as I know OpenCL is pretty mature right now. Is it just because the lack of initiative of lack of standard hardware?

I already did some mandelbub and simple sphere/triangle raytracer with GLSL, or a billiard game attempt with Optix, but for sure hardware isn’t still powerfull enough now, it will be for future as normen said.



But for sure rendering in realtime everything without cheating isn’t for now.

iamcreasy said:
Is it just because the lack of initiative of lack of standard hardware?

No, it because its an unsurmountable workload computing all those rays..

Another with just AO only. Take 3/4 hour i think.



.