Textures in blender importer

Hi everyone,



I know I haven’t been much active recently, but I was focused on developing improvements for textures loading.

It is not finished yet but most of the functions are working now.



I managed to change the textures loading function. Now all generated textures will be baked into the mesh and mapped as a 2D texture :slight_smile:



Advantages of this solution:

  • MUCH less memory used if an object has applied more than one 3D texture
  • 3D textures can be used in all texture mapping methods (and not only as shadeless color)
  • user will be able to mix 3D and 2D textures in one object which was not possible before (still working on that but hope to succeed)
  • no need to define 3D texture shaders
  • the code is better separated (no code specific to textures or material loading will reside in MeshHelper)



    I still have to finish it (especially mixing textures with different UV mappings).



    Do we have any methods to do affine transformations on images ?

    I can do that using awt classes but that forces me to export the texture to AWT format (ie. png) do the transformation and import it back again.

    Do we have at least any basic operation, like image scaling, implemented?



    Cheers,

    Kaelthas
4 Likes

i use blender models, but I generate my custom materials for every mesh. I don’t think this is a good idea to bake a texture into the mesh.



Otherwise we need a method for removing baked textures from the mesh. This is a vital thing.



Another way is to make a constructor for loading meshes without textures or without baking.

@Kaelthas , just a little remindinig of the issue with TexCoord2. It would be cool to support TexCoord2 by blender.



http://hub.jmonkeyengine.org/groups/import-assets/forum/topic/blender-loader-texcoord2-support-request/

Whe I say ‘baking’ I mean generating a 2D texture from a 3D texture using a mesh as a baking shape.

So I take every triangle and calculate how 2D texture looks like on this face, and then I combine all of them as a texture.



Later I attach this texture to the material as you normally do.

Of course this happens only when you use 3D texture.

When you use an image it is directly used as before.

1 Like

Does the light map need to use exactly texcoord2 ?

In the new implementation the image is mapped to texcoord and normal map, light map and shining (not sure if I remember all of that correctly) are mapped to other texcoords.

Now i understand your point about baking. Ok, i take my words back. )



About texcoord2: all i need is just import TexCoord2 if it exists in the Mesh. At present, only TexCoord1 is imported by blender importer. So, I cannot apply “lightmap” for TexCoord2.



I can make a testcase if yuo need it.

As I said in new implementation there will be several textcoords used and one texture for each of course.



I didn’t test everything yet because textures merging is yet to be done. And it is quite tricky when textures have different UV mappings.

Hope I’ll solve it soon :slight_smile:

1 Like

Thanks for continuing the work on the blender support @Kaelthas Hopefully I’ll be able to work on some new models soon to try some importing to jME :slight_smile: . Lately I’ve been trying to get to grips with the latest blender and trying to get my son involved into working on some sort of storyline to get something going eventually in the modeling dept :slight_smile:

OK guys,



as I see it, the new implementation have fixed several minor errors, so it means we’re on a good way :slight_smile:



But I have one mathematical problem and maybe someone might have come accross this one.

The problem is easy to understand.



Lets say that I have a 2D texture with the sizes W0 x H0. And I have a triangle area T0 described by points(P0, P1, P2) inside the texture.

I also have another texture with sizes W1 x H1. And I have different triangle there T1 described by points (PP0, PP1, PP2) in the second texture.



Each point has the coordinates (x, y). x is from range [0; W] and y from range [0; H] (so they are simply points on the texture).



The task is to convert the image that is inside the first triangle and fit it into the second triangle on the second texture.

So we need to find affine transform that will move the image described by triangle 1 to the area described by triangle 2.

The pixels of each image that are outside each trinagle are not important.





Easy to understand and not that easy to implement.

I was trying to solve the problem with affine transform but the results were always poor :frowning:



Unfortunately it will be difficult to merge textures with different UV’s without it unless I think up some other idea.



Has anyone faced such problem before??

If you find the problem still unclear - just ask and I’ll try to explain it in a better way.

I’ve managed to correctly transform the texture :slight_smile:



The result is not satisfying still but this must be due to other, not yet found, errors.

So no need to focus on the above mentioned topic :wink:

why not use barycentric coordinates? http://en.wikipedia.org/wiki/Barycentric_coordinate_system_(mathematics)



pseudo-code: (i apologize that i didn’t use your naming convention…)



input: tri0,img0, tri1,img1



let (x00,y00), (x01,y01), (x02,y02) be the three coords defining tri0

let (x10,y10), (x11,y11), (x12,y12) be the three coords defining tri1

for x,y inside tri1 (or bounding box of tri1):

alpha = ( (y11-y12)(x-x12) + (x12-x11)(y-y12) ) / ( (y11-y12)(x10-x12) + (x12-x11)(y10-y12) )

beta = ( (y12-y10)(x-x12) + (x10-x12)(y-y12) ) / ( (y11-y12)(x1-x12) + (x12-x11)(y10-y12) )

gamma = 1 - alpha - beta // gamma will be in [0,1] iff x,y in tri1

x’ = x00alpha + x01beta + x02gamma

y’ = y00
alpha + y01beta + y02gamma

c = img0.getpixel( x’, y’ )

im1.putpixel( x, y, c )



this does not do any filtering of the texture

It’s a bit late now (something’s wrong with my twitter e-mail notifications) but I received two helpful replies on twitter about this:

https://twitter.com/#!/lawrencedo99/status/189920829100933120

https://twitter.com/#!/purpleboogers/status/189818735786332160



Even though it might be irrelevant now, these guys would be worth keeping in mind for later.

OK



I’m almost done with the textures.

I have succesfully merged two flat textures with different UV coordinates.



So now what is left:

  • do the same to merge 3D and 2D textures (and 3D textures with different UV’s); but these two should be easy now
  • refactor the code (make it more readable)
  • improve the performance and reduce the result texture size
  • fix any bugs
  • test
  • commit



    This might take me some time (I guess that about 2-3 weeks) but now I am sure thie will simply work and textures loading will be much more improved.



    Just be patient for a while yet :wink:



    @qfxcoder

    Thanks for the link :slight_smile: I have found the same in other sources (already in java code) and it is working. But its always good to be familiar with mathematics that are behind the code so I intend to study this wiki article anyway :slight_smile:



    @erlend_sh

    Thanks for the links. Might be very useful when I come accross math problems in the future :wink:
1 Like

I have made a commit today.

Just a short summary here :slight_smile:





Textures loading refactoring.



New features:

  • support for loading both 2D and 3D textures (they are merged and flattened to Texture2D)
  • support for colorband usage in flat textures
  • support for using color factors for flat images



    Bugfixes:
  • blend texture should be now calculated properly at the ends of the object
  • blend texture is now properly directed
  • flat texture projection (flat, bude and tube) should now be properly directed



    Todos:
  • support for DXT1A texture type (if anyone could probide me with the format definition it would be great :slight_smile: )
  • increase the number of supported texture types for blending and IO operations
  • improve final image compaction
  • direct sphere projection towards Y axis
  • blend DDS texture with base texture (done)



    I have also improved code separation which should improve readability and I prepared myself a starting point for mapping a generated textures into a normal mapping texture :slight_smile: I hope to work on that soon.



    The loading time though might have increased but this importer is just a developer tool so I guess it won’t be a critical part of it.



    The whole texture part is quite a large topic. I am 100% sure there will be lots of bugs so please be patient. I will try to fix everything as fast as possible :wink:
3 Likes

Very cool man! Thank you for your effort.



Blender 2.63 is out. It has a new mesh system (BMesh). Now blender supports n-gons. Do you think about blender 2.63 support?

http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.63

@mifth said:
Very cool man! Thank you for your effort.

Blender 2.63 is out. It has a new mesh system (BMesh). Now blender supports n-gons. Do you think about blender 2.63 support?
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.63

dude...let him breathe...

I’ve read abount N-gons and I intend to support it of course. :slight_smile:

But first I’ll finish with the textures.

Do the stuff that is left and fix bugs :wink:

Greetings, I’ve updated my version of the jme today and got this strange bug. The ship is loaded correctly as the texture is but the mapping is damaged.



http://i.imgur.com/qMCYC.png



I thought it could be a error in the UV-mapping, but in blender everything looks fine:



http://i.imgur.com/o1I6c.png



Also the import worked before I’ve loaded the svn-updates. Could that be a bug?

1 Like

@ceiphren , I have the same issue as you. I suppose something is with UV. Nightly builds.

@Kaelthas , @ceiphren . I did a screenshot of the same issue:



http://i.imgur.com/cl91y.png







The model can be downloaded here:

http://dl.dropbox.com/u/26887202/123/jme_blender/blendermodel_issue.zip



2 or 3 weeks ago there was no such an issue. I suppose @Kaelthas did some changes recently.

1 Like