I am currently playing around with the ShapeNet dataset for some research and have noticed that many of the objects do not render correctly in Blender (an issue that has been brought up on the ShapeNet forums before). For example, the object found at
ShapeNetCore/02958343/114b662c64caba81bb07f8c2248e54bc (a “NYPD Highway Patrol Dodge Charger”) looks like this in Blender:
but, in the ShapeNet Viewer (which uses jMonkeyEngine as the rendering engine), it looks fine (see my GitHub issue here for the picture). As was mentioned in the forum post, the models that do not render properly in Blender have a “bad topology”, e.g., they include “two-sided” faces and “flipped” faces as mentioned. So, my question is, HOW does jMonkeyEngine properly render these objects? I would like to be able to replicate its process so that I can use the models with different rendering pipelines.