My house looks solid when camera is positioned outside the house. But when camera is positioned inside the house, it is transparent. That is I can see complete background beyond the house walls. Ideally I should be able to see only walls.
long story short - every polygon got a front and back face. Usually only the front face is drawn, while the other is being culled to gain performance.
If you want to change that behavior, please refer to the provided link.
Normally, the mesh of an object is made up of triangles or polys with only one side that renders, often referred to as the “Normal”. Most 3D modelers will “Flip normals” for you so the opposite side renders.
It’s done that way because extra faces inside an object are usually just wasted resources. If you want them however, you have to take steps to double-face the mesh.
If you combine the flip normals with a clone of the original faces you get an object with faces pointing in both directions (inside and out) and the walls would be visible from the inside. Most modelers will have an easy way to double-face an object (so you don’t also double up on vertexes).
3D graphics is a series of lessons where one learns that nothing is magic and nothing is free.
In this case, the low level rasterizer (the thing that turns your triangle into pixels) will need to operate differently based on the winding (clockwise or counterclockwise) so it’s pretty easy for it to reject on or the other. It already had to know.
After that, it’s turned into pixels exactly the same… it’s just that in one case the normals are ‘backwards’ (face into the wall) and the other they face out. As I recall, it’s actually possible to detect the winding order inside the shader and the normals could be inverted but JME’s shaders don’t do this and there is little point in it, really.
Almost always, the inside of something will need to look different than the outside. My house would look pretty funny if I was staring at the vinyl siding when in my office looking out.