Possibility to fund development of improved WaterFilter w/ geometry?

Hi. Seeing as how the “Jobs” board doesn’t seem to get much use, I’ll try posting this here. It’s part development discussion anyway.

I started this thread two months ago:

http://hub.jmonkeyengine.org/forum/topic/rendering-realistic-sea-states-using-the-new-jme3-waterfilter-effects/

Here I tried to gleam how realistic it would be to combine the “old” ProjectedGrid functionality and the new WaterFilter functionality into a common component, combining the visual effects of WaterFilter and the geometry of ProjectedGrid. As you can see in the thread @Nehon offered his opinions. The motivation behind this is to take ProjectedGrid / ProjectedWaterProcessorWithRefraction and dial up the visual effects to 11.

Note that I have no idea of how doable this would be or how long it would take beyond what @Nehon posted in that thread, which would be topics that need to be discussed.

Would it be possible for us to fund this development somehow, by “hiring” a developer directly or donating to the JME project? We don’t have the fine-pointed graphics programming expertise required to implement something like this within a reasonable time frame, so it probably makes more sense for us to pay an experienced JME developer rather than try to take it on ourselves. We’re a company that uses a 3D open seas simulation just for visualization and “wow-factor” purposes, hence the focus on visual effects.

Of course, if this becomes reality, the code will find it’s way back into JME as well.

1 Like

I would also like to see donated projects for development. Like:

  • Geometry Shaders
  • GPU Instances
  • More work on the SceneEditor
  • Physics softBody and other Bullet features.
    etc…

Blender has good development fund, for example. But JME-team has a different approach, as far as i understand.
I’m ready to donate some money for development too. But the way of JME is different.

You can’t donate to the project to get specific code, no, nobody in the jME team gets any money so you can’t accelerate anything with it. But you can easily hire a developer and contribute back the code he creates, we have zero issues with that, why would we? :lol:

If you don’t mind me asking, what are we donating to if the developers aren’t paid in any way for their work? This engine is great, is it developed 100% in peoples uncompensated spare time?

@Normen Sure, but for something like this, we’d pretty much need one of the core JME developers. HireAProgrammer.com is not going to work here.

I think the point he is making is that you can offer a core developer the money to carry out a job, but it has nothing to do with jmonkey at all. So you could, for example, reach an agreement with normen to carry out the task for a given price, but that is in your hands, nothing to do with jme in terms of responsibility, quality, or even guarantee of reciept. It’s all on you.

@stenb said: If you don't mind me asking, what are we donating to if the developers aren't paid in any way for their work?
For the sustain of the community, for example the web site hosting. The thing is for a long time the web site was hosted for free by a core member but the QoS wasn't so good. we switched to a more powerful hosting when we had some income. But since the community lived for a long time without spending money, we're kind of not used to it :p
@stenb said: This engine is great, is it developed 100% in peoples uncompensated spare time?
Yes and thank you :p
@stenb said: If you don't mind me asking, what are we donating to if the developers aren't paid in any way for their work? This engine is great, is it developed 100% in peoples uncompensated spare time?

@Normen Sure, but for something like this, we’d pretty much need one of the core JME developers. HireAProgrammer.com is not going to work here.

Yes, the engine is developed 100% in the spare time of the developers. We at first didn’t accept any donations at all but since then some base costs like the web site / traffic rose and some people asked about donating so we decided to change that. So now it helps us not to spend personal money on the engine on top of the time we already spend. But still it is by far not enough to compensate our work for a sustained amount of time anyway.

What jayfella said is right, though I didn’t explicitly mean that. Ofc you might find a core dev that has the time to take this job, though I doubt it, most spend just as much time on jME as they possibly can outside of their “normal” lives. The engine code is not extraordinarily complicated, you will just need some programmer that has GLSL and Java experience to extend it, after all the whole source code is available.

@jayfella said: I think the point he is making is that you can offer a core developer the money to carry out a job, but it has nothing to do with jmonkey at all. So you could, for example, reach an agreement with normen to carry out the task for a given price, but that is in your hands, nothing to do with jme in terms of responsibility, quality, or even guarantee of reciept. It's all on you.

Absolutely. I was just asking for options. Any job agreement would of course be controlled by us as the employer. I just know that if I had been a developer on this project, I would have tried to get some compensation for my countless hours if the opportunity came :). After all, the functionality in your engine is now good enough to make monetized products off of it.

Anyway, the (tentative) job offer stands. For the right person who has the necessary knowledge to tackle this “head first”. If you are that person, get in touch with me and we can start ironing out where we stand.

Maybe @ceiphren has some code or interest in implementing this?

@stenb, @normen: Aye. Give me one or two days to clean up the project on sourceforge.

@ceiphren said: @stenb, @normen: Aye. Give me one or two days to clean up the project on sourceforge.

Just shamelessly bumping this.

I know @husky has been delving into this topic as well. So @ceiphren is it up on SourceForge yet? :slight_smile:

Narf, sh**t. Sorry guys. I’m currently stuck in a hell of work and forgot it.

It will be up, soon. Promissed :slight_smile:

Okay guys, repository is up, public read access is granted. If anybody wants to contribute just tell me.

This is the url for the sourceforge project: OceanMonkey download | SourceForge.net
this is the url for the trunk of the svn repository:

svn.code.sf.net/p/oceanmonkey/code/trunk

If you use Netbeans it should work out of the box. If you use Eclipse you have to declare the folders src, test and common as source folders. Also you’ll have to reference the library jtransforms-2.4 (located in the lib-directory) and of course the jMonkeyengine.

A runnable testclass is “OceanWaterTest”.

As mentioned earlier this is a rough prototype which has some unresolved problems (like no solution for an ocean mesh up to the horizon). The bobbing is inspired by @husky solution.

Have fun.

ceiphren

4 Likes

Looks very nice. Ship itself is not that stable - I have managed to topple it on side, it seems to be not reacting to y-scale of water (still follows the original amplitude it seems) etc, but ship is not really important - water is great.

How hard do you think it would be to support Z-up world? OceanProcessor and fragment shader seems to be main problems here. Fragment shader can be probably solved with single define (only 2 or 3 lines would be different), but OceanProcess seems more tricky.

@abies:
Yes, scaling the geomety does not mean that the original mesh is scaled which results in this strange bobbing bug.

Using z as the up vector would take a while as the whole process has to be configurable for both axis. I’d rather recommend to use y as the up-vector as the jMonkey-SDK uses this as default.

Currently I see 2 major unresolved problems:

  • no concept for large ocean plane. The fast fourier transformation is very expensive and cannot be used for very large meshes. A solution could be to blend into a simple ocean plane with simplified waves in the distance. In the example I copied the original plane several times to cover a larger area but that also created this ugly repeating effect.
  • reflection bug: if you look at the ship with a very smal angle you’ll see that the water reflects the ship even if it should not be possible. I currently don’t know how to solve this.

btw. I really wished the 3D world would have decided which axis to use as up but unfortunately every possible combination is widely used. It’s a complete senseless problem. In the end it doesn’t matter which is up or if going right is going into negative or positive values. The only thing that matters is that everybody uses this one decision.

@ceiphren Looks good. What I like is that the shader code is actually readable. Meanwhile, here’s a snip from the ProjectedWaterProcessor frag shader:

[java]vec2 projCoord = invViewCoords.xy / (invViewCoords.q );
projCoord = (projCoord + 1.0) * 0.5;
vec2 projCoordDepth = viewCoords.xy / (viewCoords.q );
projCoordDepth = (projCoordDepth + 1.0) * 0.5;
if ( m_abovewater == true ) {
projCoord.y = 1.0 - projCoord.y;
}

    projCoord += (vnormal + dudvColor.xy * 0.5 + normalVector.xy * 0.2);
projCoord = clamp(projCoord, 0.001, 0.999);

    projCoordDepth += (vnormal + dudvColor.xy * 0.5 + normalVector.xy * 0.2);
projCoordDepth = clamp(projCoordDepth, 0.001, 0.999);

vec4 reflectionColor = texture2D(m_reflection, projCoord);
if ( m_abovewater == false ) {
	reflectionColor *= vec4(0.8,0.9,1.0,1.0);
	vec4 endColor = mix(reflectionColor,m_waterColor,fresnelTerm);
	gl_FragColor = mix(endColor,m_waterColor,fogDist);
}

[/java]

And it goes on and on. Yeah, ain’t going to try to decode that.

Maybe this shader could somehow be combined with the WaterFilter shader?

@Roger: hm, could work. I’ll try it out.

btw. If you want write access on oceanmonkey you can have it. I just need your sourceforge name

Greetings,

ceiphren

@ceiphren said: Yes, scaling the geomety does not mean that the original mesh is scaled which results in this strange bobbing bug.

Won’t this be resolved when transform feedback is added? Which I heard was in the near future, if I remember correctly (yes, near future is relative… but still going to happen eventually)

EDIT: Um… I made an assumption that you were referring to scaling within a shader. Ignore if I misunderstood.