jME participation in Google Summer of Code 2020

I was going to suggest asset pipeline also but figured it would be nixed since it would require adding something that needed maintenance.

Don’t know python but how difficult would it be to write an addon script for blender to export directly a j3o and just skip the importer altogether?

That would keep it from affecting the engine in the case where its abandoned.

I think the problem here is that we either already expect a shadow expert or will need an experienced mentor to guide the developer. The first seems unlikely and the missing second is one of the reasons we don’t have in pass shadows already.

Something to consider with all of these ideas: “Who will be the mentor that guides the development?” That’s a requirement. Then the second question is: “Is that mentor planning on maintaining the code after?”

Any idea with good answers to those two questions is a “good idea”. Anything else is “participation for participation’s sake”… which is not a zero-benefit option just a very low benefit option.

3 Likes

Probably we should start building a list of mentors-candidates with their respective expertise domain in engine and/or sdk, and probable high level topics they can mentor. No details is necessary at this point, but you are welcome to add them. This way the future students will know who to if talk to if they have any project idea.

I’ve added a section in the shared google drive document. Add content here,

5 Likes

OK, being realistic. How about some integration for Zay-ES for Minie physics state for plug and play network physics between server and client.

It would not be too difficult as many of us have done networked physics in jme. Also many of us use Zay-ES or Sim-eth-es. And having a good plug-n-play solution for network physics sync would be a great addition for the community.

Thoughts?

1 Like

I think maybe you mean SimEthereal integration. Zay-ES on its own is a poor way to sync real-time physics information.

I think the issue is that it’s not really a “one size fits all” thing beyond the parts that are already pretty simple: the listeners.

I think there is an existing Zay-ES bullet integration for single-player that’s not that far from what one would need for networking.

Not saying it’s a bad idea necessarily but it would be tough to package anything other than an example.

I added my ideas to the Google doc. Here’s a snapshot:

Minie is the new physics add-on for JMonkeyEngine, based on Bullet Physics: https://github.com/stephengold/Minie

Minie includes a few tutorials and a large set of simple test/demo apps, but so far no sample games. JMonkeyEngine likewise includes tutorials, tests, and demos, but the closest thing it has to a physics-based based game is RollingTheMonkey, which is less than 500 lines of code: https://github.com/jMonkeyEngine/jmonkeyengine/blob/master/jme3-examples/src/main/java/jme3test/games/RollingTheMonkey.java

A few (open-source) sample games, written in Java, might help game developers understand how to use Minie effectively and/or inspire them to learn.

A serious game-development effort might also expose shortcomings in Minie or JMonkeyEngine or their documentation, shortcomings we could then remedy. Sample games could also be used to check future releases of Minie and JMonkeyEngine for regressions.

Sample games should follow best practices, including modularity, portability, encapsulation, re-use of existing code, extensibility, view/state separation, user-friendliness, build automation, version control, and thorough testing/documentation.

Specific ideas for physics-based games:

Mentors:

Stephen Gold (sgold@sonic.net, Forum/Hub: @sgold, GitHub: stephengold) also willing to maintain the resulting project(s) if necessary

2 Likes

+1 Networked physics library. great idea.

2 Likes

Thanks @sgold. I’ve moved your idea as the first idea in the idea list.

The google document is quite long - it is easier to navigate the document if you turn on the outline feature.

1 Like

I’ve built a table based on the ideas proposed in this thread. The grey rows indicates users that have agreed to mentor students . Please feel free to update the content.

2 Likes

Please note: I’m in no way capable of mentoring someone about shadows, I’m just someone who has found a need for such functionality in the past.

I’ve removed your name from the table.

I have no time to mentor anything.

2 Likes

I was just kidding about Vulkan integration, that is really not feasible for a GSoC.
But I was serious about physics network sync library.

EDIT: I should clarify, as much as I would love to be a mentor, I am only available two weeks of every month because I work remote in the artic circle.

1 Like

There is already this… there just isn’t a minie example.

1 Like

Only the grey ones are confirmed. I’ve removed your name from the list.

I’ve changed the title to avoid confusion :stuck_out_tongue:

2 Likes

I still didn’t get around to do a full write up on libwebm/libvpx, so I am doing this here in quick bullet points and write it more detailed when/if the idea makes it through:

Story: We all know playing media/videos in Videogames is painful because of codecs, law, etc.
The webm container format using the libvpx (VP8, 9, 10) codecs is what drives the web “HTML 5 VIdeo”, what Youtube uses, etc. It’s open source and developed by google. It’s also available free of charge (I think there is confusion with X264 vs. H.264 there).

As opposed to other solutions I got suggested (using VLC or JavaFX), this one would be tightly integrated and potential (flex goal) also work on Android.

One upside of this feature is it guides the student through all phases:

  1. Research around libvpx, JNI (they even provide bindings already), jME/openGL, Color Spaces etc.
  2. Design a Video API to maybe make it into core. There I’ll need paul’s help and we have to see if we can get away with a quad [or anything generic] and just a control.
    Still we need to design multiple things: a) Lockstep [fetch a new frame every frame] b) fps invariant movies [don’t speed down when frames drop, run decoding in own thread and just push textures locked]
  3. Implement libwebM Parser and vpx decoder according to the interface in 2
  4. Test, Cleanups, Integrate like jme3-bullet-native (automatic compliation and packaging the native files in a jar)
    From here on stretch goals:
  5. Android. HW Accel? Conversion YUV2 -> RGB(A) as ShaderLib/dedicated material.
  6. Example Shader with some “old TV” style distortion to show of how the shader lib can be used, to prevent re-writing the YUV2->RGB conversion.
  7. Audio and synchronization!
2 Likes

I can mentor for the Asset Pipeline/maintain the result if desired, so you can confirm that.

3 Likes

Thank you! I’ve marked it.

1 Like

Hi all, long time no see.

Actually I like how the Google doc file listed almost every features we want or need in jME for long. Yes, a few years ago, this same list already show up, but wait for such event like GoSC to be solved.

My two cents is: About organization and value of GoSC after the event passed, may be each idea in this list can become a combination of:

  1. Pull request to jme core
  2. PR to jme sdk
  3. its own github projects for other libs or components.

So finally, when its done, even the person involve abandon it, someone else can pick it up and continue. The value of the code and idea remain…

Side story: Ive recently move to LA, USA. Suddenly have a lot of time to contribute for OS again and I want to start with jME and my ideas from the past. So Cinematic Editor and the SDK will be on top of my list.

6 Likes

Hello,

Welcome back.

Just a heads up. The wiki no longer maintains user project documentation. Its now an engine only wiki, or will be, in the end.

The atomix docs you had are in here.

If you wish to host your own docs I can link to your main page under user contributions.

https://wiki.jmonkeyengine.org/jme3/contributions.html

Edit: Or use the jMonkey store.
https://jmonkeystore.com/

2 Likes