I think the problem here is that we either already expect a shadow expert or will need an experienced mentor to guide the developer. The first seems unlikely and the missing second is one of the reasons we don’t have in pass shadows already.
Something to consider with all of these ideas: “Who will be the mentor that guides the development?” That’s a requirement. Then the second question is: “Is that mentor planning on maintaining the code after?”
Any idea with good answers to those two questions is a “good idea”. Anything else is “participation for participation’s sake”… which is not a zero-benefit option just a very low benefit option.
Probably we should start building a list of mentors-candidates with their respective expertise domain in engine and/or sdk, and probable high level topics they can mentor. No details is necessary at this point, but you are welcome to add them. This way the future students will know who to if talk to if they have any project idea.
I’ve added a section in the shared google drive document. Add content here,
OK, being realistic. How about some integration for Zay-ES for Minie physics state for plug and play network physics between server and client.
It would not be too difficult as many of us have done networked physics in jme. Also many of us use Zay-ES or Sim-eth-es. And having a good plug-n-play solution for network physics sync would be a great addition for the community.
A few (open-source) sample games, written in Java, might help game developers understand how to use Minie effectively and/or inspire them to learn.
A serious game-development effort might also expose shortcomings in Minie or JMonkeyEngine or their documentation, shortcomings we could then remedy. Sample games could also be used to check future releases of Minie and JMonkeyEngine for regressions.
Sample games should follow best practices, including modularity, portability, encapsulation, re-use of existing code, extensibility, view/state separation, user-friendliness, build automation, version control, and thorough testing/documentation.
I’ve built a table based on the ideas proposed in this thread. The grey rows indicates users that have agreed to mentor students . Please feel free to update the content.
I was just kidding about Vulkan integration, that is really not feasible for a GSoC.
But I was serious about physics network sync library.
EDIT: I should clarify, as much as I would love to be a mentor, I am only available two weeks of every month because I work remote in the artic circle.
I still didn’t get around to do a full write up on libwebm/libvpx, so I am doing this here in quick bullet points and write it more detailed when/if the idea makes it through:
Story: We all know playing media/videos in Videogames is painful because of codecs, law, etc.
The webm container format using the libvpx (VP8, 9, 10) codecs is what drives the web “HTML 5 VIdeo”, what Youtube uses, etc. It’s open source and developed by google. It’s also available free of charge (I think there is confusion with X264 vs. H.264 there).
As opposed to other solutions I got suggested (using VLC or JavaFX), this one would be tightly integrated and potential (flex goal) also work on Android.
One upside of this feature is it guides the student through all phases:
Research around libvpx, JNI (they even provide bindings already), jME/openGL, Color Spaces etc.
Design a Video API to maybe make it into core. There I’ll need paul’s help and we have to see if we can get away with a quad [or anything generic] and just a control.
Still we need to design multiple things: a) Lockstep [fetch a new frame every frame] b) fps invariant movies [don’t speed down when frames drop, run decoding in own thread and just push textures locked]
Implement libwebM Parser and vpx decoder according to the interface in 2
Test, Cleanups, Integrate like jme3-bullet-native (automatic compliation and packaging the native files in a jar)
From here on stretch goals:
Android. HW Accel? Conversion YUV2 → RGB(A) as ShaderLib/dedicated material.
Example Shader with some “old TV” style distortion to show of how the shader lib can be used, to prevent re-writing the YUV2->RGB conversion.
Actually I like how the Google doc file listed almost every features we want or need in jME for long. Yes, a few years ago, this same list already show up, but wait for such event like GoSC to be solved.
My two cents is: About organization and value of GoSC after the event passed, may be each idea in this list can become a combination of:
Pull request to jme core
PR to jme sdk
its own github projects for other libs or components.
So finally, when its done, even the person involve abandon it, someone else can pick it up and continue. The value of the code and idea remain…
Side story: Ive recently move to LA, USA. Suddenly have a lot of time to contribute for OS again and I want to start with jME and my ideas from the past. So Cinematic Editor and the SDK will be on top of my list.