I was playing around creating a small world with some imported objects, physics, and effects.
Mostly everything so far has been intuitive. However, when I went to add effects, I found that the next step in encapsulating the behavior of the objects I was spawning would break my understanding of the use of controls.
Basically, I created a small bomb object and spawned it into the world on a keypress. This works great. I then created a Control, called ExplosionControl, which causes any object to ‘explode’ by watching for N seconds to elapse and then removing it from the scene. That works great, as well.
Now, let’s say I want to take the ExplosionControl class and add an explosion effect to the control. Unfortunately, ExplosionControl does not have access to assetManager, and there doesn’t seem to be a way to get one through the Spatial object. So, my thought was to investigate AppStates, before realizing - is it not actually possible to encapsulate the behavior of the spatial once that behavior involves secondary assets? Sound, shaders, etc? Or am I missing something?
And then, if I wanted to refactor and create a “Bomb” class, which loaded the appropriate model and textures and associated the control with its Spatial, it would also require that I pass an AssetManager instance to it? Is this standard operating procedure? Or is it intended that only the AppState be able to manage any resources?
I guess the ‘intuitive’ thing for me here is, if you have a control which is intended to describe the behavior of an object, and that behavior involves any kind of asset (for example, a sparking generator, or an NPC that plays a sound when you walk by), there should be some way to handle that situation, but it feels like you would create a lot of classes that depend directly on AssetManager. That seems a little odd to me. What is the appropriate way to do this?