Advice before I go much further with this

<cite>@toolforger said:</cite> You'll want to allow interaction between sounds and other entities. E.g. an NPC might have its battle cry interrupted by that squishy sound from under a rock. For a less drastic example, imagine an NPC starting to tell a joke, then noticing an approaching officer - he might cancel the joke, or continue in a whisper. Or imagine music systems that fade out the "lazy afternoon" tune and fade in the battle theme as soon as enemies are near. The basic pattern I'm seeing is: Most decisions that a game makes can be modified at the next frame; the decision to play a sound is something that will stay in effect over many, many frames, so if the next frame decides something about what sounds should play, it needs to be able to modify the previous decisions. The data should be reachable for the code that makes these decisions, so if the sound is, say, emitted by a scenegraph object, the sound information shoud probably be tied to that scenegraph object.

I ported the quick throw together over to a more final version using the advice thus far. The current version uses:

A single manager for updating all tasks
It iterates over entities which own their audio tasks and calls update on any active tasks
These tasks can be interrupted, but glad you brought it up with an example of why it is necessary!

The way tasks are handled (as described in the footstep scenario above) works really nicely!

Going a bit overboard here, but for the sake of trying, I’m going to allow an alternate method of firing off random looped clips by syncing then directly to the current animation. So, footsteps play when the characters foot hits the ground… attack swings play when the left or right arm swings. etc, etc.

Now, I may not keep this… but it will basically pass a reference of the relevant channel to the audio task with keyframe indexes and use getTime() to sync audio (if set to sync). Since animations are key’d off of actions… or run at the player’s current speed, this should be a really simple addition that (if not cumbersome) I may keep after all.

I’m in dire need of assistance from one of you super-geniuses.

I tried to sync audio with animations… and it works… well… it works until… Here is the scenario, how can I fix this:

The audio manager is running on it’s own thread.
The audio task is set up and handed an AnimChannel and a list of times that an audio clip should be queued.
The task gets the max time of the animation and determines the ratio the times should be altered according to the current set speed of the animation.
Once the current time of the animation is within the threshold of the queue time, the audio node is adjusted and and played.

Here is where the problem comes in… the audio manager is considerably faster than the main app, as it has very little to do in comparison. So, the audio node is queued and run via a Callable (you know… the whole managed scene graph thing). Big problem, as when the audio node should be played and when the app’s task queue gets around to the Callable varies based on TPF and where in the queue it happens to be.

A single frame can offset the audio enough to make the idea just not work… 3 frames and the offset gets really bad, Add a few more frames and depending on the time between clips, it can potentially stop the clip a single frame after it gets around to starting.

The whole point in threading this was for both offloading the work from the main render thread and accuracy of audio timing. Having to enqueue the calls to the audio nodes, defeats the purpose.

Is this idea just not doable? Or is there something I am overlooking?

This is not important, by any means… now I’m just curious as to how I would make this work.

If the work of the Audio Manager is not significant then maybe some/all of it can be moved back to the render thread.

Is your game single player or networked?

Network games often have a built in delay (I’ll avoid relinking the Valve articles because it’s off topic) and so make the task of synching time a little easier since there is sometimes up to 1/5th of a second “give”. Since I already have the code, I do this in my single player game also… though not as big of a delay. It makes it easier to decouple things like physics and then interpolate. The visualization always operates at 100 ms behind the actual simulation.

Which is generally imperceptible (presuming you aren’t playing an instrument) and anyway it’s unavoidable in multiplayer.

Barring that, the only way to accurately sync audio with rendering is to select and play the audio during rendering. Otherwise, as you’ve discovered, you end up with at least a one frame delay… which in the worst case could be seconds late, actually.

1 Like

The game is a network game, however, the audio renderer is completely new and I don’t think I know enough about the specific quirks of working with OpenAL to guess whether what I have in place for handling latency is… um… not even sure how to phrase this… I honestly didn’t see this problem even potentially arising… as my initial thought about audio was… queue an audio node and as long as there is sound… it’s all good.

I think you’re 100% accurate in your statement that the audio manager does not need it’s own thread. As neighbor entities move beyond the threshold of what is relevant to the user, they no longer have any impact on the local physics or the render threads, which means I can limit what the audio manager iterates over as well (this is not the case at the moment as it isn’t even close to being properly integrated).

I did try this at one point and it had zero noticeable impact on the frame rate… however, I changed it back before realizing that there were smarter people than me who could give me advice on how to proceed. -.-

So! I am going to change it back from changing it back, which will remove the enqueue issue and keep me from having to rip apart my latency handling trying to account for something I’m 100% sure I don’t even half understand yet.

Thank you for smacking me upside the head with the obvious! This will definitely at least give me the ability to see if the idea is worth pursuing.