Those both sound like things that ought to be abstracted out of an editor though.
Not necessarily. The animation framework has an abstract interface (controller) with two implementations currently. The two implementations are implemented via the machine learning library as components.
These animation components can then be set as a component for a rig for example.