Event Driven Imager Simulator (instead of frame based)

Hi everyone,

I’m considering to use jMonkeyEngine to build a simple Dynamic Vision Sensor (DVS) sensor simulator (a.k.a. Address Event Representation (AER) camera).

The main difference between standard cameras and DVS is that standard cameras are Frame Based , i.e. they produce a full frame every period while DVS produce an stream of timestamped events of the pixels that change. If you want to know more look at

I am looking at the jmonkey code and trying to think which would be the best approach to implement this.

I am trying to test it with the TestJaime demo.

As far as I see, I see two main different approaches

  1. Try to reduce the speed of the action by a large factor (e.g. 1000 ) and then capture the difference between consecutive frames to produce the events. In the mentioned demo this is controlled by the cinematic FPS.

  2. Try to modify the main rendering loop so that the restriction to render a frame with relation to the real time clock is removed. By doing so, I could increase the temporal resolution and compute consecutive frame differences to produce the stream of pixel change events.

In both cases the goal is to increase both the temporal resolution of events. To clarify it, in an standard frame base approach an object with a fast movement would “jump” from a position in the frame to another distant (more than 1 pixel away) position.

In a DVS system the same movement would procude an stream of events happening at every pixel of the path at a higher temporal resolution.

Does anyone have an idea (or pointer) about which would be the best way to implement this?

Best Regards

1 Like

I feel like in a post-processing shader with access to the last frame in addition to the current frame that you could do this in real time… but I don’t know what your output needs to look like. If it can also be represented as a frame buffer then yeah, you could definitely do this in real time.

Thanks! I’ve done it with a combination of a viewport post-processor (to capture the result of the rendering chain) and a custom timer (that advances time at my will).

It works :wink: