Unreal engine material and particle system

First part of this video is really cool - poking hole on a surface using material and how particle can use zbuffer to fall through that hole. Really awesome! :slight_smile:

[video]http://www.youtube.com/watch?v=RURQSR788Dg[/video]

2 Likes

awesome, thanks for sharing! I loving seeing these behind the scenes videos.

1 Like
@thetoucher said: awesome, thanks for sharing! I loving seeing these behind the scenes videos.

yeah! me too :smiley:

Mmm, very interesting. Simple technique and very effective.

I wonder if a similar technique would work with the new particle controller - I guess the tricky bit would be getting hold of the screen depth buffer in the particle controller code.

I don’t think they are using the screen depth buffer, but a depthbuffer from the view of the emitter.
This would make more sense, because if you look at a close angle you would not see the holes in the above demo.
All in all you could maybe reuse the shadowfilter code to generate the depthbuffer. (Probably you need to generate the buffer only once when the emitter is added to the scene, or if you want to have it complex you can regenerate it if there are changes in the area of effect)

1 Like
@zzuegg said: I don't think they are using the screen depth buffer, but a depthbuffer from the view of the emitter. This would make more sense, because if you look at a close angle you would not see the holes in the above demo. All in all you could maybe reuse the shadowfilter code to generate the depthbuffer. (Probably you need to generate the buffer only once when the emitter is added to the scene, or if you want to have it complex you can regenerate it if there are changes in the area of effect)

Very true. I guess you would have 3 “modes”, generate once, generate every frame and generate every X frames.

I’m not 100% sure about that though - since if the sparks start bouncing around they would hit problems - for example the hole in the grid mesh in the example from the video would essentially become tubes that the sparks can bounce around inside but not leave.

The depth buffer preview at 1 minute in also implies it might be using screen depth buffer - or maybe that was just to explain what a depth buffer is?

2 Likes

You are right, they say that they use the scene depth buffer but i can’t think of any way to extract the needed informations from it. It would cause different behaviour based on the view angle and i don’t think that is expected/wanted behaviour.

Using a seperatate depth buffer has also a lot of downsides like the tubing effect, which might not be visible in the shown demo but is an issue depending on the geometry.

Under the hood i think there is much more going on then a simple depthbuffer collision model. But it would be easy to test once they release udk4.

I personally find the lit fog emitter more interesting as it adds more realism to the scene than those bounding particles. Lit translucency is a key factor when it comes to generating such a detailed world and i would see plenty of usecases troughout different kind of games.
It’s also something hard to fake, and not possible to do in a deferred lighting pass.

Best example here is BF4, the overall lighting works good and on a good machine nearly constant, but once you activate a flare inside smoke the fps starts dropping massively. It seems that even the big engines currently don’t have a good method for calculating lighting on translucent objects.

As a sidenode, since i started looking deeper into game programming, my mind when playing games changed. I now look more into detail and then i am disapointed when i spot something that is just a big bad fake.

In this context i would also like to mention the tec demo from the ‘Snowdrop’ engine. Also worth to take a look.

[video]http://www.youtube.com/watch?v=R2NJSAvuiQ0[/video]

@zzuegg said: As a sidenode, since i started looking deeper into game programming, my mind when playing games changed. I now look more into detail and then i am disapointed when i spot something that is just a big bad fake.

“Achievement unlocked”. :slight_smile:

My son got a little frustrated with me when I stood looking at the rainy surfaces in Force Unleashed 2 for too long.

2 Likes

mmmm translucency … this is the closest I’ve got in jme so far:

1 Like
@zzuegg said: Under the hood i think there is much more going on then a simple depthbuffer collision model.

This remark made me wonder how much of a real geometry and how much of a real physics model you can put into shader programs.
The grand solution (yeah… megalomaniacally grand) would be to encode the scene’s geometry in some buffer object and let the shaders calculate particle trajectories from that.

Hm. And now I wonder how a shader can even preserve state from one frame to the next so that it can put the particle in the next position on its trajectory.
What’s the technique for that?
Where in the JME API do I look if I want to make such a thing happen? (I haven’t found that yet.)

… or is each shader simply checking whether it’s on the path of any currently active spark, and whether the spark is currently crossing its fragment?

@pspeed said: "Achievement unlocked". :)

My son got a little frustrated with me when I stood looking at the rainy surfaces in Force Unleashed 2 for too long.

Don’t worry, it gets worse when you start to see UV mapping errors in the cinema. :slight_smile:

@toolforger said: This remark made me wonder how much of a real geometry and how much of a real physics model you can put into shader programs. The grand solution (yeah... megalomaniacally grand) would be to encode the scene's geometry in some buffer object and let the shaders calculate particle trajectories from that.

Hm. And now I wonder how a shader can even preserve state from one frame to the next so that it can put the particle in the next position on its trajectory.
What’s the technique for that?
Where in the JME API do I look if I want to make such a thing happen? (I haven’t found that yet.)

… or is each shader simply checking whether it’s on the path of any currently active spark, and whether the spark is currently crossing its fragment?

Take alook at opencl opengl interaction. This should mkae it possible. LWJGL alos already provides the opencl bindings, so just play around fi you get something to work.

I had a lot of fun watching this :

[video]Live Build Episode 4 - Exploring Unreal Engine 4 - YouTube

@iamcreasy said: I had a lot of fun watching this :

Yes, UE editor is quite impressive. Take a look at
https://www.youtube.com/user/UnrealDevelopmentKit/videos?sort=dd&view=0&shelf_id=10
especially Materials 1-10 ones.

Unfortunately, my impression is that it is very oriented towards finite, static content games. I wonder if it is possible to do reasonable Minecraft/infinite procedural world/seamless MMORPG/procedural spacesim/etc types of games with Unreal without rewriting half of the engine.

I have that feel as well, maybee cause I have never seen any game being at least partly procedual on unreal tech. The generated guns in borderlands are the highest of generation in the engine I know of. ^^

Yes, UE editor is quite impressive. Take a look at https://www.youtube.com/user/UnrealDevelopmentKit/videos?sort=dd&view=0&shelf_id=10 especially Materials 1-10 ones.

You are right. I might end up purchasing UE4 just to play around the material editor. Looks like it will be a ton of fun. Just amazing. Never had these much fun watching an engine demonstration video.