Two-pass render

I’d like to write a custom render system, that will use two different passes with different shaders to make two textures and than blend them with custom filter.

To summarize I should write two pixel shaders, two vertex shaders for two passes as well as final shader for texture blending. I may use any existing renderer mechanics to glue all this scripts, may I? Or must I write custom renderer inheritor? What should I do to add shadow rendering to this system?

Why don’t you start by telling what you really want to do?

I mean to what purpose do you think you need a 2 pass rendering?

We can surely help you find an easy solution with the existing rendering system…but we need to know what you want to do.

I’d like to wirte a non realistic render as an exercise.

I want to draw base pixel color at first pass, fill z-buffer and use it to calculate object borders (using border filter on z-buffer). Than I want to render nonrealistic hatch effects based on this borders. I will use calculated at first pass texture as part of computation.

It’s a basic idea and needs some experiments to gain proof.

Yeah you keep giving me the “how”, not the “what” but anyway…

That’s sounds a lot like post processing effects, take a look into FilterPostProcessor and Filters, you’ll probably find valuable things to help you do w/e you want to achieve.

I can give no appropriate descriptions before checking possibility of the accurate autogeneration of effects.

This picture inspired me to write similar effects, although it’s clear that most of the pictures content are user edited and no generic automation make such picture, But I want to try write something in the spirit of this.

“Sorry, that page was not found.”

It does sound like you are looking for post-processing effects, though. You get the color and Z-buffer and can then generate your own from colors that.

Sorry, my language interface spontaneously became spanish and I can not share the picture in picassa.

This is another link:

I’ve found some posteffects info. But there is no documentation about it, only code, although many other filters may become good example for writing own.

I’ve found two callpoints for render passes:

postQueue and preFrame

And a placeholder for post rendere passes:


If I would like to write a custom filter for own use I will be sufficient with these knowledge, but if I want to write filter for common use, I should know what is the precedents of all this passes if filter stack is defined and my filter is used before or after another.

The source code may explain all of this things and I have synced my personal copy for researching but it may take some time.

If all you want to do is render a scene and then operate on the z-buffer and color buffer… then look at FilterPostProcessor and effects like the cartoon filter for things that do similar to what you seem to be talking about: