One important feature of any renderer implementation is the alpha-masking support. Vegetation, chains and wired fences would be a nightmare to model and also a candidate to be a triangle-hungry-monster. The idea of alpha-masking is to decide if a given pixel should be rendered or not using the alpha channel of a texture (we just need one channel, so we can store it on diffuse’s alpha). If the value is bigger then a threshold, we draw it, otherwise it is skipped.
With the introduction of full shader-based pipelines, even this basic behaviour should be implemented on pixel shader. On HLSL we must use the clip(value) function, that discards the pixel if value is negative. Note that the computation for that pixel is still performed, it is just not sent to the render target (or backbuffer). To effectively skip the processing, we could to use dynamic branching and the [branch] notation for Xbox (I will not enter in details here).
In the pixel shader, all we need to do is clip(diffuse.a – alphaReference), so our values lesser than alphaReference will evaluate a negative result, skipping the result. We can play with the alphaReference value in run-time, to make objects appear/disappear (useful for spawning effects).
Now the cool section: how to integrate it into my light pre pass pipeline (if you don’t know what I’m talking about, take a look at my old posts). As usual, you can get the full source code here. Use it at your own risk!!
First, we need to do the alpha masking in 3 different stages:
- when rendering to the G-Buffer;
- when reconstructing the light;
- when drawing to a shadow map.
Thinking ahead, we may need to mix alpha masking with fresnel/reflection/skinned meshes/multi-layer materials/etc, so its better to start using a solution to prevent something like “shader_fresnel_alpha_skinned_dual_layer.fx”. I introduce you…uber shaders!!
Uber shader is just a big shader that implements lots of behaviours (fresnel/reflection/etc), and the application decides which path to follow. I will use pre-processor parameters (#define/#ifdef) to construct the shader flow, since it’s a compile-time only process. I must confess I’m not a big fan of uber shaders, since sometimes the code gets messy, tricky to follow and not so human-readable, but for now I’m ok with it.
I’ve added the option to enable/disable alpha-masking and also the alphaReference value on 3DMax®, so we need to find a way to get that information and store into our processed mesh. To accomplish that, we need to make some changes on our Pipeline Processor (it took me a while to have it working properly, so accept this as a good gift :p ):
on the model processor (our LightPrePassProcessor.cs), we need to extract the alpha information on the original material, and store a list of “defines” (for now, I’m handling only alpha-masking, but the idea is to gather all kind of information like fresnel/reflection/etc). After that, we put this list into the material’s opaqueData, like “lppMaterial.OpaqueData.Add(“Defines”, listOfDefines);”;
we have to extend a material processor: I’ve created a class named LightPrePassMaterialProcessor to handle the “defines” we pushed from the first step, and send it to the effect processor;
we need also to extend the EffectProcessor, a job for LightPrePassFXProcessor class. It only reads the “defines” information stored into the context’s parameters and copy it to its “Defines” property.
With these steps working, we can focus on the shader itself. All we need to do is to put the alpha-checking inside “#ifdef ALPHA_MASKING …. #endif” region (ALPHA_MASKING is the key I chose for that, it’s on LightPrePassProcessor.cs). Here is a small snippet, from the technique to render to GBuffer:#ifdef ALPHA_MASKED //read our diffuse half4 diffuseMap = tex2D(diffuseMapSampler, input.TexCoord); clip(diffuseMap.a – AlphaReference); #endif
Note that as we don’t need the diffuse for the rest of this technique (remember we just output normals/depth on this technique), we can put the texture fetch inside the alpha mask region. We need also to support backface lighting, since almost anything that uses alpha-masking is not a closed mesh. To do that, we need to use the VFACE semantics (available only on SM3+) if we detect that macro, like this:struct PixelShaderInput
float4 Position : POSITION0;
float3 TexCoord : TEXCOORD0;
float Depth : TEXCOORD1;float3 Normal : TEXCOORD2;
float3 Tangent : TEXCOORD3;
float3 Binormal : TEXCOORD4;#ifdef ALPHA_MASKED
float Face : VFACE;
At this point you’ve probably got the idea of uber shaders (this is just the beginning, though). We need to extend it to the shadow map generation, including texture coordinates to the vertex input/output and performing the clip() inside the pixel shader. Remember also to set the culling to none on the technique declaration.
The trees on this sample were generated by Tree[d], an awesome free tool to generate trees.
I would like to thanks the guys that donated some money. It’s not about the money itself: to get to the point of donating anything, someone has read my blog, downloaded the code, run it, enjoyed it, returned to the blog, and clicked on the button to do a donation. This means that I’m doing a good job in sharing the knowledge, and it motivates me to continue this series of samples.
Thanks guys, see you!