Textures with alpha maps

I’ve tried a variety of combinations of settings, but I just can’t seem to get geometry to render with a texture with alpha transparency in it. How do I go about doing this?
Thanks,
Michael

1 Like

Hey, check out

derivative.ca/wiki/index.php … ansparency

and let me know if you have any questions.

Thanks… I didn’t even actually have to turn on “Blending”… it started working as expected as soon as I clicked the “Discard pixels based on alpha” box. I’m hoping this is correct behavior.

That’s probably not what you want. That is a way of cutting out the geometry based on the texture. You’ll notice it aliases quite a bit on the edges. Its used to make things like chain link fences, trees etc without having to use complex geometry.

You’ll get smoother edges with the blending.

I see… the problem is that the geometry behind the geometry with transparency becomes invisible, so I see the “shadow” of the geometry as it cuts out whatever is behind it. You mentioned that cutting out the geometry is for things like chain-link fences. If proper alpha blending makes for better results, why wouldn’t you just do a chain-link fence with alpha blending a texture on a plane?

Nevermind… I got it working. Is cutting out the geometry faster than alpha blending?

As far as the GPU goes, it’s unlikely either will end up affecting your performance.

The big difference between the two is that blending requires sorting your geometry correctly, while discarding pixels doesn’t.

It’s totally scene dependent. In video games it may be too much work to correctly sort a tree with a lot of branches, so they just use cutting (this is called alpha testing technically).

In Touch I’ve never actually had a use for cutting out the geometry, I’ve always used blending. But I threw the feature in there since the GPU supports it and people may run into a case where they need it.

1 Like

Actually let me elaborate on that, the actual work involved with the two features is negligible for the GPU. However drawing in farthest to closest order is much slower than closest to farthest, so if you were planning on drawing in closest to farthest, then using blending will be slower since you can’t do that anymore.

Check out the early depth test article on the wiki to understand with closest to farthest is faster.

Right, so I ran into this problem. It seems as though the polygons in my lone mesh are sometimes drawn in the wrong order (depending upon how it’s deformed?) so that when part of it moves behind the other part, blending sometimes fails to work correctly. How can you specify the draw order for the polygons in a single piece of geometry?
Thanks,
Michael

You can use the Sort SOP to do this. However be warned this is an expensive operation so hopefully you can do it before the Deform SOP (so it’s only done once).

If your deformations are causing some parts of the geometry to move back and forth infront/behind each other, you’ll have to Sort after the deform.

If you are deforming using materials then you have no choice but to sort only once and hope that that one order will suit all situations. MAT deformations do the deformations while the geometry is rendering, so it’s too late at that point to re-order them.

This is a common problem in real-time rendering, you can often see this of issue pop up in video games.

There are been lots of techniques developed to try to work around this, but they are all quite expensive and complex to implement (Depth Peeling for example)

Okay. So it sounds like at this point the only option is to discard pixels or to live with a large performance hit… or possibly just deal with the occasional incorrect blend. It might not actually be that noticable in my particular application. Thanks for your help!