The aim of this sample is to show how CySlice generated normal maps can be used with the Maya Mental Ray renderer.
Normal mapping works by encoding a detailed surface normal directly into a texture map; this normal is then decoded by a shader and re-applied to the surface during rendering. Its more accurate than traditional bump mapping, which only estimates a surface normal based on differing grey levels between adjacent pixels in the texture map.
![]() Normal Mapped Maya Mental Ray Render |
|||||||||
This scene is derived from the SUBD Displaced Normal Maps sample, but differs in a few significant ways:
1: | Its setup up to use Mental Ray (MR), not the Maya software renderer. |
2: | Its a polymesh, not a subdivision surface. The samplerInfo.tangentU calculation across texture coord boundaries, for SUBD surfaces, appears to be broken in MR. The calculation is fine for polymeshes though. |
3: | It appears that MR requires a world space normal to go into lambert.normalCamera, despite the fact that its supposed to be camera space normal. The yellow highlighted nodes below are the additions required to get the Maya software renderer shader to work with MR; the camera's worldMatrix is used to transform the normal map normal from camera space into world space. |
![]() Modified Shader Graph |
|||||||||
4: | There's no displacement mapping. |
File | Description | md5sum.txt What's this? | |||||||
6,387,104 | f773356339f31ebc26b8f3314d73b4e3
| |
Note: To grab a binary file, first move the mouse pointer over the link, click the right mouse button, then select "Save Link As..." or "Save Target As...". |
![]() |