RenderStateManager.SlopeScaleDepthBias Property |
Language: |
Retrieves or sets a value used to determine how much bias can be applied to coplanar primitives to reduce z-fighting.
Visual Basic Public Property SlopeScaleDepthBias As Single C# public float SlopeScaleDepthBias { get; set; } C++ public:
property float SlopeScaleDepthBias {
float get();
void set(float value);
}JScript public function get SlopeScaleDepthBias() : float
public function set SlopeScaleDepthBias(float);
System.Single
This property is read/write.
Floating-point value that specifies the slope scale bias to apply.
The default value is 0.
Polygons that are coplanar in your 3-D space can be made to appear as if they are not coplanar by adding a z-bias to each one. An application can help ensure that coplanar polygons are rendered properly by adding a bias to the z-values that the system uses when rendering sets of coplanar polygons.
The following formula shows how to calculate the bias to be applied to coplanar primitives.
bias = (m * SlopeScaleDepthBias) + DepthBias
where m is the maximum depth slope of the triangle being rendered, defined as:
m = max(abs(delta z / delta x), abs(delta z / delta y))
Send comments about this topic to Microsoft. © Microsoft Corporation. All rights reserved.
Feedback? Please provide us with your comments on this topic.