Skip to content

5. Creating a shader

theproadam edited this page May 3, 2020 · 8 revisions

renderXF requires a shader to process and render an object. For this reason a shader must be created before rendering.
A shader in renderXF are composed of two parts: a vertex part; and a fragment part.

To begin, just declare your shader:

Shader myShader;

Creating the fragment shader

Whenever the fragment shader is executed, the function **usually ** receives three things:

  • A pointer to the pixel that it will set: BGR
  • A pointer to the attribute data that it will set: Attributes
  • A integer that that represents the face index that is being rendered from the buffer: FaceIndex

The attribute data can contain values interpolated from the vertex data, such as UV coordinates and or face normals, or it can contain camera space coordinates, or screen space coordinates.

To begin creating a fragment shader, just copy and paste the following method:

unsafe void myShaderFS(byte* BGR, float* Attributes, int FaceIndex) 
{

}

To set the pixel a color, just assign the red/green/blue color values:

unsafe void myShaderFS(byte* BGR, float* Attributes, int FaceIndex) 
{
    BGR[0] = 255; //Blue
    BGR[1] = 255; //Green
    BGR[2] = 255; //Red
}
WARNING: Even though renderXF is based on 32bpp pixel buffers, exceeding [2] on the BGR pointer may result in stack/heap corruption.

To access attribute data, just use the attribute pointer. The following example extracts face normal data, and displays it in color.

unsafe void myShaderFS(byte* BGR, float* Attributes, int FaceIndex)
{
    BGR[0] = (byte)(127.5f + 127.5f * Attributes[0]); //Blue  -> Normal I
    BGR[1] = (byte)(127.5f + 127.5f * Attributes[1]); //Green -> Normal J
    BGR[2] = (byte)(127.5f + 127.5f * Attributes[2]); //Red   -> Normal K
} 

As each normal values range from -1 to 1, they are normalized to 0-255 and then each RGB color value is set in correspondence with its assigned normal value.

If you are unsure about the order of the data in the Attribute pointer, the following function will print out what each index of the attribute pointer will give you:

string s = myShader.GetFragmentAttributePreview(myBuffer);
Debug.WriteLine(s);

Example output:

Attributes[0]: Vertex Data
Attributes[1]: Vertex Data
Attributes[0]: Camera Space X Position
Attributes[1]: Camera Space Y Position
Attributes[2]: Camera Space Z Position
Please treat the Attribute pointer as read-only to prevent unwanted behavior.
WARNING: If GLRenderMode.TriangleFlat is selected, then the fragment shader will be only sampled once and the Attribute pointer will be null.

Creating the vertex shader

By default, renderXF contains its own vertex shader. However it is still possible to perform vertex transformations before each vertex is sent down the graphics pipeline. Unfortunately due to performance limitations that vertex shader is a bit locked down.

renderXF vertex shader has a built in camera transformation system. Using the ForceCameraRotation() and ForceCameraPosition functions the object can be rendered in conjuction of the camera data. This feature can be overriden with the SetOverrideCameraTransform() function.

Whenever the vertex shader is executed, the function receives three things:

  • A pointer to the output data: OUT
  • A pointer to the input data: IN
  • A integer that that represents the face index that the vertex belongs to: FaceIndex
unsafe void myShaderVS(float* OUT, float* IN, int FaceIndex)
{
    OUT[0] = IN[0];
    OUT[1] = IN[1];
    OUT[2] = IN[2];
}

If your vertex buffer contains any attribute data, (ie. UV coords, normals) then it will be automatically copied into the work stack during shader execution. This can be overridden with the SetOverrideAttributeCopy() function of the shader.

In this example the object being rendered is first scaled by 50, and then moved to the location of a light:

unsafe void myShaderVS(float* OUT, float* IN, int FaceIndex)
{
    OUT[0] = IN[0] * 50f + LightPosition.x;
    OUT[1] = IN[1] * 50f + LightPosition.y;
    OUT[2] = IN[2] * 50f + LightPosition.z;
}
WARNING: If you add a vertex shader, it must contain the base code, otherwise nothing will be rendered.
WARNING: Please don't exceed the (amount of attributes - 1) limit. Doing so may corrupt memory.
Please treat the IN pointer as read-only to prevent corrupting the vertex buffer.

Merging the two shaders

With both shaders completed, the main shader can now be created:

myShader = new Shader(myShaderVS, myShaderFS, GLRenderMode.Triangle);
GLRenderMode VertexShader? FragmentShader? Comments
Line Yes Yes Used for rendering lines
Triangle Yes Yes None
TriangleFlat Yes Partial None
TriangleGouraud Yes No Hardcoded Gouraud
Wireframe Yes Yes Wireframe
WireframeDebug No No No Depth Test, use SetDebugWireFrameColor() to change color.

Now at this point it is also possible to attach extra scene data via the GLExtraAttributeData enum.

myShader = new Shader(myShaderVS, myShaderFS, GLRenderMode.Triangle, GLExtraAttributeData.XYZ_CameraSpace);
GLExtraAttributeData What does it add?
None Returns nothing, Default Option
XYZ_CameraSpace Returns a XYZ position in the world space
XYZ_XY_Both Return both a XYZ position and screenspace position
XY_ScreenSpace Returns the screen space X,Y value
Z_Depth Returns depth value (Not Depth Buffer)
Z_Depth returns the true Z value, unlike the Depth buffer where z = farZ - trueZ
For lightning calculations it is strongly recommended to rotate the lightposition once into the camera space rather than rotation the fragment position in camera space to world space.

Here is a few examples of shader creation:

// No vertex shader attached:
myShader = new Shader(null, myShaderFS, GLRenderMode.TriangleFlat);

// Gouraud mode:
myShader = new Shader(myShaderVS, null, GLRenderMode.TriangleGouraud);

// Phong mode (prebuilt vertex shader):
myShader = new Shader(null, myShaderFS, GLRenderMode.TriangleFlat, GLExtraAttributeData.XYZ_CameraSpace);

Scratch space (Experimental)

Creating variables takes up valuable cpu time. For this reason renderXF allows a "scratch space" to be allocated in the stack for extra variables.

Please note that scratch space is only available for the fragment shader, and the values are not zeroed out.

To set the scratch space size just use the SetScratchSpaceSize() method, with a integer specifying its size.

myShader.SetScratchSpaceSize(size);

The current scratch space size limit is 20 floats.

Clone this wiki locally