Click here to Skip to main content
16,017,247 members
Articles / Multimedia / DirectX

Rendering 3D Graphic in .NET with C# and Igneel.Graphics

Rate me:
Please Sign up or sign in to vote.
5.00/5 (13 votes)
20 Jan 2017CPOL8 min read 65.4K   26   21
The article shows how to render 3D graphics with C# in .NET using an API Igneel.Graphics.

Introduction

Igneel.Graphics is an API for rendering 3D graphics on .NET. It provides and abstraction for interacting with the graphic hardware from the C# code. The API was developed combining the expresivity of C# with the power of C++. Futhermore it combines conceps taken from OpenGL and Direct3D specifications with unique features like C# interface and dynamic mapping to shader's uniforms variables. Although Igneel.Graphics shares commons definition with Direct3D10 it is not just a simple wrapper of it, it's more like a platform o middleware you can use from managed code, that can be implemented with Direct3D11, OpenGL or OpenGL ES. Also the shaders management in Igneel.Graphics is more related to OpenGL specification than Direct3D.

In Igneel.Graphics every progamable stage of the graphic pipeline is represented by a IShaderStage<TShader> inteface. This interface can be used to creates shader, set resources like textures ,buffers or sampler states.

This article will cover a sample application hosted in a Windows Forms environment in order to show how use the API components for rendering geometry, applying textures, load and build shaders code and provides application values to shader's uniforms variables.

Background

Igneel.Graphics was developed as the low level Graphic API of Igneel Engine. It is an abstraction on .NET that support the high level rendering system of Igneel Engine. The API was designed to support several shader models upto SM5.0. Therefore Igneel.Graphics design allows the API to be implemented on different native platforms.

The current Igneel.Graphics implementation use Shader Model 4.0 (SM4.0). This shader model was initially supported by Direct3D10 and OpenGL 2.0. This model redefined completely the previous shader model architecture, allowing the customization of new stages of the graphic processing. Later shader model 5.0 added new stages like the hull and domain shader stages that interact with the non-customizable tessellation stage. The vertex shader stage is the only required, the others are optional. For more information take a look at the DirectX SDK or OpenGL Documentation.

Image 1

Using the code

First of all we create a Windows Form application. In the Form constructor we acquire a reference of the GraphicDevice, after that we are ready to start loading our shaders and creating the GraphicBuffer for holding the model's geometry. After the GraphicDevice was created we can also load the model's textures represented by the Texture2D.

C#
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Runtime.InteropServices;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
using Igneel;
using Igneel.Graphics;
using Igneel.Windows;
using Igneel.Windows.Forms;

namespace BasicEarth
{
    public partial class Form1 : Form
    {
        /// <summary>
        /// The Sphere vertex definition. 
        /// Attributes are used for defined the 
        /// vertex shader input layout
        /// </summary>
        [StructLayout(LayoutKind.Sequential)]
        public struct SphereVertex
        {
            [VertexElement(IASemantic.Position)]
            public Vector3 Position;

            [VertexElement(IASemantic.Normal)]
            public Vector3 Normal;

            [VertexElement(IASemantic.Tangent)]
            public Vector3 Tangent;

            [VertexElement(IASemantic.TextureCoordinate, 0)]
            public Vector2 TexCoord;
         
            public SphereVertex(Vector3 position = default(Vector3), 
                Vector3 normal = default(Vector3), 
                Vector3 tangent = default(Vector3), 
                Vector2 texCoord = default(Vector2))
            {
                Position = position;
                Normal = normal;
                Tangent = tangent;
                TexCoord = texCoord;               
            }            
        }

        /// <summary>
        /// Data Structure that defines a Directional Light. 
        /// The structure use 16 bytes padding for efficiently transfer the data to GPU memory
        /// </summary>
        [StructLayout(LayoutKind.Sequential)]
        public struct DirectionalLight
        {
            /// <summary>
            /// Light's direction
            /// </summary>
            public Vector3 Direction;
            private float pad0;
            
            /// <summary>
            /// Light's color
            /// </summary>
            public Color3 Color;
            private float pad1;
        }

        /// <summary>
        /// Contract to access shader uniform variables and textures.      
        /// </summary>
        public interface ProgramMapping
        {
            float Time { get; set; }

            Matrix World { get; set; }

            Matrix View { get; set; }

            Matrix Projection { get; set; }

            DirectionalLight DirectionalLight { get; set; }

            Sampler<Texture2D> DiffuseTexture { get; set; }

            Sampler<Texture2D> NightTexture { get; set; }

            Sampler<Texture2D> NormalMapTexture { get; set; }

            Sampler<Texture2D> ReflectionMask { get; set; }

            Vector3 CameraPosition { get; set; }

            float ReflectionRatio { get; set; }

            float SpecularRatio { get; set; }

            float SpecularStyleLerp { get; set; }

            int SpecularPower { get; set; }
        }


        //The Graphic Device
        private GraphicDevice device;

        //Buffer for storing the mesh vertexes in GPU memory
        private GraphicBuffer vertexBuffer;

        //Buffer for storing the triangles indices in GPU memory
		private GraphicBuffer indexBuffer;

        //The shader program mapping
        ProgramMapping input;

        //The shader program
        ShaderProgram shaderProgram;                

        //Transformation matrices
        Matrix world;
        Matrix view;
        Matrix projection;

        //Texture sampling settings
        SamplerState diffuseSampler;
       
        //Textures
        Texture2D diffuseTexture;
        Texture2D nightTexture;
        Texture2D normalMapTexture;
        Texture2D reflectionMask;

        //Camera position
        private Vector3 cameraPosition = new Vector3(0, 10, -15);


        public Form1()
        {
            SetStyle(ControlStyles.Opaque, true);

            InitializeComponent();

            Init();            

            Application.Idle += (sender, args) =>
            {
                NativeMessage message;
                while (!Native.PeekMessage(out message, IntPtr.Zero, 0, 0, 0))
                {
                    RenderFrame();
                }
            };

          
        }           

        protected override void OnResize(EventArgs e)
        {
            base.OnResize(e);

            if (device != null)
            {
                //resize the device back buffer after the form's size changed
                device.ResizeBackBuffer(Width, Height);

                //set the new render target viewport
                device.ViewPort = new ViewPort(0, 0, Width, Height);

                //create the projection matrix with the new aspect ratio
                projection = Matrix.PerspectiveFovLh((float)Width / (float)Height, Igneel.Numerics.PIover6, 1, 1000);
            }
        }


        private void Init()
        {

            //Setup shader model version and default compiling options, 
            //also set the relative directory where the shaders are located
            ShaderRepository.SetupD3D10_SM40("Shaders");

            //Create an instance of the GraphicDeviceFactory.
            //The GraphicDeviceFactory abstract class is used to creates GraphicDevices without worrying about the native implementation.
            //This sample use a Direc3D10 native implementation, therefore an instance of a GraphicManager10 is created
            GraphicDeviceFactory devFactory = new IgneelD3D10.GraphicManager10();

            //A GraphicDevice is created using a WindowContext containing rendering and display settings.           
            device = devFactory.CreateDevice(new WindowContext(Handle)
            {
                BackBufferWidth = Width,
                BackBufferHeight = Height,
                BackBufferFormat = Format.R8G8B8A8_UNORM,
                DepthStencilFormat = Format.D24_UNORM_S8_UINT,
                FullScreen = false,
                Sampling = new Multisampling(1, 0),
                Presentation = PresentionInterval.Default                 
            });

            //Create a ShaderProgram using the input layout definition provided by the SphereVertex struct
            //and the code for the vertex and pixel shaders located in the VertexShaderVS and PixelShaderPS files.
            //As a convention the last 2 characters in the filename specify the type of shader to load.
            shaderProgram = device.CreateProgram<SphereVertex>("VertexShaderVS", "PixelShaderPS");

            //Get a typed mapping using the ProgramMapping interface for the ShaderProgram uniform variables and textures
            input = shaderProgram.Map<ProgramMapping>();

            //The application blending state allowing transparency blend
            device.Blend = device.CreateBlendState(new BlendDesc(
                blendEnable: true, 
                srcBlend: Blend.SourceAlpha, 
                destBlend: Blend.InverseSourceAlpha));

            //The application depth testing state
            device.DepthTest = device.CreateDepthStencilState(new DepthStencilStateDesc(
                depthFunc: Comparison.Less));

            //The application rasterizer state
            device.Rasterizer = device.CreateRasterizerState(new RasterizerDesc(
                cull: CullMode.Back,
                fill: FillMode.Solid));

            //Default texture sampling settings
            diffuseSampler = device.CreateSamplerState(new SamplerDesc(
                addressU: TextureAddressMode.Wrap,
                addressV: TextureAddressMode.Wrap,
                filter: Filter.MinPointMagMipLinear));

            //Load the textures
            diffuseTexture = device.CreateTexture2DFromFile("Textures/Earth_Diffuse.dds");
            nightTexture = device.CreateTexture2DFromFile("Textures/Earth_Night.dds");
            normalMapTexture = device.CreateTexture2DFromFile("Textures/Earth_NormalMap.dds");
            reflectionMask = device.CreateTexture2DFromFile("Textures/Earth_ReflectionMask.dds");

            //Create transformation matrices
            world = Matrix.Identity;
            view = Matrix.LookAt(cameraPosition, new Vector3(0, 0, 1), Vector3.UnitY);
            projection = Matrix.PerspectiveFovLh((float)Width / (float)Height, Igneel.Numerics.PIover6, 1, 1000);

            CreateSphere();
        }

		.............
}

Listing 1: Initialization.

In the previous code we hookup the Application.Idle event, thus we can render each frame when there aren't pending messages to process. The line SetStyle(ControlStyles.Opaque, true) was used to avoid the flickering when windows tries to repaint the background. In addition some structures were defined, like the geometry vertex definition SphereVertex, and the light definition DirectionalLight used in the pixel shader for lighting the scene. Also an interface was defined ProgramMapping, this interface is used to create a mapping between the application's code and the shader's uniforms. A shader's uniform is a variable that can recieves its value from the application's code and are generally defined in a Constant Buffer.

In the Init method the line ShaderRepository.SetupD3D10_SM40("Shaders") tells the API the shader' files location and default shader compiling settings for use with shader model 4.0. Next a GraphicDeviceFactory was created, in these case an implementation with Direct3D10 was used so a GraphicManager10 instance is required. This is the only part of the code that is attach to a specific native implementation of the API. Then using the factory we can create the graphic device passing a WindowsContext as argument that contains presentation settings like the width and height of the window as well the back buffer format and multisampling.

After the device is created we can load and compile the shaders with just one line of code.

C#
shaderProgram = device.CreateProgram<SphereVertex>("VertexShaderVS", "PixelShaderPS");

The previus line creates a shader program containing a vertex shader with input layout specified by the SphereVertex struct and a pixel shader. In order to simplify the code for creating shader objects, the API use some conventions for identify the type of shader to create based on the shader's filename. For accomplish this it use the filename suffix like VS(Vertex Shader), PS(pixel shader), GS (geometry shader), HS (hull shader), DS (domain shader) and CS (compute shader).

Also a unique feature of Igneel.Graphics is called shader interface mapping as shown in the following statement:

C#
input = shaderProgram.Map<ProgramMapping>();

Then after retrieving the interface instance we can use it to set the shader uniforms variables like transformation matrices, lighting data and textures just by setting C# properties with additional intellisence support.

Textures are very important in Computer Graphics applications. So the graphic device support several methods for loading textures from files ,streams or just reserving GPU memory for filling it later. You can also load different types of textures like Texture1D, Texture2D or Texture3D. Cube textures are treated as an array of 6 Texture2D. The supported file formats are .DDS, .JPG, .PNG, .TGA and .BMP.

In Computer Graphics matrices are used to transform vectors in one space to another space. Therefore during the rendering process we need matrices to used in the vertex shader to transform position from the local mesh space to projection space also called homogeneous device coordinates. Then the GPU will take care of transforming these projection coordinates to screen cooordinates by dividing by z and viewport transformation.

C#
world = Matrix.Identity;
view = Matrix.LookAt(cameraPosition, new Vector3(0, 0, 1), Vector3.UnitY);
projection = Matrix.PerspectiveFovLh((float)Width / (float)Height, Igneel.Numerics.PIover6, 1, 1000);

Listing 4 Create Transforms

The code for generating the sphere mesh is shown here:

C#
private void CreateSphere()
{
    var stacks = 128;
    var slices = 128;
    var radius = 10;

    var vertices = new SphereVertex[(stacks - 1) * (slices + 1) + 2];
    var indices = new ushort[(stacks - 2) * slices * 6 + slices * 6];

    float phiStep = Numerics.PI / stacks;
    float thetaStep = Numerics.TwoPI / slices;

    // do not count the poles as rings
    int numRings = stacks - 1;

    // Compute vertices for each stack ring.
    int k = 0;
    var v = new SphereVertex();

    for (int i = 1; i <= numRings; ++i)
    {
        float phi = i * phiStep;

        // vertices of ring
        for (int j = 0; j <= slices; ++j)
        {
            float theta = j * thetaStep;

            // spherical to cartesian
            v.Position = Vector3.SphericalToCartesian(phi, theta, radius);
            v.Normal = Vector3.Normalize(v.Position);
            v.TexCoord = new Vector2(theta / (-2.0f * (float)Math.PI), phi / (float)Math.PI);

            // partial derivative of P with respect to theta
            v.Tangent = new Vector3(-radius * (float)Math.Sin(phi) * (float)Math.Sin(theta), 0, radius * (float)Math.Sin(phi) * (float)Math.Cos(theta));

            vertices[k++] = v;
        }
    }
    // poles: note that there will be texture coordinate distortion
    vertices[vertices.Length - 2] = new SphereVertex(new Vector3(0.0f, -radius, 0.0f), new Vector3(0.0f, -1.0f, 0.0f), Vector3.Zero, new Vector2(0.0f, 1.0f));
    vertices[vertices.Length - 1] = new SphereVertex(new Vector3(0.0f, radius, 0.0f), new Vector3(0.0f, 1.0f, 0.0f), Vector3.Zero, new Vector2(0.0f, 0.0f));

    int northPoleIndex = vertices.Length - 1;
    int southPoleIndex = vertices.Length - 2;

    int numRingVertices = slices + 1;

    // Compute indices for inner stacks (not connected to poles).
    k = 0;
    for (int i = 0; i < stacks - 2; ++i)
    {
        for (int j = 0; j < slices; ++j)
        {
            indices[k++] = (ushort)((i + 1) * numRingVertices + j);
            indices[k++] = (ushort)(i * numRingVertices + j + 1);
            indices[k++] = (ushort)(i * numRingVertices + j);

            indices[k++] = (ushort)((i + 1) * numRingVertices + j + 1);
            indices[k++] = (ushort)(i * numRingVertices + j + 1);
            indices[k++] = (ushort)((i + 1) * numRingVertices + j);
        }
    }

    // Compute indices for top stack.  The top stack was written
    // first to the vertex buffer.
    for (int i = 0; i < slices; ++i)
    {
        indices[k++] = (ushort)i;
        indices[k++] = (ushort)(i + 1);
        indices[k++] = (ushort)northPoleIndex;
    }

    // Compute indices for bottom stack.  The bottom stack was written
    // last to the vertex buffer, so we need to offset to the index
    // of first vertex in the last ring.
    int baseIndex = (numRings - 1) * numRingVertices;
    for (int i = 0; i < slices; ++i)
    {
        indices[k++] = (ushort)(baseIndex + i + 1);
        indices[k++] = (ushort)(baseIndex + i);
        indices[k++] = (ushort)southPoleIndex;
    }

    vertexBuffer = device.CreateVertexBuffer(data: vertices);
    indexBuffer = device.CreateIndexBuffer(data:indices);

}

Listing 5 Create the vertex and index buffers

In the method CreateSphere the last two statements create the vertex buffer for storing the vertices in gpu memory and the index buffer containing the indices that defines mesh triangles.

C#
vertexBuffer = device.CreateVertexBuffer(data: vertices);
indexBuffer = device.CreateIndexBuffer(data:indices);

This will reserve memory on the graphic device for holding the arrays containing the vertex data and indices. The memory is reserved with default settings ,the method also allows passing several parameters for controlling the resource memory behavior or the cpu access type like reading or writing.

The code for rendering the scene is located in the RenderFrame method.

C#
private void RenderFrame()
{
    //Set the render target and the depth stencil buffers
    //for rendering to the display just set the device default
    //BackBuffer and BackDepthBuffer
    device.SetRenderTarget(device.BackBuffer, device.BackDepthBuffer);

    //Set the ViewPort to used by the device during the viewport tranformation
    device.ViewPort = new ViewPort(0, 0, Width, Height);

    //Clear the render target and depth stencil buffers
    device.Clear(ClearFlags.Target | ClearFlags.ZBuffer, new Color4(0, 0, 0, 0), 1, 0);

    //Set the primitive type
    device.PrimitiveTopology = IAPrimitive.TriangleList;

    //Bind the vertex buffer to slot 0 at offset 0
    device.SetVertexBuffer(0, vertexBuffer, 0);

    //Set the index buffer
    device.SetIndexBuffer(indexBuffer);


    //Send the transformation matrices to the vertex shader
    input.World = Matrix.RotationY(-(float)Environment.TickCount / 5000.0f);
    input.View = view;
    input.Projection = projection;

    //Send the light info and other values to the pixel shader
    input.CameraPosition = cameraPosition;
    input.ReflectionRatio = 0.05f;
    input.SpecularRatio = 0.15f;
    input.SpecularStyleLerp = 0.15f;
    input.SpecularPower = 8;
    input.DirectionalLight = new DirectionalLight
    {
        Color = Color3.White,
        Direction = new Euler(45, 0, 0).ToDirection()
    };

    //Bind a texture with a sampler state. As a convetion the SamplerState
    //in the shader must have the same name as the texture with 's' as prefix
    //for example in the shader the sampler state is declared
    //SamplerState sDiffuseTexture;
    input.DiffuseTexture = diffuseTexture.ToSampler(diffuseSampler);

    //Bind textures with default sampler state (linear filtering and wrap TextureAddressMode).
    //these statements have the same behavior that calling nightTexture.ToSampler()
    input.NightTexture = nightTexture;
    input.NormalMapTexture = normalMapTexture;
    input.ReflectionMask = reflectionMask;


    //Set the shader program
    device.Program = shaderProgram;

    //Draw the geometry using the indices count, the start index an the vertex base offset
    device.DrawIndexed((int)indexBuffer.SizeInBytes / indexBuffer.Stride, 0, 0);

    //Present the render target buffer to the display.
    device.Present();
}

Listing 6 Render Frame

After setting the render and depth-stencil buffers ,the ViewPort is set and the rendering buffers are cleared. The primitive type is specified as a list of triangles, and the GraphicBuffer storing vertices and indices are binded to the pipeline. Then shader interface mapping is used to sending the shader variable values like matrices and lightning info. Also shader interface mapping can be used to bind textures and sampling states like in the statement.

C#
input.DiffuseTexture = diffuseTexture.ToSampler(diffuseSampler);
.....
input.NightTexture = nightTexture;

Textures and sampler states can also be set by GPU registers using the IShaderStage interface like in the following statements where a texture and a sampler state are binding to texture register 0 and sampler register 0.

C#
device.GetShaderStage<PixelShader>().SetResource(0, diffuseTexture);
device.GetShaderStage<PixelShader>().SetSampler(0, diffuseSampler);

A IShaderStage<TShader> can be obtained calling device.GetShaderStage<TShader>() where TShader is a type that inherit from Shader like VertexShader, PixelShader, GeometryShader, HullShader, DomainShader or ComputeShader. If a particular GraphicDevice implementation does not support a shader stage for a [Shader Type] it must return null when calling device.GetShaderStage<[Shader Type]>().

On the other hand another unique feature called dynamic shader mapping can be used instead of interface shader mapping like the in the following line of code.

 

C#
shaderProgram.Input.World = Matrix.RotationY(-(float)Environment.TickCount/5000.0f);

The type of shaderProgram.Input is dynamic so it's not necessary to declare a interface for mapping the shader constants. But the drawback of dynamic shader mapping is that it can only map primitive types like vectors, matrices or textures. It cannot map user defined types like the DirectionalLight struct. Also it cannot bind a SamplerState therefore the SamplerState must be binded by using an IShaderStage.

 

The Vertex Shader

HLSL
struct VSInput
{
    float4 Position : POSITION; 
    float3 Normal : NORMAL;
    float3 Tangent : TANGENT;
    float2 TexCoords : TEXCOORD0;
};

struct VSOutput
{
    float4 PositionVS : SV_POSITION;
    float2 TexCoords : TEXCOORD0; 
    float3 Normal : TEXCOORD1;
    float3 Tangent : TEXCOORD2;
    float3 Binormal : TEXCOORD3;
    float3 Position : TEXCOORD4;
};

cbuffer camera
{
	float4x4 View;	
	float4x4 Projection;
};

cbuffer perObject
{	
	float4x4 World;
};


VSOutput main( VSInput input)
{
	 VSOutput output;
   
    // Transform to clip space by multiplying by the basic transform matrices.
    // An additional rotation is performed to illustrate vertex animation.    
    float4 worldPosition = mul(input.Position, World);
    output.PositionVS = mul(worldPosition, mul(View, Projection));
    
    // Move the incoming normal and tangent into world space and compute the binormal.
    // These three axes will be used by the pixel shader to move the normal map from 
    // tangent space to world space. 
    output.Normal = mul(input.Normal, World);
    output.Tangent = mul(input.Tangent, World);
    output.Binormal = cross(output.Normal, output.Tangent);
    output.Position = worldPosition.xyz;
     
    // Pass texture coordinates on to the pixel shader
    output.TexCoords = input.TexCoords;
    return output;    
}

The Pixel Shader

HLSL
struct Light
{
	float3 Direction;
	float3 Color;
};

struct VSOutput
{
    float4 PositionVS : SV_POSITION;
    float2 TexCoords : TEXCOORD0; 
    float3 Normal : TEXCOORD1;
    float3 Tangent : TEXCOORD2;
    float3 Binormal : TEXCOORD3;
    float3 Position : TEXCOORD4;
};

cbuffer cbParams
{
	float ReflectionRatio;
	float SpecularRatio;
	float SpecularStyleLerp;
	int SpecularPower;
};

cbuffer cbLight
{
	Light DirectionalLight;
	float4x4 View;
	float3 CameraPosition;
};

Texture2D DiffuseTexture;
Texture2D NightTexture;
Texture2D NormalMapTexture;
Texture2D ReflectionMask;

SamplerState sDiffuseTexture;

float4 main(VSOutput input) : SV_TARGET
{	
    float3 EyeVector = normalize(input.Position - CameraPosition );

    
    // Look up the normal from the NormalMap texture, and unbias the result
    float3 Normal = NormalMapTexture.Sample(sDiffuseTexture, input.TexCoords).rgb;
    Normal = (Normal * 2) - 1;
    
    // Move the normal from tangent space to world space
    float3x3 tangentFrame = {input.Tangent, input.Binormal, input.Normal};
    Normal = normalize(mul(Normal, tangentFrame));
    
    // Start with N dot L lighting
    float light = saturate( dot( Normal, -DirectionalLight.Direction ) );
    float3 color = DirectionalLight.Color * light;
    
    // Modulate against the diffuse texture color
    float4 diffuse = DiffuseTexture.Sample(sDiffuseTexture, input.TexCoords);
    color *= diffuse.rgb;
    
    // Add ground lights if the area is not in sunlight
    float sunlitRatio = saturate(2*light);
    float4 nightColor =NightTexture.Sample(sDiffuseTexture, input.TexCoords);
    color = lerp( nightColor.xyz, color, float3( sunlitRatio, sunlitRatio, sunlitRatio) );
       
    
    // Add a specular highlight
	float reflectionMask = ReflectionMask.Sample(sDiffuseTexture, input.TexCoords);
    float3 vHalf = normalize( -EyeVector + -DirectionalLight.Direction );
    float PhongSpecular = saturate(dot(vHalf, Normal));
	

    color += DirectionalLight.Color * ( pow(PhongSpecular, SpecularPower) * SpecularRatio * reflectionMask);  
    
	 // Add atmosphere
    float atmosphereRatio = 1 - saturate( dot(-EyeVector, input.Normal) );
    color += 0.30f * float3(.3, .5, 1) * pow(atmosphereRatio, 2);

    // Set alpha to 1.0 and return
    return float4(color, 1.0);	
}

In both shader different constant buffers were declared and the interface and dynamic mapping mechanism will take care of efficiently manage those constant buffers, so only buffers with accessed variables will be opened and closed only once in each rendering frame.

Application Screens

Image 2 Image 3

Points of Interest

In Igneel.Graphics it's interisting to note the simplicity for creating devices and resources like shader, buffers, textures and pipeline states. It's also interisting the particular managent of shaders and features like interface mapping and dymanic mapping. As a remark you can write your base rendering code unaware the native implementation of the API and test if a IShaderStage if implemented for a given Shader type. Also I enjoyed a lot developing this API and I learn so much how to write hight performance code and integrating managed code in .NET/MSIL with native unmanaged code.

Also in order to run the sample you must first install the DirectX redistributable that comes with the sdk and you can download at https://www.microsoft.com/en-us/download/details.aspx?id=6812. Note after installing the SDK you must locate the SDK installation folder by default in Program Files and run the redistributable installer in  [SDK Folder ]/Redist/DXSETUP.exe

Futhermore Igneel Engine is now available in Github so contributions are welcome.

About the Author

My name is Ansel Castro Cabrera, I have a bachelor in Computer Science from the University of Havana were I specialized in computer graphics, compiling and .NET development. I also worked in other trends of computer science like Machine Learning ,Computer Vision ,Web and Android Programming. In addition I have developed neural network and convolutional neural networks models for pattern recognition on images, also I have worked with OpenCV in feature tracking and extration. On the other hand I have worked with Django, PHP, ASP.NET WebForms, ASP.NET MVC, Javascript in web development.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer (Senior)
Uruguay Uruguay
Senior Software Engineer with more than 8 years of experience in the industry. Graduated from Computer Science ,focused on .NET and Java technologies with special interest on Computer Graphics, Compilers , Languages and Machine Learning.

Comments and Discussions

 
QuestionExample with Digital Model Terrain (DMT) DATA Pin
Member 1027179624-Jan-17 3:24
Member 1027179624-Jan-17 3:24 
AnswerRe: Example with Digital Model Terrain (DMT) DATA Pin
Ansel Castro16-Feb-17 15:10
Ansel Castro16-Feb-17 15:10 
GeneralRe: Example with Digital Model Terrain (DMT) DATA Pin
Member 1027179616-Feb-17 15:28
Member 1027179616-Feb-17 15:28 
GeneralRe: Example with Digital Model Terrain (DMT) DATA Pin
Ansel Castro16-Feb-17 17:05
Ansel Castro16-Feb-17 17:05 
GeneralRe: Example with Digital Model Terrain (DMT) DATA Pin
Member 1027179616-Feb-17 21:45
Member 1027179616-Feb-17 21:45 
GeneralRe: Example with Digital Model Terrain (DMT) DATA Pin
Ansel Castro17-Feb-17 2:19
Ansel Castro17-Feb-17 2:19 
GeneralRe: Example with Digital Model Terrain (DMT) DATA Pin
Ansel Castro6-Mar-17 1:58
Ansel Castro6-Mar-17 1:58 
Hi,
I have good news for you I have posted an article about HeightField terrain rendering using Igneel Engine Height Map Rendering with Igneel Engine[^].
Also I have updated the HeightField class that represents a terrain and added a constructor that recieves an bidimensional float array. The changes are in the dlls within the sample proyect. I haven't upload it yet to github but will do shortly.

Best regards.
GeneralRe: Example with Digital Model Terrain (DMT) DATA Pin
Member 102717966-Mar-17 2:28
Member 102717966-Mar-17 2:28 
QuestionFile Not Found Pin
Member 1295097415-Jan-17 14:56
Member 1295097415-Jan-17 14:56 
AnswerRe: File Not Found Pin
Ansel Castro20-Jan-17 2:03
Ansel Castro20-Jan-17 2:03 
QuestionFile not Found Exception Pin
georani13-Nov-16 23:49
georani13-Nov-16 23:49 
AnswerRe: File not Found Exception Pin
Ansel Castro15-Nov-16 8:18
Ansel Castro15-Nov-16 8:18 
AnswerRe: File not Found Exception Pin
Ansel Castro9-Dec-16 9:16
Ansel Castro9-Dec-16 9:16 
GeneralRe: File not Found Exception Pin
Member 244330620-Jan-17 3:49
Member 244330620-Jan-17 3:49 
GeneralRe: File not Found Exception Pin
Ansel Castro20-Jan-17 4:07
Ansel Castro20-Jan-17 4:07 
QuestionWhere does Igneel.Graphics come from? Pin
up_late2-Nov-16 2:04
up_late2-Nov-16 2:04 
AnswerRe: Where does Igneel.Graphics come from? Pin
Ansel Castro13-Nov-16 16:00
Ansel Castro13-Nov-16 16:00 
QuestionDoes it work on Mono/Linux? Pin
William Ivanski31-Oct-16 6:12
professionalWilliam Ivanski31-Oct-16 6:12 
AnswerRe: Does it work on Mono/Linux? Pin
Ansel Castro13-Nov-16 15:56
Ansel Castro13-Nov-16 15:56 
QuestionError compiling Pin
Member 1250712131-Oct-16 4:55
Member 1250712131-Oct-16 4:55 
AnswerRe: Error compiling Pin
Ansel Castro13-Nov-16 15:52
Ansel Castro13-Nov-16 15:52 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.