Category Archives: Graphics
This project was done as a part of my honours-stage research paper. The overall idea was born from realizing how little flexibility animation in real-time currently has; while animation blending, ragdoll physics, and simple IK re-targeting techniques are commonplace, virtually little else in the way of full-character motion generation existed. The benefits such a notion were enormous both from a game-design and real-time perspective. So I decided to investigate current procedural animation techniques, and developed a procedural animation system of my own, which I later named IKAS (Inverse Kinematic Animation System).
Bezier curves, or curves in general, are arguably one of the most useful tools in graphics programming. They have so many practical applications particularly in games they’re almost always used one way or another, whether that be camera movement, animation, motion paths, the possibilities are endless. The demo was built upon a framework that I’ve used for most of my DirectX projects, it’s sustained heavy amounts of iterations through the years and it’s served its purpose very well considering.
This demo was a quick experimentation of Bezier curve manipulation, rendering composite bezier curves and then animating an object along the curve at a speed dictated by some simple physical properties. The user can place control points for a bezier curve arround a small level, the user can then tell the application to run, and a spaceship will fly along the user defined bezier curve at velocities dependant on the incline of the slope.
W, A, S, D, PageUp and PageDwn move the current control point. Selecting “Add control point” creates a new control point for placement while fixating the previous. Hitting “Run” will switch to view mode where a simple model follows the path created by the user. Reset the curve by clicking “Restart.”
The aim for this demo was to create a 3D scene using the Direct3D API. The application design allows the user dynamic control over numerous variables, such as the wave’s height, speed, size and texture scale. The code uses the concepts talked about in Frank Luna’s book ‘Programming in DirectX 9.0c’.
The application starts with a flat vertex grid and uses displacement mapping to offset each vertex over time. Displacement mapping itself is accomplished by associating each vertex with a displacement vector, these displacement vectors specify a direction for the Vertex Shader to offset (i.e displace) the vertices based on a defined length stored in the displacement map. The result is what you see below.
My original idea behind this project was to create numerous ‘types’ of water by using different rendering methods to show various styles and levels of realism. However this was out of reach considering the time frame we had available, so I eventually settled on using displacement mapping as my method for this project. Other techniques I investigated were noise-based water simulations, and I am working on implementing these along with added reflection/distortion effects within the demo.
All source code for the project can be found here. You will need Visual Studio and DirectX installed to run the project. Most modern gaming PCs should be capable of running the application at a smooth frame rate.
This short demo uses Simplex Noise both on the CPU and GPU to procedurally generate terrain. Everything in the scene is created at run time – no models are loaded in at all. The scene was rendered using OpenGL, with lighting, material and multi-texturing effects applied with GLSL shaders, using a simple but flexible modular framework I built throughout the project.
Simplex Noise is Ken Perlin’s updated version of Perlin noise. While generally less well known, it has a several advantages over classic Perlin Noise, such as being visually isotropic, being faster to calculate at higher dimensions and having a lower computational complexity in general, with far fewer multiplications required. In common with Perlin Noise, Simplex noise will produce the same output given the same input values, and by considering functions, such as taking the absolute values of the noise, a range of effects can be created.
A key feature I’m currently working is to implement water similar to my DirectX demo and perhaps introduce a procedurally generated skybox.
All source code for the project can be found here. You will need Visual Studio 2008+ and the tr1 extension from Microsoft to compile the application.
The aim of this demo was to show how the Vector Units of the PS2 can be used to create various lighting effects. Working on top of a framework which was provided for this university project, I implemented a hierarchical system for managing entities in the scene, as well as producing a range of interesting effects.
Animated Point Lights
Animated point lights which rapidly changing colour, size and position to give a realistic effect. The intensity of these faded over distance squared, acting only on a small area.
Ambient and Directional Lights
Ambient and directional lighting was implemented on the Vector Unit, using the normals for each vertex to calculate intensity.
Texture and Alpha Blending
Using multiple layers of geometry in close proximity allowed for textures to be blended, improving the overall appearance of the models. Various alpha blending algorithms were also used to alter the values of colours within the scene to intensify and smooth the models to fit within the rest of the landscape.
This being on the PS2, there was no real way I could record footage of the demo in action, I was able to take screenshots however using an inbuilt function within the framework.
The source code for the project can be found here. You will need a PS2 development kit to compile and run the source code.