jeudi 12 novembre 2009

Very Fast Fake Radiosity

The concept is very simple:
we have RadiantProbes inside an object (a RadiantObject) which give the lighting information to the world. RadiantProbe are like colored spheres inside an object. We have to just transport the normal of the fragment in the RadiantProbe space, and get the color, fetching with the normal transformed.
My demo shows the concept only, for a real implementation in a game, a hard work with radiant probes sorting has to be done.
An enhancement could be to store with the colored spheres (RadiantProbes) others probes which have geometric information (a depth cube map transformed to SH coefficients), in this case occlusion could be achieved.



I took irrlicht as base.
The main class is CDemo. The core of the algorithm is in RadiosityManager / RadiantObject and in the shader (sh.hlsl).

Each object which emits light is called RadiantObject.
Each Radiant object has a number of RadiantProbes. RadiantProbes can be seen as a colored sphere. At each RadiantProbe, we generate a cube map, transform this cube map into SH coefficients and save them.
In the rendering step, the light manager will retrieve the lights/radiant probes which concern the object to be rendered, and send as shader constant the SH coefficients for this object.
These sources are the base for my framework, of course not finished. I'm doing a musical shoot them up (Battle Mode) in the same time, during my hobby time.
I've interfaced with Lua many things for the game (not seen here).
I have some maxscripts (you can see an example in the project) which generate lua file with data information, for my splines, enemy wave, weapons specs etc. The RadiantProbes will be exported in the same way.

source on google code here:
http://code.google.com/p/fakeradiosity/

Real time Multibounce Radiosity Approximation

It is a proposal (refused) for the shader x8 book, I made with my collegue Emmanuel Briney.

We present a algorithm which simulates the light transport during a radiosity calculation solution in real time, using a very simplified representation of the geometry of the scene. All computations are done on the GPU.

First, in a preprocess step, we shoot volume textures which contain needed data ; those textures can be at a low resolution (32x32x32 for example) ; for each texel in those volume textures there is a corresponding world position xyz where we want to approximate radiosity (to simplify a WorldToRadiosityVolume matrix is used), we will call that points radiosity probes .

For each radiosity probes we shoot :

- an environment distance cube map (the distance from the probe to the scene in every directions) ; we transform this cubemap into Spherical Harmonics coefficients, and we store those coefficients in volume textures.

- an environment normal cube map (the normals viewed form the probes divided by distance from probe to surface) ; we average all texels of this cubemap to have what we can call a bent normal and store it in textures.

So for order 1 distance packing we will need 2 float4 textures (one for the distance and one for the bent normal) and for order 2 distance packing we will need three float4 textures (9 components for SH and 3 for the normal).

In the real time process, for each fragment of the scene, we accumulate the path to the light in the reverse order : from the fragment to the light. We use normal and position of the fragment to draw, get the distance to the nearest object in volume textures, then calculate the new position, fetch the normal in the volume texture, calculate the reflection of incident light and so on. With successive iterations we can reconstruct approximatively the multiple bounces of the light through the scene geometry.


In order to calculate light contribution of a single iteration we also probably need an extra RGBA8 volume texture which contains material property of the sampled space, diffuse color and reflectivity for the simplest lighting model are sufficient.


Pseudo code for the main shader :

Reflectivity = 0..1 //material reflectivity

P1 = Position

N1 = Normal

Color = CalculateLighting(P1,N1) * (1-Reflectivity)

for i to numberOfBounce

{

ShCoefficents = GetSHCoefficientsFromVolumeTextures(P1)

D1 = GetDistanceToNearestObject( ShCoefficents,N1)

P2 = P1 + N1 * D1

N2 = GetNormalAt(P2)

// at each step we can compute the attenuation of the light along the multiple bounces,

// depending on material reflectivity( maybe store in a 3d volume textures too or consider it as constant)

Reflectivity2 = MaterialReflectivity(P2,N2,P1)

Color += CalculateLighting(P2,N2) * Reflectivity * (1-Reflectivity2)

// init next step

R = reflect(N1,N2)

N1 = R

P1 = P2

Reflectivity *= (1-Reflectivity2)

}

dimanche 6 janvier 2008

What I could put in this blog

Here I want to show some techs I think interesting about shader optimisation, 3d game engine developpement, graphics production pipeline for real time games.
At my work, we have developped some fun techs about global illumination, ambient occlusion & shadows, all fake of course ( it's always "fake" in 3d real time graphics) , but which should work great. Hope I can show soon some nice pics.
For now some great links I found :
http://zeuxcg.blogspot.com/2007/10/my-own-lighting-shader-with-blackjack.html