DEV Community

Nikita Agafonov
Nikita Agafonov

Posted on

Space, Rockets and GPU particles

This article will explain how effects presented in starmax project work. “Starmax” is a showcase project I built together with immersive studios. There are two parts to this article — one is about technical details, the other is more philosophical on how we arrived at the final result. I believe it’s important to cover both topics since they represent different parts of creative process.

I recommend to use code snippets here in the context of the repository’s code (see Links).

There’s only 1 js file — main.js, which contains both scene and materials setup. The rest of the code is in the /shaders directory.

We also used three.js to setup a scene and manage resources.

Links.

“Starmax” project:
https://starmax.immersivestudios.de

GPU particles repo:
https://github.com/lightest/gpuparticles

The technical part.

GPU driven particles rendering consists of 2 steps:
1) Simulation to update points positions in 3D space.
2) Visualize particles, using positions from the simulation as an input.

Particles are represented mathematically as points in 3D space, think vec3. They are set in motion using two swapping textures. First one goes as input to the simulation shader, while the second acts as an output storage. You change positions of 3D points and write the result to the output texture. This constitutes one simulation step. On the next frame you swap the textures and the process repeats again. Textures keep revolving, carrying the positions over and thus evolving the simulation state.

Image descriptionSwapping textures. If you visualize them, they’ll look like grids of random color.

You can utilize any motion formula known to you from school physics to displace particles, like linear accelerated motion for instance. Or you can implement something more complicated, like fluid dynamics behavior using Navier-Stokes equations. It will all work using this architecture:

  • accept previous state (positions in space) as input.
  • change positions in any way.
  • write the result to output.
  • swap.
  • repeat the process.

The effect presented in our project is achieved by a combination of things. First, together with positions in space we also supply a life-time of a particle. This information is used to reset individual particle’s position to it’s original location and start the process again. It allows to “emit” particles in any direction and return them back to the starting point once life-time is exceeded. Second, we supply surface normals — a 3D unit vectors pointing away from the surface at a 90 degree angle. In this case particles displacement goes gradually along certain direction instead of being randomly juggled around. Finally, displacement itself is driven using classic 3D Perlin noise, which allows to achieve organic, fluid like motion. When in doubt — use Perlin noise!

Image descriptionParticles taking off from the surface, in the direction of surface normals.

In summary — we pass 3 different textures to our simulation shader:

1) Original positions in 3D space. This is needed to reset the particles once life-time is over.

2) Surface normals, to be able to drive motion in a certain direction, away from original position.

3) Previous simulation step. A texture containing output of the shader, calculated on previous frame. It contains altered particle position and it’s life-time.

Particle visualization.

Simulation itself isn’t going to bring particles to the screen. All it does is gradually changes 3D positions with time. Something has to utilize those positions and do the rendering itself. This is good news! It means you can utilize any type or renderer — be it volumetric particles or vertex based has no difference. From this stand point, texture containing particle’s data can be plugged into any renderer, capable of using such input.

We simply used Points - a very handy way of displaying particles, which also gives you capabilities to work with them as with any regular mesh. You have access to the position, rotation and scale which behaves as you’d expect with a mesh. Further down the article, we’re gonna reference it as “points mesh”.

Image descriptionPoints mesh, rendering particles as stars.

Data source.

Now let’s talk data source. Where to get those original positions from? Depending on your purposes you can use any source of data and write it all down to a texture. You can even have separate shader if you like, which outputs geometry positions of some generated surface and then send it as an input to particle simulator.

In our case we used three.js tooling, specifically MeshSurfaceSampler. Put simply, it outputs a set of 3D points evenly distributed along the mesh surface. It’s usage is quite straightforward and presented in the examples. We also utilize it to get surface normals. Then we pack it all to the DataTextures and this gives us textures 1 and 2. The 3rd one is going to be the render target to which the simulation is going to render the result.

function glbToMeshSurfacePoints(glbModel)
{
    const width = SIM_WIDTH;
    const height = SIM_HEIGHT;
    const mesh = glbModel.scene.children.find(child => child instanceof THREE.Mesh);
    const data = sampleMeshSurface(width, height, mesh);

    const originalPositionsDataTexture = new THREE.DataTexture(data.surfacePoints, width, height, THREE.RGBAFormat, THREE.FloatType);
    const originalNormalsDataTexture = new THREE.DataTexture(
        data.surfaceNormals, width, height, THREE.RGBAFormat, THREE.FloatType,
        undefined,
        undefined,
        undefined,
        THREE.LinearFilter,
        THREE.LinearFilter
    );
    originalPositionsDataTexture.needsUpdate = true;
    originalNormalsDataTexture.needsUpdate = true;

    return {
        mesh,
        originalPositionsDataTexture,
        originalNormalsDataTexture
    };
}
Enter fullscreen mode Exit fullscreen mode

Points rendering.

After simulation has finished it’s first step, we can pass the result to points material. Unlike the simulation, points positions are set in the vertex shader. This is also where the color change is calculated. The fragment stage here is simply used to apply the color and render an individual particle. We additionally use a star shape taken from this beautiful shadertoy.

Finally, this is how particle simulation and points rendering setup looks like:

function update()
{
    const elapsedTime = clock.getElapsedTime();
    const dt = elapsedTime - prevFrameTime;

    currentMesh = meshes[Math.round(particlesDebugObject.spawnPointMix)] || meshes[0];

    controls.update();
    updateRaycaster();

    materials.simShaderMaterial.uniforms.uTime.value = elapsedTime;
    materials.simShaderMaterial.uniforms.uDt.value = dt;

    // Execute Particle simulation program and output the result to texture (render target).
    renderer.setRenderTarget(textures.computeRenderTargets[simStep]);
    renderer.render(
        particlesComputeProgram.scene,
        particlesComputeProgram.camera
    );
    // renderer.render(particlesComputeProgram.scene, camera);

    // Set render target back to null, for the next render call to render to the screen.
    renderer.setRenderTarget(null);

    // Set output of the simulation step as input for points rendering.
    materials.pointsRenderShaderMaterial.uniforms.uParticlesOutput.value = textures.computeRenderTargets[simStep].texture;

    // Set output of the simulation step as input, to be ready for the next frame.
    materials.simShaderMaterial.uniforms.uParticlesPositions.value = textures.computeRenderTargets[simStep].texture;
    simStep = (simStep + 1) % 2;
    prevFrameTime = elapsedTime;

    animate3DCursor();

    // Constantly rotate the model, just for demo sake.
    pointsRenderProgram.pointsMesh.rotation.y += 0.003;
}
Enter fullscreen mode Exit fullscreen mode

Pointer effect.

Displacement driven by the pointer is exactly the same displacement you see between model transitions, but applied locally in the proximity of the cursor. Position of the cursor is obtained using Raycaster, which checks for the intersections with an invisible mesh. It is the same mesh we used for the surface sampling, but with position, rotation and scale copied from points mesh on every frame. We add that mesh to the scene together with points mesh and set it’s visibility to false.

The important bit here is that cursor position provided by a raycaster, is given in the world space, so in order to use it within the simulation we must first multiply it by an inverse world matrix of the mesh. This gives us a cursor position in the local space of particles.

// Convert cursor corrdinates to the particle’s local space.
cursor3DVec4.set(cursor3D.x, cursor3D.y, cursor3D.z, 1.0);
inverseMeshWorldMatrix.copy(currentMesh.matrixWorld);
inverseMeshWorldMatrix.invert();
cursor3DVec4.applyMatrix4(inverseMeshWorldMatrix);
cursor3DVec3Sim.set(cursor3DVec4.x, cursor3DVec4.y, cursor3DVec4.z);
materials.simShaderMaterial.uniforms.uPointerPos.value = cursor3DVec3Sim;
Enter fullscreen mode Exit fullscreen mode

Note, that we pass 3d cursor position to the simulation shader, not the points rendering shader! Unlike many web-sites you can find online utilizing particle effects, ours integrates the cursor position in the simulation step. This enables us to affect the particles positions continuously with the rest of the simulation formulas of choice and thus preserve the effects activated in cursor’s proximity.

Further, for the aesthetics sake, any effects applied in the proximity of the cursor must be applied gradually. Otherwise to the user it will look as a very abrupt change in behavior. Unless it is a goal, it’s better to use some sort of a smooth curve. We use gaussian in it’s simplest form which is e^-x². This allows any effect to gradually gain power, reach maximum at the intersection point with the cursor and then just as gradually fade away. Both color change and position deviation are applied using this curve.

Image descriptionGaussian shaped gradual displacement of particles in the direction of surface normal.

// Gaussian shape.
float l = exp(-pow(length(uPointerPos  modelPosition.xyz), 2.0));
vParticleColor = mix(vParticleColor, uParticleTouchColor, l);
Enter fullscreen mode Exit fullscreen mode

Finally, position deviation itself is not applied directly. Instead the position is changed along the deviated surface normal. Deviation is provided by a 3D Perlin noise, which slightly adjusts the direction of the normal. Particle’s position then is displaced along the direction of that normal. This is what results in a such an interestingly looking behavior.

Image descriptionChange of the direction of the normal in time, allows to gradually change particle’s trajectory.

Image descriptionLive example of particles following surface normal direction which changes in time.

void main()
{
    // Stores particle position in 3d space and life-time.
    vec4 particleData = texture2D(uParticlesPositions, vUv);

    vec4 originParticleData = texture2D(uParticlesOriginPosition, vUv);
    vec4 originParticleNormals = texture2D(uParticlesOriginNormal, vUv);
    vec4 originParticleDataAlt = texture2D(uParticlesOriginPositionAlt, vUv);
    vec4 originParticleNormalsAlt = texture2D(uParticlesOriginNormalAlt, vUv);

    vec3 pos = particleData.xyz;
    vec3 originPos = originParticleData.xyz;
    vec3 originNormal = originParticleNormals.xyz;
    vec3 originPosAlt = originParticleDataAlt.xyz;
    vec3 originNormalAlt = originParticleNormalsAlt.xyz;

    vec3 normal = mix(originNormal, originNormalAlt, uOriginPointMix);
    float particleLifeTime = particleData.w + uDt;
    float rndVal = n1rand(pos.xy);

    float tf = uTime * 0.5f;
    vec3 p = pos * uNoiseScale;
    p = vec3(p.x + cos(tf), p.y + sin(tf), p.z + tf);
    float n0 = cnoise3(p);
    float n0Scaled = n0 * uNoiseMagnitude;
    pos += normal * n0Scaled;

    // Raycaster driven position offset.
    // Deviating direction of the normal using noise, to displace particles in an interestingly looking way.
    vec3 deviatedNormal = normal + vec3(n0);

    // Sphere shape.
    // float l = 1.0f - clamp(length(uPointerPos - pos), 0.0f, 1.0f);

    // Gaussian shape.
    float l = exp(-pow(length(uPointerPos - pos), 2.0));
    pos += deviatedNormal * l * uPointerDisplacementMagnitude;


    if (particleLifeTime > uParticlesLifetime)
    {
        pos = mix(originPos, originPosAlt, uOriginPointMix);
        particleLifeTime = rndVal * uParticlesLifetime;
    }

    // Write new position out.
    gl_FragColor = vec4(pos, particleLifeTime);
}
Enter fullscreen mode Exit fullscreen mode

Final bit to mention — as you might notice, the noise is taken from the parameter which is calculated based on cos and sin, which in turn calculated from time offset. This is simply to add a rotation to the coordinates, which passes it to the entire displacement effect. A small touch, but without it, noise is sort of passes through the mesh continuously, whereas in this case it revolves around it.

Stars background.

As already mentioned — cosmic background uses Martijn Steinrucken’s shader in it’s foundation. He, by the way, has a great video explaining it in detail:
https://www.youtube.com/watch?v=rvDo9LvfoVE

We introduced few modifications to better serve our needs.
First, the stars themselves — we use a simplified version of them, which renders stars just as glowing orbs:

float Star(vec2 uv)
{
    float d = max(0.001, length(uv));
    float m = 0.05 / d;
    m *= smoothstep(1.0, 0.2, d);

    return m;
}
Enter fullscreen mode Exit fullscreen mode

Second, the density of the stars is tweaked. We use only two layers of stars and they are distributed more sparsely. This is to mitigate distractions. After all, the attention is supposed to be focused on the main content.
To achieve parallax effect visible when scrolling the page, each star layer uses it’s own, slightly adjusted UV scale. It’s y coordinate is shifted together with scrolling.

Finally, the nebula gradually revealing the stars is the same 3D Perlin noise we used to displace particles. For simplicity sake, we’re using only two octaves of such noise. Final value you see on the screen is noise of the noise:

// Nebula (cloud like stuff).
// In order to achieve parallax and overall higher complexity of the underlying noise.
// We use different octaves of noise with increasing offset in time and in scrolling.
// Each next layer of noise, featureing more details, should scroll faster and change in time faster.
// Each next layer also feeds on the previous one, thus providing overall sense of depth and complexity.
vec2 uvScrollOffset0 = vec2(uv.x, uv.y - uScrollOffset * 0.025f);
vec2 uvScrollOffset1 = vec2(uv.x, uv.y - uScrollOffset * 0.1f);

float noiseTime = uTime * 0.075f;
float f = cnoise3(vec3(uvScrollOffset0, noiseTime));
f += cnoise3(vec3(uvScrollOffset1 * f * 2.0f, noiseTime * 2.0f));
Enter fullscreen mode Exit fullscreen mode

The philosophical part.

It’s important to also lay out how we have arrived to the effects presented in the project. Contrary to what one might think it wasn’t imagined in the design right away. We had a general direction, which included rockets and space. We knew we would like to mess with particles, but none of the amazing bits were near close in our minds in the beginning.

So how did it happen then?

Naturally, as a result of solving existing problems at hand. Initial problem was GPU load being heavier than desired. First version of the star background shader coupled with particles rendering ended up utilizing 40–50% of the GPU. It was clear to me right away it’s not good enough and there are no reason for it to be this heavy.

Playing with the stars shader and trying to simplify existing effects I ended up with the current version, where the smooth color change is driven by the same 3D Perlin noise. There are only two octaves of the noise and the second one uses the output of the first as it’s input value. In other words, we take a noise of the noise. This gives an organic look, which evolves over time in an interestingly looking way.

But no matter how I tried, it wasn’t possible to achieve less GPU load without loosing in quality and complexity of the effect. It became obvious — the resolution of the screen is what causes most load. There are simply too many pixels involved in the computation. Thus the next step was to render to a texture with adjustable resolution instead of the screen, such that you can scale GPU load with resolution multiplier.

Right away, using half the resolution of the screen the load went down to roughly 16%. Thanks to low frequency of the noise i.e. very gradual color change, the fact of reduced resolution is not noticeable. The only thing that would suggest anything along those lines were the rays of the stars inherited from original shader. Removing those simplified the shader further, making stars to be a simple length functions with some smoothstep. This is how we got our current look.

As you can see, the driving force here was the desire to gain more performance. Thus gradually changing and adjusting what we already had, naturally arriving at the working solution.

Similar story goes for pointer effect for the particles. Original plan was to just have some kind of displacement effect and may be a color highlight which occurs as you hover the cursor over a mesh. But there was a problem — due to displacement happening along the direction of the surface normal, curvature of the surface on low poly model reveals the faces. Since there’s one normal per face, curved surface will immediately make obvious the use of low amount of polygons. In case with rocket it looks like it’s being decomposed to stripes or bands of particles, because a primitive cylinder shape was used in it’s construction.

So how to go about this? It was clear this effect should be at least mitigated, for the aesthetics sake. It just doesn’t look that great. It was also clear you need to somehow alter the direction of the normal, such that the particles taking off from the surface would fly away in a slightly different direction. It concealed the situation to a degree, but it wasn’t good enough. Overall it just added white noise to the motion and made the entire thing feel more chaotic.

Further I tried different combinations of sin and cos to alter the direction of the normal, but it was too periodic. None of these felt right.
Frustrated, I thought — ok fuck it, let’s plug in the Perlin noise we already have calculated for general displacement! Immediately it was obvious — this is it! Simple vector constructed from noise value, deviated the direction of the normal just right. The bands didn’t really disappeared, but due to overall organic nature of the noise, they now looked in place! Suddenly nothing was missing, it was just good!

That’s the version of effect you see on your screens. Again — it was discovered as a result of attempts to solve problems at hand. In this case, low poly mesh giving away it’s nature and the solution being — embrace it and use it to your advantage.

Most things, if not all, have happened in a similar way. The grass is green because at some point, it figured out — the most efficient spectrum of light to convert to a chemical energy is in red and blue wavelengths, thus the green is what’s mostly reflected back. All it did is tried to survive, to gain efficiency.

One guy once said — “Design is not just what it looks like and feels like. Design is how it works.”, I believe our project demonstrates this principle very well. Both cosmic background and particles look and behave the way they do because they’re designed to be efficient in resource use and aesthetically utilize built in geometry. Few things there were done with the purpose to pursue specific looks imagined ahead of time. Most spectacular results have emerged naturally from internal organization.

Top comments (2)

Collapse
 
mattycoderx profile image
Mattycoderx

It's great

Collapse
 
nikitaworks profile image
Nikita Agafonov

thank you!