Actual-time dreamy Cloudscapes with Volumetric Raymarching

I spent the previous few months diving into the realm of Raymarching and learning a few of its functions that will turn out to be useful for future 3D initiatives, and whereas I managed to construct a reasonably various set of scenes, all of them consisted of rendering surfaces or strong objects. My blog post on Raymarching lined a number of the many spectacular capabilities of this rendering approach, and as I discussed on the finish of that publish, that was solely the tip of the iceberg; there’s much more we will do with it.
One fascinating side of Raymarching I shortly encountered in my examine was its capability to be tweaked to render volumes. As an alternative of stopping the raymarched loop as soon as the ray hits a floor, we push via and proceed the method to pattern the inside of an object. That’s the place my obsession with volumetric clouds began, and I feel the numerous hours
I spent exploring the various Sky Islands in Zelda Tears of the Kingdom contributed so much to my curiosity to study extra about how they work. I thus studied a lot of Shadertoy scenes leveraging many Volumetric Raymarching strategies to render smoke, clouds, and cloudscapes, which I clearly could not resist giving a strive rebuilding myself:
I spent a substantial amount of time exploring the other ways I might use Raymarching to render clouds, from totally wrapping my head round the fundamentals of Volumetric Raymarching to leveraging bodily based mostly properties of clouds to strive getting a extra life like output whereas additionally making an attempt to squeeze as a lot efficiency out of my scenes with neat efficiency enchancment suggestions I realized alongside the way in which. I cowl all of that on this article, which I hope can serve you as a area information on your personal volumetric rendering experiments and learnings.
Volumetric rendering: Raymarching with a twist
In my earlier weblog publish on Raymarching, we noticed that the approach relied on:
-
Signed Distance Fields: capabilities that return the gap of a given level in area to the floor of an object
-
A Raymarching loop the place we march step-by-step alongside rays solid from an origin level (a digicam, the observer’s eye) via every pixel of an output picture, and we calculate the gap to the article’s floor utilizing our SDF. As soon as that distance is sufficiently small, we will draw a pixel.
If you happen to’ve practiced this method on a few of your individual scenes, you are in luck: Volumetric Raymarching depends on the identical ideas: there is a loop, rays solid from an origin, and SDFs. Nevertheless, since we’re rendering volumes as an alternative of surfaces, there is a tiny twist to the approach 👀.
The best way to pattern a quantity
The primary time we obtained launched to the idea of SDF, we realized that it was necessary to not step inside the article throughout our Raymarching loop to have a phenomenal render. I even emphasised that truth in one in all my diagrams showcasing 3 factors relative to an object:
-
P1
is positioned removed from the floor, in inexperienced, representing a optimistic distance to the floor. -
P2
is positioned at an in depth distance ε to the floor, in orange, -
P3
positioned inside the article, in purple, representing a damaging distance to the floor.
When sampling a quantity, we’ll want to really raymarch inside our object and reframe how we consider SDF: as an alternative of representing the gap to the floor, we’ll now use it because the density of our quantity.
-
When raymarching exterior, the density is null, or
0
. -
As soon as we raymarch inside, it’s optimistic.
As an instance this new mind-set about Raymarching within the context of quantity, here is a modified model of the widget I launched in my weblog publish on the subject earlier this yr.
-0.5,0.5
0.5,0.5
-0.5,-0.5
0.5,-0.5
That reframing of what an SDF represents finally ends up altering two core ideas in our Raymarching approach that should be mirrored in our code:
-
We’ve got to march step-by-step with a fixed step dimension alongside our rays. We not use the gap returned by the SDF.
-
Our SDF now returns the reverse of the gap to the floor to correctly signify the density of our object (optimistic on the within,
0
on the skin)
Our first Volumetric Raymarching scene
Now that we’ve got a grasp of sampling volumes utilizing what we find out about Raymarching, we will strive implementing it by modifying an present scene. For brevity, I am not detailing the setup of a primary of
Raymarching scenes. If you’d like a superb place to begin you possibly can head to my Raymarching setup I already launched in a earlier article.
The setup of the scene is sort of just like what we’re aware of in traditional Raymarching; the modifications we’ll must do are positioned in:
Instance of SDF utilized in Volumetric Raymarching
1
float sdSphere(vec3 p, float radius) {2
return size(p) - radius;6
float distance = sdSphere(p, 1.0);
Volumetric Raymarching loop with fixed step dimension
3
const float MARCH_SIZE = 0.08;7
vec3 p = rayOrigin + depth * rayDirection;11
for (int i = 0; i < MAX_STEPS; i++) {12
float density = scene(p);18
p = rayOrigin + depth * rayDirection;
Now comes one other query: what lets draw as soon as our density is optimistic to signify a quantity?
For this primary instance, we will maintain issues easy and play with the alpha
channel of our colours to make it proportional to the density of our quantity: the denser our object will get as we march into it, the extra opaque/darker will probably be.
Easy Volumetric Raymarching loop
1
const float MARCH_SIZE = 0.08;3
vec4 raymarch(vec3 rayOrigin, vec3 rayDirection) {5
vec3 p = rayOrigin + depth * rayDirection;9
for (int i = 0; i < MAX_STEPS; i++) {10
float density = scene(p);14
vec4 shade = vec4(combine(vec3(1.0,1.0,1.0), vec3(0.0, 0.0, 0.0), density), density );16
res += shade * (1.0 - res.a);20
p = rayOrigin + depth * rayDirection;
If we attempt to render this code in our React Three Fiber canvas, we must always get the next outcome 👀
Drawing Fluffy Raymarched Clouds
We now know and utilized the fundamentals of Volumetric Raymarching. Up to now, we solely rendered a easy volumetric sphere with fixed density as we march via the amount, which is an efficient begin. We are able to now strive utilizing that straightforward scene as a basis to render one thing extra attention-grabbing: clouds!
Noisy Quantity
Going from our easy SDF of a sphere to a cloud consists of drawing it with a bit extra noise. Clouds do not have a uniform form nor have they got a uniform density, thus we have to introduce some natural randomness via noise in our Raymarching loop. If you happen to learn some of my previous articles, you must already be aware of the idea of:
-
Noise, Perlin noise, and worth noise by-product
-
Fractal Brownian Movement, or FBM.
-
Texture based mostly noise.
To generate raymarched landscapes, we used a noise texture, noise derivatives, and FBM to get an in depth natural outcome. We’ll depend on a few of these ideas to create natural randomness and acquire a cloud from our SDF ☁️.
Noise perform for Raymarched panorama
4
vec2 u = f * f * (3. - 2. * f);6
float a = textureLod(uTexture, (p + vec2(.0,.0)) / 256.,0.).x;7
float b = textureLod(uTexture, (p + vec2(1.0,.0)) / 256.,0.).x;8
float c = textureLod(uTexture, (p + vec2(.0,1.0)) / 256.,0.).x;9
float d = textureLod(uTexture, (p + vec2(1.0,1.0)) / 256.,0.).x;11
float noiseValue = a + (b-a) * u.x + (c-a) * u.y + (a - b - c + d) * u.x * u.y;12
vec2 noiseDerivative = 6. * f * (1. - f) * (vec2(b - a, c - a) + (a - b - c + d) * u.yx);14
return vec3(noiseValue, noiseDerivative);
For clouds, our noise
perform appears a bit totally different:
Noise perform for Volumetric clouds
4
vec2 u = f * f * (3. - 2. * f);6
vec2 uv = (p.xy + vec2(37.0, 239.0) * p.z) + u.xy;7
vec2 tex = textureLod(uNoise,(uv + 0.5) / 256.0, 0.0).yx;9
return combine( tex.x, tex.y, u.z ) * 2.0 - 1.0;
To let you know the reality, I noticed this perform in lots of Shadertoy demos with out essentially seeing a credited writer or perhaps a hyperlink to a proof; I stored utilizing it all through my work because it nonetheless yielded a convincing cloud noise sample. This is an try at gathering collectively a few of its specificities from my very own understanding:
-
Clouds are 3D buildings, so our perform takes in a
vec3
as enter: a degree in area inside our cloud. -
The feel lookup differs from its panorama counterpart: we’re sampling it as a 2D slice from a 3D place. The
vec2(37.0, 239.0) * p.z
appears a bit arbitrary to me, however from what I gathered, it permits for extra variation within the ensuing noise. -
We then combine two noise values from our texture lookup based mostly on the
z
worth to generate a clean noise sample and rescale it throughout the[-1, 1]
vary.
Making use of this noise together with a Fractal Brownian movement is fairly just like what we’re used to with Raymarched landscapes:
Fractal Brownian Movement utilized to our Volumetric Raymarching scene
2
vec3 q = p + uTime * 0.5 * vec3(1.0, -0.2, -1.0);9
for (int i = 0; i < 6; i++) {10
f += scale * noise(q);20
float distance = sdSphere(p, 1.0);
If we apply the code above to our earlier demo, we do get one thing that begins to appear like a cloud 👀:
Including mild
As soon as once more, we’re simply “beginning” to see one thing approaching our aim, however a vital ingredient is lacking to make our cloud really feel extra cloudy: mild.
The demo we simply noticed within the earlier half lacks depth and shadows and thus does not really feel very life like general, and that is as a result of lack of diffuse mild.
So as to add mild to our cloud and consequentially acquire higher shadows, one might need to apply the identical lighting we utilized in commonplace Raymarching scenes:
-
Calculate the traditional of every pattern level utilizing our
scene
perform -
Use the dot product of the traditional and the sunshine route
Diffuse lighting in Raymarched scene utilizing normals
1
vec3 getNormal(vec3 p) {4
vec3 n = scene(p) - vec3(15
vec3 ro = vec3(0.0, 0.0, 5.0);16
vec3 rd = normalize(vec3(uv, -1.0));17
vec3 lightPosition = vec3(1.0);19
float d = raymarch(ro, rd);23
vec3 shade = vec3(0.0);26
vec3 regular = getNormal(p);27
vec3 lightDirection = normalize(lightPosition - p);29
float diffuse = max(dot(regular, lightDirection), 0.0);30
shade = vec3(1.0, 1.0, 1.0) * diffuse;33
gl_FragColor = vec4(shade, 1.0);
That might work in concept, but it surely’s not the optimum selection for Volumetric Raymarching:
-
The
getNormal
perform requires a number of pattern factors to estimate the “gradient” in each route. Within the code above, we’d like 4, however there are code snippets that require 6 for a extra correct outcome. -
Our volumetric raymarching loop is extra resource-intensive: we’re strolling at a relentless step dimension alongside our ray to pattern the density of our quantity.
Thus, we’d like one other technique or approximation for our diffuse mild. Fortunately, Inigo Quilez presents a method to unravel this drawback in his article on directional derivatives. As an alternative of getting to pattern our density in each route like getNormal
, this technique simplifies the issue by sampling the density at our sampling level p
and at an offset within the route of the sunshine and getting the distinction between these values to approximate how the sunshine scatters roughly inside our quantity.
Within the diagram above, you possibly can see that we’re sampling our density at p1
and at one other level p1'
that is a bit additional alongside the sunshine ray:
-
If the density will increase alongside that path, which means the amount will get denser, and lightweight will scatter extra
-
If the density will get smaller, our cloud is much less thick, and thus, the sunshine will scatter much less.
This technique solely requires 2 sampling factors and consequentially requires fewer sources to offer us a superb approximation of how the sunshine behaves with the amount round p1
.
We are able to apply this diffuse components to our demo as follows:
Diffuse lighting utilizing directional derivatives
5
float diffuse = clamp((scene(p) - scene(p + 0.3 * sunDirection)) / 0.3, 0.0, 1.0 );7
vec3 lin = vec3(0.60,0.60,0.75) * 1.1 + 0.8 * vec3(1.0,0.6,0.3) * diffuse;8
vec4 shade = vec4(combine(vec3(1.0, 1.0, 1.0), vec3(0.0, 0.0, 0.0), density), density );12
res += shade * (1.0 - res.a);
That’s, as soon as once more, similar to what we had been doing in commonplace Raymarching, besides that now, we’ve got to incorporate it contained in the Raymarching loop as we’re sampling a quantity and thus should run the calculation a number of occasions all through the amount because the density might range whereas a floor required just one diffuse lighting computation (on the floor).
You may observe the distinction between our cloud with out lighting and with diffuse lighting under 👇


And here is the demo that includes the idea and code we simply launched 👀:
Morphing clouds
Let’s take slightly break to tweak our scene and have some enjoyable with what we constructed up to now! Regardless of the variations between the usual Raymarching and its volumetric counterpart, there are nonetheless a number of SDF-related ideas you possibly can apply when constructing cloudscapes.
You may attempt to make a cloud in enjoyable shapes like a cross or a torus, and even higher, attempt to make it morph from one kind to a different over time:
Mixing SDF to morph volumetric clouds into totally different shapes
1
mat2 rotate2D(float a) {4
return mat2(c, -s, s, c);7
float nextStep(float t, float len, float smo) {8
float tt = mod(t += smo, len);9
float stp = flooring(t / len) - 1.0;10
return smoothstep(0.0, smo, tt) + stp;15
p1.xz *= rotate2D(-PI * 0.1);16
p1.yz *= rotate2D(PI * 0.3);18
float s1 = sdTorus(p1, vec2(1.3, 0.9));19
float s2 = sdCross(p1 * 2.0, 0.6);20
float s3 = sdSphere(p, 1.5);21
float s4 = sdCapsule(p, vec3(-2.0, -1.5, 0.0), vec3(2.0, 1.5, 0.0), 1.0);23
float t = mod(nextStep(uTime, 3.0, 1.2), 4.0);25
float distance = combine(s1, s2, clamp(t, 0.0, 1.0));26
distance = combine(distance, s3, clamp(t - 1.0, 0.0, 1.0));27
distance = combine(distance, s4, clamp(t - 2.0, 0.0, 1.0));28
distance = combine(distance, s1, clamp(t - 3.0, 0.0, 1.0));
This demo is a copy of this volumetric rendering related Shadertoy scene. I actually like this creation as a result of the outcome may be very natural, and it gives the look that the cloud is rolling into its subsequent form naturally.
You can too attempt to render:
There are a number of artistic compositions to strive!
Efficiency optimization
You might discover that working the scenes we constructed up to now might make your laptop sound like a jet engine at excessive decision or at the least not look as clean as they might. Fortunately, we will do one thing about it and use some efficiency optimization strategies to strike the suitable stability between FPS depend and output high quality.
Blue noise dithering
One of many predominant efficiency pitfalls of our present raymarched cloudscape scene is because of:
-
the variety of steps we’ve got to carry out to pattern our quantity and the small
marchSize
-
some heavy computation we’ve got to do inside our loop, like our directional by-product or FBM.
This situation will solely worsen as we try to make extra computations to realize a extra bodily correct output within the subsequent a part of this text.
One of many first issues we might do to make this scene extra environment friendly could be to scale back the quantity of steps we carry out when sampling our cloud and improve the step dimension. Nevertheless, if we try this on a few of our earlier examples (I invite you to strive), some layering can be seen, and our quantity will look extra like some sort of milk soup than a fluffy cloud.
You might need encountered the idea of dithering or some photos utilizing dithering types earlier than.
This course of can create the phantasm of extra colours or shades in a picture than accessible or purely used for creative ends.
I like to recommend studying Dithering on the GPU from Alex Charlton when you
desire a fast introduction.
In Ray marching fog with blue noise, the writer showcases how one can leverage blue noise dithering in your raymarched scene to erase the banding or layering impact as a consequence of a decrease step depend or much less granular loop.
This method leverages a blue noise sample, which has fewer patterns or clumps than different noises and is much less seen to the human eye, to acquire a random quantity every time our fragment shader runs. We then introduce that quantity as an offset in the beginning of the raymarched loop, shifting our sampling begin level alongside our ray for every pixel of our output.
Blue noise dithering introducing an offset in our Raymarching loop
1
uniform sampler2D uBlueNoise;5
vec4 raymarch(vec3 rayOrigin, vec3 rayDirection, float offset) {7
depth += MARCH_SIZE * offset;8
vec3 p = rayOrigin + depth * rayDirection;14
float blueNoise = texture2D(uBlueNoise, gl_FragCoord.xy / 1024.0).r;15
float offset = fract(blueNoise);17
vec4 res = raymarch(ro, rd, offset);
By introducing some blue noise dithering in our fragment shader, we will erase these artifacts and get a high-quality output whereas sustaining the Raymarching step depend low!
Nevertheless, below some circumstances, the dithering sample will be fairly noticeable. By taking a look at another Shadertoy examples, I found that introducing a temporal side to the blue noise can attenuate this situation.
Temporal blue noise dithering offset
1
float offset = fract(blueNoise + float(uFrame%32) / sqrt(0.5));
This is a earlier than/after comparability of our single body of our raymarched cloud. I suppose the outcomes communicate for themselves right here 😄.


And here is the demo showcasing our blue noise dithering in motion giving us a softer cloud ⛅:
Upscaling with Bicubic filtering
This second enchancment advisable by @N8Programs goals to repair some remaining noise artifacts that stay following the introduction of the blue noise dithering to our raymarched scene.
Bicubic filtering is utilized in upscaling and permits smoothing out some noise patterns whereas retaining particulars by calculating the worth of a brand new pixel by contemplating 16 neighboring pixels via a cubic polynomial (Sources).
I used to be fortunate to seek out an implementation of bicubic filtering on Shadertoy made by N8Programs himself! Making use of it on to our present work nonetheless, isn’t that easy. We’ve got so as to add this enchancment as its personal step or go within the rendering course of, virtually as a post-processing impact.
I launched a straightforward method to construct this sort of pipeline in my article titled Beautiful and mind-bending effects with WebGL Render Targets the place I showcase how you should utilize Body Buffer Objects (FBO) to use some post-processing results on a whole scene which we will use for this use case:
-
We render our predominant raymarched canvas in a portal.
-
The default scene solely incorporates a fullscreen triangle.
-
We render our predominant scene in a render goal.
-
We go the feel of the principle scene’s render goal as a uniform of our bicubic filtering materials.
-
We use the bicubic filtering materials as the fabric for our fullscreen triangle.
-
Our bicubic filtering will take our noisy raymarched scene as a texture uniform and output the smoothed out scene.
This is a fast comparability of our scene earlier than and after making use of the bicubic
filtering:


The total implementation is a bit lengthy, and options ideas I already went via in my render goal targeted weblog publish, so I invite you to take a look at it by yourself time within the demo under:
Leveraging render targets allowed me to play extra with the decision of the unique raymarched scene. You may see slightly selector that permits you to choose at which decision we render our raymarched cloud. You may discover that there usually are not a number of variations between 1x
and 0.5x
which is nice: we will squeeze extra FPS with out sacrificing the output high quality 🎉.
Bodily correct Clouds
Up to now, we have managed to construct actually stunning cloudscapes with Volumetric Raymarching utilizing some easy strategies and mixing the suitable colours. The ensuing scenes are satisfying sufficient and provides the phantasm of huge, dense clouds, however what if we wished a extra life like output?
I spent fairly a while digging via talks, movies, and articles on how sport engines clear up the issue of bodily correct clouds and all of the strategies concerned in them. It has been a journey, and I wished to dedicate this final part to this matter as a result of I discover the topic fascinating: from a few bodily ideas of precise real-life clouds, we will render clouds in WebGL utilizing Volumetric Raymarching!
Beer’s Regulation
I already launched the idea of Beer’s Law in my Raymarching blog post as a method to render fog within the distance of a scene. It states that the depth of sunshine passing via a clear medium is exponentially associated to the gap it travels. The additional to the medium mild propagates, the extra it’s being absorbed. The components for Beer’s Regulation is as follows:
I = I0 * exp(−α * d)
, the place α
is the absorption or attenuation coefficient describing how “thick” or “dense” the medium is. In our demos, we’ll contemplate an absorption coefficient of 0.9
, though I might invite you to strive totally different values so you possibly can see the influence of this quantity on the ensuing render.
We are able to use this components in our GLSL code and modify the Raymarching loop to make use of it as an alternative of the “hacky” transparency hack we used within the first half:
Utilizing Beer’s Regulation to calculate and return the amassed mild power going via the cloud
2
#outline ABSORPTION_COEFFICIENT 0.96
float BeersLaw (float dist, float absorption) {7
return exp(-dist * absorption);10
const vec3 SUN_POSITION = vec3(1.0, 0.0, 0.0);11
const float MARCH_SIZE = 0.16;13
float raymarch(vec3 rayOrigin, vec3 rayDirection, float offset) {15
depth += MARCH_SIZE * offset;16
vec3 p = rayOrigin + depth * rayDirection;17
vec3 sunDirection = normalize(SUN_POSITION);19
float totalTransmittance = 1.0;20
float lightEnergy = 0.0;22
for (int i = 0; i < MAX_STEPS; i++) {23
float density = scene(p);26
float transmittance = BeersLaw(density * MARCH_SIZE, ABSORPTION_COEFFICIENT);27
float luminance = density;29
totalTransmittance *= transmittance;30
lightEnergy += totalTransmittance * luminance;34
p = rayOrigin + depth * rayDirection;
Within the code snippet above:
-
We gutted the raymarching loop, so it now depends on a extra bodily based mostly property: Beer’s Regulation.
-
We modified the interface of our perform: as an alternative of returning a full shade, it now returns a
float
representing the quantity of sunshine or mild power going via the cloud. -
As we march via the amount, we accumulate the obtained
transmittance
. The deeper we go, the much less mild we add. -
We return the ensuing
lightEnergy
The demo under showcases what utilizing Beers Regulation yields in our Raymarching loop 👀
The ensuing cloud is a bit unusual:
all of which is, as soon as once more, as a result of lack of a correct lighting mannequin.
Sampling mild
Our new cloud doesn’t work together with mild proper now. You may strive altering the SUN_POSITION
vector: the ensuing render will stay the identical. We not solely want a lighting mannequin but additionally a bodily correct one.
For that, we will attempt to compute how a lot mild has been absorbed for every pattern level of our Raymarching loop by:
-
Begin a devoted nested Raymarching loop that goes from the present pattern level to the sunshine supply (route of the sunshine)
-
Pattern the density and apply Beer’s Regulation like we simply did
The diagram under illustrates this method to make it a bit simpler to grasp:
The code snippet under is one in all many implementations of this method. We’ll use this one going ahead:
Devoted nested raymarching loop to pattern the sunshine obtained at a given sampled level
2
#outline MAX_STEPS_LIGHTS 63
#outline ABSORPTION_COEFFICIENT 0.97
const vec3 SUN_POSITION = vec3(1.0, 0.0, 0.0);8
const float MARCH_SIZE = 0.16;10
float lightmarch(vec3 place, vec3 rayDirection) {11
vec3 lightDirection = normalize(SUN_POSITION);12
float totalDensity = 0.0;13
float marchSize = 0.03;15
for (int step = 0; step < MAX_STEPS_LIGHTS; step++) {16
place += lightDirection * marchSize * float(step);18
float lightSample = scene(place, true);19
totalDensity += lightSample;22
float transmittance = BeersLaw(totalDensity, ABSORPTION_COEFFICIENT);26
float raymarch(vec3 rayOrigin, vec3 rayDirection, float offset) {28
depth += MARCH_SIZE * offset;29
vec3 p = rayOrigin + depth * rayDirection;30
vec3 sunDirection = normalize(SUN_POSITION);32
float totalTransmittance = 1.0;33
float lightEnergy = 0.0;35
for (int i = 0; i < MAX_STEPS; i++) {36
float density = scene(p, false);40
float lightTransmittance = lightmarch(p, rayDirection);41
float luminance = density;43
totalTransmittance *= lightTransmittance;44
lightEnergy += totalTransmittance * luminance;48
p = rayOrigin + depth * rayDirection;
Due to this nested loop, the algorithmic complexity of our Raymarching loop simply elevated, so we’ll must outline a comparatively low variety of steps to pattern our mild whereas additionally calculating a much less exact density by lowering the variety of Octaves in our FBM to protect a good frame-rate (that is one simple win I applied to keep away from dropping too many frames).
All these little tweaks and efficiency concerns have been taken into consideration within the demo under:
Anisotropic scattering and part perform
Till now, we assumed that mild will get distributed equally in each route because it propagates via the cloud. In actuality, the sunshine will get scattered in several instructions with totally different intensities as a consequence of water droplets. This phenomenon is named Anisotropic scattering (vs. Isotropic when mild scatters evenly), and to have a practical cloud, we will attempt to take this into consideration inside our Raymarching loop.
To simulate Anisotropic scattering in our cloud scene for every sampling level for a given mild supply, we will use a part perform. A standard one is the Henyey-Greenstein part perform, which I encountered in just about all of the examples I might discover on bodily correct Volumetric Raymarching.
The GLSL implementation of this part perform appears as follows:
Implementation of the Henyey-Greenstein part perform
1
float HenyeyGreenstein(float g, float mu) {3
return (1.0 / (4.0 * PI)) * ((1.0 - gg) / pow(1.0 + gg - 2.0 * g * mu, 1.5));
We now should introduce the results of this new perform in our Raymarching loop by multiplying it by the density at a given sampled level, and what we acquire is extra life like lighting for our cloud, particularly if the sunshine supply strikes round.
Introducing the Henyey-Greenstein part perform inside our Raymarching loop
1
float raymarch(vec3 rayOrigin, vec3 rayDirection, float offset) {3
depth += MARCH_SIZE * offset;4
vec3 p = rayOrigin + depth * rayDirection;5
vec3 sunDirection = normalize(SUN_POSITION);7
float totalTransmittance = 1.0;8
float lightEnergy = 0.0;10
float part = HenyeyGreenstein(SCATTERING_ANISO, dot(rayDirection, sunDirection));12
for (int i = 0; i < MAX_STEPS; i++) {13
float density = scene(p, false);17
float lightTransmittance = lightmarch(p, rayDirection);18
float luminance = density * part;20
totalTransmittance *= lightTransmittance;21
lightEnergy += totalTransmittance * luminance;25
p = rayOrigin + depth * rayDirection;


The ultimate demo of this text under showcases our scene with:
The result’s appears actually good, though I am going to admit I had so as to add an additional worth time period to my mild power components so the cloud would not merely “fade away” when dense elements would find yourself within the shade.
Further worth added to the luminance components
1
float luminance = 0.025 + density * part;
The necessity for a hack most likely highlights some points with my code, most probably as a consequence of how I take advantage of the ensuing mild power worth returned by the Raymarching loop or an absorption coefficient that is a bit too excessive. Unsure. If you happen to discover any blatantly fallacious assumptions in my code, please let me know so I could make the required edits.
Another optimizations are doable to make the cloud look fluffier and denser, like utilizing the Beer’s Powder approximation (page 64), but it surely was talked about to me that these are simply used for aesthetic causes and usually are not truly bodily based mostly (I additionally truthfully could not work out easy methods to apply it with out altering considerably my MAX_STEPS
, MAX_STEPS_LIGHTS
, and marchSize
variables 😅 and the outcome was nonetheless not nice).
@MaximeHeckel Word that beer-powder is a non-physical approximation. Perhaps see:
https://t.co/LlHxW5x5sb
Conclusion
We realized a number of methods to render cloud-like volumes with Raymarching all through this text, and contemplating that a number of weeks previous to that, I wasn’t even capable of wrap my head across the idea of Volumetric Raymarching, I am pleased with the outcome and pleased with myself given how complicated and daunting this topic will be. I used to be additionally pleasantly shocked by the power to use some physics ideas and port strategies generally utilized in triple-A online game productions to WebGL to realize life like trying clouds.
With that, I am excited to try combining raymarched clouds with terrain, like those launched in my Raymarching article, or much more complicated challenges like rendering a planet with life like atmospheric scattering. One other concept I had was to construct a raymarched galaxy; since we will simplify it to an enormous cloud in area and that a number of the physics ideas launched on this article ought to nonetheless apply and yield stunning renders.
I hope this text will encourage you in your individual Raymarching endeavors and that it helped make this seemingly hard-to-grasp idea of Volumetric rendering a bit extra welcoming 🙂.