-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Procedural atmospheric scattering #16314
base: main
Are you sure you want to change the base?
Conversation
and more things as well
mostly
and more things as well
mostly
The generated |
Wow, this is very impressive! I'll review. |
Considering the |
The generated |
The generated |
Would you like reviews/feedback on this at this stage or is it too early? |
fn sample_sky_view_lut(ray_dir: vec3<f32>) -> vec3<f32> { | ||
let lat_long = ray_dir_to_lat_long(ray_dir); | ||
let uv = sky_view_lut_lat_long_to_uv(lat_long.x, lat_long.y); | ||
return textureSampleLevel(sky_view_lut, sky_view_lut_sampler, uv, 0.0).rgb; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Inverse trigonometric functions are really slow on GPU (and on CPU). Can you use a cubemap LUT instead? Directional data is basically why cubemap textures exist.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe the author did it this way because the latitude is parameterized to put most of the detail near the horizon, which does have a noticeable effect on the final result. There's probably room to speed up my math here, but I'd want to make sure a cubemap could preserve that detail before switching.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I should check how Hillaire implements the conversion, since apparently it's good enough for unreal
} | ||
|
||
// Convert uv [0.0 .. 1.0] coordinate to ndc space xy [-1.0 .. 1.0] | ||
fn uv_to_ndc(uv: vec2<f32>) -> vec2<f32> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function already exists under that name in view_transformations.wgsl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I copied it because I wanted to avoid the import bringing in all the mesh view bindings, which would overlap with atmosphere/bindings.wgsl
. Maybe I'm misunderstanding how the imports work though
} | ||
|
||
/// Convert a ndc space position to world space | ||
fn position_ndc_to_world(ndc_pos: vec3<f32>) -> vec3<f32> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function already exists under that name in view_transformations.wgsl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I copied it because I wanted to avoid the import bringing in all the mesh view bindings, which would overlap with atmosphere/bindings.wgsl
. Maybe I'm misunderstanding how the imports work though
return FRAC_3_16_PI * (1 + (neg_LdotV * neg_LdotV)); | ||
} | ||
|
||
fn henyey_greenstein(neg_LdotV: f32) -> f32 { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably worth factoring this out somewhere so that both volumetric fog and atmosphere can use it
let horizontal_rotation = mat2x2(cos_long, -sin_long, sin_long, cos_long); | ||
let horizontal = horizontal_rotation * vec2(-view.world_from_view[2].xz); | ||
|
||
return normalize(vec3(horizontal.x, sin(lat_long.x), horizontal.y)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this would definitely be simpler as a cubemap.
The structure of everything is pretty much final so feedback on that would be nice :). At this stage I'm mainly trying to make everything physically correct, so things will change there but if you see something I'm clearly doing wrong lmk |
Some of the main assumptions made are that the center of the planet is at position |
The generated |
You added a new feature but didn't update the readme. Please run |
I thought the Hillaire paper explicitly allows for space views of the atmosphere? Anyway, I totally understand if it is out of scope. Thanks for working on this! |
It does, but the higher altitude the camera is the worse precision the sky-view and aerial-view LUTs are, even within the atmosphere. The paper recommends switching to per-pixel raytracing for high-altitude or space shots, which is where I'd want to integrate pcwalton's volumetrics code. I'd prefer that in a followup though :) |
Implement procedural atmospheric scattering from Sebastien Hillaire's 2020 paper. This approach should scale well even down to mobile hardware, and is physically accurate.
TODO
Showcase
Check the example in
examples/3d/atmosphere.rs
. Currently is super over-exposed when applied to camera, but in a graphics debugger the LUTs themselves look fine.