Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Procedural atmospheric scattering #16314

Draft
wants to merge 49 commits into
base: main
Choose a base branch
from
Draft

Conversation

ecoskey
Copy link
Contributor

@ecoskey ecoskey commented Nov 9, 2024

Implement procedural atmospheric scattering from Sebastien Hillaire's 2020 paper. This approach should scale well even down to mobile hardware, and is physically accurate.

TODO

  • implement multiscattering (followup?)
  • handle scene units better
  • fix over-exposure
  • check physical correctness
  • check that coordinates are correct (lat-long -> ray direction specifically, and that signs for depth + view dir are correct)
  • integrate with pcwalton's volumetrics code (could have volumetric shadows with accurate sky color!!) (definitely in a followup)
  • draw sun disks (get radius from shadow softness value?)
  • handle hdr/non-hdr view texture
  • set camera depth texture usages automatically

Showcase

Check the example in examples/3d/atmosphere.rs. Currently is super over-exposed when applied to camera, but in a graphics debugger the LUTs themselves look fine.

@alice-i-cecile alice-i-cecile added M-Needs-Release-Note Work that should be called out in the blog due to impact S-Waiting-on-Author The author needs to make changes or address concerns before this can be merged labels Nov 9, 2024
Copy link
Contributor

github-actions bot commented Nov 9, 2024

The generated examples/README.md is out of sync with the example metadata in Cargo.toml or the example readme template. Please run cargo run -p build-templated-pages -- update examples to update it, and commit the file change.

@pcwalton
Copy link
Contributor

pcwalton commented Nov 9, 2024

Wow, this is very impressive! I'll review.

@aevyrie
Copy link
Member

aevyrie commented Nov 9, 2024

Considering the Atmosphere takes parameters about the inner and outer radius of the atmosphere, maybe it would make sense if that component was on another entity? That would allow you to compute the camera's view within (or outside) the atmosphere based on the position of the atmosphere and the camera using their GlobalTransforms. Excited to see this implemented!

Copy link
Contributor

The generated examples/README.md is out of sync with the example metadata in Cargo.toml or the example readme template. Please run cargo run -p build-templated-pages -- update examples to update it, and commit the file change.

Copy link
Contributor

The generated examples/README.md is out of sync with the example metadata in Cargo.toml or the example readme template. Please run cargo run -p build-templated-pages -- update examples to update it, and commit the file change.

@BenjaminBrienen BenjaminBrienen added the D-Complex Quite challenging from either a design or technical perspective. Ask for help! label Nov 10, 2024
@pcwalton
Copy link
Contributor

Would you like reviews/feedback on this at this stage or is it too early?

fn sample_sky_view_lut(ray_dir: vec3<f32>) -> vec3<f32> {
let lat_long = ray_dir_to_lat_long(ray_dir);
let uv = sky_view_lut_lat_long_to_uv(lat_long.x, lat_long.y);
return textureSampleLevel(sky_view_lut, sky_view_lut_sampler, uv, 0.0).rgb;
Copy link
Contributor

@pcwalton pcwalton Nov 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inverse trigonometric functions are really slow on GPU (and on CPU). Can you use a cubemap LUT instead? Directional data is basically why cubemap textures exist.

Copy link
Contributor Author

@ecoskey ecoskey Nov 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the author did it this way because the latitude is parameterized to put most of the detail near the horizon, which does have a noticeable effect on the final result. There's probably room to speed up my math here, but I'd want to make sure a cubemap could preserve that detail before switching.

Copy link
Contributor Author

@ecoskey ecoskey Nov 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I should check how Hillaire implements the conversion, since apparently it's good enough for unreal

}

// Convert uv [0.0 .. 1.0] coordinate to ndc space xy [-1.0 .. 1.0]
fn uv_to_ndc(uv: vec2<f32>) -> vec2<f32> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function already exists under that name in view_transformations.wgsl

Copy link
Contributor Author

@ecoskey ecoskey Nov 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I copied it because I wanted to avoid the import bringing in all the mesh view bindings, which would overlap with atmosphere/bindings.wgsl. Maybe I'm misunderstanding how the imports work though

}

/// Convert a ndc space position to world space
fn position_ndc_to_world(ndc_pos: vec3<f32>) -> vec3<f32> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function already exists under that name in view_transformations.wgsl

Copy link
Contributor Author

@ecoskey ecoskey Nov 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I copied it because I wanted to avoid the import bringing in all the mesh view bindings, which would overlap with atmosphere/bindings.wgsl. Maybe I'm misunderstanding how the imports work though

return FRAC_3_16_PI * (1 + (neg_LdotV * neg_LdotV));
}

fn henyey_greenstein(neg_LdotV: f32) -> f32 {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably worth factoring this out somewhere so that both volumetric fog and atmosphere can use it

let horizontal_rotation = mat2x2(cos_long, -sin_long, sin_long, cos_long);
let horizontal = horizontal_rotation * vec2(-view.world_from_view[2].xz);

return normalize(vec3(horizontal.x, sin(lat_long.x), horizontal.y));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this would definitely be simpler as a cubemap.

@ecoskey
Copy link
Contributor Author

ecoskey commented Nov 10, 2024

Would you like reviews/feedback on this at this stage or is it too early?

The structure of everything is pretty much final so feedback on that would be nice :). At this stage I'm mainly trying to make everything physically correct, so things will change there but if you see something I'm clearly doing wrong lmk

@ecoskey
Copy link
Contributor Author

ecoskey commented Nov 10, 2024

Considering the Atmosphere takes parameters about the inner and outer radius of the atmosphere, maybe it would make sense if that component was an another entity? That would allow you to compute the camera's view within (or outside) the atmosphere based on the position of the atmosphere and the camera using their GlobalTransforms. Excited to see this implemented!

Some of the main assumptions made are that the center of the planet is at position (view.x, -bottom_radius, view.z) and that the camera never leaves the atmosphere. So this implementation doesn't support worlds that are actually spherical, or space scenes where the camera can view the atmosphere from above. In the future once this gets integrated with pcwalton's raytracing code, it might be worth removing these restrictions. In that case I'd agree that the atmosphere/planet should be its own entity

Copy link
Contributor

The generated examples/README.md is out of sync with the example metadata in Cargo.toml or the example readme template. Please run cargo run -p build-templated-pages -- update examples to update it, and commit the file change.

Copy link
Contributor

You added a new feature but didn't update the readme. Please run cargo run -p build-templated-pages -- update features to update it, and commit the file change.

@aevyrie
Copy link
Member

aevyrie commented Nov 11, 2024

Some of the main assumptions made

I thought the Hillaire paper explicitly allows for space views of the atmosphere? Anyway, I totally understand if it is out of scope. Thanks for working on this!

@ecoskey
Copy link
Contributor Author

ecoskey commented Nov 12, 2024

Some of the main assumptions made

I thought the Hillaire paper explicitly allows for space views of the atmosphere? Anyway, I totally understand if it is out of scope. Thanks for working on this!

It does, but the higher altitude the camera is the worse precision the sky-view and aerial-view LUTs are, even within the atmosphere. The paper recommends switching to per-pixel raytracing for high-altitude or space shots, which is where I'd want to integrate pcwalton's volumetrics code. I'd prefer that in a followup though :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Rendering Drawing game state to the screen C-Feature A new feature, making something new possible D-Complex Quite challenging from either a design or technical perspective. Ask for help! M-Needs-Release-Note Work that should be called out in the blog due to impact S-Waiting-on-Author The author needs to make changes or address concerns before this can be merged
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

5 participants