Good answer. Especially if it could degrade gracefully for low performance, without temporal artifacts. E.g., have ray-surface hits act like point projectors with approximate visibility, so indirect lighting splashes brightness on a soft region instead of creating a hotspot.
I think there’s a general solution to noise that’s gone unused.
Okay, Metropolis light transport jiggles steps in each bright path, to find more low-probability / high-intensity paths. Great for caustics. But it’s only used on individual pixels. It samples a lot of light coming in to one spot. We want the inverse of that.
When light hits a point, it can scatter off in any direction, with its brightness adjusted according to probability. So… every point is a light source. It’s not uniform. But it is real light. You could test visibility from every hit along any light path, to every point onscreen, and it would remain a true unbiased render that would eventually converge.
The sensible reduction of that is to test visibility in a cone from the first bounce offscreen. Like if you’re looking at a wall lit by the moon, it goes eye, wall, moon, sun. Metropolis would jitter to different points on the moon to light the bejeezus out of that one spot on the wall. I’m proposing to instead check moon-to-wall visibility, for that exact point on the moon, but nearby spots on that wall. (Deferred rendering can skip testing between the wall and your eye. Pick visible spots.)
One spot on the moon would not accurately recreate soft moonlight - but Blender’s newer Eevee renderer proves that a dozen can suffice.
One sample per pixel could be a million.
You don’t need to go from all of them, to every point onscreen, to get some pretty smooth results. Basically it’s “instant radiosity” but with anisotropic virtual point sources. It’s just a shame shadowing still needs to be done (or faked) or else applying them would be doable in screen space.
Ez ray tracing, because ray tracing is cool.
Good answer. Especially if it could degrade gracefully for low performance, without temporal artifacts. E.g., have ray-surface hits act like point projectors with approximate visibility, so indirect lighting splashes brightness on a soft region instead of creating a hotspot.
I think there’s a general solution to noise that’s gone unused.
Okay, Metropolis light transport jiggles steps in each bright path, to find more low-probability / high-intensity paths. Great for caustics. But it’s only used on individual pixels. It samples a lot of light coming in to one spot. We want the inverse of that.
When light hits a point, it can scatter off in any direction, with its brightness adjusted according to probability. So… every point is a light source. It’s not uniform. But it is real light. You could test visibility from every hit along any light path, to every point onscreen, and it would remain a true unbiased render that would eventually converge.
The sensible reduction of that is to test visibility in a cone from the first bounce offscreen. Like if you’re looking at a wall lit by the moon, it goes eye, wall, moon, sun. Metropolis would jitter to different points on the moon to light the bejeezus out of that one spot on the wall. I’m proposing to instead check moon-to-wall visibility, for that exact point on the moon, but nearby spots on that wall. (Deferred rendering can skip testing between the wall and your eye. Pick visible spots.)
One spot on the moon would not accurately recreate soft moonlight - but Blender’s newer Eevee renderer proves that a dozen can suffice.
One sample per pixel could be a million.
You don’t need to go from all of them, to every point onscreen, to get some pretty smooth results. Basically it’s “instant radiosity” but with anisotropic virtual point sources. It’s just a shame shadowing still needs to be done (or faked) or else applying them would be doable in screen space.