@tekcor - I plan to add more meat to the documentation of my raytracer once it gets a public beta (alpha?) sometime in a month or two. For now, I can briefly summarize that it's a GPU-based unidirectional pathtracer that uses multiple importance sampling to relatively quickly converge on a final image without being too biased. It currently supports metallic and dielectric materials but I'll add at least glass and maybe some others before I release.
As for geometry, it supports packing an SDF into a volume texture and tracing against that (from Field Trip even!) or importing a standard OBJ mesh. Material creation, geometry placement, and lights are done through VL and fed via dynamic buffers into the shaders. I also support some basic primitives like boxes and spheres for playing around without using real models.
For lights, I currently support point, spot, and area.
For performance, I so far have a pretty naive BVH system that can allow me to trace against models with upwards of 100k vertices at near-interactive framerates. Have another screenie fo' free!
If I find the time, I could do an VL adapter for your materials and lights, so they could be used with superphysical. Would be nice, to have a scene rendered with real time rasterisation and also be able to create pathtraced renderings.
Great work!
fantastic, watching closely the development, if you can share some more details about which method you are implementing, would be highly appreciated
that is so mind blowing man!
Thanks for the commnts!
@tekcor - I plan to add more meat to the documentation of my raytracer once it gets a public beta (alpha?) sometime in a month or two. For now, I can briefly summarize that it's a GPU-based unidirectional pathtracer that uses multiple importance sampling to relatively quickly converge on a final image without being too biased. It currently supports metallic and dielectric materials but I'll add at least glass and maybe some others before I release.
As for geometry, it supports packing an SDF into a volume texture and tracing against that (from Field Trip even!) or importing a standard OBJ mesh. Material creation, geometry placement, and lights are done through VL and fed via dynamic buffers into the shaders. I also support some basic primitives like boxes and spheres for playing around without using real models.
For lights, I currently support point, spot, and area.
For performance, I so far have a pretty naive BVH system that can allow me to trace against models with upwards of 100k vertices at near-interactive framerates. Have another screenie fo' free!
If I find the time, I could do an VL adapter for your materials and lights, so they could be used with superphysical. Would be nice, to have a scene rendered with real time rasterisation and also be able to create pathtraced renderings.
that would be awesome, design scenes in real-time with superphysical and then render a polished shot with the tracer... +1!