Realtime Pathtracing

Hi everybody,

as i’m from a photoreal 3d rendering domain, i’m using pathtracing very frequently. Pathtracing is a phisically-based rendering algorithm that gives you photoreal lighting.
Modern technologies allow to do pathtracing in realtime so i think it’s time to start using pathtracing in Touch. By today’s 3d rendering standards Touch have quiet limited rendering engine, even no raytrace lighting, shadows, reflections and refractions…

Here is a proof about real-time pathtracing raytracey.blogspot.com/

Just look at the freshest video with city and instances.

Can’t you just do pathtracing in a shader?

madebyevan.com/glsl-path-tracing/

I think you are thinking about the Touch Rendering pipeline a bit backwards. The rendering engine seems to expose most of the capabilities of OpenGL, but implementing shader effects is largely up to the user.

The forum is replete with users (or Malcolm & Ben) posting shaders for other users. I don’t have time to look into this at the moment, but perhaps someone could import a path tracing shader such as the one linked above into TD for the OP?

If it can be done in webGL/OpenGL ES then it should be trivial:
cg.alexandra.dk/2012/12/14/webgl … mpetition/

-FDP

Also you should note. Touch is designed as a realtime production package meant to run at 60 fps. Unlike other 3d packages which are designed to run fast while working on your project and then rendering your photorealistic environments offline usually at 1 min or more per frame.

There are few tricks though for some different rendering techniques using a GLSL top as a raytracer but that has it’s limitations also.

Ah the webgl link above looks like it is using accumulation (probably using the alpha channel) to store data and then send it back in it’s input. In other words again it wouldn’t be useful for realtime because you would have to render the same frame multiple times. Actually both links look like they are using accumulation.

There are some distance marching shaders out there though. Check out the new shadertoy site.
shadertoy.com/

Again though at higher resolutions things will be slow. I have whole Touch DE project setup where I do animations at a lower res (hopefully at >= 30 fps) and then do offline renders at 4x times the desired resolution (to anti alias usually at 5 or 10 sec per frame to render) but that is definitely not realtime work. I do it though because Touch is the only program (that I know of at least) where I have access to all kinds of other data signals and types (LFO’s, animation, programming shaders)

Oh, one thing to note, the Brigade engine linked to by the OP uses 16 GTX Titan cards. I don’t know that I would characterize it as realtime on “normal” hardware.

OK, I should be ashamed of myself for contradicting what I said earlier, but having recently tried out “Neon” for Rhino 3D (sorry Houdini :wink:) I am sort of in love with Caustic Render’s hardware. It isn’t quite “true” realtime, as it does need to converge, but I would say that Caustic Render hardware is actually affordable (unlike 16x Titans). It seems like something that might be nice for those using TD for non-realtime/near-realtime work. Plus Houdini would probably benefit from this integration as well…

-FDP

Here is a real-time technique for creating reflections and refractions that integrates in simple rasterization rendering

gpupro.blogspot.co.nz/2014/01/gp … g-for.html

I believe ray tracing is our future :ugeek:

Great find! Seems like it might be closer. It looks like this book GPU Pro 5 is written by the same people who wrote ShaderX 7 which I recently purchased. Soon I plan on getting into some CUDA coding so maybe after I get through ShaderX 7 I’ll get a copy of GPU Pro 5. Further down the blog they also have some other great techniques.

gpuopen.com/announcing-real-time-ray-tracing/