Cinema 4d will get a new physical renderer that offers instant and realtime feedback.
Tachyon Render is a realtime renderer for Cinema 4D…
Some of you might be thinking that another “new” plugin renderer for Cinema 4D might be one too many. Hold off on that thought for a second.
Uppercut Software and Animation is working on a new renderer named after theorized particles that can travel faster than light.
Watch the teaser to see what will soon be in store for Cinema 4D rendering, and visit Tachyon Render for more information.
Tachyon Render got some excitement in the C4D community when a teaser video was posted, showing off how fast the renderer can be.
Tachyon isn’t a typical GPU renderer, it actually uses the same technology that a game engine would use, based in OpenGL.
That is the exciting part. Technically, it is closer to a new viewport engine than a rendering engine.
I had an opportunity to ask the Founder and CEO of Uppercut Software and Animation, Martin Weber, a few questions on Tachyon.
What lead to the development of Tachyon Render?
Martin Weber: We were using many different renderers in our productions.
The current rise of GPU rendering was intriguing but many focused on the highest possible quality with unbiased rendering. In the meanwhile we saw how video games reached a visual qualtity that rivals elaborate renderings from just a couple of years back.
Game titles like The Order 1886 looks like pre-render game cinematics during gameplay. Other artists are putting Unreal Engine 4 to good use in e.g. architectural visualization.
That lead us to research the current state of realtime rendering and how we could integrate it into Cinema 4D for our own projects. We are hoping to accelerate the creative workflow by dramatically reducing render times without the need to use a game engine.
We also struggle with the issue that clients often get nervous when we show intermediate results that are playblasts or reduced in quality. They often cannot visualize how the final product will look like. With high quality realtime rendering we would be able to mitigate this.
How long has Tachyon been in production?
Martin Weber: We started research in 2014. In 2015 we started to develop a prototype to see what quality we can achieve with different approaches.
Now we are starting to work on more features, performance optimization and tighter integration into Cinema 4D. We’d love to make the workflow as hassle-free as possible.
What does Tachyon do that other renderers don’t at the moment?
Martin Weber: Tachyon Render uses OpenGL to render. In this way it is different from all other GPU renders that facilitate CUDA or OpenCL to use the compute power of GPUs.
They are mostly unbiased raytracers. Tachyon Render is based on techniques developed by the video game industry. We are using a deferred render method specifically.
In contrast to video games we focus on image quality rather than frame rate. This allows us to use more complex algorithms that are still too slow for games. In this regard we bridge the gap between GPU rendering and realtime rendering as we can still use several seconds to render a frame but we don’t need several minutes our hours.
Rendering in a few milliseconds naturally requires some trade-offs but so far we are happy with what we can achieve with this approach.
What are the hardware requirements?
Martin Weber: Tachyon Render requires an OpenGL compatible graphics card. Most current Nvidia and AMD card should work.
We are testing with a GTX970 and a Radeon 380 but will expand that once we get closer to launch.
Naturally more complex scenes require more Video-RAM. That affects both texture sizes and geometry density. We are looking into ways to optimize for that.
Where would you see Tachyon going in the future?
Martin Weber: We are a small developer team right now. We are actually still looking for OpenGL developers to join us to accelerate development.
Right now we focus on the must-have features. We are still doing research on topics like more complex shaders (currently we have GGX with a specular/glossiness workflow implemented), realtime global illumination, realtime subsurface scattering etc.
There are also some hybrid approaches that integrate raytracing into scan-line rendering. Those are areas that we would like to research for future versions.
Right now, we hope we can deliver a tool for fellow artists that is useful in their day-to-day life. Integration into other tools like Modo and Maya are also something we think about once we got the Cinema 4D version into users hands.
— Martin Weber (@martinweber) March 4, 2016