ray tracing. It's a term you'll hear a lot now that Nvidia has announced professional and consumer graphics cards that use this technique to produce some of the more lifelike simulations that are possible in games and other animations. So, what is raytracing exactly and how does it differ from current graphics rendering techniques?
The simplistic answer is that raytracing models the behavior of light in real time while intersecting objects in a scene.
It's a feature that could lead to spectacular new graphics, but was very difficult due to the computational requirements. However, with a new graphics architecture called Turing Nvidia encounters several problems in raytracing.
First, it's about the problem of introducing the next generation computer graphics. Ray tracing is just one of many rendering techniques, but this is where Nvidia comes in, because it's a great way to add realistic real-time lighting and effects.
The second problem is the computational cost: The best Turing card for professional production costs $ 10,000, but it was even more expensive to use raytracing. What's new here is Nvidia's ready to bring ray-tracing tech to consumer-level GPUs; That was not done before.
Nvidia's current graphics technology – and most of the industry – simulates light and how light behaves in a given scene in a much simpler way, with something called rasterization. Like a painter who paints layers on a canvas, objects are rendered from the back to the front so that the front covers the objects on the back.
For example, this makes it difficult to model a mirror because screening techniques can not track and model light itself. It is often used in real-time scenes because the current-generation hardware can not compete to simulate a complex scene in motion for something it needs (such as a game or a 3D animation).
This next generation of light simulations can model light in much more detail without requiring as much computational effort as before. Raytracing models the behavior of light as it intersects with surfaces, materials, and moving objects.
A light path that moves through a scene can now be rendered more complicated. Using raytracing, you could simulate how light rays interact with objects and  create real-time realistic reflection, refraction and scattering effects. Ray-tracing can even detect and mirror mirrors, break glass, find out where the light comes from in a scene and even determine the color of the light as it passes through objects.
On paper, it sounds like the rendering technique is almost realistic to but it's not exactly new – Ray-tracing has been used in the professional industry for years. It has already been used in popular media such as Pixar's Monsters University and Marvel's Iron Man films. What makes the announcement exciting is that it will finally be available for consumer hardware; An achievement that was previously too difficult and expensive.
Here is an example of real-time ray-tracing work during a Star Wars demo with Nvidia's professional Volta RTX graphics cards:
What Nvidia held back To provide this kind of service to customers Raytracing requires an incredible amount of processing power. Nvidia's CEO Jensen Huang said this was "the biggest jump we've ever made in a generation".
It makes sense to consider that the new Turing architecture of the new Nvidia GPUs addresses the processing issue. Dedicated raytracing cores work in conjunction with tensor cores – they use AI to close the "real-time" part – to make simulations six times faster than the previous Pascal platform (GTX 1080Ti, etc.).
It's a big leap in graphics technology, and it's exciting to think of studios and people finding new applications for raytracing in animations, games and scientific simulations.
Even though the new hardware that Nvidia produces will be available only for desktops, laptops with ray-tracing technology will hit the market next year, so they are not too far away. In fact, certain upcoming games such as Metro: Exodus already have Nvidia RTX demos that show real-time raytracing, what that lasts.