So I've been playing around with the denoiser this morning and either I'm not using it correctly, my machine isn't powerful enough to notice a difference or it doesn't do what it's supposed to be doing for me. I have a Dell Precision 7710, 16GB RAM w/ Nvidia Quadro M3000M (4gb DDR5 VRAM).
I've tested with the denoiser on and off both in the viewport and with producing renders and I'm not seeing the 10x faster speed. Attached a couple examples below. Render settings were as follows:
2700x2160
200 resolution
100 passes
No Denoiser (3 mins 18 seconds)
With Denoiser (3 mins 6 seconds) -12 secs!!
It seems the images are identical or at least I can't spot the difference and there are still artifacts in the image, particularly around the glass doors. Am I doing something wrong or misunderstanding what the tool is supposed to do?
Mike
For those following this thread, Alan and I worked offline and he reported the memory usage issue to internal tech support. We reproduced and followed it up with NVIDIA. It turns out it is expected behavior in SOME cases (newer hardware with certain scenes; scene size being the factor)
The following text was used to close the reported issue as "Not A Bug".
Newer generation graphics cards employ a dynamic memory allocation technique, referred to as "wavefront scheduling". Wavefront scheduling will reserve more graphics memory at scene load time, based on scene size and available graphics card memory at the time of load, in order to achieve better overall ray tracing performance. Smaller scenes may allocate a larger chunk of graphics memory to take advantage of the performance gains, while larger scenes may allocate less memory, or even skip the use of "wavefront scheduling" altogether, in order to handle the memory requirements of the scene.