Graphics Card memory and LR, DNAI, SAI.

If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).

BCcanuck

Well-known member
Supporting Member
Marketplace
I note that both DeNoise AI and Sharpen AI like Lightroom use the grapics processsing unit of the graphics card. Am I correct in assuming that more graphics memory equals faster processing? That being so, can anyone suggest an ideal card for a Windows system? My current card only has 1GB of memory.
 
If you have an Intel processor, look at the nVidia line, I've done well with GForce. With a AMD processor, Radeon cards seem to work better. Pay attention to the power requirements, if you try to draw too much power from the power supply you'll have lots of issues.
 
I use a Radeon 570 with 4GB RAM on board and 8GB motherboard RAM for the Topaz offerings without problems, it could be a bit quicker, but I don't do much post processing apart from a bit of denoising and sharpening now and again, if I think it is needed.
 
The higher amount of Ram is needed primarily for video editing. With LR and similar applications the number of CUDA cores is important and this is also important for video processing with the decoding and encoding involved. Newer boards have GPU's with more CUDA cores and higher memory bandwidth and faster GDDR5 GPU memory.

I only wanted to use a single PCIe slot on my motherboard and as the RTX boards use up 3 slots, I went with a Nvidia 1070 Ti board that has 2432 CUDA cores, 8GB GDDR5 memory, and 256 GB/sec memory bandwith. I paid $525 but is all I need for still and video processing.
 
I would take a look on the Topaz website for the graphics cards that they support. My new iMac has an unsupported graphics card, but by using the Intel Openvino option in application preferences both Sharpen and DeNoise AI move along crisply.
 
I just bought a new Dell for photo editing and put in 32 GB of memory and a NVIDIA(R) GeForce(R) RTX 2060 SUPER(TM) 8GB GDDR6 graphics card and it only takes a few seconds to do a DeNoise!
 
I recently built a PC and used the Nvidia RTX 2060 Super 8GB and it runs the Topaz apps well and renders video well. I find it interesting that currently the Nvidia 20 series cards have two different drivers, a game ready driver and a studio driver. The studio driver is supposed to be optimized for creative apps. This past week while working on video I noticed that my machine was lagging and in looking at the resource utilization the app was not taking advantage of the GPU. I switched to the game ready driver and the performance was restored. The studio driver is date 9/17/2020 and the game driver is dated 10/07/2020. My experience has been that normally when a driver is updated it updates both studio and game drivers but not this time. Not sure what is going on but the game driver is working fine and I don't see any noticeable performance issues.

Other than that issue the 2060 super seems to be a great graphics card for photo and video apps.
 
Just an FYI here guys... On my Mac if I go to Preferences / advanced settings you can tell the app to use just the on board CPU. or if you have a video card that is supported you can run the GPU...
There is also an intel VINO setting to use for 6th generation CPU these are two settings, to try changing if you are having issues.

On my Mac I ended up turning on the GPU for all my apps and it runs much faster.
 
Last edited:
Back
Top