Graphics card?

If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).

BCcanuck

Well-known member
Supporting Member
Marketplace
Can I get some suggestions for a new graphics card to speed up Lightroom. My current card together with the large Z9 files is making my desktop shudder.
 
Of recent, the need of a larger and more powerful GPU has grown in LR Classic with many of the newly introduced features. I cannot make specific recommendations as I was looking to upgrade and have held off for now, hoping to revisit the issue with possibly more powerful GPU options. Puget Systems often has articles about what hardware they recommend (and I believe that Adobe may have updated their recommendations as well). But in short, if you want to utilize many of the new features, you will want more than a basic GPU card. I am sure you will get specific recommendations from other forum members shortly.

Good luck,

--Ken
 
Last edited:
Be sure the problem is only graphic card. Large files from hres sensors can slow down the machine in different ways.
When do you feel your station is slow and what is your station (cpu/gpu/ram and drive used) ?
 
What language are you guys talking LOL? LRC takes over 1 1/4 minutes to render Denoise which I thought would be speeded up with a faster GPU???
Ryzen 7 3700X, Asus Prime B550-Plus, 64gb DDR 4 1333Mhz, Nvidea GTX 1050.
 
Like others have said make sure the GPU is the problem. When I am concerned about processing speed I like to view the Windows Task Manager and watch the resource utilization while using an application. I believe there is information on the Adobe site on which LrC tasks benefit from a GPU.

I currently use an NVIDIA RTS 2060 Super and it processes my Z9 High Efficiency RAW files in under 30 seconds. I have considered upgrading my GPU but I only have a 600 watt power supply so I would have to upgrade that as well.
 
Last edited:
Can I get some suggestions for a new graphics card to speed up Lightroom. My current card together with the large Z9 files is making my desktop shudder.
With Apple and the M1 and M2 chips a separate GPU may not make for a big improvement. With a Windows computer a lot depends on the slot available inside and the space for a wide GPU board.

At least with Photoshop I found that processing quickly fragmented my hard drive and so I added a separate physical drive to be the Scratch drive. This prevented the application from fragmenting my data on its drive. Going to a SSD is one is not already in the computer is a cheap fix with a Sandisk 1TB SSD selling for $50.

With a laptop I would check the power settings as the default settings are usually made to extend battery life at the expense of performance.
 
You have to search for RTX 4090 Nvidia. You should find several brands and models done with this gpu.
Be carefull of 3 things :
- your power supply must be able to give enough power or must be changed.
- factorform of those cards is uge (generally 3 places) : so you must have enough room in your workstation.
- as for all those powerfull calculation cards, the cooling of your case must be ok or adapted.
And I would add :
- never! overclock it for professional use (let those things for gamers).

Here is the first amazon link I found when searching :
https://www.amazon.fr/dp/B0BMQX4CMQ/
Not sure it is your country zone, but you will easily find links if not.
 
A few additional comments with regards to either the RTX3090 or RTX4090 GPU cards.

- for RTX3090, 850W supply will be ok unless you're really loaded up with spinning disks and using a top of the line CPU...1000W would be wise, though. I'm using an 850W supply with an I7-8700K and I never come close to maxing it out.
- either card takes up 3 slots, but with the exception of the nVidia Founders Edition RTX3090, putting anything but a very small card in the next slot over from the GPU card will block one or more of the cooling fans, making that slot unuseable, as well. As JoPoV mentions, these cards are long, make sure you can fit it in. Don't even think about using one of these in a case for a MicroATX MB...go mid-tower at a minimum.
- if you're going to do large batch processing, denoising images or video rendering of longer content, you absolutely need a well ventilated case with one or more active case exhaust fans, preferably driven off the motherboard to allow for temperature dependent speed control. In my case, I have 3 front panel fans with manual speed control and one high-flow rear panel fan under MB control. Keeps the case interior from getting toasty!
 
A few additional comments with regards to either the RTX3090 or RTX4090 GPU cards.

- for RTX3090, 850W supply will be ok unless you're really loaded up with spinning disks and using a top of the line CPU...1000W would be wise, though. I'm using an 850W supply with an I7-8700K and I never come close to maxing it out.
- either card takes up 3 slots, but with the exception of the nVidia Founders Edition RTX3090, putting anything but a very small card in the next slot over from the GPU card will block one or more of the cooling fans, making that slot unuseable, as well. As JoPoV mentions, these cards are long, make sure you can fit it in. Don't even think about using one of these in a case for a MicroATX MB...go mid-tower at a minimum.
- if you're going to do large batch processing, denoising images or video rendering of longer content, you absolutely need a well ventilated case with one or more active case exhaust fans, preferably driven off the motherboard to allow for temperature dependent speed control. In my case, I have 3 front panel fans with manual speed control and one high-flow rear panel fan under MB control. Keeps the case interior from getting toasty!
Pretty much correct on all points. I had to lose my USB expansion card to make room for the 4090. I run an ATX mid tower with 6 fans input and 4 fans output (3 of those on the CPU cooler). This results in lots of flow and also a slight positive case pressure so won't be sucking dust into the case. Cabling can also be an issue, as you will be using either 3 or 4 8-pin cables out of the power supply. Card was supplied with a funky splitter which is pretty ugly, so ordered a cable specific to the 4090 from Cablemod to clean things up a bit.
 
It is only denoise that really hits the GPU in LR. Mostly LR is a very poor utilizer of GPU. Two most important bits are disk speed and CPU and enough RAM. Especially when importing and building previews. Also a USB 3.2 2X2 port if you are using fast cards in your camera (CF Express Type B). GPU essential if you use denoise though. About 8 seconds for 45mp files on my 3090.
 
It is only denoise that really hits the GPU in LR. Mostly LR is a very poor utilizer of GPU. Two most important bits are disk speed and CPU and enough RAM. Especially when importing and building previews. Also a USB 3.2 2X2 port if you are using fast cards in your camera (CF Express Type B). GPU essential if you use denoise though. About 8 seconds for 45mp files on my 3090.
That has been changing more of late. The newest versions of LRC are utilizing the GPU more than previous versions. Here is a recent and somewhat related long thread touching on the topic if you are interested: https://www.lightroomqueen.com/comm...ience-gigabyte-nvidia-rtx-4070-ti-12gb.47572/ .

--Ken
 
That has been changing more of late. The newest versions of LRC are utilizing the GPU more than previous versions. Here is a recent and somewhat related long thread touching on the topic if you are interested: https://www.lightroomqueen.com/comm...ience-gigabyte-nvidia-rtx-4070-ti-12gb.47572/ .

--Ken
Yes, without doubt the new AI based functions (de noise and intelligent masking) hit the GPU. But Adobe have not touched the rest of it regarding processor allocation. For injesting images, creating previews and performing most edits, sufficient memory, fast CPU and fast IO are going to give you the best bang for buck. You can easily see this in the performance monitor when using LR. Ideally you will have all high performance components h, but budgets are not often unlimited and prioritisation is necessary.
 
Yes, without doubt the new AI based functions (de noise and intelligent masking) hit the GPU. But Adobe have not touched the rest of it regarding processor allocation. For injesting images, creating previews and performing most edits, sufficient memory, fast CPU and fast IO are going to give you the best bang for buck. You can easily see this in the performance monitor when using LR. Ideally you will have all high performance components h, but budgets are not often unlimited and prioritisation is necessary.
Completely agree. It is mostly the new features that have been taking advantage of the GPU horsepower. And this need has put me in a bind as I am looking for a new PC and had thought a mild graphics card would have sufficed for a number of years. I am now rethinking that. Regarding basic functions, you are correct. There was a long thread a few months ago at LRQ about reading images and previews, and the consensus was that there was not as much gain from advances in storage technology (i.e. SSD's) as people had expected. So yes, a fast CPU and lots of RAM are still key ingredients when looking at a new computer.

--Ken
 
Completely agree. It is mostly the new features that have been taking advantage of the GPU horsepower. And this need has put me in a bind as I am looking for a new PC and had thought a mild graphics card would have sufficed for a number of years. I am now rethinking that. Regarding basic functions, you are correct. There was a long thread a few months ago at LRQ about reading images and previews, and the consensus was that there was not as much gain from advances in storage technology (i.e. SSD's) as people had expected. So yes, a fast CPU and lots of RAM are still key ingredients when looking at a new computer.

--Ken
Not wanting to appear contrarian but this advice needs a little refining. Regarding SSD's; advances have been enormous. The NVME interface has enabled vast increases in speed over SATA. Further, each iteration of the PCIE interface doubles the number of lanes (bandwidth) - we are now up to 5.0. This makes a vast difference to the responsiveness of nearly all functions of LR. Regarding memory, LR really does not require huge amounts of RAM, it will perform well with 32 Gb of fastish RAM. However if you frequently take a round trip into PS or perform stitching or focus stacking then you will use every bit of RAM you can cram onto your Motherboard. Regarding CPU, yes LR will use every bit of CPU you can throw at it. I run an i9 13900k and LR will run it full bore when creating previews, exporting etc.
Building a PC to run LR, I would choose currently:
- Motherboard with LGA1700 CPU socket, at least 3 4.0 or 5.0 M.2 NVME slots, PCIE 4.0 or 5.0, sufficient SATA sockets for spinning SATA archive discs, USB 3.2 2x2, 4 RAM sockets capable of taking 32Gb sticks
- i9 13900k CPU
- one or two 32Gb sticks of RAM
- one NVME SSD for system (OS and Apps) as fast as possible EVO 990 is great but 980 if budget is tight. These are PCIE 4.0 cards - 5.0 cards are starting to appear but are rather costly right.
- one NVME SSD for LR catalogue and "project" files
- Several spinning SATA drives for archive and smaller files
- A case large enough to fit a top end graphics card (3090 or 4090)
- 1200w + platinum power supply

You can add more memory and more NVME SSD's as budget permits. You can also add a graphics card as budget permits. A used 3090 is actually a good option right now and will denoise a 50mp raw file in less than 10 seconds. Of course a 4090 is the ultimate.
 
Not wanting to appear contrarian but this advice needs a little refining. Regarding SSD's; advances have been enormous. The NVME interface has enabled vast increases in speed over SATA. Further, each iteration of the PCIE interface doubles the number of lanes (bandwidth) - we are now up to 5.0. This makes a vast difference to the responsiveness of nearly all functions of LR. Regarding memory, LR really does not require huge amounts of RAM, it will perform well with 32 Gb of fastish RAM. However if you frequently take a round trip into PS or perform stitching or focus stacking then you will use every bit of RAM you can cram onto your Motherboard. Regarding CPU, yes LR will use every bit of CPU you can throw at it. I run an i9 13900k and LR will run it full bore when creating previews, exporting etc.
Building a PC to run LR, I would choose currently:
- Motherboard with LGA1700 CPU socket, at least 3 4.0 or 5.0 M.2 NVME slots, PCIE 4.0 or 5.0, sufficient SATA sockets for spinning SATA archive discs, USB 3.2 2x2, 4 RAM sockets capable of taking 32Gb sticks
- i9 13900k CPU
- one or two 32Gb sticks of RAM
- one NVME SSD for system (OS and Apps) as fast as possible EVO 990 is great but 980 if budget is tight. These are PCIE 4.0 cards - 5.0 cards are starting to appear but are rather costly right.
- one NVME SSD for LR catalogue and "project" files
- Several spinning SATA drives for archive and smaller files
- A case large enough to fit a top end graphics card (3090 or 4090)
- 1200w + platinum power supply

You can add more memory and more NVME SSD's as budget permits. You can also add a graphics card as budget permits. A used 3090 is actually a good option right now and will denoise a 50mp raw file in less than 10 seconds. Of course a 4090 is the ultimate.
It was a nuanced discussion, and it was not so much focused on advances in storage technology, but rather that they did not make as big of a difference as some had hoped/stated in reading the image files. So, it was more of a statement similar to what you had said above about LRC not taking advantage of the GPU on many basic features. It was also a somewhat contentious thread, but the good information that was posted was helpful. I'll try to find the link if I can.

--Ken
 
It was a nuanced discussion, and it was not so much focused on advances in storage technology, but rather that they did not make as big of a difference as some had hoped/stated in reading the image files. So, it was more of a statement similar to what you had said above about LRC not taking advantage of the GPU on many basic features. It was also a somewhat contentious thread, but the good information that was posted was helpful. I'll try to find the link if I can.

--Ken
Here is the specific post in the thread that I was trying to recall: https://www.lightroomqueen.com/community/threads/lr-classic-storage-performance.47289/post-1313058 . I recommend focusing on Jim's reply and not the entire exchange. I do not think this is any earth-shattering news, and it does not directly address the OP's question about GPU's, but it does call out one area where LRC does and does not benefit from newer and faster technology. Again, this needs to be considered in the greater context of how to put together a system that works better/best with LRC, which your above comments also touch upon.

--Ken
 
Thanks for that link. However I cannot see in his methodology where he clears the cache between each test and where his LR catalogue is stored. It is possible all he is doing is using cached data on his internal SSD for each "test". But he is the guru so....
 
Back
Top