Can I get some suggestions for a new graphics card to speed up Lightroom. My current card together with the large Z9 files is making my desktop shudder.
If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).
RTX 4090 should be fine, of course !I went with an MSI Geforce RTX 4090. Still huge, but its one of the smaller 4090 cards. Figure $1600 to $1700. Blistering fast processing in LR and Topaz
but you have to consider power supply with those card.RTX 4090 should be fine, of course !
4090 Card can draw 600 watts, consider 1000 watt power supply as a minimum.but you have to consider power supply with those card.
What is your current configuration, and when do you feel your station is slow ?
With Apple and the M1 and M2 chips a separate GPU may not make for a big improvement. With a Windows computer a lot depends on the slot available inside and the space for a wide GPU board.Can I get some suggestions for a new graphics card to speed up Lightroom. My current card together with the large Z9 files is making my desktop shudder.
What language are you guys talking LOL?
This is indeed gpu related.LRC takes over 1 1/4 minutes to render Denoise which I thought would be speeded up with a faster GPU???
Can you give me a link (Amazon?) to an actual card of this type?
This is indeed gpu related.
Tensor core of 4090 are the more powerfull
Pretty much correct on all points. I had to lose my USB expansion card to make room for the 4090. I run an ATX mid tower with 6 fans input and 4 fans output (3 of those on the CPU cooler). This results in lots of flow and also a slight positive case pressure so won't be sucking dust into the case. Cabling can also be an issue, as you will be using either 3 or 4 8-pin cables out of the power supply. Card was supplied with a funky splitter which is pretty ugly, so ordered a cable specific to the 4090 from Cablemod to clean things up a bit.A few additional comments with regards to either the RTX3090 or RTX4090 GPU cards.
- for RTX3090, 850W supply will be ok unless you're really loaded up with spinning disks and using a top of the line CPU...1000W would be wise, though. I'm using an 850W supply with an I7-8700K and I never come close to maxing it out.
- either card takes up 3 slots, but with the exception of the nVidia Founders Edition RTX3090, putting anything but a very small card in the next slot over from the GPU card will block one or more of the cooling fans, making that slot unuseable, as well. As JoPoV mentions, these cards are long, make sure you can fit it in. Don't even think about using one of these in a case for a MicroATX MB...go mid-tower at a minimum.
- if you're going to do large batch processing, denoising images or video rendering of longer content, you absolutely need a well ventilated case with one or more active case exhaust fans, preferably driven off the motherboard to allow for temperature dependent speed control. In my case, I have 3 front panel fans with manual speed control and one high-flow rear panel fan under MB control. Keeps the case interior from getting toasty!
That has been changing more of late. The newest versions of LRC are utilizing the GPU more than previous versions. Here is a recent and somewhat related long thread touching on the topic if you are interested: https://www.lightroomqueen.com/comm...ience-gigabyte-nvidia-rtx-4070-ti-12gb.47572/ .It is only denoise that really hits the GPU in LR. Mostly LR is a very poor utilizer of GPU. Two most important bits are disk speed and CPU and enough RAM. Especially when importing and building previews. Also a USB 3.2 2X2 port if you are using fast cards in your camera (CF Express Type B). GPU essential if you use denoise though. About 8 seconds for 45mp files on my 3090.
Yes, without doubt the new AI based functions (de noise and intelligent masking) hit the GPU. But Adobe have not touched the rest of it regarding processor allocation. For injesting images, creating previews and performing most edits, sufficient memory, fast CPU and fast IO are going to give you the best bang for buck. You can easily see this in the performance monitor when using LR. Ideally you will have all high performance components h, but budgets are not often unlimited and prioritisation is necessary.That has been changing more of late. The newest versions of LRC are utilizing the GPU more than previous versions. Here is a recent and somewhat related long thread touching on the topic if you are interested: https://www.lightroomqueen.com/comm...ience-gigabyte-nvidia-rtx-4070-ti-12gb.47572/ .
--Ken
Completely agree. It is mostly the new features that have been taking advantage of the GPU horsepower. And this need has put me in a bind as I am looking for a new PC and had thought a mild graphics card would have sufficed for a number of years. I am now rethinking that. Regarding basic functions, you are correct. There was a long thread a few months ago at LRQ about reading images and previews, and the consensus was that there was not as much gain from advances in storage technology (i.e. SSD's) as people had expected. So yes, a fast CPU and lots of RAM are still key ingredients when looking at a new computer.Yes, without doubt the new AI based functions (de noise and intelligent masking) hit the GPU. But Adobe have not touched the rest of it regarding processor allocation. For injesting images, creating previews and performing most edits, sufficient memory, fast CPU and fast IO are going to give you the best bang for buck. You can easily see this in the performance monitor when using LR. Ideally you will have all high performance components h, but budgets are not often unlimited and prioritisation is necessary.
Not wanting to appear contrarian but this advice needs a little refining. Regarding SSD's; advances have been enormous. The NVME interface has enabled vast increases in speed over SATA. Further, each iteration of the PCIE interface doubles the number of lanes (bandwidth) - we are now up to 5.0. This makes a vast difference to the responsiveness of nearly all functions of LR. Regarding memory, LR really does not require huge amounts of RAM, it will perform well with 32 Gb of fastish RAM. However if you frequently take a round trip into PS or perform stitching or focus stacking then you will use every bit of RAM you can cram onto your Motherboard. Regarding CPU, yes LR will use every bit of CPU you can throw at it. I run an i9 13900k and LR will run it full bore when creating previews, exporting etc.Completely agree. It is mostly the new features that have been taking advantage of the GPU horsepower. And this need has put me in a bind as I am looking for a new PC and had thought a mild graphics card would have sufficed for a number of years. I am now rethinking that. Regarding basic functions, you are correct. There was a long thread a few months ago at LRQ about reading images and previews, and the consensus was that there was not as much gain from advances in storage technology (i.e. SSD's) as people had expected. So yes, a fast CPU and lots of RAM are still key ingredients when looking at a new computer.
--Ken
It was a nuanced discussion, and it was not so much focused on advances in storage technology, but rather that they did not make as big of a difference as some had hoped/stated in reading the image files. So, it was more of a statement similar to what you had said above about LRC not taking advantage of the GPU on many basic features. It was also a somewhat contentious thread, but the good information that was posted was helpful. I'll try to find the link if I can.Not wanting to appear contrarian but this advice needs a little refining. Regarding SSD's; advances have been enormous. The NVME interface has enabled vast increases in speed over SATA. Further, each iteration of the PCIE interface doubles the number of lanes (bandwidth) - we are now up to 5.0. This makes a vast difference to the responsiveness of nearly all functions of LR. Regarding memory, LR really does not require huge amounts of RAM, it will perform well with 32 Gb of fastish RAM. However if you frequently take a round trip into PS or perform stitching or focus stacking then you will use every bit of RAM you can cram onto your Motherboard. Regarding CPU, yes LR will use every bit of CPU you can throw at it. I run an i9 13900k and LR will run it full bore when creating previews, exporting etc.
Building a PC to run LR, I would choose currently:
- Motherboard with LGA1700 CPU socket, at least 3 4.0 or 5.0 M.2 NVME slots, PCIE 4.0 or 5.0, sufficient SATA sockets for spinning SATA archive discs, USB 3.2 2x2, 4 RAM sockets capable of taking 32Gb sticks
- i9 13900k CPU
- one or two 32Gb sticks of RAM
- one NVME SSD for system (OS and Apps) as fast as possible EVO 990 is great but 980 if budget is tight. These are PCIE 4.0 cards - 5.0 cards are starting to appear but are rather costly right.
- one NVME SSD for LR catalogue and "project" files
- Several spinning SATA drives for archive and smaller files
- A case large enough to fit a top end graphics card (3090 or 4090)
- 1200w + platinum power supply
You can add more memory and more NVME SSD's as budget permits. You can also add a graphics card as budget permits. A used 3090 is actually a good option right now and will denoise a 50mp raw file in less than 10 seconds. Of course a 4090 is the ultimate.
Here is the specific post in the thread that I was trying to recall: https://www.lightroomqueen.com/community/threads/lr-classic-storage-performance.47289/post-1313058 . I recommend focusing on Jim's reply and not the entire exchange. I do not think this is any earth-shattering news, and it does not directly address the OP's question about GPU's, but it does call out one area where LRC does and does not benefit from newer and faster technology. Again, this needs to be considered in the greater context of how to put together a system that works better/best with LRC, which your above comments also touch upon.It was a nuanced discussion, and it was not so much focused on advances in storage technology, but rather that they did not make as big of a difference as some had hoped/stated in reading the image files. So, it was more of a statement similar to what you had said above about LRC not taking advantage of the GPU on many basic features. It was also a somewhat contentious thread, but the good information that was posted was helpful. I'll try to find the link if I can.
--Ken