bill_3305731's profile

1.2K Messages

 • 

13.1K Points

Fri, Mar 13, 2020 9:50 PM

Lightroom/Camera Raw: Suggestions for selecting a Video Card

Video cards are expensive and it would be nice if we could understand how much we should be spending when we upgrade. So what is Adobe's statement of direction? 
 
Sample question, will we benefit from an 8 GB card over a 4 GB card on a 4K monitor? What about on a dual monitor system, dual 4K? 
 
It has been true for several years that any modest video card would match the fastest cards for Lightroom. I've personally experience this confirming the benchmarks. However with significant use of the card by the Texture Slider and Import/Export, is this still true? 
 
Puget Systems found a huge improvement for the Texture Slider between the fastest (2080 Ti) and more modest cards. But I have an old Quadro K1200 4GB card yet this slider performs in real time on my 4K monitor. There is no delay, I can watch the image change as I adjust the slider. Whereas the Noise Reduction Luminance slider has about a 3 sec delay on Fuji X-T1 RAW files. 
 
Decision example, ONLY for Lightroom and not other applications: I'm getting ready to replace my Quadro K1200 4GB card and considering the Quadro RTX 4000 8 GB vs the GTX 2080 Ti 11 GB. Ignoring batch operations and price, will I see any differences between these 2 cards over the next 3 years? 1) performance? 2) true 10 bit processing? 

Official Solution

Employee

 • 

637 Messages

 • 

11.6K Points

2 y ago

Hi Bill,

Here are some things to keep in mind as you consider your GPU choices.

For a large display (4K and beyond) I definitely recommend more video RAM. 8 GB is better than 4 GB in this case. Lightroom caches a lot of data on the video card when doing interactive edits, and the bigger the screen the more data it has to hold.

At present, Lightroom's GPU acceleration applies to editing images interactively in Develop and to the Enhance Details feature. It doesn't apply to import/export. If you invest in a strong GPU you won't see any changes to import/export performance, at least not with current versions of Lightroom. To get improved batch preview generation & batch save (export) perf your best option at the moment is to invest in a faster CPU with more cores (within reasonable limits); Puget has some extensive benchmarks in this area.

As far as GPU choices for performance goes in Develop, we have generally found that this page's benchmark scores are representative of the relative performance gains we observe in our tests:

https://www.videocardbenchmark.net/directCompute.html

Both the cards you mention (Quadro RTX 4000 and 2080 Ti) are there. Possible typo correction: you mentioned "GTX 2080" whereas I think you meant "RTX 2080".

Also, be aware that at present, not all of the edit controls in Develop are GPU accelerated. For example, local corrections & spot adjustments are not currently GPU accelerated, so if you're used to adding a lot of these adjustments to your images and these are the primarily bottleneck in performance for you, then adding a strong GPU is not going to help, at least not right away.

Longer term, our direction is to accelerate as much of the processing pipeline as possible (as many sliders and tools as we can) and as many workflows as we can. So, despite the caveats in applicability that I've noted above, be assured that GPU acceleration (and performance in general) is a very active area of development for us.

1 Message

 • 

60 Points

Great answer, thank you Chan (excuse me for leaving madman out ;) ), and you confirm my assumptions about the directions that Adobe are taking.

It now triggers follow up questions. Let me introduce these. My older 1080 TI card with 11GB vRAM is about as fast as a 2080 TI. The added processor logic for ray tracing that caused the series name change is likely not used by Camera Raw (i.e. LR). (a- True?)

Along these lines I do not expect a lot of improvement from a 3080 card either with its 10GB of vRAM. A 3090 with 22GB of even faster vRAM on the other hand, I would expect a lot of acceleration from. (b- True?) I assume this because I was unable to comfortably run LR on an Intel 9 series i7 with 16GB RAM and discrete NVIDIA 16 series with 4GB of vRAM.

 

Now I have to go to the "Quadro" reference that brought me here. The difference between the xTX and Quadro series may not be well understood in the general public, but these Quadro cards run their floating-point operations at much higher precision (I guess 4 times based on the quadro label.)

Which is to say, if we see the FLOPS for a 1080 or a Quadro card be the same then the Quadro shifted a lot more bits in the process.

LR causes some random color processing noise in some of my photos and this triggered my in-depth research, btw. That noise is not present in Nikon Capture NX-D, nor Capture One 20, nor in LR when it processes a shot from the same Nikon Z camera but now with an older F type lens instead of a new Z/S series. This may be a bug (very serious IMO).

(c-) Anyhow, if I run LR/CR/PS on a Quadro card, will I see improved processing? Less processing artifacts? Improved de-Bayerization and demosaicking?

I see a significantly improved rendition in PS when I convert to 32-bits, but that is not a convenient option in processing larger batches of photos. (-c)

 

KR

Peter

 

System details

W10 - latest version 64-bits Pro based on fresh install

NVIDIA studio driver (latest version)

Two 4K monitors (Eizo CS2740)

LR+PS subscription running at latest version/update/patch level.

Intel 10700K

NVIDIA 1080 Ti & 11GB vRAM

ASUS Z490 pro creator motherboard with IRST

32GB RAM

3 RAID 1 pairs of "pro" SSD keep I/O streams logically separated (IRST), increase SSD cache and speed up concurrent reads.

1 RAID 0 (4 SSD) array runs high-speed low-latency I/O directly from CPU PCIe lines (add-in card & W10 RAID).

 

1.2K Messages

 • 

13.1K Points

2 y ago

Thanks for your very detailed and useful response. It looks like the RTX 2080 Ti would be the best choice if I can address the heat issues. Some risk but the blower edition is probably the answer. Otherwise the single slot RTX 4000 would be a good choice.