• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Graphics Processor usage in LR6/CC and Capture One 8.3.2

Cem_Usakligil

Well-known member
You may be aware that some image processing programs do support the use of the graphics processor on your VGA, so that the performance of those programs get an additional boost. In order to see how much difference this makes in real life, I have decided to upgrade my old VGA, which is an ATI Radeon HD6850 card with a brand new NVidia GTX 960 2OC 4GB card. I have run extensive benchmarks with the old card as well as the new one so that I could compare the additional benefit of paying for a new top-of-the line graphics card. Disclaimer: I don't game on my work PC so the performance boost of the new card is purely measured for the purposes of photography. I have used two common programs to do the testing: Lightroom 6/CC and Capture One 8.3.2.

Just to give you an idea about the power of the old graphics card vs. the new one, here are some major specs of both.
ATI Radeon HD 6850: GPU Clock: 775 MHz, Memory Clock: 1000 MHz, Memory: 1 GB
NVidia GTX 960: GPU Clock: 1300 MHz, Memory Clock: 7010 MHz, Memory: 4 GB
On paper, the new card should run at least 2 to 3 times faster than the old one.

Before I give you the results of the benchmarks, let me explain a couple things about LR6/CC, namely how much it makes use of the GPU. The short answer is, not much. LR uses the GPU only when one is working in the Develop module. The GPU is called into action when one zooms in or changes the sliders. LR does not use the GPU in all the other modules (such as Library). It certainly does not use the GPU when creating previews nor when it exports the processed images. So it is very difficult to reliably measure the added-value of the new GPU. I have timed certain repetitive actions such as zooming into 1:1 and scrolling through 50 raw images in the develop module. I have waited for each picture to be rendered and displayed correctly before moving onto the next one.

LR6/CC benchmark results:
Scroll through 50 raw images (mixed bag from 4 different cameras; Canon, Nikon, Sony, Pentax):
Without the GPU support: 4 min 10 sec
With the old GPU: 4 min 4 sec
With the new GPU: 3 min 56 sec

As can be seen, these results are not statistically significant and they are not really accurate. But still, one can conclude that the added-value of a new GPU for LR6/CC is rather negligible. I guess we will have to wait for the LR developers to improve their GPU support software before any significant gains become visible.

Capture One 8.3.2. benchmark results:
Contrary to LR, C1 does use the GPU extensively for everything (such as exporting) and the resulting performance increase is tangible.
Export 50 raw images (mixed bag from 4 different cameras; Canon, Nikon, Sony, Pentax):
Without the GPU support: 4 min 25 sec
With the old GPU: 1 min 53 sec
With the new GPU: 1 min 50 sec

These timings are the average of 5 export runs. Statistically, they are a bit more significant compared to the LR benchmarks. Moreover, the performance increase when using a GPU is more than twofold.

Conclusion:
If you are using LR6/CC switch on the GPU acceleration if LR supports your graphics card. But do not go out to buy a new and expensive GPU since the added value is zero.

If you are using C1, you will definitely want to use the GPU acceleration. However, even a 5-yo graphics card will provide 90% of the perfromance. In order to get the remaining 10%, one can buy a new and fast graphics card but I would not recommend investing in this since the added-value is less than 10% whereas the price of such a card will be above 200 Euro/$.

I hope that this helps decide in case any of you were wondering.
 

Asher Kelman

OPF Owner/Editor-in-Chief
Thanks so much for this effort and the sharing of such significant findings. It seems hat Adobe has some explaining to do!

Did you look at the performance in PS too?

Asher
 

Cem_Usakligil

Well-known member
Thanks Asher and Doug.

@Asher: I did not test PS since I do not use it much. Besides, testing itself would pose more of a challenge since I would have to create batch actions and run them on multiple files. No time for all that I'm afraid.

Re. "Adobe having some explaining to do", it is a well known fact that LR is only partially GPU friendly. They have already explained this, among others via Eric Chan's posts in different forums. The scope of the functions utilizing the GPU within LR will hopefully be increased in future releases. But this may mean that I don't get to benefit from that since I use LR6 and not LR CC. As you know, LR6 is a dead-end street and won't get any functionality upgrades.
 
Thanks Asher and Doug.

@Asher: I did not test PS since I do not use it much. Besides, testing itself would pose more of a challenge since I would have to create batch actions and run them on multiple files. No time for all that I'm afraid.

Hi Cem,

Thanks for the feedback about performance improvements resulting from your hardware upgrade.

It is most likely that PS, which basically uses the same Adobe Camera Raw (ACR) engine for Raw conversion, would show similar effects, as far as the actual Raw conversion itself is concerned. Other, user interface related, performance obviously will depend on the particular (LR or PS) application user interface specifics.

Re. "Adobe having some explaining to do", it is a well known fact that LR is only partially GPU friendly. They have already explained this, among others via Eric Chan's posts in different forums. The scope of the functions utilizing the GPU within LR will hopefully be increased in future releases.

This is likely, since Lightroom is late to the game, as far as GPU acceleration is concerned.

But this may mean that I don't get to benefit from that since I use LR6 and not LR CC. As you know, LR6 is a dead-end street and won't get any functionality upgrades.

I'm not 100% sure. It seems clear, sofar, that additional features (such as the DeHaze function that was introduced shortly after release of the version 6 upgrade) will be only unlocked(!) for LR CC subscriptions. LR 6.x perpetual licensees (who have paid the full license fee amount upfront) are left in the cold, but such is the coercion strategy that Adobe have adopted to promote their (more profitable) subscription policies.

However, it seems to me that GPU functionality support would be harder (not impossible, but harder) to deliberately lock(!) because it is much more intertwined with the core code for LR. I know one can disable the GPU acceleration from the user preferences, so it's not impossible, but it would become increasingly hard/expensive to keep up such artificial crippling of features as time goes by. But then, I'm an optimist ...

Cheers,
Bart
 
Top