We might do something for other apps that use the GPU more (Premiere Pro, After Effects, DaVinci Resolve, etc), but I doubt we will invest the time to test Lightroom Classic. The Ryzen 7 3700X is the next step up from the Ryzen 5 3600X in terms of performance and price. Another factor that has changed recently is that the Gigabyte B550 Vision D motherboard - with fully certified Thunderbolt support - has launched and passed our internal qualification process. "Overall, Ryzen is unfortunately not a great choice for Lightroom. Future software or BIOS updates could of course fix this issue, although we saw the same behavior between the Ryzen 9 3900X and 3950X, so this is unlikely to be a simple BIOS or software bug. Thanks for the reply. After all that, we can try to track RAM timing, screen resolution, overclocking, and a number of other aspects of the system information. For me in my example, switching between Modules in Lightroom and scrolling in developer modul is very important, also 1:1 Rendering . This limits the Ryzen platform to 64GB of RAM while the other platforms had 128GB, but since our Lightroom Classic benchmark never needs more than 32GB of RAM to run, this does not affect performance at all. Ryzen 3000 series Lightroom performance? You can still get more overall performance from the (significantly) more expensive Threadripper processors, but the Ryzen 9 5900X, in particular, is not too far behind those beefier models. How is the performance? Not sure there is anything meaningfully faster that will go into the current CPU socket. In my opinion that is a shame for Intel, AMD and Adobe altogether and not a reason to hype anybody. Frequency can be grabbed through WMI or through the command line, but timings would need an external application which we have tried to avoid doing since it makes cross-platform support much harder. Both missing informations are very important for the endresult. It shouldn't affect performance much, but good benchmarking is about removing variables to try to get the most accurate results as possible. How about a comparison between the fastest affordable Quadro (the RTX4000) and the GTX 2080 TI? Or is it a problem with your benchmark?• NEF-Export: Intel 9960x is about the same as 3900x/3950x as expected. Iknow, i know, it's a little bit malicious :-). Even if some processes are slower, exporting and building previews can be twice as fast. Puget Systems Lightroom Classic Benchmark. Comparing applications is something we don't really try to do since there is so much more to why you would use one application over another than straight performance. I will quote from your Lightroom benchmark procedure : How does the scoring work?The scoring system used in our benchmark is based on the performance relative to a reference system with the following specifications: Intel Core i9 9900K 8 CoreNVIDIA GeForce RTX 2080 8GB64GB of RAMSamsung 960 Pro 1TBWindows 10 (1903)Adobe Lighroom Classic CC 2019 (ver. The differents can be mor den 40% !!! So we would need to be able to detect what display the app is running on which I don't believe we can do very easily. Is this due to another "performance optimization" of Adobe? AMD hasn't added any more cores to their new line of processors, but among other things, they are touting a 19% IPC (instructions per clock) improvement. In this article, we want to see whether the increase in core count (and price) is worth it for Adobe Lightroom Classic. There is no need for that high-end of a GPU, but in the off chance that it does make an impact, we want to make sure that the performance is being primarily limited by the CPU rather than another component. 3950x: 19 min 30 sek Here both CPUs had 100% usage for the entire exporte, but despite having twice the core counts the 3950x was slower. Thank you for such a competent and detailed reply. And 4) Lastly, AMD is saying that the TR socket will be compatible with future Treadrippers… If the 2 CPU’s are close already, does that push the TR over the top to make it that worth the added expense? AMD has said before that Threadripper wouldn't change socket, then they changes to TRX40 with the latest CPUs. Eine kleine Benchmark Orgie meines neuen Ryzen 2700er Computers. Puget Systems offers a range of poweful and reliable systems that are tailor-made for your unique workflow. Please add the Quadro RTX 4000 to your GPU test. High praise & recommendation for the current generation Ryzen CPUs. Comparison of 2700x and 3900X stock rendering 550 still photos. When AMD released the first of their 3rd generation Ryzen processors back in July 2019, they were quickly established as the fastest processors for Adobe Lightroom Classic. On my system, for the Develop sliders (the only performance characteristic I care about as I spend 90+% of my Lightroom time dragging sliders), V9.1 was a slowdown and 9.2 a huge slowdown. PugetBench V0.8 BETA for Lightroom Classic, Best Workstation PC for Adobe Lightroom Classic (Winter 2020), Adobe Lightroom Classic: AMD Ryzen 5000 Series CPU Performance, Adobe Lightroom Classic - NVIDIA GeForce RTX 3070, 3080 & 3090 Performance, Adobe Lightroom Classic - NVIDIA GeForce RTX 3080 & 3090 Performance, Best Workstation PC for V-Ray (Winter 2020), SOLIDWORKS 2020 SP5 AMD Ryzen 5000 Series CPU Performance, Best Workstation PC for Metashape (Winter 2020), Agisoft Metashape 1.6.5 SMT Performance Analysis on AMD Ryzen 5000 Series, Lightroom Classic CPU performance: Intel Core 10th Gen vs AMD Ryzen 3rd Gen, Lightroom Classic CPU performance: AMD Threadripper 3990X 64 Core, What is the Best CPU for Photography (2019), Lightroom Classic CPU Roundup: AMD Ryzen 3rd Gen, AMD Threadripper 2, Intel 9th Gen, Intel X-series, Lightroom Classic CPU performance: AMD Ryzen 9 3950X. Puget Systems builds custom PCs tailor-made for your workflow. i understood how you calculate the total score (Active + Passive)/2*10 .. 2) The system shouldn't lock up, but if it does, you can always do some trickery with Windows affinity so that Lightroom isn't allowed to use a handful of CPU cores. Lightroom: cache size 500GB catalogue size 5-6gb library 6tb Settings and library is identical. We confirmed these results multiple times, and for whatever reason, Lightroom Classic simply doesn't like the 5950X at the moment. Maybe in the future we will try to figure out reliable ways to check for all those things, but for now we are more concerned about making the benchmarks reliable and that they are testing everything we want. I haven't seen any benchmarks on the Ryzen CPUs, don't go by the hype, find some benchmarks. I think above a small GPU upgrade, you are going to be bottlenecked by your CPU. I'm currently speccing up a new desktop build to mostly run Lightroom and Photoshop, and have read elsewhere that there are good gains in memory performance by using 3600Mhz ram with CL16 or CL18 timing. Calibrating the monitors had no impact as expected, Datacolor Spyder 5 Pro. I'm currently building a desktop machine for editing in Lightroom Classic based on an AMD Ryzen 9 3900X, 12x 3.80GHz. The average of 87.7 and 96.5 is 92.1, which x10 is 921. The "Passive Score" does a pretty good job of summarizing performance for tasks like that as well. No matter how you look at it, however, the AMD Ryzen 9 3950X performs very well in Lightroom Classic. I have played around with it a bit as well, and it looks like it is going to be really difficult to accurately and reliably benchmark. Yet, if i take a look on the scores of the 9900k it's 921 (87.7 active + 96.5 passive). With the launch of AMD's new Ryzen 5000-series processors, however, it is very likely that AMD will be able to take a very solid lead over Intel in Lightroom Classic no matter what task you are looking at. CL timings are really hard (impossible from what I have found so far) to get directly at the level we have access to through the various Adobe APIs. 8.4)Overall Score: 1000Active Tasks Score: 100Passive Tasks Score: 100, I dont understand why if everything is normalized to 9900K, why the score for 9900K is not 1000 (100 active / 100 passive), Yeah, compare is really interesting.. Keep in mind that the benchmark results in this article are strictly for Lightroom Classic. (assuming that the 10700k in these results is on par with that old 9900k). At the first look it seems like there can't be more than 5% but :-): RAMDual rank -> Single rank2 DIMM -> 4 DIMMDaisy Chain -> T-Topology2666 Mhz -> 3600Mhz -> 4400 MhzCL 19-19-19-19 -> CL-14-15-15AMD -> INTEL, Resolution1980 + 1020 -> 2560 x 1440 -> 3840 x 2160. Compared to the previous generation AMD Ryzen 3000-series CPUs, these new processors are all roughly 10% faster than the CPUs they are replacing. Are you going to do a Lightroom Classic 9.0 GPU performance test?It seems that Adobe has improved the GPU usage in Lightroom and I would like to know if I should update my graphics card or not.Great article, keep up with the great work. The reason I ask is because there are many reports of Lightroom not performing well if the CPU has more than 4 physical cores. I also see bad performance in Lightroom classic where I exported from ARW to JPG 397 files (the same files with the same edit on both systems) with quality set to 80 and got these times: 7820x: 16 min 21 sek. Now I can just take a small break and get back to work. Ah, got you, sorry I misunderstood! Can you please explain this? We will be publishing more articles as these new processors launch, so be sure to keep a close eye on our list of Hardware Articles in the coming weeks. Is anyone out there using Lightroom with i9 or Ryzen CPUs? Overall, the new Ryzen 5000-series CPUs from AMD are terrific for Lightroom Classic. These results are then combined into an overall score to give you a general idea of how that specific configuration performs in Lightroom Classic. Hello AMD! Back again doing some real world testing of Lightroom CC 2017 running on Windows 10 and Ryzen 1700x. This effectively puts AMD in the lead over Intel no matter what your budget is and what parts of Lightroom Classic you want to optimize for. Soon after launch, there should be an update that adds support for AGESA 1.1.0 which is supposed to increase the performance of each Ryzen CPU by another few percent. So my questions are: 1) given everything I’ve told you, which should I go with? And since the August update it finally - 10627947 It also gets a bit hairy for us since we are partners with many of these companies, and very few of them seem to welcome head-to-head comparisons. Is this right? First things first: Thank you for the lightning fast testing of the new 3950X!However, it is very difficult to draw meaningful conclusions without a closer look at your numbers:• You seem to have tested Intel with HT-on. Lightroom is my bottleneck- its soslow its annoying. You are of course free to do whatever you want with your own system, but we've always taken the stance that reliability is more important than getting a bit more performance since in a production environment, system crashes and lost work costs far more money than losing a few percent performance. Takt und IPC zählen. Right now our plate is pretty full, but that is pretty close to the top of my to-do list. 16gb ram and gtx1080. As has been stated in the benchmarks that the video card, above a minimum level, doesn't much impact Lightroom performance (except for the Texture slider); if I upgrade from the K1200 to the RTX 4000 vs the GTX 2080 Ti, am I going to see equivalent performance with the RTX 4000? One thing we do want to note is that the pre-launch BIOS that is available for Ryzen motherboards is using AGESA 1.0.8. It’s the Mac Pro that’s *REALLY* bad. A faster export is certainly welcome. :-), - There are no information about Screenresolution- There are no information about RAM CL-Timings. I’ve narrowed it down to 2 top contenders, the TR 3960X and the Zen 5900X. Might not be much if you are lucky, or it might result in numerous random bluescreens or application crashes. I honestly don't know what specifically has caused that drop, but there have been a number of Intel security vulnerabilities that have been fixed at the expense of performance, and Lightroom Classic is adding more GPU acceleration which sometimes can reduce performance at first until they get it really dialed in. I also know Puget Systems recommendations for RAM frequency but in the real world there are many out there with 3600 Mhz or more, see Puget systems database results :-) My working settings are moderate CL 16-18-18-38 2933 Mhz. At a glance then it would appear that all of the systems reviewed here are notably slower than that old 9900k test rig - which is clearly incorrect. It is definitely one of the more "finicky" of our benchmarks (none of these apps are made for benchmarking, so we have to do some "creative" things to get them to work). We saw some odd performance issues with the Ryzen 9 5950X, but the Ryzen 7 5800X and Ryzen 9 5900X beat the Intel Core i9 10900K by a solid 14% and 21% respectively, while the Ryzen 5 5600X outperforms the similarly-priced Intel Core i5 10600K by a bit smaller 11%. Hence the attraction of a single slot card. Maybe you should setup a databases system where people could upload their results to compare with others. Listed below are the specifications of the systems we will be using for our testing: *All the latest drivers, OS updates, BIOS, and firmware applied as of October 26, 2020. When we can, we try to have many of the tests be similar, but we first and foremost want to measure the performance for "typical" workflows in each app separately. While our benchmark presents various scores based on the performance of each test, we also wanted to provide the individual results. So, personally, I wouldn't worry too much about future socket compatibility, especially with DDR5, PCI-E Gen 5, and who knows what else that might be coming in the next several years. If you were to compare AMD and Intel processors based on price alone, AMD is anywhere from 11% to 30% faster than Intel. Even this relatively small 10% increase in performance allows the modest Ryzen 5 5600X to beat every single Intel processor we tested, although it only snuck by the Intel Core i9 10900K by a few percent. So stay tuned on that! Since the 5600x isn't out yet, there's no testing to indicate if it's supposed faster single core speed will help improve performance in Lightroom over a CPU like the 3700x, which is around the same price but has 2 more cores/4 more threads. A few notes on the hardware and software used for our testing: First, we have decided to standardize on DDR4-2933 memory for the Ryzen platform. Some of the active tasks are accelerated by LR through the GPU ... Perhaps the difference in CPU performance would be much clearer with a lower GPU.• Many Lightroom users still have a Core i7-4700K in use. Turning off SMT can improve performance a bit in tasks like exporting, but in the last few versions of LrC, it also lowers performance in active tasks. The devs have also been putting a ton of work into improving many aspects of LrC that we haven't figured out a good way to test like brush/slider lag and things like that. Maybe it is a bigger deal on older GPUs like your RX 570? Having said that, for Lightroom ONLY (and not other Adobe software, which I cannot comment on), you want the fastest 4-core CPU you can afford. Comparing the 5600X to the more similarly-priced Intel Core i5 10600K, the 5600X is a decent 11% faster in our Lightroom Classic benchmark. Lightroom is hard to benchmark since the things that are easiest to test (importing, exporting, generating previews, etc.) As far as performance relative to older systems, that is something we've done in the past and want to do more of - we just don't have the bandwidth to do that in addition to keeping up with the latest hardware and software updates. We saw some odd performance issues with the Ryzen 9 5950X, but the Ryzen 7 5800X and Ryzen 9 5900X beat the Intel Core i9 10900K by a solid 14% and 21% respectively, while the Ryzen 5 5600X outperforms the similarly-priced Intel Core i5 10600K by a bit smaller 11%. 9.2 is at least 4 times slower than the last V8 release. We were close about a month ago, then we realized Lightroom 9.0 was going to launch during Adobe MAX so we held off. So in general, it should be better overall to leave SMT on currently. We are still working on updating our Lightroom testing right now, so it may be a bit before we look at the new Ryzen CPUs in Lightroom. Over the last few years, AMD has been making great strides with their Ryzen and Threadripper processors, often matching - or beating - the performance from similarly priced Intel options. Definitely enough to skew results, which is why our own internal testing with locked down configurations is always going to be more reliable than publicly uploaded results. In this article, we will be examining the performance of the new AMD Ryzen 5600X, 5800X, 5900X, and 5950X in Lightroom Classic compared to a range of CPUs including the Intel 10th Gen, Intel X-10000 Series, AMD Threadripper 3rd Gen, and the previous generation AMD Ryzen 3000-series processors. Noch interessanter wird Platz 2! As for the future, only the developers could tell you.4) No way to really know. I would guess maybe in 2-3 weeks we can have a version for Windows up for download. That is the same reason we use a NVMe storage drive as well. I recently upgraded from an Intel i5 2500K system to a AMD Ryzen 1800X-based machine. If you would like to skip over our test setup and benchmark sections, feel free to jump right to the Conclusion. In our testing for RAM timings for example, we only saw around a 5% max difference between RAM speeds: https://www.pugetsystems.co... . The Quadro line is mostly about having high amounts of VRAM which almost never a problem for photography applications. Same with the new Ryzen - as far as I know, AMD hasn't made an official announcement, so no way to know for sure. It's actually slower on the new setup, and I see many people complaining about Lightroom's bad performance on CPUs with more than four cores. I see that the 'active score' benchmarks are all under 100. Listed below are the specifications of the systems we will be using for our testing: *All the latest drivers, OS updates, BIOS, and firmware applied as of November 11th, 2019. Comparing the 5600X to the more similarly-priced Intel Core i5 10600K, the 5600X is a decent 11% faster in our Lightroom Classic benchmark. If there is a specific task that is a hindrance to your workflow, examining the raw results for that task is going to be much more applicable than the scores that our benchmark calculated. The K1200 is a pretty old GPU, so you should notice some difference with the newer versions of Lightroom Classic where they have been improving GPU acceleration support. And that '100' benchmark was established with a 9900k system. Benchmark. It seems like Affinity Photo is in some Tasks much faster. It will probably end up being a pretty big project since we are going to have to take into account how many displays are being used as well as the resolution for each display (since that apparently is a big factor for Lightroom GPU performance). and "passive" tasks (exporting, generating smart previews, etc.). I actually had been considering the 9900 prior to the 3900x, but the link in my OP is to some benchmarks specifically related to Lightroom performance, and the 3900x has about a 25-30% gains over the Intel counterparts. Maybe once we are able to test the features that use the GPU a bit better, but for now, there is almost no chance our testing would show any difference. Puget Systems builds custom PCs tailor-made for your workflow. I notice that you perform the Lightroom benchmarks with 3200Mhz CL22 memory. Thanks for all the reviews you're making, there are really useful. One of the reasons we sometimes used the Intel 10th Gen CPUs over Ryzen when the performance was similar was because only Intel platforms had passed our qualification process for Thunderbolt. Historically many Adobe products have seemed to favor Intel processors. To get up to the same performance as a RTX 2080 Ti, you are going to need a Quadro RTX 6000, and even then it will likely be slightly slower. If you are concerned about general Lightroom performance, the Intel Core i7 7700K is significantly faster for most tasks and only ~10% slower when exporting images. Right now I’m running an Intel i7-6850 and lightroom pretty much locks up my system (100% CPU Usage) when I’m importing and creating previews or exporting. I was wondering if you had performed any testing using this faster memory, and whether further big gains were achievable for a modest investment. Thanks for the read! It is also worth noting that the 5800X and 5900X outperformed the 10900K not only in the passive tasks but the active ones as well, which was where Intel was previously maintaining a slight edge. It may only be about 5% faster overall than the AMD Ryzen 9 3900X, but that still makes it solidly the fastest CPU we have ever tested for Lightroom Classic. You already know it better!• Looking at the NEF numbers, there is really no reason to spend even a penny more for a 3950x instead of a 3900x (for Photoshop and Lightroom only). Until recently, even 3200MHz didn't meet our stability standards, and going above that is definitely going to cause more system instability. In the past, there were arguments for using an Intel processor for Lightroom Classic if you wanted to optimize for active tasks like scrolling through images, but with the new Ryzen 5000 Series CPUs, AMD takes a solid lead no matter the task. For a number of reasons which I won't go into here, there is a preference for Quadro cards. There is almost no reason to use the X-series when the Core i9 10900K is both less expensive and faster, so the true performance lead with the AMD Ryzen 5000-series peaks out closer to only 20%. Even with all the improvements Adobe has done in the last couple of Lightroom versions to take advantage of the GPU, it is still primarily a CPU-driven application. it is very hard to know where you stand with performance on your current system. We've tried to work with the devs to add the functionality we need, but it can be hard to find time to add features that help us when they are busy tackling bugs and adding features that are useful for their end users. I would believe that scaling goes way down after 6 cores though. Or does there exist a “political correctness” problem with Adobe? And it's not always straightforward and faster and 100% utilized with more cores etc, as export is.Also it helps import previews and develop module when you make and apply a some preset with Sharpening and Noise Reduction set to 0. I NEVER delete anything. I don't think that is because any of them are scared, but rather because it is much harder to place a value on workflow optimizations than it is for things like "how long does this effect take to apply?". In order to see how each of these configurations performs in Lightroom Classic, we will be using our PugetBench for Lightroom Classic V0.92 benchmark and Lightroom Classic version 10.0. Feel free to skip to the next sections for our analysis of these results to get a wider view of how each configuration performs in Lightroom Classic. Be sure to check our list of Hardware Articles to keep up to date on how all of these software packages - and more - perform with the latest CPUs. The officially supported RAM speed varies from DDR4-2666 to DDR4-3200 depending on how many sticks you are using and whether they are dual or single rank, and DDR4-2933 is right in the middle as well as being the fastest supported speed if you want to use four sticks of RAM. I really wouldn't advise going above 3200MHz though. With this motherboard, Thunderbolt support is no longer as much of a factor when choosing between Intel 10th Gen and AMD Ryzen CPUs in our workstations. 3) Adobe CLAIMS it only uses 6 cores, if that’s the case, do we expect them to start utilizing more cores in the future? There are quite a few things we want to test in LrC, but unfortunately the API is way behind other apps like Photoshop and Premiere Pro.

Ferienhaus Schwarzwald Mit Sauna Alpirsbach, Gasthof Zur Post Landau Speisekarte, Tu Darmstadt Erasmus+, Asiatische Immigranten In Amerika, Familie Trapp Zdf, Augusta Restaurant Brasserie Bar Göttingen, Pille Absetzen Mit 50, Sozialversicherungsausweis Ddr Verloren,