Speed use / interactive GPU will worse

LR 2015.6 and maybe (!) also in previous versions, I have a problem with the interactive speed dragging the sliders:

Just after I start LR and maybe for a few more minutes, I have perfect while sliders interactive speed dragging in the module development (with active GPU).

But then, all of a sudden, this interactive speed worsens perceptible and even the use of the GPU (supervised by a desktop widget) breaks down. It s as if the GPU more used to things isn´t. Unchecking and and recheck the GPU using preferences doesn´t. Only a restart of LR.

I have no idea why this is happening, or what I have to do to get there. I have just can´t repro it. But it happens every time I use LR.

BTW. I'm on WIN 8.1, 5930K@4,1ghz, 32 GB RAM, GTX 970.

LR in general runs quite fast. But something that happens after a few minutes. The program does not generally receive a slower. It s just the interactive speed and GPU usage falling down.

RAM and VRAM are not maxxed, if that happens. I still have at least 50-70% left.

There are reports in 2015.6 speed problems, especially with the Nvidia GTX cards.

If you wish, you can roll back to 2015.5.1 until it is resolved. Here's how: http://www.lightroomqueen.com/how-do-i-roll-back-to-lightroom-2015-1-1-or-lightroom-6-1-1/

Just replace 6.1.1 with 6.5.1

Tags: Photoshop Lightroom

Similar Questions

  • Since the last update Photoshop CC will not use my GPU while Lightroom CC works. East of GPU Radeon HD 5700

    Since a recent update Photoshop CC will not use my GPU saying I do not have a minimum of 512 MB of vRam and wont let me enable it, even if I have 1 GB of vRam...

    When I load CC in Lightroom, it detects, is already selected for use.

    I use Windows 10 with 8 GB of ram and a Radeon HD 5700

    any ideas on how to fix.

    Go to the web site AND and download the most recent Wibdows 10 device driver for your graphics card.  It looks like you have installed is not correct.  No date shows no VRAM.

    glgpu [0]. GLVersion = "3.0".

    glgpu [0]. IsIntegratedGLGPU = 0

    glgpu [0]. GLMemoryMB = 0<>

    glgpu [0]. GLName ="? »<>

    glgpu [0]. GLVendor = "ATI Technologies Inc."

    glgpu [0]. GLVendorID = 0

    glgpu [0]. GLDriverVersion ="? »<>

    glgpu [0]. GLRectTextureSize = 16384

    glgpu [0]. GLRenderer = "AMD Radeon HD 5700 Series.

    glgpu [0]. GLRendererID = 0

    glgpu [0]. HasGLNPOTSupport = 1

    glgpu [0]. GLDriver ="? »<>

    glgpu [0]. GLDriverDate ="? »<>

  • By using the GPU for rendering? Get makes it faster?

    I'm trying to export a 20 min video I've done in After Effects. It has a few effects of particles and some effects of Audio spectrum. When I start a Render, it shows between 20 to 30 hours to complete. If I had several machines I would have no problem with that, but I did not.

    I tried most of the suggestions in the forums to try and speed up the rendering time:

    Untitled-1.jpg

    • Clasic made instead of Ray traced
    • Removed unused and sequences pre comps
    • Reduced the number of effects
    • reduced the number of precomps
    • reduced the number of expressions
    • Rendering with CUDA
    • Disabled motion blur
    • Disabilities to 3D layers.

    Still, I can't take the shortest rendering time. I also tried various rendering parameters:

    • YouTube 1080 p
    • Youtube720p
    • QuickTime animation
    • Lossless
    • JPEG sequence

    with no major difference (youtube 1080 p being the best)

    I noticed that the SOUL does not all the power he could. It uses around 50% of CPU, but I read that may be because it doesn't have time to ramp upward before arriving at another framework.

    I also notice that even using CUDA acceleration, SOUL does not use power of the GPU at ALL. I have two GTX780 and I'm sure that if it was their use he would accelerate a lot, but I don't know if we can use them and if possible I don't know how to activate it.

    If you are looking for any other suggestions to speed up my rendering, or instructions for making SOUL use my GPU

    Data sheet:

    • I7 - 4790K
    • 32 GB of RAM
    • 2 x GTX780

    The GPU is not rendered 99% of everything that AE. If you use the TEA to render a model of EI then AE works in the background so SOUL must leave a few resources for do you other things. With the help of SOUL is always slower rendering with the Render tail if you go to the same format, but for most of us is not a problem because we don't sit and wait for makes, work us on the next shot or move on to another project. That's the beauty of the SOUL, must never stop creating - things, you get paid for.  If it's a rush to get out the project will restore then with making Cue of EI for a codec made fast production and then run through the SOUL SOUL being much more efficient and faster video transcoding in another format.

    As far as general things AE goes, temporal effects are taking the most time to render, particles also take a long time to render. Drawn to the beam with acceleration CUDA only is absolutely not supported in SOUL even if you have a correctly configured NVIDIA GPU. I know not all professionals that use drawn with RADIUS one rendered for anything whatsoever. There are much better solutions even without 3rd party effects.

    I hope this helps. The best way to get something out of the door is to make a standard production format by using the mark to return and a quick to render format ensuring that you use the formats of standard production for all of your belongings in the Ant project that are all of the images used in your publication to scale to 100% at some point in your comp (a lot of people try to make photo slideshows in HD with 20 images of a DSLR MP and all of them are resized way down). Once you have your master of standard production format that fall into the SOUL to generate your deliverables.

    Currently AE will not use 100% of your system resources when rendering HD or even 4 K video because AE watch only 1 frame at a time and any modern processor can easily treat an image using a fraction of its power. Then, we need to calculate everything again. Certain effects will use more memory a temporal effects and particles can quickly fill the memory, but until the basic architecture of the AE render engine is completely revamped we're stuck with makes it slow.

    Just FYI a little... Many of my composite complex can take 1 or 2 minutes per image for rendering. Some have taken 5 or 6 minutes. My 'pencil tests' or movement work almost always with little or no effects just to check the appearance of the animation. I call them pencil tests, test the hosts traditional photo pencil sketches to check the action and blocking before sending frames out to ink and paint because it makes no sense to paint on a stage where you don't know if Bambi walks funny because you didn't run a pencil and ink. In any case, my 'pencil tests' or previews make usually about 7 to 10 frames per second. It's the range of my work. Most production companies who are trying to earn a living do effects and editing have a fairly strict set of standards for travel times. Only, they conceive their work to fit into this mold.

  • Adobe first renders is not as fast as it should, suspect he doesn't use external GPU.

    I built a PC not so long ago:

    Windows 7

    3.9 Ghz CPU

    8 GB RAM

    Space 1 TB at 7200 RPM (SMAART status: good)

    750 GB 2 GPU NVIDIA TI

    Made off the coast of external drive, 1.5 to 7200 RPM (SMAART status unconfirmed)

    Renders take forever sometimes, and using Speccy (Application that gives PC specs) I see that my CPU reached 95-105 C that the GPU is a stable 25-30 ° C. This makes me believe that the GPU does not as hard as it should and that the CPU puts all the rendering, it is exactly what I don't want. I want that my GPU to do.

    I already went into the BIOS and set the GPU as the main graphics and Adobe P to use the mercury GPU acceleration.

    I don't know what the problem is.

    Currently, I went a clip, 270 frames, stabilizing Warp is the neat video noise reduction filter and frame. It takes more than 12 minutes! The clip is 9 seconds long.

    Any ideas?

    TL; DR: CPU to 95-105 C and GPU is 25-35 ° C, 9 second clip takes 12 minutes to render. Suspect Adobe P does not at all use the GPU to render.

    -Very well, OP here never go.
    I apologize for forgetting this thread a bit.

    Long story short, I made a very simple solution of temp.

    I went to the power consumption of PC and changed the settings for high performance. Went where it says Min and Max CPU use. Both are set to 100%. I changed the value to 99% and the temperature of my CPU dropped on the ground of 95 c for about 65 C. However, my CPU speed is increased from 3.9 GHZ to 3.6 GHZ. Not a huge loss in GHZ compared to the temperate zone, it's a success, I'm willing to take. This seems to be the only solution. It seems AMD + overheating is synonymous...

    I'll run the benchmark tomorrow.

    I appreciate really all your answers and insight.

  • I added a router and a connection high speed using windows vistawhen pc is connected to the router her there is no internet connection. When the pc connected to the Internet by modem cable is available

    I added a router and a connection high speed using windows vistawhen pc is connected to the router her there is no internet connection. When the pc connected to the Internet by modem cable is available

    Hello

    I suggest you to follow the steps in this link and check if it helps:

    http://Windows.Microsoft.com/en-us/Windows/help/wired-and-wireless-network-connection-problems-in-Windows

    It will be useful.

  • How to use the GPU for the AMD over-pants D500 in Mac pro 2013

    I install the first CC 2015 Pro and After Effect CC 2015 to 2013 with the graphics card AMD over-pants D500 Pro Mac, I can set GPU rendering in Premiere Pro, but impossible to define in the After Effect, only can use the CPU in the preview of the setting, how can I do?

    AE uses only a few NVIDIA CUDA enabled cards to speed up rendering drawn with RADIUS. This feature never quite worked, with disabilities masks and effects on layers 3D, and NVIDIA has changed the direction of their development of this technology for the Ray-traced rendering feature is is more developed by Adobe. They are working on another technology to use the GPU, but it is not yet available. There is nothing you can do to enable the GPU acceleration, unless you have a compatible NVIDIA card.

  • After using the GPU effects

    Szalam, if possible, I'd like some information in what concerns this topic because I'm having a moment a little hard to get my head around how After Effects uses the GPU in some situations.

    For example, I have a comp UltraHD quite complex with a group of layers 3D element. Because we work closely with the 3D with element layers, we have the rendering engine in After Effects "3D Classic" on. Because I'm using a GPU K4200, I'll just follow the advice of Adobe and use the CPU for any trace of the rays since the K4200 is "unsupported".

    So now that I'm assuming that After Effects will do all THE processing of treatment on the CPU - with the exception of the work of 3-d element, which will be done only on the GPU, correct? Then now, I guess as long as I "simultaneous rendering of several images multiprocessing" something sensible (i.e. leaving 1/4 or preferably more like 1/3 of the available RAM to non - After Effects-related work and not hungry processors), then After Effects will try to max-out the processors as much as possible for everything except the item in 3D work (which will be hopefully out the GPU max). Is this correct?

    Is it interesting to use in load/no tested (for example the K4200) GPU to accelerate the raytraced render engine in this scenario (i.e. 3D only in the model elements are 3D, element, cameras and lights 3D layers), or is it better to leave it on the CPU and leave the single GPU for 3D element?

    Thanks in advance!

    Chris Hocking wrote:

    For example, I have a comp UltraHD quite complex with a group of layers 3D element. Because we work closely with the 3D with element layers, we have the rendering engine in After Effects "3D Classic" on. Because I'm using a GPU K4200, I'll just follow the advice of Adobe and use the CPU for any trace of the rays since the K4200 is "unsupported".

    Right. Your scene does not all the layers using the ray traced in her rendering engine at all, so leave it on Classic 3d is the best choice. The new version of the item has "drawn with RADIUS' shadows and whatnot, but which is unrelated to converter traced to obsolete radius of EI. EI raytraced render engine is essentially an effect pressing the Optix of NVIDIA GPU library. It was just to add depth to layers. People are often confused and think that turning traced to the rendered RADIUS on would hire the GPU to make this, but the only thing he never accelerated has been traced to the RADIUS effect (which has very few people have already used).

    Chris Hocking wrote:

    So now that I'm assuming that After Effects will do all THE processing of treatment on the CPU - with the exception of the work of 3-d element, which will be done only on the GPU, correct?

    Yes.

    Chris Hocking wrote:

    Then now, I guess as long as I "simultaneous rendering of several images multiprocessing" something sensible (i.e. leaving 1/4 or preferably more like 1/3 of the available RAM to non - After Effects-related work and not hungry processors), then After Effects will try to max-out the processors as much as possible for everything except the item in 3D work (which will be hopefully out the GPU max). Is this correct?

    Pretty much, Yes. There are other potential bottlenecks (if you use images out of an external device with a slow connection, for example), but it's like basic things will be.

    Chris Hocking wrote:

    Is it interesting to use in load/no tested (for example the K4200) GPU to accelerate the raytraced render engine in this scenario (i.e. 3D only in the model elements are element and 3D & cameras lights 3D layers), or is it better to leave it on the CPU and leave the single GPU for 3D element?

    Your scene, as you describe it, does not use the ray-traced rendering engine at all. The ray-traced rendering engine is just a way to add depth and dimension to natively layers in AE. Given that you do not use this feature, by turning your comp to use the rendering engine raytraced won't help what whatsoever and, instead, plu cause lead the times.

    But you can feel free to test yourself. Try a rendering with the standard rendering engine and with the renderer raytraced lit. See if you can see differences.

    Here is an official blog of Adobe on the GPU in AE: features GPU (CUDA, OpenGL) in After Effects

  • By using the GPU of the iMac to 5K with After Effects

    Hi all

    I have an iMac 27 "with an AMD Radeon R9 M290 (2 GB) GPU.

    I use after effects 14 CC, but I can't use it with the GPU (it uses the CPU). How can I solve the problem?

    (Use the CPU is not the best way to work with AE).

    Thank you.

    After Effects uses the GPU for very little.

    Your GPU will be used for all OpenGL GPU functions. It just will not be used for CUDA unique feature: the outdated plotted in 3D rendering engine Department.

    For more information, see the following page:

    Features GPU (CUDA, OpenGL) in After Effects

  • Dedicated GPU will accelerate LR4 to drive a monitor high resolution?

    I plan to order new custom made Win7 Office based on i7 3770K and Asus P8Z77 mobo. The monitor will be wide range 27 "with a resolution of 2560 x 1440. I am aware of the complications caused by these monitors, but decided to choose.

    My main and most demanding application for this office will be LR4 (no CS6, games, videos, etc.). To optimize its performance I'll have 16 MB of RAM, a second SSD and SSD for the OS and LR4 app cache, previews and catalogues + 2 TB HDD for storage of the image.

    Current version LR4.3 is always claimed 1) be slow in the conduct of monitors high resolution and 2) not to use the GPU effectively.

    Mobo Asus with 3770 K/HD4000 is specified to be able to drive a monitor with above resolution through its DisplayPort. However, LR4 would be faster if I would add a decent dedicated GPU for the monitor?

    In which case the answer is no, would be an addition of the smaller second monitor (1920 x 1080 max resolution) above Set up change the situation and make a dedicated GPU beneficiary? Basically Asus mobo 3770 K/HD4000 by specifications can also simultaneously drive two monitors with above resolution. However, my concern is slow LR4...

    Thanks for any help.

    Lightroom does not GPU.

    You can have other reasons (video games) to add a GPU, but don't get to Lightroom.

  • Export / made video using the GPU / MPE?

    I have the support of acceleration GPU MPE enabled in my project settings, but when I export an H.264 YouTube Widescreen HD video preset, my GPU usage is 0% and my CPU usage is 74%.

    Is there something special I have to go do encode / export using the GPU not the CPU?

    MPE has only hardware acceleration if you have taken care of the effects in your timeline. If you have no effects supported in your time line, you will not receive.

    Despite all the hype about MPE and the supported cards, guests only if the effects that you use are those supported. If you don't regularly use them, what is the sense to invest in a video card cheap if you can't use it? It's something of that every potential buyer should be aware.

    Investment in a card CUDA support is only reasonable if you use effects supported regularly.

    Benefits to export a calendar without supported effects are still unclear and maybe you knew now that without these effects, there is no discernible benefits to a supported video card. I wish I knew more about it, but we need benchmarks to tell us where and when we can enjoy such a card.

  • By dictating the long notes, using Siri, Siri will stop as soon as I take a little break. Therefore, the note ends abruptly. Any solution for this?

    By dictating the long notes on Mac using Siri, Siri will stop as soon as I take a little break. Therefore, the note ends abruptly. Any solution for this?

    For long notes, you are better to use regular old dictation - under Mac OS for a few years now, Siri is not mandatory.

    To activate it, go to "System Preferences"-> keyboard-> dictation. Once enabled, you can press the "Fn" key twice to activate and dictate the long notes in any application.

    Siri is optimized for short, quick things. I don't think that you can change, you just use the method that is better to listen to what you want to do at the time (short notes, using Siri. Long notes, use dictation).

  • Integrate the acceleration to get speed using time domain mathematical

    I'm trying to do something simple, but have not found the answer to my problem.

    I have a loop that couple as input and calculates the acceleration; See attachment.

    I'm trying to calculate speed using acceleration.

    I used the function of continuous integration of time mathematical field, acceleration plugged as a signal, but the speed that it displays is not correct.

    If the acceleration is 1 / s ^ 2, the calculated speed is tens of thousands in / s in the second. This is not true.

    What I am doing wrong?

    Thank you

    Looks like I posted this in the wrong place. Don't know how to delete it well.

  • Need for Speed Hot Pursuit 2 will work with Windows 7?

    Need for Speed Hot Pursuit 2 will work with Windows 7?

    Click on the following link for the game compatibility in the Windows 7 operating system.

    http://www.Microsoft.com/Windows/compatibility/en-us/default.aspx

  • Premiere Pro CC 2015 won't use my GPU (GTX 750 it)

    Hey people!

    I have an ASUS Geforce GTX 750 Ti and I can't use the GPU in the first.

    When creating - project settings - reading Mercurcy engine - GPU CUDA is on and chosen!

    But here's the thing - if I export any kind of film creation - my Asus control panel shows me little GPU usage!

    When I use After Effects (there, I can choose the GPU with note not recommended). When I work in After Effects using GPU is noticeable ~ 30-50%.

    It is not only the use which makes me skeptical - its time to export...

    before the gtx750ti I used the onboard gpu and time now (with the geforce) is the same or more?

    What I've tried so far:

    I added a "cuda_supported_cards.txt" with 'GeForce GTX 750 Ti' entered - the first Adobe Pro CC 2015 folder!  But nothing happened - no use of GPU when you change or made... ;(

    You guys have any suggestions for me?

    I'm really looking forward for any tips/advice.

    Greetings from tyrol

    System:

    Win 10 pro x 64

    ASUS W97 more mainboard

    ASUS geforce Ti gtx750

    64g Ballustix DDR3 Ram

    Read this doc on what cuda in the first and that it does NOT (e.g., encoding)

    CUDA, engine reading Mercury and Adobe Premiere Pro "Premiere Pro workspace

  • Use the GPU?

    Use the GPU?

    What is the official recommendation for a 1080 p monitor?

    Thank you

    Hi stephenr,.

    Check this link for a list of the graphics card tested and suggested for Lightroom:

    Adobe Lightroom GPU troubleshooting and FAQ

    Kind regards

    Claes

Maybe you are looking for

  • Impossible to change my default message font in Mail :-(

    Hello I'm changing the default font for my Mail message. When I access mail 'Préférences', click on 'Fonts & colours' it is said that my policy of Message is set to "Helvetica 12". When I click on the "Select" button, a pop up window which me gives a

  • Reference Dell D10S: Software 451dn Laserjet Pro 400 has error during installation.

    LaserJet works fine when plugged in to USB, but now moved to a place wireless. The software will not work. I tried install printer Win 10: "the pilot is not available.Tried to uninstall the universal printer driver, restart, reinstall the HP driver,

  • Problem with Outlook Express "Message has not been downloaded."

    I'm having a problem with emails and attachments removed once I open and view them.  More specifically, when I opened an e-mail message by using Microsoft Outlook Express 6 with an attachment, I am able to see the file and open the attachment.  Howev

  • I no longer see the splash screen

    Hello I was wondering if there is a problem? When I start my pc, I don't see the initial start up screen (where you see the pc that runs through the quotes - before loading windows) - I wanted to install a card from the PCI (replacing the edge - but

  • 2504 versus 3850 wireless controller

    We are just beginning to look at wireless on our secure networks. We have a 2504 and some WAP 36xx but never installed. We also ordered a couple of 38xx WAP and group the 3850 templates which included the wireless controller. We plan use our Cisco AC