Jump to content

tamag901

Members
  • Posts

    19
  • Joined

Posts posted by tamag901

  1. Upgrade from 3.x to 4.x went smoothly but now widgets on my Nexus are stuck with a permanent black background.

     

    On 3.x I could turn off the background toggle and have text overlaid over the animated GIF - now these black rectangles are covering it up. Turning the background switch on and off does nothing now.

     

    Is this a bug @Corsair?

    537166008_Screenshot2021-03-17172149.thumb.png.4ed5f4b14f67c7211c70616c1da8ba4e.png

  2. This is just a real basic thing that may have an answer but, i was wondering if there is a way to make the F13-24 programable keys actually work. I'm working with a Scimitar pro so I was wondering if there is a way i can make sure that the keys work with most to all games I play. Is there a sure fire way to make sure? or could it be different for each game?

    You could set up macros instead of using F13-24 directly. For example, you could bind something longer like Ctrl+Alt+Shift+1 to slot 1 on hotbar 4.

     

    Then, set up a macro so that F13 on the mouse triggers the keybind you set up in the game. Don't forget to add a very small delay (20ms) between each macro step - I find that the macro misbehaves in FFXIV sometimes without this. I don't know how the game processes input, but you might have to raise the delay if you're playing at a lower framerate.

     

    I have the G1-G6 macro keys bound to various hotbar actions that I need to hit quickly (like Second Wind).

  3. I think I've got my OC settings right on the edge of stable - 1.30v and Mode 4 LLC. Any lower and OCCT fails.

     

     

    I just remounted and dropped my temps by quite a bit actually - I'm now hovering in the 70C range when running the CPU-Z stress instead of slamming straight into 80s.

     

    I grabbed a shot of the IHS right after I removed the pump head, it looks like the thermal paste hadn't spread evenly?

    cooler.thumb.jpg.9aa5a57037a83b6b7e102720a356ecf7.jpg

  4. Hi,

     

    My LLC is currently set to Automatic - I was getting random crashes when trying out anything from Mode 3-8. Setting this back to Auto fixed the crashing so I believe this is either Mode 1 or 2.

     

    My voltage is currently set to Adaptive + Offset mode in BIOS, with the offset at +0.025.

     

    Here's a screenshot of the motherboard section in HWInfo while running CPU-Z stress - it says the Vcore is 1.356v.

    cpuid1.thumb.png.b7a1e0a0dd28735dbc8d57d58d334929.png

  5. Is a +55-60C difference between the CPU package temp and coolant temp normal?

     

    Got this when running the CPU-Z stress test. When starting the test, the CPU instantly shoots up to 80-85C, with coolant in the 25C range. The coolant temp does rise by about 0.1C every 2-3 seconds, so there is thermal transfer. Once the coolant hits 32C, the CPU is at 100C and thermal throttling.

     

    My chip is a i7-10700K overclocked to 5.1GHz @ 1.350v.

     

    I'm using XTM50 thermal paste that I applied using the included applicator.

     

    I've attached a screenshot of the temps from iCUE/HWMonitor.

    177676625_Screenshot2021-02-21161814.thumb.png.b63d5fbcaa15c8a646621e939941a57d.png

  6. I'm inclined to just leave it be at this point - I'm sticking to using Video Lighting in some games, like MMOs, where it looks fantastic in raids and a drop of a few FPS (like from 160 to 155) doesn't really matter. If I really need to squeeze out a few extra FPS like in Cyberpunk I just set them to off in the game's CUE profile.

     

    Reckon I've got the hardware to keep Video Lighting on in like 90% of the things I play without noticing an impact unless I'm staring at the FPS meter and taking notes. I treat it as another graphical quality setting.

  7. "Will give 3DMark a go soon to see if it does anything."

     

    Just posted in other thread - 20k with ambient lighting off, 17800 with it on in timespy, so yeah, pretty big impact

     

    Can confirm I lost a bunch of points in 3DMark with Video Lighting.

     

    Without:

     

    13.7k points, with 14.1k for GPU and 12k for CPU

     

    With:

     

    11.9k points, with 11.9k for GPU and 11.7k for CPU

     

    So also lost about 2k points with Video Lighting active.

    without.png.ffd8f6720d0974576e9d4b8eb4c36136.png

    with.png.4c7550edabf66fc42b4f25fe60d88878.png

  8. Did you tried checking with the G-SYNC indicator [control panel>display>g-sync indicator] just to be sure ?

     

    Because I also have an Asus monitor and a RTX2080ti and I'm positif that video lightning remove G-SYNC on all my games and also on GPU bench like Superposition.

     

    Would be great to have a statement from Corsair directly

     

    I just tried that out in a couple games - it still says "G-Sync On" when video lighting is (obviously) enabled. My monitor's OSD also says that it is on and I can see that it is changing its refresh rate according to the FPS in game.

  9. Coming over from https://forum.corsair.com/forums/showthread.php?p=1071306#post1071306

     

    I'm not an expert on this, but I believe this is related to the way iCUE captures colors.

    In fact, it's quite common on other ambient lighting apps as well. When I tested Hue, only one app managed to avoid the conflict by using Desktopmagic to take the screenshots. The efficiency could be much lower than other methods but at least it has no conflict with G-Sync. Hope Corsair could implement similar methods as a compatibility option.

    In terms of FPS losses, iCUE has to process the images to get colors, so I guess it's just a sacrifice you have to make (unless you leave the processing and even capturing to another device).

     

    I don't seem to get any conflict between Video Lighting and G-Sync - I can have both enabled and they run well together. I've got an RTX 3070 and Asus monitor if that helps, and run all games in Exclusive Fullscreen.

     

    I suppose that makes sense, I would hope there would be a way to make it not effect performance so much. Regardless of how little/much performance affect there is, Corsair should mention this somewhere in their documentation so the consumer can decide if the performance drop is worth it prior to purchasing. I still love the effect but depending on what game I’m playing I’ll turn off video lighting so my FPS won’t drop!

     

    I suppose this is fair enough, since it has to use a few CPU cycles to sample the display. I think the effect is worse in CPU intensive games - HZD uses 60-70% of the CPU in open areas and I feel that is impacted the most (from ~110FPS to ~90FPS and stuttering). I tried Wolfenstein Young Blood and the effect is much less apparent there - I went from ~160FPS to about ~155FPS in open areas. In Nier: Automata there's no difference at all, but that game is locked to 60FPS. I couldn't tell a difference in The Outer Worlds either, even with that game running at 100FPS.

     

    I also tested runs in Unigine Valley but having Video Lighting did not seem to affect my score at all. Will give 3DMark a go soon to see if it does anything.

  10. I've just finished installing the LS100 starter kit on my Asus monitor. Bit fiddly to get everything mounted but I'm quite happy with how it turned out.

     

    Anyways, I jumped into Horizon: Zero Dawn with the video lighting effect turned on (which is the reason I bought the LS100 in the first place). I usually get between 80-100FPS in that game at 1440p, with everything set to Ultra on my RTX 3070. However, when playing with video lighting enabled, my FPS hovers around 60-70 instead. I know, still technically playable, but my monitor is 165Hz so the drop is very noticeable.

     

    If I alt tab out and switch to a different lighting mode (e.g static), my FPS goes back to normal.

     

    I've tried out a couple of other games, such as The Outer Worlds and NieR: Automata, but they don't seem to be having the same obvious performance issues as Horizon.

     

    I don't seem to be having the same CPU usage issues that others are experiencing - iCue sits around 1-2% of use on my i7-10700K. G-Sync appears to be working normally too.

     

    This is quite disappointing as video lighting really is a nice effect to have and is the main reason I got the LS100. Hoping someone knows a way to work around this?

×
×
  • Create New...