Jump to content
Corsair Community

Search the Community

Showing results for tags 'dominator platinum'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


    • Community Rules & Announcements
    • Live Event Announcements
    • New Product Releases
    • Non-CORSAIR Tech, Benchmarks, and Overclocking
    • Games & Gaming
    • Battlestation and Build Showcase
    • iCUE Software Troubleshooting
    • Build Hardware Troubleshooting
    • Gaming Peripherals & Audio Troubleshooting
    • Furniture and Ambient Lighting Troubleshooting
    • CORSAIR Pre-Built Systems Troubleshooting
    • Build Hardware
    • Gaming Peripherals
    • Audio Devices
    • Battlestation Hardware: Ambient Lighting, Furniture, etc.
    • CORSAIR Pre-Built Systems
    • CORSAIR Technologies Q&A
  • Alternative Languages
    • Deutscher Support (German Support)
    • French Support
    • Spanish Support
    • Corsair Enthusiasts Section
    • Corsair Product Discussion
    • Alternate Language Support
    • Customer Service


  • iCUE


  • System Build Inspiration
  • Memory
  • Cases
  • Maximizing Performance
  • Peripherals
  • Storage
  • Liquid Cooling
  • Gaming Mice
  • News and Events
  • How-tos and DIY
  • USB Drives
  • Extreme OC Mods
  • Gaming Keyboards
  • Power Supply Units
  • Gaming Headsets
  • Uncategorized

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me



Optical Drive # 1







  1. Hey Frage steht oben. Würde mir gerne zeitnah mein System bauen. Wann kann man die Dominator kaufen?
  2. I have the Corsair Dominator Platinum DD4 3200Mhz 64GB kit (CMD64GX4M4C3200C16) in my X99 system that I have been using for more than 4 years. It has been working great, however, I just put in a new PSU in the system yesterday and now the system won't boot. I have the Asus Rampage V Extreme motherboard with the 6950X CPU and this Corsair RAM kit. The debug code the mobo throws is 'b7' so it is memory related. After spending hours trying to diagnose the issue, I've narrowed it down to the following. 1.) Default settings work fine - I can boot into Windows and everything seems to work. HOWEVER, the RAM timings are not at the advertised speed of the kit - they should be 16-18-18-36 but instead they are 15-15-15-36. I know that default settings should have a speed of 2133Mhz and not 3200Mhz, but that isn't the problem here - it is the timings! They show up as C15 instead of C16 (see screenshot). 2.) The moment I try to apply any XMP profile or even manually set the timings, it does not boot! When I go back into BIOS (after clearing CMOS), the RAM timings show up as C15. I have tried OC'ing the CPU and keeping the RAM at Auto/Default - still no boot/POST. I have tried setting XMP on the RAM and keeping the CPU at Auto/Default - still no boot/POST. I have also tried setting the RAM timings manually without OC'ing/changing anything else - still no boot/POST. Every single time, it shows the debug code of 'b7.' I have also tried reseating the RAM sticks several times to no avail. I also tested it with just ONE RAM stick, still no boost/POST when trying XMP, manual, or OC settings. Only default settings seem to boot with C15. The moment I change anything in BIOS, it does not boot/POST. I have checked and rechecked all the connections from the PSU - everything is connected properly. I am at a loss as to why it does not even POST when trying to use XMP or OC'ing the system. I had the OC fine-tuned and was using it for more than 4 years without a hitch! I really need help fixing this issue. PLEASE HELP! Thanks in advance.
  3. Hey, I've been trying to set layered effects on my Corsair Dominator Platinum RGB RAM. When I watch Youtube Videos, they are able to select individual LEDs to apply effects to. I tried clicking, click &drag, ctrl+click, but nothing worked for me. I can only chose which RAM sticks I want to apply the effect to. I attached a screenshot of my iCue. Thanks for the help
  4. Hi, I just purchased DOMINATOR PLATINUM RGB 16GB (2X8GB) 3600 MHZ DDR4 DUAL CHANNEL MEMORY KIT RYZEN as I was upgrading my PC along with a Ryzen 7 3700x and an Asus Rog Strix F-Gaming x570 Motherboard. The system is perfectly stable provided I keep RAM running at the auto 2133MHz frequency, the moment I try 3600MHz or even 3200MHz my system starts to randomly restart. Any help would be appreciated. Thanks
  5. I can't seem to find the hardware lighting profile to keep my lighting profile after the Icue app is closed. So basically when I set a color and close the app the colors go back to factory (rainbow or something). How do I find this setting for the memory or at least how do I make it stay the color I choose? I have 6 LL fans and they have the hardware lighting profile. so when I turn on/off my computer or closed the Icue app they stay with the color profile I choose. Thanks in advance
  6. Blessing :) I've got a non-RGB Corsair Dominator Platinum, and now when the RGB edition of it ("corsair dominator platinum rgb") is out, I would like to change my current non-RGB into those RGB shield. Is there a kit or a way to do that? I would like to see my non-RGB RAMs in RGB! Thank you :)
  7. This build log is going to be a bit on the personal side. The fact is, at its core, Corsair is a cadre of geeks with shared interests trying to make cool stuff. A lot of companies want to project being “cool” or “rock stars,” but the reality here is that our products are conceived and designed by a bunch of people who are just trying to produce something they’d use. Why am I laboring over the notion that Corsair is ultimately a fairly human organization? Because, well, human things happen to us. At the end of August, I had a very good friend die in a motorcycle accident. He was in his early thirties, driving home from work as a district supervisor for DHS out of Oakland, California. Hit a bad patch of asphalt, lost control of his motorcycle, went under a semi, and that’s all she wrote. Odds are you don’t know him, but given the number of people I saw at his memorial service, I wouldn’t be surprised if one or two of you did. His name was Benjamin Moreno. Ben was a fairly serious gamer. We got into Mass Effect 3 multiplayer together, then graduated to MechWarrior Online with some of our friends. He and his wife were into Star Wars: The Old Republic and Elder Scrolls Online, and near the end had spent considerable time playing Dota 2 and Heroes of the Storm. He got me to give Dragon Age II another chance (and was right on the money). He was also a big part of my choice to join Corsair. Outside of that, he was – regardless of your politics – an exceptional cop. Tough-minded, fair, and directly responsible for saving many lives. Before that, he was in the Air Force. Through his life, he had friends who he’d set on the right path when they’d strayed, and was generous with his time and attention. There are an awful lot of people who would be far worse off today if it hadn’t been for him. Unfortunately, Ben left behind a widow, Risa, and a very young daughter, too young to really comprehend that her father’s not coming home. His family lives on the outskirts of the bay area, which unfortunately played a role in his passing due to the long commute. Gaming was and is a very large part of how they stayed in contact with friends. He and I often talked about someday building him a ritzy custom loop system when circumstances and finances permitted. Since Risa is an avid gamer and plays a healthy amount of Dota 2, it seemed like building her a proper, custom loop gaming machine was the right thing to do. It didn’t have to be as fancy as his would have been, but should have plenty of horsepower for gaming, photo editing, and coding. You’re going to find the custom loop is excessive for this build, but I haven’t built a custom loop for performance reasons for a long time. The fact is that it looks cool – not just to fellow geeks, but to just about everyone. With that said, here’s the component breakdown for the “Blight” Memorial Build, after his handle: Corsair Carbide Air 240 His old gaming PC was built in an Air 540, so it seemed appropriate to go with its more compact cousin for the new one. This would also be an opportunity to show a custom loop operating inside this substantially smaller chassis. Intel Core i7-5775C We had a couple of spare Broadwell chips from internal testing. These are both remarkably powerful and remarkably efficient, and while it’s not the latest and greatest available, the i7-5775C is mighty close. Four cores, eight threads, that massive L4 cache, second in IPC only to Skylake, and a 65W TDP. The odds of being CPU limited with this chip are very low. ASRock Z97E-ITX/ac Mini-ITX We did our internal testing on Broadwell using this platform and found it rock solid with good overclocking potential. Given the cramped quarters of the Air 240, it seemed necessary to go with a smaller motherboard. Corsair Dominator Platinum 2x8GB DDR3-2400 C10 with Lightbars In my testing, I’ve found 2400MHz to be the perfect speed for DDR3 on Haswell and to a lesser extent Broadwell. 16GB of DRAM provides plenty of memory to work with for almost any task. EVGA GeForce GTX 970 It didn’t make sense to put some monster graphics card in the build, but we definitely needed one that would be plenty powerful for gaming for the foreseeable future. NVIDIA’s GeForce GTX 970 was that card, and we went with an EVGA model because of EVGA’s tendency to adhere to NVIDIA’s reference design (improving waterblock compatibility). Corsair Force LS 960GB SSD The Force LS was our budget line up until our TLC-based Force LE drives, but make no mistake – these drives, and the 960GB one in particular – are plenty fast. We’re at the point now where nearly a terabyte of solid state storage is no longer outrageous, and the 960GB Force LS is a highly capable drive. Corsair HX750i 80 Plus Platinum Power Supply The HXi series isn’t quite as popular these days with the more affordable RMi and RMx series floating around at 80 Plus Gold efficiency, but the HX750i was chosen for its compatibility with our Type 3 sleeved cables, its higher efficiency, and its ability to run fanless at the loads this system was likely to produce. Corsair Link Commander Mini A powerful system need not be loud. The Commander Mini lets me spin the violet SP120 LEDs in the system at minimum speed as well as control the RGB lighting strips placed on the inside of the side panel, surrounding the window. XSPC 240mm Radiator For this build we’re looking at a rated maximum combined TDP for the CPU and graphics card of just 210 watts. Since even an H100i GTX can cool a 350W overclocked i7-5960X without too much difficulty, I felt a single 240mm radiator in the front would be fine for these highly power-efficient components. EKWB FC970 GTX Waterblock The PCB of the GTX 970 is so small, and the EKWB block really shows that off. The clear acrylic surface lets the end user see the coolant running through the graphics card, which is very cool. Because the block is so much shorter than the stock cooler, it affords us room in the case to optimally place the pump/reservoir combo. XSPC Raystorm CPU Block w/ Violet LEDs Since this build was intended to be more showy as opposed to a crushing performer, I opted for XSPC’s Raystorm water block and violet LEDs to give the CPU the right glow. EKWB D5 Vario XRES 100 Pump and Reservoir I’ve had great experiences with the D5 Vario pump in my own liquid cooled build, and this combo seemed to be the perfect choice for an attractive, efficient system. In addition to the parts used in this build, we also included a Corsair Vengeance K70 RGB keyboard, Sabre RGB Optical mouse, and our new Void RGB headset in black. With all of the components installed, the “Blight” build looks like a fun size version of a more beastly Air 540 liquid cooled build, and that achieves exactly the intended purpose. Because of the highly efficient components, the fans never have to spin up, and everything still stays running cool and fast. The violet (which I confess can look pink in some light) coloring was chosen for its significance to both Risa and Ben, as it’s their favorite color. It undoubtedly seems at least a little unusual to build a computer as a memorial for the passing of a dear friend, but gaming is fast becoming an integral part of our culture. I can think of no better tribute to a community gamer than to keep his wife connected with their friends and loved ones.
  8. The Graphite Series 380T was designed to be the ultimate LAN enclosure, with a sturdy handle on the top, easy internal access, integrated fan control, and a striking ID. For better or worse, we expanded its dimensions to allow you to install a 240mm liquid cooler for the CPU. Amusingly enough, though, what I always fixated on with it was the way the white one, with red LED fans, could wind up looking like this guy’s head: Source: Mass Effect 2 wiki. The white version of the 380T has white LED fans and white LEDs for all of the lighting, but that’s fixable. What I also wanted to do was put the most comically powerful system I could inside the case. Initially I was gunning for efficiency and planning to use Intel’s Core i7-5775C CPU, but Broadwell’s limited overclockability wound up being unappealing in the face of being able to go completely insane with the ASRock X99E-ITX/ac: Source: ASRock. While the board uses an enterprise-class socket with narrower mounting points than the traditional LGA 2011-3 socket, Asetek produces a mounting kit for this narrow socket that allowed me to install an H100i GTX, giving me all the cooling performance I could need for the Intel Core i7-5960X I was planning to use. It can be tough to scale to high DDR4 speeds on Haswell-E when you’re populating all four memory channels, but when you’re running in dual channel it takes some of the load off the controller. The result is that I have two 8GB DDR4-2800 DIMMs installed, making up some of the memory bandwidth deficit stemming from the X99E-ITX/ac’s two memory channels. The other half of the performance equation is getting a powerful graphics card, and right now the NVIDIA GeForce GTX 980 Ti is a tough card to beat. I’ve already covered how well this card overclocks when under an HG10 and it is an absolute bear. For this build, I used a reference Gigabyte GeForce GTX 980 Ti, our prototype HG10-N980 bracket, and a Hydro Series H75 cooler with two red SP120 LED fans. The H75 is mounted to the front of the case, and the fans are controlled and powered by the 380T’s integrated fan controller. Handling storage duties are a 240GB Neutron GTX SSD as the system drive and a 960GB Neutron XT SSD as the gaming/scratch drive. If I’m going to overclock this system – and I absolutely am – I’m going to need a pretty solid power supply, and for that I turned to our recently released RM750i. This PSU necessitated ordering the PSU extension bracket, which also buys a little more breathing room internally. The extension doesn’t stick too far out of the back, either, so it’s not unsightly. Finally, to get the look I wanted I needed to replace the front fans with 120mm red LED fans as well as replace the white-lit I/O board with the red I/O board from the black version of the 380T. All in all, I don’t think it came out too bad. For reference, here’s the list of components used in this build: CPU: Intel Core i7-5960X Motherboard: ASRock X99E-ITX/ac DRAM: Corsair Dominator Platinum 2x8GB DDR4-2800MHz CAS15 Graphics Card: Gigabyte GeForce GTX 980 Ti Storage: Corsair Neutron GTX 240GB SSD and Corsair Neutron XT 960GB SSD CPU Cooling: Corsair Hydro Series H100i GTX with aftermarket Asetek bracket GPU Cooling: Corsair Hydro Series HG10-N980 with Corsair Hydro Series H75 Power Supply: Corsair RM750i 750W 80 Plus Gold Chassis: Corsair Graphite Series 380T White Accessories: 3x Corsair SP120 LED Red Fans, PSU Extension Bracket for 380T, 380T Red I/O PanelIn an upcoming blog, I’ll detail overclocking and just how much performance I was able to extract out of this system, especially in comparison to the extremely powerful (and much larger) “Yamamura” 750D build.
  9. Any overclockable PC component, be it a CPU, a graphics card, or system memory, essentially exists in a few grades. There’s the entry-level – the “good enough for government work” grade – that typically works fine for everyone. For a lot of users, that’s really an i3 or non-K i5 at most, or a reference graphics card. Then there’s the performance grade, where you start getting the special sauce, and that bleeds into an enthusiast grade, where you start pushing performance limits while still gunning for 24/7 use.And then, I would argue, there’s competitive overclocker grade. The cream of the crop hardware. Price-performance simply isn’t a consideration; even practical performance may not be a consideration. It’s stuff that’s the best…period. Like, for example, our Dominator Platinum Limited Edition Orange DDR4 kit, four 4GB DIMMs running at a staggering 3400MHz with latencies of 16-18-18-36 at 1.35V. Top of the line ICs assembled in a kit that pushes the limits of what the host platform itself can handle. This is memory that runs at such a high clock that your CPU may not even be able to crack it. It also more or less requires a very specific motherboard: the GIGABYTE X99-SOC Champion. This kit was developed in concert with Gigabyte’s motherboard, which goes down to four DIMM slots and offers a unique topology designed to squeeze every last ounce of performance out of your CPU and memory. The X99-SOC Champion also enables “Socket 2083” mode, adding pins to the socket that allows for more precise voltage control with your processor – a necessity when dealing with bleeding edge performance. Even still, because of the variable quality of host CPUs and their memory controllers, XMP may not always work with this kit. For that reason, should you decide to walk this path, you may need to adjust your memory settings manually. Some chips will be able to handle the preferred 1.66x gear ratio to hit the kit’s rated 3400MHz speed, while others will require the lower 1.25x gear ratio. We’re providing settings here for either. SETTING Using 1.25x Gear Ratio Using 1.66x Gear Ratio Advanced Frequency Settings Host/PCIe Clock Frequency 102.10MHz 102.00MHz CPU Clock Ratio 24 or 25 19 or 20 System Memory Multiplier 26.66 20.00 Advanced CPU Core Settings CPU Clock Ratio 24 or 25 19 or 20 Uncore Ratio 24 Matches CPU Clock Ratio Intel® Turbo Boost Technology Auto Auto Advanced Memory Settings System Memory Multiplier 26.66 20.00 Memory Timing Mode Manual Manual Channel A Memory Sub Timings CAS Latency 16 16 tRCD 18 18 tRP 18 18 tRAS 36 36 CPU Core Voltage Control CPU System Agent Voltage +0.300V to +0.450V +0.300V to +0.450V DRAM Voltage Control DRAM Voltage (CH A/B) 1.350V 1.350V DRAM Voltage (CH C/D) 1.350V 1.350V This memory, developed in conjunction with Gigabyte’s X99-SOC Champion motherboard and competitive overclocker Hi Cookie, has already been used to set records that will no doubt last for some time to come. Or at least until we break them again.
  10. At Corsair, we make all kinds of stuff, but at our core, at our heart, we’ve been a memory company since the beginning. So when someone comes up with what appears to be a fantastic solution to using the largesse of memory modern machines are capable of supporting, we’re interested. With that in mind, I took DIMMDrive for a spin. It’s been garnering very positive reviews on Steam, and the $29.99 buy-in isn’t too unreasonable. I tried it on two different testbeds: Testbed #1 Testbed #2 CPU Intel Core i7-4790K @ 4.5GHz Intel Core i7-5960X @ 4.4GHz DRAM 2x8GB Vengeance Pro DDR3-2400 8x8GB Dominator Platinum DDR4-2400 Motherboard ASUS Z97-WS ASUS X99-Deluxe Graphics GeForce GTX 980 2x GeForce GTX 980 Storage 240GB Force LS SSD 512GB Force LX SSD 4x 480GB Neutron GTX SSD in RAID 0 Cooling Hydro Series H110i GT Custom Liquid Cooling Loop PSU HX750i AX860i Chassis Obsidian 450D Obsidian 750D What DIMMDrive does is provide a smart front-end between Steam and its games and an old school RAM drive. You load it up, toggle which games you want loaded into the drive, and then toggle DIMMDrive on. And therein lies your first problem: you’ve just front-loaded your loading times. The games you’re loading have to copy – in their entirety – to the RAM drive, and that loading time continues to be gated by the speed of your storage. The second issue is the footprint of the modern triple-A title. While DIMMDrive offers some small allowance for this by letting you choose which individual files in a game you want copied to the drive, the solution is a clunky one. But look at the storage requirements for these modern games: Battlefield: Hardline – 60GB Battlefield 4 – 30GB Far Cry 4 – 30GB Counter-Strike: Global Offensive – 8GB Elder Scrolls V: Skyrim (assuming no mods) – 6GB Watch_Dogs – 25GB The Witcher III: Wild Hunt – 40GB Grand Theft Auto V – 65GB Dota 2 – 8GB World of Warcraft – 35GBFor users playing less intensive games, you’re still looking at a minimum of 16GB of system memory just to have enough to handle the game’s footprint. And how does it work in practice? I tried using it with a few games that seemed like they might benefit from faster access time: Sid Meier’s Civilization V has basically no loading time during the game, but takes an eon to load initially. Wolfenstein: The New Order uses id Tech 5’s texture streaming and thus by its nature desperately needs all the bandwidth it can get. Even old school Left 4 Dead 2 tends to take a while to load. /corsairmedia/sys_master/productcontent/blog_DIMMDrive_Review-Content-1.jpg The biggest problem was that whether I loaded these games off of my RAIDed SSDs or just the one, the longest load time was always by and large just copying the game into memory when DIMMDrive was enabled in the first place. Switching to a single SSD from a mechanical hard disk improves virtually every aspect of the computing experience and brings game load times in line, but going beyond that to the RAID or the DIMMDrive just doesn’t feel any faster. The most noticeable aspect of DIMMDrive was how long it took to load a game into RAM in the first place. Beyond that, Wolfenstein: The New Order would just crash when I tried to run it from DIMMDrive, so I wasn’t able to see if DIMMDrive could at least improve the texture pop any. So why doesn’t DIMMDrive make a homerun impact on gaming and loading times? Quad-channel DDR4-2400 is, at least synthetically, capable of being almost 100x faster on read than a good SSD. But the answer is more complex, because when games load, it isn’t just loading a game from storage into system memory. Many modern games already use system memory intelligently to smooth out load times in the first place. From there, data needs to be copied either from system memory or system storage to the graphic’s card’s video memory, and that’s going to be gated by the PCI Express interface among other things. A PCI Express 3.0 x16 slot is capable of transferring a ballpark 16GB/s. A quad-channel memory bus will outstrip that in a heartbeat, while a more mundane dual-channel DDR3-1600 configuration is still capable of a ballpark ~25GB/s. Even then, though, actually copying/moving data between system memory, system storage, and through the PCI Express bus is only a part of what a game does when it’s loading. There are countless other operations to consider: compiling shaders, connection speed and latency for online games, and so on. My ultimate point is that by the time you’re done taking all of these other operations into account, the amount of time DIMMDrive might save you could be a few seconds at best, or it may actually cost the time it requires to copy the entire game into system memory in the first place. If you’re on mechanical storage, DIMMDrive could definitely demonstrate an improvement, but it would still require a substantial amount of investment in DRAM in the first place. Ultimately, getting value out of DIMMDrive – assuming you’re on a platform that supports enough memory to make it viable for larger games – requires greater expense and more complexity than simply buying a high capacity SSD. While I’d love to sell you our enormous memory kits, and I continue to recommend 16GB of system memory as a baseline for those that can afford it, the more sensible option continues to be solid state storage
  11. Corsair DDR4 memory is among the fastest, if not outright the fastest, DDR4 you can buy. It’s the kind of DRAM that can really stress the memory controller on your shiny new Haswell-E processor, especially as you start to increase capacity. Intel only includes memory straps up to 2400MHz in their new hexacore and octal-core Core i7 processors, but we started at 2666MHz and went up from there, all the way to 3400MHz at present. That’s before overclocking. The secret sauce of Corsair DDR4 below 3000MHz, though, is the inclusion of a second XMP profile beyond the first. The first profile is for what your memory is rated for and is a guaranteed level of performance designed to maximize compatibility; it runs at 1.2V, features tight timings, and just works. On the other hand, the second profile enables overclocking out of the box: it boosts voltage to 1.35V and hits the next speed grade. For testing and future articles, we’re using a staggering 64GB of DDR4 specced for a relatively modest 2400MHz; specifically this Dominator Platinum kit. This kit is attractive because it’s comparatively less expensive than the other 64GB Dominator Platinum kits and is optimized for Haswell-E’s highest native memory strap. Our test platform enjoys an ASUS X99-DELUXE and a Core i7-5960X overclocked to 4.4GHz. To give you some idea of what you get going up from JEDEC spec to our DDR4-2400 at CAS 14, read bandwidth jumps 7.1% and copy bandwidth jumps 13.7%. Write bandwidth remains stable, but that’s normal; Haswell-E seems to have a memory write bandwidth ceiling. Keeping in mind that our DDR4-2400 is built to go higher, running it at 1.35V with 2666MHz speed and CAS 14 again – and remember, this is 64GB, which is really pushing the memory controller – nets us 11.5% read bandwidth over JEDEC and a very healthy 22.6% increase in copy bandwidth. One of the major concerns of making the jump from DDR3 to DDR4 was latency. Yet DDR4-2400 and 2666 latency is actually comparable to Ivy Bridge-E, meaning Haswell-E’s moderate DDR4 can handily outpace its predecessor. JEDEC-standard 2133MHz 72.9ns latency is nothing exciting, but even a modest kit of Dominator Platinums gets you 5.3% faster latency, and the built in overclock increases that lead to a respectable 12.1%. With all that said, if you want more out of a 64GB high capacity kit, you’ll need to buy a faster kit. Our 2666MHz kit contains a 2800MHz XMP profile, and the 2800MHz kit may very well hit 3000MHz depending on the quality of the kit and the amount of optimization your motherboard’s BIOS has seen. But even the “entry level” is still pretty respectable and ensures your processor is never too bottlenecked by memory bandwidth.
  12. Recently, the actual computer part of the Obsidian Series 750D “Yamamura” custom water-cooled system began having issues with random shutdowns and reboots, as detailed in this earlier blog. Ordinarily those types of problems are a frustration, but when your system looks like this… …the increased difficulty of swapping any parts out, potentially requiring you to actually drain the loop entirely, may even make you question why you built your system up like this in the first place. However, as any die-hard builder knows, part failure always has a silver lining: an excuse to upgrade. And that’s what I did, giving me a chance to rectify a few pain points in the original build, things I felt like I could’ve done or specced better. The system was already close to unimpeachable, but we can certainly do more. Swapping the CPU, Motherboard, and DRAM Before After CPU Intel Core i7-4790K (4 GHz) 4 Cores, 8 Threads, 84W TDP Intel Core i7-5960X (3 GHz) 8 Cores, 16 Threads, 140W TDP Motherboard ASUS Z97-WS ASUS X99-DELUXE DRAM 4x8GB Dominator Platinum DDR3-2400 10-12-12-32 1.65V 8x8GB Dominator Platinum DDR4-2400 14-16-16-31 1.2V The only way to “upgrade” past Intel’s monstrous Core i7-4790K (overclocked to 4.7GHz in our build) is to change your platform entirely, so that’s what I did. While the i7-4790K tops out at between 120W and 130W when overclocked, the i7-5960X starts there and pulls considerably more when overclocking is applied. But that’s fine: Yamamura enjoys a custom liquid cooling system with massive heat capacity. Changing the platform means swapping to the even more capable ASUS X99-DELUXE motherboard as well as jumping from DDR3 to DDR4. Latency does increase, but so does capacity and overall bandwidth. It’s a net gain, and our DDR4-2400 kit even includes an extra XMP profile that pushes the voltage to 1.35V and speed to 2666MHz. Incidentally, due to the spacing of the video cards, we actually lose a little bit of bandwidth to the pair of GeForce GTX 980s. The slot arrangement results in the bottom GTX 980 only getting PCIe 3.0 x8 instead of the full sixteen lanes, but thankfully this produces virtually no measurable decrease in performance. Upgrading the Storage A lot of people didn’t care for the way the LG blu-ray burner broke up the front of Yamamura, and I can see why. At the same time, I also found myself needing a little bit more storage for a documentary I’m editing in my off hours. Thankfully, there’s a way to serve both masters, and it comes from SilverStone. SilverStone produces a 5.25” drive bay adapter that can fit a slimline, slot-loading optical drive and four 2.5” drives. By purchasing a slimline, slot-loading blu-ray burner and installing a spare 512GB Force LX SSD we had in house, I was able to clean up the front of the case and increase storage. Fingerprints notwithstanding, it's a lot cleaner than it was before. Improving the Cooling and the Bling While the original build called for a Dominator Airflow Platinum memory fan, we weren’t able to find clearance for one owing to the ASUS Z97-WS’s layout. Happily, the ASUS X99-DELUXE doesn’t have this problem, and that meant we could add two Dominator Airflow Platinums. Because they’re PWM controlled, they’re a perfect match for our old Corsair Link Cooling Node, and because they use the same RGB LED connector as our other lighting kits, a single Corsair Link Lighting Node is able to control them. The end result isn’t just increased bling: even at minimum speeds, the airflow from the fans helps keep the DDR4 cool (with individual DIMMs peaking at just 38C), while also shaving at least 10C off of the power circuitry surrounding the memory slots. Getting fresh airflow into the motherboard’s VRMs never hurts. Yamamura 1.5 I was immeasurably thankful that I didn’t have to drain the loop to make these upgrades, thus reaffirming my belief in flexible tubing. Hard acrylic is frequently argued as the way to go in modern builds, and people say it looks nicer, but it’s not functional. I use this computer on the daily, and I am possessed by a relentless appetite for tweaking the hardware. Given just how bloody fast the Yamamura is now (and stable, mercifully), I don’t foresee making any major changes to the system until Skylake and Big Maxwell at the earliest, at which point there may be a newer, more exciting chassis to move into…
  13. Building an expansive, gorgeous custom liquid cooling loop in your PC has its perks. For one, it looks awesome. It also gives you the opportunity to maximize and perfect the cooling capacity of your enclosure. That, in turn, gives you the opportunity to maximize and perfect the performance of your system. And honestly, again, it looks awesome. You can show it to people who don’t even know anything about computers and get their eyes to bug out. Of course, this is all predicated on the idea that the system works. That the motherboard, graphics cards, memory – that everything is functioning properly. And for a little while, my monstrous Obsidian Series 750D build, “Yamamura,” was working perfectly. For a little while. Then the random shutdowns and reboots came. And the POST loops. Killing the overclock on the i7-4790K seemed to largely solve the problem, but it’s hard to feel proud of your monster of a computer when the CPU is running at stock under a custom loop. And this is where the custom loop becomes a problem. When troubleshooting this… …there’s only so much you can test before things get…inconvenient. The DDR3 was known good and wasn’t being pushed beyond XMP, so it was ruled out early. My boss handles power supplies, so I opted to blame that last. The graphics cards are part of the loop and can’t be removed without draining the whole thing, so that necessitated basically hoping the cards weren’t the problem. So the first thing I did was test swapping out the CPU for a known good one: an i7-4770K that had barely been overclocked. Swapping out the CPU was frankly very easy; you just remove the CPU block from the CPU socket. Unfortunately, that didn’t solve the problem. Since I’m used to seeing POST loops being a motherboard problem, and since the board I was using had been having initialization issues with USB pretty much since the get go, it seemed like that was the culprit. Uh oh. As it turns out, swapping out the motherboard was easier than I’d expected, and I took the opportunity to switch from Haswell to Haswell-E and give the loop a chance to really stretch its legs. Due to the long, flexible tubing and arrangement of the loop, I was able to “fold” the CPU block and graphics cards over the pump and reservoir and free up the motherboard. An alternative (and arguably smarter) route would’ve been to install spill-proof quick-release connectors around the video cards, as I had in my previous system. This would’ve isolated the graphics cards in the loop, allowing me to remove them entirely, and even replace them without draining the loop. But folding works in a pinch. Some cabling behind the motherboard tray had to be snipped and rerouted, and the 8-pin CPU line needed some extra give, but I was able to swap in the new board, CPU, and DDR4 memory. It’s not perfect. Because of the spacing of the graphics cards, one is running at PCIe x8 instead of x16, but thankfully that’s a pretty negligible difference. And imagine my delight when the system booted up! It was working perfectly fine, everything was going great, and then…it shut down again. Now if you look at that photo above, you’ll see the PSU cables are crammed very tightly between the AX860i and the bottom radiator. Unfortunately, that AX860i was the only component left that we could replace without draining the loop. …and so it was replaced. And sure enough, swapping in another AX860i actually did correct the random shutdowns and reboots. It’s hard to say what went wrong, but even the best power supplies can have bad days, especially when they were randomly picked up from the tech marketing lab and likely exposed to all kinds of hilarious and awful circumstances. Of course, with all of these changes to the system come new opportunities to upgrade, test, and improve performance…
  14. This is the fifth and final part of our build log for the Obsidian Series 750D “Yamamura.” The previous four chapters: Part Selection Assembly Overclocking Optimization There are essentially four reasons to build a custom liquid cooled system: The pleasure of constructing something with your hands. The unique aesthetic of a liquid cooled system. The potential for improved performance as a result of the larger heat capacity. The ability to quiet or silence an extremely high performance system.On this front, how did the Obsidian Series 750D “Yamamura” build do? The pleasure of constructing something with your hands. Yamamura proved to be a more difficult build than I expected. While the 750D is uniquely well suited to high performance liquid cooled builds, cramming a third radiator into the bottom of the case resulted in clearance problems for deeper power supplies as well as forcing the pump/reservoir to be mounted to the motherboard tray instead of the bottom. The 750D has very healthy dimensions, but we're still trying to cram a lot into it. Thankfully, the AX860i power supply turned out to be an all-star. The reduced depth coupled with high capacity and best-in-class performance allows a power supply with only 160mm of depth to handle the demanding job of powering multiple high performance overclocked components. That, and we get to keep the third radiator. As a result of having to cut two fans and the Dominator Airflow Platinum, though, I wound up ultimately being able to go down to just one Corsair Link Commander Mini unit. This is fortunate, as the NZXT USB 2.0 header splitter simply didn’t play well with the USB controller on the ASUS Z97-WS (note that the USB controller itself on my board seems to have issues with resolving hubs in general). The unique aesthetic of a liquid cooled system. This is hands down the most beautiful liquid cooled rig I have ever built. The 750D’s large side window allows you to really see and appreciate the glowing XSPC waterblocks, Dominator Platinum memory kit with lightbars, blue sleeved cabling, SP120 LED fans, and the XSPC Photon 170 reservoir. My girlfriend had worked with me on my last build and was skeptical that this one would look better, but Yamamura is a gorgeous beast and excellent showpiece. The Corsair Link lighting kit set to white allows all of the blue components to really pop. I have found over and over again that even people who aren’t die-hard DIY enthusiasts can still be impressed by a beautiful, well-built system with a custom loop. The potential for improved performance as a result of the larger heat capacity. While I wasn’t able to reach the mythical 4.8GHz on my Intel Core i7-4790K, nor was I able to get higher overclocks on my GeForce GTX 980s even with modded BIOSes, the waterblocks on the 980s do their job with aplomb. They may hit the same overclocks that they did on air, but those overclocks are much more stable now. XSPC's Razor GTX 980 waterblock does a stellar job of keeping every heat generating component incredibly cool. I feel better being limited by the silicon more than by the heat, and I now have two GeForce GTX 980s that spend their lives pushing 8GHz on the memory and 1.5GHz on the GPU. I’m looking forward to putting them through their paces at 4K. The ability to quiet or silence an extremely high performance system. Until we produce the greatest silent case the world has ever seen, one that effectively marries best-in-class airflow with smart acoustic design, the best way to make a quiet system is by controlling airflow. Having twelve fans and a pump decoupled from the chassis allows me to run the Yamamura extremely quietly. No high end build is complete without the Corsair Link Commander Mini. By employing a Corsair Link Commander Mini, I can run all of the fans at minimum speed until absolutely necessary, and this much heat capacity takes a very long time to reach a “steady state.” The result is that Yamamura is barely audible when running and certainly in no way obtrusive. Conclusion I actually have one regret as far as the Yamamura goes, and it’s a semi-silly one: I wish I had gone with Haswell-E instead of Devil’s Canyon. It arguably would’ve pushed the AX860i to its limits, but even a 4.7GHz i7-4790K feels oddly underpowered and modest in a build like this. An i7-5960X or even an i7-5930K, when overclocked, can start to really tap into the extra cooling potential of more elaborate cooling systems, while the i7-4790K can reasonably be handled by something as modest as a Hydro Series H75. Somehow I'll get by. With all that said, though, the system is still bracingly fast and handles just about anything I throw at it. I can’t complain too much. Except about my power bill.
  15. (This is the second part in a series of blogs. The first part details part selection, and is here.) Building a custom liquid cooling loop, even in a case as well-designed as the Obsidian Series 750D, seems to be inevitably more involved than you originally plan. At least if you’re a hobbyist like I am; this is only my fourth loop, and each time I’ve learned new and exciting lessons. For example, plans are adorable. Every time I’ve sat down to do one of these, I haven’t been exactly certain what order to go in. So for the Yamamura, I started out by just installing the waterblocks to the graphics cards. I’m using the XSPC Razor GTX 980, almost entirely for its lighting, but also because I’ve been continually bothered by the general lack of user-friendliness of EKWB products. Installing an EKWB block on a Radeon R9 290X was a fussy, frustrating experience. The XSPC Razor was better, but not by much. Carefully removing plastic from both sides of little pieces of thermal padding is a chore unto itself. The Razor GTX 980 can also be ordered with XSPC’s backplate, or you can re-use the one that comes with the stock 980. I opted to just re-use NVIDIA’s. XSPC’s block sure is a looker, though, and definitely more appealing than the Swiftech Komodo blocks I used on my GTX 780s. The EKWB transparent block I used on the R9 290X for my 250D build was well-suited to the task, but for the Yamamura, the Razor 980’s glowing trim is going to be killer and a real eye-catcher. The Obsidian 750D needs to almost be gutted to fit the amount of cooling capacity we’re cramming into it. The 3.5” drive cages have to go along with the stock intake and exhaust fans. I also had to temporarily remove the 2.5” drive sleds, but thankfully the smart layout of the 750D allows me to use up to four SSDs even with three radiators installed. Despite my reservations about how fiddly EKWB’s blocks can be, the Supremacy EVO is regarded on several forums as being simply the best CPU block you can buy. Interestingly, EKWB doesn’t necessarily employ a one-size-fits-all approach with their blocks; components within the block can be swapped out to optimize for individual platforms. The default configuration is for an LGA2011(-3) CPU, but replacing the jet plate makes it better suited for our Intel Core i7-4790K. You also want to make sure the copper microfins inside the block run parallel with the CPU die beneath the heatspreader to maximize heat transfer. And here’s where plans begin to crumble into dust. Two design choices already have to be cut and altered. Due to limited clearance and overachieving ambition, the bottom radiator can’t be configured as push-pull, so I went with push. The radiator’s fixtures also encroach on the HX1000i. While the HX1000i and radiator fit together, the HX1000i’s cables make it impossible. At this point I had to decide whether I wanted the HX1000i or the bottom radiator; the bottom radiator won out, and the HX1000i was replaced by the higher-performing but lower-capacity AX860i, which has an impressive 160mm depth. With radiator and component fitment sorted out, it’s time for some fresh problems. Everything installs okay enough, but the 360mm radiator in the top needs to be rotated 180 degrees; the inlet and outlet overlap the primary AUX12V socket on the motherboard. The secondary one gets covered by fans, and the third is next to the first PCIe x16 slot. Thankfully we only really need the first. The GTX 980s also wind up being a touch too long to use the stock mounting holes in the motherboard tray for the XSPC Photon 170 D5 Vario reservoir/pump combo. Note, too, that the speed control on the bottom of the pump is basically buried; I tested it before installation to find the right balance of performance and noise and went with the Level 3 (of 5) setting. In order to mount the pump and reservoir, I needed to drill holes into the motherboard tray. Per my girlfriend’s directions (she’s much handier than I am), I covered the tray with painter’s tape to keep metal shrapnel from flying into the electronics (a smarter decision would’ve been to take them out ahead of time). I also photocopied the mounting side of the Photon 170 and used it essentially as a guide for drilling the mounting holes, and this worked fairly well. Once I was clear that the pump and reservoir assembly was going to install safely, it was time to actually cut the tubing and connect the loop. Per a suggestion from a more experienced modder, we switched from using Swiftech compression fittings to Bitspower, and it was a very positive switch. Swiftech’s fittings certainly work, but the Bitspowers are much, much easier to install. Incidentally, the loop layout wound up being essentially identical to the plan: CPU Block 360mm Radiator GTX 980 #1 GTX 980 #2 Bottom 240mm Radiator Front 240mm Radiator Pump Back to CPU BlockOwing to the relative spaciousness of the 750D, actually attaching the fittings wound up being fairly trivial provided we warmed up the ends of the tubing before slipping it over the barbs. Of course, it’s easy for me to say it was trivial; there was still a decent amount of elbow grease involved, and my much stronger girlfriend was responsible for securing all of the compression fittings. With paper towels down, we primed the loop and left it leak testing overnight. The next morning, the paper towels were dry. Since the loop itself was in place, I finished the build by installing the SSDs, Commander Minis, rear 140mm exhaust, and finishing up the cabling. I did wind up having to use a helpful little accessory from a competitor of ours: NZXT has an accessory that lets you split a single internal USB 2.0 header into three (plus two USB 2.0 ports). Since there are two Corsair Link Commander Minis installed plus the two USB 2.0 ports from the case and only two USB 2.0 headers on the ASUS Z97-WS motherboard, the accessory came in handy. And the Yamamura is complete. I had trouble deciding whether or not to include the BD-RE drive, but we felt like the break in the drive bays was worth the utility, and the silver line on the drive is a nice accent that helps keep the front of the case from being too monochrome. The system as a whole is amazingly silent while having tremendous cooling capacity. In the next part of this build log, I’m going to talk about optimization: with all of this cooling performance, it’s time to try unlocking the GTX 980s.
  16. It’s not at all uncommon (in fact, exceedingly normal) for Corsair employees to want to tinker with our latest and greatest products just to see what we can actually do. While I was doing a single HG10-A1 build in the Carbide Series Air 240 that I was pretty proud of, one of our product engineers, Dennis Lee, was pushing things…well, a lot further. His Air 240 build borders on insane, and I’m happy to share it with you. COMPONENTS CPU Intel Core i7-3820 @ 3.9GHz Memory Corsair Dominator Platinum 32GB (4x8GB) DDR3-1866 9-10-9-27 1.5V Motherboard ASUS Rampage IV Gene (X79) Graphics 2x AMD Radeon R9 290X CPU Cooling Corsair Hydro Series H75 GPU Cooling 2x Corsair Hydro Series H75 and HG10-A1 PSU Corsair AX860i Storage Corsair Neutron GTX 240GB Enclosure Corsair Carbide Air 240 Dennis’s build is…pretty wild. He used white SP120 LED fans and a red sleeved cable kit, then doubled down and swapped in LED lit pump caps from H105s onto all of the H75 coolers. The result is easily one of the craziest systems we’ve ever seen and a testament to just how much power can be crammed into a Carbide Series Air 240. In all of its glory: two liquid-cooled AMD Radeon R9 290X cards on an X79 Micro-ATX board with just about everything under water. Bird's eye view. In order to fit two H75s in the main chamber, one had to be arranged in a push-pull configuration. The H75s operate as intakes, keeping the blowers on the HG10s fed while the two top fans work as exhausts. The pair of HG10s look cramped, but were designed to allow for exactly this kind of close proximity when used with the right Hydro Series cooler. The third H75 (cooling one of the R9 290X cards) had to be mounted to the 120mm fan mount in the back chamber. Screw the H75 radiator to the side panel, close it up, and game on.
  17. It’s been a half a year since we took an Obsidian Series 250D enclosure and installed a custom liquid cooling loop into it just to prove we could. Today we’re going to do something a little more straightforward with one of the most flexible cases in our lineup: the mainstream juggernaut Obsidian Series 750D. The 750D has been an extremely popular and solid seller for us, and it’s not hard to see why. This chassis design (and to an extent its flashier derivative, the Graphite Series 760T) is a history of Corsair cases placed in a crucible, the excess burned away and only the essentials remaining. It’s large, but feature rich, maximizing its space and giving the end user tremendous flexibility. This will be a series of articles on a build I’ve dubbed “Yamamura” after the villainess of the Japanese “Ring” films, whose father is inferred to be a water demon. Today we’re going to start with the parts list. Note that this is tentative; at some point parts may be swapped in or out depending on circumstances. Chassis: Obsidian Series 750D This build’s reason for being, the 750D boasts tremendous capacity for water cooling, rivaled only by the larger Graphite 780T and Obsidian 900D cases. Combining a clean design with solid airflow, room for multiple radiators, mounting points for a pump/reservoir combo, and general ease of assembly, the 750D is really the ideal mainstream case for liquid cooling enthusiasts who don’t want to go all out with a juggernaut like the 900D. Processor: Intel Core i7-4790K It’s reasonable to suggest an Intel Core i7-5960X might be a more exciting option, but the i7-4790K is a vastly more efficient processor, even when substantially overclocked. Part of the reason we’re going with so much radiator capacity (listed later) is to be able to run the fans at low speeds; a chip like the i7-5960X that dumps an extra ~150W of heat into the loop when overclocked takes a substantial bite out of that thermal efficiency. Intel’s i7-4790K is a stellar processor in its own right, and our samples hit 4.7GHz on Intel’s highest performing CPU architecture. Motherboard: ASUS Z97-WS I’ve been using this board in my Haswell and Devil’s Canyon testbed and it’s been an absolute pleasure. The Z97-WS is feature complete for this generation, sporting SATA Express, M.2, a PLX switch for dual PCIe x16 SLI and CrossFire, multiple USB 2.0 and USB 3.0 headers, and even FireWire capability. There are also extra power leads for the CPU socket and the PCI Express slots. Short of an ROG board, the Z97-WS is basically as good as Z97 gets. Memory: 32GB (4x8GB) Corsair Dominator Platinum DDR3 2400MHz CAS 10 It’s tempting to go for higher speed memory, but we’ve found internally that 32GB of DDR3-2400 is really the sweetest spot for Haswell and Devil’s Canyon. This is fast memory and a lot of it, and it ensures that you’ll never be bottlenecked by your memory subsystem. This kit is hands down my favorite for Haswell and Devil’s Canyon: high speed, high capacity, low latency, peak performance. Memory Cooling: Corsair Dominator Airflow Platinum While the benefits of having active cooling over high speed memory can certainly be debated, the Dominator Airflow Platinum cooler serves double duty both as cooling and as a classy bit of bling that can be added to the build. Rather than be limited to the two light bar kit colors, the Dominator Airflow Platinum has two RGB LED fans in it that can be controlled and configured via Corsair Link. Graphics Cards: Dual NVIDIA GeForce GTX 980 4GB GDDR5 Essentially the fastest single-GPU card on the planet, the NVIDIA GeForce GTX 980 also holds the distinction of being one of the most overclockable as well. We’ve seen the GTX 980 exceed a boost clock of 1.5GHz on stock air cooling with only a minor poke to voltage; with two of these under water and modified vBIOSes to remove the TDP cap, we may be able to push these cards to new heights of performance. Storage: 4x Corsair Neutron Series GTX 480GB SSD in RAID 0 Previous testing has indicated that four Neutron GTX SSDs are enough to saturate Z97’s SATA bus, offering peak throughput of a staggering 1.6GB/sec. While striped RAID has its own drawbacks (if one drive fails all of the data is lost), judicious backups and good computing habits can leave you free to enjoy a tremendous amount of solid state capacity and performance. Power Supply: Corsair HXi Series HX1000i 1000W 80 Plus Platinum This selection could’ve gone either way, between the HX1000i and the AX1200i, but in the end I opted for the slightly shorter, slightly less featured, but still exceptional new HX1000i. The HX1000i gives us an extra 20mm to avoid clearance difficulties with the bottom-mounted radiator while still offering Corsair Link monitoring and control. Better yet, the blue logo ID matches the blue theme of the rest of the build (as you’ll see later.) Corsair Link: Commander Mini Unit The Corsair Link Commander Mini is borderline purpose built for liquid cooling. The multitude of fans we’re planning on using for this build may necessitate a second unit, but the Commander Mini itself is useful for controlling a substantial number of fans on its own through the use of Y-cables, and we can use it to control the LED fans on the Dominator Airflow Platinum. Finally, the HX1000i can be connected directly to the Commander Mini instead of burning a USB port on the motherboard on its own. Fans: One Air Series SP140 LED Blue Static Pressure Fan, 14x Air Series SP120 LED Blue Static Pressure Fans The goal is to achieve push-pull with all three radiators; research suggests it should be possible, but overall radiator clearances may prevent it. Nonetheless, our blue SP LED fans are among our most efficient fans available, and incorporating push-pull on the radiators substantially reduces the speed we have to run them at. CPU Waterblock: EK Supremacy EVO Blue Edition Sticking with our blue theme, we’ve selected arguably the most efficient CPU waterblock currently available. Internal testing has proven heat transfer isn’t the same issue on Devil’s Canyon that it was on conventional Haswell, opening up the possibility of using a high performance waterblock to extract the maximum amount of performance the silicon offers. GPU Waterblock: XSPC Razor GTX 980 Chosen for its illumination support, XSPC’s full cover waterblock for the GeForce GTX 980 has a clean aesthetic that meshes beautifully with the Obsidian 750D. It’s thin, attractive, and cools all of the surface components of the GTX 980, ensuring long life and quiet operation. Note that we opted not to purchase the backplate that XSPC offers; the GTX 980 stock cooler already includes an excellent backplate of its own, mitigating the need for an aftermarket one. Pump and Reservoir: XSPC D5 Photon 170 Like so many of XSPC’s kits, the Photon 170 reservoir includes lighting, keeping it in theme with the rest of the build. However, the integration of a mounting backplate and D5 Vario pump makes it easy to get exactly the placement and performance we want and need to drive our loop. Radiators: Swiftech Quiet Power 360mm and 2x Quiet Power 240mm Radiator selection is a matter of preference; I’ve traditionally been pretty happy with Swiftech’s radiators. Note that these are standard-thickness (25-30mm) radiators. Given the choice between an extra-thick 280mm front radiator or two standard 240mm radiators, I opted for the increased airflow that spreading out the surface area provides. This is a matter of preference, though, but a cumulative 840mm x 25mm of radiator capacity should be more than adequate for getting the job done. Stay tuned for part two, when we begin assembly of the Yamamura…
  18. Just a couple of days ago, I talked about the drawbacks of having a beastly dual-GPU system featuring a custom liquid cooling loop as well as my solution to the problem in the form of my new Carbide Series Air 240 build that I dubbed “Blues.” I believe largely in balance, not overkill, though there is something to be said for the joy of assembling by hand a massively powerful machine. Knowing that my performance target wasn’t 4K but 1080p (and occasionally 3x1080p) suggested that my existing system wasn’t worth the 600W of power it consumed under gaming load, not to mention the corresponding 600W of heat it has to dissipate into a room that enjoys Californian Indian summers. Using some of our newest hardware, I opted to build a machine that would run as quiet (if not quieter) than my existing system while retaining the required amount of performance – but with superior performance per watt. These are the specifications of the two systems, compared. My old system was named “Ted” and it’s been with me for a while in an almost comical number of permutations. TED BLUES CPU Intel Core i7-4790K @ 4.7 GHz, 1.31V Intel Core i7-4790K @ 4 GHz, 0.975V Memory Corsair Dominator Platinum 32GB DDR3-2400 10-12-12-32 1.65V Corsair Dominator Platinum 16GB DDR3-2400 10-12-12-32 1.65V Motherboard ASUS Maximus VI Formula (Z87) ASUS Z97I-PLUS (Z97) Graphics 2x EVGA GeForce GTX 780 3GB (980 MHz Core, 6 GHz GDDR5) AMD Radeon R9 290X 4GB (1 GHz Core, 5 GHz GDDR5) CPU Cooling Custom Loop Corsair Hydro Series H75 w/ SP120 LED Fan GPU Cooling Custom Loop Corsair Hydro Series HG10-A1 Corsair Hydro Series H105 w/ 2x SP120 LED Fan PSU Corsair AX860i Corsair HX750i Storage 4x Corsair Neutron GTX 480GB in RAID 0 3x Corsair Force LX 512GB in RAID 0 Enclosure Corsair Carbide Air 540 Corsair Carbide Air 240 You can see I didn’t make a lot of brutally unkind cuts. I maintain that 2400MHz is the sweet spot for memory on Haswell and Devil’s Canyon, so that was worth the modest increase in power consumption. The AMD Radeon R9 290X is by no means frugal with power, but it is an incredibly fast card; had the NVIDIA GeForce GTX 980 been available when this build was assembled, that would’ve been the more sensible choice. While Blues is obviously inferior in performance to Ted, nobody would really be “slumming it” by making the transition. So what do we save in power, and what do we sacrifice in performance? Note that these games were all tested at or near their highest settings; Metro: Last Light Redux was maxed out with SSAA but with Advanced PhysX disabled, while Tomb Raider was only run with 2xSSAA and TressFX enabled. What we see is that in our synthetic video encoding benchmark, for our ~15% reduction in CPU clock speed we lose ~13% of the performance. That’s not too bad. Games run the gamut; BioShock Infinite’s minimum frame rate doesn’t change drastically, and the average stays well above 60 fps. Tomb Raider’s minimum does drop below 60 fps, but the average is above, and the single R9 290X doesn’t suffer from the rendering artifacts with TressFX that the SLI’ed 780s do. Metro: Last Light Redux is the most unpleasant hit, but still stays well above 30 fps. Finally, F1 2013 doesn’t seem to have SLI functioning correctly, but it’s irrelevant: either system maintains over 60 fps. We can use the Corsair Link connectivity of our AX860i and HX750i power supplies to see how much power each of these systems is drawing, and that’s where the difference really lies. While Blues peaks at about 365W under its most taxing load, Ted is gunning all the way up to nearly 600W. Particularly alarming is the near doubling of power consumption under the x264 benchmark for an extremely modest increase in performance. This is the truth of overclocking: at a certain point, substantial amounts of power become necessary to hit higher and higher speed bins. Almost entirely across the board, though, Ted is drawing substantially more power than Blues does, and arguably a lot of that is wasted performance. Mapping performance per watt puts it all into a different perspective. Since all of our benchmarks are measured in frames per second, we can divide those results by the peak power drawn during the benchmark to come up with a rough idea of how efficiently each system is running. This isn’t the grand slam that the absolute power consumption is; performance per watt stays mostly level in every game but the odd duck F1 2013. CPU efficiency is vastly improved, though. The measure for success here is overall power consumption while maintaining acceptable performance levels, and on that front, Blues, is a victory. I’ll be mothballing Ted for a while and spending more time with Blues to see if the reduced performance is really worth writing about, but for now, this has been a fun exercise in seeing how we can make our systems more efficient. We have overclocking competitions and records, but I’d love to see users trying to hit performance targets while reducing power consumption as much as possible.
  19. Today’s review roundup is focused on DDR4. Now that they’ve had a chance to exist in the wild, reviewers have had time to play with our Vengeance LPX and Dominator Platinum DDR4 kits…and they liked what they saw. First blood was drawn stateside with TweakTown and Overclockers.com, and those reviews are in this earlier roundup. Today we’re expanding our net and tackling international coverage. Before we visit the UK, a trip to Hardware Heaven sees our Vengeance LPX DDR4-2800 kit tested against competing kits and coming out the fastest. That earned it a Recommended award. Proclockers.com tested our more mainstream 2666MHz Vengeance LPX kit, but found that even that kit was able to hit 2900MHz with some coaxing. The ability to go from fast to faster scored a Recommended award. OCDrift.com tested our Dominator Platinum DDR4 at 2800MHz and just like Proclockers, they found there was still some gas in the tank as the kit took to 3000MHz C15 with ease. Keep reading and you’ll find this is a trend. Starting with high performance and being able to go higher still earned the kit a Gold award. Hexus.net used our Vengeance LPX DDR4-2800 kit exclusively in their review of Intel’s Haswell-E platform, then examined the kit on its own and handed off an “Approved” award in the process. Overclock3D has spent an extensive amount of time with our Vengeance LPX kit as well, using it in their i7-5960X review and then featuring it in not , not , but separate videos. Continuing the trend, Hilbert Hagedoorn over at Guru3D also employed our Vengeance LPX for his maiden review of the Haswell-E platform. Over at eurogamer.net, Richard Leadbetter saw fit to outfit his entire test platform for Haswell-E with Corsair kit. That includes an RM1000, an H105 to keep the i7-5960X frosty, and of course, Vengeance LPX DDR4. You can see that review here. New friends over at Gaming Till Disconnected also came away happy with the Vengeance LPX kit; watch their video here. As it turns out, even outside of English-speaking countries, Corsair DDR4 is the weapon of choice for maximizing performance and stability with Intel’s new platform. Those of you who don’t speak French may want to fire up Google Translate for this next set of reviews. Reviewers from Clubic.com, Hardware.fr, 59Hardware, Ginjfo, Cowcotland, and multiple print publications all chose Corsair DDR4 for their X99 testbeds and Haswell-E CPU reviews. The reviewer at OverClex who spent a little extra time with our Vengeance LPX 2800MHz DDR4 kit was pleasantly surprised to discover a little bit of Corsair’s secret sauce: our 2800MHz kit runs at 1.2V, but a secondary XMP profile will bump the voltage to 1.35V and the speed to a brisk 3000MHz C16. That turned out not even to be the limit: his kit went up to 3200MHz C15. He gave it 4 out of 5 stars. The reviewer at Overclocking Made in France also had a chance to play with our Vengeance LPX 2800MHz DDR4 kit, and just like his peer at OverClex, he was able to hit 3200MHz. Another 4 out of 5 stars. Finally, if you expand into the Nordic countries and elsewhere in Europe, you’ll see the same thing happening: reviewers sticking with reliable Corsair DDR4 for their X99 reviews and testing. Swedish juggernauts Sweclockers used Vengeance LPX exclusively, as did Hardware.no and Tweak.dk. Likewise, over in Benelux, you’ll see the same choice at the incredibly popular Tweakers.net and especially Hardware.info, which used both Vengeance LPX and Dominator Platinum DDR4.
  20. Having a boss overclocked, dual-GPU, custom-liquid-cooled system is pretty fantastic. It’s quiet, fast, runs any games I throw at it…it’s hard to complain too much. From the outside (or at least outside California), there’s very little wrong with having something that beastly to play with. And indeed, it’s hard to complain. Except in the summer. Except when I need to work on it. Except… There are drawbacks. Power consumption is high, and that means the system has to dissipate a tremendous amount of heat. California is experiencing one of the hottest summers in history (to say nothing of our drought), and we’ve never had very low power bills. A system like mine is great right up until I run headlong into the drawbacks. In a bid to see if I could make my life easier, I decided to take advantage of some of our new products (one of which isn’t out just yet but will be very soon) and produce a leaner, more purpose-driven build. It still has to be quiet, it still has to deliver superior gaming performance at my home resolutions of 1920x1200 and 5760x1200, and it can’t feel like a substantial step down. At the same time, it has to draw a lot less power. Let me introduce you to Blues. In the interest of producing something smaller, easier to use, and still incredibly powerful, I opted to employ our new Carbide Series Air 240 enclosure. The Air 240 is particularly special because unlike many of our other cases, it wasn’t entirely planned. While we take tremendous care in all the products we develop, the Air 240 was something that we really wanted. Remember that the people designing Corsair products are die hard enthusiasts, honestly just a bunch of nerds that come to work every day and ask themselves what they want to see on the shelves. This case was a pet project, and it’s everything we hoped for. Call it cliché, but my favorite color combination continues to be the time tested black and blue. For me, that meant taking the black version of the Air 240 and then fitting it with a series of our new SP120 blue LED fans. But just because I opted to use efficient fans doesn’t mean I was guaranteed silence. For that, I needed to choose my components very carefully. The CPU is Intel’s Core i7-4790K based on the Devil’s Canyon version of their Haswell architecture. These chips are Intel’s top of the line, but rather than overclocking, I opted instead to lock the peak clock speed to a still speedy 4GHz, allowing me to drop the Vcore to just 0.975V. This keeps temperatures low, ensuring the SP120 LED fan on the Hydro Series H75 cooler never has to spin up. Attached to the CPU is 16GB of Dominator Platinum DDR3 running at 2400MHz CAS10 with blue Lightbar kits installed. 2400MHz is really the sweet spot for Haswell, and 16GB ensures I never run out of system memory. That hardware is all plugged into an ASUS Z97I-PLUS mini-ITX motherboard. The Z97I-PLUS has a fairly understated color scheme while being very feature rich. Of course the other part of the equation is gaming performance, and that’s where our new Hydro Series HG10 comes into play. I swapped out the noisy stock cooler of an AMD Radeon R9 290X for the HG10 and then attached our 240mm Hydro Series H105 CPU cooler to it. The result? R9 290X performance, always running at the full 1GHz on the GPU, without any of the noise. It’s quiet, and it’s fast. You can see that overall, the interior design of the Air 240 is pretty efficient. The hoses on the H75’s radiator do apply a little pressure against the memory slots, but everything does fit, and it looks surprisingly neat for a small form factor build. The remainder of the primary chamber is kept cool by three more blue SP120s, but with a total of six fans and only three fan headers on the motherboard, how on earth was I going to keep the noise down? Not pictured: cable management skills. I continued by employing an incredibly efficient HX750i power supply, a unit that has a fan that only needs to spin up under substantial stress. That HX750i, along with the six primary chamber fans, gets connected to our new Corsair Link Commander Mini. The Commander Mini’s improved hardware over the original Cooling Node allows for precise control of the six main fans, letting me run them all at their lowest speeds. Finally, you’ll see that I have a trio of 512GB Force LX SSDs handling storage duties, providing plenty of high speed storage for gaming and video editing. With this fairly robust system (in a fairly small footprint) on hand, the major test is whether or not it’s worth the reduced noise, heat, and power consumption. That’s something we’ll be looking at very soon, so stay tuned.
  21. Now that Haswell-E’s been with us for about a month, we’ve had a chance to study it a bit more thoroughly and collect some data. One of the standout aspects of the architecture, particularly in its premium octal-core configuration, is its power consumption. At stock speeds, Haswell-E is an incredibly efficient architecture. However, while mainstream Haswell and Devil’s Canyon are able to see substantial overclocks without massive corresponding increases in power draw, Haswell-E can quickly lose that efficiency edge when overclocked. We used a sample of four Intel Core i7-5960X CPUs and tested overclocking on each one to get a fairly holistic idea of what we can get. Unfortunately the results weren’t as diverse as I’d hoped, likely owing at least partially to these samples all being from the same batch. Power consumption was measured through Corsair Link using the AX1200i power supply. Our test platform: CPU Cooler: Corsair Hydro Series H100i (four fans in push-pull) Motherboard: ASUS X99-Deluxe Memory: 4x4GB Corsair Dominator Platinum DDR4-3200 CAS 16 Video Card: eVGA GeForce GTX 780 Ti Storage: 128GB and 256GB Force LX SSDs Enclosure: Corsair Graphite Series 760T Power Supply: Corsair AX1200i PSUFor peak load numbers, the CPU was kept under a sustained stress test for ten minutes. To start, we’ll look at the voltages required to get each of our samples up to the individual overclocks. We tested them at their stock configuration, then at 3.5GHz all the way up to 4.4GHz. Note that none of these chips was able to do 4.5GHz at under 1.4V, which in this writer’s opinion is just too high for this architecture and manufacturing process. You can see there wasn’t a lot of variance between these four chips, with the exception of the last one. The fourth chip was able to hit the overclocked speeds at slightly lower voltages than its kin, but still had the same trouble getting to 4.5GHz. Voltage has a pretty linear relationship to power consumption on these chips, too. You can see load consumption hangs out at around to slightly above 200W at stock speeds, and considering that’s driving eight cores that’s really not too bad. Manually setting the clock speed to 3.5GHz across all eight cores and the voltage at a steady 0.975V shaves off a little consumption for most of the chips. Go to the other end of the scale, though, and suddenly you’re looking at more than 350W of load power. When overclocking CPUs, there’s always an “inflection point.” Up to that point, getting an extra speed bin requires a modest increase of voltage if any. On all of these chips, the inflection point has been between 4.2GHz and 4.3GHz. There’s the steady, gradual increase of consumption up until about 4.1GHz, and then suddenly we’re cresting 300W under load at 4.2GHz. 4.4GHz, while totally doable, can add as much as 60W of additional power consumption over 4.3GHz for just a single bin. This graph should give a clearer idea. Since all four of our samples behaved fairly similar, I’ve isolated one of them and tracked the voltage and wattage scaling against each other. The relationship is fairly linear, with both of them spiking at 4.4GHz. Interestingly, though, voltage and wattage climbing doesn’t start to get really onerous until about 3.9GHz, and this is true across all four chips. While I desperately wanted 4GHz to be the sweet spot, it seems like 3.9GHz is really where it’s at. When we take a look at the idle to load delta, we get a better idea of just how nasty power consumption becomes at high load. At 3.5GHz, the chip idles at ~104W and under load goes up to ~206W, netting about 102W of power consumption. That’s not so bad. On the far flung, nastiest edge (4.4GHz), though, you’ve got ~138W of idle power – still not unreasonable – but a staggering ~358W of load. Suddenly the chip is adding 220W to your power consumption under load. Also, like before, note that power consumption starts to really climb after 3.9GHz; up to that point, you’re getting a healthy amount of performance at fairly reasonable power. You can see from the data here that the i7-5960X is at least reasonable with power consumption at stock speeds, but starts to really beat on your power supply once you overclock past a certain point. For our samples, that point was about 3.9GHz, which is a little disappointing since it’s shy of the magic 4GHz mark. Is a nice, round 4GHz worth about 20W? We’ll take a look at how much performance overclocking gets you as well as performance-per-watt metrics in a future article, so stay tuned.
  22. We’ve done a couple FAQs and Q&As, but we haven’t painted a clear, by-the-numbers picture yet of what DDR4 really has to offer beyond DDR3. On the desktop side the lower power consumption is offset somewhat by the fact that the only platform that supports it starts chugging power the instant overclocking gets involved, while 16GB DIMMs (one of the key advantages of DDR4) aren’t expected to be available until 2015.That leaves us with performance. A lot of users are concerned that the increased timings on DDR4 make it inferior to DDR3 at similar speeds, but that doesn’t really tell the whole story. While DDR2 and DDR3 were architecturally very similar and took some time to separate, DDR4 is host to a few internal architectural changes that affect overall latency and performance. Those changes allow it to see benefits over DDR3 right out of the gate. I want to stress that this exercise, at least right now, is academic: there is no platform currently available that supports both DDR3 and DDR4. So if you want DDR4, you’re using Haswell-E, and vice versa. That makes this comparison a little bit difficult since it’s tough to quantify in apples-to-apples terms whether or not DDR4 really is “faster” than DDR3. For testing, I used three platforms with both single rank and dual rank DIMMs. Dual rank DIMMs increase parallelization a little bit at the cost of a very minor hit in latency, typically about 1ns. In lay terms, denser memory DIMMs (i.e. 8GB) get a little more mileage than lower capacity, single rank DIMMs (i.e. 4GB). Single rank DIMMs pretty much have to bank on hitting higher speeds to make up the deficit. These are the testbeds I used: Haswell Ivy Bridge-E Haswell-E CPU Intel Core i7-4790K Intel Core i7-4930K Intel Core i7-5960X Motherboard ASUS Z97-WS ASUS P9X79 Pro ASUS X99-Deluxe Single Rank Kit CMY16GX3M4A3000C12R CMD16GX3M4A2933C12 CMD16GX4M4B3200C16 Dual Rank Kit CMY32GX3M4A2800C12R CMY64GX3M8A2400C11R CMD32GX4M4A2800C16 Memory Channels 2x DDR3 4x DDR3 4x DDR4 Note that in each case, the CPU’s core clock was set to 4GHz and uncore clock was set to 3GHz. And these are the latencies I tested with at each speed: DDR3 DDR4 1600 MHz 10-10-10-30 1866 MHz 11-13-13-31 2133 MHz 11-13-13-31 15-15-15-35 2400 MHz 11-13-13-31 15-15-15-35 2666 MHz 11-13-13-31 15-15-15-35 2800 MHz 12-14-14-36 16-16-16-36 3000 MHz 16-16-16-36 3200 MHz 16-16-16-36 You can see I’ve tried to make it as apples-to-apples as possible, but these are different architectures and memory controllers. For bandwidth testing, I used AIDA64. I’m keen to point out before we get started that it’s tough to actually quantify “faster” since there are essentially four disciplines you’re looking at: three that are bandwidth related and one that is latency related. It’s more sensible to look for trends. READ 1600 1866 2133 2400 2666 2800 3000 3200 Haswell 1R 23233 26392 30586 34079 22850 23917 Haswell 2R 23982 27840 31833 35406 23585 Ivy-E 1R 41435 48444 50573 55197 Ivy-E 2R 43670 50520 57341 59831 Haswell-E 1R 54514 57664 60025 59651 60848 62407 Haswell-E 2R 56771 60231 62164 61045 So right off the bat, you can see Haswell’s dual-channel memory controller is going to have a hard time keeping up with the quad-channel memory controllers on Ivy Bridge-E and Haswell-E. What’s notable right off the bat, though, is that DDR3 and DDR4 are very close at the same clock speed despite DDR4’s increased CAS latency. In fact, if you’re using single rank DIMMs, DDR4 is measurably faster than DDR3. You may also be seeing Haswell’s memory bandwidth take a bath after 2400MHz; this is something independently verifiable. Latency continues to improve past 2400MHz, but memory bandwidth takes a consistent hit. Meanwhile, Haswell-E’s DDR4 controller takes a slight dip at 2800MHz when we have to shift to CAS16 from CAS15, but resumes climbing at 3000MHz and 3200MHz. WRITE 1600 1866 2133 2400 2666 2800 3000 3200 Haswell 1R 23715 27157 30852 34819 22926 24053 Haswell 2R 25132 29222 33248 37415 24158 Ivy-E 1R 30537 33096 37845 42184 Ivy-E 2R 31488 52746 60432 43347 Haswell-E 1R 46711 46817 46919 46888 46927 47009 Haswell-E 2R 47758 47832 47892 47912 At this point it’s obvious Intel’s Ivy Bridge-E and Haswell DDR3 controllers just weren’t architected to handle high speeds. Ivy Bridge-E and DDR3 do offer consistently higher write speeds than DDR4 does (provided you’re running dual rank modules), while DDR4’s write speed is essentially constant and stable at about 47GB/s. While write speeds are obviously a weak point in Haswell-E’s DDR4 memory controller, they’re really the only one. COPY 1600 1866 2133 2400 2666 2800 3000 3200 Haswell 1R 21557 24347 27535 30596 22256 23324 Haswell 2R 23794 27262 30494 33635 23557 Ivy-E 1R 39492 44368 49485 54149 Ivy-E 2R 43876 51026 58393 59667 Haswell-E 1R 52447 57749 62076 62793 65558 68990 Haswell-E 2R 56066 61703 67144 59848 Memory copy functions start slightly behind DDR3 at 2133MHz and then pretty much start soaring past it at 2400MHz. Judging from the synthetics so far, it seems like users who want to start getting the most out of Haswell-E should be looking at 2666MHz kits at a minimum. Again, mainstream Haswell’s dual-channel DDR3 controller is totally outclassed by the fatter pipes of these higher end, hexa-core and octal-core processors. COPY 1600 1866 2133 2400 2666 2800 3000 3200 Haswell 1R 60.2 57.2 53.6 48.6 46.1 45.6 Haswell 2R 61.5 58.3 53.7 49.4 46.5 Ivy-E 1R 78.3 72.2 65.7 60.6 Ivy-E 2R 80.8 67.7 60.1 61.7 Haswell-E 1R 71.3 66.2 62.3 63.3 61.2 56.3 Haswell-E 2R 72.9 67.5 63.3 64.6 This is probably the biggest bugbear in the transition from DDR3 to DDR4. But users expecting DDR4 to grossly underperform DDR3 due to the higher CAS latency are going to be in for a surprise: as you ramp DDR4 to its intended speeds, latency actually drops below DDR3 (excepting Haswell’s dual-channel controller, which is just plain lower latency than both quad-channel controllers.) So while it’s true that DDR4 can be as much as 10ms slower than DDR3 at the same clock speed, it still has lower latency at its mainstream speeds, and the deficit isn’t any greater than if you were going from Haswell’s dual-channel controller to Ivy Bridge-E’s quad-channel. Ultimately that’s kind of the takeaway here: DDR4 starts at very high speeds with room to scale higher, and at those entry level speeds, it’s faster and more capable than its predecessor in almost every test. Mainstream DDR4 actually winds up with lower overall latency and higher bandwidth than mainstream DDR3. In the future we’ll be testing DDR4 in practical applications to see if there are performance gains to be had from exceeding the baseline 2133MHz, but for now it’s clear that if nothing else, DDR4 is a more than worthy successor to DDR3, and fears regarding the higher timings resulting in substantially increased overall latency are by and large unfounded.
  23. With the launch of DDR4 in conjunction with Intel’s Haswell-E high end desktop platform, it only makes sense for us to provide you with as much information as possible about this exciting new memory technology. Not just in terms of why you might need it or what it might offer you, but what it is and how to use it. DDR4, the X99 chipset, and the Haswell-E platform are all brand new technology and this is about as bleeding edge as it gets. To that end we’ve authored a whitepaper for the more technical folks in our readership as well as this FAQ. The whitepaper is aimed at an enthusiast level: not low level technical detail, but not overly simplified either. If that sounds like you, have a look here. Frequently Asked Questions Why do we need DDR4? There are four major reasons why DDR4 is set to replace DDR3: it’s capable of hitting faster speeds, it’s capable of hitting higher densities (16GB DIMMs are expected in 2015), it has improved error correction built into the baseline specification, and it consumes less power for equivalent or better performance than DDR3. In short, while DDR3 is butting up against its limitations today, DDR4 still has a tremendous amount of room to scale. Is DDR4 slower than DDR3? Because DDR4 uses looser latencies than DDR3 does, it can be slightly slower than DDR3 at the same clock speeds. What makes DDR4 important is that it can easily make up for that deficit by hitting higher clock speeds than DDR3 can. Getting DDR3 to run at 2666MHz or higher requires very careful binning of memory chips and can be very expensive, while 2666MHz is the lowest speed we’re launching DDR4 at. Is DDR4 backwards compatible with DDR3? No. DDR4 and DDR3 have key notches in different places on the DIMM to prevent them from being mixed up, and Haswell-E and X99 are DDR4 only. Does DDR4 have XMP? Yes! We’ve been working hard with all major motherboard vendors to ensure compatibility with our high speed DDR4 memory, and that includes XMP. DDR4 employs a new specification, XMP 2.0, while DDR3 remains on XMP 1.3. How does XMP work on DDR4? Very similarly to DDR3, but with some caveats. For starters, Haswell-E tops out at a 2666MHz memory strap, which is very low for what DDR4 can do. Since XMP specifies speeds in excess of 2666MHz, your motherboard BIOS has to compensate somehow. Typically, when XMP tells the motherboard to use a higher memory speed than 2666MHz, the motherboard BIOS will bump the BClk strap from 100MHz to 125MHz. That’s normal, but that change will also increase the clock speed of the CPU itself; a well-designed BIOS will compensate and bring the CPU clock speed in line. Why are there two XMP profiles on my Corsair DDR4? We include a pair of XMP profiles instead of just one for users who want to control how much power is consumed by the memory. The first XMP profile runs the DDR4 at its specification of 1.2V, while the second offers a higher speed at the cost of bumping the voltage to 1.35V. The first profile, then, is officially supported, while the second is not and instead offers a baseline of what the memory should be able to achieve. Why am I encountering stability issues with XMP? While we’ve been working around the clock with motherboard vendors to maximize compatibility and performance, these technologies are all very new. If you have trouble with stability using either XMP profile, we recommend either manually entering the speed and timings the DDR4 is rated for or running your memory at its default speeds until your motherboard vendor provides a BIOS update to improve stability. I’m running at the default 2133MHz speed, but my system still isn’t stable. Double-check to see which memory slots your DDR4 is installed in against your motherboard’s instruction manual. We’ve found that you have to install your DIMMs in the primary set of memory channels first, in order, to ensure stability. If this checks out, please contact our tech support. What’s the difference between Dominator Platinum DDR4 and Vengeance LPX DDR4? Vengeance LPX is our mainstream DDR4, utilizing a standard height PCB and heatspreader. Dominator Platinum DDR4 adds a larger, more robust heatspreader as well as compatibility with our Light Bar Kit, Dominator Airflow Platinum fan, and Corsair Link for monitoring voltage and temperature (Airflow Pro required.) What can we expect from DDR4 in the future? We’re launching DDR4 at up to 3000MHz speeds and 8GB per stick densities, but that’s just this year alone. DDR4 is expected to hit 16GB densities in 2015, allowing your X99 motherboard to support a staggering 128GB of memory (provided it has eight memory slots). In short, it’s gonna get bigger, and it’s gonna get faster. Where can I learn more about DDR4? As I mentioned in the introduction, we’ve authored a whitepaper that provides a much more detailed examination of this new memory technology. You can find it here.
  24. A little less than a month ago I ran into a troubling issue with my system at home. Drives on the SATA ports were starting to blink in and out while the machine was running. We checked the SSDs (all Neutrons and Neutron GTXes, naturally) internally and found no problems, and that pretty much left me with the motherboard. Ordinarily replacing a motherboard is a nuisance but not a huge issue, but when your system looks like this: …it’s borderline catastrophic. Since then I’ve been able to rotate the drives to different SATA ports and so far things have been okay, but it seemed prudent nonetheless to build a system that I could swap in if things got too difficult with my primary. Luckily we had a couple of Graphite Series 760T prototypes in house, and I opted to grab one and put together a pretty handsome beast. It’s easy for the white version of the 760T to overshadow the black one, and the white one is the one we’ve been showing off, but I actually have a soft spot for black. I decided to take the opportunity to style a black 760T and make it my own. Here are the parts I used for this build: Intel Core i7-4770K Corsair Hydro Series H110 CPU Cooler ASUS Maximus VI Hero Z87 Motherboard 4x8GB Corsair Dominator Platinum DDR3-2400 CAS 10 with Light Bar Kit AMD Radeon HD 7990 XFX Radeon HD 7970 Corsair Neutron GTX 240GB SSD Corsair AX1200i 1200W 80 Plus Platinum Power Supply Corsair Link Lighting Node Two blue AF120 LED fans and two blue AF140 LED fansAssembly was fairly easy and I made effective use of the resources on hand from being a Corsair employee and member of the tech marketing team. For starters, the Intel Core i7-4770K is an engineering sample; nigh identical to retail chips, but in my experience, engineering sample Haswell CPUs tend to overclock a bit better. This one did the same clocks my home chip does: 4.5GHz at ~1.22V. That’s pretty good for an i7-4770K, especially one that has 32GB of DDR3-2400 CAS 10 strapped to it. I was also able to scrounge up a second Y-cable and two more of the stock fans for the H110 and assembled it in a push-pull configuration to maximize performance while reducing noise. The ASUS Maximus VI Hero motherboard was a good chance for me to play around with one of ASUS’s ROG boards and BIOSes. I’m used to overclocking on Gigabyte hardware, but one of the benefits of working here is being able to learn about everything else that’s available. I do my research and pass it on to you. The Maximus VI Hero is a fine board and very easy to use, and I may make the jump back to ASUS when I do finally rebuild my custom loop in my home system (probably circa Devil’s Canyon or Haswell-E). On graphics duty, I was compelled to investigate the issues Tom’s Hardware reported with the AMD Radeon HD 7990 as well as trying to get a little triple CrossFire action going. Tom’s Hardware was right, though: in a multi-card configuration, the 7990 just plain can’t be the top card. For whatever reason, the fan design on the card causes the first GPU to suffocate and overheat; swapping the 7990 into the bottom slot and putting the 7970 into the top slot allowed TriFire to function without thermal issues. That said, it’s still a bit noisy: the 760T has fine air cooling performance, but we’re talking about ~625W of graphics hardware under open air coolers. As for performance, it’s definitely there, but microstutter is also plainly evident in Unigine Heaven. Historically, going past two GPUs has exponentially increased potential driver issues, and that’s pretty apparent with this build. If I were going to replace my home system with this build, I’d be seriously considering simply removing the 7970 and sticking with just the 7990. Visiting forums suggests other users have fared better with triple Radeons, but I remain skeptical. In order to give my build some flair, I swapped out the red LED fans that came pre-installed in the black 760T for blue ones, as well as adding an additional 120mm blue intake fan on the bottom of the case. I used our blue braided cable kit for the AX1200i power supply as well, and swapped in blue light bars for the Dominator Platinum memory. Finally, I added two Corsair Link LED strips to the top and bottom of the case to help illuminate it. Despite blue being an extremely common LED color, blue system builds actually seem to be rarefied, and I’m very happy with the black/blue/red color scheme of this 760T. We know the 760T is taking a little extra time to get to you but we want to make sure it’s just right. The prototype I used for this build can have issues with flexing and flexibility on the side panels that frankly we just weren’t happy with in a final product, and we think you’ll be a lot happier with the 760T when it does ship later this month.
  25. We’re concluding our run of Obsidian Series 250D coverage with a step by step guide on how you can build your own high performance system. Our how to video covers assembling a system in the 250D from start to finish, complete with replacing the fan, installing a 240mm Hydro Series H100i, and even adding a bit of flash with our Corsair Link Lighting Node. In addition to the instructional video, I thought I’d at least share my thoughts on this particular build. This is a build I put together for my girlfriend. She hasn’t gamed since the eras of Super Mario Bros. and Wolfenstein 3-D, so it’s a chance for her to get into and explore an entirely different world of gaming as well as experiencing just how enjoyable a modern computer system with high quality peripherals like the Vengeance K95 keyboard can be. For the core of the system, I went with an Intel Core i7-4770K processor, Gigabyte GA-Z87N-WiFi motherboard, and 16GB of Dominator Platinum DDR3-2400 CAS 10 memory complete with light bar kit. The processor is admittedly overkill (especially since I overclocked it to 4.3GHz), but amusingly enough, when you’re industry it’s actually easier to get top end chips than something more modest and appropriate like an i5-4670K. Our motherboard is at least more in line with that. I’m personally a huge proponent of our Dominator Platinum line of memory. It doesn’t sell through like our Vengeance and Vengeance Pro lines and it definitely comes at a premium, but as far as I’m concerned it’s the best-looking memory on the market, and the light bar kits only improve it. Interestingly, our light bar kit was originally going to come in red, blue, and white, but the red turned out more of a blush color when illuminated. I managed to rescue these prototypes from the lab and used them for this build. I wanted to make sure the build had a card that was powerful enough to play any game she wanted to try at 1080p with all the settings turned up. We had a GeForce GTX 660 Ti in the back that was going to be used for a build that never came to pass, and because it’s last generation Kepler, it’s not that exciting for any of our more showcase builds. It’s still a fantastic card, though, and I was happy to include it here. The 256GB Neutron SSD provides at least a good starting point for any user. It’s responsive and snappy and helps complete the high performance computing experience. Powering the whole system is our quiet RM750 power supply. 750 watts is way more than enough, which means the fan will seldom if ever have to turn on, and I used our white sleeved modular cables to help properly pick up the lighting. Our Hydro Series H100i CPU cooler remains our most popular cooler, and the 250D was architected specifically to support it. It seems only appropriate that I include the H100i here. I’m also using our purple AF series LED fans (her favorite color) for the front intake of the case and the radiator. The AF series aren’t as efficient for radiator duties as our SP series fans, but until we have SP LED fans (and I continue to shake the tree for those), they’ll do in a pinch. Finally, I did add a Corsair Lighting Kit to complete the system. The resulting build is attractive, well tuned, and was well received. I’ll be spending the next couple of weeks trying to build up a profile of her as a gamer, and hopefully I’ll be able to report back soon with what she’s been playing. In the meantime, now you have detailed step by step instructions on how to build in our most imaginative case to date, so you can get out there and build your own miniature gaming machine!
  • Create New...