Jump to content
Corsair Community

Search the Community

Showing results for tags 'neutron gtx'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


    • Community Rules & Announcements
    • Live Event Announcements
    • New Product Releases
    • Non-CORSAIR Tech, Benchmarks, and Overclocking
    • Games & Gaming
    • Battlestation and Build Showcase
    • iCUE Software Troubleshooting
    • Build Hardware Troubleshooting
    • Gaming Peripherals & Audio Troubleshooting
    • Furniture and Ambient Lighting Troubleshooting
    • CORSAIR Pre-Built Systems Troubleshooting
    • Build Hardware
    • Gaming Peripherals
    • Audio Devices
    • Battlestation Hardware: Ambient Lighting, Furniture, etc.
    • CORSAIR Pre-Built Systems
    • CORSAIR Technologies Q&A
  • Alternative Languages
    • Deutscher Support (German Support)
    • French Support
    • Spanish Support
    • Corsair Enthusiasts Section
    • Corsair Product Discussion
    • Alternate Language Support
    • Customer Service


  • System Build Inspiration
  • Memory
  • Cases
  • Maximizing Performance
  • Peripherals
  • Storage
  • Liquid Cooling
  • Gaming Mice
  • News and Events
  • How-tos and DIY
  • USB Drives
  • Extreme OC Mods
  • Gaming Keyboards
  • Power Supply Units
  • Gaming Headsets
  • Uncategorized

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me



Optical Drive # 1







Found 16 results

  1. The Graphite Series 380T was designed to be the ultimate LAN enclosure, with a sturdy handle on the top, easy internal access, integrated fan control, and a striking ID. For better or worse, we expanded its dimensions to allow you to install a 240mm liquid cooler for the CPU. Amusingly enough, though, what I always fixated on with it was the way the white one, with red LED fans, could wind up looking like this guy’s head: Source: Mass Effect 2 wiki. The white version of the 380T has white LED fans and white LEDs for all of the lighting, but that’s fixable. What I also wanted to do was put the most comically powerful system I could inside the case. Initially I was gunning for efficiency and planning to use Intel’s Core i7-5775C CPU, but Broadwell’s limited overclockability wound up being unappealing in the face of being able to go completely insane with the ASRock X99E-ITX/ac: Source: ASRock. While the board uses an enterprise-class socket with narrower mounting points than the traditional LGA 2011-3 socket, Asetek produces a mounting kit for this narrow socket that allowed me to install an H100i GTX, giving me all the cooling performance I could need for the Intel Core i7-5960X I was planning to use. It can be tough to scale to high DDR4 speeds on Haswell-E when you’re populating all four memory channels, but when you’re running in dual channel it takes some of the load off the controller. The result is that I have two 8GB DDR4-2800 DIMMs installed, making up some of the memory bandwidth deficit stemming from the X99E-ITX/ac’s two memory channels. The other half of the performance equation is getting a powerful graphics card, and right now the NVIDIA GeForce GTX 980 Ti is a tough card to beat. I’ve already covered how well this card overclocks when under an HG10 and it is an absolute bear. For this build, I used a reference Gigabyte GeForce GTX 980 Ti, our prototype HG10-N980 bracket, and a Hydro Series H75 cooler with two red SP120 LED fans. The H75 is mounted to the front of the case, and the fans are controlled and powered by the 380T’s integrated fan controller. Handling storage duties are a 240GB Neutron GTX SSD as the system drive and a 960GB Neutron XT SSD as the gaming/scratch drive. If I’m going to overclock this system – and I absolutely am – I’m going to need a pretty solid power supply, and for that I turned to our recently released RM750i. This PSU necessitated ordering the PSU extension bracket, which also buys a little more breathing room internally. The extension doesn’t stick too far out of the back, either, so it’s not unsightly. Finally, to get the look I wanted I needed to replace the front fans with 120mm red LED fans as well as replace the white-lit I/O board with the red I/O board from the black version of the 380T. All in all, I don’t think it came out too bad. For reference, here’s the list of components used in this build: CPU: Intel Core i7-5960X Motherboard: ASRock X99E-ITX/ac DRAM: Corsair Dominator Platinum 2x8GB DDR4-2800MHz CAS15 Graphics Card: Gigabyte GeForce GTX 980 Ti Storage: Corsair Neutron GTX 240GB SSD and Corsair Neutron XT 960GB SSD CPU Cooling: Corsair Hydro Series H100i GTX with aftermarket Asetek bracket GPU Cooling: Corsair Hydro Series HG10-N980 with Corsair Hydro Series H75 Power Supply: Corsair RM750i 750W 80 Plus Gold Chassis: Corsair Graphite Series 380T White Accessories: 3x Corsair SP120 LED Red Fans, PSU Extension Bracket for 380T, 380T Red I/O PanelIn an upcoming blog, I’ll detail overclocking and just how much performance I was able to extract out of this system, especially in comparison to the extremely powerful (and much larger) “Yamamura” 750D build.
  2. At Corsair, we make all kinds of stuff, but at our core, at our heart, we’ve been a memory company since the beginning. So when someone comes up with what appears to be a fantastic solution to using the largesse of memory modern machines are capable of supporting, we’re interested. With that in mind, I took DIMMDrive for a spin. It’s been garnering very positive reviews on Steam, and the $29.99 buy-in isn’t too unreasonable. I tried it on two different testbeds: Testbed #1 Testbed #2 CPU Intel Core i7-4790K @ 4.5GHz Intel Core i7-5960X @ 4.4GHz DRAM 2x8GB Vengeance Pro DDR3-2400 8x8GB Dominator Platinum DDR4-2400 Motherboard ASUS Z97-WS ASUS X99-Deluxe Graphics GeForce GTX 980 2x GeForce GTX 980 Storage 240GB Force LS SSD 512GB Force LX SSD 4x 480GB Neutron GTX SSD in RAID 0 Cooling Hydro Series H110i GT Custom Liquid Cooling Loop PSU HX750i AX860i Chassis Obsidian 450D Obsidian 750D What DIMMDrive does is provide a smart front-end between Steam and its games and an old school RAM drive. You load it up, toggle which games you want loaded into the drive, and then toggle DIMMDrive on. And therein lies your first problem: you’ve just front-loaded your loading times. The games you’re loading have to copy – in their entirety – to the RAM drive, and that loading time continues to be gated by the speed of your storage. The second issue is the footprint of the modern triple-A title. While DIMMDrive offers some small allowance for this by letting you choose which individual files in a game you want copied to the drive, the solution is a clunky one. But look at the storage requirements for these modern games: Battlefield: Hardline – 60GB Battlefield 4 – 30GB Far Cry 4 – 30GB Counter-Strike: Global Offensive – 8GB Elder Scrolls V: Skyrim (assuming no mods) – 6GB Watch_Dogs – 25GB The Witcher III: Wild Hunt – 40GB Grand Theft Auto V – 65GB Dota 2 – 8GB World of Warcraft – 35GBFor users playing less intensive games, you’re still looking at a minimum of 16GB of system memory just to have enough to handle the game’s footprint. And how does it work in practice? I tried using it with a few games that seemed like they might benefit from faster access time: Sid Meier’s Civilization V has basically no loading time during the game, but takes an eon to load initially. Wolfenstein: The New Order uses id Tech 5’s texture streaming and thus by its nature desperately needs all the bandwidth it can get. Even old school Left 4 Dead 2 tends to take a while to load. /corsairmedia/sys_master/productcontent/blog_DIMMDrive_Review-Content-1.jpg The biggest problem was that whether I loaded these games off of my RAIDed SSDs or just the one, the longest load time was always by and large just copying the game into memory when DIMMDrive was enabled in the first place. Switching to a single SSD from a mechanical hard disk improves virtually every aspect of the computing experience and brings game load times in line, but going beyond that to the RAID or the DIMMDrive just doesn’t feel any faster. The most noticeable aspect of DIMMDrive was how long it took to load a game into RAM in the first place. Beyond that, Wolfenstein: The New Order would just crash when I tried to run it from DIMMDrive, so I wasn’t able to see if DIMMDrive could at least improve the texture pop any. So why doesn’t DIMMDrive make a homerun impact on gaming and loading times? Quad-channel DDR4-2400 is, at least synthetically, capable of being almost 100x faster on read than a good SSD. But the answer is more complex, because when games load, it isn’t just loading a game from storage into system memory. Many modern games already use system memory intelligently to smooth out load times in the first place. From there, data needs to be copied either from system memory or system storage to the graphic’s card’s video memory, and that’s going to be gated by the PCI Express interface among other things. A PCI Express 3.0 x16 slot is capable of transferring a ballpark 16GB/s. A quad-channel memory bus will outstrip that in a heartbeat, while a more mundane dual-channel DDR3-1600 configuration is still capable of a ballpark ~25GB/s. Even then, though, actually copying/moving data between system memory, system storage, and through the PCI Express bus is only a part of what a game does when it’s loading. There are countless other operations to consider: compiling shaders, connection speed and latency for online games, and so on. My ultimate point is that by the time you’re done taking all of these other operations into account, the amount of time DIMMDrive might save you could be a few seconds at best, or it may actually cost the time it requires to copy the entire game into system memory in the first place. If you’re on mechanical storage, DIMMDrive could definitely demonstrate an improvement, but it would still require a substantial amount of investment in DRAM in the first place. Ultimately, getting value out of DIMMDrive – assuming you’re on a platform that supports enough memory to make it viable for larger games – requires greater expense and more complexity than simply buying a high capacity SSD. While I’d love to sell you our enormous memory kits, and I continue to recommend 16GB of system memory as a baseline for those that can afford it, the more sensible option continues to be solid state storage
  3. Recently, the actual computer part of the Obsidian Series 750D “Yamamura” custom water-cooled system began having issues with random shutdowns and reboots, as detailed in this earlier blog. Ordinarily those types of problems are a frustration, but when your system looks like this… …the increased difficulty of swapping any parts out, potentially requiring you to actually drain the loop entirely, may even make you question why you built your system up like this in the first place. However, as any die-hard builder knows, part failure always has a silver lining: an excuse to upgrade. And that’s what I did, giving me a chance to rectify a few pain points in the original build, things I felt like I could’ve done or specced better. The system was already close to unimpeachable, but we can certainly do more. Swapping the CPU, Motherboard, and DRAM Before After CPU Intel Core i7-4790K (4 GHz) 4 Cores, 8 Threads, 84W TDP Intel Core i7-5960X (3 GHz) 8 Cores, 16 Threads, 140W TDP Motherboard ASUS Z97-WS ASUS X99-DELUXE DRAM 4x8GB Dominator Platinum DDR3-2400 10-12-12-32 1.65V 8x8GB Dominator Platinum DDR4-2400 14-16-16-31 1.2V The only way to “upgrade” past Intel’s monstrous Core i7-4790K (overclocked to 4.7GHz in our build) is to change your platform entirely, so that’s what I did. While the i7-4790K tops out at between 120W and 130W when overclocked, the i7-5960X starts there and pulls considerably more when overclocking is applied. But that’s fine: Yamamura enjoys a custom liquid cooling system with massive heat capacity. Changing the platform means swapping to the even more capable ASUS X99-DELUXE motherboard as well as jumping from DDR3 to DDR4. Latency does increase, but so does capacity and overall bandwidth. It’s a net gain, and our DDR4-2400 kit even includes an extra XMP profile that pushes the voltage to 1.35V and speed to 2666MHz. Incidentally, due to the spacing of the video cards, we actually lose a little bit of bandwidth to the pair of GeForce GTX 980s. The slot arrangement results in the bottom GTX 980 only getting PCIe 3.0 x8 instead of the full sixteen lanes, but thankfully this produces virtually no measurable decrease in performance. Upgrading the Storage A lot of people didn’t care for the way the LG blu-ray burner broke up the front of Yamamura, and I can see why. At the same time, I also found myself needing a little bit more storage for a documentary I’m editing in my off hours. Thankfully, there’s a way to serve both masters, and it comes from SilverStone. SilverStone produces a 5.25” drive bay adapter that can fit a slimline, slot-loading optical drive and four 2.5” drives. By purchasing a slimline, slot-loading blu-ray burner and installing a spare 512GB Force LX SSD we had in house, I was able to clean up the front of the case and increase storage. Fingerprints notwithstanding, it's a lot cleaner than it was before. Improving the Cooling and the Bling While the original build called for a Dominator Airflow Platinum memory fan, we weren’t able to find clearance for one owing to the ASUS Z97-WS’s layout. Happily, the ASUS X99-DELUXE doesn’t have this problem, and that meant we could add two Dominator Airflow Platinums. Because they’re PWM controlled, they’re a perfect match for our old Corsair Link Cooling Node, and because they use the same RGB LED connector as our other lighting kits, a single Corsair Link Lighting Node is able to control them. The end result isn’t just increased bling: even at minimum speeds, the airflow from the fans helps keep the DDR4 cool (with individual DIMMs peaking at just 38C), while also shaving at least 10C off of the power circuitry surrounding the memory slots. Getting fresh airflow into the motherboard’s VRMs never hurts. Yamamura 1.5 I was immeasurably thankful that I didn’t have to drain the loop to make these upgrades, thus reaffirming my belief in flexible tubing. Hard acrylic is frequently argued as the way to go in modern builds, and people say it looks nicer, but it’s not functional. I use this computer on the daily, and I am possessed by a relentless appetite for tweaking the hardware. Given just how bloody fast the Yamamura is now (and stable, mercifully), I don’t foresee making any major changes to the system until Skylake and Big Maxwell at the earliest, at which point there may be a newer, more exciting chassis to move into…
  4. (This is the second part in a series of blogs. The first part details part selection, and is here.) Building a custom liquid cooling loop, even in a case as well-designed as the Obsidian Series 750D, seems to be inevitably more involved than you originally plan. At least if you’re a hobbyist like I am; this is only my fourth loop, and each time I’ve learned new and exciting lessons. For example, plans are adorable. Every time I’ve sat down to do one of these, I haven’t been exactly certain what order to go in. So for the Yamamura, I started out by just installing the waterblocks to the graphics cards. I’m using the XSPC Razor GTX 980, almost entirely for its lighting, but also because I’ve been continually bothered by the general lack of user-friendliness of EKWB products. Installing an EKWB block on a Radeon R9 290X was a fussy, frustrating experience. The XSPC Razor was better, but not by much. Carefully removing plastic from both sides of little pieces of thermal padding is a chore unto itself. The Razor GTX 980 can also be ordered with XSPC’s backplate, or you can re-use the one that comes with the stock 980. I opted to just re-use NVIDIA’s. XSPC’s block sure is a looker, though, and definitely more appealing than the Swiftech Komodo blocks I used on my GTX 780s. The EKWB transparent block I used on the R9 290X for my 250D build was well-suited to the task, but for the Yamamura, the Razor 980’s glowing trim is going to be killer and a real eye-catcher. The Obsidian 750D needs to almost be gutted to fit the amount of cooling capacity we’re cramming into it. The 3.5” drive cages have to go along with the stock intake and exhaust fans. I also had to temporarily remove the 2.5” drive sleds, but thankfully the smart layout of the 750D allows me to use up to four SSDs even with three radiators installed. Despite my reservations about how fiddly EKWB’s blocks can be, the Supremacy EVO is regarded on several forums as being simply the best CPU block you can buy. Interestingly, EKWB doesn’t necessarily employ a one-size-fits-all approach with their blocks; components within the block can be swapped out to optimize for individual platforms. The default configuration is for an LGA2011(-3) CPU, but replacing the jet plate makes it better suited for our Intel Core i7-4790K. You also want to make sure the copper microfins inside the block run parallel with the CPU die beneath the heatspreader to maximize heat transfer. And here’s where plans begin to crumble into dust. Two design choices already have to be cut and altered. Due to limited clearance and overachieving ambition, the bottom radiator can’t be configured as push-pull, so I went with push. The radiator’s fixtures also encroach on the HX1000i. While the HX1000i and radiator fit together, the HX1000i’s cables make it impossible. At this point I had to decide whether I wanted the HX1000i or the bottom radiator; the bottom radiator won out, and the HX1000i was replaced by the higher-performing but lower-capacity AX860i, which has an impressive 160mm depth. With radiator and component fitment sorted out, it’s time for some fresh problems. Everything installs okay enough, but the 360mm radiator in the top needs to be rotated 180 degrees; the inlet and outlet overlap the primary AUX12V socket on the motherboard. The secondary one gets covered by fans, and the third is next to the first PCIe x16 slot. Thankfully we only really need the first. The GTX 980s also wind up being a touch too long to use the stock mounting holes in the motherboard tray for the XSPC Photon 170 D5 Vario reservoir/pump combo. Note, too, that the speed control on the bottom of the pump is basically buried; I tested it before installation to find the right balance of performance and noise and went with the Level 3 (of 5) setting. In order to mount the pump and reservoir, I needed to drill holes into the motherboard tray. Per my girlfriend’s directions (she’s much handier than I am), I covered the tray with painter’s tape to keep metal shrapnel from flying into the electronics (a smarter decision would’ve been to take them out ahead of time). I also photocopied the mounting side of the Photon 170 and used it essentially as a guide for drilling the mounting holes, and this worked fairly well. Once I was clear that the pump and reservoir assembly was going to install safely, it was time to actually cut the tubing and connect the loop. Per a suggestion from a more experienced modder, we switched from using Swiftech compression fittings to Bitspower, and it was a very positive switch. Swiftech’s fittings certainly work, but the Bitspowers are much, much easier to install. Incidentally, the loop layout wound up being essentially identical to the plan: CPU Block 360mm Radiator GTX 980 #1 GTX 980 #2 Bottom 240mm Radiator Front 240mm Radiator Pump Back to CPU BlockOwing to the relative spaciousness of the 750D, actually attaching the fittings wound up being fairly trivial provided we warmed up the ends of the tubing before slipping it over the barbs. Of course, it’s easy for me to say it was trivial; there was still a decent amount of elbow grease involved, and my much stronger girlfriend was responsible for securing all of the compression fittings. With paper towels down, we primed the loop and left it leak testing overnight. The next morning, the paper towels were dry. Since the loop itself was in place, I finished the build by installing the SSDs, Commander Minis, rear 140mm exhaust, and finishing up the cabling. I did wind up having to use a helpful little accessory from a competitor of ours: NZXT has an accessory that lets you split a single internal USB 2.0 header into three (plus two USB 2.0 ports). Since there are two Corsair Link Commander Minis installed plus the two USB 2.0 ports from the case and only two USB 2.0 headers on the ASUS Z97-WS motherboard, the accessory came in handy. And the Yamamura is complete. I had trouble deciding whether or not to include the BD-RE drive, but we felt like the break in the drive bays was worth the utility, and the silver line on the drive is a nice accent that helps keep the front of the case from being too monochrome. The system as a whole is amazingly silent while having tremendous cooling capacity. In the next part of this build log, I’m going to talk about optimization: with all of this cooling performance, it’s time to try unlocking the GTX 980s.
  5. It’s not at all uncommon (in fact, exceedingly normal) for Corsair employees to want to tinker with our latest and greatest products just to see what we can actually do. While I was doing a single HG10-A1 build in the Carbide Series Air 240 that I was pretty proud of, one of our product engineers, Dennis Lee, was pushing things…well, a lot further. His Air 240 build borders on insane, and I’m happy to share it with you. COMPONENTS CPU Intel Core i7-3820 @ 3.9GHz Memory Corsair Dominator Platinum 32GB (4x8GB) DDR3-1866 9-10-9-27 1.5V Motherboard ASUS Rampage IV Gene (X79) Graphics 2x AMD Radeon R9 290X CPU Cooling Corsair Hydro Series H75 GPU Cooling 2x Corsair Hydro Series H75 and HG10-A1 PSU Corsair AX860i Storage Corsair Neutron GTX 240GB Enclosure Corsair Carbide Air 240 Dennis’s build is…pretty wild. He used white SP120 LED fans and a red sleeved cable kit, then doubled down and swapped in LED lit pump caps from H105s onto all of the H75 coolers. The result is easily one of the craziest systems we’ve ever seen and a testament to just how much power can be crammed into a Carbide Series Air 240. In all of its glory: two liquid-cooled AMD Radeon R9 290X cards on an X79 Micro-ATX board with just about everything under water. Bird's eye view. In order to fit two H75s in the main chamber, one had to be arranged in a push-pull configuration. The H75s operate as intakes, keeping the blowers on the HG10s fed while the two top fans work as exhausts. The pair of HG10s look cramped, but were designed to allow for exactly this kind of close proximity when used with the right Hydro Series cooler. The third H75 (cooling one of the R9 290X cards) had to be mounted to the 120mm fan mount in the back chamber. Screw the H75 radiator to the side panel, close it up, and game on.
  6. It’s been a half a year since we took an Obsidian Series 250D enclosure and installed a custom liquid cooling loop into it just to prove we could. Today we’re going to do something a little more straightforward with one of the most flexible cases in our lineup: the mainstream juggernaut Obsidian Series 750D. The 750D has been an extremely popular and solid seller for us, and it’s not hard to see why. This chassis design (and to an extent its flashier derivative, the Graphite Series 760T) is a history of Corsair cases placed in a crucible, the excess burned away and only the essentials remaining. It’s large, but feature rich, maximizing its space and giving the end user tremendous flexibility. This will be a series of articles on a build I’ve dubbed “Yamamura” after the villainess of the Japanese “Ring” films, whose father is inferred to be a water demon. Today we’re going to start with the parts list. Note that this is tentative; at some point parts may be swapped in or out depending on circumstances. Chassis: Obsidian Series 750D This build’s reason for being, the 750D boasts tremendous capacity for water cooling, rivaled only by the larger Graphite 780T and Obsidian 900D cases. Combining a clean design with solid airflow, room for multiple radiators, mounting points for a pump/reservoir combo, and general ease of assembly, the 750D is really the ideal mainstream case for liquid cooling enthusiasts who don’t want to go all out with a juggernaut like the 900D. Processor: Intel Core i7-4790K It’s reasonable to suggest an Intel Core i7-5960X might be a more exciting option, but the i7-4790K is a vastly more efficient processor, even when substantially overclocked. Part of the reason we’re going with so much radiator capacity (listed later) is to be able to run the fans at low speeds; a chip like the i7-5960X that dumps an extra ~150W of heat into the loop when overclocked takes a substantial bite out of that thermal efficiency. Intel’s i7-4790K is a stellar processor in its own right, and our samples hit 4.7GHz on Intel’s highest performing CPU architecture. Motherboard: ASUS Z97-WS I’ve been using this board in my Haswell and Devil’s Canyon testbed and it’s been an absolute pleasure. The Z97-WS is feature complete for this generation, sporting SATA Express, M.2, a PLX switch for dual PCIe x16 SLI and CrossFire, multiple USB 2.0 and USB 3.0 headers, and even FireWire capability. There are also extra power leads for the CPU socket and the PCI Express slots. Short of an ROG board, the Z97-WS is basically as good as Z97 gets. Memory: 32GB (4x8GB) Corsair Dominator Platinum DDR3 2400MHz CAS 10 It’s tempting to go for higher speed memory, but we’ve found internally that 32GB of DDR3-2400 is really the sweetest spot for Haswell and Devil’s Canyon. This is fast memory and a lot of it, and it ensures that you’ll never be bottlenecked by your memory subsystem. This kit is hands down my favorite for Haswell and Devil’s Canyon: high speed, high capacity, low latency, peak performance. Memory Cooling: Corsair Dominator Airflow Platinum While the benefits of having active cooling over high speed memory can certainly be debated, the Dominator Airflow Platinum cooler serves double duty both as cooling and as a classy bit of bling that can be added to the build. Rather than be limited to the two light bar kit colors, the Dominator Airflow Platinum has two RGB LED fans in it that can be controlled and configured via Corsair Link. Graphics Cards: Dual NVIDIA GeForce GTX 980 4GB GDDR5 Essentially the fastest single-GPU card on the planet, the NVIDIA GeForce GTX 980 also holds the distinction of being one of the most overclockable as well. We’ve seen the GTX 980 exceed a boost clock of 1.5GHz on stock air cooling with only a minor poke to voltage; with two of these under water and modified vBIOSes to remove the TDP cap, we may be able to push these cards to new heights of performance. Storage: 4x Corsair Neutron Series GTX 480GB SSD in RAID 0 Previous testing has indicated that four Neutron GTX SSDs are enough to saturate Z97’s SATA bus, offering peak throughput of a staggering 1.6GB/sec. While striped RAID has its own drawbacks (if one drive fails all of the data is lost), judicious backups and good computing habits can leave you free to enjoy a tremendous amount of solid state capacity and performance. Power Supply: Corsair HXi Series HX1000i 1000W 80 Plus Platinum This selection could’ve gone either way, between the HX1000i and the AX1200i, but in the end I opted for the slightly shorter, slightly less featured, but still exceptional new HX1000i. The HX1000i gives us an extra 20mm to avoid clearance difficulties with the bottom-mounted radiator while still offering Corsair Link monitoring and control. Better yet, the blue logo ID matches the blue theme of the rest of the build (as you’ll see later.) Corsair Link: Commander Mini Unit The Corsair Link Commander Mini is borderline purpose built for liquid cooling. The multitude of fans we’re planning on using for this build may necessitate a second unit, but the Commander Mini itself is useful for controlling a substantial number of fans on its own through the use of Y-cables, and we can use it to control the LED fans on the Dominator Airflow Platinum. Finally, the HX1000i can be connected directly to the Commander Mini instead of burning a USB port on the motherboard on its own. Fans: One Air Series SP140 LED Blue Static Pressure Fan, 14x Air Series SP120 LED Blue Static Pressure Fans The goal is to achieve push-pull with all three radiators; research suggests it should be possible, but overall radiator clearances may prevent it. Nonetheless, our blue SP LED fans are among our most efficient fans available, and incorporating push-pull on the radiators substantially reduces the speed we have to run them at. CPU Waterblock: EK Supremacy EVO Blue Edition Sticking with our blue theme, we’ve selected arguably the most efficient CPU waterblock currently available. Internal testing has proven heat transfer isn’t the same issue on Devil’s Canyon that it was on conventional Haswell, opening up the possibility of using a high performance waterblock to extract the maximum amount of performance the silicon offers. GPU Waterblock: XSPC Razor GTX 980 Chosen for its illumination support, XSPC’s full cover waterblock for the GeForce GTX 980 has a clean aesthetic that meshes beautifully with the Obsidian 750D. It’s thin, attractive, and cools all of the surface components of the GTX 980, ensuring long life and quiet operation. Note that we opted not to purchase the backplate that XSPC offers; the GTX 980 stock cooler already includes an excellent backplate of its own, mitigating the need for an aftermarket one. Pump and Reservoir: XSPC D5 Photon 170 Like so many of XSPC’s kits, the Photon 170 reservoir includes lighting, keeping it in theme with the rest of the build. However, the integration of a mounting backplate and D5 Vario pump makes it easy to get exactly the placement and performance we want and need to drive our loop. Radiators: Swiftech Quiet Power 360mm and 2x Quiet Power 240mm Radiator selection is a matter of preference; I’ve traditionally been pretty happy with Swiftech’s radiators. Note that these are standard-thickness (25-30mm) radiators. Given the choice between an extra-thick 280mm front radiator or two standard 240mm radiators, I opted for the increased airflow that spreading out the surface area provides. This is a matter of preference, though, but a cumulative 840mm x 25mm of radiator capacity should be more than adequate for getting the job done. Stay tuned for part two, when we begin assembly of the Yamamura…
  7. Just a couple of days ago, I talked about the drawbacks of having a beastly dual-GPU system featuring a custom liquid cooling loop as well as my solution to the problem in the form of my new Carbide Series Air 240 build that I dubbed “Blues.” I believe largely in balance, not overkill, though there is something to be said for the joy of assembling by hand a massively powerful machine. Knowing that my performance target wasn’t 4K but 1080p (and occasionally 3x1080p) suggested that my existing system wasn’t worth the 600W of power it consumed under gaming load, not to mention the corresponding 600W of heat it has to dissipate into a room that enjoys Californian Indian summers. Using some of our newest hardware, I opted to build a machine that would run as quiet (if not quieter) than my existing system while retaining the required amount of performance – but with superior performance per watt. These are the specifications of the two systems, compared. My old system was named “Ted” and it’s been with me for a while in an almost comical number of permutations. TED BLUES CPU Intel Core i7-4790K @ 4.7 GHz, 1.31V Intel Core i7-4790K @ 4 GHz, 0.975V Memory Corsair Dominator Platinum 32GB DDR3-2400 10-12-12-32 1.65V Corsair Dominator Platinum 16GB DDR3-2400 10-12-12-32 1.65V Motherboard ASUS Maximus VI Formula (Z87) ASUS Z97I-PLUS (Z97) Graphics 2x EVGA GeForce GTX 780 3GB (980 MHz Core, 6 GHz GDDR5) AMD Radeon R9 290X 4GB (1 GHz Core, 5 GHz GDDR5) CPU Cooling Custom Loop Corsair Hydro Series H75 w/ SP120 LED Fan GPU Cooling Custom Loop Corsair Hydro Series HG10-A1 Corsair Hydro Series H105 w/ 2x SP120 LED Fan PSU Corsair AX860i Corsair HX750i Storage 4x Corsair Neutron GTX 480GB in RAID 0 3x Corsair Force LX 512GB in RAID 0 Enclosure Corsair Carbide Air 540 Corsair Carbide Air 240 You can see I didn’t make a lot of brutally unkind cuts. I maintain that 2400MHz is the sweet spot for memory on Haswell and Devil’s Canyon, so that was worth the modest increase in power consumption. The AMD Radeon R9 290X is by no means frugal with power, but it is an incredibly fast card; had the NVIDIA GeForce GTX 980 been available when this build was assembled, that would’ve been the more sensible choice. While Blues is obviously inferior in performance to Ted, nobody would really be “slumming it” by making the transition. So what do we save in power, and what do we sacrifice in performance? Note that these games were all tested at or near their highest settings; Metro: Last Light Redux was maxed out with SSAA but with Advanced PhysX disabled, while Tomb Raider was only run with 2xSSAA and TressFX enabled. What we see is that in our synthetic video encoding benchmark, for our ~15% reduction in CPU clock speed we lose ~13% of the performance. That’s not too bad. Games run the gamut; BioShock Infinite’s minimum frame rate doesn’t change drastically, and the average stays well above 60 fps. Tomb Raider’s minimum does drop below 60 fps, but the average is above, and the single R9 290X doesn’t suffer from the rendering artifacts with TressFX that the SLI’ed 780s do. Metro: Last Light Redux is the most unpleasant hit, but still stays well above 30 fps. Finally, F1 2013 doesn’t seem to have SLI functioning correctly, but it’s irrelevant: either system maintains over 60 fps. We can use the Corsair Link connectivity of our AX860i and HX750i power supplies to see how much power each of these systems is drawing, and that’s where the difference really lies. While Blues peaks at about 365W under its most taxing load, Ted is gunning all the way up to nearly 600W. Particularly alarming is the near doubling of power consumption under the x264 benchmark for an extremely modest increase in performance. This is the truth of overclocking: at a certain point, substantial amounts of power become necessary to hit higher and higher speed bins. Almost entirely across the board, though, Ted is drawing substantially more power than Blues does, and arguably a lot of that is wasted performance. Mapping performance per watt puts it all into a different perspective. Since all of our benchmarks are measured in frames per second, we can divide those results by the peak power drawn during the benchmark to come up with a rough idea of how efficiently each system is running. This isn’t the grand slam that the absolute power consumption is; performance per watt stays mostly level in every game but the odd duck F1 2013. CPU efficiency is vastly improved, though. The measure for success here is overall power consumption while maintaining acceptable performance levels, and on that front, Blues, is a victory. I’ll be mothballing Ted for a while and spending more time with Blues to see if the reduced performance is really worth writing about, but for now, this has been a fun exercise in seeing how we can make our systems more efficient. We have overclocking competitions and records, but I’d love to see users trying to hit performance targets while reducing power consumption as much as possible.
  8. If you were at PAX, then you already know we had a couple of incredibly beefy gaming systems with tri-monitor surround configurations set up there. Of course, if you weren’t, then the systems we had built up for head-to-head gaming might surprise you a little…especially since we couldn’t even announce what was running in them until August 29th. But that time has passed, and now we can show you our PAX Graphite 780T red and blue configurations. We knew in advance that Intel would be using PAX Prime as their opportunity to launch their new high end desktop platform, complete with Haswell-E processors, X99 chipset, and DDR4 support. It would have been frankly embarrassing if we showed up with anything less. That’s why we got these two bad boys ready to go. These two systems were almost identically configured with the components listed below: Processor Intel® Core i7 5960X Motherboard Asus® X99-DELUXE GPU 2x EVGA® GeForce GTX 780 ACX Superclocked Case Corsair Graphite Series 780T White PSU Corsair HX1000i Power Supply (Blue); Corsair AX1500i Power Supply (Red) Memory Corsair Vengeance LPX Black DDR4 2800MHz (4x4GB) Storage Corsair Neutron GTX 240GB Cooling Corsair Hydro Series H105 Additionally, we used red and blue sleeved cables along with red and blue SP120 and SP140 LED fans to contrast the systems against each other. You can see glamour shots of the two systems below. With eight fast cores, sixteen gigabytes of new DDR4 memory, and dual GeForce GTX 780s in SLI in each system, let’s just say we didn’t have much trouble running our games at the required 5760x1080 resolution that the trio of monitors plugged into each system called for. Here the blue/white system is in action at PAX Prime and barely breaking a sweat.
  9. While the M.2 interface has plenty to recommend it, specifically PCIe connectivity in a smaller form factor, SATA Express unfortunately seems to be a lot more underwhelming. The underpinnings are the same (typically two lanes of PCIe) but you sacrifice two SATA ports on your motherboard to use it. You get a theoretical 10Gb/sec of bandwidth for your drive, which is definitely an improvement on SATA’s 6Gb/sec…but if you put two drives in a striped RAID on those same two SATA ports, your theoretical bandwidth jumps up to 12Gb/sec. And you can still scale. For a thought experiment, for funsies, we grabbed six of our Neutron GTX 480GB SSDs and tested them in striped RAID from one drive all the way up to all six. We wanted to see where scaling would level off, and just how much performance you could get from going this route. Note that most M.2/SATA Express implementations will be limited to an effective ~850MB/sec after overhead is taken into account. In terms of pure theoretical performance, even two conventional SATA SSDs in RAID 0 should be able to meet or beat that bar. The testbed employed an Intel Core i7-4770K @ 4GHz, 8GB of DDR3-2133, and Intel’s Z87 chipset. The system drive was connected to a separate ASMedia SATA controller, allowing all six of Intel’s SATA 6Gbps ports to be used in the array. The benchmark AS SSD shows fairly steady gains in read and write performance and seems to plateau at about four drives. You’ll see this as a trend moving forward; two drives can easily match what SATA Express can do, and then moving past that is just gravy. Interestingly, write performance temporarily surpasses reads at three or four drives, but this is isolated to AS SSD. ATTO tends to produce the absolute best case scenarios. Our testing shows scaling essentially petering out at the fourth drive, just like AS SSD does. The numbers are impressive, though; just two drives can offer a staggering gigabyte-per-second of read speed, and it only gets faster when you add more. Crystal Disk Mark again plateaus at four drives. Two drives again meet or beat SATA Express, while three drives start to reach beyond it and four drives are mind-bogglingly fast. Finally, IOMeter’s 4K random read and write benchmarks (we’ve been largely sequential up until this point) are brutal, but enlightening. Interestingly, 4K performance flattens out on the third drive instead of the fourth. The amount of operations these drives running in parallel can perform is pretty impressive, though. Users aching for faster storage than even a single conventional SSD can provide (and that’s a pretty fast experience in and of itself) will be heartened to see Intel’s chipset can scale reasonably well up to four drives and even eke at least a little more out of the last two.
  10. There’s a curious split in the market: people are going bigger or smaller. Far be it for me to suggest people shouldn’t buy full ATX cases or larger; these cases provide ample space for multi-GPU configurations and the increased case volume can improve cooling. Likewise, half the fun of a mini-ITX build is seeing how much power you can cram into an enclosed space, something we experimented with when we did our “God Lives Underwater” build in the Obsidian Series 250D. The funny thing is that Micro-ATX is, at least in my opinion, really the sweet spot form factor. Four expansion slots (enough for two graphics cards), four DIMM slots, and typically just as fully featured as the bigger ATX boards. Our Obsidian Series 350D may be a bit bigger than most Micro-ATX enclosures, but it’s as fully-featured as they come, even including a fifth expansion slot specifically for multi-GPU configurations. And as it turns out, you can fit a ridiculous amount of power in it. To prove it, and to give our shiny new AX1500i 1500-watt power supply a good workout, we built arguably a pretty ridiculous system in the 350D. What may surprise you more than anything is that it works and works very well for a machine that has to dissipate up to 1.5kW of heat from a fairly small enclosure. As a last hurrah to the X79 platform before Haswell-E and X99 descend upon us later this year, we used the ASUS Rampage IV Gene as the basis of our system and then plugged in an Intel Core i7-3930K. The i7-4930K is readily available, but doesn’t enjoy the kind of overclocking headroom (or heat dissipation) of its predecessor. On top of that was 32GB (4x8GB) of DDR3-1866 Vengeance Pro memory, which when combined with Sandy Bridge-E’s quad-channel memory controller offers a staggering 50GB/sec of memory bandwidth to keep our hexacore processor fed. Storage was handled by a 480GB Corsair Neutron GTX SSD, gingerly mounted behind the 5.25” drive cage. Why, you ask? Because we had to remove the 3.5” and 2.5” drive cages to make room for our Hydro Series H105 CPU cooler, installed in the front of the case in a push-pull configuration with Quiet Edition SP120 fans. The 350D only has enough clearance at the top for an H100i or H110; the H105’s radiator is just a little too thick, to say nothing of trying to install push-pull fans as well. Of course, it’s also because we had to make room for two AMD Radeon R9 295X2 graphics cards. These monsters are rated for 500W apiece, but they can easily exceed that. That’s why it’s important to use a dedicated 8-pin PCIe cable for each power connector on the R9 295X2 instead of the traditional daisy-chained ones. Those cables are fine for any other graphics card, but AMD breaks the PCIe connector spec in a very big way with the R9 295X2, so dedicated cables are needed lest you melt the connectors. The radiators for the Radeons are mounted to the top and rear of the case; the H105 serves as an intake in the front, cooling the CPU before air flows through the two R9 295X2 radiators. This creates a pretty clear and directed air flow path, and the ambient internal temperature of the case itself becomes less relevant. Under load we’re looking at a rated TDP of about 1.2kW, but it’s very easy to exceed that. The CPU is overclocked to 4.4GHz with a commensurate bump in voltage to ~1.35V; 4.5GHz simply wasn’t stable at any voltage. Actual load figures were pushing very close to the AX1500i’s rated capacity, but the system ran surprisingly quietly all things considered, and thermals were perfectly fine. We had to run FurMark for an extended period of time to get the Radeons to start throttling. Seeing results like these makes it hard to fathom a system that the Obsidian Series 350D wouldn’t be able to handle with aplomb. If you want a mid-tower-sized system with full tower performance, this certainly seems to be the way to go.
  11. A little less than a month ago I ran into a troubling issue with my system at home. Drives on the SATA ports were starting to blink in and out while the machine was running. We checked the SSDs (all Neutrons and Neutron GTXes, naturally) internally and found no problems, and that pretty much left me with the motherboard. Ordinarily replacing a motherboard is a nuisance but not a huge issue, but when your system looks like this: …it’s borderline catastrophic. Since then I’ve been able to rotate the drives to different SATA ports and so far things have been okay, but it seemed prudent nonetheless to build a system that I could swap in if things got too difficult with my primary. Luckily we had a couple of Graphite Series 760T prototypes in house, and I opted to grab one and put together a pretty handsome beast. It’s easy for the white version of the 760T to overshadow the black one, and the white one is the one we’ve been showing off, but I actually have a soft spot for black. I decided to take the opportunity to style a black 760T and make it my own. Here are the parts I used for this build: Intel Core i7-4770K Corsair Hydro Series H110 CPU Cooler ASUS Maximus VI Hero Z87 Motherboard 4x8GB Corsair Dominator Platinum DDR3-2400 CAS 10 with Light Bar Kit AMD Radeon HD 7990 XFX Radeon HD 7970 Corsair Neutron GTX 240GB SSD Corsair AX1200i 1200W 80 Plus Platinum Power Supply Corsair Link Lighting Node Two blue AF120 LED fans and two blue AF140 LED fansAssembly was fairly easy and I made effective use of the resources on hand from being a Corsair employee and member of the tech marketing team. For starters, the Intel Core i7-4770K is an engineering sample; nigh identical to retail chips, but in my experience, engineering sample Haswell CPUs tend to overclock a bit better. This one did the same clocks my home chip does: 4.5GHz at ~1.22V. That’s pretty good for an i7-4770K, especially one that has 32GB of DDR3-2400 CAS 10 strapped to it. I was also able to scrounge up a second Y-cable and two more of the stock fans for the H110 and assembled it in a push-pull configuration to maximize performance while reducing noise. The ASUS Maximus VI Hero motherboard was a good chance for me to play around with one of ASUS’s ROG boards and BIOSes. I’m used to overclocking on Gigabyte hardware, but one of the benefits of working here is being able to learn about everything else that’s available. I do my research and pass it on to you. The Maximus VI Hero is a fine board and very easy to use, and I may make the jump back to ASUS when I do finally rebuild my custom loop in my home system (probably circa Devil’s Canyon or Haswell-E). On graphics duty, I was compelled to investigate the issues Tom’s Hardware reported with the AMD Radeon HD 7990 as well as trying to get a little triple CrossFire action going. Tom’s Hardware was right, though: in a multi-card configuration, the 7990 just plain can’t be the top card. For whatever reason, the fan design on the card causes the first GPU to suffocate and overheat; swapping the 7990 into the bottom slot and putting the 7970 into the top slot allowed TriFire to function without thermal issues. That said, it’s still a bit noisy: the 760T has fine air cooling performance, but we’re talking about ~625W of graphics hardware under open air coolers. As for performance, it’s definitely there, but microstutter is also plainly evident in Unigine Heaven. Historically, going past two GPUs has exponentially increased potential driver issues, and that’s pretty apparent with this build. If I were going to replace my home system with this build, I’d be seriously considering simply removing the 7970 and sticking with just the 7990. Visiting forums suggests other users have fared better with triple Radeons, but I remain skeptical. In order to give my build some flair, I swapped out the red LED fans that came pre-installed in the black 760T for blue ones, as well as adding an additional 120mm blue intake fan on the bottom of the case. I used our blue braided cable kit for the AX1200i power supply as well, and swapped in blue light bars for the Dominator Platinum memory. Finally, I added two Corsair Link LED strips to the top and bottom of the case to help illuminate it. Despite blue being an extremely common LED color, blue system builds actually seem to be rarefied, and I’m very happy with the black/blue/red color scheme of this 760T. We know the 760T is taking a little extra time to get to you but we want to make sure it’s just right. The prototype I used for this build can have issues with flexing and flexibility on the side panels that frankly we just weren’t happy with in a final product, and we think you’ll be a lot happier with the 760T when it does ship later this month.
  12. A major aspect of the future of computing is heterogenous processing. CPU and GPU hardware are radically different, and they’re capable of running radically different tasks. In the past few years, GPUs have also become increasingly programmable, to the point where we’re seeing tasks that might benefit from their massively parallel architecture. Performing work on a GPU that isn’t strictly gaming related is becoming more common. One of the companies that has been most aggressively taking advantage of this shift in the way we do work on our computers is Adobe. The Mercury Playback Engine, introduced in Premiere Pro CS5, used NVIDIA’s proprietary CUDA libraries to offload tasks on to the CPU during video editing. For older formats like HDV (a wrapper for MPEG2), this was good for accelerating certain filters. But when applied to newer, more computationally intensive compression like H.264 (used in AVCHD and many/most smartphones), the Mercury Playback Engine made fluid real time editing possible. Now with Premiere CC, the Mercury Playback Engine also functions on OpenCL. OpenCL isn’t proprietary to NVIDIA, but instead dependent upon GPU vendors to implement. MPE will still default to CUDA on NVIDIA graphics hardware, but now people with AMD and Intel graphics can get a potential performance boost by switching on OpenCL hardware acceleration. This isn’t something Corsair benefits from in a strict and transparent way; our fast memory and storage can help alleviate bottlenecks so that your CPU and graphics hardware can take advantage of the Mercury Playback Engine, but once those bottlenecks are lifted, it’s really up to the CPU and GPU to get the job done. This was more research for my own edification, and I’m happy to share it with you. AMD’s strongest asset is also the subject of their biggest push: their GPU architecture. However much AMD might lag behind Intel on the x86 side, their GPU architecture is substantially more powerful and more advanced. With tremendous computational power strapped to the four Steamroller cores in the A10-7850K, it stands to reason this should be AMD’s chance to shine and beat back the Intel Core i7-4770K with its substantially less exciting HD 4600 graphics. Additional testing was done with a dedicated NVIDIA GeForce GTX 760 (to test CUDA) and a dedicated AMD Radeon HD 7970 GHz Edition (to test OpenCL.) Both systems were tested using a Force LS SSD as a source drive and a Neutron SSD as a render target, and both were running 32GB of Dominator Platinum DDR3 at 2400MHz CAS 10. Our H.264/AVCHD sample just doesn’t seem to tax GPU compute as hard as I had expected. While AMD’s A10-7850K gets an 11.6% speedup against Intel’s less exciting 7.9%, it’s still so hamstrung by x86 performance that it can’t bridge the gap. Switching the i7-4770K from its IGP to a GTX 760 or HD 7970 demonstrates no notable change in performance, either. Here’s where things get weird. Decoding and encoding HDV (MPEG2), the integrated graphics built into both the i7-4770K and the A10-7850K actually measurably increase the rendering time. Yet if you swap in a reasonably powerful dedicated graphics card, the i7-4770K gets an absolutely massive performance boost. Render times get cut down to less than half of what they were when you relied solely on the CPU. What have we ultimately proven? Under some circumstances, even having the integrated graphics handle some of the workload can offer tangible performance benefit in Adobe Creative Cloud. The all around boost we wanted to see from AMD and Kaveri just doesn’t quite seem to be here yet, though, to say nothing of the lukewarm OpenCL showing by Intel. Intel has been focusing on improving their graphics performance each generation, but they seem to only be interested in the gaming side of the equation. We’re slowly getting to the point where the GPU becomes not just an important citizen but an essential one in our computing ecosystem beyond just gaming. That’s the roadmap AMD has laid out, the roadmap NVIDIA is pursuing aggressively (for obvious reasons), and the one Intel seems to be dragging their feet on. With that said, we aren’t quite there yet. If you’re doing any kind of video work, though, the conclusion is abundantly clear: a good GPU is more important than ever.
  13. Ever since I saw the Obsidian Series 250D in development, I knew I was going to have to try to do something fairly crazy with it. That’s kind of the nature of these small enclosures, they practically dare you to see just how far you can push them. George, Aaron, and Guillermo (among others) saw fit to cram a 240mm radiator mount in alongside the 120/140/200mm mount in the front of the case. I see that, and it feels like an open challenge. Challenge accepted. I’m going to attempt to put together the most powerful system I conceivably can inside the 250D with a custom watercooling loop. This is part one of a series of articles that will detail the conception, assembly, and optimization of what will hopefully be a pretty proud showpiece for us in the months to come. Or, alternatively, something I may attempt to steal and take home. Today we’re looking at the components chosen for this build. CPU: Intel Core i7-4770K The Intel Core i7-4770K is pretty much the single fastest CPU you can install in a mini-ITX system. While I continue to be underwhelmed by Haswell as a whole, having not really moved the needle on the enthusiast performance meter, it is nonetheless the best choice for this build. I don’t have amazing hopes with having this chip underwater, but honestly I don’t believe in doing closed loops for CPUs alone anymore. We pop the i7-4770K into the loop because we can since we’re also watercooling… GPU: AMD Radeon R9 290X …AMD’s most monstrous GPU ever. The Radeon R9 290X’s relationship with heat has been well documented at this point, and I continue to believe that if ever a chip wanted to live under the sea, this one does. I’m hoping to get a beefy overclock out of the R9 290X, but honestly I’ll be happy just to peg it at 1GHz for the entire time it’s gaming. Choosing this card was a matter of some deliberation. Technically the fastest graphics card we could put in this build would be a GeForce GTX 690 or Radeon HD 7990, but the former is getting a little long in the tooth while the latter isn’t readily available anymore. The GeForce GTX 780 Ti is a strong alternative, but I believe that AMD’s new hotness (literally and figuratively) will more than come into its own given the opportunity to stretch its legs. Motherboard: ASRock Z87E-ITX This is probably going to be the most contentious hardware selection in this build. As far as high performance mini-ITX boards go, ASUS pretty much rules the roost with their Maximus VI Impact. There’s just one problem: while the 250D was designed with ASUS’s PWM riser in mind, I don’t like the idea of it either obstructing airflow or having hot air slam into the back of it. As a result, I went to Ian over at AnandTech, discussed the issue with him, and asked him for a suggestion on alternative mini-ITX motherboard. The ASRock Z87E-ITX is the one he came back with. Memory: 8GB (2x4GB) Corsair Vengeance Pro DDR3-3000 CAS 12 Since we’re going all out, we want the best memory we can conceivably fit into the build. This is about as good as it gets: two 4GB DIMMs running at a blistering DDR3-3000 at CAS 12, our absolute fastest RAM available. That means 8GB of capacity at a blazing fast speed. Storage: 2x 120GB Corsair Neutron GTX SSD in RAID 0 I’m the first person to admit doing a striped RAID with solid state drives is…kind of ridiculous. SSDs are already fast enough on their own (especially the Neutron GTX), striping data between two of them really is overkill. We’re not trying to build something sensible here, though, we’re trying to build something ridiculously fast. Power Supply: Corsair AX860i 860W 80 Plus Platinum If we look at the most power this system could hope to consume, at least at stock, you’re talking about an 84W TDP for the CPU, 300W for the graphics card, and then maybe another 30W for the memory, storage, and motherboard after all is said and done. That puts us at a little over 400W, nowhere near the capacity of our smallest 80 Plus Platinum PSU, the AX760i. So why go for the AX860i? The AX860i fits in the exact same dimensions as the AX760i, we’re more likely to hit better points on its efficiency curve, and the fan is going to be less likely to kick in on the higher specced PSU. Since I’m literally at Corsair headquarters building this thing, there’s really no reason not to go with the AX860i. Radiators: AlphaCool NexXxoS ST30 30mmx240mm and UT60 60mmx140mm I did about as much research as I could, and in the end I needed a slim 240mm radiator for the side mount. Every last millimeter counted, and the AlphaCool ST30 turned out to be the best choice for the job. On the 250D, we only have 55mm of clearance to work with, so the 30mm of the ST30 is basically the limit (our fans are 25mm thick.) As for the front, though, thickness was less of an issue and an opportunity to score some extra cooling capacity. The UT60 might wind up being a hair too thick, but I suspect it’ll be fine, and in exchange we get additional surface area. Undoubtedly there are going to be opinions as to whether or not we have enough radiator capacity for the CPU and the GPU. We do. An Arctic Cooling Accelero Hybrid is able to cool an overclocked GTX 680 with minimal fan noise, and that’s using a garden variety 28mm x 120mm aluminum radiator. So we’re talking about potentially 200+ watts being handled in 120mm; here, we have 30mm x 240mm and 60mm x 140mm to work with. Pump and Reservoir: Koolance RP-1250 Koolance’s RP-1250 made the grade through a combination of features and, frankly, the fact that it fits in a single 5.25” drive bay. It has in-built fan control and temperature monitoring along with a powerful pump, and while the whole unit is a little bit deep, we shouldn’t have too many problems getting it installed. Waterblocks: EK-Supremacy (LGA 1150) and EK-FC R9-290X For waterblocks we went to EKWB. While the CPU block was a breeze to get, blocks for the R9 290X are surprisingly difficult to come by in the aftermarket, probably owing to the 290X’s natural proclivity as marine life. I’m not going to lie, the thought of putting this thing together is a little intimidating. Quarters in the Obsidian 250D will be a little bit cramped, but the loop is going to be fairly simple and run serially: The RP-1250 radiator/pump, to the 240mm radiator, to the CPU block, to the GPU block, to the 140mm radiator, and back to the RP-1250. It’s not super optimized, but should get the job done more than adequately. As I mentioned, this is the first part of a series of articles as we document the assembly and optimization of this rig. In the next one we’re going to start really getting into the assembly process; in the meantime, throw us a comment here or on Facebook disputing our component choices. ;)
  14. One of the sideways benefits of Intel’s lack of progress on the performance front over the past couple of generations is that our “old” gaming notebooks based on Sandy Bridge chips are still very relevant. These are babies that we don’t have to throw out with the bathwater just yet, and if you’ve got a little spare bread after this holiday, you can spruce up your old gaming notebook and eke a bit of extra performance out of it.Gaming laptops like the Alienware M17x R3, ASUS G series, and just about any Clevo or MSI are often a bit more upgradeable than usual. You’ll have to consult forums to see what you can do about the CPU or GPU (the M17x R3 can go to at least a GTX 680M like I have with modified drivers), but much, much easier is just upgrading the memory and storage. Most of these systems shipped with 8GB or less of RAM, and SSDs weren’t fully in fashion just yet, so we have plenty of extra mileage we can pull. With just about any of these systems, it’s easy enough to just pop off the bottom panel and get started. The Alienware M17x R3 has two retaining screws hidden under the battery, but just about any gaming system’s underside should be pretty easy to get to, and instructions should be readily available online if you’re having any trouble. Where you might get hung up is if your notebook has four memory slots instead of two, because two of those slots will invariably be under the keyboard and thus much harder to access. You can use a program like CPU-Z to see if you have two or four DIMM slots populated, and from there you can check under the bottom of your notebook to see if it’s the two easily accessible ones or the two under the keyboard that may make you rip your hair out. As for replacing the memory, most of the Sandy Bridge era gaming notebooks shipped with DDR3-1333, and we can do better on every front. Two 8GB sticks of Vengeance DDR3-2133 SODIMMs are enough to keep any modern game and your CPU fed without breaking Windows 7 Home Premium 64-bit’s 16GB memory limit. The performance gains are modest, but we’re looking to maximize here. At this point it’s worth mentioning that while we benefit from XMP on the desktop, there’s nothing like that on notebooks, which forces memory companies like Corsair to employ some tricks to hit speeds higher than DDR3-1600. Sandy Bridge’s memory controller is also going to be more finicky than Ivy’s or Haswell’s. What this all adds up to is the memory speed the SO-DIMMs are specced to is more of a maximum than a guarantee, and this can’t really be helped; reviews of competing high speed SO-DIMMs demonstrate the same issues. I replaced the four pre-installed 2GB DDR3-1333 SO-DIMMs with two of our 8GB Vengeance DDR3-2133 CAS11 SO-DIMMs, and while the system would POST at DDR3-2133, it was unstable at that speed. I knocked it down to DDR3-1866, though, and it was perfectly fine, leaving me with 16GB of very fast system memory. Next is replacing the storage. A good 240GB/256GB-class SSD is enough for your operating system and your most important games, but I went whole hog and snagged a 480GB Neutron GTX anyhow. This is one of those upgrades that offers a pretty immediate and tangible benefit; the system boots faster, and everything loads much quicker. I’ve found gaming off of an SSD doesn’t improve framerates (no real reason for it to) but substantially improves loading times, a gain much appreciated when you’re dealing with games that tend to suffer from protracted loading times in the first place. If you really want to make your life easier, you can also simply clone your existing Windows and game install over to the new SSD with our cloning kit. It’s not essential, but I’ve been doing a lot of drive migration lately and have come to appreciate being able to just image drives instead of having to reinstall Windows from scratch. These two upgrades (particularly the SSD) can help breathe some additional life into your gaming notebook. At some point you’re likely to become GPU limited, but the pace of progress on both the CPU and GPU fronts these days has slowed down somewhat and given us some more longevity out of our core hardware. Bringing the surrounding subsystems up to snuff is a good way to bolster the rest of the system and keep it fresh, and I continue to be more than happy with my aging M17x R3.
  15. Prior to coming to Corsair, I was extremely blessed to write for AnandTech, and toward the end of my tenure there as one of the main editors I threw a Hail Mary and did a piece on building a custom liquid cooling loop. I’d never built a watercooled system before, but having cut my teeth on so many air-cooled builds before and having replaced my share of graphics card coolers, I felt like it was the right time. There was also the fact that I could. I had access to the hardware, I had the connections, so why not go for it?That’s not the whole story, though. I had options but they had to still be fairly reasonable, and even a build where the sky is theoretically the limit still involves understanding both personal taste and what the system will be used for. I’m not just a gamer; I have my degree in Film, and I’ve made my share of mediocre short horror films. Point being: a system that could just game wasn’t going to cut it. Since building the system, it’s also been continually modified and refined. When I started work here, I made it a point to “eat our own dog food.” That meant swapping in Corsair hardware wherever possible and getting a full end user experience that I could both feed into blog content here and communicate internally to aid in product development. The timeframe of the build was such that I was choosing between Haswell and Ivy Bridge-E. Neither one is a major victory over the previous generation, but it was doubly difficult because the CPU I was coming from was an i7-3770K that managed to hit a very robust 4.6GHz for a 24/7 overclock. In the end I went with a Haswell i7-4770K; Intel’s QuickSync is a feature that doesn’t get enough love, but it’s incredibly useful for converting video content to be streamed online, and I got plenty of mileage out of it on the i7-3770K. Zoë probably would've gone with Ivy Bridge-E. Given that I use the machine for editing video, it makes sense that I would employ 32GB of extremely fast Dominator Platinum RAM. The four 8GB sticks run at DDR3-2400, CAS 10, and they’ve been immeasurably useful. A fast storage subsystem is also helpful; I’m using a 240GB Neutron GTX as my system and project drive, and a 512GB Neutron for mass storage (read: gaming). Source video for editing is stored on a home server, but I’ll probably add another SSD for individual editing projects at some point. The motherboard is one place I had my pick of the litter, and in the end I went for the Gigabyte G1.Sniper 5. Built in barbs for liquid cooling made it ideal for the build, and it’s absolutely overflowing with features: PLX multiplexing for multi-GPU, Sound Blaster Core3D audio, and even a swappable op-amp. Graphics duty is handled by two OEM NVIDIA GeForce GTX 780s in SLI that have been modified with Swiftech Komodo waterblocks, basically making them EVGA Hydro Coppers. It’s nice to watercool one video card, but a bit excessive; two is where it’s at. The power supply is an AX1200i, connected to a Corsair Link Commander unit, and the fan basically never has to spin up. I’ve actually had this PSU since before it launched, and it’s still kicking strong. Actual liquid cooling duties are handled by two Swiftech radiators (240mm and 360mm), a Swiftech Apogee HD waterblock on the CPU, and a Swiftech MCP35X pump and reservoir combo that hides in the cable nightmare that is the rear chamber of the Carbide Air 540 I’m using to house the entire build. There are eight SP120 Quiet Edition PWM fans keeping the radiators cool; six in push-pull on the 360mm and two on the 240mm, and both radiators are configured as intakes. My inability to cable is the stuff of legend. Finally I added a Corsair Link lighting kit. I love the way the Air 540 looks, it’s one of the only cases I think demands a window, and I needed to thoroughly pimp my ride. Display duties are handled by a trio of ASUS VS24AH-P monitors. These are 24”, 1920x1200 IPS displays and they are an absolutely unbeatable bargain for the money. I was using Corsair SP2500s for my speakers, but have actually switched over to the Simple Audio Listens. They’re expensive but they really do sound fantastic. For keyboard and mouse, I’m rocking the Vengeance K90 and a Vengeance M65. I seldom use a headset when I game, preferring to use the mic built in to my Microsoft LifeCam so everyone can listen to my cat meow her idiot head off on voice chat, but I have a Vengeance 2100 wireless headset handy. Desk clutter is emblematic of a busy mind, or laziness. You decide! And that’s what I game on, edit video on, and try to work and create art on. I’m of the opinion that people who work in the tech industry, be they journalists or enthusiasts working at major companies, should be rolling top shelf or near top shelf kit. It’s not just a matter of having a machine you enjoy using, but being knowledgeable about the hardware you’re reviewing and recommending to others. It’s important to stay informed and keep up with the Joneses, which is part of why I’m continually trying to put together builds here at Corsair that include kit I don’t have immediate experience with. There are other machines in my apartment (two media centers, one Alienware M17x R3 that I keep Frankensteining and cut my laptop repair teeth on), but I’m most proud of my liquid cooled beast. Now if I could figure out how to drain it.
  16. After some recent “discussion” on our Facebook page about the lack of the use of AMD components in our system builds I thought I would fulfill the desires of the AMD loyalists and build an AMD system that can beat out Intel in the price/performance department. When it comes to building the highest end system, Intel is still at the top, but not everyone is planning on spending multiple thousands of dollars on their PC. For those who want to build a capable PC without breaking the bank, AMD has some excellent offerings. Below is a list of the components I used for this build, along with their prices. CPU: AMD Athlon X4 750K (Amazon – Newegg) $79 Motherboard: AsRock FM2A75-M (Amazon – Newegg) $69 Memory: 8GB Vengeance LP 1600C9 (Black) (Corsair) $89 GPU: XFX Radeon HD 7850 (Amazon – Newegg) $149 CPU Cooler: Hydro Series H55 (Corsair) $65 PSU: CX500 (Corsair) $49 SSD: Neutron GTX 120GB (Corsair) $125 Case: Obsidian Series 350D (Corsair) $89 For a total of $714 (without tax or shipping) we were able to build a very capable gaming rig that will play modern games at respectable settings which will rival next gen consoles. Consoles may be cheaper out of the gate, but once you factor in online service subscriptions like Xbox Live or PSN+, and the fact that console games generally stay at full price for much longer than PC games, spending a little more up front for an inexpensive gaming rig could be better in the long run, especially when you consider some of the advantages of PC gaming.
  • Create New...