Jump to content
Corsair Community

Very disappointed with corsair


userfromCN

Recommended Posts

attachment.php?attachmentid=9261&stc=1&d=1316342455

My computers.

attachment.php?attachmentid=9262&stc=1&d=1316342499

4 video cards and 2 PSUS/machine

 

The motherboard has 3*PCIE16X slots and 1*PCIE1X slot, when I took these pics in July, every video cards are connected by extension wires. I removed 16X extension wires in August.

I'll hang them up again tomorrow, it can cool the video cards much better.

1.thumb.jpg.12e327b5058c1c3ebc75ced19c20b35a.jpg

2.thumb.jpg.1839e0d6de49e4829e1b8d95fb573004.jpg

Link to comment
Share on other sites

  • Replies 50
  • Created
  • Last Reply
WOW! That's a lot of graphic cards. So all those six computers all have two power supplies. How are all those PSUS connected? Are they connected to one wall socket or multiple?

The 12 PSUS are connected to 4 wall sockets, each wall socket connects to different air switch(16A).

Link to comment
Share on other sites

It depends Toasted. If they stay within specs for temps and so on they should be fine. With the anything electronic you just dont know the life span of anything.

 

In the end i dont think there is much Corsair can do about this because of the way he is using them. I could b wrong about that. But when you start using them in configurations that are basically risky to begin with it, I would think it would void the warranty.

Link to comment
Share on other sites

im just trying to think where the short/overload may have occurred.

i used to have 16 HDDs on my win2k server years ago. i had 2 PSUs to do the task and i had 1 dedicated to 12 of the drives and the other PSU ran 4 drives and the system and a few fans.

neither PSU seemed to interfere with the other however in this case im wondering if there was any feedback voltage through the pci-e slot if 1 psu powered the board and the other powered the card. i could be totally off base but you know how if you hook a fan to a 12v and 5v lead you actually get a 7v power source? what if one of the PSUs was a bit lower on the available voltage/amps to the board/pci-e circuit and the one going through the card was stronger? at some connection there would be increased resistance from feedback voltage no?

if this theory is in comprehensible i apologize, im only on my 1st cup of coffee this morning.

Link to comment
Share on other sites

The temps of GPU are not very high because I set a fan for each machine.

I never overload the PSU and I have an air condition in that room.

The PSUS are used just like for a server, do you think the server need any rest?

I said 7*24 working is just a dream, none of the six machines can.

 

I read the manual again and again, I didn't do anything against 9 safety instructions in the manual, how can Corsair void the warranty?

Link to comment
Share on other sites

im just trying to think where the short/overload may have occurred.

i used to have 16 HDDs on my win2k server years ago. i had 2 PSUs to do the task and i had 1 dedicated to 12 of the drives and the other PSU ran 4 drives and the system and a few fans.

neither PSU seemed to interfere with the other however in this case im wondering if there was any feedback voltage through the pci-e slot if 1 psu powered the board and the other powered the card. i could be totally off base but you know how if you hook a fan to a 12v and 5v lead you actually get a 7v power source? what if one of the PSUs was a bit lower on the available voltage/amps to the board/pci-e circuit and the one going through the card was stronger? at some connection there would be increased resistance from feedback voltage no?

if this theory is in comprehensible i apologize, im only on my 1st cup of coffee this morning.

 

:idea:Great, when I planed to use two PSUS instead of one stronger PSU, I thought about this too, so I did some test on a machine and cross-measure the voltage of two PSUS. And when I run the machine, I use GPUZ to show the voltage and record the changes. My conclusion is there is little difference between PSUS, but they can support the machine work properly if both of them are stable.

I'll check more files about this theory before my sleep.

Link to comment
Share on other sites

i used to run a bunch of computers/video cards with another DC project F@H for thousands of work units, they ran 24/7 with no break. the way i see it is if they are kept at a certain temp (100% load) they dont wear out as fast as a system that gets hot then cold then hot then cold...

 

of course this has nothing to do with your O/T, i just mentioned it because you said you give the systems a "break" every day. personally i just dont see the need to let them cool down just to have them get hot again 20 minutes later.

Link to comment
Share on other sites

i used to run a bunch of computers/video cards with another DC project F@H for thousands of work units, they ran 24/7 with no break. the way i see it is if they are kept at a certain temp (100% load) they dont wear out as fast as a system that gets hot then cold then hot then cold...

 

of course this has nothing to do with your O/T, i just mentioned it because you said you give the systems a "break" every day. personally i just dont see the need to let them cool down just to have them get hot again 20 minutes later.

Yes, I do agree with you.

Link to comment
Share on other sites

I was told the broken PSUS will be replaced, and Corsair will tell me why it happend when they get them.

Hmm... I think that's not a short time.

Thats good news! FWIW total turn around time for RMA's are about 7-10 working days . And a couple of more days if you are overseas.

You could also call Customer Service and request an advanced RMA.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...