PDA

View Full Version : Watercooling pump question



x88x
11-25-2009, 01:16 PM
So, there's a very real possibility that I might be able to build something awesome for work, but in planning it and laying out the costs, I'm running into a bit of a problem.

Problem: need stupid amounts of GPU power.
Solution: EVGA 7-16x PCIe MBB (http://www.newegg.com/Product/Product.aspx?Item=N82E16813188059) + 7x EVGA GTX295 WC (http://www.newegg.com/Product/Product.aspx?Item=N82E16814130505) (yes, I know the MBB won't have 16x speeds if I fully populate it)

New problem:
I haven't decided on a case yet (can't do too modification, since it's for work), but the case will largely depend on what I want to put in it. For laying out the WC loop, I'm thinking probably 4-5 3x120mm rads and a nice Eheim pump. What I'm wondering is if the one pump would be good, or if I would be better off using multiple smaller pumps at various points in the loop for a better distribution of pressure. Also, has anyone seen something like this done before? One of my biggest concerns would be that I would have to daisy-chain all the GPUs together in one long string because of the close proximity, and that the last few would have pretty horrible temps as a result.

Like I said, this is just a possibility, though it is definitely a possibility :D I'm just trying to work out some kinks in the idea before I try and sell it to my boss.

Kayin
11-25-2009, 02:36 PM
Run dual DDCs, though I can design you split loops for that as well, and that would let you run all of it off a TyphoonIII and a single D5...

x88x
11-25-2009, 04:01 PM
I'm definitely not adverse to split loops but IDK if I would be able to break off the loop anywhere because of the close proximity of the cards. I know I can daisy-chain them with the 'SLI' WC connectors (http://www.dangerden.com/store/crosssli-danger-den-fittings.html), but I'm pretty sure there won't be enough space to break off and have, say, 3 cards on one loop and 4 on the other. If I can't split it, would you recommend have the two DDCs both at the same point or at two different points? If at the same point, in series or parallel? I would be using 3/8" ID tubing, unless there's a compelling reason for me to go up in this specific case.

jdbnsn
11-25-2009, 04:39 PM
You don't want to run water pumps in series, you can run them in parallel or in discrete loops. Just not in series

x88x
11-25-2009, 04:49 PM
You don't want to run water pumps in series, you can run them in parallel or in discrete loops. Just not in series

Ok, thanks...come to think of it, if I ran them in series, the second one wouldn't do much, would it..either that or it would run too fast and do horrible things.

jdbnsn
11-25-2009, 07:12 PM
Exactly! One will have far less resistance against the water pressure and the slower one will act as a resistor.

Kayin
11-25-2009, 08:14 PM
There are quite a few series pump tops to increase pumping head pressure. If you used matched pumps, any difference should be negligible.

Spawn-Inc
11-25-2009, 09:19 PM
series is perfectly fine so long as your pumps are matched.


there is several tops that run series as kayin said. XSPC even just came out with a triple series ddc pump top.


x88x i would highly recommend checking out this thread with some one that is building the same system as you, but for folding.

http://www.ocforums.com/showthread.php?t=624109


anyhow i would run the whole loop in series using 3-4 triple rads with 2 pumps in series.

for tops check out XSPC.



and here is a review (http://www.xtremesystems.org/Forums/showthread.php?t=238389&highlight=dual+top+review) round up of various tops for DDC pumps.

jdbnsn
11-25-2009, 11:16 PM
I stand corrected!

p0Pe
11-26-2009, 12:57 PM
+1 on running 3 ddc´s in series! it will give you the head preasure you need for a thing like this.

then a mo-ra 2 pro for radiator.

why is it that you dont go 5970 instead?

and for the cooling, you would have to run the cards in parallel to get some decent results! :)

the cosmos case should fit that mobo IIRC, and for pump top i would go for the xspc triple top! :)

http://www.realredraider.com/vbulletin/showpost.php?p=83684&postcount=1

x88x
11-27-2009, 12:46 AM
Yeah, I know running the cooling for the cards in parallel will give better cooling, but I don't think there will be space to put in the connections to do that.

The reason for using the GTX295 is that with the software we're currently thinking of using, CUDA support is required, and the GTX295 is currently the most powerful CUDA card on the market.

I'll most more details once (if) it gets finalized, but for now it's still in the proposal stage.

Spawn-Inc
11-27-2009, 07:03 PM
here is a pic (http://ep.yimg.com/ca/I/yhst-65556269779593_2081_48821086) of slot to slot parallel will work, swifty uses there short barbs with a piece of tubing in between, but i would grab the DD sli fittings, or bp sli fittings, links below.

Danger Den sli fitting (http://www.sidewindercomputers.com/sliandcr1vid.html)

Bits power small sli (http://www.sidewindercomputers.com/bishg14simid.html)

Bits power medium sli (http://www.sidewindercomputers.com/bishg14sidse.html)

Bits power large sli
(http://www.sidewindercomputers.com/bishg14si1id.html)

x88x
11-28-2009, 07:35 PM
Oooh, that kind of parallel... ok, that makes sense.

mDust
11-29-2009, 02:17 AM
What is this machine going to be doing with all this power?..and 4 to 5 rads!?! Wow! You should figure out how much heat you need to dissipate and then find out how much each rad can dissipate (allegedly, according to the salesman)...that will tell you how many rads you need. It might be easier to sell the idea to your boss if you have some hard numbers anyway...especially if s/he doesn't understand them!:D

For example:
This (http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=200&products_id=4185) rad claims to dissipate 6875 BTU per hour...whether it's true or marketing hype is not something I can say for sure. It's just a little guy so, to be on the safe side, I'll bet it dissipates 60% of that: 4125 BTU/hour. Converted to watts here (http://www.mhi-inc.com/Converter/watt_calculator.htm), it's safe to assume it dissipates at least 1208 watts/hour. Depending on how efficient the 295's are, only a bit of the total electrical wattage will be converted to heat energy. A GTX 295 peaks at 289 watts...x 4 = 1156 watts. So even if the cards were 0% efficient and converted 100% of the electricity to heat, that one rad would dissipate it...This is still assuming that the manufacturer is a self-promoting liar regarding the 6875 BTUs in the first place. So it should easily cool those cards with muscle left to flex. I'd personally get two just to make sure though...because who knows what unholy fans/turbines they strapped to that rad to obtain that maximum dissipation measurement. And I'm not trying to sell you on a specific rad...I'm just saying you don't need 5! (...unless they'll be passive...are they passive?):D Then again, if it's corporate budget, I'd like to see 5 monster rads and what you'd do with them...

PS. If someone more knowledgeable than I says that I made a mistake then I most likely did...:whistler:

Spawn-Inc
11-29-2009, 03:02 AM
What is this machine going to be doing with all this power?..and 4 to 5 rads!?! Wow! You should figure out how much heat you need to dissipate and then find out how much each rad can dissipate (allegedly, according to the salesman)...that will tell you how many rads you need. It might be easier to sell the idea to your boss if you have some hard numbers anyway...especially if s/he doesn't understand them!:D

For example:
This (http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=200&products_id=4185) rad claims to dissipate 6875 BTU per hour...whether it's true or marketing hype is not something I can say for sure. It's just a little guy so, to be on the safe side, I'll bet it dissipates 60% of that: 4125 BTU/hour. Converted to watts here (http://www.mhi-inc.com/Converter/watt_calculator.htm), it's safe to assume it dissipates at least 1208 watts/hour. Depending on how efficient the 295's are, only a bit of the total electrical wattage will be converted to heat energy. A GTX 295 peaks at 289 watts...x 4 = 1156 watts. So even if the cards were 0% efficient and converted 100% of the electricity to heat, that one rad would dissipate it...This is still assuming that the manufacturer is a self-promoting liar regarding the 6875 BTUs in the first place. So it should easily cool those cards with muscle left to flex. I'd personally get two just to make sure though...because who knows what unholy fans/turbines they strapped to that rad to obtain that maximum dissipation measurement. And I'm not trying to sell you on a specific rad...I'm just saying you don't need 5! (...unless they'll be passive...are they passive?):D Then again, if it's corporate budget, I'd like to see 5 monster rads and what you'd do with them...

PS. If someone more knowledgeable than I says that I made a mistake then I most likely did...:whistler:

think those numbers are off.

here is a review (http://skinneelabs.com/Radiators/XSPC/RX360/RX360.html) for the XSPC RX 360 rad, it can dissipate almost 1200 watts with fans at 2800RPM and an water in air out delta of 15C

so in theory it can dissipate the heat for 4 cards (using mdust's numbers). but there is overclocking and decent temps to keep in mind. and those are loud fans too.

so practically speaking 2-3 triples is best for this job.

Trace
11-29-2009, 03:12 AM
PS. If someone more knowledgeable than I says that I made a mistake then I most likely did...:whistler:

Sig quoted.

x88x
11-29-2009, 02:11 PM
Thanks for the advice on rads; tbh, 5 was just a guess without drawing up any numbers yet. Though if we go with this setup it would be 7 GTX295s, not 4.

As for what all this is for...I'm not sure how much detail I can go into, but suffice it to say we need as much CUDA processing power as possible in one system.

Granted if this does go through it'll probably be a few months, but I'll be sure and update when I know.

Spawn-Inc
11-29-2009, 04:10 PM
Granted if this does go through it'll probably be a few months, but I'll be sure and update when I know.
PLEASE!

if this thing gets built you HAVE to show it off... thats just like proper pc etiquette. :D

mDust
11-29-2009, 04:26 PM
Though if we go with this setup it would be 7 GTX295s, not 4.7!!!? *Passes out*


...what the hell? I always thought the maximum number of cards in SLI was 4...but I guess that's just because nobody has really crammed more than that in a mobo...that I've seen anyway. Or are all the cards going to just be running independently? Either way, this thing will be a monster, so be careful around it. I'd hate for it to tear a hole in space-time...

Oh, and can you say where you work? I'm going to be wondering about this thing all day now...:mad::D


Sig quoted. 8)

x88x
11-29-2009, 06:52 PM
7!!!? *Passes out*


...what the hell? I always thought the maximum number of cards in SLI was 4...but I guess that's just because nobody has really crammed more than that in a mobo...that I've seen anyway. Or are all the cards going to just be running independently? Either way, this thing will be a monster, so be careful around it. I'd hate for it to tear a hole in space-time...

Check the link for the MBB in the OP; there's actually two MBBs available now with 7 PCIe x16 slots...and they can run 4 at a time at full PCIe 2.0 x16 speeds, so 1 at 16x and 6 at 8x. And yeah, they will be run independently; what they will be used for doesn't benefit from SLI.


Oh, and can you say where you work? I'm going to be wondering about this thing all day now...:mad::D

I'm not sure I'm allowed to do that...they get real weird about non-US-citizens (which are plentiful on the internets :P ).

Spawn-Inc
11-29-2009, 11:25 PM
as far as i know there are only 2 setups that do 4 way sli. 2 of the GTX295's or 4 of the GTX 285's.

mDust
11-30-2009, 02:49 AM
I'm not sure I'm allowed to do that...they get real weird about non-US-citizens (which are plentiful on the internets ).You, sir, have piqued my curiosity...


I've seen the mobo's with endless amounts of x16 slots, but I'm not entirely sure the manufacturer designed the product because they thought someone might need 7 graphics cards...It's my assumption that they never even expected all of them to be populated by something as taxing as a graphics card. I would further assume it would function properly but performance would probably be sub-par. I can't even imagine the workout that x58 chip would enjoy.;)
It seems like there shouldn't be nearly enough lanes to run all those cards...wikipedia says the x58 supports 36 lanes. I read a little about that board from various (read: unreliable) sources which say it can support 4 cards at x16...hmmm... They also say that regardless of what the other slots are running at, the last slot is always x16...if so the remaining six still wouldn't be able to hit x4. It must be magic. I would think that 4 cards would run at x8 and the rest would be running at a pitiful x1. You might want to email EVGA to make sure 7 graphics cards will work in it before committing to this project.:think:

While absorbing information at EVGA's site I found a list (http://www.evga.com/support/faq/afmmain.aspx?faqid=58719) of cases that that beast will fit into. Most other cases will require extensive modding to make it work. I normally wouldn't consider that a limitation, but you said you didn't want to do much modding since it's for work.:)

x88x
11-30-2009, 12:46 PM
It must be magic.

Heheh, almost. What they did was they added another NB in order to get more PCIe lanes. Dell has a few of their Precision workstations that have a similar setup, though only with 4 x16 slots instead of 7 :P


You might want to email EVGA to make sure 7 graphics cards will work in it before committing to this project.:think:

Yeah, I'll definitely be doing that if we get the go-ahead to build it...make sure that it'll actually work before we drop 8 grand on the thing :P

Kayin
11-30-2009, 01:00 PM
Seven graphics cards most likely won't.

There's an NF200 chip that splits the lanes on each of those cards. HOWEVER, there are already two NF200 chips on that board that split the existing lanes. There was a guy that tried it with 9800GX2s and could not get anything to talk. His was with six.

Also, remember crosstalk latency. That SLI connector does some of the negotiations required to team those cards up. I know you're not running SLi, but even in CUDA apps it's used to exchange. After a while, I'd imagine its loss would be an issue-especially in a heavily CPU-bound area like an enthusiast board with a single socket. Make sure to get a Gainestown if you do this.

tl;dr-this probably won't work unless something major has changed recently.

mDust
12-01-2009, 05:13 PM
...they added another NB in order to get more PCIe lanes.That's just sneaky...:D I figured they probably did something extreme on the board (other than 7 PCI-E slots :facepalm: ) but I couldn't find anything technical about it. Even if this thing gets toned down a wee-bit, it's still going to be a beast.

Also, I was pondering the flow through the water blocks:
http://img214.imageshack.us/img214/3860/wcingoptionscopy.jpg
I realized that in the parallel setup that was suggested (A) might not work well with 7 water blocks. The flow will be highest in the blocks closest to the inlet and outlet but there's nothing really forcing the water through the blocks that are furthest. I modified it (B) so that all the blocks will have decent flow. The water moving past the bottom blocks will draw water through them, so no blocks will have 'sitting' water. I also figured, since you're considering multiple pumps, I'd modify a series setup (C) to make it work. What I drew up assumes at least 2 pumps and if more are used, the blocks may be split up such as 2-2-3 instead of 4-3. Basically, there is a pump > cards in series > rad > 2nd pump > cards in series > rad...etc.

x88x
12-01-2009, 05:19 PM
Thanks. I think B would probably be the best option. I would love to something like C, but I seriously doubt the angle connectors would fit between the cards.

Spawn-Inc
12-01-2009, 11:57 PM
i like B. for something on this scale i think it will come down to testing, no one has done such a test, that i know of.

Trace
12-02-2009, 03:10 AM
I'm voting for C, it's what I did for my dual 3870X2's