as far as i know there are only 2 setups that do 4 way sli. 2 of the GTX295's or 4 of the GTX 285's.
as far as i know there are only 2 setups that do 4 way sli. 2 of the GTX295's or 4 of the GTX 285's.
CPU: Q6600 G0 3.5GHz@1.4v (4.2GHz max) / 4790k 4.8ghz @1.265v
GPU: 9800GTX /GTX780 hydrocopper
Ram: Samsung 4GB /gskill 16gb DDR3 1600
Mobo: EVGA-NF68-A1 680i (P32) /AsRock Extreme6
PSU: Enermax Galaxy 850Watt /EVGA 850 G2
HDD: OCZ 120GB Vertex4, Samsung evo 840 250GB
LCD: Samsung 32" LN32A450, Samsung 226BW 22" wide
Sound: Logtiech Z 5500
CPU & GPU: 3x Swiftech MCR320, 2x MCP655, MCW60 R2, Dtek Fuzion V2, 18 high speed yates @ 5v
You, sir, have piqued my curiosity...Originally Posted by x88x
I've seen the mobo's with endless amounts of x16 slots, but I'm not entirely sure the manufacturer designed the product because they thought someone might need 7 graphics cards...It's my assumption that they never even expected all of them to be populated by something as taxing as a graphics card. I would further assume it would function properly but performance would probably be sub-par. I can't even imagine the workout that x58 chip would enjoy.
It seems like there shouldn't be nearly enough lanes to run all those cards...wikipedia says the x58 supports 36 lanes. I read a little about that board from various (read: unreliable) sources which say it can support 4 cards at x16...hmmm... They also say that regardless of what the other slots are running at, the last slot is always x16...if so the remaining six still wouldn't be able to hit x4. It must be magic. I would think that 4 cards would run at x8 and the rest would be running at a pitiful x1. You might want to email EVGA to make sure 7 graphics cards will work in it before committing to this project.
While absorbing information at EVGA's site I found a list of cases that that beast will fit into. Most other cases will require extensive modding to make it work. I normally wouldn't consider that a limitation, but you said you didn't want to do much modding since it's for work.
I'll procrastinate tomorrow.
Heheh, almost. What they did was they added another NB in order to get more PCIe lanes. Dell has a few of their Precision workstations that have a similar setup, though only with 4 x16 slots instead of 7
Yeah, I'll definitely be doing that if we get the go-ahead to build it...make sure that it'll actually work before we drop 8 grand on the thing
Seven graphics cards most likely won't.
There's an NF200 chip that splits the lanes on each of those cards. HOWEVER, there are already two NF200 chips on that board that split the existing lanes. There was a guy that tried it with 9800GX2s and could not get anything to talk. His was with six.
Also, remember crosstalk latency. That SLI connector does some of the negotiations required to team those cards up. I know you're not running SLi, but even in CUDA apps it's used to exchange. After a while, I'd imagine its loss would be an issue-especially in a heavily CPU-bound area like an enthusiast board with a single socket. Make sure to get a Gainestown if you do this.
tl;dr-this probably won't work unless something major has changed recently.
That's just sneaky... I figured they probably did something extreme on the board (other than 7 PCI-E slots ) but I couldn't find anything technical about it. Even if this thing gets toned down a wee-bit, it's still going to be a beast.Originally Posted by x88x
Also, I was pondering the flow through the water blocks:
I realized that in the parallel setup that was suggested (A) might not work well with 7 water blocks. The flow will be highest in the blocks closest to the inlet and outlet but there's nothing really forcing the water through the blocks that are furthest. I modified it (B) so that all the blocks will have decent flow. The water moving past the bottom blocks will draw water through them, so no blocks will have 'sitting' water. I also figured, since you're considering multiple pumps, I'd modify a series setup (C) to make it work. What I drew up assumes at least 2 pumps and if more are used, the blocks may be split up such as 2-2-3 instead of 4-3. Basically, there is a pump > cards in series > rad > 2nd pump > cards in series > rad...etc.
I'll procrastinate tomorrow.
Thanks. I think B would probably be the best option. I would love to something like C, but I seriously doubt the angle connectors would fit between the cards.
i like B. for something on this scale i think it will come down to testing, no one has done such a test, that i know of.
CPU: Q6600 G0 3.5GHz@1.4v (4.2GHz max) / 4790k 4.8ghz @1.265v
GPU: 9800GTX /GTX780 hydrocopper
Ram: Samsung 4GB /gskill 16gb DDR3 1600
Mobo: EVGA-NF68-A1 680i (P32) /AsRock Extreme6
PSU: Enermax Galaxy 850Watt /EVGA 850 G2
HDD: OCZ 120GB Vertex4, Samsung evo 840 250GB
LCD: Samsung 32" LN32A450, Samsung 226BW 22" wide
Sound: Logtiech Z 5500
CPU & GPU: 3x Swiftech MCR320, 2x MCP655, MCW60 R2, Dtek Fuzion V2, 18 high speed yates @ 5v
I'm voting for C, it's what I did for my dual 3870X2's
Have you checked out the front page lately?
Projects:
Moe's Tavern | Sponsored by: Mimo Monitors, Crucial, Thermaltake
Book Of Knowledge