PDA

View Full Version : Video Card RAM



.Maleficus.
08-20-2006, 12:28 PM
Ok, so there's DDR, GDDR2, and GDDR3. Which is better? I'm guessing GDDR3, but would it be better to have more memory on a GDDR2 or better to have less using GDDR3? I'm looking at 2 cards right now (for my new computer until Vista and DX10 comes out) and they are about the same price. One has 256MB of GDDR3, and the other has 512MB of GDDR2. The 256MB one is the "Overclocked Edition", so it has higher clock speeds too. These are the cards. 512MB Card (http://www.newegg.com/Product/Product.asp?Item=N82E16814102653) and 256MB Card (http://www.newegg.com/Product/Product.asp?Item=N82E16814102019). If I get the 512MB un-overclocked card, I can overclock it myself, so don't let that sway your decision.

Also, as these are ATi cards, and I've never spent much time around them, would you suggest buying an ATi made card, or a different brand? The 2 I linked were Sapphire, and they were the best deal I could find to just get me playing games until Vista and DX10. If you know of a better deal, preferrably under $105, can you post that too?

Thank you for any help!

Silenced_Coyote
08-20-2006, 09:21 PM
Getting a Sapphire brand is good.

Get this one:
http://www.newegg.com/Product/Product.asp?Item=N82E16814102019

When comparing cards, it is way more important to look at the clock and memory speeds than the amount of memory it has. Oh, and the 2nd one has a 20 mail-in-rebate so hurry up and buy it!

public_eyesore
08-20-2006, 09:45 PM
the newer ones overclock further

.Maleficus.
08-20-2006, 11:53 PM
Well, this isn't a permanent card, just until I feel Vista is worth my money. And if you say the second one is the best, then that shall be the one I buy.

Silenced_Coyote
08-21-2006, 12:47 AM
Don't forget the mai-in-rebate! Must be purchased before the 27th to qualify.

.Maleficus.
08-21-2006, 04:22 PM
Hmmm... Now I have a new question. Which is better, X1600PRO or X1600XT? I can get a X1600XT 256MB GDDR2 card for about $10 more, if I don't get the rebate on the other one. It doesn't tell the clock speeds for the XT, but I'm guessing it is a better card. Now which one should I get?

CanaBalistic
08-21-2006, 10:42 PM
Radeon X1600 XT runs officially at 590 MHz and accesses its memory at 1.38 GHz (22.08 GB/s), 128 bits per clock cycle.

This video card has four GDDR2 512-Mbit 1.4 ns chips from Samsung (K4J52324QC-BC14), making its 256 MB video memory (512 Mbits x 4 = 256 MB). These chips can run up to 1.4 GHz. Since this video card accesses the memory at 1.38 GHz, there is no room if you wish to overclock its memory inside its specs.

http://www.hardwaresecrets.com/article/355/1

Slug Toy
08-21-2006, 11:23 PM
as far as the x1600pro goes, the 256MB GDDR3 card is enough. going to 512MB wont get you any more performance because the card is limited by the 128 bit interface to some extent. 512MB isnt completely necessary yet anyways... its good, but not essential. i personally dont like GDDR2 anyways... i cant really explain why... it just feels like a bastardized step between DDR and GDDR3. why anyone would put GDDR2 on a card these days is beyond me because it makes just as much, if not more, sense touse GDDR3.

seeming as the core clocks are 575 for the pro and 590 for the xt, you can save a little money and overclock the pro up to 590. a 15MHZ increase should be completely possible. the ram is a little iffy though, to get up to 1.38GHz from 1.2... theres no guarantee on that. it IS only 180MHz, but a lot can happen in that span. the ram on my video card is technically rated for 1.2GHz, but i cant get it passed 1.05 on a good day.

sapphire is definitely the way to go. they get good deals and generally have the lowest prices. i had a sapphire 9600xt a while ago. no complaints, it ran as good as a card can run.

.Maleficus.
08-22-2006, 08:09 AM
Then I will go with the X1600PRO. Hopefully I will be able to order it in the next few days to get that rebate.

dgrmkrp
08-22-2006, 08:46 AM
i don't know if it's too late or not.. i read a couple of reviews and articles.. look for deals on HIS x1800gto IceQ version.. this baby can be converted to a x1800xl by some bios tweaking and has 256b memory interface.. and ddr3 :) and the only problem is it's price, compared to what u seek.. with the right cooling and luck, u could get a x1800xt :D

Razors Edge
08-22-2006, 11:07 AM
See i've never been that kind of person. When I'm buying a graphics card, I go by reviews. Its like what I need now. I will be buying a AGP x1600, and i'm gettign a 1600 becuase by the name it must be better then the x800. I don't care about the memory clock and all of that, Long as it plays my games i'm happy.

.Maleficus.
08-22-2006, 11:28 AM
Yeah, I go by reviews a lot too, but clock speeds matter a lot. I'm getting the X1600PRO because that is the cheapest thing I could find that should play all my games I have now, and the games coming out until I feel comfortable with Vista. I don't need all the eye candy enabled, I just need the experience, even if the graphics are most of the experience. Hell, I don't even know what AA and AF do. But I just placed my order for the card, so I'm one step closer to having my computer :D.

dgrmkrp
08-22-2006, 11:34 AM
well, razors edge, i said reviews and articles.. believe it or not, a x800 can be better than a x1800.. why? both have a pseudo 16 pipeline architecture, 256b memory interface.. but the 1800 is oriented on pixel shared power whle the x800 has some raw power, not really used today. it will be either outperformed in newer games or will have more raw power to be used by some games.. and a 1600 isn't better by default.. think of the x1300.. cheap, good replacement for old cards, yet in it's same price range it can be outperformed by other chips because of the internal architecture.. true 4 pipeline GPUs will leave this behind from a power perspective.. but the 1300 can do more eye-candy and stuff, and it gets a higher mark.. of course, i don't follow the reviews by heart.. sometimes, it's better to do what u said.. if it goes, i'm happy.. but for my quarks to be happy :) i need numbers :) just me :) and nice die pictures :)

Slug Toy
08-22-2006, 03:19 PM
define raw power please. because im having trouble figuring out how a GPU with 16 pipelines and 1 shader per pipeline can do more work than a GPU with 16 pipelines and 3 shaders per pipeline. in the worst case, they should perform equally. there is no raw power to squabble about... power is power is power.

if you want to look at "raw" power, look at nvidia's architecture. they are still running on the "add more pipelines and transistors" method. its not a bad one, but its more of a brute force method.

and whats this talk about true pipelines? fact of the matter is that any pipeline is a true pipeline. theres no set rule stating that a pipeline needs this and that and a couple of those. the x1300 is a true 4 pipe GPU... theres no two ways about it. i would like to see a comparison between the x1300 and the 7300 to see if what you say is true, because it would be my guess that they would be more or less evenly matched.

also, keep in mind here... only the x1600 and x1900 have three-shader pipelines. the other cards have different numbers. the x1800 only has one. i think the x1300 has one or two. so to say that the x1300 can do more eye candy than a similar priced card, and still perform worse somehow is wishful thinking. it will do what a four pipeline card can do.

dgrmkrp
08-22-2006, 03:51 PM
raw power: the ability to create geometry and put a texture on it.. no special effects, just plain 3D. pixel shaders are a cosmetic effect that requires something different than vertex shaders (or the future dx10 geometry shader): more math power and more complex branching mechanisms.. ati has a nice + here..

ok... what i'm talking about is the components on each "pipeline".. you need some components, but not all, in a "modern" card.. you have your texturing units , geometry units, pixel units.. if u have 1 unit per a so-called pipeline, u get a true pipe-line.. with nvidia, this (was) is mostly true.. with ati, only in higher-end models.. basicly, they link several units in series to get a pipelin and then end several to one other unit, that takes care of the entire information till that point.. in lower end boards, it is more economical to do so.. not much power there anyway, so why use dedicated parts?

i constantly read xbitlabs.com and tomshardware.com.. some stuff i based my affirmations on are:

http://www.tomshardware.com/2006/07/31/graphics_beginners_2/ and basicly the shader/architecture part of the article..
http://www.tomshardware.com/2005/10/06/ati_enters_the_x1000_promised_land/page7.html
http://www.tomshardware.com/2005/10/06/ati_enters_the_x1000_promised_land/page8.html
http://www.xbitlabs.com/articles/video/ lots of fun stuff, spread all over :)

and some tests on thg's part.. if they are wrong, i'm wrong too.. my bad :) i'm not so interested in video boards really, but what does fascinate me here is the shift of perspective: nvidia things a balance should exist between the geometry and the effects, while ati goes all out on pixel shaders.. take the 1900 with it's 48 pixel processors..

http://img529.imageshack.us/img529/9351/clipboard01so9.jpg

http://img528.imageshack.us/img528/8672/clipboard02ks5.jpg

http://img528.imageshack.us/img528/5118/clipboard03ai3.jpg

http://img226.imageshack.us/img226/121/clipboard04vf0.jpg

basicly, in age of emps 3 and serious sam 2, the x800 beats the x1800.. but when newer action filled games are involved, like fear, x1800 wins.. plus, the generation gap and a few +mhz give the 1800 a 50% boost in 3dmark:) of course, i'm not in favour of the x800, but it shows that in some circumstances, a different build can have different effects.. the x800 is just a shader 2 gpu, but it had more of that "raw power".. in the new model, they made up for that by increasing the freq. i love them anyway and i like studying stuff.. anyway, i kinda forgat what i was saying :rolleyes:

Slug Toy
08-22-2006, 05:33 PM
ok, i kind of see what you're saying but it still isnt completely convincing. the problem is that the x800 beats the x1800 at 1024x768 with lowest quality basically. i can see how that would happen... the x1000 series is definitely forward looking, at the expense of some backwards compatibility so to speak. it can excel in sm3, but can fall behind in sm2.

you know what, youve managed to confuse me. too many colours going on i think. this is bringing back too many memories at once now. all this talk about balance between geometry and shading... and then the comparisons and argh.

i think in this x1800 vs x800 thing, im sticking to the idea of being too complex to handle the simple tasks. kind of like... over-thinking an issue in english class or something... you're too smart for the task at hand and get held up by your awesomeness.

im also sticking to the idea that a pipeline is a pipeline is a pipeline. no matter what anyone says... it gets the job done so it qualifies as a pipeline.

i think ive got a bit of reading to do because i DO remember some interesting anomalies when the x1000 series first came out.

.Maleficus.
08-22-2006, 10:47 PM
Wow, it's funny how much of that I didn't understand lol.

dgrmkrp
08-23-2006, 04:18 AM
i think in this x1800 vs x800 thing, im sticking to the idea of being too complex to handle the simple tasks. i totally agree.. and about the backward compatibility.. i was a tad interested because i regularly make large .max scenes that simply cripple my cpu and gpu when it comes to walking around in them.. and from this point of view, at the beggining the x1000 series wouldn't be better.. there is no shader in the viewports.. of course, ati modified their drivers to offer more power here.. but nvidia did more and said they would allow gpu rendering of the final scene too.. i was confused and had no money to test this :) i know i'm going a lil' away from the topic, but this is what made look at these awkard reversals of performance.. and still, the new architecture wins 90% of the time :) and yes, a pipeline is a pipeline :)
so, how's the new video board? .Maleficus., are you gonna keep it stock or do some nasty, big, coppery, windy, pipey things to it and maybe oc it? a video card is such a moddable project.. and you can invalidate your warranty instantaniously ;)

.Maleficus.
08-23-2006, 11:17 AM
Well, it hasn't arrived yet (should come Friday though), and I still won't be able to use it for a while (I'm still in the process of getting money/parts) but yes, I do plan to OC it and maybe do some cool cooling with it. I'll probably put some Arctic Silver on it right away, so I have another question. There are RAM heatsinks, right? Do I put Arctic Silver on those too?

dgrmkrp
08-23-2006, 12:21 PM
well.. how are the chips fixed on the board? 'cause if they are glued... i don't think it would be wise to put AS on it.. kinda stops the adhesive properties :) if u buy a cooler with a base that is fashioned so it has direct contact with the ram chips and mounts tru the pcb holes.., then u can add AS.. but you should only use thermal paste if there is a mechanical fixing solution too, not just thermal tapes..

.Maleficus.
08-31-2006, 10:19 AM
Ok, so I got my video card, but I still don't know what half the crap that came with it is. These are the cables and stuff.


Edit: Can't get the image to upload, so here's the link (http://www.newegg.com/Product/ShowImage.asp?Image=14%2D102%2D019%2D02%2Ejpg%2C14 %2D102%2D019%2D03%2Ejpg%2C14%2D102%2D019%2D04%2Ejp g%2C14%2D102%2D019%2D05%2Ejpg%2C14%2D102%2D019%2D0 6%2Ejpg&CurImage=14%2D102%2D019%2D02%2Ejpg&Type=OPENBOX&Description=Open+Box%3A+SAPPHIRE+100156L+Radeon+X1 600PRO+256MB+GDDR3+PCI+Express+x16+Overclocked+Edi tion+CrossFire+Ready+Video+Card+%2D+OEM).

The Black Pumpkin
08-31-2006, 10:20 AM
Looks like the major component there is a white square with a little red x. Could be the crossfire dongle.

Can't see the pic. :rolleyes:

.Maleficus.
08-31-2006, 11:12 AM
Fixed.

The Black Pumpkin
08-31-2006, 12:40 PM
Well, there's the driver cd, the dvi/vga adapter (tan), s-video cable (black ends). That's what I know. Looks like the triple splitter goes from s-video to a different system, maybe for projectors? The yellow adapter looks like from s-video to the cable with yellow plugs, no clue for purpose.

dgrmkrp
08-31-2006, 02:59 PM
the weird plug is for the proprietary system ati uses.. youi plug the solo e-/nd in the card and get vivo or multiple standard video/tv out.. you can't of course work a tv with a short cable like that, so you get extenders :) the yellow thing looks like an adapter for something, but i don't know exactly for what..