View Full Version : It seems this is not a good time to buy real computers.
Konrad
05-01-2014, 10:55 PM
Not just not a good time. Kind of an awful time.
And by "real computers" I refer to nice high-/extreme-performance/gaming stuff. Not necessarily the record-breaking overclock platforms, but not far off. Sure, low-end value computing, where bang-for-the-buck is paramount above all else, is pretty impressive. And the middly systems range all over the middle, some of which peaks out surprisingly near the top for not-too-bad pricing.
I suspect there's basically two reasons for this:
1)
Intel and AMD traditionally leapfrog each other, alternately juggling positions on who offers the best tech. Also, incidentally, forcing them to innovate more aggressively and keep their pricing tiers trimmed down. But, in recent years, AMD's focus has been elsewhere and they've lagged considerably in their top PC offerings. While Intel has focussed unchallenged on numerous incremental minor refinements and finer lithography/substrate/microcode technologies which offer diminishing-returns in flat performance but command exponentially inflated prices.
and 2)
PCIe 3.0 (GEN3) motherboards, processors, chipsets, and graphic cards are - as always - "just around the corner". Yes, a few offerings do exist, enough to build an entire working GEN3 system. But, when you dig into deep electronic/logic details, these seem to all be PCIe 2.0 platforms which incorporate added parts and complexity to somehow merge PCIe 2.0 components into half as many PCIe 3.0 components. With less than expected performance and reliability. In short - these products are basically PCIe 2.0/3.0 hybrids which are experimentally prototyping PCIe 3.0 implementations, they aren't as good as they should be.
Any thoughts?
TLHarrell
05-02-2014, 12:54 AM
I'm usually not on the bleeding edge as the bang for the buck starts to fall off pretty rapidly once you get beyond about $2K.
TheMainMan
05-02-2014, 10:52 AM
I feel the same way for both posts actually. In Canada, I would say that $2K becomes more like $2.5K because the economies of scale prevent prices quite as low as in the US but the last time I built a really high end machine I spent a little over $2200 any noticeable performance gains would have taken at least another $200 to hit.
My primary rig is still running on a Core2 Quad 9550 and until that dies, there isn't a platform out there that really makes me feel the need to upgrade. I know anything Socket 1150 or 2011 would crush my machine but as Konrad mentioned the refinements Intel worked on while AMD ventured into APUs mean that the really high performing 2011s are extremely expensive and while the 1150s are great, they are not worth a $2K+ upgrade for me.
I would guess that the PCIe cobbling-together will probably happen in the early stages of most generations. My Asus Striker II Formula was one of the first dual 16x PCIe 2.0 boards... at least until you dive into the architecture that underlies the marketing claim. Since they assumed that anyone who would care about dual 16x and PCIe 2.0 would be planning on running an SLI setup, an extra chip was added to the board that links the PCIe slots at full 16x speeds but that chip was only connected to the Northbridge through ...... a link half as big as it provides to the attached slots! Looking at some of the current PCIe 3.0 stuff, the solutions that are being manufactured now seem just as thrown together as back then.
All that being said, I still like getting to build nice machines for other people :D
d_stilgar
05-02-2014, 01:58 PM
I'll agree and disagree to some extent.
I think now is a pretty exciting time to build. The Q9550 machine I've had as my HTPC has been really unstable, even after I got a new motherboard and ram. Maybe it's a PSU issue. Maybe the hardware is just at it's end of life. But the important thing is that I built that system in 2008 for around $1500 (excluding the case). Today, a $100 i3 CPU will perform faster. It's great. Progress has been great. I can get that machine running for ~$300, and it will be better than my old system in almost every way.
The bad thing has been that cryptocurrency mining has made AMD chips artificially expensive, which is frustrating, slows progress, and gives AMD incentive to cater their cards more to that purpose.
Also, 4k monitors are coming down in price really quickly, with some really good offers for relatively cheap prices (http://www.amazon.com/dp/B00IEZGWI2/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=2FVSUTLCGPQFM&coliid=I3K432VPB24F6Q). If I had the money, I would get a 4k monitor and upgrade my system within the next 9 months. As it is, I'll just wait until HL3 comes out (amiright?). In any case, that new resolution is really demanding, so it's a great motivator to build a new system.
But if you had a system and all you are doing is running at 1080p, then I don't see what the big deal is. The system I built in 2008 was running a 1920x1200 monitor, and although that system won't run games at full settings anymore, they still look pretty good. Most mid-range hardware can pretty well handle all current games at 1080p on "high" or "highest" settings anymore.
Not that this is about system building, but VR is finally going to become a thing in a very big and very real way. It's not something that will inspire most people to build a new system (because most systems should be able to handle it already), but if your system isn't fast enough to render out for a VR headset, then I think it's a really strong motivator to build a new system.
Konrad
05-03-2014, 04:04 PM
Hmm, I guess I'm just disillusioned because I wanted to build a nice top-tier system.
So, say, a Socket LGA2011, X79 chipset, i7 6-core Extreme system? Basically $1750-$2500 for just proc, mobo, and full RAM ... and Intel keeps promising that X99 chipset is coming soon ... and mobo offerings for GEN3 and OC look kinda kludgy and limited right now. Might as well just stick with my trusty old room-heating LGA1366 X58 i7-990X. Bah!
Or, say, a Socket AM3+, 990FX chipset, AMD FX 8-core Black Edition? $1250-$1750 for just proc, mobo (ASUS Sabertooth 990FX/GEN3 R2.0 or Gigabyte GA-990FXA-UD7?), and full RAM ... again, all the GEN3 looks really kludged and receives poor reviews, and more OC options exist but they aren't really as impressive as the Intel-compatible boards. AMD (or is it ASUS?) apparently still hasn't worked out all the troubly HT 3.1 issues. And, I might be mistaken, but I've always felt that pure-AMD CrossFire systems gotta work a little better than mixed AMD/nVidia SLI counterparts, so there's another $1K for middly-highish pair of x16 AMD Radeons (which don't actually benchmark at true PCIe 3.0 performance levels).
Stupid computer market. Hurry up and improve!
Omega
05-04-2014, 03:36 AM
I certainly feel like things were better on the processor front back in the days when AMD and Intel were battling it out hard, you could pick up good performing chips for pennies on the dollar because they were trying to out-do and out-sell each other. Plus AMD based systems were often fairly competitive for comparably priced Intel rigs.
Due to some bad experiences with the AMD end of things in recent times (with my A10-5800K) and baseline performance being another benefit of an Intel rig, there's few reasons to go AMD unless you're seeking budget computing.
That said, I feel like the graphics war is on. Or rather, is still on. Nvidia and AMD are battling it out there hard, with R9 290Xs and 780Tis, rapid development of new cards and ever increasing insane power outputs from these things. Hell, even my GTX 660 SC does things I could have never imagined, and this is a $200 card.
Konrad
05-06-2014, 07:20 PM
True, true.
Even the crappiest mid-end tech today is vastly superior to old stuff and, unless you're an extremist, probably a little overkill for the stuff most of us actually do on our computers.
I guess it's just a sign that we're getting old.
Having said that ... I'd still overwhelmingly prefer my ancient 64K Apple II+ clone over some junky Acer/Gateway system.
Markwinstanley
05-14-2014, 03:48 AM
I think we cant be up to date with technology.every few month new and new tech coming like you recently buy 1080p 3d moniter then say oh **** i have to buy 4k coz you heard about 4k.we wait till price goes down but when it happen tech is not more new coz generate of new tech so i have to say that its for just rich people to stay up to date with new tech
Twigsoffury
05-18-2014, 06:18 PM
It's been that way since the dawn of electronic time.
Konrad
03-20-2015, 12:46 AM
Necroing myself One Year Later ...
Leapfrogging competition is a thing of the distant past. Intel is two leaps ahead of AMD, NVidia is two leaps ahead of AMD, both are working out their next tick-tocks while AMD's focus is making new middly cheap little APUs to outperform previous cheap middly APUs. The FX-9590 is a whole lotta hot CPU but it just can't bench as much as an i7-5930K (let alone an i7-5960X), plus it can (at best) be run on an outdated RD990FX+SB950 chipset. AMD's just-announced-latest-greatest (and not quite available) "ultra-enthusiast" R9-390X GPU card appears to be yet another unspectacular rebadge/tweak of the ancient HD79xx Tahiti with some fancy new post-GDDR5 memory technology and not a whole lot else. Press releases speculate (with good reason) that next iterations of PlayStation and Xbox consoles will be built around NVidia GPUs, not AMD APUs/GPUs. More and more enthusiasts, overclockers, modders, and gamers appear to be abandoning the Red Team - pure-AMD fanatics are now becoming a visible minority. Even game engines are now predominantly optimized to run on Intel/NVidia. I'm not sure how AMD fares in the workstation/enterprise market, aside from noticing that Xeon and NVidia offerings appear a whole lot more visible and powerful in comparison.
I'm thinking this is the end of AMD, at least in terms of high-end computing. They will one day be just another low-end junk maker like Acer and Gateway.
d_stilgar
03-20-2015, 08:31 AM
I had written a long post about how you shouldn't put AMD out to pasture yet. Then something happened and I lost it all. Here's the short version.
Nvidia Titan X is a beast, but it's also $1000. There's going to be a small market for that.
AMD keeps up well enough in terms of comparable cards:
http://www.hardocp.com/article/2015/03/16/asus_rog_poseidon_gtx_980_platinum_vs_amd_r9_295x2/8
The price of AMD cards is finally back to normal after the coin mining craze artificially raised prices.
4k is coming, and will be a huge deal. The biggest thing to look for in cards for ultra high res gaming is going to be V-Ram.
The other major innovation that will make 4k playable will be Freesync/G-Sync. Freesync is AMD, an open standard, and will have no licensing costs. G-Sync requires Nvidia GPU and special monitor which adds ~$200 to its cost. This is going to be hugely limiting for people who want to get into 4k on a budget. AMD might very well win out, even though I don't think Freesync is quite as good as G-sync (I'll need to look up the specific differences more to be sure).
Same goes for VR. Whatever card can cater to the different VR hardware requirements is going to be at an advantage. Like G-sync/Freesync for VR headsets will be really important, with a minimum refresh and framerate of 75hz being incredibly important.
Konrad
03-21-2015, 12:43 AM
FreeSync and G-Sync both add component cost and complexity to the panel interface logic, a price premium built into every compatible monitor. I don't know if G-Sync's cost is determined by technical or by licensing/legal factors, although it would seem counterproductive for NVidia to actively restrict/oppose availability of a feature which encourages people to buy GeForce (and only GeForce) cards.
I'm still not entirely convinced that 3D and VR display technologies will ever be more than a niche market (or collection of niche markets). They've both been around in some form forever, long predating the currently available AMD and NVidia implementations. Yet they never really caught on. I think cost has been an important factor in this, but not by any means the only factor. 3D displays just don't really deliver intuitive/realistic 3D, they don't even work well for everyone. Immersive VR displays turn people into frozen zombies, hardly an interactive improvement over old flat panels. But then again, all it takes is one Killer App to bring demand for the tech into every household.
Everybody uses the terms "4K" and "120Hz" these days, along with some vague understanding that these are important things to look for in a display, although very few people bother to know what these terms really mean or why they are important. Still, I agree that these notions will drive the market for displays and GPU capabilities.
d_stilgar
03-21-2015, 08:18 PM
Well, high refresh rate (and corresponding frame rate) is really important for VR that doesn't make you sick. Super high res displays (like 4k) will help reduce aliasing by the nature of there being more subpixels to represent a line.
Freesync will be cheaper for a lot of reasons, but the biggest is that the way panels draw an image is a vestage of the days of CRT, where an electron beam would literally scan across phosphors to light up a screen. The rate at which that beam could scan would determine the refresh rate. Screens don't work that way anymore, but they still "work" that way, which also brings about lots of issues including tearing. There's no reason why screens today shouldn't have a dynamic refresh rate (I've been talking about this for 10 years) except for bios/firmware. Freesync started as a hack of existing hardware, no changes needed. G-sync requires a proprietary board. So yes, it will be more expensive no matter what.
I think VR is coming in a much bigger way than lots of people think, including many hardcore gamers. Sony will have PS4/PC compatible morpheus. There's the rift. There's Razers open source VR project. There's HTC/Valve's Vive. The race is on, and the impact in the gaming industry will be huge, but then it will precipitate to other media as well. The issues with VR in the past were that the resolution was too low, the refresh rate was too low, there was no head tracking or it had too much lag, the optics gave people headaches, there were no low-persistence displays, they were too expensive, and they were too heavy. We're at a point where all these technical challenges can finally be overcome.
The phone industry brought us very small, very high resolution displays produced in extremely large numbers. In VR prototypes these have been overclocked and hacked for low persistence and high refresh rate. These displays are lightweight. Optics is still being worked on, but is getting better quickly. Before now optics weren't the weak link in the chain, so it didn't get worked on (even though it's arguably the one technology that wouldn't have been prohibitively expensive to fix a long time ago). Low lag head tracking is almost perfected. Gyroscopes, webcams, and lasers have gotten much smaller and cheaper in the past 25 years. All the vectors for really good VR are crossing. The thing about VR is that when a minimum standard isn't met it fails completely. But when those things are done well, it is completely engrossing.
I know I'm getting the HTC Vive when it comes out this Christmas. My wife is already planning on the expense. The consumer Rift will be $350 or less. I'm thinking the Vive might be as much as twice that, but it comes with two controllers and two satellite "lighthouses" that can track people within a room, so there's a lot more to it than the rift.
Konrad
03-21-2015, 08:43 PM
AMD has basically abandoned development of FreeSync. It's opensource now ... I hope that works out well, even if it means some semi-official standard authority needs to coalesce. Who knows, maybe FreeSync will become a de-facto standard driven by low cost market availability and demand? I expect FreeSync/G-Sync will become standard features on all but the cheapest displays soon enough. (And, having bought into G-Sync, I really hope NVidia doesn't stranglehold it with an iron fist, lol.)
Regardless, it seems like AMD will happily support whatever new (non-proprietary) display technologies emerge, but they're withdrawing from the race to pioneer new thresholds. A little worrisome to me that technological development in this area might essentially default to whatever a single company decides to work on.
AMD doesn't seem to be doing so well on the processor front, either. Intel is too dominant and too far ahead. Less worrisome, to me, since a lot of other non-Intel products are beginning to assert themselves. Intel always makes the best toys - but damn, they can be expensive!
d_stilgar
03-22-2015, 07:26 PM
Yeah, Intel does certain things well, and other than one build I did 10+ years ago, I've always bought Intel, but I read some reviews that more or less concluded that, unless you are planning on live streaming your gaming sessions, there's no reason to use Intel over AMD. And given the price difference, it's essentially a recommendation to use AMD.
Really, when comparing products on a performance/dollar scale, I'd say AMD and Nvidia are more or less on par. Nvidia has been the performance leader, but they've also commanded the higher price for their stuff. AMD and Nvidia also use different AA methods in their cards, which is qualitative (down to preference), and some people just prefer the AMD tech better.
I still use AMD because I've found Eyefinity to be less cumbersome than Nvidia Surround. Now, it's been a few years since I built my computer, so maybe Surround is less annoying now, but I still have that memory. Also, I bought an Nvidia Shield around Christmas, and I've enjoyed it very much. It came with a nice controller too, which can be used on your PC if you have it wired . . . but only if you have an Nvidia video card. This is an unnecessary block and leaves a bad taste in my mouth. There's no reason that you shouldn't be able to use the controller over USB with an AMD video card other than the fact that they chose to run the driver through their video card software. Does it make me more likely to buy an Nvidia card? Honestly, no. It more just pisses me off a little bit about Nvidia as a whole.
Anyway, I'm no AMD fanboy, I just don't think things are as dire as you're making them out to be. People are fickle, and the video card buying decision is multidimensional. The decision on "what is the best cad" has a lot more factors than it used to. People realize that are are buying cards taking all factors into consideration.
Powered by vBulletin® Version 4.2.1 Copyright © 2025 vBulletin Solutions, Inc. All rights reserved.