PDA

View Full Version : And everyone told me it wouldn't happen :(



overdosedelusion
08-04-2006, 11:41 AM
oh buggar... (http://www.bit-tech.net/hardware/2006/07/24/how_it_all_shakes_out_for_4/1.html)

well, i told people but they didnt believe me.

High-end gaming is going to be cleared out to make way for the CPU/GPU integrated chip. This unfortunately means that games will no longer be able to be played with all the sparkle and luster of a 7900GT, instead, we'll be left with mediocre graphics just so that we can have pc's the size of a palm top for "family" entertainment. And to make it all worse, we wont even be able to choose our components. Buffs of the ATi-intel, and the AMD-nVidia combo's will not be best pleased. AMD have surely dug a hole for themselves with this merger.

by 2008/2009 we are going to start seeing a dramatic change in the game market. i mean, what are they going to do, produce games solely for high end? or mass market the mid end? wonderful eh? guess we'll have to get used to playing games with blotchy aliasing, poor rendering, and physics in league with MOH allied assault. And o don't know about you guys, but i cant just jump onto console, I need a fecking keyboard and mouse, not some dodgey contoller that you practiclly have to guess with.

Its a sad sad day :( i need to game, my spirits are low.

DaveW
08-04-2006, 01:05 PM
Hmm...i have my doubts. In order to build the GPU into the CPU, it would take a complete re-write of everything we know about computer architecture. Games will probably be less compatible, and will probably be classed as 'legacy systems' while the architecture changes. But This has been happening for a while now. I don't think you need to worry about this-this is only a prelude to a huge war between Intel and AMD. Everything we've seen so far is only scuffles and arguments.

Which means massive price drops and technological leaps for us.

Conroe vs. the new giant? Now THAT will be a war. What worries me is the longivity of my brand new x1900xt. Will future games be compatible with it?

If you ask me, this means that NVidea has pretty much won the Graphics Card battle now. The competition is owned by AMD, and by the time they've sorted out their affairs, NVidea will be miles ahead technologically.

However, if AMD/ATI were to buy Ageia...THAT would be a very interesting development indeed.

-Dave

Omega
08-04-2006, 01:43 PM
if AMD/ATi were to buy Ageia, then it's game over for Intel and nVidia, unless they team up and find another Physics processor company...

overdosedelusion
08-04-2006, 01:44 PM
actually, everything in the article confirm what my mums bf has been telling me, hes an intel engineer (number 1 in the UK aparently :S) but anyhow, he told me all this before the partnership even happaned. intel actually plan to get rid of their gaming side, as its such a small market compared to buisness use. I passed off what he said of complete bull, like everyone is saying to me. But i do belive now that this partnership has arised.

If this is supposed to be a war, the surely AMD should have snapped up nVidia? being the leading grphics card provider?. nVidia have now liscenced intel also (according to my mums bf) to use their SLi technology. So basically, we have an Intel/nVidia vs AMD/ATi, and with the new conroe chips, intel will slaughter AMD. Thus, intel will have their way and we will start seeing cpu/gpu architecture become standard. My mums bf was saying the AMD are just trying to be clever (i hope to god this isnt true, as id be happy with either ATi or nVidia) and that intel are going to seize the opportunity. nVidia will slowly become nothing (along with all their partners, who will either have to switch to the red team, or die out)

I too worry about the long term use of my 7950 card, in 1-2 years it wont be worth the silicon its made from.

Though i do hope however that nw companies will rise from the ashes, and produce hardware for gaming. perhaps nVidia could create their own CPU? it would be hard, ATi are out the window, but it could open a window for 3Dlabs or Matrox? nVidia have the tecnology to make their own motherboards, cpu's and graphics cards. I just hope this doesnt spell the end for PC gaming. but i think there is about a 98% chane that it will.

nil8
08-04-2006, 02:00 PM
The end of pc gaming? WHAT?!
The first rule of business is to provide your product to your customers. Gaming is a massive industry and PC-related companies definitely thrive off large parts of it.
Come on, the end of gaming?

What I'm predicting is as the integrated technology advances, it will be used in business class desktops and different equipment will be available for enthusiasts.
Nothing will realistically change in either market, just technology doing what it always does. End of gaming...bah.

I'm glad AMD did this. They needed to branch out and ATI has a solid customer base and a good product. Let's hope AMD can keep their research dept up against NVidia.

Besides this, game developers will design games to run within certain specs, just like they do now. I highly doubt if developers will 'utilize' ATI or NVidia more than they do now. That cuts down on the customers that will spend the dough for the game and the possibility of a new graphics card.

You have to view what companies do with a dollar sign in mind. They do.

overdosedelusion
08-04-2006, 02:13 PM
The end of pc gaming? WHAT?!
The first rule of business is to provide your product to your customers. Gaming is a massive industry and PC-related companies definitely thrive off large parts of it.
Come on, the end of gaming?

What I'm predicting is as the integrated technology advances, it will be used in business class desktops and different equipment will be available for enthusiasts.
Nothing will realistically change in either market, just technology doing what it always does. End of gaming...bah.

actually, its a miniscule market. just think, your local school probably have more pc's than everyone on this forum combined. now multiply that by how many shools there are in the district, and multiply that by how many districts in the world. and thats just schools, think about huge companies. gaming industry for pc's is so small that it wouldnt be missed. companies will simplpy produce games for consoles instead.

GT40_GearHead
08-04-2006, 02:28 PM
just think, your local school probably have more pc's than everyone on this forum combined


be serious man! .....

you want to tell me that youre school has 5,295 comps, it may be America but that i dont belive!
btw: tell me what every body does soner or later on theyre home pc, they PLAY, so every pc owner is a potential customer....
i dont see how you can call that negligible

overdosedelusion
08-04-2006, 02:46 PM
be serious man! .....

you want to tell me that youre school has 5,295 comps, it may be America but that i dont belive!
btw: tell me what every body does soner or later on theyre home pc, they PLAY, so every pc owner is a potential customer....
i dont see how you can call that negligible

hm, maybe i underestimated how many people were on this forum. but seriously, y college has about 1000, and there are about 8 colleges in my little area

Silenced_Coyote
08-04-2006, 02:47 PM
This quote was taken off the bottom of page 1.

As functionality aligns and re-aligns, it makes more and more sense to create an integrated chip at the heart of the computer that handles all processing, be it system related, memory related, graphics or physics related. As we gain the ability to put more and more cores on CPUs, expect to see some of those cores dedicated to graphics processing, especially as 3D graphics capability becomes more important (cf Windows Vista).

From what I can tell, it doesn't sound like the end of PC gaming to me. You talk about a future game that doesn't have all the "sparkle and luster" yet why would they add more cores so that the CPU can handle more 3D intensive stuff? You might be thinking more along the lines of what integrated graphics on the motherboard did. Just because they are putting graphics and physics on the CPU, it doesn't mean that it is going to suck for PC gaming. That is why they are adding more cores.

I don't see why it can't be good for PC gaming like the cache on CPUs and the integrated memory controller.

overdosedelusion
08-04-2006, 02:50 PM
well im just going off what my mums bf has been saying to me, and i seriously have no reason not to believe him. I see an end, or at least a dismal future for pc gaming. And we will lose high definition gaming. Dont come crying to me though :P i warned you. If im wrong ill be jumping for joy. i wouldnt say it if i didnt believe it.

Omega
08-04-2006, 02:53 PM
be serious man! .....

you want to tell me that youre school has 5,295 comps, it may be America but that i dont belive!
btw: tell me what every body does soner or later on theyre home pc, they PLAY, so every pc owner is a potential customer....
i dont see how you can call that negligible


And that's assuming we only have one computer each.

I have *thinks* 4 running computers alone.



Gaming will never die. Just because personal and professional computers are getting smaller doesn't mean that enthusiast computers will, plus nothing says we can't have power.

Silenced_Coyote
08-04-2006, 02:59 PM
I still don't see how it is that dismal.

I'm not that bright of a kid and this is how I see it. :)

4 GPUs + 4 cores on the CPU.
Throw them all together in a baking sheet (lined with foil).
Cook for 6 years.
Now we got a super CPU with 8 cores. It does all the stuff a quad core CPU can do and the stuff quad SLI can do too!

.Maleficus.
08-04-2006, 03:04 PM
Well, in bigger cities, overdosedelusion may be right. Think of cities like Chicago, and New York City. When you have that many students, you need that many computers. Not to mention how each classroom has at least 1 computer, sometimes 4 or 5.

The community of "hardcore-gamers" is very small. Yes, lots of people play games, but lots of that is flash games and games that don't require ultra fast systems. It may seem like a lot of people are willing to spend $3000+ on a gaming computer, but 100,000 people isn't all that many when you really think about it. How many people are in the world? It's something like 5 or 6 billion. Now, it the "hardcore gaming" market seems very small. Most average people these days buy computers for a few things. Internet, media (music, movies, etc.) and work (school, jobs, etc.) Now thinking that way, it looks ok to have an integrated GPU. Not many people need the power of a 7950 GX2, so why continue making them? You will still probably be able to build a good gaming rig by today's standards in the future, but you'll also end up playing today's games.

If game companys do start making games for the new integrated GPU/CPU's, then they will either be taking a step back in graphics, or AMD/ATI will have to find a way to get the power of a X1900XT into that small of a chip, and still be able to have the rest of the CPU. And if they take that step back, then they are very, very dumb.

I hope the games coming out soon are good, because they look like the last of the good-graphics games to me.

(BTW, I don't read much about any of this stuff, so if any of it is wrong or dumb sounding, please tell me.)


Edit: Wow, 4 posts before I finished writing mine lol.

GT40_GearHead
08-04-2006, 03:29 PM
the big problem i see is upgrade-ability, or lack of it

nil8
08-04-2006, 03:57 PM
I understand the numbers and differences in work machines and gaming machines. I understand the needs of both and I will tell you that PC gaming doesn't have a bleak future or a weak future or anything of the sort. Gaming itself as a market is growing both in the console and pc markets.

Besides, how many diehard gamers buy an HP or Dell? Not many. I'd go so far as saying it's rare. Manufacturers deal with their large contracts and then focus in on the rest of us. You know, the ones that build stuff and are interested because we enjoy it. We're the people that need to be kept happy as customers because we are the group that knows what the hell we're talking about and knows when a company doesn't treat us right.

Larger colleges can easily have 6000 pc's, but a lot of them are owned by the students and a decent amount of students are pc gamers. Colleges, hospitals, accounting, research firms, etc. Anywhere there is a lot of data retention or use you will find a lot of pc's.

Looking at the world's population is an unfair estimate. Lots of people in the world don't have 1 computer, much less a gaming rig.

If they two do become integrated, and core technology keeps increasing the way it should, we could just have one really large processor, which would simplify things.

Cyber cafes' will require good hardware, pro gamers, enthusiasts, and business types will always demand better, faster, smoother. The world of 8,16,32 bit is gone. It won't come back and everyone will demand more. They always have.

Despite all this, PC gaming itself has had a HUGE revolution in the past few years. How many people in the world have played CS or WOW? How many of those people will continue to play games? What about the professional curcuits that have risen in the past few years?

When you're doing something interesting, people pay attention to what you're doing.

This statement sums up PC gaming over the past 8 years or so and will continue.
PC games as a valid form of entertainment is here to stay and until you prove your case otherwise, I won't believe you.

The statement you're making is like saying that digital cameras will dissapear because one major company has changed the way their cameras take pictures. Nothing dissapears, it just shifts. Some people integrate the technology into their scheme, some don't.

And I run 6 boxes, 2 of which I can game on.

overdosedelusion
08-04-2006, 07:31 PM
I hope the games coming out soon are good, because they look like the last of the good-graphics games to me

believe me, they are. Gaming PC's WILL BECOME MID RANGE. and eventually a gaming machine will become obsolete. Extreme gamers are already an Extreme minority, and the minority is going to get smaller. Deal. I was even advised not to get my current rig (from my mothers bf..again.. he knows alot okay?) as upgrades will cease to exist for it, unless of course nvidia bring out some new cards i wont be able to upgrade any further at all, cept for say the FX-62, or quad sli.

PS. im not trying to Jynx it or anything :P

.Maleficus.
08-04-2006, 10:35 PM
The statement you're making is like saying that digital cameras will dissapear because one major company has changed the way their cameras take pictures.

Actually, that's not a very good comparison. You have to take into consideration the number of digital camera manufacturers and the nubmer of CPU and graphics card manufacturers. When there are 2 CPU manufacturers and 2 major graphics card manufacturers, there is a big change. There are tons of companys that make cameras, so, you're right, the whole market won't change. But when 50% of an entire good is changed, then there is a big difference. I guess a huge chip for both the CPU and GPU is possible, but I'm not sure about that.

But, I could be wrong, and I hope I am.

mikeroq
08-04-2006, 11:13 PM
I really hate amd buying ATI, if anything they should have bought nVidia. You always see the intel/ati and amd/nvidia combos, SLI came out on AMD boards first. I for one will die when computer games become halfass games comparitable to a playstation

overdosedelusion
08-04-2006, 11:19 PM
I really hate amd buying ATI, if anything they should have bought nVidia. You always see the intel/ati and amd/nvidia combos, SLI came out on AMD boards first. I for one will die when computer games become halfass games comparitable to a playstation

is two years from now ok with you ;)

Slug Toy
08-05-2006, 02:39 AM
everyone seems to be talking about this like its the end of the freaking world. its not... just so you know.

sure, this means that there could be shake-ups, 180 degree turns in the way things are done... new business models and new hardware. thats old news though. the athlons WERE something new and they WERE a big change when they were new. pentiums WERE a big deal when they first came out. and of course, the first video cards WERE something to talk about. i didnt see the world end because of any of those things though.

it does mean that we will start to see more amd-ati combinations, but im sure they wont make it so if you have an amd processor you HAVE to have an ati gpu... that cuts out a lot of market share right there and it doesnt make any god damned business sense.

it means intel probably wont like ati anymore, and ive already seen that they are planning to pull ati from their list of chipset suppliers. it probably means nvidia wont like amd that much anymore either, but i havent seen any signs of action taken yet.

you know what else this means? wild speculation as usual. there was another thread about this a while ago where i shared my hopes for the future of this amd-ati deal, but just because i said it doesnt mean any of its true. im pretty sure all we have to go on right now is guess work and rumours.


I was even advised not to get my current rig (from my mothers bf..again.. he knows alot okay?)

ill bet you the reason he advised against getting anything at the time is because he knew conroe was in the works. ive been telling everyone to hold off until about now, and its really paid off. yes, he was trying to do you a favour, but maybe not for the reasons you think. i have my doubts that upgrades will disappear and things will dry up. if the money is there (which it is) and the demand is there (again, which it is)... the show will go on.


I for one will die when computer games become halfass games comparitable to a playstation

you should be dead many times over then. for every good game (fear, half life 2 to a certain extent) there are quite a few games that just take an engine and throw in some textures, models, and maps. heck, with the hammer editor in half life 2 and the mod options, you basically ARE creating your own game, and it really isnt all that hard once you run through it a few times. for instance... in three nights i created a 75% destructible italian villa for cs source with proper lighting, wind, and ambient sound. it looked good and played well, but it really was half assed on my part. it just goes to show that its easy to make a simple game, and there are lots out there.


so... dont get yourself worked up over anything just yet. this whole amd-ati deal is new and only the people in the loop at the companies know whats going to happen. just sit back and enjoy the ride.

Cool1Net6
08-05-2006, 05:02 AM
... blah blah blah ...
Thank you! You got to this before me (and more in depth). People, its not the end of the world. PC gaming wont die, nor return to pre-2000 status. Think of it this way...

There are different grades of CPU's. Some are for doing light work, like surfing the web, others are for heavy, intensive tasks like video rendering and other number-crunching tasks. Then there are the middle-of-the-road CPU's. Like the article said most likely graphics rendering technology will end up being rolled in as a dedicated core. This means that the low-end CPU's may have a weaker GPU core, or none at all, the middle-of-the-road CPU's will have a middle-of-the-road GPU core, and the high end CPU's will have the high-end GPU cores. You still get to game like hell and your graphics will always match up perfectly with your processor.

You'll just have one less card to worry about.

-Cool-

mikeroq
08-05-2006, 01:03 PM
Knowing how companys work these days AMD will probilly make it where you have to use their CPU/GPU chips if they intergrate it into one chip. On the other hand they should just socket graphics and give them some DDR slots so you can upgrade your GPU and your GPU ram.

DaveW
08-05-2006, 02:08 PM
What you're saying is ludicrous. If AMD weren't interested in gaming, then they wouldn't have bought ATI, full stop. Would you buy a company as powerful as ATI then run it into the ground? I don't think so. You're panic-mongering and really, i don't think your statistics are anything more than speculation.

-Dave

nil8
08-06-2006, 06:13 AM
I blather on and on post after post and Dave sums up what I've been trying to say in 2 sentences. Bah.

overdosedelusion
08-06-2006, 01:56 PM
well of course we can still play farcry at medium settings but it will probably end there.

"We are not going to see bleeding-edge graphics on processors, because there will not be capability or capacity. We will see low and medium end, but not high. But this will, by definition, make high-end graphics an increasingly niche product which could push prices up and decrease configuration flexibility. When you can have a tiny system with almost everything on one chip, why add cost to the system by including a graphics expansion slot? In the long run, we could see a situation where, as gaming becomes popularised by the increased availability of 3D graphics through integration, high-end gaming becomes increasingly niche."

EDIT - been asked to remove some of this by Richard.

Slug Toy
08-06-2006, 03:00 PM
"We are not going to see bleeding-edge graphics on processors, because there will not be capability or capacity. We will see low and medium end, but not high. But this will, by definition, make high-end graphics an increasingly niche product which could push prices up and decrease configuration flexibility. When you can have a tiny system with almost everything on one chip, why add cost to the system by including a graphics expansion slot? In the long run, we could see a situation where, as gaming becomes popularised by the increased availability of 3D graphics through integration, high-end gaming becomes increasingly niche."

man where are you getting these quotes from? it looks like something from the inquirer. a lot of doomsday talking.

the fact is that you probably cant create super GPU's for a CPU socket because of bandwidth issues. the best you could get away with is a mid range. that sure as hell doesnt mean that high end stuff is just going to shrivel up and blow away. gamers wouldnt let that happen, game makers wouldnt let that happen. if intel started making boards without expansion slots... they would probably be shooting themselves in the foot, and then the head.

the only way you can get away with having high end graphics on a motherboard is if you do something like in the 360. even then, you need the space and the parts to do it, and that means you cant upgrade it.

i think you might be jumping to conclusions about intel's plans for boards without expansion slots. if you have no way to put a good sound card and video card on, there has to be something to make up for it. no one will buy a board without the capability to do anything.... thats just retarded.

all in all... stop it before everyone goes crazy. lets just go "ooh, ah... good times man."

overdosedelusion
08-06-2006, 05:01 PM
i think you might be jumping to conclusions about intel's plans for boards without expansion slots. if you have no way to put a good sound card and video card on, there has to be something to make up for it. no one will buy a board without the capability to do anything.... thats just retarded

im sure A LOT of people would rather buy something cheap that has everything, rather than lots of different parts. schools, buisnesses, familys for small entertainment systems and all the basics, someting people can do small tasks on. and surely if everyhing is onboard, it DOES have the capabilty to do everything a standard system has anyway. only as cheaply as ITX boards. anyhow, im not getting into heated debates about it lol, i just hear what mr intel tells me, and then slowly wonder whats gonna happen.

EDIT = been asked to remove some of this from richard.

nil8
08-06-2006, 11:23 PM
You can have full integration now. Almost all business class manufactured desktops do.

Doesn't mean anything to the hobbyists.

MitaPi
08-10-2006, 09:26 PM
Is the main concern here high end gaming? If it is... then as long as we (the gamers) demand the further production of high end GPU's, CPU's, etc etc. there wont be an end. I have always thought that graphics in games will stop when they can not be made any better. (Aka - so real that you cant tell the difference from the game and real life). So I guess what I am saying is... as long as gamers that want to push graphics to the edge and then some exist in quantities enough to pursuade a company (AMD & ATI) to keep producing better and better graphics, we wont see a drop in quality. IF that is what everyone is saying... I'm not entirely sure. I feel lazy right now and didnt want to read through it all.

If I am wrong then please just ignore me ^_^ I usually dont know what I am talking about but still like to throw in my half cent.

d_stilgar
08-11-2006, 12:34 AM
Even if chips are combined into CPU, GPU, PPU and anything else you can think of, I think we will be ok in the gaming community. Processors are moving more and more to multiple cores. You get an quadcore or an eight core (octocore?) processor. hmm . . .2.66 cores per job? We will have amazing performance in the palm of our hands.

GT40_GearHead
08-11-2006, 02:19 AM
We will have amazing performance in the palm of our hands.


but almost no ugrade potential

MitaPi
08-11-2006, 04:51 AM
Well... in the palm in our hand... I dont see why you would want to upgrade it if it was litteraly in the palm of your hand. lol But I dont think thats what he meant huh? You can upgrade Ipods... but there not very cool upgrades and they dont really make a difference lol. Oh well...

d_stilgar
08-11-2006, 05:14 PM
If we have multicore, then it's kinda like raid. I can have two cores working for one thing, two on another, and four to do the main number crunching. I have eight cores at 2.4Ghz and I need more speed, I just buy a new CPU at 3.4Ghz.

Ram is upgradable too. 4 gigs of DDR3 in quad channel; raid for our ram. We can also decide how much is dedicated to what. 1gig for GPU, 1 for PPU, and 2 for CPU. And all of this in a laptop.

We are looking at the future of gaming as a very affordable thing, and much more efficient and better looking than it is today.

Why is it that GPUs havn't broken 1Ghz yet? One is because of the form factor. It limits how well we can cool the chips. Another is because of bus speeds from the card to the mobo. Putting all this onto the same chip will end that problem and gaming performance will come down to software and tweaks.

But that is just if all this happens.

DaveW
08-11-2006, 07:48 PM
Interesting Point of view, D_Stilgar. I actually haven't thought of it from that angle.

In my head, i wouldn't have put them on the same chip, but now you've pointed out the advantages to me...i see how it would work pretty well.

-Dave

nil8
08-12-2006, 03:02 AM
Yes, the only drawback is heat. Lots and lots of heat.

meticoeus
08-12-2006, 03:25 AM
OK guys, has anyone stopped to think about the pointless-ness of this thread. If for no other reason than the rather silly assumptions that seem to be being made (granted I'm assuming this, so I guess it can't be helped). Basically it doesn't seem anyone is taking acount the advances that the near and relatively near future should bring about.

Since we are nearing the limits of current tech in many areas, new methods must be utilized. This isn't new info. We've known this for some time and progress is being made. Quantum mechanics (at least, our increasing understanding of it) is being utilized to find better ways of building computer components which will ultimately lead to much smaller and much faster computers. Why would any steps be taken backwards?

Even in worst case scenario, everything becomes standardized andnothing can be upgraded easily, we are still talking about technology that is kilometers or even miles beyond what we are currently familiar with. Sure most games of the present will eventually be relegated to realm of using emulators to run them (it that works) just as nearly any game from a decade ago is difficult to play unless you get together the right software or even hardware to run it.

I agree and think this is nothing but panic mongering and is kind of pointless b/c no-one here really knows what the future holds (unless someone here is part of research team making whatever that might be but you've been pretty quiet so...)

DaveW
08-13-2006, 06:53 AM
Even in worst case scenario

Do not utter those words here!


this is nothing but panic mongering

There is an element of panic mongering, but i think this is mostly just contemplating the future of the industry. Although i suspect all the points were made a couple of posts ago ;)

-Dave

meticoeus
08-14-2006, 02:36 AM
Do not utter those words here!

Wow, oops. I was completely now thinking about the site name when I was typing that, lol.

monoflap
08-14-2006, 01:02 PM
"We are not going to see bleeding-edge graphics on processors, because there will not be capability or capacity. We will see low and medium end, but not high. But this will, by definition, make high-end graphics an increasingly niche product...
I know I'm a little late with this one :redface: but your forgeting about the 800 pound gorrila in the room, the movie industry! Millions of people watch movies with all sorts of computer animation and directors sure as hell aren't going to wait 5 years for a scene to finish rendering. Ok NOW all the points are made:p.