Log in

View Full Version : More Power?



progbuddy
06-17-2009, 11:38 AM
Here's something I've started thinking about recently. Why is it that computers keep needing more power, and more power, and more power as we go along? If you imagine it, the graph of this power over time curve would look like a bathtub. Sure, computers are becoming more efficient, but they still haven't tapped into the vast majority of power they already contain with good software, code compilers, etc. The majority of programs are still single-threaded and don't use the extra 1-3 cores. Sure, OpenCL is starting to be supported by some companies, but it isn't really growing at the rate it should be.

I remember my old Micron Millennia ran on 225 watts of power from an FSP power supply. Now, the desktop uses all of the 550 watts it can get, and gaming companies are pushing out 1600 watt power supplies (approaching the 15-amp breaker limit :think: ).

Pretty soon, we're going to start having computers with their own self-contained nuclear reactors to supply 3 Gigawatts to the mainboard.

Airbozo
06-17-2009, 12:03 PM
That's because it's easier to throw hardware at a problem than it is to re-train coders to take advantage of the existing hardware. This cycle has been going on since I started in computers (30 years ago) and likely will continue.

When I worked at SGI, I attended many trade shows and training shows and THE hardest thing was to convince the programmers to utilize the already present features in the hardware to make things work better and faster by re-learning how to code.

progbuddy
06-17-2009, 02:37 PM
That's because it's easier to throw hardware at a problem than it is to re-train coders to take advantage of the existing hardware. This cycle has been going on since I started in computers (30 years ago) and likely will continue.

When I worked at SGI, I attended many trade shows and training shows and THE hardest thing was to convince the programmers to utilize the already present features in the hardware to make things work better and faster by re-learning how to code.

Yeah. What I just don't get is why they don't write one program that makes all the others utilize the cores more efficiently.

x88x
06-17-2009, 03:03 PM
Yeah. What I just don't get is why they don't write one program that makes all the others utilize the cores more efficiently.

Because that's a lot harder than it sounds. Any applications that would want to take advantage of that program would probably have to be written specifically to work with that program, and it would definitely decrease efficiency, and possibly performance as well, throwing that extra layer in there. But the main point is that programmers would still have to change how they code, and if you're gonna change anyways, you might as well just learn to multi-thread.

As for the power, the main power-hogs in modern PCs are the graphics cards. I think my GTX260 pulls about as much power as the rest of my rig, if not more... Also, manufacturers push PSUs WAY more powerful than their systems need. For example, if I purchased my current rig from an OEM, they probably would have put in at least an 800W PSU, if not more...and yet it churns away happy as can be on a 550W. Or take a system I built for work a while ago; dual quad-core Opterons, 32GB of RAM and 6 SATA HDDs....Dell would probably put their 1kW or 1.2kW PSUs in it, but it's perfectly happy on an 850W..and usually only pulls about 300-400W. So, yes, PC power usage is going up, but not at anywhere near the rate that manufacturers would want you to believe.

progbuddy
06-17-2009, 04:05 PM
Because that's a lot harder than it sounds. Any applications that would want to take advantage of that program would probably have to be written specifically to work with that program, and it would definitely decrease efficiency, and possibly performance as well, throwing that extra layer in there. But the main point is that programmers would still have to change how they code, and if you're gonna change anyways, you might as well just learn to multi-thread.

As for the power, the main power-hogs in modern PCs are the graphics cards. I think my GTX260 pulls about as much power as the rest of my rig, if not more... Also, manufacturers push PSUs WAY more powerful than their systems need. For example, if I purchased my current rig from an OEM, they probably would have put in at least an 800W PSU, if not more...and yet it churns away happy as can be on a 550W. Or take a system I built for work a while ago; dual quad-core Opterons, 32GB of RAM and 6 SATA HDDs....Dell would probably put their 1kW or 1.2kW PSUs in it, but it's perfectly happy on an 850W..and usually only pulls about 300-400W. So, yes, PC power usage is going up, but not at anywhere near the rate that manufacturers would want you to believe.

But that small decrease in efficiency would look negligible next to the large sum of power that would come from the program using all cores and processing units inside of a computer. Also, graphics cards don't really pull as much power as people think. It's just because it runs on a 12 volt rail that it seems like it uses a lot of power. In actuality, the 4870 in my desktop pulls all of about 120-130 watts max from the power supply, whereas the CPU pulls about 95 watts max. The GPU core is able to push out around 1.2 TeraFLOPS of power, whereas the CPU does about 1/12 of that, yet the CPU does most of the work.

Big question mark.

x88x
06-17-2009, 06:05 PM
But that small decrease in efficiency would look negligible next to the large sum of power that would come from the program using all cores and processing units inside of a computer.

You're probably right here, but unless it were integrated at the OS level, below all the handles that developers normally use, there would still be the problem that programs have to be re/written to call that middle-man program instead of dealing with the OS. So, personally, if it were my program, I would much rather redo it multi-threaded than redo it calling this special program. Same problem (ie, having to learn a new way of doing the same things), just different end points.


Also, graphics cards don't really pull as much power as people think. It's just because it runs on a 12 volt rail that it seems like it uses a lot of power. In actuality, the 4870 in my desktop pulls all of about 120-130 watts max from the power supply, whereas the CPU pulls about 95 watts max. The GPU core is able to push out around 1.2 TeraFLOPS of power, whereas the CPU does about 1/12 of that, yet the CPU does most of the work.

Big question mark.

The problem there is partly one of hardware and partly one of coder laziness, but mainly a problem of hardware. What it really comes down to is that CPUs are much more versatile than GPUs. Yes, GPUs beat the sh** out of CPUs in certain areas, but in other areas CPUs accel. The other hardware problem is that GPUs do not have a unified architectural interface like CPUs do. Part of the x86 standard is a unified assembly language interface. Yeah, it's not the most organized thing, but it is a standard. You go to any x86 CPU (AMD, Intel, VIA, whatever) and you know you can use x86 assembly. GPU's on the other hand, do not have a unified standard yet. AMD has FireStream, nVidia has CUDA, Intel has...actually, I don't think Intel has anything yet, probably because they don't really make very powerful GPUs. More important, though, is that a minority of the computing population has GPUs that would make implementing such a thing useful. This is changing, though, as more powerful GPUs get cheaper (like the Geforce 9400), and as a result, there have been applications popping up recently that take advantage of them (such as BadaBoom (http://www.badaboomit.com/). I think as more and more manufacturers start integrating powerful GPUs into their designs, this trend will most likely accelerate, but that is where it really needs to start. Nobody wants to start making performance tires if there aren't any roads yet (not a perfect analogy, I know, but hey). Basically, the CPU does most of the work because it's more versatile and it's been around a lot longer.


Point of interest, your CPU also runs on a 12-volt rail ;)

Airbozo
06-17-2009, 07:15 PM
Great discussion here.

BTW Intel does have something similar to CUDA, it is the x86 instruction set with some additional instructions specifically for the GPU... It is for the next generation of CPU's that will have the GPU on-die called Larrabee. From the early looks it will give nVidia and ATI a run for their money in the low end graphics arena initially, then will start pushing the performance to the high end. From the briefings I have been in, if the performance is only half what Intel claims it will be a threat to the big GPU makers. In time. This is only good news for us computer modders.

Another BTW, Intel holds some patents on some very high end graphics stuff that has been historically used only in SGI super computers.

progbuddy
06-17-2009, 09:21 PM
The new version of OS X supposedly has OpenCL built into a 64-bit architecture. Yay unlocking GPU power.

x88x
06-17-2009, 10:05 PM
Great discussion here.

BTW Intel does have something similar to CUDA, it is the x86 instruction set with some additional instructions specifically for the GPU... It is for the next generation of CPU's that will have the GPU on-die called Larrabee. From the early looks it will give nVidia and ATI a run for their money in the low end graphics arena initially, then will start pushing the performance to the high end. From the briefings I have been in, if the performance is only half what Intel claims it will be a threat to the big GPU makers. In time. This is only good news for us computer modders.

Another BTW, Intel holds some patents on some very high end graphics stuff that has been historically used only in SGI super computers.

Interesting; I knew they were planning on pushing their GPUs further with Larrabee, but I didn't know they were going that far. Should be an interesting year.

Drum Thumper
06-17-2009, 11:28 PM
I can remember my god-father's brother (who happened to work for Intel at the time) telling me about what was on the horizon back in 1992, when Clinton gave order to declassify a whole of of military affairs. He couldn't be all that precise, but basically said that they (the military) were about two generations ahead of what the civilians were getting to use.

He also had a very funny story about a P90 that he fried with a drop of sweat. Long story short, he was stress testing the machine and took the heat sink off. Drop of sweat off the brow, no more computer.

x88x
06-17-2009, 11:34 PM
I can remember my god-father's brother (who happened to work for Intel at the time) telling me about what was on the horizon back in 1992, when Clinton gave order to declassify a whole of of military affairs. He couldn't be all that precise, but basically said that they (the military) were about two generations ahead of what the civilians were getting to use.

Yeah, kinda like how Universities are supposed to all be either two generations ahead or two generations behind.. Smaller group + better funding + easy access to the people developing the new stuff + able to test stuff better than the general populace = getting better toys faster. Doesn't make it any less frustrating though :D