Page 2 of 2 FirstFirst 12
Results 11 to 16 of 16

Thread: The black art of silicon

  1. #11
    Will YOU be ready when the zombies rise? x88x's Avatar
    Join Date
    Oct 2008
    Location
    MD, USA
    Posts
    6,334

    Default Re: The black art of silicon

    Good grief, dude, enough with the conspiracy fearmongering bull**** (not you, Konrad). There are so many things wrong with those posts that I don't even know where to start...aside from pointing out that if this is a serious attempt at opening a conversation on the topic it should be in its own thread, not hijacking this one.

    EDIT:
    If you are interested in having a serious conversation on the topic, by all means, open up a new thread and we can chat there...just not going to spend those threads right here right now...

    EDIT2:
    Sorry if I came across a bit strong at first in this post. I stand by my statements, but the wording was not in the spirit of the culture that we try to promote on this site (no, I was not contacted by any mods, or anyone else for that matter).
    That we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours, and this we should do freely and generously.
    --Benjamin Franklin
    TBCS 5TB Club :: coilgun :: bench PSU :: mightyMite :: Zeus :: E15 Magna EV

  2. #12
    Anodized. Again. Konrad's Avatar
    Join Date
    Aug 2010
    Location
    Canada
    Posts
    1,060

    Default Re: The black art of silicon

    Can I hijack my own thread?

    I do happen to agree with the planned obsolescence argument. Not necessarily with all the specific examples given above, nor with all the specific conclusions drawn from them - yeah, it's a bit too conspiracy theory for my tastes, and it looks to be strongly biased from a single (and selectively overinformed) source. But it's still a valid argument.

    But back to Dark Silicon ...

    Now it's actually become an important part of integrated circuit design.

    A fine example is NVidia's Maxwell vs Kepler architectures. Basically the same old design and the same old 28nm TSMC fab, but far more transistors packed onto a ridiculously huger die - all with far greater power budget efficiency. A GTX980 (GM204 GPU) essentially benchmarks about the same (actually just a little better, largely because of minor ASIC refinements and new DX12 feature support) than its GTX780Ti (GK110Bx GPU) predecessor - but it pulls 165W instead of 250W to achieve those same results. It seems this is mostly accomplished by establishing an internally partitioned trickle-down hierarchy, a group of "master" CPUs (in this case the GPCs) controls a number of "slaved" SPUs (in this case the SMM Warp Schedulers) which in turn control all the actual processing taking place (CUDA cores plus a sundry variety of dedicated function units). The whole idea is to channel electrical power only towards those circuit blocks which are actually being utilized moment to moment. It seems like an overcomplicated way to do things, but real-world results do not lie.

    NVidia has, of course, now moved on to combining the greater power efficiency from this "Dark Silicon engineered" approach with a greater (old school) power budget. So now we have a 250W Maxwell GPU (on the Titan X), and soon we will have more of these magnificently beastly but also power efficient GPUs.

    Intel has moved in the same direction over time, less deliberately. Mostly because the various power efficient design tricks they learn on their mobile CPUs end up being adopted into subsequent desktop/enterprise "tock" releases. World-class pioneering research aside, Intel doesn't seem quite as ready to embrace "Applied Dark Silicon" implementations. It seems that Intel's philosophy is that every and any circuitry they build into the package has to be maximally utilized (because if it's not, then they must be doing it wrong) - unless a power-saving sleepy state happens to be initiated. I suppose they are aware enough of NVidia's crazy results to consider shifting their design paradigm - I expect (well, I hope) to see future CPUs which apply a master/slave hierarchy towards great numbers of SPUs. (I can see future computing moving towards large-density simple parallel processing tasks. The sorts of "streaming" things more of us run through GPGPU now could easily be stacked into a main processor package, for this purpose, not as some gutless little integrated GPU which simply wastes the power budget.)
    My mind says Technic, but my body says Duplo.

  3. #13
    Will YOU be ready when the zombies rise? x88x's Avatar
    Join Date
    Oct 2008
    Location
    MD, USA
    Posts
    6,334

    Default Re: The black art of silicon

    Quote Originally Posted by Konrad View Post
    Not necessarily with all the specific examples given above, nor with all the specific conclusions drawn from them - yeah, it's a bit too conspiracy theory for my tastes, and it looks to be strongly biased from a single (and selectively overinformed) source. But it's still a valid argument.
    That was my main issue with it (other than being tired and short-tempered when I first read it). The core concept is not wrong, but wildly inaccurate and misinformed conclusions were drawn from that core concept, based on assumptions that were either patently false or fundamentally misunderstood.

    Quote Originally Posted by Konrad View Post
    But back to Dark Silicon ...
    Agreed.

    Quote Originally Posted by Konrad View Post
    I suppose they are aware enough of NVidia's crazy results to consider shifting their design paradigm - I expect (well, I hope) to see future CPUs which apply a master/slave hierarchy towards great numbers of SPUs. (I can see future computing moving towards large-density simple parallel processing tasks. The sorts of "streaming" things more of us run through GPGPU now could easily be stacked into a main processor package, for this purpose, not as some gutless little integrated GPU which simply wastes the power budget.)
    IIRC, this is kind of how the Xeon Phi CPUs[1] work. As I understand it, they are designed to attack the same class of problems that GPGPU has been addressing.

    [1]http://www.intel.com/content/www/us/en/processors/xeon/xeon-phi-detail.html
    That we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours, and this we should do freely and generously.
    --Benjamin Franklin
    TBCS 5TB Club :: coilgun :: bench PSU :: mightyMite :: Zeus :: E15 Magna EV

  4. #14
    Anodized. Again. Konrad's Avatar
    Join Date
    Aug 2010
    Location
    Canada
    Posts
    1,060

    Default Re: The black art of silicon

    You're right, lol.

    Look at me, decades behind the tech curve.
    My mind says Technic, but my body says Duplo.

  5. #15
    Will YOU be ready when the zombies rise? x88x's Avatar
    Join Date
    Oct 2008
    Location
    MD, USA
    Posts
    6,334

    Default Re: The black art of silicon

    Well, to be fair, the Phi's are a pretty niche product at the moment. I think they are currently mainly used in certain niches of the HPC industry. I've never actually have the chance to work with any myself.
    That we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours, and this we should do freely and generously.
    --Benjamin Franklin
    TBCS 5TB Club :: coilgun :: bench PSU :: mightyMite :: Zeus :: E15 Magna EV

  6. #16

    Default Re: The black art of silicon

    Quote Originally Posted by Konrad View Post
    AMD has done well with a planned-longevity approach, though. Each new CPU socket is designed for backward compatibility with older CPUs, each new CPU is designed for backward compatibility with older sockets. As much as possible - at least - sometimes an entirely new platform needs to be designed to accommodate entirely new tech and it's just time to let go of the legacy anyhow. Just saying, that AMD is a surprisingly small company yet this strategy let's them float in the same water as megacorporate Intel. And we all know how AMD and NVidia both have recycled and rebranded and rebinned countless revisions of the same tired old "disposable" GPUs - a special example which seems to flaunt planned obsolescence.
    Well, very true that AMD has been very innovative when it comes to CPU.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •