Log in

View Full Version : Super Diamonds in Computers



TheGreatSatan
09-22-2010, 02:01 PM
Diamonds (http://www.abazias.com/diamondblog/diamond-education/building-the-first-super-computer-with-diamonds) will soon be the new silicon

Diamonds can carry 30x more power than silicon and operate 3x faster. They also resist temperatures up to 1800 F, which would melt silicon.

AWW5Pa1h_1g

x88x
09-22-2010, 03:58 PM
Interesting, but last I checked quantum computing wasn't expected to be of all that much use in normal computing needs.

dr.walrus
09-26-2010, 08:15 PM
Last time I checked, diamonds were known for being a little bit expensive?

mDust
09-26-2010, 08:45 PM
Last time I checked, diamonds were known for being a little bit expensive?

That's only because of criminal organizations like DeBeers. Natural diamonds aren't perfect, which is necessary for computing. The diamonds proposed for computing are manufactured in a lab layer by layer. I heard that insane clock speeds (like 60Ghz insane) would be possible with diamond. I'm already wondering how much further it could be overclocked...:D

dr.walrus
09-26-2010, 08:57 PM
That's only because of criminal organizations like DeBeers. Natural diamonds aren't perfect, which is necessary for computing. The diamonds proposed for computing are manufactured in a lab layer by layer. I heard that insane clock speeds (like 60Ghz insane) would be possible with diamond. I'm already wondering how much further it could be overclocked...:D

I was talking about lab produced diamonds. The cost of diamonds could never, ever match that of silicon, even taking into account the decreased amount of materialrequired.

Frankly, we're going to hit the quantum tunneling limit soon, so it'll be interesting to see what innovations that pushes. Intel reckons we have until 2017.

mDust
09-26-2010, 09:26 PM
I was talking about lab produced diamonds. The cost of diamonds could never, ever match that of silicon, even taking into account the decreased amount of materialrequired.
Not anytime soon, at least. The cost of manufacturing diamond will decrease when companies start competing in the market. And besides, large companies will drop ridiculous amounts of cash for technology if needed. The cost isn't important in such cases. So expect diamond processors in top-end servers in 10-15 years which will migrate into top-end PCs 15 years later. Maybe it will be mainstream in 30 years.

dr.walrus
09-26-2010, 09:57 PM
Not anytime soon, at least.

Not ever, as far as I can see. Silicon is very easy and cheap to manufacture, and even in the very purified for required for semiconductors, it doesn't need anything like the energy and machinery required to produce diamonds. And I very much doubt those processes are ever gonna be comparable

TheGreatSatan
09-27-2010, 05:56 PM
The main problem is that they can't find an easy way to make it into a wafer

Konrad
10-06-2010, 05:52 PM
A "revolutionary" new sci-fi wondercomputer technology is announced every year. Typically something involving a breakthrough in material engineering, a cleverly quirky little device, a nanosomething, or bizarre quantum logic.

In 2010 they promise diamond
In 2009 they promised graphite substrates (http://www.gizmag.com/graphit-mass-data-storage-circuit-design/12788/)
In 2008 they promised memristors (http://discovermagazine.com/2009/jan/065)
In 2007 they promised optical microchips (http://web.mit.edu/newsoffice/2007/optics.html)
In 2006 they promised quantum logic (http://www.michigandaily.com/content/quantum-chip-could-revolutionize-information-processing)
In 2005 they promised graphene nanocircuits (http://discovermagazine.com/2009/jun/04-life-after-silicon)
In 2004 they promised exotic crystals (http://www.physorg.com/news164289676.html)
...
I could go on forever, I suppose.

The point is all of these things create a research frenzy and eventually spinoff some useful technologies, but they rarely work out to fundamentally improve computing hardware. They all turn out to be useful in minor ways, mostly little incremental evolutionary improvements. Hardly ever anything truly revolutionary. Most end up being overhyped pocket lint.

mDust
10-07-2010, 01:30 AM
A "revolutionary" new sci-fi wondercomputer technology is announced every year. Typically something involving a breakthrough in material engineering, a cleverly quirky little device, a nanosomething, or bizarre quantum logic.

In 2010 they promise diamond
In 2009 they promised graphite substrates (http://www.gizmag.com/graphit-mass-data-storage-circuit-design/12788/)
In 2008 they promised memristors (http://discovermagazine.com/2009/jan/065)
In 2007 they promised optical microchips (http://web.mit.edu/newsoffice/2007/optics.html)
In 2006 they promised quantum logic (http://www.michigandaily.com/content/quantum-chip-could-revolutionize-information-processing)
In 2005 they promised graphene nanocircuits (http://discovermagazine.com/2009/jun/04-life-after-silicon)
In 2004 they promised exotic crystals (http://www.physorg.com/news164289676.html)
...
I could go on forever, I suppose.

The point is all of these things create a research frenzy and eventually spinoff some useful technologies, but they rarely work out to fundamentally improve computing hardware. They all turn out to be useful in minor ways, mostly little incremental evolutionary improvements. Hardly ever anything truly revolutionary. Most end up being overhyped pocket lint.

It just boils down to cost vs benefit. The ratio for all those promises has always cost too much or not been beneficial enough to matter. Companies can't invest in unprofitable technology...even if it is freakin' cool.

The problem with the diamond chips is cost. It costs way too much money per chip to produce them and they aren't reliably mass-produced. Dr.Walrus was correct about the price being way too high compared to silicon. However, I believe that some new technology in the near future will allow for a drastic cost reduction that can make diamond chips a bit more feasible.

Konrad
10-07-2010, 02:04 AM
Yeah, all these things are cool. I still don't see any quantum-optical computers running on exotic nanographene crystals yet, not even in server rooms, lol, stupid overenthusiastic media reporters. And where's that flying car they promised me 25 years ago?

Artificial diamond films are actually quite easy to manufacture, and not especially expensive*. Not at all suited for jewellery or even industrial cutting uses, but absolutely perfect for silicon-on-diamond (SOD) circuit substrates. Of course that's assuming the foundries can figure out how to do it cost-effectively, and that it turns out to actually have real advantages over conventional approaches.

I'm always more interested in the less exciting "revolutionary" computer tech ... smaller lithographic processing, denser memory wafers, the next-generation chipsets and processors. That stuff manages to hit my desk within 5 years instead of 50.

* semiconductor-grade pure carbon + energy bill + tons of ridiculous equipment which technically doesn't exist yet.
How much more will it cost than silicon? Maybe $5 per die? No doubt Intel will still charge $1000 per unit so it seems financially feasible. Even if the electrical/thermal properties of the diamond dies only offer a measly +10% performance advantage.
This all assumes that it won't cost $billions in R&D to migrate existing mature technologies onto diamond. And that the first specimen doesn't condemn the entire concept with a random unforeseen bug.

mDust
10-07-2010, 11:38 AM
Yeah, all these things are cool. I still don't see any quantum-optical computers running on exotic nanographene crystals yet, not even in server rooms, lol, stupid overenthusiastic media reporters. And where's that flying car they promised me 25 years ago?

Artificial diamond films are actually quite easy to manufacture, and not especially expensive*. Not at all suited for jewellery or even industrial cutting uses, but absolutely perfect for silicon-on-diamond (SOD) circuit substrates. Of course that's assuming the foundries can figure out how to do it cost-effectively, and that it turns out to actually have real advantages over conventional approaches.

I'm always more interested in the less exciting "revolutionary" computer tech ... smaller lithographic processing, denser memory wafers, the next-generation chipsets and processors. That stuff manages to hit my desk within 5 years instead of 50.

* semiconductor-grade pure carbon + energy bill + tons of ridiculous equipment which technically doesn't exist yet.
How much more will it cost than silicon? Maybe $5 per die? No doubt Intel will still charge $1000 per unit so it seems financially feasible. Even if the electrical/thermal properties of the diamond dies only offer a measly +10% performance advantage.
This all assumes that it won't cost $billions in R&D to migrate existing mature technologies onto diamond. And that the first specimen doesn't condemn the entire concept with a random unforeseen bug.
There's a limit to how small we can make electronic circuits though. Electron tunneling and other quantum phenomena are a big ol' wall that chip manufacturers are racing towards. If they can't find a way around, over, or underneath it then they'll have to compete in other ways...probably by just introducing those cool technologies mentioned above.

And it's that tons of ridiculous equipment and other related costs that make the diamond chips expensive to produce. At this point they aren't made on an assembly line which means they are Expensive with a capital 'E'.