Log in

View Full Version : When will moores law hit a big hard atomic wall?



Oneslowz28
07-08-2010, 08:00 PM
I was reading an article in a past issue of eetimes and they were speculating that moores law may hit an atomic wall by 2020. This means that our electronics will no longer shrink. So does that mean to speed up we will have to move back to increasing the size of our devices to speed them up?

So when do you think it will happen?

x88x
07-08-2010, 08:30 PM
Well, strictly speaking, Moore's law doesn't deal with shrinking stuff, just them becoming more powerful.

As for the issue though, personally I think one of two things is gonna happen:

1) Computers get so powerful that it doesn't really matter if they keep doubling. If we're only using 10% of the available power, what good does it do to double and only use 5%?

2) The cloud takes off, and we move all our processing to central locations, where it doesn't matter if they get bigger.

Looking at stuff lately, I think for a large portion of the population it's gonna be #2. With the rise of mobile devices that are always connected to the internet with increasingly fast wireless connections, it won't be long until working off the cloud won't feel any different than working locally.

I think the only real technical barrier will be gamers. Until some sort of remote thing gets perfected, I don't see PC gamers moving away from operating locally on as powerful a machine as possible.

knowledgegranted
07-08-2010, 09:14 PM
You guys are missing the point here in Moores Law.

Moores law was based on electricity, and electrical components. Biocomputers are the next big thing. They can be fast and better than our own human brains.

EDIT:

There is also gonna be another programming revolution. To fit more functionality, in small spaces.

mtekk
07-08-2010, 10:02 PM
Well, strictly speaking, Moore's law doesn't deal with shrinking stuff, just them becoming more powerful.

Moore's Law specifically deals with the doubling of transistor counts in commercial chips every 22 months or so (due to physics we realize this via feature size shrinks).


You guys are missing the point here in Moores Law.

Moores law was based on electricity, and electrical components. Biocomputers are the next big thing. They can be fast and better than our own human brains.


Well, if you want to talk about bio computers, let's talk about how they will be larger than silicon based systems (physics pretty much dictates this). And, don't forget to mention strict environment control they require (can't be too hot, cold, wet, or dry otherwise they die). IMHO, the only interesting thing about biological computers is that they self assemble/self program.

A circa 1990s CPU can do math faster than a human, but that doesn't make it better. The brain is a marvelous thing, it's ability to reorganize and process information is amazing. This is what we can learn from. We will have a programming revolution (or renaissance if you will) as many of today's programs are lazy and bloated. When a hardware upgrade is not available, optimizing software will be norm, as it was before the mid 1990s.

x88x
07-09-2010, 01:26 AM
(due to physics we realize this via feature size shrinks).

I read his comment about devices growing as the size of the entire chip growing when we inevitably hit the limit of how small we can shrink silicon semiconductors.

Along those lines, I remember reading somewhere recently that a group was having great success in making circuits I think using a graphite/silicon blend? They were making circuits smaller than would have been possible with straight silicon. ...I wish I could remember where I saw that...

silverdemon
07-09-2010, 04:25 AM
That would be 'grafene' (don't if it is spelled right, in dutch it is 'grafeen') which is (I believe) a sheet of single-atom thickness graphite (or carbon)

This stuff makes for higher clock speeds and possibly a bit smaller feature sizes.
However, to get back to the topic-question: it doesn't matter what material you use, protons, neutrons and electrons stay the same size, so that's probably a physical barrier we will encounter...

...unless we will be able to craft machines from smaller parts, like quarks or something, but I don't see that happening very soon


1) Computers get so powerful that it doesn't really matter if they keep doubling. If we're only using 10% of the available power, what good does it do to double and only use 5%?

That might be possible, but I think we will hit this physical barrier (someone said 2020) before our computers are 'too powerful'. And for scientific use a computer can't be too powerful anyways...

dr.walrus
07-09-2010, 08:48 AM
I was reading an article in a past issue of eetimes and they were speculating that moores law may hit an atomic wall by 2020. This means that our electronics will no longer shrink. So does that mean to speed up we will have to move back to increasing the size of our devices to speed them up?

So when do you think it will happen?

Funnily enough, I had this question in an exam a few months ago.

People have been saying Moore's law will cease to hold true since the 80s, and it simply hasn't happened. There are a few things worth noting that increasing numbers of transistors doesn't directly equate to higher speed.

Increasing the size of processors causes the major problem of increased power consumption and heat. We can make processors much more effective by using more appropriate architectures. A question not often enough asked is 'what alternatives do we have to x86 in the desktop market'. The x86 architecture is a sort of bodge that just DOES everything we want it to do, but often with frigtening inefficiency.

And it's unfair to place that onus solely on the hardware industry. Software code is so incredibly bloated, unoptimised, and in some cases it could be argued that there is an implied level of collusion between software and hardware manufacturers. Want to use new software - buy a new computer - which comes with more software... There are much better engineering solutions than 'add more transistors' as we have been doing. Our movement to parallel processing is to prevent the power dissipation of improving individual functional units, but what it's actually done is reduce the amount of everyday performance we get per transistor. It seems strange, but software is presenting a major barrier in itself.

Back to the question, in terms of the decreasing size of lithography, it's pretty much agreed that we've got about ten years or so left, because of the effect of quantum tunnelling. By then, we're gonna need a new technology to manufacture ICs, or whatever their replacements will be. And from there, we're a long way from approaching the maximum density of computing power.

The following from Wikipedia sums up the accepted technical wisdom about the theoretical potential of computing density quite nicely:


Futurists (http://en.wikipedia.org/wiki/Futures_studies) such as Ray Kurzweil (http://en.wikipedia.org/wiki/Ray_Kurzweil), Bruce Sterling (http://en.wikipedia.org/wiki/Bruce_Sterling), and Vernor Vinge (http://en.wikipedia.org/wiki/Vernor_Vinge) believe that the exponential improvement described by Moore's law will ultimately lead to a technological singularity (http://en.wikipedia.org/wiki/Technological_singularity): a period where progress in technology occurs almost instantly.[49] (http://en.wikipedia.org/wiki/Moore%27s_law#cite_note-Kurzweil_2005-48)
Although Kurzweil (http://en.wikipedia.org/wiki/Ray_Kurzweil) agrees that by 2019 the current strategy of ever-finer photolithography (http://en.wikipedia.org/wiki/Photolithography) will have run its course, he speculates that this does not mean the end of Moore's law:

Moore's law of Integrated Circuits was not the first, but the fifth paradigm (http://en.wikipedia.org/wiki/Paradigm) to forecast accelerating price-performance ratios. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census (http://en.wikipedia.org/wiki/U.S._Census,_1890), to [Newman (http://en.wikipedia.org/wiki/Max_Newman)'s] relay-based "[Heath] Robinson (http://en.wikipedia.org/wiki/Heath_Robinson_%28codebreaking_machine%29)" machine that cracked the Lorenz cipher (http://en.wikipedia.org/wiki/Lorenz_cipher), to the CBS vacuum tube computer (http://en.wikipedia.org/wiki/UNIVAC_I) that predicted the election of Eisenhower (http://en.wikipedia.org/wiki/Dwight_D._Eisenhower), to the transistor-based machines used in the first space launches (http://en.wikipedia.org/wiki/Space_launch), to the integrated-circuit-based personal computer.[50] (http://en.wikipedia.org/wiki/Moore%27s_law#cite_note-49)
Kurzweil speculates that it is likely that some new type of technology (possibly optical (http://en.wikipedia.org/wiki/Optical_computer) or quantum computers (http://en.wikipedia.org/wiki/Quantum_computers)) will replace current integrated-circuit technology, and that Moore's Law will hold true long after 2020.

Lloyd (http://en.wikipedia.org/wiki/Seth_Lloyd) shows how the potential computing capacity of a kilogram of matter equals pi times energy divided by Planck's constant. Since the energy is such a large number and Plancks's constant is so small, this equation generates an extremely large number: about 5.0 * 1050 operations per second.[49] (http://en.wikipedia.org/wiki/Moore%27s_law#cite_note-Kurzweil_2005-48)
He believes that the exponential growth (http://en.wikipedia.org/wiki/Exponential_growth) of Moore's law will continue beyond the use of integrated circuits into technologies that will lead to the technological singularity (http://en.wikipedia.org/wiki/Technological_singularity). The Law of Accelerating Returns (http://en.wikipedia.org/wiki/Law_of_Accelerating_Returns) described by Ray Kurzweil has in many ways altered the public's perception of Moore's Law. It is a common (but mistaken) belief that Moore's Law makes predictions regarding all forms of technology, when it has only actually been demonstrated clearly for semiconductor (http://en.wikipedia.org/wiki/Semiconductor) circuits (http://en.wikipedia.org/wiki/Integrated_circuit). However many people including Richard Dawkins have observed that Moore's law will apply - at least by inference - to any problem that can be attacked by digital computers and is in it essence also a digital problem. Therefore progress in genetics where the coding is digital 'the genetic coding of GATC' may also advance at a Moore's law rate. Many futurists still use the term "Moore's law" in this broader sense to describe ideas like those put forth by Kurzweil but do not fully understand the difference between linear problems and digital problems.

mtekk
07-09-2010, 09:18 AM
I read his comment about devices growing as the size of the entire chip growing when we inevitably hit the limit of how small we can shrink silicon semiconductors.

Along those lines, I remember reading somewhere recently that a group was having great success in making circuits I think using a graphite/silicon blend? They were making circuits smaller than would have been possible with straight silicon. ...I wish I could remember where I saw that...

Well increasing the physical size causes manufacturability issues, hell just ask Nvidia how easy large dies are to make (they can't make a 512 shader GF100 due to defects as it's over a 500mm^2 die). Not to mention it sucks gobs of down power.

People have said the next thing is to go to different materials, which if for each drop in size they have to jump from material to material it will be very expensive (some say GaAs is the future, my semiconductors professor said it will always be the future). Others have said optical processors are the next big thing. Unfortunately, they are right they will be big (physically), but they have some benefits. Some are backing spintronics (store states in "up" and "down" spin of electrons), which is really cool, but who knows when that will be viable. Others believe will will hit a wall, but it won't matter at that time as we'll have moved on to focusing on creating organic semiconductors that cost much less.

mDust
07-09-2010, 11:48 AM
I believe photons will replace electrons altogether. Physicists have been working on how to manipulate the speed of photons for decades. In 1999 they successfully brought photons to a complete stop and in 2007 they figured out how to trap the entire spectrum and bring it to a halt. A few years from now they'll be able to manipulate the photons efficiently and a few years after that they will demonstrate how their process can store and manipulate data. I would expect that several large computer manufacturers would rabidly snatch up this tech to bring it to the market a few years later...which would be around 2020. In another decade we will literally be computing at the speed of light!:)
http://www.livescience.com/strangenews/071114-trapped-rainbow.html

I was reading about this elsewhere (no link, sorry) where they talked about how they were developing a logic system on a nano scale. It was pretty impressive! The various wavelengths can all be processed simultaneously while individual photons within a wavelength can be slowed to allow more time-sensitive data through. Unfortunately, they weren't imaginative enough to reinvent how the logic gates or physical parts of an IC work, so they ended up with an analog of current tech that simply computed with a different particle...but much, much faster.

Airbozo
07-09-2010, 11:54 AM
...
A circa 1990s CPU can do math faster than a human, but that doesn't make it better. ....

One thing people forget is that even though the calculations _may_ be faster than most human brains, there is still the step of input that is done by a human. This is where humans can be faster than a computer. I know people who can calculate large numbers faster than you can type the problem into a computer.

x88x
07-09-2010, 01:15 PM
for scientific use a computer can't be too powerful anyways...

This is very true, but for scientific use size doesn't always matter as much as it does for consumer use. TBH, I think we'll eventually move to a hybrid of the two states I mentioned, where you can have a crazy powerful computer in your pocket, but if you need even more insane power, you connect to a remote system that you either own or rent time on (kinda coming full circle, in a way). For pretty much everything except for graphics this is stupid easy to do even now (especially with Linux), but integrating it into consumer devices and making it easy for the proverbial grandmother to use will be the biggest hurdle, imo.

I know I'll probably look back on this in a few years time and laugh at myself, but I really think we're approaching a different wall in the consumer market, where the amount of processing power required by the vast majority of people is far outstripped by the power available with hardware of the time. I think we're mostly there already with the desktop market, except for gamers. Even PC graphics, I think, will hit a wall in the next 5-10 years, where you can't really improve the graphics quality anymore, and we can live-render true photorealistic video on our desktops. The problem will be with mobile devices, but I think with the recent improvements in the ARM architecture, and if we can integrate a good, standardized, remote processing architecture, that could easily hit this wall as well.

Of course, scientific, enterprise, and a few other computing markets will continue to require more and more power, but for the other 80-90% of the market, I really think we'll reach a point where it doesn't matter.

dr.walrus
07-09-2010, 03:05 PM
One thing people forget is that even though the calculations _may_ be faster than most human brains, there is still the step of input that is done by a human. This is where humans can be faster than a computer. I know people who can calculate large numbers faster than you can type the problem into a computer.

I would seriously question this, apart from very specific situations. The vast majority of the information to be processed is machine-produced anyway, and you should know full well how fast a bulk insert via SQL is 8)

dr.walrus
07-09-2010, 03:07 PM
I really think we're approaching a different wall in the consumer market

'The wall' is the atomic limit of the current lithography processes, where there's enough quantum tunnelling to render existing proceses totally ineffective. Intel estimates that to be 16nm transistors with 5nm gates, that's not different between home PCs and workstations matey!

Airbozo
07-09-2010, 03:39 PM
I would seriously question this, apart from very specific situations. The vast majority of the information to be processed is machine-produced anyway, and you should know full well how fast a bulk insert via SQL is 8)

I agree, but if you are comparing computer calculations to human calculations you have to level the playing field since the human brain does not YET have a way to input an SQL database.

lol

Go John Henry GO!

dr.walrus
07-09-2010, 03:46 PM
I agree, but if you are comparing computer calculations to human calculations you have to level the playing field since the human brain does not YET have a way to input an SQL database.


I think you'll find a bulk insert loads a .csv file, not an SQL database itself :banana:

x88x
07-09-2010, 04:20 PM
'The wall' is the atomic limit of the current lithography processes, where there's enough quantum tunnelling to render existing proceses totally ineffective. Intel estimates that to be 16nm transistors with 5nm gates, that's not different between home PCs and workstations matey!

Yes, I know the same processes are used in consumer and enterprise markets. If you read more than just the first half of the first sentence of what I wrote, you would see that's not what I was talking about. The 'wall' I was referring to there is a matter of demand, not technical capability.

Airbozo
07-09-2010, 04:40 PM
I think you'll find a bulk insert loads a .csv file, not an SQL database itself :banana:

lol

Well then we need a way to input a cvs file directly to the brains math calculation center...

dr.walrus
07-09-2010, 04:50 PM
Yes, I know the same processes are used in consumer and enterprise markets. If you read more than just the first half of the first sentence of what I wrote, you would see that's not what I was talking about. The 'wall' I was referring to there is a matter of demand, not technical capability.
Lol i didn't mean it to sound so patronising - I was just making a joke of that's how this seems to read:


I think we're mostly there already with the desktop market, except for gamers. Even PC graphics, I think, will hit a wall in the next 5-10 years, where you can't really improve the graphics quality anymore, and we can live-render true photorealistic video on our desktops. The problem will be with mobile devices, but I think with the recent improvements in the ARM architecture, and if we can integrate a good, standardized, remote processing architecture, that could easily hit this wall as well.

x88x
07-09-2010, 04:57 PM
lol

Well then we need a way to input a cvs file directly to the brains math calculation center...

Sweet! Think I could get a couple USB 3.0 controllers built into my brain? That would be awesome! :D

slaveofconvention
07-09-2010, 05:00 PM
Sweet! Think I could get a couple USB 3.0 controllers built into my brain? That would be awesome! :D

Then you'd have to choose if you want the ports in your nostrils or ears.... Either would suit you, I'm sure :p

Oneslowz28
07-09-2010, 05:03 PM
Sweet! Think I could get a couple USB 3.0 controllers built into my brain? That would be awesome! :D I have a few FTDI Serial - USB chips here. We can experiment if you like....

dr.walrus
07-09-2010, 05:06 PM
I have a few FTDI Serial - USB chips here. We can experiment if you like....

I've got an angle grinder you can borrow?

x88x
07-09-2010, 05:12 PM
Then you'd have to choose if you want the ports in your nostrils or ears.... Either would suit you, I'm sure :p
Actually, I was thinking nape of the neck, ala The Matrix. Maybe on either side of my spine? ...Hmmm.... *runs off to find a decent medical resource for unused nerve clusters*

Oneslowz28
07-09-2010, 05:56 PM
Hey we do have a Dr. on staff here too.... This is turning out to be a good possibility.. I will draft a release tonight that releases TBCS from any and all legal ramifications to said act.

If this works I see the following happening.

1. Get Underpants
2. ???
3. Profit!

x88x
07-09-2010, 06:07 PM
Hahaha, awesome business plan. :D

In all seriousness though, this actually started me thinking, and I wonder if some sort of serial communication might actually be possible. I've seen direct-nerve-control and feedback in prosthetics, I've seen the touch nerves in the tongue reroute their signals to the vision center of the brain, I've even seen someone develop a 6th sense (of sorts) using vibratory input from a GPS receiver. What's to say we couldn't do a similar thing with computer communication? The actual hardware should be fairly simple, especially if you only need one-way communication (input). You just find a nice nerve cluster, and send small electric pulses at it. You would want to keep the current extremely low, for safety's sake, but it would definitely be feasible. The hard part would then be training your brain to interpret the signals as some sort of sensory input. Think of it sort of like someone tapping out Morse code on your leg. Hmmmm.....where can I pick up some medical electrodes? :think:

mDust
07-09-2010, 11:12 PM
Hmmmm.....where can I pick up some medical electrodes? :think:
...and that was the last time anyone ever heard from x88x...

blaze15301
07-10-2010, 12:22 AM
Hahaha, awesome business plan. :D

In all seriousness though, this actually started me thinking, and I wonder if some sort of serial communication might actually be possible. I've seen direct-nerve-control and feedback in prosthetics, I've seen the touch nerves in the tongue reroute their signals to the vision center of the brain, I've even seen someone develop a 6th sense (of sorts) using vibratory input from a GPS receiver. What's to say we couldn't do a similar thing with computer communication? The actual hardware should be fairly simple, especially if you only need one-way communication (input). You just find a nice nerve cluster, and send small electric pulses at it. You would want to keep the current extremely low, for safety's sake, but it would definitely be feasible. The hard part would then be training your brain to interpret the signals as some sort of sensory input. Think of it sort of like someone tapping out Morse code on your leg. Hmmmm.....where can I pick up some medical electrodes? :think:

and this is where Armageddon starts, one crazy man trying to get usb 3.0 in his brain lol.:whistler:

i do believe there is a man in london who has something like this already tho. when he goes into his workplace the doors open lights come on and he can pretty much control computers around him. i herd this a few years back.

diluzio91
07-10-2010, 12:25 AM
of course, we all know where this is going to go if it happens... your sitting at your desk, with your flashdrive plugged in... watching brain porn. lol

crenn
07-10-2010, 02:23 AM
and this is where Armageddon starts, one crazy man trying to get usb 3.0 in his brain lol.:whistler:

i do believe there is a man in london who has something like this already tho. when he goes into his workplace the doors open lights come on and he can pretty much control computers around him. i herd this a few years back.
Professor Kevin Warwick.
http://en.wikipedia.org/wiki/Kevin_Warwick

x88x
07-10-2010, 03:18 AM
Thanks crenn! I'll have to see if I can find more detailed info about his work.


There is no way I want to stay a mere human.
A man after my own heart. :D

dr.walrus
07-10-2010, 07:34 AM
...

http://img.photobucket.com/albums/v26/paul_brack/walrus/walrussurgery.jpg

mDust
07-10-2010, 07:37 AM
...

http://img.photobucket.com/albums/v26/paul_brack/walrus/walrussurgery.jpg

lol, this looks shady...

dr.walrus
07-10-2010, 07:45 AM
http://img.photobucket.com/albums/v26/paul_brack/walrus/walrussurgery2.jpg

mDust
07-10-2010, 07:58 AM
http://img.photobucket.com/albums/v26/paul_brack/walrus/walrussurgery2.jpg

In that case, I'll take 3 brain-embedded computer chips please. I'd also like a referral for my 6th and 7th sense installation surgery. Thanks!

dr.walrus
07-10-2010, 08:02 AM
In that case, I'll take 3 brain-embedded computer chips please. I'd also like a referral for my 6th and 7th sense installation surgery. Thanks!
http://img.photobucket.com/albums/v26/paul_brack/walrus/walrussurgery4.jpg

blaze15301
07-10-2010, 12:49 PM
http://img.photobucket.com/albums/v26/paul_brack/walrus/walrussurgery4.jpg

go for it. he looks legit :up:

simon275
07-10-2010, 10:23 PM
2) The cloud takes off, and we move all our processing to central locations, where it doesn't matter if they get bigger.

Looking at stuff lately, I think for a large portion of the population it's gonna be #2. With the rise of mobile devices that are always connected to the internet with increasingly fast wireless connections, it won't be long until working off the cloud won't feel any different than working locally.


Wow this thread went off topic.

Yes the cloud will replace a lot of things. There will still be a some client side proccessing to be done for latency sensitive applications such as games and perhaps CAD. As even with really fast routing of packets and light switching you still can't get around the speed of light down the fibre cable.