What will replace silicon chips?

  • Thread starter KingKai
  • Start date
  • Tags
    Silicon
In summary: Intel x86 has been getting faster and faster, while Microsoft's code has been getting slower and slower. Figure 1 in the article shows that this has been happening for a while.
  • #1
KingKai
34
0
Moore's Law will not continue indefinitely because the silicon-based processing units can only go so small.

If computing plateaus at the point where the next computer is equally as fast as the last one, what would drive consumer demand? Why would you buy a new computer that is just as fast as the one you got last year?

Will this cause economic turbulence? Or will a new type of computing, such as molecular, quantum, or DNA be used to keep improving and adhering to Moore's Law?

Also, what is the future of the keyboard? Will it become obsolete as we resort more to voice-activated systems, or even thought activated systems?
 
Computer science news on Phys.org
  • #2
KingKai said:
Moore's Law will not continue indefinitely because the silicon-based processing units can only go so small.

Nothing will for a long time. We've all seen predictions that silicon is dead for the last 10 years but why do we need to replace silicon? Silicon also makes a great substrate for quantum computing, spintronics and a host of future technologies.

The current limitations are mainly financial not technical. The current costs for a complete 450mm wafer, 14-10nm process fab for 2016 are huge ($10 billion to $15 billion). I don't see anyone shifting that level of investment to something else for a long time when silicon can be pushed to 7-5nm before current technology dead ends.

You can take my keyboard from my dead cold hands.
http://www.youtube.com/watch?feature=player_detailpage&v=JSmGjB-G6v8#t=69
 
Last edited:
  • #3
KingKai said:
If computing plateaus at the point where the next computer is equally as fast as the last one, what would drive consumer demand? Why would you buy a new computer that is just as fast as the one you got last year?

Things don't have to continue improving to be useful. Why do you buy a new car that only goes as fast as the last one?
 
  • #4
phyzguy said:
Things don't have to continue improving to be useful. Why do you buy a new car that only goes as fast as the last one?
because if I go any faster I'll get a speeding ticket...
 
Last edited:
  • #5
Is it silicon as a substrate or silicon for transistors or their equivalents? What's being worked on? How far has the work gone?

I remember some work on Josephson junctions some years back, but that did not get very far.


My first thought would be to try to expand vertically, to have several computing layers, for lack of a better term. That's most easily done by having each chip package contain a stack of chips. Otherwise one risks increasing the defect rate to high levels.


From what I can tell, over the last decade, the CPU speed has been increasing much more slowly than previously. I also notice a trend toward multicore CPU's. If it's difficult to expand vertically, one can instead expand horizontally.

Multiple cores are effectively a MIMD system, and making the most use of such a system requires programmers to do multithreading, which can be tricky. Another thing that we've been seeing a lot of is SIMD, either as add-ons like Intel-x86 SSE or else as additional cores like in the IBM Cell chip. While SIMD systems can outperform traditional CPU's with the same number of components, they have a certain rigidity. For branching, one must follow both directions, and then use each branch-condition value to select which results to use.

So it's getting a lot more tricky to program to get the highest performance.
 
  • #7
The developmental cost of nanotube CPU technology looks rather daunting. I think we are stuck with silicon for another 10 years.
 
  • #8
KingKai said:
because if I go any faster I'll get a speeding ticket...

Your car can only go the speed limit? Pblackf
 
  • #9
http://spectrum.ieee.org/semiconductors/devices/the-status-of-moores-law-its-complicated

Moore’s Law, when reflected through the steady march of node names, might seem easy and inexorable. But today a plague of intense manufacturing and design problems is forcing compromises that are sometimes sobering. And some analysts suggest that regardless of what we call the next generation of chips, the transition from old to new no longer provides nearly the kind of payoff—in cost or performance—that it used to.
 
  • #10
There's no question the benefit from each new generation is less, so the gains are slowing down. I think it is inevitable that they eventually will plateau. The point I was trying to make in my earlier post was that this isn't really a crisis. It just means that information technology will eventually look like most other technologies, where you have to pay more to get more, and you won't keep getting more essentially for free.
 
  • #11
The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software

"Andy giveth, and Bill taketh away."
Andy Grove, the head of Intel, whose CPU-chip designers design faster and faster chips.
Bill Gates, the head of Microsoft, whose programmers write slower and slower code.

Figure 1 in it shows what's been happening. It shows results for Intel x86 and Itanium CPU chips, but other companies' chips show similar trends. The number of transistors per chip continues to increase exponentially with time, but the clock speed started to level off around 2000 and completed its leveling off around 2005 at about 3 GHz. Instruction-level parallelism leveled off even before that.

Concurrency? That's doing stuff in parallel, as I'd posted about earlier.
 
  • #12
ref What will replace silicon chips?

My guess is this.
http://fc04.deviantart.net/fs70/i/2010/020/2/c/dandelion_snowglobe_II_by_mondspeer.jpg

1) A very stable crystal scaffolding that is porous

2) Those holes are filled with large 'semi biological type' molecules that act as simple or perhaps complex logic gates

3) A combination of electricity and molecular lasers will be used to power the computation and store data

4) Simple software will become hardware which is just switched off when not needed (ie a chip that sorts)

you will probably have to wait until 2300ad before we have mastered fab techniques required to make all this happen.



Im seeing several factors.

1) uber reliability ( self healing of simple subunits)

2) uber low power requirements

3) uber compactness, 3d chips with high surface area...think of a sea urchin immersed in a liquid vortex


For the next 50 years we will probably still be using silicone or analogs,just highly polished and tweaked.

Remember when stuff first gets invented its usually ugly square and boxy. It is usually unnecessarily modular and the working parts distinct. Over time greater integration and unification occurs. Today we are seeing rounded heat sink fins and water cooling emerging. We are seeing sound cards and network cards integrated into motherboards and graphics cards integrated into CPU's. Tomorrow you will see the integration of the computer chip into cooling fins.






ref Also, what is the future of the keyboard?

The keyboard of the future will have a loads more buttons and keys. Read "one stroke solutions to common problems" This extra large keyboard will mostly be for power users and may have up to 1000 keys and use much of the universal iconography we use today.

In a sense the Chinese kind of got it right, but that future keyboard will utilize the sense of touch and colour to aid navigation rather than a mish mash of black n white flowery strokes.

Qwertyuiop is a 'general solution', a legacy that gets you by but it is a very long way from being optimal for power users or specialist applications.


When HD computer screens are 6ft by 3ft in size you will want to have a keyboard input system that matches the visual display. Today communicating with computers is much harder than it needs to be because we cannot afford the $$$ for very large screen acreage and very large keyboard acreage...this is strongly true for power users, hence multiple monitors the poor mans choice today.

A more ergonomic multicoloured version of this
http://img.xcitefun.net/users/2012/09/306260,xcitefun-chinese-keyboard-2.jpg



*note

While a great deal of computation can/and is made parallel you just can't beat the simplicity and utility of ghz

My quad core 4ghz(ish) computer does a passable impression of a 10ghz computer but a significant element of computation is and always will be serial by nature so a 'true 10ghz' computer will whoop mine in key areas of 'significance'

AMDs latest 8 core CPU was nicknamed faildozer because crippled cores dunt beat ghz in a straight line race.
 
Last edited:
  • #13
...lol
 

FAQ: What will replace silicon chips?

1. What are the limitations of silicon chips that require a replacement?

Silicon chips have been the backbone of computing technology for decades, but they have limitations in terms of speed, power efficiency, and size. As our demand for faster and more efficient devices increases, there is a need for a replacement material that can overcome these limitations.

2. What are the possible alternatives to silicon chips?

Scientists are exploring various alternatives to silicon chips, such as gallium nitride, graphene, carbon nanotubes, and quantum computing. Each of these materials has unique properties that could potentially improve the performance of electronic devices.

3. How will the replacement of silicon chips affect the electronics industry?

The replacement of silicon chips will have a significant impact on the electronics industry. It will open up new possibilities for faster and more efficient devices, leading to advancements in fields such as artificial intelligence, virtual reality, and quantum computing. However, it will also require significant investments and changes in the manufacturing processes and supply chains.

4. When can we expect the replacement of silicon chips to become a reality?

The development and implementation of a new material to replace silicon chips is a complex process and can take years, if not decades. While there have been promising advancements in materials such as graphene and carbon nanotubes, it is difficult to predict an exact timeline for when they will become commercially available.

5. Will the replacement of silicon chips make current devices obsolete?

It is likely that the replacement of silicon chips will lead to significant improvements in the performance of electronic devices. However, it is unlikely that it will make current devices completely obsolete. Many of the existing technologies can still be integrated with new materials, and there will be a transition period before the new technology becomes the standard.

Back
Top