How Small Can a Transistor Be and Still Work Reliably?

In summary, the current trend in computer chip manufacturing is towards making transistors smaller and smaller in order to increase computing speed and possibly energy efficiency. However, this also creates problems such as signal leaks and overheating. Two major remaining issues are quantum tunneling, which causes occasional computer malfunctions, and cosmic rays or other energetic particles hitting transistors and causing them to malfunction. These problems become more frequent as transistor sizes continue to shrink, leading to reduced reliability of computers. There is ongoing research and development to address these challenges, but there is no hard limit on the smallest size for a reliably working transistor. The trade-off between size and reliability must be carefully considered in chip design.
  • #1
OscarCP
TL;DR Summary
There are several factors that conspire against the usefulness of packing more and more transistors and diodes in a chip: While the advantages of doing so are desirable. Some of these obstacles have been overcome successfully, others remain Among them, this two: quantum tunneling in tiny transistors and strikes on them by energetic cosmic ray particles, such as iron nuclei. The later is not exclusive to small transistors, but can have more serious consequences in chips with more of them.
If this is not the correct forum, perhaps someone would be so kind as to move it to a more appropriate one? Thanks.

The current trend in computer chip manufacturing is towards making transistors smaller and smaller, so more and more can be packed in a single chip.
This has a number of advantages, for example in computing speed and possibly in energy efficiency.
However, shrinking the size creates problems, a number of which have been solved, at least for now: signal leaks, overheating, etc. But two remain that have to do with two basic physical facts:

(1) Quantum tunneling, that would cause electrons and holes to jump potential barriers set up to keep them flowing as desired through transistors, so some end up, instead, in places where they do not belong, resulting in some occasional computer malfunction. Not necessarily is this a grim prospect: there are ideas to use quantum tunneling to make transistors work better! (https://phys.org/news/2013-06-harnessing-potential-quantum-tunneling-transistors.html)

(2) Cosmic rays and other form of energetic particles hitting a transistor and causing it to malfunction and, along with this, the computer itself. (https://www.sciencealert.com/rogue-...-space-are-causing-havoc-with-our-smartphones)

These two problems are not necessarily catastrophic, with present transistor sizes their most likely effect is for the user getting some weird results now and then, or some rare mistakes made in calculations that cannot be detected easily and, or in time to correct them.
But they are likely to become more frequent, reducing the reliability of computers, as the sizes continue to shrink. At present the attempts are to come up with working chips that have 5nm transistors, or roughly the length of a chain of of 20 or 30 average atoms lined up one after another. And there is the intention to try for even smaller sizes. Some ideas already tried to increase the number of transistors in chips of the same size while getting around the tunneling problem, such as staking larger components in vertical piles, resembing high-rise, multi store buildings, rather than smaller ones side by side horizontally as usual, exist but are expensive to implement.

Any thoughts?
 
  • Like
Likes Delta2
Physics news on Phys.org
  • #2
I am not EE (though I have studied classical EM a lot) I am a mathematician, but I think a lower bound on the size of transistor is the size of one molecule of semiconductor material. Of course this is a very relaxed lower bound, we just can't make an operating transistor consisting just by 1 molecule but just saying.

(OK maybe my answer is a bit dumb I don't know, sorry anyway).
 
  • #3
Thanks, Delta2. Maybe one can make a PNP transistor, for example, with just one molecule, if it consists of enough atoms, with three parts: one of P (acceptor) semiconductor, another of N (donor) and a third of P again, for example. Each part being made with some semiconductor group with a few appropriate doping atoms of, let's say: phosphorous (N) or gallium (P) arranged just right, with three atoms of gold as contacts for P emitter, N gate and P collector ... (But probably won't work, even if the chemical equation describing it makes sense at all).

I am afraid that I might not have posed well the question I used to name this thread. It should have been clearer if called something like:

What is the limit of smallest size for a reliably working transistor in a computer chip?
 
Last edited by a moderator:
  • Like
Likes Delta2
  • #4
OscarCP said:
I am afraid that I might not have posed well the question I used to name this thread. It should have been clearer if called something like:

What is the limit of smallest size for a reliably working transistor in a computer chip?
Done!

Next time, simply report your post and ask for a title change. Mentors will have a look at it.
 
  • Like
Likes OscarCP and Delta2
  • #5
OscarCP said:
What is the limit of smallest size for a reliably working transistor in a computer chip?
There is no hard limit, big transistors are more reliable, but are bigger, slower, and cost more to run. How unreliable a transistor will you tolerate?
 
  • #6
DrClaude said:
Done!

Next time, simply report your post and ask for a title change. Mentors will have a look at it.
DrClaude, thanks. I am in PF and I have not yet found the place where the way one is supposed to do things here is explained. So, for example, I do not know, yet, how to "report a post", along with, I would imagine, a number of other things.
 
  • #7
Baluncore said:
There is no hard limit, big transistors are more reliable, but are bigger, slower, and cost more to run. How unreliable a transistor will you tolerate?
My question is about the limits imposed by quantum tunneling, cosmic ray hits and other fundamental issues, as explained in my opening comment. A chip design that causes random faults in the results of computations eventually will make the use of chips with transistors too small to function reliably enough a serious problem.
I was hoping that those taking and interest in this thread might come up with ideas and suggestions on this question. I am aware that there has been quite a bit of research on this and there much be plenty already known as well as known to be still unknown, so good engineering and scientific ideas about this topic are out there.

So, you see, I am not trying to get help with having a problem of mine solved, but to start a conversation on something I would like to know more about and I think is an interesting topic.
Thank you for your interest
 
  • #8
Baluncore said:
There is no hard limit, big transistors are more reliable, but are bigger, slower, and cost more to run. How unreliable a transistor will you tolerate?
I don't quite understand here, the transistors inside a modern CPU are not 100% reliable ?(unless the CPU overheats or overvoltaged or something like that)
 
  • #9
Delta2 said:
I don't quite understand here, the transistors inside a modern CPU are not 100% reliable ?(unless the CPU overheats or overvoltaged or something like that)
Please, have a close look at my previous comments. Thank you.
 
  • #10
OscarCP said:
Please, have a close look at my previous comments. Thank you.
Ok I see, according to myself overheating is the main cause for a transistor not to be reliable, but according to your OP, cosmic rays and quantum tunneling , as transistor size decreases, can cause non reliability too.
 
Last edited:
  • #11
OscarCP said:
I do not know, yet, how to "report a post"
1652796000129.png
 
  • Like
Likes berkeman and DrClaude
  • #12
You can get a chip with a 7nm feature size today at your local shop. There are 2 nm feature chips in the Lab. That's about 10,000 atoms, so it can't be made hugely smaller. You will, however, need to specify how reliable is "reliable". One error in a billion calculations? A trillion? A quadrillion?

You tell me the error rate you can tolerate, and I'll tell you your feature size.
 
  • Like
Likes protonsarecool, berkeman and Delta2
  • #13
Vanadium 50 said:
You will, however, need to specify how reliable is "reliable". One error in a billion calculations? A trillion? A quadrillion?

You tell me the error rate you can tolerate, and I'll tell you your feature size.
I have no idea. I believe that is an interesting question so that is the main point of what I would like this conversation to be about. This forum is about condensed physics, that includes solid state physics, so the question is also about what solid state physics might have to say about solid-state transistors size. The question I am asking is not about how to solve some problem I have.
 
Last edited by a moderator:
  • #14
Just from curiosity, what is the error rate of a transistor of a modern CPU (7nm) under normal temperature of course.
 
  • #15
Delta2 said:
Ok I see, according to myself overheating is the main cause for a transistor not to be reliable, but according to your OP, cosmic rays and quantum tunneling , as transistor size decreases, can cause non reliability too.
Right. For tiny enough transistors. So the question could be phrased as "how tiny is tiny enough for those things to become a serious problem? Or "how close is present-day bleeding edge chip-making technology to "tiny enough"?
 
  • #16
  • Like
Likes Delta2 and berkeman
  • #17
Vanadium 50 said:

I think if I ever edited Wiki articles, I'd delete "power transistors" from the list, LOL...

A single-event upset (SEU), also known as a single-event error (SEE), is a change of state caused by one single ionizing particle (ions, electrons, photons...) striking a sensitive node in a micro-electronic device, such as in a microprocessor, semiconductor memory, or power transistors.
 
  • #18
berkeman said:
I think if I ever edited Wiki articles, I'd delete "power transistors" from the list, LOL...
They are a bit too big, aren't they?
As to size and odds of SEE's happening: it occurs to me that the smaller the transistor, the more transistors in a chip, the more chances that a transistor may fail, with a chance of causing the chip to fail some calculation. The number of transistors, for a chip of a given size, increases, I think, roughly as the inverse of the square of the size of each transistor decreases, if all are of equal size and each takes up, altogether, an area more or less square.
 
  • #19
OscarCP said:
They are a bit too big, aren't they?
Yes, exactly the reason for my comment. I don't know of any power transistor (even smaller ones in TO-92 packages) that could be messed up in any way by a single particle event.

OscarCP said:
As to size and odds of SEE's happening: it occurs to me that the smaller the transistor, the more transistors in a chip, the more chances that a transistor may fail, with a chance of causing the chip to fail some calculation.
Yeah, the issue is a single bit error, and what-all havoc that can cause. One of the worst ones would be an incorrect instruction fetch operation, with the incorrectly-interpreted instruction causing a vector into the middle of some other code. That can cause all kinds of corruptions before a watchdog timer causes a reset.

That's why for critical computers you will see triple-redundant processing with voting between the modules to be able to survive a single-bit error and keep running normally.
 
  • #20
The effects of high energy particles from cosmic radiation and solar flares hitting the transistors in a chip and causing not only soft errors (a.k.a. single-event upsets, or SEU) but also actually damaging electronic components, is a reason why chips used in the computers of artificial satellites and space probes are "radiation hardened" and often older, slower chips, with larger and fewer transistors.

Also used for this reason is computing redundancy, where two or more computers, or two or more CPU cores in the same chip, are used working independently from each other, each carrying the same calculations as the others, with their results compared to detect discrepancies, flagging adverse events in one or more of them when they occur.

These strong measures, that slow down computation considerably when compared with home-use and business-use computers now days, are not necessary, so far, at the Earth's surface or at lower altitudes, where error checking methods already in use for years often suffice, (except in nuclear stations and other special cases) for now, but cosmic radiation starts to become more likely to cause problems at the greater heights (around 10 km, or some 30,000 feet) airliners cruise, particularly in long international and transoceanic flights.

https://en.wikipedia.org/wiki/Radiation_hardening

"Radiation hardening is the process of making electronic components and circuits resistant to damage or malfunction caused by high levels of ionizing radiation (particle radiation and high-energy electromagnetic radiation), especially for environments in outer space (especially beyond the low Earth orbit), around nuclear reactors and particle accelerators, or during nuclear accidents or nuclear warfare."

https://root-nation.com/en/articles-en/tech-en/en- spacecraft -processors/

"Currently, when creating new processors, more modern technological processes are used, and we can easily buy microscopic processors made using 7 nm lithography. The smaller the chip, the less voltage is needed to turn it on and off. In space, this can cause serious problems. The fact is that when exposed to radiation particles, there is a possibility of unplanned switching of the state in which the transistor will be located. This, in turn, can lead to the fact that the latter will stop working at the most unexpected moment, or calculations performed using such a processor will be inaccurate. And in space, this is unacceptable and can lead to tragic consequences."

One could say that current space hardware is ahead, in part, by being backwards, when it comes to preventing problems that become serious with smaller and faster transistors, at least for now in challenging environments, as that of outer space.

So ... what do you know about this?
 
Last edited by a moderator:
  • Like
Likes Delta2

FAQ: How Small Can a Transistor Be and Still Work Reliably?

What is the current limit of the smallest transistor size in computer chips?

The current limit of the smallest transistor size in computer chips is around 5 nanometers. This is the size of transistors currently being produced by leading semiconductor companies.

Will the smallest transistor size continue to decrease in the future?

There is ongoing research and development in the field of nanotechnology to push the limits of transistor size even smaller. However, there are physical limitations that may prevent transistors from becoming much smaller than they are currently.

How does the size of a transistor affect the performance of a computer chip?

The smaller the transistor size, the more transistors can fit on a single chip, allowing for more processing power and faster speeds. Smaller transistors also use less power, making them more energy-efficient.

What are the challenges in creating smaller transistors?

As transistors become smaller, it becomes more difficult to control the flow of electrons and maintain stability. This requires advanced materials and manufacturing techniques, which can be costly and complex.

Are there any alternatives to traditional silicon-based transistors for smaller sizes?

Scientists are exploring alternative materials such as graphene and carbon nanotubes for use in transistors. These materials have the potential to create even smaller transistors, but more research is needed to overcome technical challenges and make them commercially viable.

Back
Top