Can someone give me a better intuition of bandwidth?

In summary, the bandwidth is the range of frequencies which a signal/wave is allowed to have. The term bandwidth is used in two ways. The first is the range of frequencies which the data takes up. The second is the range of frequencies which the transmission occupies.
  • #1
pj33
24
3
TL;DR Summary
Intuition of bandwidth
Can someone give me a better intuition of bandwidth.
The way I see it, is that the bandwidth is the range of frequencies which a signal/wave is allowed to have. This doesn't feel complete though.
For example, how can I explain that TDMA, FDMA and CDMA are similar in this sense. As far as I know these 3 methods use equivalent bandwidth but the above definition doesn't seem sufficient to explain this except if I understand something wrong.

Thank you in advance!
 
Engineering news on Phys.org
  • #3
pj33 said:
Summary:: Intuition of bandwidth

The way I see it, is that the bandwidth is the range of frequencies which a signal/wave is allowed to have.
The word "allowed" is where your problem lies. The bandwidth of many signals can't necessarily be stated in a simple way that you are implying. Take any phase or frequency modulated signal. Its spectrum can extend to infinity but no receiver is designed to accept the whole spectrum - for obvious reasons.
1620984802492.png

It's a convention that the half power point (or difference between the two for an RF modulated signal) defines the bandwidth. Truncating a signal will introduce distortion but it's 'sufficient' to define it that way.
For any signal transmission exercise, the Noise that's introduced is always relevant so you have to balance out the distortion, resulting from truncation of the spectrum against the noise that's admitted by any filtering. In practice, the 3dB bandwidth works fine. Many references sort of assume that.
 
  • Love
  • Like
Likes DaveE and dlgoff
  • #4
I agree with #3 that the word "allowed" is confusing.

The term bandwidth is used in two ways. Either, the spectrum taken up by the data to be transmitted, or the radio frequency spectrum occupied by the transmission. These are not the same thing.

For the first, look at a bit stream, such as 1010101010. It can be considered as five cycles of alternating current with a frequency of 5/time. So if it took 1 microsecond for the ten bits, the frequency is 5/10^-6 = 5 MHz. Now look at another bit stream, 101110100100111. In this case you might notice that there are other possible frequencies present, and for such a bit stream we expect it to occupy not only the single frequency at 5 MHz but all frequencies from zero up to that figure, in the manner of white noise. So this is the bandwidth according to the first definition (which is generally viewed with horror by radio engineers).

If we now wanted to send these bits over a radio system, we use the bit stream to modulate a carrier. In a simple case, where we cause the bits to turn the carrier on and off to represent 1 or 0, we find that side frequencies are created each side of the carrier, and for the present example extend at least 5 MHz each side, usually more. So the radio spectrum occupied by the transmission is at least 10 MHz, and this is the bandwidth of the radio transmission. By changing the type of modulation it is possible to narrow the spectrum a little or alternatively, an intentionally wideband signal can be used, as for CDMA. This is the second meaning of bandwidth.
 
  • #5
I think bandwidth is easiest to understand when we think of multiple simultaneous messages sent simultaneously on different frequencies. How close can neighboring frequencies be before the messages interfere with each other?

Your choice of modulation scheme may affect how much interference you can stand, but your neighbors on either side may not be using the same scheme.
 
  • Like
Likes sophiecentaur
  • #6
anorlunda said:
I think bandwidth is easiest to understand when we think of multiple simultaneous messages sent simultaneously on different frequencies. How close can neighboring frequencies be before the messages interfere with each other?

Your choice of modulation scheme may affect how much interference you can stand, but your neighbors on either side may not be using the same scheme.
As you say, Co and Adjacent channel interference are both relevant but channel noise always needs to be considered. When all the signals in a given channel (or rather Band of channels) are using optimal modulation systems (which means they are probably wide band, noise-like signals) each of the receivers is effectively only discriminating its wanted signal from a noise. In which case, the bandwidth to consider would be the noise bandwidth, defined as being between the 3dB points on the channel response. Putting a signal on a carrier really makes very little difference to the necessary occupied bandwidth (when done properly).

Any good spectrum plan would be based on sharing of any particular band by similar signals. It's what was done for decades, with the bands all being allocated to TV and sound. Things didn't change much and the users were forced to make sure any system changes were compatible. With the migration of virtually every service to digital modulation, we're heading towards a situation where virtually all information is carried by a common system. A few very specialised channels would do their own thing (like military comms to submarines etc.)
 
  • #7
I think Bandwidth can be confusing because it's used in many different contexts. It may best be thought of as a "region of interest" in the RF spectrum (frequency domain). Yes, that's a pretty sloppy definition, but that's my point. You need to know the context for a precise definition.

Bandwidth may be what is allow by law for a transmitter.
Bandwidth may be the actual frequency content of a signal (transmitted or not).
Bandwidth may refer to the characteristics of the receiver.
Bandwidth may describe a complete communication link, i.e. the baseband signal that gets all the way to it's destination.
Bandwidth may describe the data rate your optic nerve can carry.
etc.

It also may refer to information capacity in general, in the vernacular. Like my limited bandwidth to understand a PF post after two beers.
 
  • Like
  • Haha
Likes Klystron, davenn, sophiecentaur and 1 other person
  • #8
DaveE said:
You need to know the context for a precise definition.
Channel filtering (which defines the bandwidth of the signal itself) is shared by the transmitting and receiving equipment. In practice, it's much easier to produce a filter for the receiver which does the pulse shaping because many transmitting systems can't have complicated specs.

The detailed pulse response is particularly relevant for digital signals because, in order to maximise capacity, inter symbol interference needs to be minimised. Not only the bandwidth but the details of the overall filter characteristic are important. Basic theory usually considers simple binary data streams but most data uses more than two levels these days. This link discusses Nyquist's work on channel bandwidth and overall filter requirements.
 
  • Like
Likes alan123hk
  • #9
pj33 said:
Summary:: Intuition of bandwidth

Can someone give me a better intuition of bandwidth.
The way I see it, is that the bandwidth is the range of frequencies which a signal/wave is allowed to have. This doesn't feel complete though.
For example, how can I explain that TDMA, FDMA and CDMA are similar in this sense. As far as I know these 3 methods use equivalent bandwidth but the above definition doesn't seem sufficient to explain this except if I understand something wrong.

I don't quite understand what you mean, how do we use the concept of bandwidth to explain why TDMA, FDMA and CDMA use equivalent bandwidth?

In any case, as far as digital communication is concerned, there are two basic theories worth learning from.

Nyquist Theorem – defines theoretical max symbol rate in noiseless channel with bandwidth BW, ##~ \text {Max Symbol Rate} = 2~BW ~~\text{[Symbol per second]}##

Shannon Law – maximum transmission rate over a channel with bandwidth BW, with Gaussian distributed noise, and with signal-to-noise ratio SNR=S/N,
##~ \text {Max Capacity of the channel} = BW~{log}_2~(1+\frac {S} {N}) ~~\text{[bit per second]}~##

No matter how advanced the development of modern digital communication technology is (spread spectrum modulation techniques, orthogonal frequency division multiplex, data compression, error detection and correction, etc.), it is also limited by the above two basic theories.
 
Last edited:
  • #10
alan123hk said:
Nyquist Theorem – defines theoretical max symbol rate in noiseless channel with bandwidth BW, Max Symbol Rate=2 BW [Symbol per second]
That theorem, about a noiseless channel is not really relevant in a noisy channel (and you can't avoid noise). The Nyquist theorem tells you that the original waveform can only be reproduced exactly if the sample rate is high enough. Basically, if you limit the bandwidth, the Eye Pattern becomes degraded and the noise performance suffers. But that doesn't matter if the transmission system is adequate. You have to 'start with' what Nyquist says, of course, but you need to be aware of what its consequences are.
alan123hk said:
it is also limited by the above two basic theories.
No - only the Shannon limit applies.
 
  • #11
sophiecentaur said:
That theorem, about a noiseless channel is not really relevant in a noisy channel (and you can't avoid noise)
Although Nyquist Theorem is defined in noiseless channel, I believe it also implies that there is the same restriction in the case of noisy channel, and a worse situation may occur in a noisy channel, that is, max symbol rate may be less than twice of BW.

sophiecentaur said:
No - only the Shannon limit applies.
Of course, the Nyquist theorem cannot determine the final maximum channel capacity. It is related to inter-symbol interference and only represents the maximum symbol rate, and each symbol can represent one bit or more bits. The maximum actual information transmission rate is limited by Shannon's theorem and is measured in bits per second.
 
Last edited:
  • #12
alan123hk said:
I believe it also implies that there is the same restriction in the case of noisy channel,
It's true to say that the 'best ' performance is related to the eye pattern but ISI is only a form of distortion and can be equalised for - same as any distortion (Edit: many distortions) Equalisation is used in many systems and the 'Nyquist limit' is frequently exceeded in practice. I think I'm right in pointing out that the sort of diagrams that are used to illustrate the consequences of non ideal frequency response are probably over-stressed, early on in the courses on signalling theory. It's just one of many factors that affect overall channel capacity.
 
Last edited:
  • Like
Likes alan123hk
  • #13
sophiecentaur said:
Equalisation is used in many systems and the 'Nyquist limit' is frequently exceeded in practice.

In my cognition, equalization techniques to eliminate inter-symbol interference only make the symbol rate as close as possible to the Nyquist condition (maximum symbol rate = 2BW). Therefore, the system designer must be aware of this basic restriction imposed by nyquist therom.

Are you saying that through some advanced equalization or signal processing techniques, the maximum symbol rate can exceed 2 BW?

If this is the case, I need to correct my previous thoughts.
 
  • #14
sophiecentaur said:
Equalisation is used in many systems and the 'Nyquist limit' is frequently exceeded in practice.
I think formally this statement is incorrect (although I am willing to be dissuaded).
For instance mpeg (or Dolby for that matter) can provide music that sounds far better than uncompressed music sampled at the Nyquist limit. Visual compression schemes that provide effectively superior images are legion. These all rely on context that allows a smaller information content to be expanded upon receipt in a proscribed manner.
This does not imply that the information has exceeded Nyquist.
 
  • Skeptical
Likes sophiecentaur
  • #15
hutchphd said:
I think formally this statement is incorrect (although I am willing to be dissuaded).
For instance mpeg (or Dolby for that matter) can provide music that sounds far better than uncompressed music sampled at the Nyquist limit. Visual compression schemes that provide effectively superior images are legion. These all rely on context that allows a smaller information content to be expanded upon receipt in a proscribed manner.
This does not imply that the information has exceeded Nyquist.
I have to ask what you understand by the words in the Nyquist Criterion. Remember, it's a criterion and not a 'rule' or a 'law'.
Your comment about mpeg takes us a long way away from the Nyquist Criterion.

mpeg takes an analogue signal and codes it in a very complicated way, in an attempt to redistribute coding artefacts in such a way as to make them 'minimally unacceptable'. I have no idea how one could quantify the 'information content' of the mpeg signal because it is coloured by human psychology.

Nyquist considerations may come into the system twice, aamof. Once in any process of Analogue - digital conversion and once in the process of transmitting the digital data (which is now an analogue signal in its own right of course). Nyquist applies to the transmitted data signal because it imposes a condition of maximum eye height - or equivalent for a given symbol rate. But it's only a guideline. If you have loads of carrier to noise to spare, you can degrade the eye, almost without limit if you are prepared to take long enough (processing delay) in digging the data out of the received analogue signal.

We can get a very close to answering the original question by using the simple definition of Bandwidth as the frequency range, occupied by a channel between the 3dB points. Bandwidth in the vernacular is not really a term that we should be using for engineering. "Data Rate" fits in after any messing around with the input information (coding) and before decoding it. "Bandwidth" has been hijacked by journalists and salespeople and what they mean could be absolutely anything.
 
  • Like
Likes alan123hk
  • #16
sophiecentaur said:
Nyquist considerations may come into the system twice, aamof. Once in any process of Analogue - digital conversion and once in the process of transmitting the digital data (which is now an analogue signal in its own right of course). Nyquist applies to the transmitted data signal because it imposes a condition of maximum eye height - or equivalent for a given symbol rate. But it's only a guideline. If you have loads of carrier to noise to spare, you can degrade the eye, almost without limit if you are prepared to take long enough (processing delay) in digging the data out of the received analogue signal.

After further research on this issue, I found that what you said makes sense, and I now very much agree that Nyquist condition (max symbol rate = 2BW) is just a guideline.

To be precise, even if the symbol rate > 2BW, it will only cause the impossible to completely eliminate the inter symbol interference. This may increase the bit error rate due to the reduction of noise margin, but it is only one of the influencing factors.
 
Last edited:
  • #17
alan123hk said:
To be precise, even if the symbol rate > 2BW, it is only theoretically impossible to completely eliminate the inter symbol interference. This may increase the bit error rate due to the reduction of noise margin, but it is only one of the influencing factors.
What does "only theoretically impossible" mean?? In my vernacular that means not possible. If there are larger sources of error then it may not be important, granted. But this seems a tortured way to do science.

But I guess I seldom worry about Heisenberg uncertainty when trying to hit a baseball, either.

.
 
  • #18
hutchphd said:
What does "only theoretically impossible" mean?? In my vernacular that means not possible.
Okay, I changed it to "it will only cause the impossible to completely eliminate the inter symbol interference."
 
  • Like
Likes sophiecentaur
  • #19
alan123hk said:
Okay, I changed it to "it will only cause the impossible to completely eliminate the inter symbol interference."
Inter symbol interference is not, in itself, a problem. It can be 'undone' in many cases, once the channel has ben characterised.

Imo, it's important to remember that the work that Nyquist and Shannon did was a long time ago. They established some very useful basics about communications and a lot of the work was done, using Morse Code as the system to work with. The Shannon Theorem operates at one end of the topic (maximum information transfer possible in the presence of 'noise') and the Nyquist Criterion considers no system noise at all. All practical systems operate somewhere in the middle and yet people want total as if one or both these principle must apply to any system they want to discuss. This isn't surprising when you read what many basic textbooks have to say about it. Friction and noise are often neglected in basic studies. Also, there's such a lot of material thrown up by search engines which over simplifies pretty much all Science and Technology.
PF does it's best to redress the balance.
 
  • #20
pj33 said:
Summary:: Intuition of bandwidth

For example, how can I explain that TDMA, FDMA and CDMA are similar in this sense. As far as I know these 3 methods use equivalent bandwidth but the above definition doesn't seem sufficient to explain this except if I understand something wrong.
Did you google for an explanation of these 3 ?

from wiki ...
Time-division multiple access (TDMA) is a channel access method for shared-medium networks. It allows several users to share the same frequency channel by dividing the signal into different time slots.[1] The users transmit in rapid succession, one after the other, each using its own time slot. This allows multiple stations to share the same transmission medium (e.g. radio frequency channel) while using only a part of its channel capacity. Dynamic TDMA is a TDMA variant that dynamically reserves a variable number of time slots in each frame to variable bit-rate data streams, based on the traffic demand of each data stream.

Frequency-division multiple access (FDMA) is a channel access method used in some multiple-access protocols. FDMA allows multiple users to send data through a single communication channel, such as a coaxial cable or microwave beam, by dividing the bandwidth of the channel into separate non-overlapping frequency sub-channels and allocating each sub-channel to a separate user. Users can send data through a subchannel by modulating it on a carrier wave at the subchannel's frequency. It is used in satellite communication systems and telephone trunklines.

FDMA splits the total bandwidth into multiple channels. Each ground station on the Earth is allocated a particular frequency group (or a range of frequencies). Within each group, the ground station can allocate different frequencies to individual channels, which are used by different stations connected to that ground station. Before the transmission begins, the transmitting ground station looks for an empty channel within the frequency range that is allocated to it and once it finds an empty channel, it allocates it to the particular transmitting station.

Code-division multiple access (CDMA) is a channel access method used by various radio communication technologies. CDMA is an example of multiple access, where several transmitters can send information simultaneously over a single communication channel. This allows several users to share a band of frequencies (see bandwidth). To permit this without undue interference between the users, CDMA employs spread spectrum technology and a special coding scheme (where each transmitter is assigned a code).[1][2]

CDMA optimizes the use of available bandwidth as it transmits over the entire frequency range and does not limit the user's frequency range.

CDMA allows several users to share a band of frequencies without undue interference between the users. It is used as the access method in many mobile phone standards. IS-95, also called "cdmaOne", and its 3G evolution CDMA2000, are often simply referred to as "CDMA", but UMTS, the 3G standard used by GSM carriers, also uses "wideband CDMA", or W-CDMA, as well as TD-CDMA and TD-SCDMA, as its radio technologies.
 
  • Like
  • Informative
Likes pj33 and sophiecentaur
  • #21
davenn said:
Did you google for an explanation of these 3
Obviously well worth knowing about. However, they are not, basically, methods of sampling or modulation. They are very advanced coding ('Codulation') systems and they're well outside what the OP needs to sort out first. The schemes are effectively methods of getting closer to the Shannon limit in that they allow many different information streams to be combined in a way that they interfere minimally with each other and minimise the effects of channel noise - good value, I would say but definitely what we would refer to as 'Third Year Work'.
 
  • #22
davenn said:
Did you google for an explanation of these 3 ?

from wiki ...
Time-division multiple access (TDMA) is a channel access method for shared-medium networks. It allows several users to share the same frequency channel by dividing the signal into different time slots.[1] The users transmit in rapid succession, one after the other, each using its own time slot. This allows multiple stations to share the same transmission medium (e.g. radio frequency channel) while using only a part of its channel capacity. Dynamic TDMA is a TDMA variant that dynamically reserves a variable number of time slots in each frame to variable bit-rate data streams, based on the traffic demand of each data stream.

Frequency-division multiple access (FDMA) is a channel access method used in some multiple-access protocols. FDMA allows multiple users to send data through a single communication channel, such as a coaxial cable or microwave beam, by dividing the bandwidth of the channel into separate non-overlapping frequency sub-channels and allocating each sub-channel to a separate user. Users can send data through a subchannel by modulating it on a carrier wave at the subchannel's frequency. It is used in satellite communication systems and telephone trunklines.

FDMA splits the total bandwidth into multiple channels. Each ground station on the Earth is allocated a particular frequency group (or a range of frequencies). Within each group, the ground station can allocate different frequencies to individual channels, which are used by different stations connected to that ground station. Before the transmission begins, the transmitting ground station looks for an empty channel within the frequency range that is allocated to it and once it finds an empty channel, it allocates it to the particular transmitting station.

Code-division multiple access (CDMA) is a channel access method used by various radio communication technologies. CDMA is an example of multiple access, where several transmitters can send information simultaneously over a single communication channel. This allows several users to share a band of frequencies (see bandwidth). To permit this without undue interference between the users, CDMA employs spread spectrum technology and a special coding scheme (where each transmitter is assigned a code).[1][2]

CDMA optimizes the use of available bandwidth as it transmits over the entire frequency range and does not limit the user's frequency range.

CDMA allows several users to share a band of frequencies without undue interference between the users. It is used as the access method in many mobile phone standards. IS-95, also called "cdmaOne", and its 3G evolution CDMA2000, are often simply referred to as "CDMA", but UMTS, the 3G standard used by GSM carriers, also uses "wideband CDMA", or W-CDMA, as well as TD-CDMA and TD-SCDMA, as its radio technologies.
Yes, I know how each work.
What I meant is that, if someone uses full bandwidth for certain amount of time (TDMA) is it equivalent for someone to use part of the bandwidth for more time (FDMA).
 
  • #23
pj33 said:
Yes, I know how each work.
What I meant is that, if someone uses full bandwidth for certain amount of time (TDMA) is it equivalent for someone to use part of the bandwidth for more time (FDMA).
The basics of Comms Theory are independent of specific systems. The differences between comms systems are very much in the implementation and in the commercial and political requirements. The total information transferred will be governed but the time t and bandwidth B - along the lines of Bt. When you say "equivalent", do you imply "all things being equal"? How could you be sure about that?
There's not a lot you can say about such a comparison - or about any comms channel until you specify the transmit power and the received noise / interference. If the systems were all on a par under all conditions then many people would just use binary PSK in their own individual systems (as radio amateurs tend to do). But there is a huge advantage in multiplexing many information streams into one channel because it can minimise mutual interference BUT all the basics still apply.
 
  • #24
A more objective method is to compare bandwidth efficiency ##~(\frac {\text{bit/s}} {\text{Hz}})~## and the ratio of energy per bit to noise power spectral density ##~ (\frac {\text{joules}} {\text {watts per hertz}})~##.
 
  • Informative
Likes hutchphd

FAQ: Can someone give me a better intuition of bandwidth?

What is bandwidth and why is it important?

Bandwidth refers to the amount of data that can be transmitted over a network connection in a given amount of time. It is important because it determines the speed and efficiency of data transfer, which can impact the overall performance of a network.

How does bandwidth affect internet speed?

Bandwidth directly affects internet speed because it determines how much data can be transferred at a given time. The higher the bandwidth, the faster the internet speed will be.

How is bandwidth measured?

Bandwidth is typically measured in bits per second (bps) or its multiples, such as kilobits per second (Kbps) or megabits per second (Mbps).

What factors can affect bandwidth?

There are several factors that can affect bandwidth, including the type of network connection, the number of devices connected to the network, and the amount of data being transferred. Network congestion and interference can also impact bandwidth.

How can I improve my bandwidth?

To improve bandwidth, you can upgrade to a higher speed internet plan, optimize your network settings, and limit the number of devices connected to the network. Using a wired connection instead of wireless can also improve bandwidth.

Back
Top