Calculating throughput for serial vs parallel

In summary, a parallel interface with a width of 16 bits and a latency of 200 microsecondswill take at least 200 μs to transmit a message of 16 bits. anda serial interface with a latency of 20 microsecondswill take 20 μs to transmit a message of 1 bit, but in order to work out how long it will take to transmit a message of 16 bits we need to know its data rate, or throughput (also called, erroneously, bandwidth). If it has a data rate of 100 kbps (i.e. 1 bit every 10 μs) then it will take 16 x 10 μs + 20 μs = 180 μs to transfer 16 bits.
  • #1
ver_mathstats
260
21
Homework Statement
We are given two choices for interfacing a serial interface with a latency of 20 microseconds and a parallel interface with a width of 16 bits and a latency of 200 microseconds. How long would it take to transfer 16 bits over each interface?
Relevant Equations
throughput = data item / latency
I know that a serial interface has a single data line whereas a parallel can have several data lines. Could someone check over my work please and thank you. Would it just be, a serial interface: 16 bits x 20 microseconds for 320 bits per microsecond, and then for parallel it would just take 200 microseconds to transfer 16 bits? So for this it would be better to go with parallel interface?

Thank you and help would be appreciated
 
Physics news on Phys.org
  • #2
Your serial answer should be 320 microseconds not 320 bits per microseconds.

20 usecs/bit x 16 bits = 320 usecs

so yes 200 usecs is faster than 320 usecs.
 
  • Like
Likes ver_mathstats
  • #3
jedishrfu said:
Your serial answer should be 320 microseconds not 320 bits per microseconds.

20 usecs/bit x 16 bits = 320 usecs

so yes 200 usecs is faster than 320 usecs.
Sorry, I just realized that but thank you I got it now
 
  • Like
Likes jedishrfu
  • #4
And @ver_mathstats just to be sure you know, this is a textbook problem and VERY unrealistic. It would be a very odd circuit technology indeed that could manage 20 microseconds for a serial line but require 200 microseconds each for the parallel lines. PLUS, the serial has to be turned from parallel into serial, transmitted, then turned back into parallel.
 
  • Like
  • Haha
  • Skeptical
Likes jedishrfu, pbuk, 256bits and 1 other person
  • #5
phinds said:
And @ver_mathstats just to be sure you know, this is a textbook problem and VERY unrealistic. It would be a very odd circuit technology indeed that could manage 20 microseconds for a serial line but require 200 microseconds each for the parallel lines. PLUS, the serial has to be turned from parallel into serial, transmitted, then turned back into parallel.
Is that the brief on SATA vs PATA.:smile:
https://en.wikipedia.org/wiki/SATA
 
  • Haha
Likes pbuk
  • #6
Oh dear, this seems to have gone off-track. I think the problem starts here:

ver_mathstats said:
Relevant Equations:: throughput = data item / latency
This is not correct. I don't know what "data item" means in this context, but throughput and latency are not related in this (or any similar) way.

Now
ver_mathstats said:
a parallel interface with a width of 16 bits and a latency of 200 microseconds
will take at least 200 μs to transmit a message of 16 bits. and
ver_mathstats said:
a serial interface with a latency of 20 microseconds
will take 20 μs to transmit a message of 1 bit, but in order to work out how long it will take to transmit a message of 16 bits we need to know its data rate, or throughput (also called, erroneously, bandwidth). If it has a data rate of 100 kbps (i.e. 1 bit every 10 μs) then it will take 16 x 10 μs + 20 μs = 180 μs to transfer 16 bits.
 
Last edited:
  • Like
Likes jedishrfu and 256bits
  • #7
phinds said:
And @ver_mathstats just to be sure you know, this is a textbook problem and VERY unrealistic. It would be a very odd circuit technology indeed that could manage 20 microseconds for a serial line but require 200 microseconds each for the parallel lines. PLUS, the serial has to be turned from parallel into serial, transmitted, then turned back into parallel.
In case @256bits excellent piece of humour is lost on anyone, note that you can in general achieve much higher frequencies on serial cables than parallel cables because you don't have the problem of signals interfering with one another (crosstalk). Also, the "problem" of turning your data from (16 bits) parallel to serial and back again only exists if your data is 16 bits wide in the first place, and ends up 16 bits wide at the other end.
 
  • Like
Likes ver_mathstats, jedishrfu and DaveE
  • #8
pbuk said:
Oh dear, this seems to have gone off-track. I think the problem starts here:This is not correct. I don't know what "data item" means in this context, but throughput and latency are not related in this (or any similar) way.

Now

will take at least 200 μs to transmit a message of 16 bits. and

will take 20 μs to transmit a message of 1 bit, but in order to work out how long it will take to transmit a message of 16 bits we need to know its data rate, or throughput (also called, erroneously, bandwidth). If it has a data rate of 100 kbps (i.e. 1 bit every 10 μs) then it will take 16 x 10 μs + 20 μs = 180 μs to transfer 16 bits.
I did not use the formula "throughput = data item / latency" for this problem in particular as I am told I have to calculate throughput in a different part. In my class, I was taught that the data item is denoted as d_i, and can be a bit, and then latency is denoted as delta t, then the throughput is calculated using the formula |d_i|/dt. You then get your answer as throughput in bits per millisecond in this particular problem.
 
Last edited:
  • #9
  • #10
pbuk said:
It is very difficult to understand what you are saying. Try separating different thoughts into different sentences. One thing that is clear is that you do not understand the difference between latency and throughput: this Wikipedia page is quite useful https://en.wikipedia.org/wiki/Network_performance#Performance_measures.
No, I do understand the difference between throughput and latency. Edited the other one as well.
 
Last edited:

FAQ: Calculating throughput for serial vs parallel

What is the difference between serial and parallel throughput calculations?

Serial throughput refers to the amount of data that can be processed in a sequential manner, while parallel throughput refers to the amount of data that can be processed simultaneously by multiple processors.

How is throughput calculated for serial systems?

Serial throughput is calculated by dividing the total number of bytes processed by the total time it takes to process them. This gives the throughput rate in bytes per second.

How is throughput calculated for parallel systems?

Parallel throughput is calculated by dividing the total number of bytes processed by the total time it takes to process them with multiple processors. This gives the throughput rate in bytes per second for each processor, which can then be added together for the overall throughput rate.

Which system typically has a higher throughput, serial or parallel?

In general, parallel systems have a higher throughput than serial systems. This is because parallel systems can process data simultaneously, while serial systems can only process data one at a time.

How does the number of processors affect parallel throughput?

The number of processors can greatly affect parallel throughput. As the number of processors increases, so does the potential for higher throughput. However, there is a limit to how much the throughput can increase with additional processors, as there may be diminishing returns due to factors such as communication overhead and resource limitations.

Similar threads

Replies
1
Views
1K
Replies
10
Views
2K
Replies
3
Views
7K
Replies
7
Views
1K
Replies
17
Views
1K
Replies
6
Views
1K
Replies
2
Views
2K
Back
Top