- #1
ver_mathstats
- 260
- 21
- Homework Statement
- We are given two choices for interfacing a serial interface with a latency of 20 microseconds and a parallel interface with a width of 16 bits and a latency of 200 microseconds. How long would it take to transfer 16 bits over each interface?
- Relevant Equations
- throughput = data item / latency
I know that a serial interface has a single data line whereas a parallel can have several data lines. Could someone check over my work please and thank you. Would it just be, a serial interface: 16 bits x 20 microseconds for 320 bits per microsecond, and then for parallel it would just take 200 microseconds to transfer 16 bits? So for this it would be better to go with parallel interface?
Thank you and help would be appreciated
Thank you and help would be appreciated