Ask Question
3 January, 01:30

Calibrations on a recent version of an operating system showed that on the client side, there is a delay of at least 0.5 ms for a packet to get from an application to the network interface and a delay of 1.4 ms for the opposite path (network interface to application buffer). The corresponding minimum delays for the server are 0.20 ms and 0.30 ms, respectively.

What would be the accuracy of a run of the Cristian's algorithm between a client and server, both running this version of Linux, if the round trip time measured at the client is 6.6 ms?

+2
Answers (1)
  1. 3 January, 01:32
    0
    4.2ms

    Explanation:

    Calibrated time = 0.3+0.2+0.5+1.4 = 2.4

    Measured time = 6.6ms

    Accuracy is closeness of measurement to an observed or true value

    Accuracy = 6.6-2.4 = 4.2ms
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Calibrations on a recent version of an operating system showed that on the client side, there is a delay of at least 0.5 ms for a packet to ...” in 📗 Engineering if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers