Ask Question

Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. How long will this same program take if 1 in every 100 instructions has a page fault and each page fault takes 100 milliseconds to resolve?

+3
Answers (1)
  1. 5 January, 13:13
    0
    (10^6 + 9.9)

    Explanation:

    Given:

    Total number of machine instructions = 1000

    Number of page fault in 100 instructions = 1

    Number of page faults in 1000 instructions = 10

    Time to serve one page fault = 100 milliseconds

    Time to serve ten page faults = 100*10 milliseconds = 1000 milliseconds = 10^6 Microseconds

    Number of instructions without any page fault = 1000 - 10 = 990

    Time required to run 1000 instructions = 10 Microseconds

    So, time required to run 990 instructions = (10 * (990/1000)) Microseconds = 9.9 Microseconds

    So, the total time required to run the program = (10^6 + 9.9) Microseconds
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. ...” in 📗 Computers & Technology if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers