Ask Question
14 November, 11:09

When did vaccines become mandatory in the us?

+4
Answers (1)
  1. 14 November, 11:30
    0
    The year of 1978 is when vaccines were mandatory in the United States
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “When did vaccines become mandatory in the us? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers