Ask Question
15 August, 10:48

How have women right changes affected American society? Consider the family structure, economic health, and the strength of the work force. Use specific moments of history as examples. In your opinion, has greater gender equity improved American society? Consider how the changing role of women has changed the identity of American society in Americans' eyes and the world's.

(answer has to be at least a paragraph)

+1
Answers (1)
  1. 15 August, 11:12
    0
    In the early 1900's women almost had no rights at all, from not being able to vote to not even to have day jobs along with their husband. On August 18,1920 all of that changed due to congress passing the 19th amendment, which granted the right of women to vote. The entire country changed that day women everywhere were overjoyed. Most men did not agree with the amendment however, they felt as if they were the dominant gender and that women were not supposed to have those rights. Now women make up about fifty percent of the votes which in the U. S. is about 162 million women voting.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How have women right changes affected American society? Consider the family structure, economic health, and the strength of the work force. ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers