ESSENTIAL QUESTION: HOW HAVE SEXIST SOCIETAL ROLES CHANGES FOR THE BETTER?
- "The 1960's: A Decade of Change for Women"
- By: Kenneth T. Walsh
In this source it talks about the extreme changes that came about in the 1960's. Women started to go into the paid workforce, this started the huge debate in equal pay between the genders, this along with other things were feminists goals like , ending domestic violence, sexual harassment, and to share housework responsibility with men. "Over time, the feminist trends of the sixties took hold over the subsequent decades changed relationships between the genders" most women were starting to have the same expectations as men, now it wasn't surprising to see women in the male dominating fields, as in television production (talk shows, Oprah Winfrey). "Even conservative republican recruited female candidates and urged them to be as aggressive on the stump." This is in favor to my essential question because all of these changes in gender roles helped women to better themselves by becoming independent from the shadow of men.