Women's Roles Post WWI-Post WWII

 

 

The conflict of war, although proving detrimental to the US in general, sparked the contemplation of a greater women's role in society.