The role of women in American society has been greatly overseen in the last few decades but now are coming to a more perspective on people. Still, women are seen as wives who were intended to cook, clean, and take care of the kids. But we believe that women are more than that.