Thursday, September 2, 2010

American Gender Roles

Since I could possible remember in the American society men have always been considered dominant. Men have been labeled as strong,mascular,money-maker,aggressive,dominant and the list goes on . While women have been simply a home-maker,caregiver,mommy and wife. Now a days women have the freedom to accomplish as much and beyond males. Women are now shifting out to society and becoming more educated and involved in top decisions that was usually left up for the men say so.

No comments:

Post a Comment