Thursday, December 7, 2017
Women's Rights
Since forever women and men have always been treated unequally, especially in the work force. Society has definitely progressed as women have gone from not being able to have a job to making up almost half of the America's workforce, but they are still not treated the same. Women have lesser pay rates, and gender is still considered when applying for jobs, but why is this so? Women and men are both capable of doing the same things and having the same impact on a company. Why are people with the same abilities being discriminated against because of their gender? It seems as if every CEO or boss of every big company is a male and that image is engrained into society's head. What can be done to improve gender equality?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment