
Other than that, I see all these shows about making yourself better cause your are not naturally. Talk shows like Oprah, Maury, Tyra, The View, Dr. Oz have talked about heath which is good but also send women off for cosmetic surgeries because their not good enough. They are applauded after on how great they look to me that's a scream of "I need to be beautiful for people to like me." All kinds of other shows, reality TV, the girl next door, extreme makeover all show mainly girls and some women either dressing provocatively to get looked at or being sent off to got get surgery to make them look good. So all women around us are seeing these images of 'you need to be sent away to get fixed into someone socially beautiful.. oh and then you can come back.' Is that healthy for women to see, It just hits me on how much that must effect a lot of women's self-esteem and body image especially since theyre pouring out thousands and thousands of dollars to be 'socially accepted.'

No comments:
Post a Comment