I feel so alienated from our American culture, like i was supposed to be born 50 years ago.
We're living in a time where the "self" is greater then anything else, our problems trump all others. How we look, how much we weigh, how much money we have; all these things have taken a greater precedence in our lives then any other problem. But yet we complain about the decay of our world, we recognize it as a problem, but not a problem deserving of our time and effort. I can only feel like this is how our culture wants us to act.
After all, it makes the people who run our society wealthy. They project the image of a "perfect" woman who is slim, beautiful, and often times promiscuous. In order to match this image of perfection, the women of our world buy their products, convinced that if they look like the girls on the commercials, they too will be perfect. They project the image of a "perfect" man; he is muscular, always into sports, confident with women, and loves beer. So in an effort to match this image, we do as the men we see on T.V do, knowing that they are perfect and greater then us.
All this time spent on ourselves we ignore the greater problems all around us. We become blinded by the hatred we have for our bodies and it makes us loose sight of what is really important. I am just as guilty as anyone else of this, but I am recognizing it, and i'm going to change it. I want to change this world, and fix the problems with it. Just like everyone else wants to. But the only way we can accomplish that is by recognizing that the worlds problems are bigger then us, and only by recognizing this and acting on it will we make the changes we want to see in the world.