Why does our society promote being things you're not?
Why do we spend our lives 'trying' and aspiring to be people we're really not deep down?
Instead of growing and realizing who we are, and trying to better ourselves in those ways, we think that looking like a celebrity and living the first class lifestyle is what it's all about.
People won't wear certain clothes if they're out of season or if they're not a specific brand or designer, same with shoes, cars, etc.
We spend SO much on vanity, and at the end we're still competitive, miserable, unhappy, and never satisfied, MIND YOU, there are also poor ppl around the world starving.
How can we justify or even excuse these behaviors!?!?!?
We're brainwashed and ingrained to think that skinny, great looking people have it made, and we should aspire to be like them.
Television/Movies/Magazines in our heads rule and give you the standard, and us, unless we LOOK like the people do in these mediums, we feel like we're less of people.
We let these businesses steal our soul and we pay them money, so they can give us something so we think we're better, when in all reality they're brainwashing us to steal our money.
America is F*cking with each and every one of us, for $$$$.
And we, like blind sheep, let it.
W H Y?