Everything You Ever Knew is Wrong
s I was growing up, I believed the United States was the best country on the planet. Boundless opportunities awaited anyone not afraid of hard work, and justice always prevailed thanks to our sacred Constitution.
As the generous benefactor of less prosperous nations and a beacon of excellence for the entire world to emulate, we were the bearers of light and hope to the entire world.
Being born in the USA was akin to hitting the reincarnation lottery. You, the average American citizen, are envied by all. Freedom! Democracy!
That’s what I believed because that’s what I was taught by every adult in my life. Why wouldn’t I believe it? Hell, why wouldn’t they believe it? It’s all they’d been fed their entire lives too. American awesomeness was merely the natural order of things. You didn’t have to question it, you just knew it, and accepted it.
God was in Heaven, and Heaven was called America.
And all was well.
Until at some point in your life (if you’re paying attention at all) the truth hits you as violently as an anvil on Wile. E. Coyote’s head in a Roadrunner cartoon.
You may already have known the whole system was an elaborate lie since you were a kid. I’m in that camp myself. Reading voraciously and idolizing John Lennon had me radicalized by fifteen. Many others share similar experiences of the truth starting to gradually dawn on them.
But then there is that precise moment when some giant, hideous truth bomb bitch slaps you, and you’re never the same again. No preamble, no advance warning.
In a split second, you realize everything you ever knew is wrong. As the proverbial glowing light bulb hovers over your head, you see right through the window dressing and propaganda to the face of the beast.
It could have been Vietnam, Watergate, Iran Contra, 9/11, Iraq, Standing Rock, Pussy Grabbing — whatever. There are as many catalysts as there are disillusioned people, and then some. What’s happening here is the domino effect of one great truth opening the floodgates, and all the pieces suddenly falling into place.
It goes something like this:
We are not the greatest country in the world in 2018. We are a mouth-breathing, universally despised global bully. Bringing freedom and democracy is code for invading innocent countries for their resources. Everyone seems to know that but us.
But what can you expect from a nation that doesn’t even take care of its own? Millions die as a direct result of poverty so a handful of oligarchs can lead lives of unimaginable luxury. We don’t even want to keep our people healthy or educate our kids adequately.
You can’t even get ahead by being smart and driven anymore. If you’re among the lucky few who manage to attend college, all you have to look forward to is an insultingly low paying job that won’t even put a dent in all those student loans.
This is what we’re expected to be proud of and swear undying allegiance to? Are we supposed to mindlessly embrace the false narrative of what America was and is?
Because this is not the greatest country in the world by any objective reckoning. In fact, it’s not even a great country at all, or even a good one, no matter what we’ve been indoctrinated to believe.
0 comments:
Post a Comment