OK, this is going to be a difficult read for some, and hardly like the rest of the blogs I've done on this forum but...I need to get this off my chest.
In short, recent news (that I've seen from The Young Turks and Secular Talk, both talk shows on YouTube) have left me very pissed off at what's happening here in America. Unarmed black guys getting killed by cops, riots breaking out with cities like Charlotte being turned into pseudo-warzones with more innocent people getting hurt. This is, of course, ignoring the whole Trump/Hillary thing.
Just... What in the flying FUCK, America!? A part of me wonders if "united" means anything anymore. We're just one step away from killing each other all over again. United? Ha! From what I've seen, we're far from that. Some part of me wonders if, deep down, we were just looking for a reason to start killing/maiming each other...that we never truly, actually liked each other. Give me a few examples, if you please, to show that we genuinely give two shits about each other.
I feel ashamed to be American, ashamed to be in this country. I want to live somewhere else. Europe, specifically the UK but I don't have the finances to do that. I want to like my country, I want to believe that there are good things about America, that we Americans aren't just deranged animals looking for a reason to tear each other's throats out...but the news is making it very hard for me to see that.
So that's that. :/ I've completely lost faith in America and my own people.
Comments
Sort Comments By