Something that's crossed my mine recently. Given the political climate in the US today (Trump/Clinton, racial relations, etc.) do you think the US is unified as a country and people, or are they becoming more and more polarized as the years go on? A part of me even theorizes that we're already fighting a second civil war -- only it's one with words, not bullets and artillery like the first one. It's just that every time you turn on the news, it's nothing but bad news and Americans screaming at each other, over each other without trying to come to a solution that works for both parties. It's like we forgot what compromise means and have now dwindled down to black-white thinking and “I'm the good guy, you're the bad guy! I won't hear anything you say because I hate you!” A part of me wonders if we Americans even truly tolerate each other anymore if the news is to be believed. What are your thoughts on this? Sorry if I seemed offensive, it was just something that's been floating in my guts and I wanted to get it out.