I'm not sure where to put this, but since this mostly involves fiction, I'm placing this here for now, but please move this if it is in the wrong place. However, on reflection of some things I've written in the past, things I've read, and what I've heard about... dystopias feel safe in a way, especially in comparison to utopias. Utopian settings have felt more controversial, judging by the reaction i've seen them draw, and... well, depending mind you, more thought provoking. On another forum, I followed a timeline(not a story per say), called Reds!, which chronicles a Communist America that... actually works. Really well. Now, whether you agree with that vision is another matter, however, I think its worth bringing up how much reaction I've seen to that, and probably continue to just by stating that. It seems, at times, if something working Now, I'm not saying Dystopias are, to say the least. However, are they really as controversial anymore? Sorry if I say something hard to follow or ridiculous here. I want to go more in depth with this, but, I've gotta go right now. I'm really curious about this, and posted it here because I'm curious whether books you've read confirmed or denied this. What made you react more, or dare I say, think more, a dystopia or a utopia?