Quote Originally Posted by KROENEN View Post
I feel like the Walking Dead ended up being the thematic opposite of the Living Dead movies. Day of the Dead embraced an almost nihilistic acceptance that society would not be rebuilt and that they would never understand the plague or overcome the dead, that whatever they did it would make little difference; it stood completely against the 80's optimism of everyone is special and could change the world. The Walking Dead is kind of saying it just takes one man to turn it all around no matter how desperate everything seems. It's overly optimistic, almost to the point where it ceases being about horror and more about leaving everyone with the "feel goods" at the end. The more I think about it the more I really dislike it...
Its optimistic in a really odd way tho - we see a society that's gone from what was basically a commune under Rick to a capitalist society with money, private property, and presumably taxes to pay for railroad and a world where you get preferential treatment from law if you know right people. That could have been presented as humanity slipping back into pre apocalyptic status quo but instead was seen as a sign of hope even when at certain points in story (esp. with Michonne) that pre apocalyptic world was a time when people never truly lived.

It was a woolly fluffed ending for me. A big point could have been made but was avoided