Originally Posted by
KROENEN
I feel like the Walking Dead ended up being the thematic opposite of the Living Dead movies. Day of the Dead embraced an almost nihilistic acceptance that society would not be rebuilt and that they would never understand the plague or overcome the dead, that whatever they did it would make little difference; it stood completely against the 80's optimism of everyone is special and could change the world. The Walking Dead is kind of saying it just takes one man to turn it all around no matter how desperate everything seems. It's overly optimistic, almost to the point where it ceases being about horror and more about leaving everyone with the "feel goods" at the end. The more I think about it the more I really dislike it...