With the upcoming 20th anniversary of the 9/11 terror attacks, I have been taking a look back on life before and after the events. To me, life seemed a bit more innocent back then. Terror attacks were things that happened elsewhere. Hate wasn't something that can transform a country like America. However, after the attacks, the way that America had changed and has been changing since then ended up surprising me. I never saw the laws they ended up enacting and the wars they caused coming. Nor did I see it becoming so divided that till this day, literally one half of the country wants to see the other half wiped off the face of the earth.
How about you? How have your views changed?