I'm trying to understand how almost every country being post-war makes any sense when WWII ended 75 years ago, but there's countries where war that ended just last year.
Sure, it doesent make sense, this is a shit thread. It sometimes feels that the only point of reference for NON-WEST is Japan. Like, Japan has had other shit to deal with since WW2, namely 3/11 and Kobe in 1995, its just that people dont really care about those because its not as sensational as a nuke.
Sometimes it feels like WW2 is so fondly remembered in some parts of the world because it was the last time there was a clear "good guys vs bad guys" narrative,
Like, nobody brings up the Korean was, or the Vietnam war, or the Afghanistan war as examples, cause all those sucked.
So people need to keep going back to WW2 and somehow they think Japan has to do the same?