I remember living in South Korea for a year, while I was still in college, and asked a local college professor why there was so much sexist advertising, and was told that it helped empower women to look their best. I wonder if it's the same in Japan and China because that response was just... Was it just that the college professor was an asshole? Is the society just ignorant of how that shit sounds or? And it seems like it's getting worse, like some entrenched extremism lately, where the attitudes have turned towards the same crazy conservatism of forcing people into gender roles and stereotypes. It's like somehow oppression that's shared is fine even if it's clearly oppression.