Sep 22, 2024
Rape culture is an outcropping of mainstream masculine social norms - be aggressive, be dominant, control women and get what you want from them. Certainly not every guy buys into that to the same extent, but they are all messaged that - every day in every way. Until those norms change - hopefully led by men - rape culture will continue.