I must've missed something - are there actually people out there claiming that Western women have always had rights that were respected and a valued part of the culture? If so, they are completely delusional and turning a blind eye to the fact that even 50 years ago women were second-class citizens by law in the US, with hundreds fewer legal rights than men.
Marital rape wasn't illegal in all 50 US states until 1993 - and there still are states with loopholes. Body autonomy is a fundamental American right, with decades and decades of case law supporting that you cannot be forced to give blood or bone marrow against your will, even if it would save a life. You cannot have medical treatments you don't want forced on you, and even dead bodies cannot have their organs harvested without prior consent - because of the fundamental right to control what happens to your body. This right is extended to all Americans with the glaring exception of pregnant women.