I’ve been thinking about this for a while and I’m curious what others think. Some people argue that sex work gives women autonomy over their own bodies, their income, and their sexuality — that it can be empowering, liberating, and even feminist.
Others say it’s the opposite: that it reinforces objectification, exploitation, and unrealistic power dynamics, and that it ultimately harms women and the feminist movement by turning empowerment into a commodity.
So I’m wondering — where do you stand? Do you think sex work for women is empowering and liberating, or do you think it’s destructive and demoralizing for women and the feminist cause in general?


I think women are discouraged from sex work mostly because men resent the economic advantage it gives them as they’re starting their adult lives.
If women gained greater economic power that would be good for women’s rights.