I’ve been thinking about this for a while and I’m curious what others think. Some people argue that sex work gives women autonomy over their own bodies, their income, and their sexuality — that it can be empowering, liberating, and even feminist.
Others say it’s the opposite: that it reinforces objectification, exploitation, and unrealistic power dynamics, and that it ultimately harms women and the feminist movement by turning empowerment into a commodity.
So I’m wondering — where do you stand? Do you think sex work for women is empowering and liberating, or do you think it’s destructive and demoralizing for women and the feminist cause in general?


The only thing wrong with sex work is the fact that it is work.