Sometimes I feel like women have lost the meaning of self respect. I mean women didn't fight for the rights to wear shorts then when you bend over let people see your butt. We fought for respect, to be noticed as humans not objects. It really freaks me out when people take pictures to show of their clothing and also show their chest or body off. I mean not that you should be super modest but you should showcase your clothes not your body. Your body is a work of art, save it for your husband, not a boyfriend that might leave in a week.