“If tomorrow, women woke up and decided they really liked their bodies, just think how many industries would go out of business.”
This morning I woke up and decided I wasn't going to blog today, and that I would would spend some time doing a bit more self-care. I meditated, sipped my coffee, had some avocado toast (my FAVE!!!), and had a killer work-out with my brother. THEN, I saw the above quote on IG, and regrammed it. WOW! Pretty dang powerful, huh?
After pondering for quite some time (and deciding I needed to write it out), I researched it a bit, and found out that Dr. Gail Dines had said it. Now, I am not a feminist, but I definitely feel that this sentiment is more true for women. Wouldn't you agree?
SOAPBOX MOMENT: I feel that the media and advertising industries have gone so above and beyond to try and make us feel like we need to look or act a certain way to be beautiful, "get" a partner, make a certain amount of money, etc. I call BULLSHIT!
Stepping down a bit...an outside source can not determine our self-worth, and who we are at the core of our being. Only we can do that. The topic of worthiness is a whole other can of worms, but I know for myself that this did not happen over night, and it most certainly did not involve any input from the beauty industry. Does this ring true for anyone else?
Does this mean that I'm going to stop wearing make-up, working out, shaving my legs, or buying cute clothes? No! Those things are fun, and I enjoy doing them. I'm not perfect, and I definitely have issues come up for me constantly, but the difference now is that I don't let it own me. I don't let that small outside (or inside) voice determine who I am. I am strong. I am beautiful. I am intelligent. I am kind. I am me.
Definitely a little different than my normal posts, but I felt it was necessary. I'd love to hear your thoughts on this issue, and anything that came up for you.