What makes you woman?
Who told you when you became woman?
Is it your body development, your natural ability to nurture and love everything in sight?
Is it a western culture that tells you when you hit 21 you are woman?
Is it traditional or gender roles that tell you once you know how to cook, clean, and have children you’ve reached womanhood?
I am just so curious to know how you became woman and what criteria you are holding on to.