Do we always have to smile?

As a huge lover of smiles, “smiling is my favorite” this may seem like a weird post. Recently, at the ballpark, someone yelled at a girl to smile while she was batting. Her concentration was broken, and she smiled. Then they shouted, “You can’t hit the ball if you’re not smiling.” I hadn’t been too worried about the constant demands to smile up to that point.

I may not look like it, but I played softball in my youth. I didn’t smile, and I hit the ball plenty of times. In fact, I didn’t smile most of the game. I concentrated on the game and smiled when I was happy. I laughed and joked with my friends during the game. I loved playing. Did I smile all the time? No.

It’s the first time I’ve really thought about how we demand smiles from girls when we would never yell at a young boy batting in a baseball game to “smile.” I’m not sure why it struck me so profoundly last evening. We’re constantly reminded to smile when we don’t feel like it as women. Why do we do that to girls? I’m sure I did it to my girls plenty of times. I’m sorry I did now.

Making a good first impression during an interview is one thing, yet to tell a girl or woman to smile when she’s sad, nervous, upset, or angry is not okay. Must we always deny our emotions? We have to control them at times, but we should never feel we have to deny them.
Buddy the Elf smiled because he was happy. Not because someone told him to do it to make them feel better or like him better. I smile at people when I greet them at my job. I’m not against smiles! However, if I’m sad or concentrating, don’t tell me to smile. You could give me a reason to smile, like offering chocolate. That would be acceptable. 🙂

Have you ever been told to smile, and you weren’t feeling it? What can we say to someone to help them understand that smiling isn’t what we want to do at that time? Comment below!