I take vitamins every day. But if you asked me if I want to take vitamins, the answer would probably be no. What the heck am I talking about?
Look, here's the deal: we're not actually meant to take vitamins, people. We're meant to eat the fruits and vegetables that come from the planet. If you eat a balanced diet of fruits, vegetables, and whole grains, theoretically you should get all of the vitamins and minerals your body needs to maintain optimum health - no supplements needed. But if you eat chips and fast food and drink soda, you probably take supplements "just in case" you don't get all the nutrients your body needs.
Problem is, despite that over half of all Americans buy vitamins, the truth is, there's no real research to show that vitamins and nutritional supplements actually work. We do know for sure that our bodies synthesize the nutrients that come from food far better than the ones that come from vitamins, but I can say from a purely anecdotal standpoint that I feel a lot better when I take my Stress Complex B vitamins and evening primrose oil.
So...what's a girl to do? For me, lately, I've been too busy and too careless and simply haven't eaten healthy enough that I feel confident I'm satisfying all of my nutritionary needs. As a result, I take vitamins as a sort of "insurance policy," since the research just isn't clear. If I ate healthier regularly, I have a feeling I'd feel better, sleep better, and live better. So give me broccoli - I favor it over hard, ugly pills that smell funny anyway.
No comments:
Post a Comment