I was wondering what others believe. Can a positive attitude help us heal our bodies? Do you prefer traditional doctors and medicine over natural methods and positive thinking? I would like to share a facebook post:
Love this from Dr. Lissa Rankin:
“Based on my research and my experience seeing patients, I would argue that caring for your body in traditional ways- like eating a healthy diet, exercising, and taking your vitamins- may be the least important part of your health. Sure, those things are crucial- and I do love my green juice and daily hikes. But they’re not enough. I’ve cared for loads of health nuts with a laundry list of health conditions, and I’ve cared for people with horrible health habits who enjoy perfect health.
So what’s the most important part of your health? The nature of your thoughts, beliefs, and feelings. If your mind is filled with anger, resentment, pessimism, loneliness, sadness or anxiety, you’re bathing every cell in your body with stress hormones, and the body’s self-repair mechanisms can’t function properly.
But when you fill your mind with love, hope, optimism, positive belief, a sense of Oneness with the Universe, and other healing thoughts, your body’s self-repair mechanisms flip on, and the body can heal itself.”
How’s the health of your mind today?
Tons of studies have been done on this subject matter. There is plenty of proof that our thinking can help make us healthy or it can help make us sick. I believe that God has given us the ability to heal ourselves. A healthy diet and exercise is extremely important to a healthy body. Also, I believe in using natural cures when possible. But a healthy mind set is just as important. We need to watch our thoughts. What we believe, we receive. What we focus on, we give that thing light and power. Make your thoughts positive and healthy.