Research has proven that women are the healthier sex. Women are more likely to investigate the ingredients in food and are significantly more conscientious about visiting their doctors for regular check-ups. But that doesn’t mean that they can’t learn something from the opposite sex. Here are 10 lessons women can learn from the men in their lives.
Text: Bauer/ Good Health/ Additional Reporting: Shenielle Aloysis