Archive by Author

The Gluten-Free Diet – When It’s Not a Fad

8 Jun

Written By: Emily Farrell


Gluten Free

Picture courtesy of Emily Farrell


The gluten-free diet (GFD) has quickly become one of the most popular self-implemented diets in North America, leading to a proportionate increase in the gluten-free food industry (Reilly, 2016). A GFD is defined as a diet devoid of all food products containing gluten, namely, gluten-containing cereals – rye, barley, wheat, and triticale – gluten-containing food additives and foods contaminated with products containing gluten. Many people believe that a GFD is a healthy lifestyle choice, with perceived health benefits of eating gluten-free.

The general population’s recent devotion to a GFD has caused a perception among individuals that the GFD is nothing but a “fad diet”. The GFD, however, is considered a clinical nutrition therapy for several conditions relating to gluten sensitivity (Pietzak, 2012). Celiac disease, a genetic immune-mediated disorder causing the body’s immune system to attack normal tissue in response to gluten, requires individuals to follow a strict lifelong GFD. It occurs in 0.5-1% of the population, and the diagnosis is increasing in prevalence due to the growing awareness within the scientific community (Pietzak, 2012). Individuals with celiac disease have an increased risk of gastrointestinal cancers and, if gluten makes its way into their diet it can significantly increase their risk of nutritional deficiencies due to malabsorption (Pietzak, 2012). Other gluten-related conditions include, but are not limited to, wheat allergy, gluten ataxia, dermatitis herpetiformis – celiac disease of the skin – and non-celiac gluten sensitivity (El-Chammas and Danner, 2011). Non-celiac gluten sensitivity is a condition where individuals negatively react to gluten-containing products, in the absence of celiac disease. It is, however, recently hypothesized that individuals with non-celiac gluten sensitivity are reacting to a specific component in wheat other than gluten, and, as such, these individuals still benefit from a GFD as it eliminates wheat entirely (Escudero-Hernández et al., 2016).

Despite the various clinical conditions, it seems that many individuals in the general population are following a GFD due to reasons unknown to them, potentially wanting to join the “fad”. A large proportion of individuals believe that gluten-free products are healthier than their gluten-containing counterparts, viewing them as “healthier options”. The gluten-free food industry is increasing exponentially and is now almost a billion dollar business. This is not due to the increasing awareness of gluten-related sensitivities, but instead due to the increasing popularity of the GFD as a “fad diet” (Reilly, 2016). A survey conducted in 2015 involving more than 1500 American adults observed that the most common reason for consuming gluten-free products was for “no reason” (35% of individuals) (The Hartman Group I, 2015). This response was followed by “a healthier option” (26%), “digestive health” (19%), and “weight loss” (13%), with the least common reason being “I have a gluten sensitivity” (8%) (The Hartman Group I, 2015). Therefore, the question is, does the general population actually benefit from following a GFD?

That is a question some researchers have sought to answer. A literature review conducted in 2016 evaluated the nutritional quality of the GFD focusing on nutritional deficiencies and over-consumption of unhealthy food components (Vici et al., 2016). A GFD is often devoid of vital vitamins and minerals including vitamin B12, vitamin D, folate, calcium, zinc, magnesium, and iron (Vici et al., 2016). Moreover, a GFD is often low in dietary fiber which is important for gastrointestinal health and for the prevention of diabetes and cancer (Vici et al., 2016). Gluten-free products often contain a greater fat content than their gluten-containing counterparts and this can lead to higher consumption of lipids, including saturated fats (Vici et al., 2016). The nutritional inadequacies experienced while following a GFD may be attributed to the poor nutritional quality of gluten-free products and the avoidance of foods naturally rich in nutrients, such as whole grains. Therefore, individuals who are not required to follow a GFD often do not benefit from eating gluten-free.

While the general population may not benefit from a GFD, many individuals do. In fact, they need a GFD to maintain a reasonable level of health and quality of life. However, because this diet is now stigmatized as a “fad” diet, individuals requesting that their food is prepared “gluten-free” now experience a diminished presence of care. This introduces health hazards to individuals who must strictly follow a GFD. It is now often assumed that individuals are following a GFD due to the “fad” and limited caution may be used in preparing meals to ensure zero contamination.

It remains to be seen how long the GFD fad will remain. Nonetheless, for those who choose to eat gluten-free, regardless if they are following a GFD due to necessity or personal preference, should do so following the guidance of a registered dietitian to ensure that they are getting adequate nutrition.


El-Chammas, K., and Danner, E. (2011). Gluten-free diet in nonceliac disease. Nutrition In Clinical Practice, 26(3): 294-299.

Escudero-Hernández, C., Peña, A.S., and Bernardo, D. (2016). Immunogenetic pathogenesis of celiac disease and non-celiac gluten sensitivity. Current Gastroenterology Reports, 18(7): 1-11.

Pietzak, M. (2012). Celiac disease, wheat allergy, and gluten sensitivity. Journal of Parenteral and Enteral Nutrition, 36(supplement 1): 68S-75S.

Reilly, N. (2016). The gluten-free diet: recognizing fact, fiction, and fad. The Journal of Pediatrics, 175: 206-210.

The Hartman Group I. “The Hartman Group’s Health & Wellness 2015 and Organic & Natural 2014 reports.” Accessed February 2017.

Vici, G., Belli, L., Biondi, M., and Polzonetti, V. (2016). Gluten free diet and nutrient deficiencies: A review. Clinical Nutrition, 35(6): 1236-1241.

Is Vitamin D the Anti-Cancer Vitamin?

30 Apr

By Elizabeth Miler

Elizabeth Miller-Sun Photo

Photo Courtesy of Elizabeth Miller

According to current research, Vitamin D may play a role in reducing cancer risk and mortality. Biologically sourced from sunlight’s Ultraviolet B rays, Vitamin D has notoriously been recommended by health experts for its role in bone health. As ultraviolet rays from the sun are known to be a leading cause of skin cancer, other sources of Vitamin D from fortified foods and supplements are often favored for prevention of osteoporosis. More recently, epidemiological and clinical research has demonstrated that sufficient levels of Vitamin D (from the diet, sun or a supplement) may also be associated with decreased risk of some forms of cancer.

As many theories in science often begin, the association between Vitamin D and cancer was first discovered as a population-based geographical trend back in 1980.  Researchers at John Hopkins University observed variances in colorectal cancer incidence that correlated with geographical location throughout the United States (Garland and Garland, 1980). Regions with increased sunlight exposure like New Mexico and Arizona had lower colorectal cancer rates than cooler, less sunny climates such as New York and New Hampshire. This observation motivated further research to identify an association between Ultraviolet-B light and other types of cancer, and strikingly, increased sunlight was associated with lower rates of 15 other cancers (Grant and Garland, 2006). Multiple meta-analyses have demonstrated an association of higher Vitamin D levels and increased survival rates of patients with colorectal, hematological and breast cancer (Mohr et al., 2015; Malmi et al., 2014; Vaughan-Shaw et al., 2017). Patients with the highest blood levels of the biologically active form of Vitamin D, known as 25-hydroxyvitamin D, had the lowest risk of cancer mortality (Wang et al., 2014; Vaughan-Shaw et al., 2017).

Despite these interesting findings, these observational studies can only provide circumstantial evidence of the correlation between Vitamin D and cancer risk, and cannot confirm a cause and effect relationship. For example, perhaps these positive associations of decreased cancer risk and mortality with sunlight exposure can be explained by other factors, such as increased exercise outdoors during warmer weather, or increased fruit and vegetable consumption. In order to truly identify if Vitamin D levels have an effect on cancer incidence and mortality, randomized controlled trials in humans need to be conducted involving the supplementation of Vitamin D.

To date, only a few clinical trials have demonstrated a reduction in cancer risk with supplementation of Vitamin D. A 2014 meta-analysis examined four randomized controlled trials involving Vitamin D supplementation with doses of 400-1100 IU/day, for duration of 2-7 years (Keum and Giovannucci, 2014). The results of this meta-analysis did not observe any significant effects on overall cancer incidence, yet did see a consistent 12% reduction in overall cancer mortality in all four studies (Keum and Giovannucci, 2014). The authors stated that the positive results in overall cancer mortality could have been related to Vitamin D’s role in altering specific cancer processes such as metastasis, apoptosis, and cell proliferation. These mechanisms concur with other observations from cell culture and animal based models, demonstrating the anti-cancer role of Vitamin D (Moukayed and Grant, 2017). It is still unclear if Vitamin D only has anti-carcinogenic effects in populations with cancer or if it can also prevent it in healthy individuals. Further research is required to understand the complex relationship between the sunshine vitamin and cancer prevention, but the evidence seems promising.

As we await the results of more randomized controlled trials, we should not discount the importance of Vitamin D in the diet, as this vitamin is essential to bone health. For those not lucky enough to live close to the equator, it is recommended that individuals deficient in Vitamin D fortify their diet with a supplement, or consume fortified foods such as dairy and nut milks. Health Canada recommends that individuals aged 1-70 consume 600 IU of Vitamin D per day, which can easily be reached with either fortified foods, a supplement, or sitting in the sun for about 15 minutes (Health Canada, 2012). Luckily we are finally reaching the end to the dreary dark days of winter, so next time you see the sun shining, go outside and reap the benefits from the ball of life in the sky.


Garland, C. F., & Garland, F. C. (1980). Do sunlight and vitamin D reduce the likelihood of colon cancer? International Journal of Epidemiology, 9(3), 227-231. doi:10.1093/ije/9.3.227

Giovannucci, E., Liu, Y., Rimm, E. B., Hollis, B. W., Fuchs, C. S., Stampfer, M. J., & Willett, W. C. (2006). Prospective study of predictors of vitamin D status and cancer incidence and mortality in men. Journal of the National Cancer Institute, 98(7), 451–459. Retrieved from:

Grant, W. B., & Garland, C. F. (2006). The association of solar ultraviolet B (UVB) with reducing risk of cancer: multifactorial ecologic analysis of geographic variation in age-adjusted cancer mortality rates. Anticancer Research, 26, 2687-2700. Retrieved from:

Health Canada (2012). Vitamin D and calcium updated dietary reference intakes. Retrieved from:

Keum, N., & Giovannucci, E. (2014). Vitamin D supplements and cancer incidence and mortality: a meta-analysis. British Journal of Cancer, 2014(111), 976-980. doi: 10.1038/bjc.2014.294

Maalmi, H., Ordóñez-Mena, J. M., Schöttker, B., & Brenner, H. (2014). Serum 25-hydroxyvitamin D levels and survival in colorectal and breast cancer patients: Systematic review and meta-analysis of prospective cohort studies. European Journal of Cancer, 50(8), 1510-1521. doi:10.1016/j.ejca.2014.02.006

Mohr, S. B., Gorham, E. D., Kim, J., Hofflich, H., Cuomo, R., E., & Garland, C., F. (2015). Could vitamin D sufficiency improve the survival of colorectal cancer patients? Journal of Steroid Biochemistry & Molecular Biology, 148(2015), 239-244. doi: 10.1016/j.jsbmb.2014.12.010

Moukayed, M., & Grant, W. B. (2017). The roles of UVB and vitamin D in reducing risk of cancer incidence and mortality: A review of the epidemiology, clinical trials, and mechanisms. Reviews in Endocrine and Metabolic Disorders, 1-16. doi:10.1007/s11154-017-9415-2

Vaughan-Shaw, P. G., O’Sullivan, F., Farrington, S. M., Theodoratou, E., Campbell, H., Dunlop, M. G., & Zgaga, L. (2017). The impact of vitamin D pathway genetic variation and circulating 25-hydroxyvitamin D on cancer outcome: systematic review and meta-analysis. British Journal of Cancer, 116, 1092–1110. doi:10.1038/bjc.2017.44

Wang, B., Jing, Z., Li, C., Xu, S., & Wang, Y. (2014). Blood 25-hydroxyvitamin D levels and overall mortality in patients with colorectal cancer: A dose–response meta-analysis. European Journal of Cancer, 50(12), 2173-2175. doi:10.1016/j.ejca.2014.05.004



Soy Isoflavones: Is age of exposure affecting your breast cancer risk?

23 Jan

By Emily Farrell


Edamame – Immature soybeans which are steamed or boiled. Image courtesy of Emily Farrell. 

Does increasing your level of soy consumption decrease your risk of breast cancer? Are the hormone levels in your body playing a role? Is the hormonal impact of soy responsible for increasing or decreasing your risk of breast cancer? The existing inconsistencies surrounding soy intake and breast cancer risk may soon be clarified by an exciting hypothesis that the age at which soy consumption begins may play a role (Messina, 2016).

Soy isoflavones are phytochemicals that naturally exist in soybeans (Messina, 2016). Better known as phytoestrogens, they can bind to estrogen receptors in the body and induce estrogenic or anti-estrogenic effects (Messina, 2016). Isoflavones are much less potent activators of the estrogen receptor than estrogen itself, and when they are present in the body in higher amounts than estrogen (typically following the consumption of soy), they can exert an overall anti-estrogenic effect (Messina, 2016). These anti-estrogenic properties of soy isoflavones are thought to contribute toward a reduction in breast cancer risk. So why are there still inconsistencies in the data surrounding this relationship? According to an exciting hypothesis, the age at which soy consumption begins may be the culprit!

An animal study conducted in 2002 investigated the impact of soy isoflavones on breast cancer risk in female rats following exposure at different time points during the life cycle (Lamartiniere et al., 2002). The study concluded that, in order for soy isoflavones to have a protective effect, exposure must occur during mammary gland development, that is, exposure must occur before puberty (Lamartiniere et al., 2002). However, the most protective effect was observed when intake occurred both before puberty and during adulthood (Lamartiniere et al., 2002). More recently, a study performed in 2011 also found that soy isoflavones reduce breast cancer incidence following exposure before puberty in mice (de Assis et al., 2011).

What about human exposure to soy isoflavones before puberty? The Shanghai Women’s Health Study published in 2009, which involved greater than 70,000 Chinese women, investigated the association between breast cancer incidence and dietary soy intake during adolescence and adulthood (Lee et al., 2009). The study concluded that a high level of soy consumption (equating to approximately 50 mg of soy isoflavones/day) during adulthood was related to a reduced risk of breast cancer and adolescent soy consumption showed similar results (Lee et al., 2009). The most pronounced association, however, was observed in women who consumed consistently high levels of soy during both adolescence and adulthood (Lee et al., 2009). These results concur with animal data and provide evidence of a reduction in adult cancer relating to adolescent soy consumption (Lee et al., 2009). A follow up study using the same data from the Shanghai Women’s Health Study investigated the incidence of premenopausal breast cancer compared to postmenopausal breast cancer based on age of exposure (Baglia et al., 2016). In this case, it was found that high soy consumption during both adolescence and adulthood was related to a reduced risk of premenopausal breast cancer, while high soy consumption solely during adulthood related to reduced risk of postmenopausal breast cancer (Baglia et al., 2016). These findings support that hormonal status is a factor in the timing of exposure hypothesis.

Research is ongoing in this area to understand the underlying biological mechanisms by which soy intake can decrease breast cancer risk, to advance the knowledge surrounding soy intake and its association with breast cancer risk, and to possibly improve a breast cancer prognosis following a diagnosis (Messina, 2016).

If you’re looking to reduce your breast cancer risk and you’re already past your adolescent glory days, increasing your soy consumption now definitely won’t hurt! I know I’ll be adding some more soybeans into my diet.


Baglia, M.L., Zheng, W., Li, H., Yang, G., Gao, J., Gao Y-T., and Shu X-O. (2016). The association of soy food consumption with the risk of subtype of breast cancers defined by hormone receptor and HER2 status. International Journal of Cancer, 139: 742–748.

de Assis, S., Warri, A., Benitez, C., Helferich, W., and Hilakivi-Clarke, L. (2011). Protective effects of prepubertal genistein exposure on mammary tumorigenesis are dependent on BRCA1 expression. Cancer Prevention Research, 4(9): 1436–1448.

Lamartiniere, C. A., Cotroneo, M.S., Fritz, W.A., Wang, J., Mentor-Marcel, R., and Elgavish, A. (2002). Genistein chemoprevention: timing and mechanisms of action in murine mammary and prostate. The Journal of Nutrition, 132(3): 552S-558S.

Lee, S-A., Shu, X-O., Li, H., Yang, G., Cai, H., Wen, W., Ji, B-T., Gao, J., Gao, Y-T., and Zheng, W. (2009). Adolescent and adult soy food intake and breast cancer risk: results from the Shanghai Women’s Health Study. The American Journal of Clinical Nutrition, 89(6): 1920-1926.

Messina, M. (2016). Soy and health update: Evaluation of the clinical and epidemiologic literature. Nutrients, 8(12).

To choose or not to choose… Cilantro?

2 Dec

By Laura Barnes


Image courtesy of Laura Barnes

The other night I went out to dinner with a group of friends. As we were choosing the main course, our unique food preferences were displayed through the various dishes we selected. I was surprised when my friend specifically requested for her dish to not include Cilantro. Subsequently, another friend asked for the same. As a self-professed cilantro lover, I was curious about their dislike of my beloved herb. This led to the question, why do some people love the herb, and others hate it?

Cilantro (Coriandrum sativum) is also known as Chinese parsley and historically known as Coriander. Coriander has been found to have several potential health benefits, including antihyperglycemic, hypotensive, antihyperlipidemic, and antixodiant properties (Gupta, 2010).  References to the herb date back to 2000 B.C. originating in the Mediterranean and Middle East. The herb’s cultivation and use expanded across the globe and it became a staple in various cuisines, including in many European dishes (Leach, 2001).  However, by the end of the sixteenth century, Cilantro use was all but eliminated by European chefs. The reason? They found the smell of the herb resembled the particular scent of Cimex lectularius, aka bed bugs.  Really! Thus the repulsive reputation of the herb was developed (Leach, 2001). Fortunately for my taste buds and those of other Cilantro lovers, globalization has led to the re-introduction of Cilantro into many cuisines; including foods common in the current North American diet (Leach, 2001).

However, Cilantro still has the ability to elicit strong reactions. With chemical eradication, bed bugs are much less common in developed nations (Liu, 2015) and any associations with the herb surprise most people. There are people who have a particular distaste for Cilantro, claiming the taste resembles soap (Eriksson, 2012). In fact, this distinct fragrance, common to bed bugs, soap and cilantro is the result of particular molecules called aldehydes (McGee, 2010).  As odour dictates most of what we taste (Spence, 2015), the common theory is that it is the odour which drives the dislike of the herb (Eriksson, 2012). Very interestingly, not everyone carries the gene that codes for the ability to smell the aldehydes present in Cilantro (Eriksson et al., 2012) In one study, participants from East Asian and European-American backgrounds were specifically found to have a higher prevalence of dislike for Cilantro (Mauer and El-Sohem, 2012). Indeed, some of the dislike for the herb may be based on cultural norms, since it is theorized that how we perceive odours is a learned behaviour (Herz, 20016), beginning in infancy.

So, what about you? Are you a Cilantro lover? Has this debate caused heated debates around your kitchen table? If any lesson can be gleaned from the herb, it is that our taste preferences are a reflection of our own personal uniqueness and global diversity. Scientifically, no side is technically “right” or “wrong”. We are just – us – and I for one, would not have my friends any other way.


Eriksson, N., Wu, S., Do, C. B., Kiefer, A. K., Tung, J. Y., Mountain, J. L., . . . Francke, U.      (2012). A genetic variant near olfactory receptor genes influences cilantro preference.      Flavour, 1(22), 1-7.

Gupta, M. (2010). Pharmacological properties and traditional therapeutic uses of important indian  spices: A review. International Journal of Food Properties, 13(5), 1092-1116. Retrieved from:

Herz, R. (2006). I know what I like; Understanding odor preferences. In J. Drobnick (Ed.), The smell culture reader (pp. 190-203). Oxford.

Leach, H. (2001). Rehabilitating the “stinking herbe”: A case study of culinary prejudice. Gastronomica, 10-15.

Liu, F., & Liu, N. (2015). Human odorant reception in the common bed bug, Cimex lectularius. Scientific Reports, 5, 1-14.  Retrieved from:

Mauer, L., & El-Sohemy, A. (2012). Prevalence of cilantro (Coriandrum sativum) disliking among      different ethnocultural groups. Flavour, 1(8), 1-5.  McGee, H. (2010, April 13). Cilantro haters, it’s not your fault. New York Times, D1. Retrieved from :

Spence, C. (2015). Just how much of what we taste drives from the sense of smell? Flavour, 4(30), 1-10. Retrieved from: