Total Pageviews

Monday, July 30, 2012

Fibre, farts and faeces


Some 2,500 years ago, Hippocrates declared that: “Wholemeal bread cleans out the gut and passes through as excrement. White bread is more nutritious as it makes less feces”.  For a long time fibre was regarded as a non-nutrient and of no importance. Thomas Richard Allinson[1] was a medical doctor who advocated vegetarianism and wholemeal bread and in 1892 he was struck off the medical register in the UK for his non-establishment views. In 1936 the American Medical Association formally condemned the use of bran, a view which would dominate for the next three decades.
Fibre was of course of enormous importance in animal nutrition, a science that was far more developed than human nutrition. To understand why, its best to first consider fibre from the botanical viewpoint. When a seedling finally gets to the sunshine of boundless photosynthetic energy, it is green and very leafy. As it grows, its stem moves from a predominantly photosynthesizing function to the dual function of physical support and the transport of minerals to the growing leafy top. Its chemistry changes and it develops strong cell walls for support and these cell walls are very fibrous growing less green as the harvest nears. The leafy top give rise to the seeds which will also be wrapped in a fibrous outer husk. For animal nutritionists, the higher the fibre content of forage, the lower was its energy value. The most abundant carbohydrate on the planet is cellulose which like starch, is a polymer of glucose. The polymer of cellulose is organised slightly differently from starch such that our starch- digesting enzymes in our gut cannot break down cellulose. For those animals that depend on a cellulose rich diet, they have developed a very sophisticated association with gut bacteria that can break down this cellulose such that the bacteria use some of the cellulose as energy and the leftovers are used by the host animal.  Its a win-win situation. In most of the grazing animals, the bacteria are found at the start of the digestive tract, the rumen, in cows and sheep while in humans, who don’t have to forage on cellulose, our gut bacteria are are the end of the gut, the colon.
Some 30 years after the AMA had downgraded bran to junk status, a small number of  distinguished medical doctors, challenged conventional wisdom and fibre entered the lexicon of human nutrition. The most notable was Denis Burkitt, a County Fermanagh man educated at Trinity College Dublin. He raised huge interest in fecal matters in relation to diseases of the digestive tract, ranging from constipation to colon cancer. Higher fibre diets lead to higher fecal outputs and this can be achieved in either of two ways. Some fibres are pretty resistant to any form of microbial degradation but have a huge capacity to absorb water. Thus if taken in sufficient quantity, the promote a high fecal output of undigested fiber, bran in particular, which has a very high water content leading to a soft stool. Indeed, Burkitt used to discuss two types of fecal stools, floaters and sinkers, the former on low fibre diets and the latter on high fibre diets. The second route to increased stool output is to provide the colonic microflora with a very fermentable fibre such that the biomass increases leading to more frequent excretion of softer stools. However, if the fibre is very water soluble and fermented very rapidly, as happens for example with the small sugar-like fibre in beans, flatulence is promoted leading to the ditty: “Beans, beans good for your heart. The more you eat, the more you fart. The more you fart the better you feel so beans, beans at every meal”!

The evidence that dietary fibre reduces the risk of colon cancer is strong and a recent meta analysis of all studies in this field, has shown the colon cancer risk is 17% lower with three  servings of wholegrain per day and that increasing fibre intake from 5g/d to 30g/d led to a decline of about 30% in the risk of colon cancer. Cereal fibre is way ahead of fruit and vegetable fibre in reducing this risk[2].  This cancer accounts for about 10% of all cancers and is of course the most preventable, not simply by diet, but also by endoscopic screening.

Fibre has taken its rightful place in human nutrition but now, there is a slightly different twist in that the colonic bacteria have raced to the top of biomedical fashion. We started with the genome ( the collective noun for all genes) and then we moved to the next level, the proteins which we called the proteome and of course soon after we had the metabolome. Now we have the microbiome. In my recent book, I devote an entire chapter to this topic because it is a very interesting and potentially important topic. However, I fear that it will go down the route of many great biomedical fashions and deliver very little. Many of the studies in this field are built on bizarre animal models which involve the breeding of germ-free animals without any gut bacteria. Many others are built on association studies which reveal that some pattern of colonic microbial flora is biologically superior to some other pattern. However, few human intervention studies have been conducted and within that limited literature, the data are not unanimous. However, the hype rolls on. To some extent, and I am in this business a long time, it reminds me of the anti-oxidant theory, a veritable bio-Klondike. It rose, dominated and fizzled out. Beware of theories that explain everything, well almost everything.

An adequate intake of dietary fibre is important. I remain a healthy sceptic about the promise of the microbiome.


[1] His flour company still flourishes, if you excuse the pun
[2] Aune D et al (2011) BMJ 343 Dietary fibre, whole grains, and risk of colorectal cancer: systematic review and dose-response meta-analysis of prospective studies

Monday, July 23, 2012

Iodine - now a problem in developed countries


When we think of hunger, we think of the gaunt and emaciated children in sub-Saharan Africa. There is another form of hunger known as “hidden hunger” which requires the services of a biochemist to detect deficiency symptoms in blood or urine and the three main nutrients of hunger are iron, leading to anaemia, vitamin A leading to blindness and iodine, leading to goiter. Today’s blog focuses on the latter[1]. Iodine is required as an element within the molecular structure of the thyroid hormone and this hormone plays an essential role in every cell where it regulates metabolic rate. Iodine also has a major role in brain development during pregnancy and the very early years of life.  If iodine intakes are inadequate, a condition known as goiter emerges in which the thyroid gland in the neck swells as it seeks to extract every last drop of iodine in blood for thyroid hormone synthesis. Most importantly, iodine deficiency in pregnancy will lead to impaired cognitive ability and if severe will lead to severe mental retardation, a condition known as cretinism.
Iodine is a rare element in the earth’s crust and is highly water-soluble and is thus concentrated in the oceans. It is deposited on land as rainwater and the further the distance from the sea, the lower the impact of this iodine rich rainfall. Inland mountainous regions are particularly susceptible to low soil iodine levels as rainwater and streams wash the iodine away, returning it to the sea. Thus there are large tracts of land known as goiter belts where soil iodine is sufficiently low to create a high prevalence of goiter. Iodine deficiency remains one of the great global nutritional challenges with the WHO estimating that 1.9 billion, almost one third of humanity, had inadequate intakes of iodine over the period 1994 to 2006 with approximately 20 million cases of impaired mental function. Great progress has been made in reducing this figure through the fortification of salt with iodine. In contrast to the improving situation in developing countries, inadequate iodine intake is now becoming an issue in developed countries.
Without any planned intervention, the problem of inadequate iodine intakes in certain developed countries was considerably reduced from the 1960s onwards because of the introduction of iodized cleansing substances (iodophors) into modern milking parlour and also the introduction of iodine enriched salt licks to ensure adequate iodine intake by dairy cows.  Milk became on of the main sources of iodine and milk consumption was encouraged particularly in children. However, the use of iodophor sanitizing agents in dairy farming has declined dramatically in recent times and the use of table salt, including iodized salt, has fallen, given the negative nutritional messages about salt and health.  In a recent survey of the iodine status of UK schoolgirls using urinary iodine levels, 51% were deemed to have mild iodine deficiency, 16% had moderate deficiency and 1% had severe deficiency. Just about one third had adequate iodine status[2]. The UK is now ranked eighth in the top ten of iodine deficiency countries according to the International Council for Control of Iodine Deficiency Disorders. Two studies have shown that if young children with poor nutritional status receive iodine supplements, their cognitive function rises significantly.

From a nutrition policy point of view, the issue of iodine adequacy is of greatest importance in women of childbearing age. The method used to assess the iodine status in populations is to take a single urine sample with the average value being compared to a WHO reference range for mild, moderate and severe iodine deficiency. However, whereas that approach can tell us if a population overall has a problem with iodine status, it cannot tell us who within that population has a true problem. A recent paper has shown that 10 different urine samples from one individual are needed to assess an individual’s iodine status. Thus at present, there is a need to ensure that all women of childbearing age receive adequate iodine intake. If, as the data would indicate, inadequate iodine status is such a serious issue especially as it relates to early life cognitive function, then iodine must become a very important public health nutrition issue.

Ironically, the many pregnant women who shift to organic foods in the belief that this will help ensure as healthy a baby as possible, will see a very significant fall in iodine intake.  Organic animal production greatly restricts the use of mineral and vitamin supplements in animal feeds.  Recent survey of the iodine content of milk from organic and conventional farms shows that the organic milk is 42% lower in iodine than conventional milk, and milk accounts for almost half the UK iodine intake. In fact, pregnant women should be counseled to avoid organic milk.


[1] This blog was prompted by an excellent paper presented to the recent Nutrition Society meeting by Sarah Bath, a PhD student at the University of Surrey
[2] Vanderpump MPJ et al (2011). Iodine status of UK schoolgirls: a cross sectional survey. The Lancet, 337, 2007-12 

Monday, July 16, 2012

Nature, nurture and the control of food intake


Some weeks ago, my blog was entitled: “Ever seen a fat fox”, the gist of which was that whereas biology can tell us a lot about the control of food intake in animals, in man, with a large pre-frontal cortex, living in a highly social existence, social aspects may be far more important than the biology. I return to this theme today in light of some intriguing research jointly carried out by researchers at the Universities of Washington and Toronto[1]. The research centers around the phenomenon of restrained eating, so let me first explain what this means. As I have previously pointed out, overweight and obesity are not usually the outcome of any conscious decision to get fat (Sumo wrestlers excluded). Restrained eating, on the other hand, is a very conscious decision to do, as the name implies, to consciously count calories in order not to gain weight or to maintain weight loss. Restrained eaters frequently have experienced some weight gain and have then made the lifetime commitment to being a fussy eater.

We know that obesity is highly heritable and based on studies of identical and non-identical twins, the figure for heritability is as high as 90%. So, it seemed a reasonable question to ask if brain function in response to food cues in restrained eating also had a similar level of genetic control. Thus the researchers used the University of Washington’s register of twins. In 2006, the twins took part in a health survey that involved the completion of a restraint-eating questionnaire (10 questions), effectively examining the concerns of the subjects about chronic dieting and weight gain. The researchers then mined the database to identify sets of identical twins that differed radically in their restrained eating patterns. Effectively, each twin pair had one serious restrained eater and one with no restraint as regards eating. By using identical twins, the study immediately eliminated any biological difference, since identical twins are genetic clones. The participants were all female and in their early thirties. The twins were shown photographs of “fattening” and “non-fattening” foods and then underwent a brain scan using a system known a functional Magnetic Resonance Imaging (fMRI). This was then repeated after the volunteers had consumed a milk shake.
The first test showed that the twin with the restrained eating personality showed much higher brain activity following exposure to pictures of high-calorie “fattening” foods than the twin with no tendency to food restraint. Specifically, the areas of brain activity most affected were the amygdala (involved in emotional processing of external cues), the occipital lobe (involved in behaviour modification) and in the right thalamus (involved in visual perception). Interestingly, there was no difference between restrained-eaters and non-restrained eaters in the prefrontal cortex. This is somewhat surprising to a non-neurobiologist such as myself since this is the area of the brain associated with all the higher complex brainpower of humans and since humans are the only species that exhibits restrained eating, one could be forgiven for thinking that the pre-frontal cortex would be involved. Things now get complex after the subjects had consumed the milkshake. In the case of the restrained eaters, all those areas of enhanced activity seen in the first test with pictures of high calorie “fattening” foods were greatly reduced. Strangely however, the sight of the non-fattening foods now switched on several parts of the brain, which was not observed in the non-restrained twins.
The authors point out that the fMRI data are consistent with other data, which show that restrained eaters are highly sensitive to external cues, which, in this study led to enhanced brain activity in the three regions mentioned. They see this as a conflict between food appeal and the desire to restrain eating. After the milkshake, this enhanced activity diminishes which the authors say may have been “cognitively driven”. I take that to mean that the intake of the milkshake “woke them up” in a sense such as to inhibit their natural desires.
For me the big deal in this paper is that an acquired habit regarding the regulation of food intake, restrained eating, is not entirely genetically driven. Moreover, it is not associated with the pre-frontal cortex, which makes man man and mice mice. Previous studies by the same authors in a very large twin study found that restrained eating was partly heritable with a range of 35% to 50% heritability recorded in 95% of the subjects. So some element of restrained eating is inherited and another, learned.  Such is the utter complexity of the regulation of food intake in humans. So, the next time you are at a conference and someone starts to discuss the regulation of food intake in rats and mice, take out the smart phone and do your e-mails!



[1] Schur AE et al (2012) Acquired difference in brain responses among monozygotic twins discordant for restrained eating. Physiology & Behaviour, 105, 560-567

Monday, July 9, 2012

Calorie counting on menus ~ The US experience


The Minister for Health here in Ireland wants to introduce calorie counts on menus and has given the industry 6 months to implement the proposal or, if they fail to  do so, legislation will be introduced. All packaged food requires full nutrition labeling, so it would seem quite reasonable to require the food service sector to follow suit. Calorie counts on menus were first introduced into New York in 2008 and, in 2010, the US Congress passed an act which required menu labeling for all restaurants with 20 or more locations. Researchers at the University of North Carolina conducted a systematic review of the impact of this legislation on actual average caloric intake in the US food service sector. A systematic review sets out very clearly, the criteria that a published paper must meet in order to be considered by the reviewers. In this case the studies had to have an experimental or quasi-experimental design comparing a calorie labeled menu with a menu without any caloric data. The review only considered studies with data on either consumption or purchase and, of course, only English language publications.[1]
They identified 164 titles, of which only 32 appeared from the title to meet the entry criteria. Having read the abstracts of these 32, a total of 18 papers were read in full and of these, 7 were included in the review. Two reported reductions in calorie intake with calorie labeling, 3 reported no change, 1 reported an increase and 1 found that of the 11 largest fast food chains, 3 reported a decrease (McDonald’s -44, Au Bon Pain -80 and KFC -59), 1 reported an increase (Subway +133) while 7 reported no change (Burger King, Wendy’s, Popeye’s, Domino’s, Pizza Hut, Papa John’s and Taco Bel)[2].

This might sound like music to the ears of the food service sector but before the rapture begins lets just ask ourselves if the data should surprise us. Research shows that consumers do not rate obesity and overweight as an important risk for them personally but they do see it as a risk to society as a whole. That is explained by the fact that although a consumer may be fat, they are themselves in control of the situation and if and when they decide to lose weight, they can manage that without any doubt. However, they are not convinced that the rest of society has such marvelous self control and hence, weight is a societal issue but not a personal issue. So if they are asked for their opinion on the listing of calories on menus, they will see the value. Now when consumers go out to dine in a cafeteria, fast food outlet or a restaurant, those that are in the mode of counting calories will be able to benefit from menu labeling. Since this is a minority, we should not be surprised that on average, there was no impact of caloric labeling. If the research was to include those who wanted to lose weight, then almost certainly, the outcome would be positive.

One useful and informative paper comes from Tacoma-Pierce County in Washington and was written by those members of the County Health department that set out to promote menu labeling in a program called SmartMenu[3]... This programme targeted locally  owned restaurants, not the food chain restaurants and this was done specifically to see how such local restaurants with less resources than the chains, could cope with the challenge. Of the 600 restaurants contacted, only 24 agreed to participate and of these, only 18 finally posted the data on their menus. By far the biggest barrier was the preparation of the menu items into a standardized format that could be entered into a nutritional analysis software programme. In the words of the authors: “The challenge for locally owned restaurant owners who are not using standardized recipes to participate in this programme cannot be overstated”. The average time from a signed agreement to participate to the posting of the menus was 8 months. The costs for the restaurants ranged from $1,500 to $8,400. Besides the time, complexity and costs issues, other barriers included the perceived business risk of labeling (the “I got fat eating in your restaurant which mislabeled the caloric value of my favourite dish” law suit) and the low perceived demand for such calorie labeling.

As ever, things are not as straight forward as first imagined. That does not mean that we shouldn’t try to label menus if indeed we believe that it will help those who are dieting and who are generally weight conscious. A few further observations can be made. We must also be aware that there are consumers who rate monetary value higher than health aspects when purchasing foods and who in fact might opt for the best value in terms of calories per euro. Based on the experience of Tacoma-Pierce County, someone is going to have to invest in this if it is going to work. If it is the restaurant owners, then guess who’s ultimately going to pay for the service. Then again, when you go out for dinner, cost is not really an issue.  Finally, last night we ate in a delightful Thai restaurant and nobody ate everything served. How many calories were left on the plate?



[1] 1 English is the language of scientific publication and papers published in other languages are always excluded from systematic reviews. To use them would require a full translation with the help of the author to retain accuracy and that is not feasible. Moreover, journals in non-english language have a local focus and are always of  a very low impact factor.
[2] 2 JJ Swartz et al (2011) International Journal of Behavioral Nutrition and Physical Activity 8, 135
[3] 3 JW Britt et al (2011) Health Promotion Practice 12, 18-24

Tuesday, July 3, 2012

Fructose - challenging the myths


In the past, the sugar component of most sweetened beverages was obtained either from sugar beet or cane sugar. For both sugar cane and sugar beet, the sugar found is called sucrose and it is one molecule of glucose linked to one molecule of fructose. In the late 1960s, an alternative to sugar was developed which was cheaper to use and less prone to the volatility of the global sugar market in terms of price and volume. This alternative is known as High Fructose Corn Syrup (HFCS) and it is produced in two stages. Ordinary cornstarch is first broken down to its basic constituent glucose and the glucose is then treated with a natural enzyme that converts it to fructose. The two sugars, glucose and fructose can now be blended yielding mixes with varying ratios. For the soft drinks industry, the ratio is 55% fructose and 45% glucose.

For some time now, various “experts” have come down heavily on fructose for its deleterious effects on health. Google the word ‘fructose and within the top three sites you find “Fructose-Sweet but dangerous” and “Sugar may be bad but fructose is far more deadly”. The latter has a link to a You Tube video by Robert Lustig, a professor of pediatrics at the University of California, San Francisco. The video is entitled “Sugar – the bitter truth”. In his video, he outlines his belief that our modern food supply is  “adulterated, contaminated, poisons and tainted” and the culprit is fructose. It really is difficult to understand the use of these terms by a medical professional. These are intended to get people scared and to grab their attention so that the learned doctor can give you the diagnosis and the cure. He goes on to ask the audience: ”What is in coke”. First he tells us that there is caffeine there, which, being a mild diuretic will make us thirsty and he then goes on to say that a can of coke has 55mg of sodium, which he says “is like drinking liquid pizza”. As far as I can tell from a 30 second search of the USDA food composition on-line database, an average serving of pizza from a fast food chain would contain about 800 mg of sodium. Why does the good doctor choose to distort the facts? He argues that the sodium and the caffeine diuretic are designed to make you thirsty. He then goes on to ask his audience why sugar is added to Coke. Why, of course, it’s to mask the salt content!

The good doctor goes on to explain that what fructose does to your blood proteins is what happens to your steak when it goes brown. Wrong again doc. When red meat is cooked, the iron containing protein in muscle, myoglobin, begins to lose its iron and as the temperature rises, the myoglobin loses more and more iron, thus turning brown. White meat such as chicken and pork do not brown in cooking because they contain much less myoglobin. What Prof Lustig probably meant was that the effect of fructose on blood proteins is, at a considerable stretch of the imagination, similar to the browning of toast where sugars react with proteins under the effect of strong heat.  Of course you don’t toast your blood so the analogy is a bit far fetched.
From the scientific point of view, a very recent systematic review of the literature of controlled human feeding studies, shows that among diabetics, where high sugar levels can react with proteins, causing undesirable clinical consequences, substituting fructose for other forms of carbohydrates actually lowers this process known as glycation[1]. This work was funded by the Canadian Institute of Health Research and completed by Professor David Jenkins, a world authority on diet and blood glucose. In a previous set of correspondence in the Journal of the American Dietetic Association, Jenkin’s challenges Lustig’s hypothesis that fructose promotes the accumulation of fat in the liver and adipose tissue in an article entitled “Is fructose a story of mice but not men?” The authors point out that studies in animals need to very carefully interpreted before the findings of such studies are extrapolated to man. Thus in mice and rats, the conversion of carbohydrates to fats can reach 70%. However, in man, the efficiency of conversion of carbohydrates to fat is much lower at about 5%. They point out that in studies with very high intakes of fructose in humans, this effect can double the rate of conversion of carbohydrate to fat, but the levels remain at 10%, a fraction of what is seen in experimental animals. Moreover, animal models often use very high levels of fructose well above (upwards of 6 times) that normally associated with human diets.

The idea that in man, carbohydrate conversion to fat is low is often greeted with surprise if not disbelief; so a little explanation is worthwhile. Many animals have the capacity to convert carbohydrate to fat but when physiologists started to look at this in humans using live-in calorimeters to study precise changes in the oxidation of fat and carbohydrate, they found that when excess calories were consumed as carbohydrate, the oxidation of body fat stores fell dramatically and that the excess carbohydrate calories were used for fuel thus sparing the fat. Thus if you require 2,500 calories a day from a typical mixed diet (fat 35%, carbohydrate 50% and protein 15%) and you consume an additional 200 calories of carbohydrate, instead of using all the ingested fat (35% of calories would be about 875 calories or 97 grams), you would only use 675 calories from fat, allowing 22 grams of fat to be spared. So, carbohydrate spares the burning of fat in man but it is poorly converted to fat in humans, even if it is as fructose and with high doses of fructose.

Sugar bashing is popular and fructose bashing is even more fun but basically built on very bad science. But, why let bad science get in the way of a good story or a nice video with 2.6 million viewings?




[1] Effect of fructose on glycemic control in diabetes. A systematic review and meta-analysis of controlled feeding trials. Diabetes Care Volume 35, July 2012

Monday, June 25, 2012

Obesity and social disadvantage


Frequently, when chatting with my middle class friends on nutrition and health, I have to argue long and hard against some preconceived notion of the truth behind the topic of discussion. Of all of these issues, the one I encounter most frequently and the one that meets most resistance to change is the view that the problem of obesity is really a problem of the lower socio-economic groups.  A frequent argument put forward is that  “if you look at their shopping trolleys in supermarkets, they are laden with all sorts of junk foods”.  So let me give you the facts. Taking the Irish population as a whole and using the IUNA database, body mass index (BMI kg/m2) is 26.8 among the professional workers, 27.4 among non-manual workers, 28.4 among skilled workers and 26.0 among the unskilled workforce. An acceptable level of BMI is 25 and I should add that the variance (standard deviations) of these figures is broadly similar. Now you can look at this and say: ”See I told you so. There is a graded rise in BMI from professional to skilled workers” and this goes nicely with the social class stereotype. The socio-economically disadvantaged are seen as lacking money to buy healthy food, and so poorly health aware as to not know good food choices from bad food choices and, in some cases, they are deemed to lack the literary skills to read labels correctly. This leads the debate on public health nutrition to shift into policy decisions in which actions toward social issues begin to dominate. In case you think that this Irish data is unique, please consult the Report of the Health and Social Care Information Centre’s report: Statistics on obesity, physical activity and diet: England, 2011.  To directly quote the report: “Table 7.3 on page 128 of the HSE 2009 report shows that there are very little differences in mean BMI by equivalised household income for men with the exception of those in the lowest income quintile who had slightly lower BMI; in contrast for women, those in the lower income quintiles had a higher mean BMI than women in the highest quintile. Among women, the proportions who were obese were higher in the lowest three income quintiles (ranging from 27%-33%) than women in the highest two quintiles (ranging from 17%-21%). The relationships between BMI and income for men were less clear”. Canadian data is quite similar but US data[1] does show quite a different pattern with obesity rising more rapidly among the socially disadvantaged. However, these data when carefully examined reveal some intriguing facts.  Among white men and women, the rate of rise in the % obese has grown equally across socio-economic status (SES) over the 30 years from 1970 to 2000. The lowest SES in 2000 had an obesity rate of 28.3% in men compared with 23.9% among those in the highest SES. For women, the figures were 36.3% and 26.6% respectively. However, when we look at black males, the % obese jumped from 4% to 33% in that 30-year period among the highest SES. The middle and lowest SES groups started off with a figure of 15%, which grew to 24%. For black women, the total reverse was seen. How does on e begin to make sense of that?


I should add that if you look at dietary patterns across socio-economic status in the Irish IUNA data, you see no biologically meaningful change in the % energy from fat or sugar and this is borne out by data from the Household Budget Survey which tracks expenditure on foods. That data shows no difference in food purchasing patterns across socio-economic status.


There is a bottom line here and that is that obesity is everywhere. To argue over one unit of BMI between the haves and the have-nots is to quite simply miss the point. Statisticians can construct models, which show that controlling for age, gender, smoking and so on, the relative risk of obesity rises with lower socio-economic status. They are welcome to that but if it begins to drive public health nutrition policies toward some social solutions, then they are being unhelpful. Of course, social disadvantage needs to be a factor we consider in all aspects of public health. But what is driving the increased adiposity of judges, teachers, doctors and so forth. It is not a lack of knowledge, not due to literacy or lack of income. The do-gooders of public health nutrition need to read the real population statistics and make appropriate recommendations.


[1] The Obesity Epidemic in the United States—Gender, Age, Socioeconomic, Racial/Ethnic, and Geographic Characteristics: A Systematic Review and Meta-Regression Analysis Youfa Wang and May A. Beydoun Epidemiologic Rev 2007;29:6–28

Monday, June 18, 2012

Brain food ~ get it early


The human brain is, pro rata bigger and is far more complex in structure than in any other species. It tends to be a very busy organ and consumes about 25% of the daily caloric intake of an average person. This increases to about 50% of caloric intake in the children aged 1 to 6 years and reaches 55% in 4-6 month olds and a staggering 75% in newborn babies.  Around about the age of 30, the human brain begins to shrink at the rate of 1 milligram per year and if that seems a small rate of decline (a lifetime reduction of 8% volume), the evolution of the human brain to its present size was also 1 milligram per year.  Without question, the biggest fear people have in entering old age is a loss of cognitive function with Alzheimer’s disease the worst-case scenario.  The issue is so important that it attracts all forms of snake-oil merchants promising this or that diet to stave off any decline in cognitive function.

The best place to start the task of ensuring a healthy brain throughout adult life is during pregnancy. During the third trimester of pregnancy, there is a growth spurt in brain development, which continues for 24 months. The human brain is about 60% fat and so when it comes to any discussion of nutrition and brain function, fat is bound to dominate. Specifically, the brain is rich in long chain highly polyunsaturated fatty acids that are abbreviated to EPA (Eicospentaenoic acid) and DHA (Docosahexaenoic acid). These fats cannot be synthesised by the human body and therefore have to be obtained from our diet. Fatty fish are by far the best source of these fatty acids and this raises an interesting question: Why, if the human brain was so important in our evolution, did we not develop the capacity to synthesise these fatty acids ourselves? Stephen Cunnane, author of “Survival of the Fattest” makes the case that man migrated from the Savannah to the shoreline of lakes, rivers and deltas where an abundance of fish, shellfish, eggs, bird and wild life existed. Such a food chain is rich in the brain type fatty acids, EPA and DHA, and there would have been no evolutionary advantage in having the energy demanding metabolic pathways to manufacture these fatty acids ourselves.

The growing foetus is totally dependent on a maternal supply of these fatty acids for brain development and when these enter the mother’s blood supply after a meal, they are preferentially transferred to the foetus. Other types of fats might be shared with the mother’s own fat reserves but not these precious fats. After birth, breast milk should contain adequate levels of these fatty acids and so too should infant formula. The problem begins to arise when post-natal nutrition is inadequate. The first 24 months of life sees a tremendous growth in brain complexity, especially in the frontal cortex through which the new baby acquires the social norms and language of its environment. Inadequate nutrition in this period will greatly diminish intellectual capital for the rest of such an individual’s life.

It goes without saying that an adequate intake of these fatty acids is required throughout adulthood and there are ample studies showing that inadequate intakes of fish oil type fatty acids are associated with a higher risk of loss of cognition in later adulthood.  However, when the putative link between these fatty acids and cognitive decline are tested in dietary intervention studies, the evidence just evaporates. One possible reason for this is in the genetic predisposition to Alzheimer’s disease. There is a protein that is strongly involved in fat transport and distribution, abbreviated to Apo E and it can exist in three different forms of (Apo E2, E3 and E4) and we can inherit any two varieties from either parent. Thus 60% of the population has E3/E3 and they account for 65% of all Alzheimer’s cases. A smaller number (23%) of the population has the E4 variety either alone (E4/E4) or in combination (E4/E3) and this quarter of the population account for nearly half of all cases of Alzheimer’s disease. Thus with such a strong genetic dimension, intervention studies will eventually have to be conducted in which the individuals genetic make-up is taken into account. Moreover, the duration of the studies will have to become much longer if we are to identify a truly protective effect should such an effect actually exist. In addition to fish oil type fatty acids, there has been similar data for some of the B vitamins, most notably, folic acid and vitamin B12. Again, the association data seems very strong but again, when dietary intervention studies are conducted, little supporting evidence emerges. A higher body mass index in middle age is also a risk factor for Alzheimer’s disease and my guess is that this may arise because the more adipose tissue you have the more you will move EPA and DHA to that tissue away from blood which would normally be the route to the brain.

In Celtic mythology, the salmon was referred to as the fish of knowledge and maybe, even as these mythologies evolved, there was anecdotal evidence that fish was good for the brain. In modern Celtic Ireland we have a low intake of EPA and DHA and remarkably, 75% of our intakes of these vital nutrients come from fish oil capsules rather than fish. Some achievement for an island race!