Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Food Fortification in a Globalized World
Food Fortification in a Globalized World
Food Fortification in a Globalized World
Ebook1,403 pages14 hours

Food Fortification in a Globalized World

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Food Fortification in a Globalized World outlines experiences over the past 50 years—and future potential—for the application of food fortification across a variety of foods in the industrialized and developing world. The book captures recent science and applications trends in fortification, including emerging areas such as biofortification, nutraceuticals and new nutrient intake recommendations, standards, policy and regulation. The book proposes a balanced and effective food fortification strategy for nations to adopt. In covering the most technical scientific details in an approachable style, this work is accessible to a range of practitioners in industry, government, NGOs, academia and research.

Food fortification has become an increasingly significant strategy to address gaps in micronutrient intakes in populations with measurable impact in both industrialized and developing countries. While the positive impacts are well recognized there are new concerns in some countries that excessive fortification of foods, outdated nutritional labeling rules and misleading marketing tactics used by food manufacturers may result in young children consuming harmful amounts of some vitamins and minerals.

  • Presents the latest science on fortification for the prevention of micronutrient deficiencies
  • Includes emerging areas such as biofortification, nutraceuticals and new nutrient intake recommendations, standards, regulations, practices and policies from around the world
  • Summarizes evidence of application of food fortification and measured impact on public health
  • Discusses how public policy impacts fortification of foods and nutritional deficiencies
  • Considers the complex economics of and market for fortified foods
LanguageEnglish
Release dateJun 29, 2018
ISBN9780128028971
Food Fortification in a Globalized World

Related to Food Fortification in a Globalized World

Related ebooks

Medical For You

View More

Related articles

Related categories

Reviews for Food Fortification in a Globalized World

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Food Fortification in a Globalized World - M.G.Venkatesh Mannar

    Nutrition

    Section I

    Need and Approach

    Outline

    Chapter 1 Food Fortification: Past Experience, Current Status, and Potential for Globalization

    Chapter 2 Prevalence, Causes, and Consequences of Micronutrient Deficiencies. The Gap Between Need and Action

    Chapter 3 Developing National Strategies to Prevent and Control Micronutrient Deficiency: The Role of Food Fortification

    Chapter 1

    Food Fortification

    Past Experience, Current Status, and Potential for Globalization

    M.G. Venkatesh Mannar¹ and Richard F. Hurrell²,    ¹University of Toronto, Toronto, ON, Canada,    ²Swiss Federal Institute of Technology, Zurich, Switzerland

    Abstract

    In the first half of the 20th century, food fortification programs in Europe and the United States were highly successful in virtually eliminating micronutrient deficiency diseases such as goiter, cretinism, pellagra, rickets, and xerophthalmia which had caused high levels of morbidity and mortality particularly in children. This success gave the impetus to further fortify staple foods, including cereal flours, milk products, condiments, fats and oils in many other countries worldwide, the development of targeted fortified foods particularly for infants and young children, and the introduction of fortified manufactured foods such as breakfast cereals that are market driven as well as public health driven. The efficacy of these programs in improving intake and status of iodine, iron, folate, vitamin D, and vitamin A is proven. However widespread micronutrient deficiencies still occur particularly in low and middle-income countries. While large-scale fortification of staple foods is on the rise the overall coverage is still at a relatively low level. While the scientific and technical issues of fortification are well known, the implementation science for scaling up new national programs in low- and middle-income countries is still under development.

    Keywords

    Micronutrients; deficiency diseases; fortified foods

    Chapter Outline

    1.1 Background 3

    1.2 Early Successes With Food Fortification 5

    1.3 Types of Fortification 6

    1.4 Selection of Vehicles 7

    1.5 Biofortification 8

    1.6 Current Situation, Issues and Challenges 9

    1.7 Concluding Thoughts 10

    References 11

    1.1 Background

    Micronutrient deficiencies are a major, global public health problem that can affect all age groups in both industrialized and developing countries. One hundred years ago multiple micronutrient deficiencies were common in poor rural and urban communities of industrialized countries. They were largely eliminated as economic conditions improved by an improved diet which included micronutrient-fortified foods as well as more access to animal source foods. Some deficiencies however, such as in iron and iodine, still persist, while others such as folic acid, B12, calcium, or vitamin D have emerged or reemerged.

    At the present time, micronutrient deficiencies in the developing world are far more severe than in industrialized countries and are a major impediment to the future development of many nations. Some 2 billion people, mainly women and children in developing countries, are reported to suffer from iron, iodine, vitamin A, and zinc deficiencies. Such deficiencies lead to a range of disabilities including impaired brain development and cognition, impaired immunity against disease, poor pregnancy outcome, poor growth, impaired work capacity, blindness, and even death. Multiple micronutrient deficiencies often occur in the same individual and are primarily due to the regular consumption of plant-based diets that include little or no animal source foods or, in the case of iodine and selenium deficiencies, they are due to low levels of these micronutrients in soil leading to low levels in plant and animal foods. Such diets provide intakes for a range of micronutrients that are below the individual’s metabolic needs. These low micronutrient intakes, coupled with widespread infections, poor hygiene, and poor sanitation in developing countries, lead to a variety of poor health outcomes that restrict the intellectual potential of the individual, reduce the earning power of the family, and decrease the gross domestic product (GDP) of the country. This situation calls for the urgent action.

    There are several approaches to increasing micronutrient intake. They include the fortification of staple foods, condiments, infant foods, and some industrial products; the biofortification of food staples by plant breeding techniques, dietary diversification; and supplementation with pharmacological doses. In developing countries, additional public health interventions including infection control, improved hygiene and sanitation, and promotion of breast-feeding may also be necessary if micronutrient status is to be improved. Dietary diversification is easier for the more affluent populations who can afford animal source foods, and biofortification is most useful for low-income populations in developing countries who consume mainly locally grown foods and have little access to processed foods. These food-based approaches are primarily designed to prevent micronutrient deficiencies. Periodic supplementation with pills or capsules containing pharmacological doses of micronutrients can be used to prevent or treat deficiencies and has been commonly used to provide additional vitamin A and a combination of iron and folic acid.

    Fortification of widely consumed foods with vitamins and minerals is a public health strategy to enhance nutrient intakes of the population without increasing caloric intake. Food fortification is a medium- to long-term solution to alleviate specific nutrient deficiencies in a population. National fortification programs involve the addition of measured amounts of nutrient-rich premix containing the required vitamins and minerals to commonly eaten foods during processing. Populations with lower purchasing power consume mainly staple foods and condiments, making these foodstuffs the ideal vehicles to provide micronutrients and to prevent the development, or to decrease the prevalence, of micronutrient deficiencies. The foods identified for fortification must be commonly eaten foods that are centrally processed. This allows the fortification process to be dovetailed into the existing food production and distribution systems. In this way, existing food patterns do not change and there is no need for special compliance of the individual. In most developing countries, the choice of vehicles is limited to a handful of staple foods and condiments such as cereals, oils and fats, sugar, salt, and sauces. The vitamins and minerals used for fortification typically include vitamins A, D, folic acid and other B-complex vitamins, iodine, iron, and zinc. The start-up cost for food fortification is relatively inexpensive for the food industry, and often the recurrent costs can be passed on to the consumer. The benefits of fortification can extend over the entire life cycle of humans. Food fortification is thus one of the most cost-effective means of overcoming micronutrient malnutrition, and as such has played an important role in its implementation in public policy. According to the World Bank …probably no other technology available today offers as large an opportunity to improve lives and accelerate development at such a low cost and in such short a time (World Bank, 1994).

    There is, however, no one single model appropriate for all population segments, making it imperative to design and implement complementary approaches to ensure the greatest penetration of fortified food products. There are also specific situations where large-scale food fortification can be enhanced by targeted fortification to reach vulnerable population subgroups, such as home fortification for vulnerable families, complementary foods for infants and young children (micronutrient powders, lipid-based nutrient supplements, fortified blended foods, etc.), and special foods for older children and pregnant and lactating women (biscuits, yogurt, beverages, etc.) (Moench-Pfanner et al., 2012).

    Many national food fortification programs have been introduced in both industrialized and developing countries over the last 70 years and have played an important role in improving public health. In the United States and Canada enriched and/or fortified foods contribute a large proportion of the intakes of vitamins A, C, and D as well as thiamine, iron, and folate. Micronutrient deficiencies have been greatly decreased or, as with iodine, virtually eliminated on a global scale. Progress has accelerated in the past decade. Today there are salt iodization programs in approximately 140 countries worldwide, 83 countries have mandated at least one type of cereal grain fortification, 20 countries fortify edible oils, nine countries fortify sugar, and several others fortify rice, milk, or condiments. The current low levels of iron deficiency in the United States have been attributed to fortified foods, with almost one-quarter of iron intake in the US diet coming from fortified foods, much of that from cereal products. Nevertheless, while many well controlled scientific studies have demonstrated the efficacy of iron-fortified foods, the impact of large-scale iron fortification of cereal flours on improving iron status in national populations has only recently been confirmed (Barkley et al., 2015; Martorell et al., 2015).

    Another success has been folic acid fortification, and since 1998, following the introduction of mandatory folic acid fortification of cereal-grain products in the United States, Canada, and Chile, there was a 30%–70% reduction of neural tube defects (NTD’s) in newborns, encouraging some 75 other countries to add folic acid to flour. A few countries have resisted however due to concerns over consumer safety. Market driven industrial foods have also played a role in alleviating micronutrient malnutrition in industrialized countries. In Europe, a comparative analysis of dietary surveys suggests that fortified foods, especially voluntarily fortified breakfast cereals in France, Ireland, the United Kingdom, and Spain have usefully contributed to increasing vitamin and mineral intakes during childhood and adolescence.

    The introduction of dietary reference values in 1942 by the United States gave the first clear indication of the quantities of micronutrients needed in diets so as to maintain optimum health. These have been updated and extended several times by the United States, WHO, and many other countries, and are the yardstick for defining the fortification level of micronutrients added to foods. The introduction of a tolerable upper limit for most nutrients protects the consumer from overfortification. A major step forward in standardizing food fortification practices was the publication of the WHO guidelines for the fortification of foods with micronutrients. These appeared in 2006 (WHO/FAO, 2006) and were updated for wheat and maize flour in 2009 (WHO, FAO, UNICEF, GAIN, MI, & FFI, 2009). The guidelines made evidence-based recommendations with respect to fortification compounds, fortification vehicles, and importantly they described how to define a fortification level. The guidelines also discuss monitoring and evaluation of fortification programs, introducing fortification legislation, the need for advocacy, and cost-effectiveness based on the expected health benefits.

    The Lancet Maternal and Child Nutrition Series (Maternal and Child Nutrition Study Group et al., 2013), the Copenhagen Consensus and the Scaling up Nutrition (SUN) Movement all recognize and endorse staple food fortification as a sustainable, cost-effective intervention with a proven impact on public health and economic development. The Copenhagen Consensus Center is a think tank that uses cost benefit analysis to establish priorities for advancing global welfare. Each year a range of global problems are evaluated and ranked by a panel of economists that includes Nobel laureates. In 2008, the expert panel considered 10 great global challenges. The panel noted the exceptionally high ratio of benefits to costs of micronutrient interventions and they ranked micronutrient supplementation (vitamin A and zinc) as the top priority, with micronutrient fortification (iron and salt iodization) as priority number three, and biofortification as priority number five. In 2012, the Copenhagen Consensus recommended increasing micronutrient intake by one or more of the different strategies as the number one priority for the greatest return on investment (Copenhagen Consensus, 2012). They noted that GDP losses from undernutrition can be 2%–3% per year. According to the Copenhagen Consensus, the return on investment of food fortification is one of the highest development dividends. For example, in the case of iodine, the cost of salt iodization is less than 20 US cents per person per year, and for every $1 spent, the saving is as much as $30 in higher medical and nonmedical expenditures. A rough estimate for low- and middle-income countries suggests the cost–benefit of fortification is around 30:1.

    Although food fortification is common in the industrialized countries, there are still important issues to resolve. These include the safety of folic acid fortification, the need for vitamin D fortification, and ensuring that fortification does not provide excess levels of micronutrients and cause negative health consequences. It is in low- and middle-income countries however that we are entering the new era for scaling-up fortification programs, for while the potential health impacts are well appreciated, the implementation science is less well developed. The first Global Summit on Fortification held in Arusha, Tanzania in September 2015 (The #FutureFortified Global Summit on Food Fortification, 2016) called for national governments to invest more in technical support, oversight, and compliance of food fortification programs. They stressed the need for enforcement of fortification standards, better advocacy to governments on the cost effectiveness, more evidence to guide fortification policy and program design. In order to ensure more transparent accountability, they also called for an annual report on the state of fortification globally.

    All aspects of global food fortification are covered in the following chapters beginning with the current global prevalence of micronutrient deficiencies, daily recommended micronutrient intakes, the different interventions that can increase micronutrient intake, and food fortification technologies. This is followed by a discussion of the different delivery models including large-scale government mandated programs, industry market-driven foods, food aid and publicly distributed foods, and biofortification of staple foods. The main food fortification vehicles and the critical micronutrients are discussed individually and a special effort has been placed on scaling up and implementation of new national programs. This includes financial and business considerations, cost-effectiveness, public–private partnerships, consumer awareness, advocacy, quality control, regulatory monitoring, role of government, and impact evaluation.

    This overview chapter sets the scene for food fortification, describing the historical development and first successes, the different types of fortification, the choice of the food vehicle, the development of large-scale national programs, and points out current issues and challenges.

    1.2 Early Successes With Food Fortification

    We tend to forget that the widespread micronutrient deficiencies that are reported today in low- and middle-income countries were once common in the poor urban and rural populations of Europe and the United States. Goiter, cretinism, anemia, rickets, pellagra, and xerophthalmia were common illnesses until the early 20th century, at a time when vitamins were being discovered, and when low intakes of vitamins and minerals were being linked to the common diseases that so increased morbidity and mortality. Food fortification with micronutrients was a part of the public health response to prevent these illnesses and was rewarded with some remarkable successes (Semba, 2012).

    The first micronutrient deficiency to be targeted by public health programs was iodine and the first fortified food to be introduced was iodized salt to prevent goiter and cretinism. Fortification of salt with iodine was introduced in Switzerland in 1923 and Michigan, USA in 1924. Its success led to the voluntary iodization of salt throughout the United States and the virtual elimination of iodine as a serious public health problem by the late 1930s. At about the same time, it was reported that iron-fortified milk decreased anemia prevalence in infants, although, in 1911–14, the United States had taken an alternative approach to treat the anemia in school children in the rural South. They targeted the widespread hookworm infections and took measures to improve sanitation and hygiene, resulting in increased hemoglobin concentrations, increased growth, and better performance on mental development tests, providing a strong reminder that anemia has multiple causes and may need multiple interventions to eliminate completely.

    By the end of the 1930s, the chemical structures of the major vitamins were known, and most could be synthesized enabling their addition to food. At that time, vitamin A deficiency was widespread in Europe especially in Denmark where it resulted in a high mortality of children. Vitamin A was first added to margarine voluntarily in the United Kingdom in 1927 and this practice became mandatory during the Second World War in order to achieve nutritional equivalence to butter.

    Rickets was common in children who lived in the industrial cities of North America and Europe from the 17th until the early 20th century when over 85% of the children living in these areas had rickets, primarily due to lack of sunshine and insufficient production of vitamin D in the skin. As soon as it was synthesized in the 1930s, it was used to fortify milk in Europe and North America which resulted in the eradication of rickets as a major health problem in children.

    In the early 20th century, pellagra was common in the maize eating populations of the southeast United States. At the peak of the epidemic (1928–30), 7000 individuals died per year from pellagra due to niacin deficiency. Lime treatment of maize, commonly used in Central America, and which releases niacin from its nonbioavailable form, was not practiced. Voluntary enrichment of bread and other grain products with niacin was implemented in 1938 and mandatory fortification followed in 1940 and, as a result, pellagra had become almost nonexistent by 1950. Iron, thiamine, niacin, and riboflavin were required to be added to wheat flour and other cereal products to replace nutrients lost during the milling process and to reduce the risk of anemia, beriberi, pellagra, and riboflavin deficiency respectively (Semba, 2012). These early successes paved the way in the latter half of the 20th century for widespread fortification in industrialized countries of flour, salt, milk, infant foods, and manufactured foods such as breakfast cereals and beverages, and are the impetus for the globalization of food fortification and the implementation of new programs in low- and middle-income countries.

    1.3 Types of Fortification

    In 1987, the Codex Alimentarius Commission outlined general principles for adding nutrients to foods (Codex Alimentarius, 1987). It used the terms fortification and enrichment interchangeably, with the following definition: Fortification or enrichment means the addition of one or more essential nutrients to a food whether or not it is normally contained in the food for the purpose of preventing or correcting a demonstrated deficiency of one or more nutrients in the population or specific population groups. The United States has made the most consistent efforts in establishing food fortification policy and to guide fortification programs. The FDA currently endorses the addition of nutrients to food under four conditions These are nutritional deficiencies (e.g., salt iodization); restoration of nutrient losses (such as the addition of micronutrients to white wheat flour); improving the quality of replacement food (the original rationale for fortifying margarine); and to balance the nutrient content of industrially fabricated foods that replace large proportions of the natural diet. The FDA endorses a standard profile of 22 nutrients for addition to these new foods.

    Fortified foods and fortification programs can be designed, delivered, and controlled in different ways depending on the extent of involvement of the private and public sectors. Programs can be designed for mass fortification, targeted fortification, or market-driven fortification. Mass fortification refers to the addition of one or more micronutrients to staple foods or condiments that are widely consumed by a general population that has an unacceptable public health risk of being deficient in these micronutrients. Flour fortification with iron and folic acid, and salt iodization are good examples. This type of fortification is usually led by governments but may be voluntary or mandatory. It reaches all sections of the population including the most at risk groups, such as women and children, but mass fortification also provides micronutrients to those population groups such as adult men who are already consuming enough micronutrients to meet their requirements. This creates the possibility of excess intake and the potential of negative health consequences if not well controlled. The additional cost of the micronutrient premix for mass fortification is a major factor in low- and middle-income countries when introducing a program, as is ensuring the collaboration of all stakeholders, particularly those in government and industry, but also academia, non governmental organization (NGO’s) and consumers.

    Targeted fortification refers to fortified foods that are designed for, and targeted at, a specific population group. The population group is most often infants and young children but any population group could be targeted including adolescents, young women, pregnant women, or even food aid for displaced persons. Unlike mass fortification, the targeted fortified foods are rarely fortified with a single or even a small number of micronutrients. They are usually fortified with range of critical micronutrients for which the targeted population is at risk of deficiency. Industrially manufactured complementary foods for young children, often based on cereals with milk or legumes, are the main targeted fortified food in this category. Complementary foods are consumed from 6 months of age, as the infant moves from breast milk as the sole source of nutrition to an intermediate weaning diet, until around 2 years when the young child moves onto the normal family diet. The composition of manufactured complementary foods is recommended and regulated by government and international agencies so as to ensure that the child receives adequate nutrition in the period between breast feeding and consuming the family diet. Infant formulas, which may be needed to replace breast milk, are also strictly regulated in composition of micro- and macronutrients and are formulated to cover all the nutrient requirements of the infant.

    In Europe, North America, and other industrialized countries, some mothers wean their children from breast milk with home-produced complementary foods, however others purchase manufactured fortified complementary foods from shops or pharmacies. In developing countries, families cannot afford to purchase manufactured complementary foods and the home-produced cereal gruels, that are fed as complementary foods to young children, are deficient in many micronutrients and often also in energy. Public health programs that supply fortified complementary foods to infants and young children have been introduced by international agencies in some developing countries and may also be distributed by national governments through targeted/subsidized programs. Poorer families in the United States can also obtain complementary foods through a government organized public distribution system (Special Supplemental Nutrition Program for Women, Infants and Children (WIC)). In recent years, a common and least costly way of fortifying complementary foods for young children in the developing world has been through the distribution of sachets of micronutrient powders. The powder is sprinkled daily onto the gruel at the time of consumption. Such products contain a multimicronutrient mixture designed to provide all the micronutrients missing from the regular diet. More recently fortified lipid based supplements have been similarly added to gruels fed to children so as to provide both the missing micronutrients and additional energy.

    Market driven fortification refers to those manufactured foods which are fortified both for the marketing advantage of the company and for the benefit of the consumer: The food manufacturers use the nutritional benefit of the consumer as a marketing advantage, while complying with government regulations in respect to the nature and quantity of the specific nutrient added. The fortified foods are targeted at specific population groups. Breakfast cereals and chocolate drink powders, e.g., are targeted at children and adolescents. They are commonly fortified with a range of micronutrients at around 30% of the daily requirement and, when widely consumed, can provide useful amounts of those micronutrients often lacking in the diet. In low-income countries, targeted fortified foods are usually out of reach for the poorer communities that are most in need of micronutrients, however cost is generally not an issue in the higher socioeconomic groups in the developing world or industrialized countries. In general, because of the epidemic of overweight in North America and Europe, and the double burden of under nutrition and overweight in low- and middle-income countries, high energy snack foods such as carbonated beverages and confectionary are not considered suitable as targeted fortification vehicles

    Food fortification is governed by national regulations whether it is mandatory or voluntary. Government legislation mandates many national mass fortification programs describing the food vehicle and the nature and level of the micronutrients to be added. National fortification policies may also provide guidance on when it is appropriate to add nutrients to foods (e.g., restoration; correcting dietary insufficiency; avoiding nutritional inferiority and maintaining a balanced nutrient profile in a food like a meal replacement). Monitoring and enforcement of the regulations is not always strong in developing countries, however mandatory mass fortification programs usually have a better chance of success. Government regulations and international recommendations also exist for voluntary fortification and allow the food industry to add micronutrients to foods as long as they conform to specifications. Market driven fortification is always voluntary, whereas targeted fortification can either be mandatory or voluntary.

    1.4 Selection of Vehicles

    When a country or region is ready to implement food fortification, the process begins by identifying the commonly eaten foods that can act as vehicles for one or more micronutrients. To better define the fortification level, WHO (WHO/FAO, 2006) recommends dietary surveys to define micronutrient intake and consumption of potential food vehicles in these different population groups. Using these guidelines, fortification programs provide meaningful levels of the micronutrients (e.g., 30%–50% of the daily adult requirements) at average consumption of one or more food vehicles. The levels also need to take into account variations in food consumption so that the safety of those at the higher end of the scale and impact for those at the lower end are ensured. They should also consider prorated intakes by young children to ensure efficacious and safe dosages. Cost, bioavailability, sensory acceptability, and storage stability are some of the criteria that determine the best match between the nutrient and food vehicle.

    Common food vehicles that can be fortified include wheat and wheat products, maize, rice, milk and milk products, cooking oils, salt, sugar, and condiments. As processed foods such as breakfast cereals and chocolate drink powders gain popularity and market reach in low and middle-income countries, they offer new channels for micronutrient delivery. If potential food vehicles are represented as a pyramid, staple foods are at the base of the pyramid as they are cost-effective to fortify on a mass scale. Basic foods, such as breads and biscuits, packaged cereals and flours, and dairy products are in the middle; and market driven fortified foods such as convenience and ready-to-eat foods are at the top. Condiments such as salt, sugar, fish and soy sauce, and bouillon cubes fit at different levels of the pyramid depending on the relative fortification cost increase to what is originally an inexpensive foodstuff.

    Fortifying less expensive staple foods at the base of the pyramid results in broader dissemination of micronutrients throughout the population, particularly to the poor. Also, fortifying foods at the base of the pyramid has a better chance of fortifying products through the other tiers of the pyramid because staple foods are generally used to produce basic and value-added foods.

    Each food vehicle offers specific opportunities and constraints:

    Cereals: Staples such as rice, corn, and wheat that are milled at centralized locations have the potential to reach large populations and are used in several countries as vehicles for multiple nutrients. Staple cereals milled at the community level pose a challenge because of quality and safety constraints. For fortifying whole grain cereals such as rice, there is now a technology to extrude a simulated rice grain premix.

    Fats and oils: Cooking fats and oils offer an option to deliver fat soluble vitamins such as vitamin A and D. While they have an advantage in that they are often centrally refined and packed, there is still the challenge of a large proportion being sold in an unbranded form. Packaging in opaque containers is critical to protect the vitamins from degradation.

    Condiments: Salt, sugar, spices, and sauces are attractive carriers. Some are processed centrally and consumed in regular quantities and offer great potential. Recent studies show the promise for salt double fortified with iron and iodine.

    Dairy products: Areas where milk is processed in dairies, may offer an option for fortification with both vitamins and minerals.

    Market-driven fortified foods: Given the global demographic shifts from rural to urban areas, a larger proportion of the population can now be reached via commercially processed foods and value-added products. However, the most vulnerable populations consume these higher priced products only sporadically.

    A multifaceted approach of fortifying more than one food vehicle is a good strategy, especially when a universally consumed vehicle is not available. When multiple foods are fortified, each with a portion of the estimated average requirements per single serving, the possibility of consuming unsafe levels of a micronutrient through excess consumption of a single food becomes more remote.

    1.5 Biofortification

    Systematic planning and research over the past two decades suggests that biofortification, the process by which the nutritional quality of food crops is improved through agronomic practices, conventional plant breeding, or modern biotechnology can raise essential nutrient content and offer a long-term solution to improving intakes and preventing micronutrient deficiencies. Biofortification differs from conventional fortification in that biofortification aims to increase nutrient levels in crops during plant growth rather than through manual means during postharvest processing of the crops. Key concepts underlying a rationale for staple food biofortification are the achievement of sustained nutrient enrichment of local staple crops, a potential for improved crop resilience, productivity, and agronomic value, and a structure for introduction into the community aimed at reaching the rural poor. Biofortification may therefore present a way to reach populations where supplementation and conventional fortification activities may be difficult to implement be limited (Bouis et al., 2011). Examples of biofortification projects include: iron-biofortification of rice, beans, sweet potato, cassava, and legumes; zinc-biofortification of wheat, rice, beans, sweet potato, and maize; provitamin A carotenoid-biofortification of sweet potato, maize, and cassava. A novel characteristic of biofortification may be its permanence in nutrient enrichment: once a nutrient-enriched staple crop has been bred, adapted, and grown in a region, the nutrient increment is, without continued plant breeding innovations, perpetual.

    1.6 Current Situation, Issues and Challenges

    Notwithstanding the considerable progress in food fortification over the past decades, there are major challenges to ensure that undernourished people especially in low- and middle-income countries receive meaningful amounts of micronutrients through improved access to fortified foods. The following section discusses the current situation with respect to fortification programs designed to prevent specific micronutrient deficiencies, and highlights some remaining issues and challenges.

    Vitamin A: Guatemala’s sugar fortification program has virtually eliminated vitamin A deficiency; and big reductions in vitamin A deficiency have also been reported in El Salvador and Honduras, where fortification was combined with supplementation. Similar approaches in Zambia beginning in 1998 demonstrated success in urban areas. Since the poorer segments of the population in Africa and Asia do not consume as much sugar as in Latin America, countries such as Nigeria, Morocco, Yemen, Bangladesh, and Pakistan are implementing national programs to fortify cooking oils with vitamin A. Because of the high efficacy of vitamin A fortification, safety is a concern, and care must be taken not to over fortify.

    Iodine: The most successful global fortification experience has been the fortification of salt with iodine. Adding iodine to salt is a simple manufacturing process costing no more than 4 cents per person annually. A significant proportion of the populations in more than 120 countries have access to iodized salt. As of 2015, nearly 76% of salt consumed in the world is being iodized, protecting nearly 80 million newborns each year from the threat of mental impairment caused by iodine deficiency (UNICEF State of the World’s Children, 2015). Successful salt iodization has reduced the incidence of goiter and cretinism, prevented mental retardation and subclinical iodine deficiency disorders, and contributed to improved national productivity. Building on the success with iodization, double fortification of salt with iodine and iron is gaining ground and can be integrated with established iodization processes. Double fortified salt is currently being produced in India and has the potential to be distributed through commercial channels and public programs to reach economically weaker sections of the population in many countries.

    Nevertheless, despite the relative success of salt iodization, there are population groups in many countries still without access to iodized salt. These groups are often those most vulnerable and are in the greatest need of protection against iodine deficiency. While the relatively easier task of getting compliance with iodine fortification guidelines from the large- and medium-scale salt industry units has been achieved, compliance by small- and some medium-scale salt producers continues to pose challenges. Thus, the strategies used to achieve 70% coverage of iodized salt globally will not necessarily result in addressing the challenge for the remaining 30% of the population. The time needed for a fortification intervention to become effective in low- and middle-income countries is likely to be much longer than in developed countries because in the former, such vehicles as salt are often processed in a large number of widely-dispersed cottage-scale industries that are less professionally managed.

    Iron: the global prevalence of iron deficiency is high but unlike with iodine and vitamin A, it has been much more difficult to demonstrate conclusively that national iron fortification programs have increased iron status and improved health. One difficulty in demonstrating the impact of iron fortification has been that iron deficiency does not lead to an easily identifiable deficiency disease that can be eradicated in the same way as goiter or xerophthalmia. Iron deficiency leads to retarded brain development, poor pregnancy outcome, decreased work performance, and anemia, all of which have multiple etiologies,

    Another difficulty has been the choice of the iron fortification compound and the definition of the iron fortification level. A major problem has been that the more bioavailable, soluble iron compounds often cause frequent color and flavor changes in some food vehicles, whereas the organoleptically acceptable, more insoluble compounds are less well absorbed. Another challenge is that cereal flours, the major iron fortification vehicle, are high in phytate, a potent inhibitor of iron absorption. Solutions have been found in recent years as a result of iron absorption studies in women and children and long-term efficacy studies that have identified alternative iron compounds and have devised ways of overcoming the inhibitory effects of phytic acid. A recent systematic review of 60 efficacy trials concluded that consumption of iron-fortified foods results in an improvement in hemoglobin, serum ferritin, and iron nutrition (Gera et al., 2012). Additionally, Costa Rica clearly demonstrated recently that a national program fortifying milk powder and maize flour with iron markedly decreased anemia prevalence in women and children (Martorell et al., 2015). Unfortunately, the use of anemia prevalence to monitor iron fortification programs can also be problematic if the observed anemia has other etiologies in addition to iron deficiency. Hookworm, malaria, hemoglobinopathies, and inflammatory disorders are major causes of anemia in many Africa and Asian countries and they may overlap with iron deficiency. Clearly, other interventions in addition to iron fortification are necessary to decrease anemia prevalence in these countries.

    Folic acid: NTD’s occur when the neural tube fails to close early in pregnancy resulting in spina bifida and anencephaly. In 1991, it was reported that supplemental folic acid reduced the recurrence of NTD in women with a previous history of an NTD pregnancy. To impact NTDs, folate must be consumed before conception. The situation is complicated however by some women having genetic polymorphisms in folate metabolism, resulting in higher folate requirements than the general female population. Flour was fortified with folic acid in the US in 1998, and in 2014 some 75 countries were likewise fortifying wheat, maize, or rice with folic acid to reduce the risk of folic acid preventable spina bifida and anencephaly. Market-driven foods fortified with folic acid are also common. Mandatory folic acid fortification of flour has been described as the most important science-driven nutrition and public health intervention in decades. Folic acid status has markedly improved in many countries, and NTDs have been dramatically decreased in the United States by 19%–32% and, in a range of other countries, from 19% to 55%. Bell and Oakley (2009) estimated that 27% of the world’s population has access to folic acid-fortified flour, but that only 10% of the preventable birth defects are currently prevented due mainly to poor coverage in low- and middle-income countries.

    There are however some issues. Folic acid fortification is targeted at young women with an increased requirement for folate, not at a general population with low folate intakes or with a reported low folate status. In Canada, e.g., which has fortified flour with folate, <1% of Canadians were reported to be folate deficient and 40% had high red cell folate concentrations. The blood folate concentration needed to achieve a maximum reduction in folate sensitive NTDs however is unknown, although it is considered to be much higher than the levels set for folate deficiency. There are potential adverse health effects of high folate intakes and concern has focused on the possibility that increased folic acid could mask anemia caused by B12 deficiency resulting in neurological damage and a higher risk of memory impairment. The potential for folic acid fortification to increase colorectal cancer has also been raised, although after almost 20 years of folic acid fortification in the United States no evidence has emerged to support this possibility. Recent focus has shifted to the possibility that folic acid fortification, because of its role in methylation reactions, might lead to changes in epigenetic patterns and might explain different health outcomes amongst individuals with similar genetic backgrounds.

    Vitamin D: synthesis in the skin is the primary source of vitamin D, however many people particularly the elderly and those in northerly latitudes rely on dietary vitamin D to maintain an adequate status. Vitamin D however is not widespread in foods and is found naturally at low concentrations in a few foods only. Vitamin D intakes in Europe, North America, and many other countries are far below dietary reference intakes established assuming minimal sun exposure, and vitamin D deficiency is currently reported in many parts of the world. One complication is the disagreement on the references ranges for 25-hydroxyvitamin D that represent adequate vitamin D status. Lower cut-off values have been recommended for the prevention of rickets and osteomalacia, whilst much higher cut-off values have been proposed for the prevention of falls and fractures in the elderly. The choice of vehicles is also problematic, and even in populations where fluid milk or margarine are voluntarily or mandatory fortified, much of the population still consumes less vitamin D than is recommended as milk is now less widely consumed. Alternative or perhaps multiple food vehicles are required for vitamin D fortification. In the United States, in addition to milk, yoghurt, butter, margarine, cheese, orange juice, and bread have been voluntarily fortified and wheat flour has been suggested as another option.

    1.7 Concluding Thoughts

    The sound science base that has resulted from much research in recent years, and the vast experience with fortification programs in industrialized countries, means that food fortification is ready for globalization, and ready to target those micronutrient deficiencies highly prevalent in low- and middle-income countries, as well as those still not eradicated in the industrialized world.

    We should proceed with care however, for while the addition of micronutrients to foods can help maintain and improve the nutritional quality of diets, indiscriminate fortification of foods could lead to overfortification or underfortification of micronutrients, and could cause a nutrient imbalance in the diet. Any changes in food fortification policy for micronutrients must therefore be considered within the context of the impact the changes will have on all segments of the population, and whether policy changes need technology changes or influence safety considerations (Dwyer et al., 2014).

    In addition to these programmatic challenges, there are differences in perceptions concerning fortification. While it is well established that food fortification has a positive impact on a population’s health and well-being that by far outweighs any potential risk, historically there has been public opposition in some countries to the addition of a foreign substance to food or water. Opponents of fortification argue that nutritional education with respect to a well-balanced diet is a more logical approach than fortification. At the other end of the scale, the nutritional supplement and vitamin industry promotes the view that it is better for people to consume multivitamin supplements. Other objections include the potential risk for negative health outcomes.

    It is important to understand these different viewpoints, but equally important to move forward in a responsible way with what is most beneficial to the largest numbers of people whose lives would otherwise be compromised without the essential vitamins and minerals in their diet. What is needed is a balanced approach. Together with food fortification programs, public health interventions should focus on the elimination of other underlying causes of micronutrient deficiencies. In the developing world, e.g., these could include improvements in sanitation that would decrease hookworm infection and improve iron status through reduced blood loss; vaccinations to protect against measles infection caused by decreased immunity because of Vitamin A deficiency; and birth control that should improve the standard of living of a family and result in a better quality diet.

    References

    1. Barkley JS, Wheeler KS, Pachón H. Anaemia prevalence may be reduced among countries that fortify flour. Br J Nutr. 2015;114(2):265–273.

    2. Bell KN, Oakley GP. Update on prevention of folic acid-preventable spina bifida and anencephaly. Birth Defects Res Part A. 2009;85(1):102–107.

    3. Bouis HE, Holtz C, McClafferty B. Biofortification: a new tool to reduce micronutrient malnutrition. Food Nutr Bull. 2011;32(Suppl. 1):531–540.

    4. Codex Alimentarius, 1987. General principles for the addition of essential nutrients to foods. Available at www.codexalimentarius.org.

    5. Copenhagen Consensus. Expert Panel Findings Denmark: Copenhagen; 2012.

    6. Dwyer JT, Woteki C, Bailey R, et al. Fortification: new findings and implications. Nutr Rev. 2014;72(2):127–141.

    7. Gera T, Sachdev HS, Boy E. Effect of iron-fortified foods on hematologic and biological outcomes: systematic review of randomized controlled trials. Am J Clin Nutr. 2012;96:309–324.

    8. Martorell R, Ascencio M, Tacsan L, et al. Effectiveness evaluation of the food fortification program of Costa Rica: impact on anemia prevalence and hemoglobin concentrations in women and children. Am J Clin Nutr. 2015;101(1):210–217.

    9. Maternal and Child Nutrition Study Group, Black RE, et al. Maternal and child nutrition: building momentum for impact. The Lancet. 2013;382 P372-375.3.

    10. Moench-Pfanner R, Laillou A, Berger J. Large-scale fortification, an important nutrition-specific intervention. Food Nutr Bull. 2012;33.

    11. Semba RD. The historical evolution of thought regarding multiple micronutrient nutrition. J Nutr. 2012;142:S143–S156.

    12. The #FutureFortified Global Summit on Food Fortification. Event Proceedings and Recommendations for Food Fortification Programs. Sight & Life Magazine Supplement. 6 July 2016.

    13. UNICEF State of the World’s Children 2015. UNICEF New York.

    14. WHO, FAO, UNICEF, GAIN, MI, & FFI. Recommendations on Wheat and Maize Flour Fortification Meeting Report: Interim Consensus Statement Geneva: World Health Organization; 2009.

    15. WHO/FAO. Guidelines on Food Fortification for Micronutrients Geneva: World Health Organization; 2006.

    16. World Bank. Enriching Lives: Overcoming Vitamin and Mineral Malnutrition in Developing Countries Washington, DC: Development in Practice; 1994.

    Chapter 2

    Prevalence, Causes, and Consequences of Micronutrient Deficiencies. The Gap Between Need and Action

    Ian Darnton-Hill¹,²,    ¹University of Sydney, Sydney, NSW, Australia,    ²Tufts University, Boston, MA, United States

    Abstract

    The global gap between micronutrient deficiencies and effective measures to address them remains large with common estimates of around 1.6–2 billion people affected. Women, young children, and adolescents are particularly at risk, especially in disadvantaged settings, and increasingly the elderly. The causes include poverty and resulting poor diets, concomitant infectious diseases with increased requirements and losses, limited health services capacity and resources, and sociocultural practices. The consequences are poorer health outcomes, impaired growth and development, and reduced economic well-being. The actual figures of the gaps are not known with accuracy, partly because of insufficient data in recent years, a lack of national and subnational surveys, and often inadequate biomarkers, meaning that much of the current data are incomplete or out-of-date. There are fewer national surveys being conducted, e.g., for vitamin A deficiency, than a decade ago.

    Rapid survey methodologies for assessing disaggregated national deficiency levels, as well as coverage of fortification (and other interventions such as dietary diversity and supplementation) are urgently needed to provide evidence of effective coverage and impact of interventions, such as food fortification among diverse geographic and income groups. In the meantime, existing figures for prevalence, causes, and consequences are given together with indicative information on the gap between the need and public health action.

    Keywords

    Micronutrients; vitamins and minerals; deficiencies; global prevalence of micronutrient deficiencies; population gaps in requirements; health and economic consequences; prevention and management strategies

    Chapter Outline

    2.1 Introduction 13

    2.2 The Gap in Micronutrient Intakes at Population Level and the Resultant Deficiency Outcomes Being Addressed 13

    2.2.1 Iron Deficiency and Anemia 15

    2.2.2 Iodine 17

    2.2.3 Folate and Neural Tube Defects (NTDs) 18

    2.2.4 Vitamin A Deficiency 19

    2.2.5 Zinc 21

    2.2.6 Other Micronutrients 22

    2.3 Conclusions 24

    References 25

    Further Reading 28

    This chapter draws on a much longer overview of Food Fortification prepared for the Micronutrient Forum (available on their website, currently maintained by the Micronutrient Initiative www.mnf.org)

    2.1 Introduction

    Deficiencies of micronutrients (vitamins and minerals/trace elements), and the resulting negative consequences of such deficiencies, continue to be very significant public health problems in much of the world (WHO, 2017; Black et al., 2013; Darnton-Hill et al., 2017). Women and young children in low- and middle-income country (LMIC) populations (WHO, 2017; Allen, 2005) and female adolescents (Thurnham, 2013) are especially at risk (Darnton-Hill et al., 2017; Bailey et al., 2015). Micronutrient malnutrition has widespread and important consequences to both national health and economic well-being (Horton et al., 2008; The World Bank et al., 1994; Bhutta et al., 2013) with a small but important contribution to the total global burden of disease (Bhutta and Haider, 2009; Darnton-Hill, 2012). The World Health Organization (WHO) and the Food and Agriculture Organization of the United Nations (FAO) have identified four main strategies for improving micronutrient malnutrition: nutrition education leading to increased diversity and quality of diets; food fortification; supplementation; and, disease control measures (WHO/FAO et al., 2006). It is now also widely recognized that without parallel changes in socioeconomic and sociocultural norms, these strategies are unlikely to be fully effective or sustained (FAO/WHO, 2014). An increasingly important part of the overall strategy is the large-scale fortification of staple foods regularly eaten in diets consumed around the world (Darnton-Hill et al., 2017; European Commission, 2017).

    2.2 The Gap in Micronutrient Intakes at Population Level and the Resultant Deficiency Outcomes Being Addressed

    Over 1.6 to 2 billion people globally are estimated to be at risk of micronutrient deficiencies such as anemia (WHO/FAO et al., 2006; European Commission, 2017; McLean et al., 2009; Stevens et al., 2013). A systematic review of all studies published between 1988 and 2008 that reported on micronutrient intakes of women in resource-poor settings found that over half of the studies reported mean/median intakes of all the micronutrients measured as below recommended intakes (vitamins A and C and niacin had especially low intakes at 29%, 34%, and 34% of Estimated Average Requirement (EAR), respectively) (Torheim et al., 2010). While regional differences were apparent, overall the review identified that women living in resource-poor settings of LMIC commonly have inadequate intakes of one or more micronutrients (Torheim et al., 2010), confirming earlier studies (Allen, 2005), particularly in pregnancy (Darnton-Hill and Mkparu, 2015). The deficiencies result in considerable social and economic costs, usually with a negative gender bias against females (Darnton-Hill et al., 2005).

    Globally, the gap can be represented by the estimated prevalence of deficiencies of each micronutrient (Fig. 2.1). The burden of the global figure of around two billion (WHO/FAO et al., 2006) is borne most by women, including adolescents, and children (Bailey et al., 2015). This leads, amongst other consequences, to the risk of less than optimal development of 40%–60% of children in the 6–24 month age group growing up in LMIC (Alderman and Horton, 2007) and contributes to over 600,000 stillbirths or neonatal deaths and over 100,000 maternal deaths during pregnancy (Rowe and Dodson, 2012). At the same time, some 18 million newborns are estimated to be born intellectually impaired as a result of maternal iodine deficiency (Iodine Global Network IGN, 2015a; Rohner et al., 2014). Insufficient intake of vitamin A results in approximately 350,000 cases of childhood blindness, with a half of them dying within 12 months of losing their sight and compromised immune system leading to at least 157,000 early childhood deaths due to diarrhea, measles, malaria, and other infections each year (WHO, 2009; Palmer et al., 2017). It has been estimated that each year, 1.1 million children under the age of five die because of vitamin A and zinc deficiencies (Micronutrient Initiative et al., 2009). Due to maternal folate deficiency, over 300,000 children were estimated in 2006 to be born each year with severe birth defects (March of Dimes et al., 2006). Just micronutrient deficiencies alone have been estimated to cost an annual GDP loss of 2%–5% (in LMIC) (Horton et al., 2008; Horton, 2006) with direct costs estimated to be between US$20 to US$30 billion every year (Horton, 2006).

    Figure 2.1 Magnitude of prevalence of micronutrient deficiencies worldwide (note: prevalence of low urinary iodine is based on a single spot urine sample) (Muthayya et al., 2013).

    Other outcomes of the relatively poorer diets, and compromised well-being and health in women and young children in many LMIC are the substantially higher rates of maternal mortality, stillbirth, and neonatal mortality in the lowest compared to the highest income countries; 98% or more of these adverse outcomes occur in low-income countries (Barros et al., 2015; Goldenberg and McClure, 2012). Within countries, costs of micronutrient malnutrition differ between socioeconomic status of subpopulations, e.g., in the Philippines costs attributed to micronutrient deficiencies in the poorest third of households were estimated to be five times higher than in the wealthiest third (Wieser et al., 2013). Such disparities add increased financial burdens to often already-overloaded and underresourced health systems (WHO, 2010). While reasons for disparities are not always known, they partly at least relate to differences in access to health care and resources and behavioral factors such as poor seeking-out behaviors to both health care and specific interventions (Boerma et al., 2008). Consequently interventions like fortification that generally require less active health and nutrition-seeking behaviors, and/or increases in availability or access to improved dietary intakes, could be expected to have an important impact (Darnton-Hill et al., 2017).

    2.2.1 Iron Deficiency and Anemia

    Anemia is the most common and widespread nutritional disorder in the world, affecting over 1.62 billion people in both affluent and LMIC (Pasricha et al., 2013; Branca et al., 2014). Iron deficiency occurs when physiological demands are not met due to inadequate intake, absorption, or utilization, or excessive iron losses (Pena-Rosas et al., 2015) and has negative impacts even before developing into actual iron deficiency anemia. While iron deficiency is thought to be the most common cause of anemia globally (Petry et al., 2016), other nutritional deficiencies (particularly folate, vitamin B12, vitamin A, copper); parasitic infections (including malaria, helminths, schistosomes such as hookworms); chronic infection associated inflammation including HIV; and genetic disorders, such as hemoglobinopathies like sickle cell disease, can all cause anemia (WHO and FAO, 2004).

    Recently estimated global anemia rates are 29% (496 million) of nonpregnant women, 38% of pregnant women (32 million), and 43% of young children under five years (273 million), but the ranges vary enormously (Stevens et al., 2013) by socioeconomic and geographical location. As most are women of reproductive age or young children (Stevens et al., 2013; WHO, 2015) in LMIC, every second pregnant woman and about 40% of preschool children are anemic. Rates for children under 5 years of age go as high as 70%, 74%, and 80% in South Asia, East Africa, and Central and West Africa respectively, c.f. 11% in high-income regions (Stevens et al., 2013). Similar figures for pregnant women range from 23% in high-income countries compared to 53%, 46%, and 61% for South Asia, East Africa, and Central and West Africa. In Latin America and the Caribbean prevalence rates of anemia among children under 6 years of age ranged from Chile (4.0%) to a severe public health problem of over 40% in Bolivia, Guatemala, and Haiti, and for women of childbearing age from 5.1% in Chile to the highest rates in Panama (40%) and Haiti (45.5%) (Martorell et al., 2015).

    It has been estimated that an average of 50% of anemia is due to iron deficiency in women, rising to 60% for pregnant women, and in children about 42% (Stevens et al., 2013). The proportion directly attributable to iron deficiency is very geographically variable, and a recent review suggests there is large heterogeneity between countries and so may be nearer to 25% for children and 37% for nonpregnant WRA in many LMIC (Petry et al., 2016; Zimmermann and Hurrell, 2007). Estimates of anemia prevalence derived from the hemoglobin concentration measurements alone do not allow properly for the contribution of iron deficiency or the role of other causes of anemia (Petry et al., 2016). Currently available iron indicators are more difficult to interpret in populations in LMIC due to this multifactoral etiology of anemia (Lynch, 2012). Current estimates, using hemoglobin levels, are nevertheless shown in Fig. 2.2 as they reflect the severity and geographic extent of the problem, even

    Enjoying the preview?
    Page 1 of 1