Update on Nutritional Recommendations for the Pediatric Patient

Published on 26/02/2015 by admin

Filed under Pediatrics

Last modified 26/02/2015

Print this page

rate 1 star rate 2 star rate 3 star rate 4 star rate 5 star
Your rating: none, Average: 0 (0 votes)

This article have been viewed 752 times

Update on Nutritional Recommendations for the Pediatric Patient

Frank R. Greer, MD


Wisconsin Perinatal Center, Meriter Hospital, University of Wisconsin School of Medicine and Public Health, 202 S Park Street, Madison, WI 53715, USA

E-mail address: frgreer@pediatrics.wisc.edu

In the last several years there have been several reports with new nutritional recommendations for infants and children. These reports have included new ideas on how to prevent the development of atopic disease that suggest that the established practices of restricting the introduction of complementary foods to infants may promote atopic disease [1]. There are new recommendations for iron supplementation for infants and toddlers that point out the importance of introducing iron-rich complementary foods much earlier in the diet of infants, particularly if they are exclusively breastfed [2]. There are also new recommendations for increased supplements of vitamin D for infants, children, and adolescents [3,4]. In addition, there are new recommended nutritional guidelines for children at risk for overweight hypercholesterolemia [5], and a new American Academy of Pediatrics (AAP) statement that questions the benefits of the addition of probiotics and prebiotics to the diet of healthy children [6]. These new recommendations are reviewed in this article for pediatric health care providers.

Prevention of atopic disease with nutritional interventions early in life

In the past 30 years, the incidence of atopic diseases such as asthma, atopic dermatitis, and food allergies have increased dramatically [1]. The incidence of peanut allergy has increased threefold and the incidence of asthma has increased 160%. It has long been believed that the diet in early childhood has been important in the development of atopic disease, including the diet of the mother during pregnancy and lactation. The AAP has previously recommended that lactating mothers with infants at high risk for developing allergy avoid peanuts, cow milk, and fish in their diets while lactating [7].

However, studies have not supported a protective effect of a maternal exclusion diet during pregnancy or lactation on the development of atopic disease in infants [1]. Even though dietary food allergens can be detected in breast milk, it cannot be concluded from published studies to date that the exclusion of eggs, cow milk, fish, or peanuts from the diet of lactating women prevents atopic disease in the infant [1].

Breastfeeding has long been believed to prevent atopic disease in children. Although there is evidence that exclusive breastfeeding protects against wheezing in early life (generally believed to be related to infection and airway size), evidence that exclusive breastfeeding prevents asthma in children older than 6 years is unconvincing [1]. There is evidence that mothers with asthma who exclusively breastfeed increase the risk of asthma in their children [1]. For breastfeeding, the most convincing evidence to date is for the prevention of atopic dermatitis. For infants at high risk of developing atopic disease (parent or sibling with allergic disease) there is evidence that exclusive breastfeeding for at least 4 months, compared with feeding intact cow milk protein formula, decreases the incidence of atopic dermatitis and cow milk allergy during the first 2 years of life. However, there is not enough evidence to conclude that exclusive breastfeeding for at least 4 months prevents the development of food allergy (discussed later) [1].

The role of special infant formulas in preventing atopic disease has also been reviewed by the AAP and other groups [1]. There is no convincing evidence that the use of soy-based formula prevents allergy [1,8], and the use of amino acid–based formulas for the prevention of atopic disease has not been studied [1].

The role of hydrolyzed formula is less clear and most of the studies have been done in infants at high risk of developing allergy. No studies have compared completely or partially hydrolyzed formula with exclusive breastfeeding, so there is no evidence that the use of these formulas is any better than human milk in the prevention of atopic disease [1]. One of the largest studies, the German Infant Nutritional Intervention Program, showed the complexity of comparing various hydrolysate formulas [9]. This study included 945 infants who were at high risk for allergy and who were initially breastfed. The study compared a cow milk formula with 1 of 3 other formulas: partially hydrolyzed whey-based formula, an extensively hydrolyzed whey-based formula, and an extensively hydrolyzed casein-based formula. These formulas were started with weaning or when breast milk was not available. This study showed that different hydrolysates have different effects on atopic disease in infants who were initially breastfed and at high risk for the development of atopic disease. The best results for prevention of atopic disease (atopic dermatitis, urticaria, food allergy) were found for the extensively hydrolyzed casein-based formula compared with cow milk formula (odds ratio [OR] 0.51; 95% CI 0.28–0.92; P<.025). Lesser effects were seen with the partially hydrolyzed whey-based formula (OR 0.65; 95% CI 0.38–1.1) and extensively hydrolyzed whey-based formula (OR 0.86; 95% CI 0.52–1.4), compared with cow milk formula [9]. Why extensively hydrolyzed whey-based formula was less effective than partially hydrolyzed whey-based formula was not answered in this study. Follow-up of these infants has shown that the preventive effect of hydrolyzed infant formulas persists to a lesser degree until 6 years of age [10]. However, the higher cost of the hydrolyzed formulas must be considered in any decision-making process for their use. The most convincing data supporting their use are only for children at high risk for atopic disease who were initially receiving breast milk.

Although there have been many studies of the duration of exclusive breastfeeding and its effect on atopic disease, fewer studies have examined the timing of the introduction of complementary foods as an independent risk factor in breastfed or formula-fed infants. Most groups have recommended exclusive breastfeeding for between 4 and 6 months. However, there is no current convincing evidence that delaying the introduction of complementary food beyond this period has a significant protective effect on the development of atopic disease, regardless of whether infants are fed cow milk protein formula or human milk [1]. However, there is evidence that the early introduction of some complementary foods between 4 and 6 months may prevent food allergy [1,11], including foods that are highly allergic, such as wheat/gluten, fish, eggs, and foods containing peanut protein [1,1119]. For instance, introducing cereal grains before 6 months of age (compared with later introduction) has been shown to be protective against wheat-specific immunoglobulin E [14,15]. With gluten allergy, there is similar evidence that introducing gluten between 4 and 6 months of age prevents celiac disease [11,17]. There is also evidence that introducing highly allergic foods while still breastfeeding may be protective against food allergy. Thus, celiac disease can be prevented in children less than the 2 years of age if they are still breastfeeding at the time of introduction of dietary gluten (OR 0.59; 95% CI 0.42–0.83), and even further decreased by continuing to breastfeed after introducing gluten (OR 0.36; 95% CI 0.26–0.51) [11]. Again, earlier introduction of egg in the first year of life affects allergic sensitization to eggs [12,13]. Introducing eggs at between 4 and 6 months compared with 7 to 9 months increases the risk for egg allergy (OR 1.4; 95% CI 1.2–3.0 ), and introducing egg between 10 and 12 months increases the risk for egg allergy even more (OR 6.5; 95% CI 3.6–11.6) [13].

A major concern remains the impact of early introduction of peanut-containing products and the effect on peanut allergy [16,18,19]. A recent epidemiology study from Israel found that the prevalence of peanut allergy was tenfold lower in Jewish people living in Israel, where peanut butter is used in the weaning food Bamba, compared with London-based Jewish families with much less exposure to peanut-containing products [18]. At present, there are 2 ongoing studies looking at the effect of peanut exposure early in life on the development of peanut allergy at 3 to 6 years, and the results of these are eagerly awaited [16].

 

Iron

The AAP recently published its first official report on the diagnosis and prevention of iron deficiency (ID) and iron deficiency anemia (IDA) in infants and toddlers [2]. In the United States there has been a decline in prevalence of IDA, but it still accounts for about 20% of all anemias (hemoglobin [Hb]<11.0 g/dL) in young children between 12 and 35 months of age, with a prevalence of 2.1% [2] (Table 1). However, the prevalence of ID in young children between the ages of 12 and 35 months is much higher. It affects 9.2% of children between 12 and 35 months of age in the general US population and up to 15% of Mexican Americans of the same age group (see Table 1) [2].

Table 1 Iron deficiency, IDA, and anemia in NHANES 1999 to 2002, children aged 12 to 35 monthsa

image

IDA represents the severe form of iron deficiency and is associated with irreversible neurodevelopmental delay [2]. Given the likelihood that ID also has some effect on long-term neurodevelopment and behavior that may be irreversible, ID also remains a serious concern [2]. Because iron is prioritized to red cells more than other body tissues, its role in the transport of oxygen in the red blood cells is clearly its most critical function. This prioritization of iron for red cells, even more than the brain, would account for the adverse neurodevelopmental effects of iron deficiency with or without anemia in the infant. Iron is essential for neuronal proliferation, myelination, energy metabolism, neurotransmission, and various enzyme systems in the central nervous system [20].

The iron supplied in breast milk (averaging 0.27 mg/d assuming a breast milk intake of 0.78 L/d) coupled with the iron stores at birth in the healthy term infant, meets the iron needs only until the infant doubles its birth weight and thus doubles its blood volume, usually by 4 months of age. Between 4 and 6 months of age, there is a dramatic increase in the iron requirement, to 11 mg/d, for the remainder of the first year of life [2]. Thus, because of concerns about ID, the AAP has recommended that breastfed infants be supplemented with 1 mg/kg/d of oral iron beginning at 4 months of age until appropriate iron-containing complementary foods (including iron-fortified cereals) are introduced into the diet [2]. The best source of iron is heme iron found in red meat. Because breast milk contains no heme iron, the early introduction of complementary foods containing meat is advised. It is acknowledged that this would require a significant paradigm shift in the way complementary foods are generally introduced in the diets of infants by pediatric health care providers in the United States. For formula-fed infants, the iron needs for the first 12 months of life can be met by a standard infant formula (iron content 10–12 mg/L) and with the introduction of iron-rich complementary foods after 4 to 6 months of age [2]. Whole milk should not be used before 12 completed months of age.

For toddlers from 1 to 3 years of age, the AAP recommends an iron intake of 7 mg/d [2]. Again, this is best delivered by eating red meats, whole-grain cereals fortified with iron, vegetables that contain iron (ie, legumes), and fruits with vitamin C, which augments iron absorption. An alternative source of iron is liquid iron supplements up to 36 months of age, after which time chewable multivitamins can be used.

Because 80% of iron stores are accumulated during the last trimester of pregnancy, preterm infants need an iron intake of 2 mg/kg/d through 12 months of age, which can be supplied by iron-fortified formulas [2,21]. If fed human milk, these infants should receive an iron supplement of 2 mg/kg/d by 1 month of age and this should be continued until the infant is weaned to iron-fortified formula or eats enough complementary foods to supply 2 mg/kg/d of iron. An exception to this practice would be the preterm infant who has received an iron load from multiple packed red blood cell transfusions.

The AAP has also recommended universal screening for IDA and ID at 12 months of age, with a determination of Hb concentration and an assessment of risk factors associated with ID/IDA [2]. These risk factors include low socioeconomic status (especially in infants of Mexican American descent), a history of prematurity or low birth weight, exposure to lead, exclusive breastfeeding beyond 4 months of age without supplemental iron, and weaning to whole milk or complementary foods that do not include iron-fortified cereals or food naturally rich in iron. Other risk factors are feeding problems, poor growth, and the inadequate nutrition typically seen in infants with special health care needs. If, as a result of this screen, the Hb is less than 11.0 g/dL, then further work-up is required to establish the cause of anemia [2]. Regardless of the Hb, if there is a high risk of dietary ID, then further testing should be performed, given the potential adverse effects of ID on neurodevelopmental outcomes.

Buy Membership for Pediatrics Category to continue reading. Learn more here