Iron deficiency was defined as transferrin saturation of less than 10%.
In each box and whisker plot, the lower and upper edges of the box indicate
the 25th and 75th percentiles; the horizontal line inside the box indicates
the median; and the whiskers indicate either the minimum and maximum values
or a distance of 1.5 × the interquartile range from the edge of the
box (whichever distance is smaller). The circles beyond the whiskers indicate
extreme values for each marker (outliers). P = .001
for reticulocyte hemoglobin content, hemoglobin, mean corpuscular volume,
red blood cell distribution width, and mean corpuscular hemoglobin; P = .005 for zinc protoporphyrin; and P = .02 for ferritin.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Ullrich C, Wu A, Armsby C, et al. Screening Healthy Infants for Iron Deficiency Using Reticulocyte Hemoglobin
Content. JAMA. 2005;294(8):924–930. doi:10.1001/jama.294.8.924
Context Current clinical practice relies on hemoglobin to detect iron deficiency,
which misses infants not yet anemic and places them at higher risk for neurocognitive
impairment. Reticulocyte hemoglobin content (CHr) has never been compared
with hemoglobin for screening healthy infants.
Objectives To evaluate CHr for detecting iron deficiency without anemia in healthy
9- to 12-month-old infants and to compare CHr with hemoglobin in screening
for iron deficiency in this population. A secondary objective was to explore
the association between CHr and subsequent development of anemia.
Design, Setting, and Patients A prospective observational cohort study of 202 healthy 9- to 12-month-old
infants from an urban, hospital-based, primary care clinic in Boston, Mass,
who were screened for iron deficiency between June 2000 and April 2003, and
followed up for a median of 5.6 months.
Main Outcome Measures Iron deficiency (transferrin saturation <10%) and anemia (hemoglobin
Results Of 202 infants enrolled, 23 (11.4%) had iron deficiency and 6 (3%) had
iron deficiency and anemia. Iron-deficient and non–iron-deficient infants
had significantly different values for all measured hematological and biochemical
markers for iron deficiency. Optimal CHr cutoff for detecting iron deficiency
was 27.5 pg (sensitivity, 83% and specificity, 72%); a hemoglobin level of
less than 11 g/dL resulted in a sensitivity of 26% and a specificity of 95%.
Reticulocyte hemoglobin content was more accurate overall than hemoglobin
was for detecting iron deficiency (area under the receiver operating characteristic
curve, 0.85 vs 0.73; P = .007). A CHr of
less than 27.5 pg without anemia at initial screening was associated with
subsequent anemia when screened again in the second year of life (risk ratio,
9.1; 95% confidence interval, 1.04-78.9; P = .01).
Conclusions A CHr of less than 27.5 pg is a more accurate hematological indicator
of iron deficiency compared with hemoglobin of less than 11 g/dL in these
healthy 9- to 12-month-old infants. Further studies are warranted to determine
whether CHr should be the preferred screening tool in the early detection
of iron deficiency in infants.
Iron deficiency is the most common nutritional deficiency in the world.1 Insufficient dietary iron, variable absorption, and
rapid growth put infants at particular risk for this problem. The relationship
between iron-deficiency anemia in infants and the impairment of mental and
motor development occurring at this stage has been demonstrated in the past.2 Iron deficiency without anemia is also associated
with adverse effects on neurocognitive development,3-6 and
evidence that these effects may be permanent is mounting.2,7-11 For
example, children who had chronic iron deficiency in infancy had poorer mental
and motor functioning when compared with their non–iron-deficient counterparts
more than 10 years later.8 Detection and treatment
of iron deficiency, before it progresses to anemia, may be crucial in the
prevention of neurocognitive impairments. However, the lack of a simple and
reliable screening tool to detect this condition has in part made iron deficiency
difficult to eradicate.
Iron deficiency is usually diagnosed using biochemical parameters (eg,
serum iron, ferritin, and transferrin saturation).12-16 However,
transferrin saturation and other biochemical tests are impractical for screening
in the ambulatory setting due to biological variability, such as diurnal variation,
fluctuation with dietary intake, and as acute phase reactants, alteration
in inflammatory states. In addition, they are expensive and frequently not
available in the ambulatory setting. Zinc protoporphyrin is a biochemical
marker that has been used in the ambulatory setting to detect evolving iron
deficiency, but it has been shown to be a poor predictor for iron deficiency.17,18 Hematological tests are more widely
used to screen for iron deficiency. Hemoglobin is the most commonly used hematological
screening test, but it is derived from the entire population of red blood
cells, each with a lifespan of about 120 days, and therefore takes some time
to be altered by iron deficiency. Consequently, relying on hemoglobin for
screening will delay the detection of iron deficiency in infants who are not
yet anemic but for whom adverse neurological consequences may have already
begun to occur.
With the lifespan of reticulocytes in the circulation being only 24
to 48 hours, reticulocyte-dependent parameters provide a more real-time view
of bone marrow iron status.19 In the initial
phases of iron deficiency, before the development of anemia, fluctuations
in the iron supply to the bone marrow yield decreased hemoglobin production
in reticulocytes, resulting in reticulocytes with less hemoglobin and an overall
reduction in reticulocyte hemoglobin content (CHr).20-22 A
recent study has shown CHr to be a better predictor of iron stores than ferritin,
transferrin saturation, or mean corpuscular volume when bone marrow analysis
is used as the criterion standard.23 Unlike
biochemical studies, CHr requires no extra tubes of blood to be drawn; CHr
is reported as part of the reticulocyte count by the hematology analyzer used
in this study and is provided without any additional cost.
A screening approach for iron deficiency based on reticulocyte analysis
is appealing for its consistency in various biological states, direct real-time
assessment of iron metabolism, and ease of collection. The optimal CHr threshold
for predicting iron deficiency in healthy children has not been prospectively
determined and CHr has yet to be compared with hemoglobin as a screening tool
in the pediatric population. The primary objectives of our study were to establish
an optimal CHr threshold for detecting iron deficiency without anemia in 9-
to 12-month-old infants and to compare CHr with hemoglobin in screening for
iron deficiency in this population. A secondary objective was to explore the
association between CHr and subsequent development of anemia. The 9- to 12-month-old
age group was chosen because this age group is already routinely screened
since it is at particular risk for iron deficiency and its consequences.
Healthy 9- to 12-month-old infants presenting to an urban, hospital-based,
primary care practice in Boston, Mass, for scheduled well child or nonurgent
visits and due for iron deficiency screening as recommended by the American
Academy of Pediatrics24 and the US Centers
for Disease Control and Prevention25 were considered
as potential enrollees. To be included, study infants must have been born
at 37 or more weeks’ gestation; must not have been diagnosed with otitis
media, gastroenteritis, or an upper respiratory tract infection or have had
a temperature of more than 38.0°C in the 28 days before the visit; and
must not have taken antibiotics for an acute infection or steroids for a 14
or more day course within the past 28 days. In addition, study participants
could not have had a known hemoglobinopathy, history of anemia, or have ever
received a blood transfusion or iron supplement.
Parents or guardians reported the race/ethnicity of their infant by
indicating all that applied to their child: black/African American, Asian,
white, Hispanic/Latino, and other/unknown. Race/ethnicity were assessed because
values for hemoglobin in African American individuals are 0.5 to 1.0 g/dL
lower than values in comparable white populations, although whether this is
a racial characteristic or due to a higher frequency of certain genetic traits
or nutritional differences is debated.26-28 Race/ethnicity
data were collected to ensure that representation of these groups did not
change between the initial screening visit and the follow-up screening, and
to evaluate the generalizability of the study.
Only families whose children met the aforementioned inclusion criteria,
as determined by chart review and/or primary care clinician, parent, or guardian
contact, were approached by the study research nurses for possible enrollment
in the study. Family history and state newborn screen results were not reviewed
for the presence of hemoglobinopathy; this more closely simulates real-world
pediatrics in which these results do not influence the current clinical practice
of screening all 9- to 12-month-old infants.
This study was approved by the Committee on Clinical Investigation of
Children’s Hospital, Boston, Mass. At the initial screening, written
informed consent was obtained from the parent or guardian and infant demographics,
including height, weight, birth weight, and race/ethnicity, were documented.
Five milliliters of blood was obtained by venipuncture for measurement of
biochemical (ferritin, iron, total iron-binding capacity, and zinc protoporphyrin)
and hematological (mean corpuscular volume, red blood cell distribution width,
hemoglobin, reticulocyte count, and CHr) parameters, as well as C-reactive
protein. Transferrin saturation (iron/total iron-binding capacity) and mean
corpuscular hemoglobin [(hemoglobin × 10)/red blood cell count]
were calculated from these measurements. Iron deficiency was defined as transferrin
saturation of less than 10%; this is the biochemical parameter that is considered
to most accurately reflect the iron available to the bone marrow for erythropoiesis,
with a transferrin saturation of less than 16% having been shown to reflect
an undersupply of iron to developing erythrocytes.29 Moreover,
this threshold specifically applies to the age range of the infants in our
study and is the lower limit of the range of transferrin saturation values
widely used clinically to define iron deficiency.1 Participants
with a hemoglobin level of less than 11 g/dL were deemed anemic in accordance
with the American Academy of Pediatrics24 and
Centers for Disease Control and Prevention25 guidelines
and were referred to their primary care clinicians for clinical management
(ie, iron supplementation). These infants and those with insufficient samples
to determine hemoglobin were excluded from further study. The remaining participants
returned for follow-up screening at least 3 months from enrollment but before
their second birthday. Enrollees were again required to meet the same inclusion/exclusion
criteria at follow-up screening for measurement of the same biochemical and
hematological parameters. The original target sample size was 250 participants,
chosen to provide 90% power to detect a 0.7-SD difference in the mean CHr
level among iron-deficient vs non–iron-deficient patients using a t test with 2-tailed α=.05 significance level, assuming
that the prevalence of iron deficiency would be 10%. The study was closed
before meeting the target sample size due to slower than expected accrual.
Erythrocyte and reticulocyte indices were measured with an automated
hematology analyzer (ADVIA 120, Bayer Diagnostics, Tarrytown, NY), which quantifies
mean values and distributions for cell volume, hemoglobin concentration, and
hemoglobin content in both erythrocytes and reticulocytes. Serum iron and
total iron-binding capacity (based on a transferrin immunoassay) were measured
using a chemistry analyzer (Hitachi 917, Roche Diagnostics, Indianapolis,
Ind). C-reactive protein was measured on a BNII nephelometer (Dade-Behring
Inc, Deerfield, Ill). Zinc protoporphyrin was measured in whole blood with
a hematofluorometer (Aviv Biomedical, Lakewood, NJ) and expressed as μmol/mol
All data were entered into a Microsoft Excel spreadsheet (Microsoft
Corp, Redmond, Wash). Each entry was double-checked and data analysis was
performed using SPSS version 11 (SPSS Inc, Chicago, Ill), S-PLUS version 4.5
(Insightful Corp, Seattle, Wash), and Stata version 6 (StataCorp LP, College
Station, Tex). All significance testing was 2-tailed and statistical significance
was defined as P<.05. Any participant with a sample
insufficient to determine transferrin saturation or CHr was excluded from
analyses for that particular study visit. Characteristics of infants who were
included and excluded from analyses were compared using the χ2 test
for categorical variables and the t test for continuous
variables. Point estimates and exact binomial 95% confidence intervals (CIs)
were calculated for the prevalence of iron deficiency (transferrin saturation
<10%) and iron-deficiency anemia (transferrin saturation <10% and hemoglobin
<11 g/dL) at initial screening.
Box and whisker plots were created to display the distributions of hematological
and biochemical marker levels of non–iron-deficient and iron-deficient
infants, and median values were compared using the Wilcoxon rank sum test.
Receiver operating characteristic (ROC) analysis was used to evaluate the
sensitivity and specificity of all possible CHr, hemoglobin, and mean corpuscular
hemoglobin thresholds for detecting iron deficiency (using transferrin saturation
<10% as the criterion standard). A priori, the minimum requirements of
a screening test for iron deficiency were defined to be 80% sensitivity and
50% specificity, and the optimal CHr cutoff was defined to be the one with
the highest sensitivity and specificity among all thresholds meeting the minimum
requirements. The overall accuracy of CHr, hemoglobin, and mean corpuscular
hemoglobin in detecting iron deficiency was summarized using the area under
the ROC and compared using a nonparametric test for comparing areas under
correlated ROC curves.30 Ninety-five percent
CIs for the sensitivity, specificity, positive predictive value, and negative
predictive value were calculated using exact binomial methods. The association
of CHr at initial screening with the incidence of anemia at follow-up screening
was estimated by calculating the risk ratio of those above vs below the optimal
CHr cutoff at initial screening, with an exact 95% CI. The Fisher exact test
was used to assess statistical significance of this association.
Two hundred nineteen infants were enrolled between June 2000 and April
2003 (Figure 1). Seventeen infants (8%)
had insufficient samples to determine hemoglobin, transferrin saturation,
or CHr at initial screening and could not be included in data analyses; their
demographics did not differ significantly from those of the 202 infants with
available data. The baseline characteristics of all infants with available
data for initial and follow-up screenings are shown in the Table. Ninety-six percent of C-reactive protein measurements were
within the normal range at enrollment.
Of those infants with complete data at initial screening, 14 (7%) had
anemia and were excluded from further study participation. Of the remaining
188 infants who were eligible for a second screening, 32 (17%) did not return
for follow-up screening and 9 (5%) returned but did not have usable data (3
received iron supplementation by primary care clinician or parental initiative
and 6 had insufficient samples to determine hemoglobin). The remaining 147
infants had complete data for both screenings, with a median time interval
between screenings of 5.6 months. The baseline characteristics for the 147
infants with complete data for both screenings were similar to the characteristics
of the 202 infants at initial screening (Table). With the exception of age (mean, 9.8 vs 10.2 months; P = .02), the demographic characteristics of the 55 infants
who did not have follow-up screening did not differ significantly from the
characteristics of the 147 infants who did have follow-up screening.
Of the 202 evaluable infants at initial screening, 23 had iron deficiency
(prevalence, 11.4%; 95% CI, 7.4%-16.6%) and 6 had iron deficiency and anemia
(prevalence, 3%; 95% CI, 1.1%-6.4%). The mean CHr value was 28.1 pg (SD, 2.3;
95% CI, 27.8-28.5). The median hemoglobin and CHr values were significantly
different for non–iron-deficient compared with iron-deficient infants;
median values of other hematological and biochemical markers also differed
significantly between non–iron-deficient compared with iron-deficient
infants (Figure 2). Zinc protoporphyrin
was missing for 1 infant (not iron deficient) and ferritin was missing for
3 infants (none were iron deficient).
By ROC analysis (Figure 3), the
2 CHr thresholds that met the minimum sensitivity and specificity requirements
for detecting iron deficiency in infants at initial screening were 27.5 pg,
with a sensitivity of 83% (detected 19 of 23 iron-deficient infants; 95% CI,
61%-95%) and a specificity of 72% (95% CI, 65%-78%), and 28.3 pg, with a sensitivity
of 87% (detected 20 of 23 iron-deficient infants; 95% CI, 66%-97%) and a specificity
of 58% (95% CI, 50%-65%). Of the 2 candidate CHr thresholds, 27.5 pg was chosen
as the optimal one due to its higher specificity and similar sensitivity.
The positive predictive value of the 27.5-pg CHr threshold was 28% (95% CI,
17%-40%) and the negative predictive value was 97% (95% CI, 92%-99%).
Using a hemoglobin level of less than 11 g/dL resulted in a sensitivity
of 26% (detected 6 of 23 iron-deficient infants; 95% CI, 10%-48%), a specificity
of 95% (95% CI, 91%-98%), a positive predictive value of 43% (95% CI, 18%-71%),
and a negative predictive value of 91% (95% CI, 86%-95%). The area under the
ROC curve for CHr was significantly larger than that for hemoglobin (0.85
vs 0.73, P = .007), indicating that CHr
was a more accurate marker overall for the detection of iron deficiency than
hemoglobin. For example, a higher hemoglobin cutoff level of 12.3 to 12.4
g/dL would have comparable sensitivity with that of a CHr of less than 27.5
pg (78%-87%), but lower specificity (39%-45%), while a hemoglobin cutoff level
of 12.0 g/dL would have comparable specificity with that of a CHr of less
than 27.5 pg (73%) but lower sensitivity (61%). Reticulocyte hemoglobin content
was more accurate in detecting iron deficiency than mean corpuscular hemoglobin,
another parameter derived from the entire population of erythrocytes; the
area under the ROC curve for the latter was 0.73, nearly identical to that
of hemoglobin, and significantly smaller than that for CHr (P = .006). Using CHr less than mean corpuscular hemoglobin
to screen for iron deficiency yielded a specificity of 93% and a sensitivity
Of the 45 nonanemic infants with CHr of less than 27.5 pg at initial
screening who were evaluated at follow-up screening, 4 (9%) developed anemia.
In contrast, of the 102 nonanemic infants with CHr of at least 27.5 pg at
initial screening who were evaluated at follow-up screening, only 1 (1%) developed
anemia (risk ratio, 9.1; 95% CI, 1.04-78.9; P = .01).
In this prospective study, we have identified a CHr threshold of less
than 27.5 pg, with its promising sensitivity and specificity profile, for
detecting iron deficiency before anemia in healthy 9- to 12-month-old infants.
This threshold also was associated with the development of subsequent anemia.
These findings suggest that CHr may prove to be a valuable screening tool
for iron deficiency in the ambulatory primary care setting.
Unlike biochemical parameters, hematological parameters, such as CHr
and hemoglobin, are advantageous for screening for iron deficiency because
they are easily obtained, inexpensive, and free from biological variability
that affects iron, total iron-binding capacity, and ferritin measurements.
Reticulocyte hemoglobin content reflects the iron availability at the time
that today’s reticulocytes were made in the bone marrow. Because of
the short duration of the reticulocyte stage in erythropoiesis, evaluation
of CHr can reveal states of iron deficiency that are clinically significant
but not reflected by parameters derived from the entire red blood cell population.
In fact, both hemoglobin and mean corpuscular hemoglobin showed significantly
lower accuracy than CHr in detecting iron deficiency.
A decrease in hemoglobin is a late finding in the development of iron
deficiency, significantly detracting from its use as a preventive screening
tool. Furthermore, as the prevalence of iron-deficiency anemia is declining
in children, the value of anemia as a predictor of iron deficiency will diminish
further.24 Like hemoglobin, hematocrit also
would not be expected to detect iron deficiency with adequate sensitivity.
In 1 study of 321 infants, a hematocrit of 33%, corresponding with a hemoglobin
of 11 g/dL, did not detect any infants with iron deficiency, defined as a
ferritin level of less than 10 μg/L.31 We
defined iron deficiency as a transferrin saturation of less than 10%. Transferrin
saturation is a measure of transported iron and a commonly used biochemical
indicator for iron deficiency. A decrease in transferrin saturation can indicate
a stage of iron-restricted erythropoiesis that has not yet resulted in frank
anemia.1 Although a difference in the biochemical
standard used to define iron deficiency between the aforementioned study and
our study likely explains the difference in sensitivities of hemoglobin/hematocrit,
both studies highlight that the sensitivity of hemoglobin/hematocrit is insufficient
for their use as screening tools for iron deficiency.
A retrospective trial involving an older pediatric population (mean
[SD] age, 2.9 [2.0] years) also suggested CHr was the strongest predictor
of iron-deficiency anemia in children compared with biochemical parameters
used to estimate iron stores (ferritin, soluble transferrin receptor, and
zinc protoporphyrin) and hematological parameters (mean corpuscular volume,
mean corpuscular hemoglobin, and red blood cell distribution width).18 In this previous retrospective study, a CHr threshold
of 27.5 pg had an almost identical sensitivity (86%) but lower specificity
(38%) than in our younger study population.
Our infant population’s mean CHr was significantly lower than
the mean CHr reported for healthy adults in 1 study,23 but
not significantly lower than in another study.32 This
may be related to the transient physiologic decreases in mean corpuscular
volume and mean corpuscular hemoglobin observed during the first 2 years of
life, as well as to differences in instrumentation and methods among the studies.
There is little else known about how CHr changes with age, and further work
to address this question is needed.
Our study does have some limitations. First, CHr is a sensitive marker
that is specific for iron deficiency even in states of inflammation. However,
as a hematological parameter, its specificity is limited by other hematological
conditions, such as thalassemia trait and symptomatic thalassemia syndromes,
which also cause microcytosis and low hemoglobin content of both reticulocytes
and mature erythrocytes. Although we did not evaluate potential participants
for thalassemia, several different erythrocyte indices have been described
that are helpful in discriminating thalassemia trait from iron deficiency,33 including the Mentzer formula34 and
the microcytosis-hypochromia ratio.35 The microcytosis-hypochromia
ratio is a feature available on the hematology analyzer used to measure CHr
and has been shown to accurately discriminate iron deficiency from thalassemia
trait based on the fact that iron-deficient erythropoiesis is characterized
by more pronounced hypochromia, whereas in thalassemia trait the erythrocytes
are more microcytic. These indices facilitate the distinction of iron deficiency
from thalassemia in the setting of microcytosis and a low CHr.
Second, CHr can only be measured by 2 of the major hematology analyzers
in use today. A growing number of analyzers can determine reticulocyte cellular
indices, including alternatives such as mean reticulocyte volume and RET-Y,
a raw reticulocyte measure dependent on size and content of the cell. RET-Y
correlates well with CHr36 and similar to CHr
may be useful in the assessment of iron-deficient states.37 Reticulocyte
indices and their possible applications are growing areas of active research,
and CHr remains the best characterized of all the indices in use today.
Third, 17% of the infants were lost to follow-up; however, given the
characteristics of the infants analyzed at the initial and follow-up screenings,
the 2 groups remained comparable. Fourth, the size of the study was relatively
small with only 23 iron-deficient infants at initial screening. Fifth, the
cohort was mainly black and Hispanic, which may limit the generalizability
of this study to all pediatric practices. Finally, the age intervals between
initial and follow-up screenings may have influenced the results. This is
largely because CHr has not yet been studied in healthy infants and children
to allow establishment of age-adjusted normal ranges. These varied age intervals
in follow-up were largely a product of the stringent exclusion criteria that
were applied to the infants at the follow-up screening. If they were not eligible
for follow-up screening due to illness that could interfere with the accuracy
of biochemical tests obtained, they were required to wait until they were
once again eligible.
A particular strength of our study is that the infants were carefully
screened so that those with known possible infection or inflammation were
excluded. Although transferrin saturation is considered the criterion standard
for iron deficiency,1 it is derived from iron
and transferrin, 2 acute phase reactants that can be altered in states of
inflammation. The ability of our criteria to exclude those infants with inflammation
is confirmed by the rarity of even minimally increased C-reactive protein
in our study infants. Although adult studies have shown that CHr remains an
accurate marker of iron status in states of infection and inflammation,38,39 infants whose biochemical parameters
might not accurately reflect their true iron status were excluded in our study.
This is supported by the fact that the prevalence of iron deficiency and iron-deficiency
anemia in our population are concordant with prevalence described elsewhere.1,40
In healthy adults, CHr has been shown to be an early indicator of response
to therapy in iron-deficiency anemia.19,22 Although
it is likely to have the same predictive ability in the pediatric population,
further studies are needed to evaluate CHr changes in response to iron therapy.
Studies are also needed to evaluate how CHr values change with increasing
age and to explore cost/benefit analyses of CHr as a screening tool for iron-deficiency
anemia in infants. Further studies are also needed to determine if the use
of CHr to detect and treat iron deficiency significantly decreases the subsequent
development of anemia and neurocognitive deficits.
The US Department of Health and Human Services initiative Healthy People 2010 aims to decrease the prevalence of iron deficiency.41 Our study suggests that CHr is a sensitive screening
tool for detecting iron deficiency. Although its modestly lower specificity
compared with hemoglobin may lead to overtreatment, CHr shows promise in the
identification of children with iron deficiency solely on the basis of hematological
parameters. Larger multicenter studies will be necessary to determine whether
CHr should be the preferred screening tool in the early detection of iron
deficiency, bringing us a step closer to reducing the prevalence of this treatable
Corresponding Author: Christina Ullrich,
MD, Department of Pediatric Oncology, Dana Farber Cancer Institute, 44 Binney
St, Boston, MA 02115 (email@example.com).
Author Contributions: Dr Bernstein, as principal
investigator of this study, had full access to all of the data in the study
and takes responsibility for the integrity of the data and the accuracy of
the data analysis.
Study concept and design: Wu, Armsby, Brugnara,
Acquisition of data: Ullrich, Wu, Armsby, Rieber,
Analysis and interpretation of data: Ullrich,
Armsby, Rieber, Brugnara, Shapiro, Bernstein.
Drafting of the manuscript: Ullrich, Rieber,
Brugnara, Shapiro, Bernstein.
Critical revision of the manuscript for important
intellectual content: Ullrich, Wu, Armsby, Rieber, Wingerter, Brugnara,
Statistical analysis: Ullrich, Rieber, Shapiro.
Obtained funding: Wu, Bernstein.
Administrative, technical, or material support:
Ullrich, Rieber, Brugnara, Bernstein.
Study supervision: Brugnara, Bernstein.
Financial Disclosures: Dr Brugnara currently
has a consulting agreement with Bayer Diagnostics and Dr Bernstein has had
a consulting agreement with Bayer Diagnostics in the past.
Funding/Support: Bayer Diagnostics funded this
study and provided laboratory reagents. The Department of Laboratory Medicine,
Children’s Hospital Boston, performed all laboratory analyses. Bayer
Diagnostics is the manufacturer of the ADVIA 120 hematology analyzer.
Role of the Sponsor: Bayer Diagnostics played
no role in the design and conduct of the study; collection, management, analysis,
or interpretation of the data; and preparation, review, or approval of the
manuscript. Statistical analyses were performed by Drs Ullrich and Shapiro
and Ms Rieber; none of them has any financial or personal connection with
Acknowledgment: We thank the Children’s
Hospital Primary Care Center and the research nurses, Anne Bailey, RN, Sandra
Smith, RN, Christine Bourgeois, RN, CPNP, MS, and Erin O’Mahony, RN,
for their valuable assistance with infant recruitment, enrollment, and follow-up;
Rebecca Stoltz, BA, for her assistance with data entry and manuscript review;
and Ellis Neufeld, MD, PhD, for his thoughtful insights.
Create a personal account or sign in to: