- Search for Research Summaries, Reviews, and Reports
- EPC Project
- Biomarkers for Assessing and Managing Iron Deficiency Anemia in Late-Stage Chronic Kidney Disease: Future Research Needs
Future Research Needs Projects
Future Research Needs - Methods Research Series
Related Links for this Topic
Research Review - Final – Jan. 14, 2013
Biomarkers for Assessing and Managing Iron Deficiency Anemia in Late-Stage Chronic Kidney Disease: Future Research Needs
People using assistive technology may not be able to fully access information in these files. For additional assistance, please contact us.
Archived: This report is greater than 3 years old. Findings may be used for research purposes, but should not be considered current.
Anemia is a common complication of chronic kidney disease (CKD), which develops early in the course of CKD and becomes increasingly severe as kidney function deteriorates. Iron deficiency anemia is a continuous process evolving in three stages. The first phase is the depletion of storage iron (stage I), where total body iron is decreased but hemoglobin (Hb) synthesis and red cell indices remain unaffected. Both these indices change when the supply of iron to bone marrow becomes problematic (iron deficient erythropoiesis, or stage II). In stage III the iron supply is insufficient to maintain a normal Hb concentration, and eventually iron deficiency anemia develops.
The management of anemia in CKD patients must strike an appropriate balance between stimulating generation of erythroblasts (erythropoiesis) and maintaining sufficient iron levels for optimum Hb production. It is important to assess iron stores and the availability of iron for erythropoiesis, as adequate iron status is integral to both iron and anemia management in CKD patients. The major cause of iron deficiency is blood loss, particularly for dialysis patients. Dialysis patients are in a state of continuous iron loss from gastrointestinal bleeding (very common), blood drawing, and/or, most importantly, with hemodialysis, the dialysis treatment itself. A CKD patient who receives treatment with erythropoietic-stimulating agents (ESAs) for the anemia often develops iron deficiency, because the iron requirements for achieving a response to ESA treatment usually cannot be met by mobilization of the patient's iron stores alone. Therefore, supplemental iron therapy, either given orally or intravenously, is often needed among dialysis patients who receive recombinant human erythropoietin (EPO) or darbepoetin alfa treatment. Thus, iron management (iron status assessment and iron treatment) is an essential part of the treatment of anemia associated with CKD, as there are concerns regarding the adverse effects associated with both elevated doses of ESAs and supplemental (intravenous or oral) iron.
Classical iron status tests, of which ferritin and transferring saturation (TSAT) are the most widely used, reflect either the level of iron in tissue stores or the adequacy of iron for erythropoiesis. Though widely used, classical laboratory biomarkers of iron status are not without drawbacks when used in CKD patients: CKD is a pro-inflammatory state, and the biological variability of serum iron, transferrin saturation, and ferritin is known to be large in the context of underlying inflammation. Furthermore, results from a meta-analysis of 55 studies (published before 1990) showed that ferritin radioimmunoassay was the most powerful test (a mean area under the receiver operating characteristic curve of 0.95; 95% CI 0.94, 0.96), compared with mean cell volume determination, TSAT, and red cell volume distribution, for diagnosing adult patients with iron deficiency, but test performances varied between patients with and those without inflammatory (e.g., CKD patients) or liver disease. The accurate assessment of iron status is dependent on the validity and reliability of laboratory test results, and differences in test performance pose a dilemma regarding the most appropriate test to guide treatment decisions.