Iron intake and its effects on neurological functioning is increasingly becoming an area of key research interest. Of particular importance is the relationship between iron homeostasis in the central nervous system (CNS) and neurodegeneration in Alzheimer’s disease (AD).
AD is a progressive, degenerative neurological condition caused by neuron death which results in brain shrinkage. According to the amyloid hypothesis, AD begins with the abnormal processing of the transmembrane amyloid-β precursor protein causing accumulation into β-amyloid plaques outside neurons. Subsequently, microtubule-associated tau protein in neurons becomes abnormally hyperphosphorylated and forms neurofibrillary tangles that disrupt neurons. Signature pathologies of AD are progressive accumulation of β-amyloid plaques in the brain and tangles of tau protein inside neurons, eventually accompanied nerve cell damage and death.
In the CNS iron is crucial to processes such as oxygen transportation, myelin synthesis and neurotransmitter production. In the brains of AD patients, iron accumulation occurs without the normal age related increase in the iron storage protein ferritin. At the time the study by Ayton et al., (2015) was published researchers had shown that in brain specimens of both AD patients and transgenic mouse models, amyloid deposition and neurofibrillary tangles co-localised with neuronal iron accumulation.
Moreover, it was widely acknowledged that dyshomeostasis of cerebral iron contributed to the neuropathology of AD. Yamamoto, et al. (2002) showed in post-mortem AD adult tissue that elevated ferric ion levels induced cluster-like formations of α-synuclein and hyper-phosphorylated tau protein, observed in neurodegenerative disorders. Similarly, in rat hypothalamic cells the reduction of ferric iron simultaneously oxidized dopamine to highly reactive and toxic quinones causing cell death through apoptosis and ferroptosis to cause neurodegeneration (Alzheimer’s Association, 2017). Another mechanism by which evidence at the time of publication pointed to iron dysregulation as a contributor to AD pathogenesis was via the production of reactive oxygen species (ROS). The impact of ROS are widespread and can cause neurodegeneration by: damaging DNA and mtDNA to epigenetically affect DNA expression; or by releasing further iron from mitochondrial iron–sulphur cluster proteins via Fenton’s reaction. Research at the time was clear then that changes in ion homeostasis could lead to iron-mediated neurotoxicity.
Well before this study was conducted, limited preliminary studies in animals using lipophilic brain-permeable iron chelators such as VK-28, HLA-20 and M3083 had shown neuroprotective effects, supporting the causative role of iron in neurodegeneration. However, there was a need for more evidence in vivo to aid in determining whether quantitative assessment of CNS iron levels could predict cognitive trajectory in AD. There is currently no cure or effective intervention to slow progression of AD and understanding the implications of CNS iron levels could offer a target for preventative drug therapies. Such analysis could also further our understanding of brain iron metabolism and the molecular mechanisms causing iron accumulation in specific brain areas, hopefully presenting alternative methods of AD prevention and treatment.
The study by Ayton et al., (2015), is a prospective longitudinal cohort study. Unique to this study is that the data was collated from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) study, a large, multi-location, extensively funded, longitudinal cohort study. This means to assess the validity of the data in the study by Ayton et al., (2015), the data collection process and protocols from the ADNI study must be analyzed. Data was collected from 302 older individuals (average age ~75 years) from the ADNI cohort: 91 with normal cognition (NC); 144 with mild cognitive impairment (MCI); and 67 with Alzheimer disease (AD). While this is a considerable number of participants, it is interesting they did not use the full amount of data freely available to them.
The recruitment of the study population and segregation into NI, MCI and AD has been heralded by many as successful due to the heterogeneity of the cohorts and applicability to other populations. Of concern though is within the ADNI cohort, as investigated by Epstein, Saykin, Risacher, Gao, and Farlow (2010), was a high rate of polypharmacy, with 85% of participants taking more than four medications. This use of these medications, in particular the commonly used cholinesterase inhibitors and Memantine, was associated with increased cognitive impairment at baseline and greater decline over 2 years according to follow-up research. This introduces a source of bias and may affect the interpretation of research outcomes.
Another factor which should have been accounted for is the effect of an elderly person’s body mass index (BMI) on rate of cognitive decline. An assessment by Cronk (2010), using the same ADNI data set the study by Ayton et al., (2015) was derived from, lower BMI at baseline was associated with increased rates of cognitive decline over 1 year in the MCI group. This decline was clinically recognizable and important as it was detected using the Alzheimer’s Disease Assessment Scale Cognition, the same scale used by Ayton et al., (2015), to determine the association between baseline cerebral spinal fluid (CSF) ferritin levels and clinical outcomes.
One downfall of the study by Ayton et al., (2015), was that magnetic resonance imaging (MRI) was only performed on CN and MCI subjects. All the AD subjects were excluded from MRI analyses due to low numbers and short follow-up. Nevertheless, for the participants who underwent MRI, these were performed at baseline, 6 months, 1 year and then yearly for six years. Such regular measurements would have given a sufficient amount of data for accurate analysis. In addition, choosing MRI as a primary measurement tool for the study was astute. MRI analysis is a promising outcome assessment technique and there have been significant increases in resolution of iron distribution images. A downside of the serial MRI scans used in ADNI is they often introduce artifacts into the results because of issues with the uniformity of signal intensity. Fortunately, in the collection of the ADNI data optimization of pulse sequences was included in the MRI protocol. Moreover, tensor-based morphometry was applied which develops maps of statistically analyzable variations in MRI reliability and reproducibility. According to Leow et al., (2005), who investigated MRI scan types for mapping brain changes using tensor-based morphometry, should this study be replicated it would be useful to use spoiled gradient recalled echo (SPGR) for the most robust results.
While the statistical analyses were performed correctly in this study, the use of natural logarithmic transformation introduces bias for the variables: ferritin; factor H; tau and hemoglobin. If a set of data values is skewed taking the natural log of each value results in a roughly symmetric and roughly normal data set. Variability may then be stabilized by taking the log transformation of the data. The result is more interpretable date that more closely satisfies the assumptions of inferential statistics. While the use of log-transformation is widely used in research the results of standard statistical tests performed on log-transformed data may not be relevant for the original, non-transformed data according (. Log-transformed data cannot usually facilitate inferences concerning the original data, since it shares little in common with the original data. Estimates which are obtained by taking antilogs of the previously transformed data are common; however, a bias is inherent in this procedure because the largest values are compressed on the logarithmic scale and so tend to have less “leverage” than small values. There have been outlined successive approximations to correct for such bias but they are seldom used in practice. Perhaps applying them would give more accurate results in this study. For many applications, it has been suggested instead of finding an appropriate statistical distribution or transformation to model the observed data, it would be better to switch to modern distribution-free methods. For example, the generalized estimating equations technique can avoid many of the problems discussed.
Adding confidence to the study results is that Ayton et al., (2015), tested the conditions necessary to be able to apply multiple regression models. The data was analyzed and said to pass the eight assumptions required for multiple regression. Researchers in particular mentioned the absence of multicollinearity, occurring when two or more independent variables are highly correlated. Establishing this was important for the study as multicollinearity in a data set can cause problems with discerning which independent variable contributes to the variance explained in the dependent variable.
CSF ferritin levels is proposed by Ayton et al., (2015), to be a possible index of brain iron load. This statement is made based on research done on cells grown in cultured systems. Though a linchpin of modern research, it has long been known cells in the laboratory behave differently from those in the body, affecting our understanding of diseases and drug development. Researchers have discovered progression to an altered cell state, evidenced by epigenetic and transcriptional reprogramming, occurs significantly earlier than previously assumed. This epigenetic remodeling occurs in parallel with gene expression changes.
Further re-emphasizing the effect of cell culture on the epigenome, a recent large-scale genome-wide study on DNA methylation found that normal human tissues and primary cell lines form two separate and robust clades. That is, the methylome of primary cell lines from multiple different tissues more closely resembled each other than the tissues they were derived. The application of findings from cell lines previously been considered faithful mimics of in vivo epigenetic and physiological processes must be reconsidered. Overall this evidence questions the veracity of experimental models which present CSF ferritin levels to be an index of brain iron load, as stated by Ayton et al., (2015). As such, while the data is accurate the conclusions of this study could be dubious.
One of the primary goals of the ADNI was to develop biomarkers to facilitate clinical trials of AD therapeutics. Their research explored hypotheses which could reveal groundbreaking insights into neurodegeneration leading to billion dollar drug developments, suggesting possible conflicts of interest. It is well publicized that approximately 20 million dollars was received by a combination of the ADNI and private industry to fund the study. With such a large amount invested it is conceivable investors may have had input into study procedures rendering skewed results towards those favorable for them.
Nevertheless, a few factors render conflicts of interest a null concern. Firstly, open disclosure is the best mechanism for dealing with this issue and on the ADNI website there is listed all the groups and companies funding the ADNIs data collection and sharing. Secondly, pharmaceutical manufacturers would be motivated by the profitability of being privy to scientific discovery into a disease with no current therapeutic strategies. This would evoke the motivation towards accuracy and applicability of findings. Initially concerning though, was the specific mention by Ayton et al., (2015), that lowering CSF ferritin as achieved by the drug Deferiprone could delay MCI conversion to AD by up to 3 years. Nevertheless, Deferiprone is manufactured by the pharmaceutical giant Apotex which is not listed as a sponsor of the ADNI so it is unlikely they could use the study as a platform to promote their own products. There is also significant evidence to support the use of Deferiprone by a number of different studies, such as the trial by Giovanni et al., (2011) and Shachar et al., (2004) who concluded it was safe, tolerable and may be competent to improve neurological manifestations associated with iron accumulation.
The current recommendations for iron intakes are available from (30) and are said to be in excess of actual requirements. There are difficulties assessing iron intakes, iron losses, iron bioavailability and the amount of iron needed to maintain iron stores so further research is needed to determine the optimal iron intake recommendations. Interestingly, this does not correspond to population iron intake levels with almost one-quarter of the population worldwide is affected by anemia, of which iron deficiency is the primary cause. Therefore, at a population level it would be inappropriate to warn consumers that toxic levels of iron consumption can contribute to neurodegeneration. Instead I see the value of the research by Ayton et al., (2015), in terms of furthering our understanding of AD pathogenesis so we can develop innovative drug therapies to combat neurodegeneration.
Nevertheless, scientists do have a duty of care to the public and I would suggest a warning about the dangers of excess iron consumption be provided on iron supplement packaging. It is difficult to reach toxic levels of iron intake from food sources but iron supplements taken in excess are concerning. It is well documented that excessive iron can cause harm by mechanisms such as the erosion of the gastrointestinal mucosa, oxidative damage of proteins and DNA, or by shifting the balance of a host–pathogen relationship in favor of the pathogen. Coupled with the recent links between iron and neurodegeneration, there should be more information where appropriate about the pathological role of iron.
This essay has been submitted by a student. This is not an example of the work written by our professional essay writers. You can order our professional work here.