By Deborah Borfitz
December 1, 2020 | The eyes could be easily and inexpensively imaged to reveal the presence of neurodegenerative diseases potentially years before telltale clinical signs, suggests a growing body of research. While currently limited to investigational use, retinal imaging as a diagnostic tool in the clinic may well be a reality in another few years, according to Maya Koronyo-Hamaoui, Ph.D., associate professor of neurosurgery and biomedical sciences at Cedars-Sinai Medical Center whose pioneering work (with Yosef Koronyo, M.Sc., LLB and Keith L. Black, M.D.) a decade ago showed amyloid beta (Aβ) aggregates in the retina of Alzheimer’s patients.
Amyloid beta is a hallmark pathology required for a diagnosis of Alzheimer's dementia and, when sufficiently elevated, triggers the spread of neurofibrillary tangles made of abnormal tau throughout the brain. But, outside of an autopsy, a definite diagnosis generally involves the use of radioisotopes or a spinal tap, Koronyo-Hamaoui says.
It is also well known that severe retinal damage is a pathological sign of Alzheimer’s disease. However, until Aβ was implicated, it was difficult to rule out other potential culprits since cell degeneration and vascular abnormalities may occur with many types of ocular and neurological conditions, says Koronyo-Hamaoui.
In addition to associating retinal Aβ aggregation with Alzheimer’s disease, in 2010 Koronyo-Hamaoui and her team showed how they spotted the protein by shining a light on the retina of animals given curcumin that attached specifically to Aβ. Subsequently, in a 2017 landmark paper, they described using live imaging of patients given a special oral formulation of curcumin and unique histological techniques to pinpoint Aβ in regions of the human retina not previously studied in Alzheimer’s patients.
Most recently, in a study published in Alzheimer's & Dementia: Diagnosis, Assessment & Disease Monitoring (DOI: 10.1002/dad2.12109), the researchers describe how they used an iPad-operated ophthalmic device (Retia, CenterVue SpA; NeuroVision Imaging) to home in on a predefined anatomical region where Aβ aggregates. Notably, they were able to detect micrometer-sized Aβ deposits, far smaller than the millimeter-scale protein accumulations that can be seen with amyloid PET imaging, Koronyo-Hamaoui notes.
Retia camera-captured images and NeuroVision Imaging software automatically quantified the retinal amyloid count in patients with mild cognitive impairment, an early clinical stage of Alzheimer’s disease. Their retinal amyloid index, a quantitative measure of increased curcumin fluorescence, was found to inversely correlate with hippocampal volumes, Koronyo-Hamaoui explains. While it's normal for the hippocampus to shrink with age it is much more pronounced in people with mild cognitive impairment or Alzheimer's disease.
Participants who had the worst score on a cognitive screening test (Montreal Cognitive Assessment) had higher levels of amyloid deposits in the retina in general, but the most predictive region of Aβ accumulation was in the proximal mid‐periphery field, she adds.
Koronyo-Hamaoui and three of the study’s other co-authors are founders of NeuroVision Imaging, which is running multiple clinical trials across the U.S., Australia, and Europe in hopes of getting their imaging approach cleared for marketing by the U.S. Food and Drug Administration (FDA) as an early warning system for Alzheimer’s disease. Other research groups are focused on the vascular basis of brain degeneration, says Koronyo-Hamaoui, adding that her team is experimenting with structural measurement of retinal vessels plus plaque abundance to better predict Alzheimer’s disease risk.
Quantitative retinal amyloid imaging, as currently envisioned, will be possible on a low-cost, patient-friendly device that can be used by a trained operator—no neurology or ophthalmology expertise required—to noninvasively measure the Aβ burden in patients in under 15 minutes, Koronyo-Hamaoui says. Users will be guided on where to look in the retina for deposits as well as the specific spectral imaging signature of Aβ plaque that distinguishes it from other aggregates that can occur in the brain and retina of aged individuals, including drusen, calcifications, and lipofuscins.
NeuroVision Imaging software embedded in the device automatically selects the eight highest quality images from each eye as well as quantifies the retinal amyloid count, and everything gets transmitted to the cloud, she adds. The test could be periodically repeated to follow the progression of Aβ in central nervous system tissue and assess the transient or longer-term effect of an intervention.
The threshold level of Aβ needed to trigger development of Alzheimer’s disease remains under investigation, she says. But it is accepted worldwide as a hallmark biomarker of this devastating neurodegenerative condition. The goal is to give providers a decision support tool for ruling out an Alzheimer’s diagnosis or moving on to more invasive procedures such as a PET scan or cerebrospinal fluid sampling to rule out vascular or Parkinson’s dementia.
AI Helps Diagnose Parkinson’s
Artificial intelligence (AI) is commonly paired with imaging because, “as the old saying goes, a picture is worth a thousand words,” says Maximillian Diaz, a biomedical engineering Ph.D. student at the University of Florida (UF). Mathematically speaking, an image is a set of coordinates with different color values as measured by pixels.
For a recent study co-authored by Diaz, and presented as an abstract at this year’s meeting of the Radiological Society of North America (and detailed in a subsequent press release), two sets of 255-by-255 retinal images produced roughly 195,000 data points that needed analyzing by AI. The U.K. Biobank provided 238 images from Parkinson’s patients and a matched number from control (normal) patients and a UF clinic added 72 Parkinson’s images and 28 control images (from spouses and significant others), supplemented by 44 controls from the biobank, Diaz says.
The task at hand was to classify Parkinson’s disease based on retina vasculature, using support vector machine learning—a 30-year-old AI tool, he says. Using pictures of the back of the eye, Diaz and his UF collaborators trained an algorithm to detect signs suggestive of Parkinson’s disease, with the key features being smaller blood vessels. The algorithm achieved above 70% accuracy in detecting Parkinson’s disease from both datasets. A manuscript on the study will be submitted for publication in the coming weeks, he adds.
The study was under the direction of Ruogu Fang, Ph.D., who leads UF’s Smart Medical Informatics Learning and Evaluation Lab and is spearheading efforts (along with Adolfo Ramirez-Zamora, M.D., at UF’s Center for Movement Disorders) to develop a dataset to validate the findings. The longer-term goal is to fine-tune the technique for predicting disease development and severity as well as diagnosing Parkinson’s.
“The healthcare community is starting to be more open and accepting of AI-based diagnostics, especially for screening and community-level detection,” Fang says. But it could take her team a while to get an FDA-approved diagnostic to market with an algorithm that is robust, explainable, and trustworthy. “We don’t want to just give doctors a result, but also show them how the algorithm makes decisions.”
The research is among a growing body of work focused on new approaches to diagnosing the neurodegenerative condition that causes tremors and rigidity and ultimately robs the body of its ability to move. Many studies of late have involved the collection of raw data and application of machine learning to detect diagnostic signals, says Diaz, including research out of UF’s Center for Movement Disorders on the timing of deep brain stimulation to coincide with when a tremor is about to start.
Elsewhere, researchers are using machine learning to improve the ability to diagnose Parkinson’s from a simple drawing or the way a person walks or their sense of smell, Diaz continues. Another avenue of investigation for eye tracking technology and AI is to spot otherwise undetectable alterations in normal eye movements as a precursor to the development of physical symptoms.
The eyeball is directly connected to the brain not only by the optic nerve but also the ophthalmic, posterior communicating, and cerebral arteries that feed the brain, Diaz points out. Since arterial blood flow for these vessels connects and diverges from the carotid artery, downstream changes to one vessel invariably impacts the others even if the causal event in the brain (e.g., blockage or decrease in blood consumption) is unclear.
The “big innovation” with the latest study is using a convolutional neural network (a common deep learning approach) to segment blood vessels in a regular fundus image, followed by support vector classifiers for diagnostics, says Diaz. Fundus images can be captured using equipment commonly available in eye clinics or, as was the case in the proof-of-concept study, a smartphone equipped with a special lens costing about $6,000.
The test can be completed in less than a minute and is far less costly than a CT or MRI machine, says Diaz. That might qualify it to become a yearly screening to catch more cases sooner. The portability of the technology also makes it ideal for resource-limited settings, including places where electricity is in short supply.
Multiple studies have shown you can truly see the brain through the eyes, making them a “promising window” into neurological degeneration and the resulting diseases, says Fang. A simple eye exam when coupled with AI could in fact have broad applicability to the detection of diseases that affect the structure of the brain, among them Alzheimer’s disease and autoimmune diseases such as multiple sclerosis.