By Jill Goetz, College of Engineering
University of Arizona researchers are developing technology that converts smartphones into powerful eye-examining instruments that could prevent millions of people from going blind.
Wolfgang Fink, professor of electrical and computer engineering and biomedical engineering, is principal investigator of a new project funded by the National Science Foundation Partnerships for Innovation: Building Innovation Capacity program to create “smart ophthalmoscopes,” specialized instruments for examining various parts of the eye’s interior. The devices, which can be attached to any smartphone, and accompanying software will enable health care providers, particularly in remote areas, to quickly and easily determine if patients are at risk of losing their vision.
“Our hand-held ophthalmoscopes will permit eye exams in places they would otherwise be impossible,” said Fink, the Edward and Maria Keonjian Endowed Chair and director of the UA Visual and Autonomous Exploration Systems Research Lab. “These are not passive recording instruments, but investigational tools with sophisticated data-processing and analytical capabilities.”
Fink said they would be comparable to typical eye exam equipment used in an eye doctor’s office, yet they would be affordable, highly portable and mobile, and easy to use.
“All that’s needed is a person on a bicycle with a smart ophthalmoscope. They can visit and examine clients of any age, in any language -- anywhere, anytime. No trucks, heavy equipment or extensive training required. I believe this portable vision-screening capability will revolutionize the availability and economy of rural health care, and the field of ophthalmology at large.”
Here’s how it works: The user -- who might be a health care provider, aid worker, nurse, paramedic or caregiver -- attaches the ophthalmoscope to a smartphone, points it at the eye and takes a picture. Taking advantage of the smartphone’s ability to take high-resolution pictures, the ophthalmoscope captures detailed images of the interior segments and surfaces of the patient’s eye, with no need for dilating eye drops, chin rests or other gear typically used for an eye exam.
Next, the user runs a custom app on the smartphone that relays these images to a remote “expert system” -- which uses intelligent software to suggest diagnoses much like a human medical expert -- for processing and analysis. In seconds, the results are relayed back to the user and displayed on the smartphone’s screen.
A single health care provider could conduct as many as 100 initial assessments in one day and immediately put patients on the fast track to accurate diagnosis and treatment for potentially vision-robbing ailments.
Fink stressed that smart ophthalmoscopes are no substitute for examinations and diagnoses by a trained eye specialist. However, in the absence of a trained specialist, he said, people in the field can make initial assessments, such as suspicion of cataracts or glaucoma, and refer patients for follow-up.
The National Science Foundation has awarded $800,000 for this three-year research project, through its Partnership for Innovation: Building Innovation Capacity program. The trans-disciplinary research study has three main parts to be tackled in parallel:
• In collaboration with an optical engineering design firm, Fink is designing and building prototype smartphone attachments that will soon be tested on patients in the UA College of Medicine, under the direction of Dr. Joseph Miller, the project’s co-investigator and head of the department of ophthalmology and vision science.
• Senior research scientist Mark Tarbell and Fink will create a framework for a central expert system that can extract, process and analyze the data from smartphones and relay information back to the smartphones.
• Fink and Tarbell will implement image analysis algorithms to provide medical reports that will help ophthalmologists and other eye-care specialists make diagnoses and recommendations for patients.
In 2012 Fink was inducted into the College of Fellows of the American Institute for Medical and Biological Engineering. He holds more than a dozen issued patents and several pending patent applications -- many for vision-related products -- some of which constitute the background intellectual property for this project.
Fink has brought several partners on board for the new NSF project. They include Breault Research Organization, an optical engineering design firm; the Center for Military Medicine Research at the University of Pittsburgh; Tech Launch Arizona; and Caltech, where he holds an appointment as visiting associate in physics.
The Vanguard of Telemedicine
Fink is a pioneer of teleophthalmology, a fast-growing branch of telemedicine that merges mobile technology and medical services. Smartphones are already being used to monitor blood pressure, blood glucose levels and heart rate. Soon they may be widely used to assess eye health not only on Earth, but also on long-duration space missions and even on the International Space Station; Fink has made presentations to NASA proposing his visual field test for use on the station.
Fink’s work in telemedicine recently led to an invitation to participate in a panel, Telemedicine Pioneers, at the Western Pennsylvania Healthcare Summit on Nov. 18 outside Pittsburgh, Pennsylvania, and to present at an invitation-only MIT-NSF workshop, Smarter Service Systems, on Nov. 20-21 in Cambridge, Massachusetts.
Biomedical engineering major Jerri-Lynn Kincade is participating in the UA research project for her senior design project.
“Dr. Fink has had a huge impact in helping me pursue my career goal of applying biomedical engineering to improve people’s lives,” said Kincade, a Black Alumni Scholar and vice president of the UA chapter of the National Society of Black Engineers, which Fink advises. “I’m getting valuable hands-on experience. Not only am I learning about the functional requirements for developing a biomedical device, I am also learning about software and hardware development and different systems processes.”