A professor’s desire to help a student with impaired eyesight leads to a project that could affect countless others.
Jimmy Vivar navigates the Loyola Marymount University campus with a white cane in his hand and a smile on his face.
The sophomore civil engineering major has severely limited vision. He wears wire-rim glasses, but they do little to alleviate the worst of the problems.
Since childhood, Vivar’s right eye has been afflicted with amblyopia, commonly known as lazy eye. The eye wanders and is, for all practical purposes, blind. Both eyes are affected by retinitis pigmentosa, an inherited disease that causes a gradual decline in night and peripheral vision.
Hossein Asghari, Ph.D., could tell that Vivar was struggling to read the white board in his electric circuits course.
“I saw his potential and wanted to help him,” Asghari, a visiting assistant professor of electrical engineering and computer science in the Frank R. Seaver College of Science and Engineering, said of his student.
An app developer whose field is optics, Asghari spotted an opportunity to help Jimmy and others with impaired vision by using cost-effective technology typically reserved for gaming and entertainment.
Asghari seized on the idea of using Google Cardboard, a headset with a mobile phone mount for use with virtual reality and augmented reality mobile applications. Cost? $12
In his free time, Asghari looked to existing open-source software applications with augmented reality features. He verified the usefulness of the features with Vivar: the ability to expand his field of view (simulating a fish lens), and adjust the picture to make lines on a board thicker or darker.
This semester, Asghari is taking the project to the next level. He has the support of his department and the help of two talented electrical engineering seniors with concentrations in computer engineering. For their senior capstone project, Sean Cunniff and Patrick Foster will bring together visually impaired students across LMU to serve as a focus group and then develop a prototype app.
“In a nutshell,” Foster said, “what we’re doing is developing an app to run on a mobile device that will fit into a headset. …It’s a deviation from what most people know as virtual reality. Instead of creating an artificial world, you have a live feed of what’s happening around you, augmented through software.”
Cunniff elaborated: “For people who are near-sighted, it’s a refraction problem. All it takes is simple lens modification. For the people we’re targeting, that solution will not work. We’re going specifically for people whose problems cannot be solved with the use of glasses or contact lenses.”

The two seniors are finding the process stimulating and a bit nerve-racking.
“It’s quite an undertaking because it’s never been done before,” Foster said. “This is really new.”
Asghari acknowledged that the seniors have a challenging task ahead. “It’s not an undergraduate-level project,” he said emphatically. “It’s Ph.D. level.”
Still, the three are excited by the prospect of creating a tool that could help millions of Americans with visual impairments.
“We think this could have a powerful educational use,” Asghari said. “This is just the beginning.”