Computer science student creates innovative, award-winning glucose monitoring app
Bryan Chiang, a first-year computer science student, developed EasyGlucose, an application that allows diabetic patients to monitor their glucose levels on their phones. He won first place in the Microsoft Imagine Cup early May for the application. (Creative Commons photo by ImagineCup via Flickr)
May 31, 2019 12:42 a.m.
A UCLA student developed an application that allows diabetic patients to monitor their glucose levels on their phones.
Bryan Chiang, a first-year computer science student, created EasyGlucose, a mobile application designed to monitor diabetic patients’ glucose levels using a picture of their eye taken on their smartphone. Chiang won first place in the Microsoft Imagine Cup, an international competition for computer science students, in early May for the application.
EasyGlucose allows diabetic patients to forgo traditional methods of monitoring their glucose levels, such as pricking their finger to acquire a blood sample.
Glucose levels in the human eye are correlated with the glucose levels in blood, Chiang said.
Changes in the blood glucose level lead to changes in the glucose concentration in the fluid of the eye, said David Myung, an assistant ophthalmology professor at the Byers Eye Institute at Stanford.
“This leads to very subtle differences in how the iris appears,” Myung said in an email statement. “(Chiang’s) idea is to use artificial intelligence to analyze these fluctuations and to one day measure blood glucose with a simple photograph.”
Researchers have tried devising methods that can predict glucose levels based on features of the eye. However, most of these methods have relied on large machinery or lasers to observe the eye, Chiang said.
This makes these methods less portable and accurate, Chiang said. Furthermore, complicated machinery can make the system less stable, as it relies on more intricate systems that can fail easily.
Chiang said the method behind EasyGlucose is more effective because it utilizes deep learning, a subset of machine learning, to accurately analyze and classify images of the eye.
“That was something unique that I brought to the problem because most of the people working on glucose monitoring are not going to be from a computer science background,” Chiang said.
He said his background in computer science allowed him to come up with a more effective method of noninvasive glucose monitoring. Chiang was inspired to create this app when he found out his grandmother had been diagnosed with diabetes. He said he saw this as an opportunity to apply his knowledge of technology and computer science.
For the project, Chiang said he entered around 15,000 images of an eye along with the person’s glucose levels at the time the images were taken into the machine learning algorithm.
“Because the eye is so complex with so many structures, we don’t really know which parts are going to be most predictive of your blood sugar level,” Chiang said. “What we do with deep learning is that we actually just give it a bunch of eye images; we tell it, ‘On your own, go find which parts of the image and structures inside these eye images are going to be most predictive of blood sugar levels.'”
Matthew Freeby, an assistant clinical professor of medicine at the David Geffen School of Medicine and director of the Gonda Diabetes Center, said he thinks patients might be interested in a less invasive way to measure blood sugar.
“If this were to come out for human use and it was noninvasive, it would definitely be a breakthrough and absolutely wonderful for our patients,” Freeby said.
Chiang said he hopes to gather more data to increase the accuracy of EasyGlucose and eventually publish his research. Chiang added he hopes to begin clinical trials this summer to get approval from the Food and Drug Administration before releasing EasyGlucose for commercial use.