Creating a new tool for self-diagnosis
In the spring of 2018, Antonio Mallia, who is a PhD candidate in the department of Computer Science at NYU, decided to explore something very different from Information Retrieval, his primary research interest, and discussed the idea of an independent early-detection tool for skin cancer with his teammates. They were immediately receptive to the notion of empowering users. “The idea is to have a tool that allows a patient to perform a basic skin mole analysis at home, with the use of artificial intelligence,” he says.
Srivastava notes that skin rashes may seem benign, “but they can turn out to be very dangerous, if not taken care of. Our plan was to not make patients have to visit a doctor every time he or she is suspicious about a skin rash or mole. A preliminary scan would be very helpful in that case, just like checking blood pressure.”
“Much of the time,” says Habeeb, “patients aren’t aware of a disease until very late in the process. By the time they visit a doctor, it might be too late. With our application, patients can figure out if they need to see a doctor. Even if it’s a false positive, there is no harm in making that appointment.”
He says that the team first heard about Google’s GPU-powered VMs from Google engineers at the hackathon. “Having that information right away helped us train our model on [a] a complete dataset within the time-frame of hackathon,” he says. (Srivastava adds: “Thanks to the technical mentors from Google, we got started on Google Cloud’s Compute Engine within minutes.”)
Mallia set about designing the app architecture and helped solve specific problems the team encountered along the way.
At HackNYU, they learned how to quickly prototype a cross-platform mobile app (Android/iOS). And they retrained the last layer of Google’s "Inception" image classification model to categorize images based on their own dataset of skin moles.