Identifeye Health, a startup looking to make retinal imaging more accessible, is preparing for the launch of its first product. The company plans to start selling a retinal camera in the first quarter of 2025, and is preparing a separate regulatory submission for an algorithm to screen for diabetic retinopathy, CEO Vicky Demas said.
The condition, a complication of diabetes, is one of the leading causes of blindness in working-age adults, according to the Centers for Disease Control and Prevention. However, it can be prevented with early treatment.
Demas joined the Redwood City, California, and Guilford, Connecticut-based startup — formerly known as Tesseract Health — in 2021. The company raised $80 million in funding that year. Prior to that, Demas was a platform product manager at cancer testing company Grail and was one of the first five members of Google’s Life Sciences team, now known as Verily.
Demas spoke with MedTech Dive about why she joined Identifeye and what’s next for the company.
This interview has been edited for length and clarity.
MEDTECH DIVE: Tell me about your background. How did you start working on retinal imaging?
VICKY DEMAS: I have developed and launched products or platforms in several distinctive areas for diagnostics, from infectious disease at the point of care to leveraging AI tools for cancer diagnostics to surgical robotics.
I was part of the team that started the life sciences initiative at Google X, which then we spun out and rebranded to Verily. And that's when I got introduced to the idea of the power of the retina as a diagnostic platform.
It fascinated me because, if you can automate retinal screening and then use image analytics on top of it, you can see how we can really democratize access. I left after Verily to work at the cancer diagnostics at Grail with some very dear colleagues and mentors.
How did you get started with Identifeye?
About 3 ½ years ago, I got contacted by Jonathan Rothberg, who is the founder of an incubator called 4Catalzyer. He had started a company called Tesseract Health.
He had heard about me from the lead investor of the last funding round. I believed in this space, we had credible investors, Jonathan is a very charismatic individual and I felt that I could come in and help solve a problem that I felt super passionate about.
We did a little bit of a focus reset, built the right team and then rebranded the company to Identifeye Health.
What problems are you looking to solve?
Not everybody goes to see a retinal specialist unless you need to renew a prescription; maybe you go to an optometrist. There is a really big problem in this space, specifically for patients living with diabetes.
About 30% of people with diabetes actually develop diabetic retinopathy and it is the leading cause of blindness for adults of working age.
There is a screening recommendation to go and get an exam, but there is very low compliance because it is inconvenient and because of a lack of awareness.
We’re trying to enable retinal screening closer to patients to address some of these problems. We want to do this by leveraging AI and automation to make it so simple that it fits in with existing workflows.
For example, a medical assistant while they're setting up the patient, taking vitals, they can also capture good quality retinal images that then can be interpreted initially with a teleretinal service. We’re also working on future products that leverage AI to do also the analysis, so you can directly give patients the actionable feedback.
We really want to bring it to primary care, because the idea is that 80% to 90% of patients living with diabetes will go to primary care to get their A1c checks and prescription refills.
What does retinal screening currently look like, and how is your device different?
Taking a good photo of the retina, it's like looking through a keyhole and being at the right position to be able to focus and take a photo of the back of the room. So in the patient, you have to make sure that their eyelids are open, that you can align with their pupil and then focus and capture good quality images of the retina.
You also have to make sure that they’re looking in the right direction, because if their gaze is drifting, then you might not capture the right field of view.
There's a lot of manual knowledge, skill and retention that we are specifically trying to substitute for, and we're using AI also to give guidance.
We wanted the first impression you have as a patient coming in to be a friendly, fun device. So you will hear this audio that walks you through and says, “Now we're going to take four images.”
And then we will tell you, “You are going to see a red dot,” and the AI guidance is very specific, because you have to look at it and fixate on it. It will tell you that there is a bright flash that happens because we are taking a photo.
How does the AI screening component work?
Our first AI screening product is going to be something that can look at an image and say, “Does this patient have more than mild diabetic retinopathy? Should we refer them to see a specialist?”
We've licensed hundreds of thousands of images to teach the algorithm what images of a person with diabetes that doesn't have diabetic retinopathy look like versus patients that have different stages of disease. We are currently finishing the development of that. We're also preparing to operationalize our validation and precision studies, which will be used for a 510(k) submission.
What are the next steps for the camera part of the device?
We have finished the development path, and we are working with a contract manufacturer. We are preparing for that first pilot batch that we’ll use for commercialization.
We did submit a 510(k) for the camera itself, and the FDA told us that without the screening AI, we can actually market the device as an exempt device. We are preparing for that and just announced that we plan to commercialize in early Q1.
What feedback have you gotten from patients and clinicians so far?
A high-volume optometry clinic in New York has helped us with some of our studies. They recommended adding different languages, because for a lot of patients in the clinic, English isn't their first language, so the team is working on a Spanish version. So people are also very excited to see that we are listening and that we're incorporating feedback.