Join the conversation:

U of T startup finds 400 subtle neurological health markers

U of T startup finds 400 subtle neurological health markers

Posted by PanamericanWorld on April 03, 2017

Faculty of Arts & Science PhD student Katie Fraser discovered hundreds of speech and language markers for neurological health. Now, she’s a partner in WinterLight Labs – a new business venture with her colleagues at the department of computer science, Liam KaufmanMaria Yancheva and Assistant Professor Frank Rudzicz, who is also a rehabilitation scientist at the University Health Network. With combined expertise in linguistics and machine learning, their team has built a tool that pinpoints early signs of Alzheimer’s disease and dementia with very high precision.

Coming to market soon, the tool will help clinicians track patients’ health objectively and start therapy promptly. Eventually, WinterLight plans to extend the technology to tests for depression and anxiety.

“Speech and language are among the most accurate lenses into somebody’s state of mind,” says Rudzicz.

How did research conducted at U of T lead to the creation of WinterLight Labs?

Frank Rudzicz: This research really started primarily with Katie Fraser who was working on aphasia, which is a loss of ability to understand or express speech, which can be caused by brain damage or stroke. She was applying machine-learning methods to analyze words and grammar that might indicate this disorder.

Then we moved into Alzheimer’s disease because of its increasing prevalence worldwide, and the need to assess and monitor individuals who might be at risk of developing it to provide better care.

[Master's student] Maria Yancheva expanded a lot of that work, looking at meanings of words and topics, and how change happens over time for individuals with language disorders.

Then, by extreme coincidence, the stars aligned and Liam Kaufman, who had a lot of experience in computer science and in Alzheimer’s disease joined us. He also had entrepreneurial experience so it was a perfect mix to transfer this research out of the lab at U of T into our startup, WinterLight Labs.

Why focus on speech? What does it reveal about our health?

Frank Rudzicz: We view speech and language as one of the most accurate lenses into a person’s state of mind. We’re focusing now mainly on pathologies and dementia, but speech and language are such a rich source of information, that we are excited about the potential to look at mood and other aspects of someone’s well-being.

Can you describe the tool WinterLight has created?

Liam Kaufman: As we’ve been exploring and meeting with clinicians, we’ve realized there is a range of possibilities with this technology. Step 1 is quantifying speech and language. Step 2 is diagnosing, and Step 3 is predicting dementia and Alzheimer’s. We’re breaking up these goals in ways that allow us to do things right now and progressively over time to achieve bigger things.

Right now, the people who are most interested in quantifying speech and language are speech language pathologists. They work with people with speech impairments, someone who’s had a stroke, someone who has aphasia, a traumatic brain injury or concussion. 

To understand the nature of their speech or language impairment, they do a number of different assessments. One of them is called a picture-description task. The patient looks at a picture for two minutes and talks about everything they see. Then, the speech language pathologist spends 30 minutes transcribing what the patient has said and scoring it.

With our technology, we can take this task and do it for them. The assessments will be quicker and evidence-based, and therefore more objective. It’ll also enable clinicians to see more patients or help patients avoid the need to make a visit to a tertiary care centre, all together.

So, is the product software?

Liam Kaufman: Yes, it’s a service. The general form of the assessment right now is on a tablet, but it could be on a desktop computer. It could even be over the phone potentially.

Someone just has to ask the patient or subject to describe a picture they’re looking at, which could be on a tablet or computer, or talk in any way spontaneously. It doesn’t work as well if someone is just reading text off of a page.

Right now, we use one picture to conduct the assessments, but it could be any image – even family photographs – to make it an activity a patient wants to do on a regular basis.

We’re piloting [our speech and language quantifying technology] with several senior care homes in Ontario and soon in Nova Scotia, as well. They are interested in using it to monitor the progress of the disease in patients. They want to provide better care. The first phase, which is for clinician use, could be as soon as three months from now. The second phase (diagnosis) may be eight months to a year, and third phase (prediction) will happen when data permits.

How accurate are your tests in diagnosing and predicting dementia and Alzheimer’s?

Liam Kaufman: We’ve achieved 90 per cent accuracy for the diagnosis of dementia. But right now in our first phase, we’re focusing on simply quantifying speech and language. 

If someone is very early on in the course of the disease, there are some medications that have a very small impact for some people. Getting medications to those people sooner can be beneficial so they can lead a fuller life for at least those first few years [after early diagnosis].

It also helps [patients and their families] with planning. While they’re still of right mind, they can do things like go on vacations, write a book, do financial planning, or whatever before it’s too late. Knowing in advance is helpful.

From a business standpoint, pharmaceutical companies are really interested in early detection. In clinical trials, they’ve administered test treatments to people with advanced dementia, [but the drugs had] no impact. Then, they started looking at people with mild dementia – still no impact. And now they’re looking at people with pre-dementia.

Basically, the earlier you administer these medications, the bigger the impact for improving quality-of-life. [Pharmaceutical companies] really want assessments [like ours] that can help find people a year or two or decades in advance so they could start giving these drugs to them as soon as possible.

Frank Rudzicz: Eighty-one per cent is probably our most conservative baseline. . . We’re getting to a point where machine learning [artificial intelligence] is about as accurate as people are in doing this. There’s a bit of variance in how accurate clinicians are in diagnosing Alzheimer’s disease. The number is about 86 per cent of Alzheimer’s diagnoses, I think, are confirmed post-autopsy.

What is it about artificial intelligence/machine learning that makes your technology possible?

Frank Rudzicz: One of the key things is that computers allow for the collection of vast amounts of data. Algorithms allow us to analyze these data in a quantitative way. But increasingly modern machine learning and modern artificial intelligence allow us to combine those two things. That gives us the ability to find patterns in data that humans couldn’t find before.

Even in a couple hundred individual samples, we could pull out that certain aspects of speech and language seem to be indicative of cognitive disorders that no literature had ever really identified before.

What WinterLight does is new to most people. Is the field crowded with competitors?

Liam Kaufman: When we first started [in September 2015], I did a competitor analysis and there was one company. Over the last year and a half, a couple more have popped up.

It’s not just AI that’s hot right now. It’s also speech. You’re probably familiar with Amazon Echo? It’s a little cylinder that listens to you at all times during the day and night. You can talk to it. And there’s Siri on the iPhone, and Google has their own AI chat botish thing. All of these companies are investing heavily in speech and a lot of people are saying this is going to be the next UI [User Interface].

I think we’re going to be at a point sometime in the future where people are talking to computers to interact with them much more than we are now. It seems natural to use that information for other things – like diagnosing and detecting. If someone is talking to their Amazon Echo, and they were able to be diagnosed [with dementia or Alzheimer’s] 20 years in advance that could be incredibly useful.

What’s it been like to see your research transform into a startup? Did you ever think you would be in business?

Katie Fraser: Absolutely not! This was totally unexpected for me. When I first started my PhD, I thought I wanted to be a pure academic and just stay in the lab and do my research.

Once I saw the potential for this kind of research and the impact it could have, it took a little bit of convincing [to start a business] because it was very unfamiliar to me. But it was one of the best decisions I think that I’ve ever made. It’s been incredibly fulfilling and motivating to actually see the work translate into something that real people are using now.

Maria Yancheva: I’m just really excited about the possibilities in the future. There is no treatment for Alzheimer’s right now, but I think the path towards finding a treatment is to collect a lot more data, to detect it early on and then learn more in the earlier stages.

Link To Full Article: 
Tags: 

Facebook comments



Monthly newsletter featuring articles hand picked by our country managers from the best content across PanamericanWorld.



Monthly newsletter featuring articles hand picked by our country managers from the best content across the Caribbean Region on PanamericanWorld.

PANAMERICANWORLD COUNTRIES