Skip to main content
Open this photo in gallery:

Emmanuel Akindele, founder and CEO of Blue Guardian, in Toronto on April 6.Christopher Katsarov/The Globe and Mail

When Emmanuel Akindele was in high school, he was afraid to speak openly about his struggle with anxiety, as he wouldn’t receive the support he was looking for by doing so.

“I remember the first time I actually shared it with an educator. They straight up just laughed in my face,” he said. “That was fairly disappointing.”

Now an economics student at Western University, Mr. Akindele is the co-founder of a new app, Blue Guardian, that uses artificial intelligence to detect early signs of mental-health issues in youth. He hopes the technology, created with fellow student Kyle Lacroix, can provide the kind of help he couldn’t find when he was younger.

Blue Guardian will launch in Ontario on May 1 to coincide with the start of Mental Health Week in Canada.

Mr. Akindele likens the technology to a spell-checking software for mental health. By downloading the app, youth between ages 7 to 17 will allow its AI to monitor the text they type on their devices. Any such content, whether that’s in the form of social media, text messages or Google searches, will be observed by the AI for potential mental-health cues.

Instead of focusing on specific words, Mr. Akindele said the AI model the app uses has been trained to pick up on subtle differences in speech patterns between a person with a “healthy mind” and a person struggling with mental-health issues such as anxiety or depression.

Once the text data is collected, the app will provide its user with emotional insights such as “happy,” “sad” or “neutral.” It might also raise potential flags, if the AI has detected signs of depression or anxiety based upon the language being typed by the user. If flags are raised, the app will also suggest resources, such as a counselling service, based upon the data its collected and biographical information the user has provided about themselves.

The child can subsequently decide if they want to share those emotional insights and flags with their parent by allowing them to scan a QR code available on the app, Mr. Akindele said.

Both the child and parent will only be able to see the emotional insights and flags on the app. Any text collected by the app is encrypted and completely inaccessible, including to the user and the developers. After the encrypted text is processed and emotional insights are generated, Mr. Akindele said it’s stored for about a week before being deleted.

Carolyn McGregor, research chair in artificial intelligence for health and wellness at Ontario Tech University, said consent is key when dealing with technology geared towards helping youth maintain their mental health.

Ontario’s Health Care Consent Act states a person capable of understanding the information relevant to making a decision about treatment of their own mental health is legally allowed to do so without a parent or guardian’s consent. This gives young people the agency to choose whether their parents are involved in decisions about their mental health – which Dr. McGregor said is important to keep in mind if a child chooses to download this app onto their device.

Her concerns are less about what information the AI is observing on youth’s devices, and more about what it isn’t picking up on.

“If it’s purely reading text, there’s a whole genre of communication that they utilize that is going to be missed,” she said.

A lot of young people use visualizations such as memes or gifs to communicate, Dr. McGregor said, which this technology would not pick up on. Girls are also more likely to communicate with visuals than boys are, because of differing levels of emotional intelligence, she said, which could introduce questions of bias in the AI’s data-collection methods.

Misty Pratt, a parent to two young children aged 10 and 13, said this technology could help monitor her children’s activities online. Right now, her eldest has a phone with TikTok. Ms. Pratt said she also has an account on the social-media app to share videos with her daughter and keep an eye on what she’s posting – but she wouldn’t mind the extra help.

With her children’s consent, Ms. Pratt said she would consider downloading Blue Guardian onto their phones to gain a better understanding of their mental health. She has waited close to a year before for an appointment with a psychologist for one of her children, and if this app could help her prevent having to seek professional help again in the future, she said she would welcome that.

“If you let it build and build and worsen and worsen, that’s when things can get really bad,” she said. “But if you’re able to get in there a little bit earlier and give them the tools they need to cope with those big feelings … the hope is it doesn’t progress into anything more severe.”

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe