Trehani M. Fonseka / en How AI is helping to predict and prevent suicides /news/how-ai-helping-predict-and-prevent-suicides <span class="field field--name-title field--type-string field--label-hidden">How AI is helping to predict and prevent suicides</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>noreen.rasbach</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2018-03-28T18:04:53-04:00" title="Wednesday, March 28, 2018 - 18:04" class="datetime">Wed, 03/28/2018 - 18:04</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">A 2018 pilot project between the Public Health Agency of Canada and Advanced Symbolics will use social media posts as a resource to predict regional suicide rates (photo by Shutterstock)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/sidney-kennedy" hreflang="en">Sidney Kennedy</a></div> <div class="field__item"><a href="/news/authors-reporters/trehani-m-fonseka" hreflang="en">Trehani M. Fonseka</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-medicine" hreflang="en">Faculty of Medicine</a></div> <div class="field__item"><a href="/news/tags/health" hreflang="en">Health</a></div> <div class="field__item"><a href="/news/tags/psychiatry" hreflang="en">Psychiatry</a></div> <div class="field__item"><a href="/news/tags/research-and-innovation" hreflang="en">Research and Innovation</a></div> <div class="field__item"><a href="/news/tags/technology" hreflang="en">Technology</a></div> <div class="field__item"><a href="/news/tags/conversation" hreflang="en">The Conversation</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">The Conversation with Ă山ǿĽé's Sidney Kennedy and Trehani M. Fonseka</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Suicide is a growing public health concern. <a href="https://www.canada.ca/en/public-health/services/publications/healthy-living/suicide-canada-infographic.html">In Canada, 4,000 lives are claimed by suicide each year</a> – that is 10 lives per day.</p> <p>For every one of these suicide deaths, there are five people hospitalized following self-injury, 25 to 30 suicide attempts and seven to 10 people affected by each tragedy, according to analysis by the Public Health Agency of Canada.</p> <p>Suicide rates are highest among certain groups – such as <a href="http://www.who.int/mediacentre/factsheets/fs398/en/">Indigenous Peoples, immigrants and refugees, prisoners and the lesbian, gay, bisexual, transgender, intersex (LGBTI) community</a> – and are on the rise.</p> <p>The impacts of suicide are felt widely. The Toronto Transit Commission (TTC) recently reported an increase in transit suicides at the end of 2017, with eight attempts in December alone, and <a href="https://www.theglobeandmail.com/news/toronto/rise-in-toronto-subway-suicides-takes-a-toll-on-ttc-staff/article37954198/">a corresponding rise in rates of stress leave by TTC employees, due to the toll this took on staff</a>.</p> <p>Could artificial intelligence (AI), or intelligence demonstrated by machines, possibly help to prevent these deaths?</p> <p>As researchers in psychiatry, in the Canadian Biomarker Integration Network for Depression, we are collecting clinical and biological data during treatment interventions for people with major depression. We are exploring early clues to changes in behaviour and mood states using mobile health technologies.</p> <p>One of our goals is to identify early predictors of relapse, and increased risk of suicidal behaviour.</p> <p>Here we review other promising applications of AI to suicide prevention, and draw attention to the barriers within this field.</p> <h3>AI predicts suicide rates</h3> <p>Early in 2018, the Public Health Agency of Canada announced <a href="http://www.cbc.ca/news/canada/nova-scotia/feds-to-search-social-media-using-ai-to-find-patterns-of-suicide-related-behaviour-1.4467167">a pilot project with Advanced Symbolics</a>, an Ottawa-based AI company <a href="http://www.metronews.ca/news/ottawa/2016/11/21/ottawa-firms-us-election-prediction-not-far-off-mark.html">which successfully predicted Brexit, Trump’s presidency and results of the 2015 Canadian election</a>.</p> <p>The project will <a href="https://globalnews.ca/news/3942921/ottawa-backs-artificial-intelligence-program-looking-for-early-suicide-warning-signs/">research and predict regional suicide rates by examining patterns in Canadian social media posts</a>, including suicide-related content, although user identity will not be collected.</p> <p>The program will not isolate high-risk cases or intervene at the individual level. Instead, findings will be used to inform mental health resource planning.</p> <h3>Facebook alerts emergency responders</h3> <p>In 2011, Facebook developed a manual suicide reporting system where users could upload screenshots of suicide content for review.</p> <p>In 2015, the system <a href="http://www.huffingtonpost.ca/entry/facebook-suicide-prevention_n_6754106">allowed users to “flag” concerning content</a>, which would prompt Facebook staff to review the post and respond with supportive resources.</p> <p>Due to the tool’s success, <a href="http://www.businessinsider.com/facebook-suicide-prevention-artifical-intelligence-2017-11">Facebook has begun expanding their AI capabilities to detect automatically suicide-related content, and alert local emergency responders</a>. There are also more <a href="https://www.cnbc.com/2018/02/21/how-facebook-uses-ai-for-suicide-prevention.html">language options, and an extension into Instagram</a>.</p> <h3>Chatbots give therapy for depression</h3> <p>AI has been used in health care since the 1990s <a href="https://www.cnbc.com/2017/05/11/from-coding-to-cancer-how-ai-is-changing-medicine.html">to improve disease detection and various indices of wellness</a>.</p> <p>Within mental health, AI has enhanced the speed and accuracy of diagnosis, and applied “decision trees” to guide treatment selection.</p> <p>A new approach to “therapy” involves conversational bots (or chatbots) which are computer programs designed to simulate human-like conversation using voice or text responses.</p> <p>Chatbots can deliver psychological interventions for depression and anxiety based on cognitive behavioural therapy (CBT). Since chatbots uniquely respond to presented dialogue, <a href="https://www.verywellmind.com/using-artificial-intelligence-for-mental-health-4144239">they can tailor interventions to a patient’s emotional state and clinical needs</a>. <a href="http://onlinelibrary.wiley.com/doi/10.1111/exsy.12151/full">These models are considered quite user-friendly</a>, and the user-adapted responses of the chatbot itself have been well reviewed.</p> <p>Similar technology is being added to smartphones to allow voice assistants, like the iPhone’s Siri, to recognize and respond to user mental health concerns with appropriate information and supportive resources. However, <a href="https://www.ctvnews.ca/sci-tech/i-don-t-know-what-you-mean-how-siri-responds-to-questions-about-rape-1.2816383">this technology is not considered reliable and is still in its preliminary stages</a>. Other smartphone applications even <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5222787/">use games to improve mental health-care education</a>.</p> <p>AI technology has also been integrated into suicide management to improve patient care in other areas. AI assessment tools have been shown to predict short-term suicide risk and make treatment recommendations that are as good as clinicians. The <a href="https://www.ncbi.nlm.nih.gov/pubmed/27314465">tools are also well-regarded by patients</a>.</p> <h3>AI models predict individual risk</h3> <p>Current evaluation and management of suicide risk is still highly subjective. To improve outcomes, more objective AI strategies are needed. Promising applications include suicide risk prediction and clinical management.</p> <p>Suicide is influenced by <a href="https://suicidepreventionlifeline.org/how-we-can-all-prevent-suicide/">a variety of psychosocial, biological, environmental, economic and cultural factors</a>. AI can be used to explore the association between these factors and suicide outcomes.</p> <p>AI can also model the combined effect of multiple factors on suicide, and use these models to predict individual risk.</p> <p>As an example, <a href="http://journals.sagepub.com/doi/abs/10.1177/2167702617691560">researchers from Vanderbilt University recently designed an AI model</a> that predicted suicide risk, using electronic health records, with 84 to 92 per cent accuracy within one week of a suicide event and 80 to 86 per cent within two years.</p> <h3>Moving forward with caution</h3> <p>As the field of suicide prevention using artificial intelligence advances, there are several potential barriers to be addressed:</p> <ol> <li> <p><strong>Privacy:</strong> Protective legislation will need to expand to include risks associated with AI, specifically the collection, storage, transfer and use of confidential health information.</p> </li> <li> <p><strong>Accuracy:</strong> AI accuracy in correctly determining suicide intent will need to be confirmed, specifically in regards to system biases or errors, before labelling a person as high (versus low) risk.</p> </li> <li> <p><strong>Safety:</strong> It is essential to ensure AI programs can appropriately respond to suicidal users, so as to not worsen their emotional state or accidentally facilitate suicide planning.</p> </li> <li> <p><strong>Responsibility: </strong>Response protocols are needed on how to handle properly high-risk cases that are flagged by AI technology, and what to do if AI risk assessments differ from clinical opinion.</p> </li> <li> <p><strong>Lack of understanding: </strong>There is a knowledge gap among key users on how AI technology fits into suicide prevention. More education on the topic is needed to address this.</p> </li> </ol> <p>Overall, AI technology is here to stay in many aspects of health care, including suicide screening and intervention delivery.</p> <p><em><span><a href="https://theconversation.com/profiles/sidney-kennedy-442755"><strong>Sidney Kenned</strong>y</a>&nbsp;is a professor of psychiatry and Arthur Sommer Rotenberg Chair in suicide and depression studies at the&nbsp;<a href="http://theconversation.com/institutions/university-of-toronto-1281">University of Toronto</a>.&nbsp;<strong><a href="https://theconversation.com/profiles/trehani-m-fonseka-448244">Trehani M. Fonseka</a></strong>&nbsp;is a&nbsp;research associate in psychiatry at the University Health Network and project lead for the Canadian Biomarker Integration Network in Depression (CAN-BIND)’s Indigenous Program at St. Michael’s Hospital.</span></em></p> <p><em>This article was originally published on <a href="http://theconversation.com">The Conversation</a>. Read the <a href="https://theconversation.com/how-ai-is-helping-to-predict-and-prevent-suicides-91460">original article</a>.</em></p> <p><img alt="The Conversation" height="1" src="https://counter.theconversation.com/content/91460/count.gif?distributor=republish-lightbox-basic" width="1" loading="lazy"></p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 28 Mar 2018 22:04:53 +0000 noreen.rasbach 132281 at