Recent Quotes View Full List My Watchlist Create Watchlist Indicators DJI Nasdaq Composite SPX Gold Crude Oil EL&P Market Index Markets Stocks ETFs Tools Overview News Currencies International Treasuries As suicide rates spike, new AI platform could ‘fill the gap’ in mental health care, say Boston researchers By: FOXNews.com April 25, 2023 at 02:00 AM EDT As the need for mental health care escalates amid provider shortages, medical technology companies like OM1 have turned to AI to help improve patient care. But some experts have worries. After a two-year decline, U.S. suicide rates spiked again in 2021, according to a new report from the Centers for Disease Control and Prevention (CDC).Suicide is now the 11th leading cause of death in the country — and the second among people between 10 and 35 years of age and fifth among those aged 35 to 54, per the report. As the need for mental health care escalates, the U.S. is struggling with a shortage of providers. To help fill this gap, some medical technology companies have turned to artificial intelligence as a means of possibly making providers’ jobs easier and patient care more accessible. CHATGPT FOR HEALTH CARE PROVIDERS: CAN THE AI CHATBOT MAKE THE PROFESSIONALS' JOBS EASIER?Yet there are caveats connected to this. Read on. Over 160 million people currently live in "mental health professional shortage areas," according to the Health Resources and Services Administration (HRSA), an agency of the U.S. Department of Health and Human Services. By 2024, it is expected that the total number of psychiatrists will reach a new low, with a projected shortage of between 14,280 and 31,091 individuals. "Lack of funding from the government, a shortage of providers, and ongoing stigma regarding mental health treatment are some of the biggest barriers," Dr. Meghan Marcum, chief psychologist at AMFM Healthcare in Orange County, California, told Fox News Digital. "Wait lists for therapy can be long, and some individuals need specialized services like addiction or eating disorder treatment, making it hard to know where to start when it comes to finding the right provider," Marcum also said. A Boston, Massachusetts medical data company called OM1 recently built an AI-based platform, called PHenOM, for physicians. The tool pulls data from over 9,000 clinicians working in 2,500 locations across all 50 states, according to Dr. Carl Marci, chief psychiatrist and managing director of mental health and neuroscience at OM1.Physicians can use that data to track trends in depression, anxiety, suicidal tendencies and other mental health disorders, the doctor said."Part of the reason we're having this mental health crisis is that we haven't been able to bring new tools, technologies and treatments to the bedside as quickly as we’d like," said Dr. Marci, who has also been running a small clinical practice through Mass General Brigham in Boston for 20 years.Eventually, artificial intelligence could help patients get the care they need faster and more efficiently, he said.OM1’s AI model analyzes thousands of patient records and uses "sophisticated medical language models" to identify which individuals have expressed suicidal tendencies or actually attempted suicide, Dr. Marci said. "We can look at all of our data and begin to build models to predict who is at risk for suicidal ideation," he said. "One approach would be to look for particular outcomes — in this case, suicide — and see if we can use AI to do a better job of identifying patients at risk and then directing care to them."In the traditional mental health care model, a patient sees a psychiatrist for depression, anxiety, PTSD, insomnia or another disorder. The doctor then makes a treatment recommendation based only on his or her own experience and what the patient says, Dr. Marci said. CHATGPT AND HEALTH CARE: COULD THE AI CHATBOT CHANGE THE PATIENT EXPERIENCE?"Soon, I'll be able to put some information from the chart into a dashboard, which will then generate three ideas that are more likely to be more successful for depression, anxiety or insomnia than my best guess," he told Fox News Digital."The computer will be able to compare those parameters that I put into the system for the patient … against 100,000 similar patients."In seconds, the doctor would be able to access information to use as a decision-making tool to improve patient outcomes, he said. When patients are in the mental health system for many months or years, it’s important for doctors to be able to track how their disease is progressing — which the real world doesn’t always capture, Dr. Marci noted."The ability to use computers, AI and data science to do a clinical assessment of the chart without the patient answering any questions or the clinician being burdened fills in a lot of gaps," he told Fox News Digital."We can then begin to apply other models to look and see who's responding to treatment, what types of treatment they're responding to and whether they’re getting the care they need," he added.With the increasing mental health challenges and the widespread shortage of mental health providers, Dr. Marci said he believes that doctors will start using ChatGPT — the AI-based large language model that OpenAI released in 2022 — as a "large language model therapist," allowing doctors to interact with patients in a "clinically meaningful way."Potentially, models such as ChatGPT could serve as an "off-hours" resource for those who need help in the middle of the night or on a weekend when they can’t get to the doctor’s office — "because mental health doesn't take a break," Dr. Marci said."The opportunity to have continuous care where the patient lives, rather than having to come into an office or get on a Zoom, that is supported by sophisticated models that actually have proven therapeutic value … [is] important," he also said. But these models, which are built on both good information and misinformation, are not without risks, the doctor admitted."The most obvious risk is for [these models] to give literally deadly advice … and that would be disastrous," he said.To minimize these risks, the models would need to filter out misinformation or add some checks on the data to remove any potentially bad advice, said Dr. Marci.Dr. Cameron Caswell, an adolescent psychiatrist in Washington, D.C., has seen firsthand the struggle providers face in keeping up with the growing need for mental health care."I’ve talked to people who have been wait-listed for months, can’t find anyone that accepts their insurance or aren’t able to connect with a professional that meets their specific needs," she told Fox News Digital. CHATGPT ANSWERED 25 BREAST CANCER SCREENING QUESTIONS, BUT IT'S 'NOT READY FOR THE REAL WORLD' — HERE'S WHY"They want help, but can’t seem to get it. This only adds to their feelings of hopelessness and despair."Even so, Dr. Caswell is skeptical that AI is the answer."Programs like ChatGPT are phenomenal at providing information, research, strategies and tools, which can be useful in a pinch," she said. "However, technology doesn’t provide what people need the most: empathy and human connection.""While AI can provide positive reminders and prompt calming techniques, I worry that if it’s used to self-diagnose, it will lead to misdiagnosing, mislabeling and mistreating behaviors," she continued. "This is likely to exacerbate problems, not remediate them."CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTERDr. Marcum of Orange County, California, said he sees AI as being a helpful tool between sessions — or as a way to offer education about a diagnosis."It may also help clinicians with documentation or report writing, which can potentially help free up time to serve more clients throughout the week," she told Fox News Digital.There are ongoing ethical concerns, however — including privacy, security of data and accountability, which still need to be developed further, she said. "I think we will definitely see a trend toward the use of AI in treating mental health," said Dr. Marcum."But the exact landscape for how it will shape the field has yet to be determined." Related Stocks: Fancamp Exploration Ltd Data & News supplied by www.cloudquote.io Stock quotes supplied by Barchart Quotes delayed at least 20 minutes. By accessing this page, you agree to the following Privacy Policy and Terms and Conditions.
As suicide rates spike, new AI platform could ‘fill the gap’ in mental health care, say Boston researchers By: FOXNews.com April 25, 2023 at 02:00 AM EDT As the need for mental health care escalates amid provider shortages, medical technology companies like OM1 have turned to AI to help improve patient care. But some experts have worries. After a two-year decline, U.S. suicide rates spiked again in 2021, according to a new report from the Centers for Disease Control and Prevention (CDC).Suicide is now the 11th leading cause of death in the country — and the second among people between 10 and 35 years of age and fifth among those aged 35 to 54, per the report. As the need for mental health care escalates, the U.S. is struggling with a shortage of providers. To help fill this gap, some medical technology companies have turned to artificial intelligence as a means of possibly making providers’ jobs easier and patient care more accessible. CHATGPT FOR HEALTH CARE PROVIDERS: CAN THE AI CHATBOT MAKE THE PROFESSIONALS' JOBS EASIER?Yet there are caveats connected to this. Read on. Over 160 million people currently live in "mental health professional shortage areas," according to the Health Resources and Services Administration (HRSA), an agency of the U.S. Department of Health and Human Services. By 2024, it is expected that the total number of psychiatrists will reach a new low, with a projected shortage of between 14,280 and 31,091 individuals. "Lack of funding from the government, a shortage of providers, and ongoing stigma regarding mental health treatment are some of the biggest barriers," Dr. Meghan Marcum, chief psychologist at AMFM Healthcare in Orange County, California, told Fox News Digital. "Wait lists for therapy can be long, and some individuals need specialized services like addiction or eating disorder treatment, making it hard to know where to start when it comes to finding the right provider," Marcum also said. A Boston, Massachusetts medical data company called OM1 recently built an AI-based platform, called PHenOM, for physicians. The tool pulls data from over 9,000 clinicians working in 2,500 locations across all 50 states, according to Dr. Carl Marci, chief psychiatrist and managing director of mental health and neuroscience at OM1.Physicians can use that data to track trends in depression, anxiety, suicidal tendencies and other mental health disorders, the doctor said."Part of the reason we're having this mental health crisis is that we haven't been able to bring new tools, technologies and treatments to the bedside as quickly as we’d like," said Dr. Marci, who has also been running a small clinical practice through Mass General Brigham in Boston for 20 years.Eventually, artificial intelligence could help patients get the care they need faster and more efficiently, he said.OM1’s AI model analyzes thousands of patient records and uses "sophisticated medical language models" to identify which individuals have expressed suicidal tendencies or actually attempted suicide, Dr. Marci said. "We can look at all of our data and begin to build models to predict who is at risk for suicidal ideation," he said. "One approach would be to look for particular outcomes — in this case, suicide — and see if we can use AI to do a better job of identifying patients at risk and then directing care to them."In the traditional mental health care model, a patient sees a psychiatrist for depression, anxiety, PTSD, insomnia or another disorder. The doctor then makes a treatment recommendation based only on his or her own experience and what the patient says, Dr. Marci said. CHATGPT AND HEALTH CARE: COULD THE AI CHATBOT CHANGE THE PATIENT EXPERIENCE?"Soon, I'll be able to put some information from the chart into a dashboard, which will then generate three ideas that are more likely to be more successful for depression, anxiety or insomnia than my best guess," he told Fox News Digital."The computer will be able to compare those parameters that I put into the system for the patient … against 100,000 similar patients."In seconds, the doctor would be able to access information to use as a decision-making tool to improve patient outcomes, he said. When patients are in the mental health system for many months or years, it’s important for doctors to be able to track how their disease is progressing — which the real world doesn’t always capture, Dr. Marci noted."The ability to use computers, AI and data science to do a clinical assessment of the chart without the patient answering any questions or the clinician being burdened fills in a lot of gaps," he told Fox News Digital."We can then begin to apply other models to look and see who's responding to treatment, what types of treatment they're responding to and whether they’re getting the care they need," he added.With the increasing mental health challenges and the widespread shortage of mental health providers, Dr. Marci said he believes that doctors will start using ChatGPT — the AI-based large language model that OpenAI released in 2022 — as a "large language model therapist," allowing doctors to interact with patients in a "clinically meaningful way."Potentially, models such as ChatGPT could serve as an "off-hours" resource for those who need help in the middle of the night or on a weekend when they can’t get to the doctor’s office — "because mental health doesn't take a break," Dr. Marci said."The opportunity to have continuous care where the patient lives, rather than having to come into an office or get on a Zoom, that is supported by sophisticated models that actually have proven therapeutic value … [is] important," he also said. But these models, which are built on both good information and misinformation, are not without risks, the doctor admitted."The most obvious risk is for [these models] to give literally deadly advice … and that would be disastrous," he said.To minimize these risks, the models would need to filter out misinformation or add some checks on the data to remove any potentially bad advice, said Dr. Marci.Dr. Cameron Caswell, an adolescent psychiatrist in Washington, D.C., has seen firsthand the struggle providers face in keeping up with the growing need for mental health care."I’ve talked to people who have been wait-listed for months, can’t find anyone that accepts their insurance or aren’t able to connect with a professional that meets their specific needs," she told Fox News Digital. CHATGPT ANSWERED 25 BREAST CANCER SCREENING QUESTIONS, BUT IT'S 'NOT READY FOR THE REAL WORLD' — HERE'S WHY"They want help, but can’t seem to get it. This only adds to their feelings of hopelessness and despair."Even so, Dr. Caswell is skeptical that AI is the answer."Programs like ChatGPT are phenomenal at providing information, research, strategies and tools, which can be useful in a pinch," she said. "However, technology doesn’t provide what people need the most: empathy and human connection.""While AI can provide positive reminders and prompt calming techniques, I worry that if it’s used to self-diagnose, it will lead to misdiagnosing, mislabeling and mistreating behaviors," she continued. "This is likely to exacerbate problems, not remediate them."CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTERDr. Marcum of Orange County, California, said he sees AI as being a helpful tool between sessions — or as a way to offer education about a diagnosis."It may also help clinicians with documentation or report writing, which can potentially help free up time to serve more clients throughout the week," she told Fox News Digital.There are ongoing ethical concerns, however — including privacy, security of data and accountability, which still need to be developed further, she said. "I think we will definitely see a trend toward the use of AI in treating mental health," said Dr. Marcum."But the exact landscape for how it will shape the field has yet to be determined." Related Stocks: Fancamp Exploration Ltd