Logo

Articles

Digital Therapy in the Age of AI: Opportunity, Risk, and Ethical Oversight

By Dr. Floyd Godfrey

In recent years, digital mental health tools, especially those leveraging artificial intelligence such as conversational agents (chatbots), virtual therapists, or other generative AI models, have expanded rapidly. This progress comes in response to growing demand for scalable mental health support, provider shortages, and accessibility gaps. At the same time, they raise critical questions about safety, ethics, regulation, and the extent to which these tools can or should substitute for or augment traditional therapy.

Promise and Potential
AI-enabled mental health tools offer several advantages. They provide 24/7 access, immediate feedback, anonymity, and lower cost compared to in-person therapy. For individuals in remote areas or those facing mobility or stigma-related barriers, these tools may offer essential first-line support. Some tools are already being used for wellness monitoring, mood tracking, or guided self-help. Their capacity to adapt over time through machine learning also suggests they may become increasingly personalized and responsive.

Emerging Risks and Ethical Concerns
However, several risks accompany this promise. Many AI mental health tools can produce unpredictable outputs. Responses might be misleading, insensitive, or fail to recognize serious risks such as suicidal or self-harming ideation. Another concern involves bias in the training data. Systems trained on unrepresentative populations may be less accurate or helpful for marginalized groups. Privacy, data security, transparency, and informed consent are also pressing issues. Users need to know how their data is used, stored, and shared.

Regulatory Landscape and Oversight
U.S. regulators are beginning to focus more intently on AI in digital mental health. The Food and Drug Administration (FDA) has scheduled a meeting of its Digital Health Advisory Committee in November 2025 to evaluate AI-enabled digital mental health devices (Reuters, 2025). The committee will examine the benefits of these tools, such as scalability and prompt intervention, while also addressing their risks. Additionally, the FDA is updating its frameworks for AI and machine learning software used as medical devices to better manage transparency and safety (U.S. Food and Drug Administration, 2024).

What Clinicians and Developers Should Consider

  • Risk stratification: Determine when a tool is appropriate, such as mild-to-moderate distress versus crisis situations.
  • Human oversight: Clarify when and how clinicians are involved or alerted.
  • Transparency: Users should understand what the AI does, its limitations, and how their data is handled.
  • Evaluation and evidence: Tools must be validated for efficacy, safety, and potential biases.
  • Ethical frameworks: Designers should apply principles such as fairness, nonmaleficence, and justice throughout development.

Moving Forward with Hope
AI offers promising avenues for expanding access to mental health support. With thoughtful oversight, transparent practices, and collaborative efforts among clinicians, developers, and regulators, we can realize these benefits while minimizing harm. The goal is not to replace human care, but to responsibly extend its reach.

Floyd Godfrey PhD is a Certified Mental Health Coach and has been guiding clients since 2000. He currently speaks and provides consulting and mental health coaching across the globe. To learn more about his services please visit his website: www.FloydGodfrey.com.

References

FDA panel to weigh in on Ai Mental Health Devices | reuters. (2025). https://www.reuters.com/business/healthcare-pharmaceuticals/fda-panel-weigh-ai-mental-health-devices-2025-09-11/

Center for Devices and Radiological Health. (n.d.). Artificial Intelligence in software. U.S. Food and Drug Administration. https://www.fda.gov/medical-devices/software-medical-device-samd/artificial-intelligence-software-medical-device

 

Fill Out Form
Would you like to speak privately with someone?