The impact of artificial intelligence in psychodiagnosis and psychotherapy: Innovations, limitations, and ethical dilemmas
DOI:
https://doi.org/10.55312/op.v17i1.7267Abstract
Artificial Intelligence (AI) is revolutionizing the field of psychology by introducing significant changes in psychological diagnosis and psychotherapy. Advanced AI systems, based on deep learning algorithms and automated intelligence, can analyze clinical data, identify behavioral patterns, and suggest person-alized diagnoses or treatments. Meanwhile, the use of AI in psychotherapy, through chatbots and digital platforms, raises new concerns regarding authenticity, empathy, and ethical considerations in psychologi-cal therapy. One of the major challenges is the existential anxiety experienced by patients guided by a ma-chine, affecting the level of trust and the effectiveness of treatment. Furthermore, ethical dilemmas related to data privacy, clinical responsibility, and the boundaries between humans and technology remain open for discussion. This study aims to explore the impact of AI in psychodiagnosis and psychotherapy by analyzing innovations, limitations, and ethical challenges associated with its application in mental health.Keywords:
artificial intelligence, psychodiagnosis, psychotherapy, digital therapy, ethics, mental health.Downloads
References
-
1. American Psychological Association. (2021). Ethical principles of psychologists and code of con-duct. https://www.apa.org/ethics/code
-
2. Bennett, C. C., & Hauser, K. (2013). Artificial intelligence framework for simulating clinical deci-sion-making: A Markov decision process approach. Artificial Intelligence in Medicine, 57(1), 9–19.
-
3. Bickmore, T., & Picard, R. (2005). Establishing and maintaining long-term human-computer rela-tionships. ACM Transactions on Computer-Human Interaction, 12(2), 293–327.
-
4. Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19.
-
5. Luxton, D. D. (2016). Recommendations for the ethical use and design of artificial intelligent care providers. Artificial Intelligence in Medicine, 71, 1-9.
-
6. Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 1-21.
-
7. Reddy, S., Allan, S., Coghlan, S., & Cooper, P. (2020). A governance model for the application of AI in health care. The Lancet Digital Health, 2(11), e503–e504.
-
8. Schore, A. N. (1994). Affect regulation and the origin of the self: The neurobiology of emotional development. Lawrence Erlbaum Associates.
-
9. Siegel, D. J. (2012). The developing mind: How relationships and the brain interact to shape who we are (2nd ed.). Guilford Press. (Original work published 1999)
-
10. Yalom, I. D. (1980). Existential psychotherapy. Basic Books.
-
11. Fonagy, P., Gergely, G., Jurist, E. L., & Target, M. (2002). Affect regulation, mentalization, and the development of the self. Other Press.
-
12. Bennett, C. C., & Hauser, K. (2013). Artificial intelligence framework for simulating clinical deci-sion-making: A Markov decision process approach. Artificial Intelligence in Medicine, 57(1), 9–19.
-
13. Rizzolatti, G., & Sinigaglia, C. (2008). Mirrors in the brain: How our minds share actions and emotions. Oxford University Press.
References
1. American Psychological Association. (2021). Ethical principles of psychologists and code of con-duct. https://www.apa.org/ethics/code
2. Bennett, C. C., & Hauser, K. (2013). Artificial intelligence framework for simulating clinical deci-sion-making: A Markov decision process approach. Artificial Intelligence in Medicine, 57(1), 9–19.
3. Bickmore, T., & Picard, R. (2005). Establishing and maintaining long-term human-computer rela-tionships. ACM Transactions on Computer-Human Interaction, 12(2), 293–327.
4. Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19.
5. Luxton, D. D. (2016). Recommendations for the ethical use and design of artificial intelligent care providers. Artificial Intelligence in Medicine, 71, 1-9.
6. Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 1-21.
7. Reddy, S., Allan, S., Coghlan, S., & Cooper, P. (2020). A governance model for the application of AI in health care. The Lancet Digital Health, 2(11), e503–e504.
8. Schore, A. N. (1994). Affect regulation and the origin of the self: The neurobiology of emotional development. Lawrence Erlbaum Associates.
9. Siegel, D. J. (2012). The developing mind: How relationships and the brain interact to shape who we are (2nd ed.). Guilford Press. (Original work published 1999)
10. Yalom, I. D. (1980). Existential psychotherapy. Basic Books.
11. Fonagy, P., Gergely, G., Jurist, E. L., & Target, M. (2002). Affect regulation, mentalization, and the development of the self. Other Press.
12. Bennett, C. C., & Hauser, K. (2013). Artificial intelligence framework for simulating clinical deci-sion-making: A Markov decision process approach. Artificial Intelligence in Medicine, 57(1), 9–19.
13. Rizzolatti, G., & Sinigaglia, C. (2008). Mirrors in the brain: How our minds share actions and emotions. Oxford University Press.



