The Dangers of Having OpenAI Be Your Counselor
Artificial Intelligence (AI) is starting to be used in so many different professions and life areas. Many of us turn to OpenAI ChatGPT to answer our questions and to even make tasks like writing emails easier. Arguably it can be very useful, however, many are starting to wonder if it can be dangerous as well.
AI and Counseling
Asking ChatGPT general questions can be helpful, but what about asking it for advice? Many are turning to AI for support and companionship, some even utilizing bots for therapy services.
It is no secret that there is a shortage of mental health professionals or that counseling can be difficult to obtain. You may be deterred by costs, long waiting lists or simply not knowing whether therapy is right for you. Regardless of what barriers we may experience, it can be difficult to find a provider that is right for you. Unfortunately, these challenges have led some to turn to AI for support.
The Problem with OpenAI As Therapy
Using OpenAI as your therapist can lead to a number of concerns such as;
- It Is NOT Private: OpenAI CEO Sam Altman warned ChatGPT users against using the chatbot as a “therapist” because of privacy concerns. AI is NOT confidential and does not follow HIPAA guidelines like a human therapist would. Real therapists are licensed and must follow a code of ethics that supports confidentiality, protecting your privacy. They are bound by an ethical and legal code, none of this applies to an AI bot.
- Inauthentic Connection: When chatting with an agreeable bot that is actively seeking to get you to want to come back can feel good, however, it isn’t an authentic connection. What may feel like connection is in fact not and can impact your idea of what a healthy therapeutic relationship truly is. In therapy your counselor will support you AND challenge you to do better. Blind agreement cannot lead to true growth.
- Missed Information: Chatbots are VERY literal and are unable to dig into meaning. This can lead to valuable things being missed and potentially present with safety concerns. There are many times we say we are okay when we are in fact not. A real human therapist can pick up on these discrepancies, a bot simply cannot.
- Too Much Validation: Validation feels great and can really help us to feel better, however, too much of anything can be a bad thing. AI is always working toward agreement, the algorithm is designed to make you feel good and keep coming back. This is often done through excessive validation and agreement, even when it’s unhealthy.
Unfortunately, there have been more and more cases of individuals harming themselves and others after being encouraged to do so by OpenAI. Utilizing AI in the place of a licensed professional can be dangerous for a number of reasons, not just the examples listed above.
Finding a Real Therapist
Carolina Counseling Services in Sanford, North Carolina contracts with skilled licensed therapists who care. Our goal is to meet you where you’re at and help you to reach your goals. We aren’t here to tell you what to do but to guide you through life’s challenges.
You deserve real therapy geared toward your unique needs. Providers are in network with most major insurances including Aetna, Aetna State Health Plan, Blue Cross and Blue Shield (Blue Cross NC) and many more. Click here to learn more about how you can start your journey with CCS. Call today!

Jaime Johnson Fitzpatrick LCMHCS, LCAS is one of the Owners and Vice Presidents of Carolina Counseling Services. She is a Licensed Clinical Mental Health Counselor and Licensed Clinical Addictions Specialist in the State of North Carolina as well as a Licensed Mental Health Counselor in State of New York. Jaime is also certified in Dialectical Behavioral Therapy and utilizes various other approaches in her practice.
