
The TeleWellness Hub Podcast
The TeleWellness Hub podcast is hosted by Marta Hamilton, a licensed therapist and a certified wellness professional and founder of the TeleWellness Hub directory. The TeleWellness Hub podcast brings wellness outside of the private consultation room and straight to listeners in an honest, trustworthy, and simple approach! It's a place to practice self care by hearing and learning directly from leading wellness experts who share wellness tips, tools, research, and ways to connect with them. We also feature guests who share their real life wellness journeys that we can relate to. In a modern world of busyness, TeleWellness Hub is here to be a partner in your health and wellness journey.
As a reminder please remember that everything we talk about on this podcast is just meant to be for general information and is not meant as personal advice. Please consult a licensed professional with any personal questions related to topics discussed on our podcast episodes.
The TeleWellness Hub Podcast
Navigating the Rise of AI-Powered Mental Health Support
Artificial intelligence is reshaping mental health care at breathtaking speed, but are we moving too quickly to embrace technology at the expense of human connection? As a licensed professional counselor since 2011, I've spent weeks researching this transformative trend to understand what it means for both providers and clients.
The appeal is undeniable. AI-driven mental health platforms offer 24/7 support, affordability, and unprecedented accessibility. Studies show these tools can reduce symptoms for some users, and their popularity has skyrocketed with a 500% increase in usage since 2020. As someone who understands the midnight need for support when therapy appointments are days away, I see the potential benefits.
Yet serious concerns emerge upon closer examination. Most AI systems provide mechanical responses that leave users feeling unheard. Some platforms have suggested actively harmful coping strategies, raising alarm bells about safety. Without a human provider's clinical judgment, AI cannot properly diagnose, personalize treatment, or adapt to complex emotional needs. Perhaps most troubling are the privacy issues – many AI mental health tools collect sensitive data that may be used for marketing or sold to third parties, a far cry from the confidentiality guaranteed by licensed professionals bound by HIPAA.
For mental health providers, this shift creates both existential fears and opportunities. Many professionals who've invested years in education and training worry about being replaced by algorithms. However, I believe the path forward lies not in resistance but in ethical integration. We need mental health professionals at the table shaping how AI is implemented in our field, ensuring technology supports rather than replaces human care. That's why platforms like Telewellness Hub are creating communities where providers can navigate these changes together while advocating for human-first, ethical mental health solutions.
Subscribe to join our conversation about the future of mental health care. Together, we can ensure that in our rush to innovate, we don't lose the irreplaceable human connection at the heart of healing.
Providers, checkout https://www.namhp.org/
We are happy and honored to be part of your life changing health and wellness journey:
https://telewellnesshub.com/explore-wellness-experts/
Welcome, friends, to the Telewellness Hub podcast, where we are going to dive into the latest trends, challenges and innovations in mental health and wellness. I'm Marni Hamilton, licensed professional counselor, based in Texas, and today we are talking about a major shift in the industry that I think is important to talk about. I've taken a break from recording just because I've been trying to educate myself on this topic. I have been concerned, I've been excited about it when thinking about Telewellness Hub and how we can utilize this for the better of humanity, but the topic is AI-powered mental health support. Ai-powered mental health support. This is causing a big shift in our industry in many industries, and I think it's important to talk about AI in terms of mental health. So AI-driven chatbots are personalized support systems that are gaining popularity. I've been spending the week doing a lot of research on this topic and I've been really considering, though what does this mean, these AI driven chatbots? What do they mean in terms of real providers, real clients and the future of mental health care? While AI offers accessibility and efficiency I mean I've used all types of AI support systems, even integrated within this podcast platform through Buzzsprout. Now AI can look for transcriptions, for summaries of podcasts. I mean AI is everywhere and it can make things a lot easier. I think it's important to also look at the risks. The technology is evolving so quickly that and it makes things so efficient and easy it can be easy to overlook some of the risks. And for me, as a licensed professional counselor who's been in this industry fully licensed since 2011, and has worked also as a supervisor an LPC supervisor certification and have supervised interns while they become independently practitioner providers, independently-practitioning providers I can't help but really think about how important it is that we, as professionals working in this space, really sit down and take a look at what AI really is and what it means for the future of our clients, those individuals seeking care. Ai isn't the solution to every mental health challenge, and I want to encourage all of you, both providers and clients, to really think critically about the role of AI, mental health and why real human-driven care is still irreplaceable, particularly with something as important as our mental health. Let's get started. I have a lot of notes, so today you'll see me reading a lot.
Speaker 1:So what is AI-powered mental health support? Ai in mental health care is growing at an unprecedented rate, so platforms like Wobot, wisa I think I'm saying this correctly and Replika use artificial intelligence to simulate therapeutic conversations. Some AI systems analyze user data and suggest coping strategies, and others even attempt to predict a mental health crisis. And the great thing is, these tools are available 24-7, making mental health support more accessible than ever before. I myself know that feeling of wishing I could talk to a therapist in that moment at 1 am, when I was a single mom, and wondering, when it's the day before my actual therapy appointment, feeling like, okay, well, what am I going to talk about? So I think a lot of people have had that experience where the idea of an on-demand therapeutic support system is incredible. So that's one of the great things about AI.
Speaker 1:When it comes to AI in this space, some facts A 2023 study published in the Lancet Digital Health found that AI chatbots reduced symptoms of depression and anxiety in 30% of users after just four weeks. I did not do a deep dive into the power of this research, so this is again a big disclaimer to really critically think about all this. Statistics is everything, and just because something is published in research doesn't necessarily mean that it can be generalized to all populations. You really have to look at the individual variables. But I thought that was an interesting thing when I looked at the abstract. Additionally, according to the World Economic Forum, ai-driven mental health apps saw a 500% increase in usage between 2020 and 2024. So the demand is there, without a doubt. 2020 and 2024. So the demand is there, without a doubt.
Speaker 1:And at first glance, this sounds promising, right, like who wouldn't want affordable, instant, around the clock support? Like I myself would be like yes, sign me up. But as we dig deeper, the reality is a lot more complicated, so I don't want this to be. Of course, I'm biased. I'm a mental health professional. I see the value in my human therapist, but I want to empower people. So, if you are going to be using an AI mental health platform, I want to share what clients should consider before using AI mental health platforms. So, if you're someone seeking mental health support, ai-driven tools might seem like an easy, low-cost option and on-demand, but here are some things I would encourage everyone to really consider before using them.
Speaker 1:Number one AI is not a replacement for professional therapy. Number one AI is not a replacement for professional therapy. So, while AI can offer general coping strategies, it cannot diagnose, it cannot personalize and it can't adapt to specific emotion and mental health needs the way a human provider can right. Same thing goes with a dentist or a medical provider, because my husband unfortunately had to go to the emergency room and had a bunch of lab work done and there are AI apps that can scan and read lab work. There's amazing technology going on in the medical field, which mental health is part of right. We are a clinical field and it still cannot replace the human eyes of someone taking the whole picture in and making decisions and collaborating with the patient. So that's something really important to think about that it cannot do those things the way a human provider can. A fact a study from the JAMA psychiatry some call it the JAMA, but psychiatry found that 68% of users felt AI chatbots provided mechanical or generic responses to their issues, making them feel unheard. So you can Google that 2024 JAMA psychiatry, look for AI and you can find. Dive deep into that research.
Speaker 1:Number two so, number one AI is not replacement for professional therapy. Number two AI can give inaccurate or even harmful advice. I have used chat GPT a ton, um, but I know that it's given me like false information, right, and they have a disclaimer like you need to double check things on your own, and I think it's important, especially when it comes to mental health, that you are aware that the advice or recommendations because a human therapist doesn't necessarily give advice, um, that gives insight or recommendations is the fact that AI can give inaccurate or even harmful advice. So AI chatbots are really only as good as the data that they are trained on, and I am learning so much while I'm building Telewellness Hub about programming, the language programming that is involved in AI. Um, thanks to the brilliant product and technology officer at tele-wellness hub, I've been really inspired to dive deep into AI and learn. Learn about this industry because I, I it's, it's in every field, and so it's been really interesting to see how how AI is trained. It's even like positive reinforcement, right, like how you would give a treat to a dog. It's a similar thing.
Speaker 1:So AI chat bots are actually only as good as the data they are trained on, and they can misinterpret a user, potential client's input and provide responses that are misleading or, even worse, inappropriate. So there's an example In 2023, a chatbot was designed for mental health support and it actually suggested harmful coping mechanisms to users. I won't say the platform, but it led to major public concern and public shutdowns. So when you hear this, I mean it's the benefits of AI are there, but when it just takes that one life every life is important that one life that can have received inappropriate, damaging insight, and you know it's not worth it. Okay, number three privacy concerns and data security risks.
Speaker 1:Many AI mental health platforms collect sensitive user data. I cannot emphasize this enough. Big tech companies, they're wonderful in their mission, they're wonderful in their mission, but many times are not created, founded by mental health providers and ultimately becomes a business of collecting and selling data. So when you have sensitive user data, these clients, many AI mental health platforms, are collecting these and so, without strict regulations and sometimes without knowing, this has happened on many major mental health platforms. Actually, big tech, I'm talking like I know. I'm representing tele-wellness hub, we are mom and pop shop. Okay, I'm talking big, big like millions, billions platforms.
Speaker 1:Your private conversations could be at risk of being recorded, of being used for marketing, of being used for research, used for research and, even worse has happened, being sold to third party companies, and I, as a mental health provider, am one that believes that my clients are. Their sensitive information is certainly not to be sold. Their sensitive information is protected by HIPAA. Um, it is my duty, through my license, through my years of education and licensure and continue education hours that I pay for and invest in to to protect, to do everything in my power to protect my clients and their information. So, to me, this is one of the biggest concerns for me, and a New York Times report in 2024 revealed that some mental health apps were selling their user data anonymously to third-party companies, raising serious ethical concerns. I don't necessarily want to throw any companies under the bus, but you can take a look New York Times 2024, mental health apps selling data, just so that you're informed and you're aware.
Speaker 1:Another concern in terms of what clients should consider before using AI mental health platforms is the risk of self-diagnosis and over-reliance on AI. So some users may start to rely solely on AI for their mental health needs instead of seeking professional help when they truly need it. And AI doesn't replace the accountability and therapeutic relationship that comes from working with a licensed professional who understands you as a whole person. Um, there is nothing like um that therapeutic relationship. Someone that you know in, understands your, your story, understands your story, your values, your challenges, is there to cheer you on, to help you reflect to consider alternatives. We deeply care about our clients Deeply, deeply, deeply.
Speaker 1:Human care and I know that some people have had some hurt done by therapy. I've I've heard some of that. I understand that there's challenges in that, but there are so many amazing providers out there that really do everything, dedicate their lives to their clients' well-beings and and that's that's their livelihood, that's everything they've worked so hard for. So something to think about. When it comes to. It's important. If you don't find the right therapist because it's happened to me I didn't feel like it was the right fit and I knew that I deserved to find the right fit and I kept searching until I found the right fit. You deserve to find the right fit and I kept searching until I found the right fit. You deserve to find the right fit. And, side note, with this revamp of Telewellness Hub, my goal is to make it even easier at first glance to know if that provider is the right fit. So more to come on that at a separate time. But so four things that clients should consider before using an AI mental health platform.
Speaker 1:Ai is not a replacement for professional therapy. Ai can give inaccurate or even harmful advice. Privacy concerns and data security risks amount and the risk of self-diagnosis and over-reliance on AI is there. So take all that and just consider that right Again. This is meant to empower and to have empower clients and providers, and for you to just take those things into consideration for critical thinking of. Is an AI-powered platform the best for me? And maybe it is, but just having those things in mind when doing so is important.
Speaker 1:Now for mental health professionals. Ai also greatly impacts us and it presents both opportunities and significant concerns. I'm going to talk about three major ones job displacement, fears, ethical dilemmas in AI, assisted care and the need for a community driven response. These are all things that I'm seeing in my communities. These are things that I feel myself, and I think that if we can empower ourselves by understanding AI more and really being able to critically think about it, it will help us in the long run. So, when it comes to AI replacing us, essentially, ai is often marketed as a cost-effective alternative to therapy, and I understand that this can make it even harder for providers to compete with free or low-cost AI solutions.
Speaker 1:A fact is a little research. Fact is a survey from the American Psychological Association, the APA go APA in 2024 found that 42% of mental health professionals felt threatened by the rise of AI in therapy. So mental health providers out there are in flight or fight about AI. Let me tell you there is a lot of concern out there, for will we even exist as a job? Will we be replaced? So, just like many other industries, we are worried about that too. And something to take into consideration is most mental health providers go into huge debt in order to have that credential after our name. We go to graduate school, we complete thousands of hours to be credentialed, we take tests, exams Every year, we have to do continuing education units, which costs money and take time just to make sure we're doing the right thing for our clients, and and we have overhead. That's a whole.
Speaker 1:There may be a whole other conversation, a different episode for providers, but it is hard for us. It is hard for us. And reimbursement rates. So people might think, oh, like you can bill insurance, you can make a lot of money. The fact is we know the whole health system is broken with insurance in the United States and actually our reimbursement rates haven't changed significantly in decades, so it hasn't caught up to the to inflation. So it's it's. It is tough out there, even if you're accepting private insurance and, um, you could do private pay. But that is a challenge also, right, because it has its own unique challenges. We'll say that and, again, maybe this needs to be its own separate episode.
Speaker 1:But this fear for mental health professionals is very real and I think there's an opportunity, though, for therapists to become empowered by AI when we come together. Now, also AI when it's marketed as cost-effective, as an alternative to therapy by these platforms. You have to remember some of these platforms have like millions of dollars to support their marketing. Most mental health providers that you see do not have millions of dollars. I don't want to speak for everybody, but I know for myself and, I think, most providers out there, we are trying to figure out how to market on a budget, because we're spending our money on continuing education and other things that enhance the experience for our clients, right, whether it's the telehealth platform, our office space, an additional certification or training to help a client just really break through barriers and meet their goals. So the job displacement fear is real. So, when you support your mental health provider, when you choose a mental health provider, a human one, and it's their private practice, let's say, their solo practitioner. You're really empowering small business we forget. You know, ultimately they're a small business. You know, ultimately they're a small business and so if supporting small businesses is important to you, I encourage you to support a small mental health business and support your mental health providers if you're seeking mental health support. Okay, enough of that.
Speaker 1:Number two ethical dilemmas exist in AI-assisted care, so some providers are being asked to integrate AI into their practice. Sometimes it happens slowly, like through maybe like a software they're using to keep track of notes for their clients. Um, sometimes it happens when maybe sessions are being like listened to through certain like integrations and being transcribed so that it can be turned into notes easily. Um, there are different things going on with AI, but without clear guidelines on its ethical use, they may be at risk of liability, and mental health providers already carry liability insurance and it's just something that we can have concerns about. Right, we are concerned about our clients' confidentiality and there are many HIPAA compliant AI platforms, but still something to think about. So if you have a mental health provider who is old school and they have a piece of paper and they're writing their notes on a piece of paper and they're filing their notes in like a folder. Just know that it doesn't mean they're like too old school and not innovative and won't know the latest and greatest techniques to support your mental health. Just know that actually a lot of mental health providers are turning to that in an effort to really protect their clients and to protect their own liability and to do the most ethically responsible thing that also supports our mental health industry. So just a little for clients listening out there, just know that there's a shift into that right. So, just like there's a shift into kind of going back to old school, simple times of making your own sourdough, of like homesteading, keeping things simple, there's also a shift in that in our field.
Speaker 1:And a third topic I wanted to talk about in terms of the impact of AI on mental health providers is the need for a community-driven response. So instead of resisting AI entirely, because I don't think that's the answer like AI is here, whether we like it or not, because I don't think that's the answer, like AI is here, whether we like it or not, providers must stay informed. I want to advocate that we, as mental health providers, advocate for ethical implementation and I would love to invite all mental health providers to help shape how AI is used in mental health care. So this next section is going to be a little bit about um for providers specifically. Um. So if you're, uh, our regular listener who wants to know a little bit more of an insider's view on providers, just keep on listening. If not, um, I appreciate you listening and taking into consideration these things. I think it's important, um, as a consumer of mental health support systems, um, to know these things, to know what's happening in our industry, to know that there are big shifts, because it directly impacts your care. So I'm going to encourage you to listen, um, but just know that behind the scenes, there are a lot of mental health providers trying to figure out how we can best serve you through this shift.
Speaker 1:So why independent providers and clients need a supportive community. As we continue to see AI expand in mental health, it's more important than ever for both clients and providers to have a space where they can navigate these changes together. Uh, platforms like Latin therapy and um, uh, latinx therapy, I'm sorry. Uh, liberatory wellness network. Alex, let me double check that that's correct. Um, let me see here. Hold on, okay, alex, you might want to like take out the whole thing about. Like don't listen if you don't want to. So anyway, you can take all that out if you want, so okay. Why independent providers and clients need a supportive community is my last little segment here on this topic. As we continue to see AI expand in mental health, it's more important than ever for both clients and providers to have a space where they can navigate these changes together. I want to share about some platforms like Telewellness Hub, but also the Liberatory Wellness Network, therapy Den I'm using my phone right now to make sure I have Latinx Therapy Network Therapist Networks. These platforms are designed to support independent providers and ensure that clients have access to ethical, human-first care. I will share a list of some platforms that I know are specific to supporting independent providers and human-first care in the show notes, so take a look there if you're looking to join one of these directories or seek a provider through there. Join one of these directories or seek a provider through there. So for clients, it's like it's really essential to find real licensed professionals who can provide personalized and confidential support.
Speaker 1:In my opinion, rather than relying solely on AI driven tools, I think there's a great space for AI driven tools to support the work, and I've even had clients. I've even had clients talk about what they entered into ChatGPT for like journaling topic ideas, and I've even had clients I've even had clients talk about what they entered into ChatGPT for like journaling topic ideas, and we talk about it in session. Together we can talk about the tools of AI and utilize those tools as a resource for our sessions. So I'm not saying it's bad, I'm just saying it's something to really think critically about and, I think, something that you can bring and navigate with a therapist in a perfect world.
Speaker 1:I realized, like I said earlier, that our, our health system in the United States is is um is facing a lot of challenges. So I understand that I that sometimes that's not an option and that's also why tele wellness hub is hoping to break those, those barriers and there's those challenges to access by providing clients who providing clients with providers who also have resources available, whether it's books, digital downloads, podcast episodes, youtube channel videos, youtube videos, podcast episodes, youtube channel videos, youtube videos, um. So, yeah, we want to make there are a lot of people out there who want to make mental health accessible and I think for some I understand that an AI chatbot might be the best option at the time on your way to find the ideal provider. Um, for providers, having a professional network is key to staying ahead of industry shifts, advocating for ethical care and ensuring that independent providers have a strong voice in the future of mental health services. I want to talk about one network, the National Alliance of Mental Health Professionals, an incredible platform that I have recently joined that I love and is let's see how many people are in this group over 8,000 people, and it's just people having a dialogue about AI, about big tech corporations and venture capitalists taking over our space. So just something to think about for providers. Maybe take a look at some of these organizations or even locally, see who's talking about this and joining in At Telewellness Hub, I will say we're creating a space, a digital space, a community where providers can really examine AI and other emerging industry trends together, along with our amazing technology team, and my goal is to have your voice.
Speaker 1:Help shape the future of Telewellness Hub and help make decisions when it comes to AI with our directory and platform, so that we can have an ethical, human-driven and effective mental health solution for providers and clients alike. So I welcome everyone to join the conversation. Shape the future of mental health together to join the conversation, shape the future of mental health together. Before I wrap up, I just want to take a moment to thank everyone who has been listening, who has provided feedback on our since the day I am on Telewellness Hub, the platform telewellnesshubcom since I first launched on a WordPress site and then, um you know, made updates and changes and the launch and especially with our previous launch um, I was so excited about it and it was was amazing to see the feedback and the signups and we I got some really great insights that have helped shaped and refine a platform that truly serves providers and clients in a more meaningful way. So I am so thankful for everyone who's taking the time to give feedback, because that is so helpful for shaping the future of mental health, not just tele-wellness hub, but for mental health.
Speaker 1:We're currently rebuilding, finalizing our custom web application. Our goal is to ensure that independent providers have a space where they can offer their expertise, connect with others in the field and truly make an impact, offer their therapy services, their workshops, showcase their media content, their products for sale and really offer clients the opportunity to do wellness mental health wellness their way. So getting help isn't always like scheduling an appointment right away. Right, it's navigating who are some experts in the field, and Telewellness Hub is here to highlight the experts in this field, which are human, independent providers. So if you're a provider and you're looking for a community that values ethical client first care without the burnout, I invite you to stay connected.
Speaker 1:Please share your thoughts, be a part of this evolving conversation. We're building a channel within Telewellness Hub for all providers to share their thoughts, to vote on features, to help shape the directory itself so that it's ours as a collective. I'm a big believer that together we can really create a difference and a huge impact. So if you're a provider or a client who wants to better understand the role of AI in mental health and ensure that real human first care remains a priority, please subscribe to this podcast and join our growing community. Let's work together to navigate these shifts and ensure that technology supports, rather than replaces, real mental health care. I'm going to have more topics on this in the future. I'm going to be inviting some experts in this field in AI and research and media, media specialists, technology specialists, community specialists. So please subscribe, stay tuned, because there's some good stuff coming. Thank you for tuning in and I'll see you next time.