A phone with an icon of a brain on it with floating icons around to display interconnectedness

AI Tools can bridge accessibility gaps for people with disabilities and help ease surface-level stress and organization.

Imagine having an emotional support companion wherever you go at all times, who does not need sleep, replies instantaneously, and never judges you. You may think we are many years away from this, but it has become a reality now.

Thank you for tuning into this week’s blog post, where I will explore the topic of Artificial Intelligence in the college setting. 

With the rise of mental health struggles among college students and the lack of counseling services on campuses, as I shared in previous posts, many students are turning to AI for support in times of need. 

At first glance, it may seem convincing that AI-powered tools can provide therapy services and wellness tools. Still, it raises concerns about whether or not these tools are practical, offer the correct support, or even contain the amount of human and emotional depth, especially regarding students with disabilities. 

What Attracts Using AI as Mental Support

Accessibility

AI-generated tools are available at all times of day and night, unlike campus counseling resources, which are available only at certain times of the day and even only on weekdays. Also, these apps and tools allow users to remain anonymous, making them more comfortable sharing their problems and insecurities. Lastly, these apps avoid the concepts of scheduling appointments, allowing students to access support on their own time. 

Affordability

Many AI tools are free or very low cost, making them very intriguing, especially regarding no need for insurance use or money at all. The lack of costs makes these apps more desirable for students with financial problems who still want emotional support. 

User-Friendly

These AI tools rely on text-based communication that can make students with social anxiety or autism feel more comfortable sharing their problems and experiences. These apps also tend to have customizable settings and sensory-friendly designs, making them more accessible for a broader range of college students.

Popular AI Tools 

  • ChatGPT
    • Some college students use this app, although it was not explicitly made for mental health. They use ChatGPT to process emotions, ask for solutions regarding social situations, or organize what to do
  • Woebot and Wysa
    • Apps like these are specifically created to support people in need by acting as trained mental health support companions, specifically using Cognitive Behavioral Therapy (CBT). These apps track people’s emotions over time, share patterns, and suggest tools to manage certain emotions. While this is a great start and can aid in everyday stress, it lacks personalization and is not as equipped to discuss deeper issues such as PTSD or suicidal ideation. 
  • Replika and Youper
    • These apps act as a “virtual friend” that is more personalized and adapts to the user’s feelings and chats. The app can adjust communication based on the user’s learning style and offers positive feedback and self-reflection. While this can be great for immediate support, excessive use can lead to a false sense of security and emotional dependence that eliminates real social interaction. 

Risks of AI

  • Lack of Personalization: AI bots cannot fully understand trauma or human emotion, such that it is not human and do not have lived experiences, making them struggle to respond in the “correct” way. 
  • False sense of support: These apps can make college students avoid seeking professional help when necessary, which can have serious consequences for those who need the support. 
  • Privacy concerns: AI companies tend to collect data that people input into the system, which raises the questions of who has access to your data and the condition of your mental health. 

AI with Disabled Students

Positives

AI Tools can bridge accessibility gaps for people with disabilities and help ease surface-level stress and organization. For example, I use Otter.ai, which transcribes conversations, relieving the stress of being unable to understand what somebody says. 

Negatives

Most AI Tools are not trained to understand the lived experiences of people with disabilities, such that they offer generic responses that invalidate or insult their users.  

  • I once tried using an AI Tool to discuss my frustration with wearing cochlear implants in noisy restaurants, how I am often overwhelmed, and how difficult it is to follow conversations with loud background noise. The AI immediately responded, "Have you tried changing seats or asking to move outside?” While this may seem beneficial, it completely dismissed my feelings of exhaustion and social isolation, such that it cannot understand how I truly felt, suggesting that AI can offer solutions, but it does not truly comprehend what it means to have a disability. 

With the example, it can share how AI can be harmful, such that a response can lead to a college student with a disability to feel isolated, especially since they most likely felt misunderstood by peers or professors. 

Closing Thought

AI can be extremely useful for easy, everyday tasks, but when it comes to mental health support, it has a long way to go from being beneficial. People in need of support should continue to aim for professional help instead of steering towards AI. Human connection is crucial and decisive in those times of need, and AI does not necessarily provide that support. Especially for those students with disabilities, seeking a support group of those who share the same disabilities as them can give a sense of connection and belonging that is needed to be fulfilled. Thus, professional help is the best answer for mental health support. 

Conclusion

Thank you for taking the time to read this blog post. This is one of the more serious topics in my blog series due to the rise of Artificial Intelligence in all aspects of our lives. Next week, I will be discussing living in a culture of constant productivity.