Imagine having an emotional support companion wherever you go at all times, who does not need sleep, replies instantaneously, and never judges you. You may think we are many years away from this, but it has become a reality now.
Thank you for tuning into this week’s blog post, where I will explore the topic of Artificial Intelligence in the college setting.
With the rise of mental health struggles among college students and the lack of counseling services on campuses, as I shared in previous posts, many students are turning to AI for support in times of need.
At first glance, it may seem convincing that AI-powered tools can provide therapy services and wellness tools. Still, it raises concerns about whether or not these tools are practical, offer the correct support, or even contain the amount of human and emotional depth, especially regarding students with disabilities.
AI-generated tools are available at all times of day and night, unlike campus counseling resources, which are available only at certain times of the day and even only on weekdays. Also, these apps and tools allow users to remain anonymous, making them more comfortable sharing their problems and insecurities. Lastly, these apps avoid the concepts of scheduling appointments, allowing students to access support on their own time.
Many AI tools are free or very low cost, making them very intriguing, especially regarding no need for insurance use or money at all. The lack of costs makes these apps more desirable for students with financial problems who still want emotional support.
These AI tools rely on text-based communication that can make students with social anxiety or autism feel more comfortable sharing their problems and experiences. These apps also tend to have customizable settings and sensory-friendly designs, making them more accessible for a broader range of college students.
AI Tools can bridge accessibility gaps for people with disabilities and help ease surface-level stress and organization. For example, I use Otter.ai, which transcribes conversations, relieving the stress of being unable to understand what somebody says.
Most AI Tools are not trained to understand the lived experiences of people with disabilities, such that they offer generic responses that invalidate or insult their users.
With the example, it can share how AI can be harmful, such that a response can lead to a college student with a disability to feel isolated, especially since they most likely felt misunderstood by peers or professors.
AI can be extremely useful for easy, everyday tasks, but when it comes to mental health support, it has a long way to go from being beneficial. People in need of support should continue to aim for professional help instead of steering towards AI. Human connection is crucial and decisive in those times of need, and AI does not necessarily provide that support. Especially for those students with disabilities, seeking a support group of those who share the same disabilities as them can give a sense of connection and belonging that is needed to be fulfilled. Thus, professional help is the best answer for mental health support.
Thank you for taking the time to read this blog post. This is one of the more serious topics in my blog series due to the rise of Artificial Intelligence in all aspects of our lives. Next week, I will be discussing living in a culture of constant productivity.