Students Need Human Relationships to Thrive. Why Bots May Stand in the Way

In August, OpenAI released its latest safety card for ChatGPT 4.0. It’s a highly technical and fairly bleak read, detailing risks and safety concerns that generative artificial intelligence could create or amplify. At the very bottom of the document, OpenAI enumerates “societal impacts” it intends to study further. First on its list? Anthropomorphization and emotional reliance.

That’s a fancy way of saying bots are increasingly capable of sounding human, putting humans increasingly at risk of bonding with them. OpenAI admits its own catch-22: These improvements create “both a compelling product experience and the potential for overreliance and dependence.” In short, as the tech gets better, the social risks get worse.

Ed tech tools are not immune to this challenge. As AI floods the market, technology companies and district leaders alike must start asking the hard question: If bots are increasingly built to emulate human relationships — if they are being engineered to sound human — are they also being designed to help connect students to actual humans?

Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

If not, AI tools run a big risk of displacing their human connections. That poses long-term risks to students’ well-being, their ability to maintain human relationships and their access to networks that open doors to opportunities.

In a new report, Navigation and Guidance in the Age of AI, Anna Arsenault and I set out to analyze whether and how that question is being addressed in AI-enabled college and career guidance.

Related

Q&A: Putting AI In its Place in an Era of Lost Human Connection at School

This is a domain where chatbots are especially likely to take hold. On average, high schools have one guidance counselor for every 385 students. Research suggests a mere 6% of high school counselors’ time is spent on career advising. Such scarce human resources create gaps where chatbots can help, offering students on-demand personalized advice about applying to college, graduating and launching careers.

Our report features insights from founders, CEOs and chief technology officers at over 30 technology companies that build and implement chatbots to support students as they apply to college and continue through to their careers.

Based on these interviews, Open AI’s warnings about anthropomorphization — attributing human characteristics to non-human things — ring true. For example, most college and career bots have names and are designed to mimic cheerful, upbeat personalities. Many go beyond informational support to offer students emotional and motivational assistance when counselors can’t.

Are students over-relying on these bots? It’s too early to tell. But while most of the leaders we interviewed envision a system of hybrid advising that gives students access to both bots and human coaches, the majority admitted that some students gravitate toward bots in hopes of avoiding human interaction altogether.

Related

Generative Artificial Intelligence May Help Teachers. Does It Work for Students?

In short, the possibility that students may start to bond with and rely on bots, rather than humans, is very real.

Luckily, a number of the leaders are taking steps to build bots that foster relationships, rather than just mimicking them. Here are five examples of efforts to ensure that authentic human connection is an outcome, rather than a casualty, of AI products:

Promoting frequent social interaction offline: Axio AI, which spun out of Arizona State University’s student-led Luminosity Lab, is an AI companion that supports students’ personal growth. The bot has been trained to learn about the relationships in students’ lives. If students tell the bot they are struggling or bored, it will suggest reaching out to specific friends or family members. Axio has also worked to curb overreliance by limiting students’ time on the app.

Involving students’ families and friends: Uprooted Academy is a nonprofit that operates a virtual community center where students can interact with AI-powered coaches that help them apply to college. To ensure that those digital relationships don’t replace real ones, Uprooted Academy asks students to identify up to five supportive individuals in their lives when they enroll. The tool automatically updates those five people by text message every two weeks with recommendations for supporting students’ college progress.

Prompting conversations — even the hard ones: CollegeVine, an AI counselor and tutor, coaches high school students throughout the college application process. Through conversations with the students, CollegeVine’s bot, Sage, keeps track of how they describe their interactions with advisers and teachers. When it comes time to ask for recommendation letters, the bot can coach students on whom to ask and how to address any challenges they might have faced in interacting with those adults.

Matching students and mentors: Backrs is a platform that recruits online volunteer mentors to coach high school students on projects related to their academic and extracurricular interests. The company recently released an AI success coach, Lubav, which helps students find the right mentors on the platform and craft messages to them.

Practicing networking through online role-playing: Coach by CareerVillage.org is a chatbot designed to help students practice asking for help. The platform includes a series of career development activities in which students practice for interviews with the bot and draft networking and job-hunting emails, social media messages and letters asking for references.

These examples highlight AI’s potential to strengthen human connections. However, the incentives to build relationship-centered AI tools are weak. Few schools are asking for these social features or evaluating tools for their social impacts.

If things remain as they are, the more anthropomorphic bots simulate relationships across guidance, tutoring and student support, the more they could foster student isolation.

Related

When Educators Team Up With Tech Makers, AI Doesn’t Have to be Scary for Schools

But that outcome is not inevitable. Research underscores the importance of relationships in youth development and economic mobility. To live up to their mission, schools should prioritize human connection, ensuring that AI tools work for — not against — expanding students’ networks. Information technology coordinators and purchasers, superintendents, principals and educators who are involved in procuring new technologies should demand evidence that AI enhances relationships and implement data systems to track that progress. Entrepreneurs taking steps to safeguard and expand connections should be rewarded for their efforts.

Otherwise, ed tech companies risk the same catch-22 as OpenAI: building artificial intelligence that gets better and better, but to the detriment of the human relationships students need to thrive.

Disclosure: Julia Freeland Fisher serves as an unpaid adviser to Backrs.

Image Credits and Reference: https://www.yahoo.com/tech/students-human-relationships-thrive-why-113000577.html