Back to Educated Reporter

How to Cover AI on Campus

Experts discussed what AI is, what it can be used for and where reporters should look for red flags.

Photo credit: C Vector Studio/Bigstock

Back to Educated Reporter

Generative artificial intelligence has caused panic in higher education since Chat GPT debuted almost two years ago and opened a new avenue for cheating. But that’s not the only way AI is being used on college campuses. 

Panelists at EWA’s 2024 Higher Education Seminar at the University of Pennsylvania’s Graduate School of Education in Philadelphia busted some myths about AI, discussed how colleges are using it and offered story ideas for reporters. 

Who Was There?

Taylor Swaak, a senior reporter who regularly covers AI at The Chronicle of Higher Education, moderated this panel. Panelists were: 

  • Jenay Robert, a senior researcher at EDUCAUSE, which conducts surveys about how colleges are using AI.
  • Joe Sabado, deputy chief information officer at University of California,  Santa Barbara, co-director of the university’s AI Community of Practice and a member of the University of California’s AI Council.
  • Kofi Nyarko, a professor at Morgan State University and director of the Center for Equitable Artificial Intelligence & Machine Learning Systems.

What Is AI? 

Artificial intelligence is not new. Sabado pointed out that it has been on college campuses since the 1950s. 

Generative AI is what has caught fire more recently. In the past, he said, AI has been used to analyze data. By contrast, generative AI produces new content (using data it was trained on). 

Panelists discussed both old and new uses for AI. Nyarko got the conversation started by explaining that AI is “any computational system that is able to complete tasks that would typically require human intelligence.” 

Computation, here, is key. Computers find patterns in very large datasets. That’s their main value to humans who can only keep so much information in our heads at a time. 

Historically, colleges have used AI or “machine learning” for its predictive abilities. Predictive systems take in lots of information about how things  happened in the past and then make predictions about how things are likely to happen in the future.  This has long helped colleges manage energy on campus and best use space and resources, panelists pointed out. It has also helped colleges identify students who are at risk of dropping out. 

More recently, higher education has turned to generative AI. This type of computer model is trained on as much data as possible, and it is designed to identify patterns that ultimately equip it to mimic human language. Out of this come chatbots, automated note taking, and other forms of AI-generated writing. Colleges and universities have created chatbots to help students get quick answers about admissions, financial aid and other student services. 

Sabado said a University of California AI Council survey found the university’s health centers have started using generative AI for billing, note taking and transcriptions. A recent EDUCAUSE survey showed more than half of respondents in higher education said AI is being used across their entire institutions – teaching and learning, research and insights, daily operations and business. 

The stated goal for all of these uses is administrative efficiency and to improve the student experience. Unclear so far is whether it’ll be worth the cost of developing AI tools. 

What Are the Risks?

Using all forms of AI carries risk. Humans are conditioned to trust what computers say. But generative AI is not like a calculator, giving the right answer every time. Generative AI tools merely offer a guess – the most likely answer based on everything the model has processed before. 

In describing early warning systems used to identify students at risk of dropping out, Nyarko discussed the “black box problem.” AI systems don’t show their work. Once trained, people can only see the answers these tools provide, not every piece of data used to come to those answers. Big, complex models simply use too much data along the way to make that practical.

And biased data can lead to biased predictions, Nyarko said, creating vicious cycles in outcomes. He described a hypothetical brilliant student, doing well in school despite coming from a disadvantaged background. But the student gets labeled “at-risk” and goes to a counselor to try to figure out why. The counselor doesn’t know, but now both the student and the counselor have a kernel of doubt about the student’s chance of success.

Sometimes predictive models use biased data – so they’re given free rein to say all Black students are at higher risk of dropping out because Black students in the past dropped out in higher numbers. Nothing corrects the model to make clear race is not the driver of that outcome. 

Sabado pointed out that sometimes the data used to train AI models is simply incorrect. “Garbage in, garbage out,” he said. Colleges and universities have to make sure they use good, clean data to train AI systems decision-makers can trust. 

Equity in Access and Training

Campuses have responded differently since ChatGPT’s release. Some, like Arizona State University and the University of Michigan, have leaned into generative AI and made it available to their entire institutions. 

Morgan State University, where Nyarko works, has developed trainings about AI through his center. Not all institutions have the money to make ChatGPT widely accessible or to help people understand what AI is and what it can be used for. And many don’t have the desire to try. 

But panelists said that work is critical.

“If a campus doesn’t have a[n AI] literacy program for faculty, staff, and students, that should be a bit of a red flag,” Nyarko said. “Good campuses are developing these literacy programs, not saying they’re going to ban it.” 

Resources

Here’s how to keep up with AI news: 

  • The TLDR AI daily newsletter. 
  • Follow people on social media who work on campuses, who research AI and then follow the people they follow in this space. 
  • EDUCAUSE’s research about AI use in higher education. Start here. And don’t miss its  2024 AI Landscape Study.
  • Look at what students are talking about on Reddit.

Story Ideas 

  • Ask colleges and universities for their AI policies and whether they have any internal guidelines. 
  • Mine National Science Foundation (NSF) grant announcements for information about which institutions are getting money to experiment with AI and how. 
  • Probe AI uses for mental health. 
    • Panelists had a good discussion about how an AI chatbot can be a gateway to mental health services for students wary of counseling, but there are privacy concerns and also concerns about whether the AI was trained with good enough data to be able to serve diverse student groups and meet their mental health needs on the way to getting them into a counseling center. 
  • Investigate any example of generative AI being used for predictive systems, including those leading to admissions or financial aid decisions. 
    • Large language models, which underpin generative AI systems, are not designed to produce reliable predictive analytics like other machine learning models. 
  • Talk to people who are not strongly opposed to or supportive of generative AI and incorporate their voices into your stories. Reach out to students and campus staff. 

And one final idea: Tell the stories of people who are harmed by algorithmic bias. 

“The more of those stories you tell, it hits home that it’s more than just computers running programs,” Nyarko said. “It’s people’s lives.”

x
Latest
Podcast
badge-arrow
Podcast
Donate