Higher Ed Grapples with AI’s Impact

From blanket bans to curriculum acceptance, universities are taking different approaches to address the growing popularity of artificial intelligence in and out of the classroom.

By: Aila Boyd
featured-image

Although it had been looming on the horizon for some time, large language models sent shockwaves throughout the higher education world in late 2022 when OpenAI released its ChatGPT. The release resulted in many questioning the impact AI would have on the classroom experience, both for better and worse. 

Due to the fluidity of the situation, many in higher ed have realized that what they know about AI today may not hold true tomorrow. This has led institutions to develop general guidelines, instead of rules that are set in stone. What is apparent to most educators, however, is that AI will continue to be a topic of discussion for years to come.

Policies

A May UNESCO global survey of more than 450 schools and universities found that fewer than 10% of them had developed institutional policies or formal guidance related to AI. 

“It’s all ad hoc. It’s all piecemeal,” said Dr. Tony Kashani, affiliate faculty and dissertation chair for the doctor of education in educational and professional practice at Antioch University.

Similarly, a June report from Tyton Partners noted that “only 3% of institutions have developed a formal policy regarding the use of AI tools, and most (58%) indicated they will begin to develop one ‘soon.’”

Although Gonzaga University hasn’t developed an AI-specific policy, its academic integrity policy does cover cheating with the use of AI in a broad sense. 

“Theoretically, students are not allowed to use the tool that is not specified by the professor to help them in their work,” said Dr. Justin Marquis, director of instructional design and delivery at the university.

Likewise, Dr. Melissa Vito, vice provost of academic innovation at The University of Texas at San Antonio, noted that most of the discourse in early 2023 when her institution started considering AI policies centered on the concern that it would lead to cheating. It has since moved to a more engaging and educational approach to the issue. 

“I didn’t want to start with policies because we don’t really know exactly what we’re dealing with,” said Vito. “We did decide not to go down a policy route where it was either going to be this or that but to keep faculty in their role to govern and guide their courses. It’s a dynamic issue. It’s not like a one-and-done. It continues to evolve.” 

As a result, a group of “faculty champions” worked to develop broad guidelines for faculty on the use of AI, as well as some best practices that can be updated as understanding of the models continues to evolve. Vito explained that faculty can bar the use of AI in their classes by stipulating as such on their syllabi.

National Louis University has issued guidance to its faculty regarding AI; however, it isn’t being prescriptive in its recommendations. 

“We believe in academic freedom and our faculty having the opportunity to make those choices,” said Dr. Bettyjo Bouchey, vice provost for digital strategy and operations at the university. If faculty allow the use of AI, she added, learning assessments should be reoriented in a way that the AI doesn’t diminish the learning process. 

Like others, Barnard College also believes in faculty autonomy. For that reason, it’s up to them to decide if and how they want to allow AI in their classes. 

“It’s a spectrum; there’s no mandate to use it or not to use it,” said Dr. Melanie Hibbert, director of informational media and academic technology services and the Sloate Media Center at the college.

The Barnard College Center for Engaged Pedagogy developed resources that can help faculty reach a decision about the use of AI in their classes. One graphic walks them through various considerations, which helps them arrive at one of four decisions: closed, restricted, conditional and open. For example, the restricted stance encourages faculty to “consider which learning outcomes may be negatively impacted.”

Sample syllabi statements for faculty who forbid AI, as well as for faculty who are open to AI, were also developed. The statement that is open to AI notes students will be informed about when and how to use tools like ChatGPT, while also advising that uses outside of the stipulated guidelines are not permitted. 

Detecting AI

Although educators feel that it’s important for students to acknowledge the role AI plays in their work, they tend to feel conflicted when it comes to detection. 

The Tyton report noted that preventing student cheating “jumped to the top instructional challenge reported by instructors in 2023, up from the 10th in 2022.” Perhaps the concern is well-founded considering that the report also indicated that 51% of students will continue to use AI even if their instructors prohibit them from doing so. 

Turnitin started work on its AI indicator software roughly three years ago. 

“We started by building out a model that is based on a representative sample spread over AI-generated text, as well as academic writing,” explained Patti West-Smith, senior director of customer engagement at Turnitin. “One of the tenets of our work is the core belief that AI detection for an academic setting has to be trained on academic writing. It isn’t enough for it to be trained generally on content on the Internet.” 

She added that an overrepresentation of underrepresented groups was included in the model to ensure it didn’t contain bias

Because the model had already been in the works, Turnitin was able to roll it out in April. West-Smith said in the early days instructors wanted Turnitin to tell them if AI had been used, not how and to what extent. 

“It’s shifting. Customers are looking to us to help them figure out not is it right or wrong to use the tool, we’ve tried to move away from that, and simply provide the data. We want to give the educators the insight so that they can see when there’s indications that the tool is being used, to what extent,” said West-Smith. “It’s our core belief that it’s the educator’s professional discretion to situate that into a context that is particular to their classroom, their assignment and an individual student.” 

Turnitin has developed a resource center that offers guidance to faculty on topics such as developing their own policies, as well as student-facing information. 

Detection Accuracy

At the time of the April launch, CEO Chris Caren said, “We are able to detect the presence of AI writing with confidence.” 

He added that the company had been careful to adjust its detection capabilities to minimize false positives, which it says are “less than 1% for documents with 20% or more AI writing.” Regarding the model’s effectiveness, Dr. Eric Wang, Turnitin’s vice president of AI, released an August white paper in which he laid out the way it operates. 

West-Smith explained that the model looks for a sentence-level indication that a submission was AI-generated. When a submission doesn’t meet the threshold, the model doesn’t flag it. 

“We’re okay with that,” she stressed. “If we miss some in a way that is fair and appropriate for students and doesn’t flag writing where we’re not sure, then that really fits into our core values as a company.” 

She added that Turnitin will continue to release updates to meet the demands of its customers, specifically based on the feedback it receives through its educator network.

In an August statement, Vanderbilt University said that after testing Turnitin’s detection tool, it would disable it for the foreseeable future. 

“Fundamentally, AI detection is already a very difficult task for technology to solve (if it is even possible) and this will only become harder as AI tools become more common and more advanced,” the university said. “Based on this, we do not believe that AI detection software is an effective tool that should be used.”  

Barnard has also taken an institutional stance against the surveillance of AI, Hibbert explained. Because of the stance, it disabled the Turnitin feature. 

“There’s really no way to perfectly detect artificial intelligence in writing,” said Bouchey. “The algorithms are far too sophisticated and continue to evolve even on a minute-to-minute basis.” 

If NLU faculty assume that students have relied too heavily on AI, they are encouraged to talk it over with them.

Marquis’s office has found the same to be true after experimenting with Turnitin and other detection models. 

“We’re finding that those tools are not reliable and/or very easily defeated,” he said. 

He pointed to a recent example of a professor who was convinced a student had used AI, so he ran the paper through an AI detector and it came back as being 10% AI. That resulted in him taking the professor’s prompt and placing it in ChatGPT. Even though he knew it had 100% been generated by AI, the detectors couldn’t verify that fact. 

Institutional Approaches

“As a university, we can’t legally hold students accountable for something that we couldn’t prove,” said Marquis. “There’s no way for us to without a doubt prove a student has used AI to cheat.”

Vito said, in an ideal world, it would be nice to know when AI is used, but she doesn’t have enough confidence in the tools that are currently available to recommend using them. She does, however, think detection tools will become a greater part of the AI ecosystem over time. 

“Turnitin will always exist because institutions always want to see who’s cheating and who’s not cheating and who’s progressing based on merit,” said Kashani. 

However, he believes that policing the use of AI isn’t going to be the best way forward. Instead, Kashani hopes the focus will be on the useful applications of AI, such as tutoring. 

Instead of focusing on detection, NLU is looking at how it can design assignments and assessments in a way that incorporates AI responsibly and ethically. However, Bouchey said the institution wants to give students agency in so far that some of them may not want to use AI for their assignments. 

With the limitations of detection in mind, Gonzaga’s academic integrity board has realized it needs to reach students before they cheat. They’re doing so through a large-scale campaign to educate students about what AI is, how it can be used and what the continuing value of humans is.

“They’re responsible for whatever comes out. AI is interesting and fun, but it really produces about C work, and there’s a ton of bias built into the system,” Marquis said. “If they’re using it for their papers or their jobs down the road, they are responsible. They’re the ones that are going to get fired if they put out biased press releases.”

An example he offered was that his wife, who also teaches at Gonzaga, asked her students to write a paper about a Gabriel García Márquez story. She received a student submission that she immediately recognized as having been written by AI due to her expertise of the author’s work.

“What AI didn’t understand were the boundaries of that story. It was pulling in characters from other stories and events from other stories and smushing them all together,” Marquis said. “Faculty experts in their field have no problem in identifying really the C, D or F work that AI puts out.”

Citing the Use of AI

Bouchey said her institution wants acknowledgment from students for their use of AI in the same way they leverage the work of others. 

“Attribution would be critical and that’s a first skill for us to teach,” she said. 

She pointed to the APA guidelines for citing ChatGPT as a good starting point. APA says to credit OpenAI for both in-text and reference citations when ChatGPT is used. Alternatively, MLA indicated students should “cite a generative AI tool whenever you paraphrase, quote, or incorporate into your own work any content that was created by it.” Included in the guidance is the notion that writers can modify the recommendations if they feel doing so is called for. 

“It requires transparency at the level of method,” said Dr. Melissa Wright, executive director of the Center for Engaged Pedagogy at Barnard College. “It’s a metacognitive awareness of how this tool is supporting research, which is really exciting for me because I don’t think we always think that closely about how we use tools and interface with them.” 

She added that citation practices will likely be discipline-specific.

Marquis is advising faculty to have students provide their chat stream with the AI model. 

“There’s an export button on ChatGPT,” he said. “Give me the chat so I can determine where your work came from.” 

Faculty Understanding of AI

Faculty members who have experimented with generative AI are far more likely to recognize its potential value in education and advocate for policies and practices that enable the responsible use of it in teaching and learning, Tyton found. However, that group seems to be in the minority, with 71% of surveyed faculty having not yet tried AI.

The faculty at Gonzaga, Marquis noted, are all over the map with their feelings on AI. Some are integrating it into every component of their classes, while others simply want to ban it outright. “I think there’s a lot of hesitancy to embrace it right now,” he said. Despite that, he observed many of them are starting to view it as a tool like a calculator or the Internet. 

“It’s not going to end the world,” said Marquis. “It’s certainly not going to save the world. It’s going to be somewhere in between, and we need to consider it as the tool that it is.” 

He added that although AI can give answers, it can’t make connections for students and explain the “why” behind the answers. 

“We can offload some of the memorization of facts and information if we trust the AI not to hallucinate too much and we as faculty can focus on the who and how, connect it to the world and connect it in innovative ways,” he said. 

Kashani explained that the fear some faculty have of AI is one of the reasons it’s so difficult to develop concrete guidelines. 

“You’re accomplished people. You’re obviously very intelligent. You have a wealth of knowledge under your belt given your level of education and profession. Don’t be afraid. People were afraid of print technology,” he stressed. “It’s not difficult to understand the logic of generative AI. They don’t need to learn how to write code, they need to learn how this works, so they’ll be able to use it as a tool for their students.”

He hopes educators will move forward with the understanding that AI is simply an extension of established pedagogies. 

NLU is offering a course for its academic leadership this fall that was designed as a community of practice to delve deeper into the use of AI. 

“We want to equip them with the knowledge, skills and abilities,” Bouchey said.

She added that the introduction of AI has been a lot for faculty to take in, especially considering that many of them are still struggling with the long-lasting impacts of COVID-19. 

“Probably we have some faculty that are at a place of ‘How much more can I really put on my plate?’” she said. 

Alternatively, Vito speculated that COVID-19 made those in higher ed more resilient when it comes to disruptions. 

“All of us in higher ed need to stay engaged and stay flexible,” she advised. “Now is not the time to make big pronouncements.”

Faculty Use of AI

Although AI can be used to help further a student’s education, it can also be used by professors. In March, Tyton found that more than 50% of current faculty users of AI used it to generate things like prompts and assessments. 

Kashani noted that the idea of a professor obtaining administrative help isn’t new, with many of them having teaching assistants for things like grading and holding discussions. 

“AI can be an assistant to the professor 24/7 and work as a liaison between the student and the professor,” he said. 

NLU is supportive of its online faculty using AI to help generate announcements for students. Bouchey said faculty can also use it to help expedite feedback if there’s an acknowledgment of such to students. 

“We look at it as being augmented, but certainly not a replacement to,” she said. 

Galvanizing Effect of AI

It’s hard to deny that the arrival of AI has galvanized disparate parts of the higher ed community to come together and grapple with its long-reaching impacts.

Wright said that the release of AI has allowed her center to partner with faculty it hadn’t engaged with previously. 

“It’s really exciting to build new relationships,” she said. 

“We’re seeing a lot of optimism around working together and potentially getting to a solution faster by coming together with colleagues and having these critical conversations so that we’re not just in our siloes trying to process and figure this out all by ourselves,” Bouchey said of NLU. 

She hopes that the discourse going forward will focus on developing a collective effort to utilize AI for the betterment of higher ed, students and society. Although many of the larger institutions are putting a lot of time and effort into developing approaches to AI, Bouchey noted that it’s important that they reach back and engage smaller schools, like community colleges. 

Aila Boyd

Aila Boyd

Reporter

Aila Boyd is a Virginia-based journalist and educator. As a journalist, she has written for and edited daily and weekly newspapers and magazines. She has taught English at several colleges and universities and holds an MFA in writing.


Newsletter Sign up!

Stay current in digital strategy, brand amplification, design thinking and more.

This field is for validation purposes and should be left unchanged.
The image features a promotional graphic for the Engage Summit 2024. It showcases a city skyline with buildings under a partly cloudy sky. The design includes a vibrant red and blue color scheme with a dynamic overlay pattern. The text displays "Engage Summit 2024" with the word "Engage" in large, bold letters and the year "2024" below.

Engage Summit Recap: Getting Real About AI

The annual conference brought fresh perspectives — and a bit of a wake-up call — to those who were still hesitant to implement AI into their marketing mix.

By: Louis Miller
College Closures Surge Amid Financial Strain, Demographic Shifts article image, a lush, green garden or park setting is visible through a chain-link fence. The vibrant green foliage beyond the fence is emphasized, while the fence and chains are in grayscale.

College Closures Surge Amid Financial Strain, Demographic Shifts

With 30 recent college closures, higher ed can implement policies to make the transition smoother for students.

Education Administration /
By: Aila Boyd
Are accreditors to blame for the growing cost of higher ed image, a male in his cap and gown with his back to the camera, dollars in cash are placed on his cap.

Are Accreditors to Blame for the Growing Cost of Higher Ed?

Universities are hiring more higher ed administrators to meet stringent and growing requirements by accreditors.

Education Administration /
By: Chris Kudialis
The image features a promotional graphic for the Engage Summit 2024. It showcases a city skyline with buildings under a partly cloudy sky. The design includes a vibrant red and blue color scheme with a dynamic overlay pattern. The text displays "Engage Summit 2024" with the word "Engage" in large, bold letters and the year "2024" below.

Engage Summit Recap: Getting Real About AI

The annual conference brought fresh perspectives — and a bit of a wake-up call — to those who were still hesitant to implement AI into their marketing mix.

By: Louis Miller
College Closures Surge Amid Financial Strain, Demographic Shifts article image, a lush, green garden or park setting is visible through a chain-link fence. The vibrant green foliage beyond the fence is emphasized, while the fence and chains are in grayscale.

College Closures Surge Amid Financial Strain, Demographic Shifts

With 30 recent college closures, higher ed can implement policies to make the transition smoother for students.

Education Administration /
By: Aila Boyd
Campus Protests as Graduation Nears article image, an image of a female turned back to the camera holding a megaphone in her hand.

What Have We Learned From a Season of Campus Protests?

With campus tensions far from resolved and a presidential election on the horizon, is higher ed ready for the fall?

Student Experience /
By: Aila Boyd