While Annika Culver - a Florida State University professor of East Asian history - reviewed one of her students' writing assignments, she knew something was off when it sounded "too sophisticated" and almost robotic.
After putting the text through a detection tool, it revealed that 49% of the writing was generated by artificial intelligence, better known as AI.
"When I read a paper where a student used AI, there's something weird about it - each sentence will look wonderful and sound great, but it's all fluff, it doesn't really say anything and there's weird citations at the end," Culver said. "I can pretty much tell when students are using it, and then they'll confess."
As artificial intelligence is at the top of discussions in higher education, FSU and Florida A&M University are making moves to address the state of AI on their campuses and how to move forward with the existence of the quickly emerging - and yet daunting - technology.
The capabilities of AI has made the technology a pressing issue on a national scale as universities and other entities grapple with the ideas of privacy violations and unethical shortcuts to success.
FSU created an Artificial Intelligence in Education Advisory Committee in July and is currently working on a generative AI policy with guidelines and principles for faculty to follow. The committee's recommendations will be presented to the Faculty Senate and the Office of the Provost in December.
"What we need is something that will help faculty make their own decisions by giving them suitable guidelines - and guide rails, maybe - as to how to make informed choices while also giving our students skills they need to use AI effectively and ethically," said FSU Associate Vice Provost for Academic Innovation Paul Marty, who is also a professor at the university's School of Information.
Similarly, FAMU announced Thursday that it formed a 20-member Artificial Intelligence Advisory Council to assess the integration of AI in the classroom and in campus-wide programs.
While the university's will have a policy on protecting FAMU from encroachment upon personal and professional uses around AI, the advisory council will look at ways to cultivate AI in teaching and learning spaces. The interdisciplinary group will start meeting at the beginning of next year.
"This AI council will combine thought leaders from across every discipline from the arts, humanities and social science to the STEM areas and engineering," FAMU's Provost and Vice President of Academic Affairs Allyson Watson said. "We want faculty to embrace the good things that AI brings and to be cognizant of some of the factors that could be harmful."
What is AI, and what are universities doing about it?
AI is an emerging technology where machines are programmed to learn, reason and perform in a way that imitates human intelligence, and one of the main reasons driving college-wide AI policies is data security.
If an AI platform receives information such as financial data from an individual and uses them as further training data, they could potentially find their way into answers given to another user later.
According to a recent article in the Chronicle of Higher Education, securing sensitive information is what led leaders of Babson College in Wellesley, Massachusetts, to establish an overarching AI policy last year - a process that took several months and resulted in a three-page policy focused exclusively on data security.
In Florida, similar policies and guidelines are on the websites of State University System institutions such as Florida Atlantic University, Florida Gulf Coast University and the University of Florida - which has an Interdisciplinary Informatics and Artificial Intelligence Research Institute that aims to build a stronger AI research community across its campus.
UF also offers undergraduates an AI Fundamentals and Applications Certificate about the basics of AI, its applications to real world problems and ethical and professional responsibilities of it.
The topic of AI came up in a Florida Board of Governors meeting over the summer where a suggestion was to develop a central repository for AI tools, knowledge and best practices, but the board does not have a universal policy in place for the SUS.
Watson says FAMU's new AI council will be responsible for analyzing and identifying areas of weaknesses related to AI, such as cheating.
"You don't want a master's student writing a thesis paper completely with AI, and you don't want a student studying journalism to write a story completely derived from artificial intelligence thoughts and not their own," Watson said. "This council means we're taking a step in the right direction, we're listening to what national research tells us and our ear is to the pulse of innovation and the future of higher education."
In addition, AI's growing dominance includes popular tools like ChatGPT (Generative Pre-trained Transformer) - a chatbot released in November 2022 that responds to questions and commands from users to provide detailed, human-like answers through text interactions.
In response to the release of ChatGPT, a Generative AI Task Force was formed at FSU and has been meeting informally since January 2023 to discuss AI on campus, Marty said.
He explained how hard it is to determine where to draw the line in AI, but since a university policy has not been established yet, it is mainly up to FSU's faculty members to decide to what extent their students can use AI.
"Faculty governance is always a core tenant of the university system, and faculty decide what happens in their classrooms," Marty said. "That includes the use of artificial intelligence."
'It's really concerning'
Professors like Culver have statements in their syllabi that say students are not allowed to use AI for writing assignments or quizzes. If they do, it constitutes plagiarism and violates the honor code.
"They're not supposed to do it, so they just don't get credit for that assignment if they do," Culver said. "That's my policy."
With college students using AI for writing-intensive and research-heavy subjects such as English, social studies and humanities, Turnitin - a well-known tool that over 16,000 institutions use to verify authenticity and detect plagiarism - introduced a new AI detection tool last year called Originality to assist instructors in identifying AI-generated text.
An Inside Higher Ed article says Turnitin asserts the tool can detect 97% of writings generated by ChatGPT.
According to an April article in Education Week, in 11% of assignments run through Turnitin's AI detection tool, at least 20% of each assignment had evidence of AI use in the writing. In 3% of the assignments, each one was made up of 80% or more of AI writing.
Since the new tool is built into the course management software Canvas, many faculty members at both FSU and FAMU have writing assignments filtered through it automatically, which also lets students know what the AI detection results reveal.
"They get pressed for time, and they think they can slip one through," said Culver, who uses the AI detection tool.
FAMU Professor of English Zachary Showers said he has caught at least five of his students using AI as a cheating tool.
"I'll assign a paper about a problem students encounter here at FAMU, and I'll expect the students to talk about trouble with housing or trouble with financial aid," Showers said. "But then, I'll get a paper that's about student loan funding in Virginia or something - just completely out in left field."
Since Showers warns students about AI usage in his syllabus, he gives them a zero for using it to write their assignment for them.
"It's frustrating because it shows that the student doesn't really care about the assignment," Showers said. "They just throw something together and turn it in. I think every university is having this trouble right now."
Troy Spier, another FAMU professor of English, says about a quarter of the students in his freshman composition courses utilized AI in their writing this semester. He described the AI-generated work as "very dry."
"It doesn't have a human touch," Spier said. "It feels like a computer is speaking to Congress, or like it has an internal checklist and it's trying to make sure that each item is checked off."
FSU's Allen Morris Professor of History Andrew Frank says the shift AI has created is kind of worrisome, and he explained two instances of AI usage in his classes from last year.
"They were reflections of what you'd imagine the internet would give you as the conventional wisdom for an answer, as opposed to an answer that relates to the materials from the course," Frank said, referring to student papers he received in a course about the history of the Seminoles.
Despite the negative uses of AI, some professors encourage AI usage to a certain extent. In Culver's class, although her students are not allowed to use it for writing, she says it can be used as a tool for finding research materials.
"I just don't want them to use it for writing," Culver said. "It's really concerning because then, they won't know how to write on their own."
Other FSU, FAMU initiatives to address AI
As university policies at both FSU and FAMU are underway, Watson says the Teaching and Learning Center on FAMU's campus has held at least 30 workshops over the past two years about how faculty can identify if students use AI and how they can help students think critically.
FAMU faculty were also given an AI for Educators resource guide this fall to help them understand how to use AI, the best practices and things to look out for, Watson said.
Meanwhile, FSU has Faculty Innovator Coffee Chats hosted by the university's Innovation Hub that are held every other Wednesday with 50 to 100 active members who gather to discuss how AI is reshaping academia. Topics have included ethics, academic integrity, medicine, K-12 education, qualitative data analysis and quantitative research.
"There's a lot of interesting potential, but I think it's still such a new tool," Culver said about AI. "We're still learning how to handle it and how to make it a tool that serves our purposes rather than it being a scary thing that will cause harm."