College x machina: Students may see an even greater (but more regulated) presence of AI in Oregon State University classrooms starting next fall.
Inara Scott, senior associate dean with OSU’s College of Business, said that the college is currently reviewing proposals from professors for updating their course curriculums to include AI.
Proposals could receive funding of up to $5,000 within the next couple of weeks.
This is only one of the many complex ways in which OSU’s use of AI might evolve next year.
“The university seems ready to go ‘all in’ with AI as a research subject and generally is encouraging us to integrate it into our own research and teaching as well,” said Evan Gottlieb, a professor for OSU’s School of Writing, Literature and Film. “That being said, the university does seem to realize that the impacts and values of AI are going to vary widely across disciplines.”
According to Ehren Pflugfelder, an associate professor for SWLF, AI could be required for some assignments while still prohibited for others suggesting the policies would be stated in course syllabi.
Gottlieb said that three template syllabi statements for AI have been provided by OSU’s Center for Teaching
and Learning either permitting the use of AI in the class, permitting its use for only certain assignments or prohibiting it altogether.
“An ‘ethic of transparency’ seems to be the key phrase here,” Gottlieb said. “It’s inevitably labor intensive to determine whether a student has cheated by using AI when they’re not supposed to.”
Gottlieb added that he knows professors who don’t feel like it’s their job to police it either, while others think it should be treated the same as traditional forms of plagiarism.
Scott said that professors are creating many different AI tools for their classrooms next year.
“Some faculty are trying to build chatbots to support student success,” Scott said. “We’ve had a couple faculty in the College of Business working on those types of applications.”
Scott said that the university has also obtained a license for Copilot, a Microsoft AI product.
“The university has been trying really hard to get ahead of security issues,” Scott said, referring to a worry that data like student information could potentially enter AI databases that could be made public or used to train other models.
Scott added that it would be safe for students to access Copilot through their university accounts.
“I think what the university is doing right now is trying to create plans for AI that are consistent with our strategic plan,” Scott said.
The strategic plan, called “Prosperity Widely Shared,” written about in more detail in a previous Barometer article mentions focusing on “enabling domains of AI.”
“Part of Prosperity Widely Shared is that they’ve identified four signature areas,” Scott said. “One of those is robotics. I participated in a task force related to that and there’s a lot of intersections with AI.”
Scott said that her personal goals for AI implementation were to build effective tools that enhance student learning and increase societal equity.
“I think there’s a lot of concerns that AI has the potential to magnify inequity,” Scott said.
Scott argued that not exposing people to the technology has the potential to add to that issue.
“I don’t worry so much about students being replaced by a computer, but I do worry about students competing for jobs with people who are really good at using AI,” Scott said.
Scott explained that AI tutors, for example, could help provide one on one attention to students that has traditionally been a privilege of the wealthy. AI will impact learning tremendously, according to her.
“AI has absolutely changed learning,” Pflugfelder said, agreeing with Scott at least in that. “Frustratingly, for students who want to simply finish an assignment that has little meaning to them, ‘busy work,’ or work that can be easily shortcut via an AI model, it provides an option aside from deep engagement with materials.”
Pflugfelder also remarked that no AI model has been created “ethically.”
“There’s a lot of ethical concerns that people still have about the technology,” Scott said, mentioning a controversy about OpenAI allegedly using Scarlett Johansson’s voice for their AI without permission as an example.
Besides nonconsensual use of a person’s likeness, concerns about AI include its environmental impact, since it has been shown to use massive amounts of water and electricity, according to The Conversation and MIT Technology Review.
“From my perspective, and that’s not everyone’s perspective in SWLF, it’s not going to magically ‘go away’ and it provides some interesting opportunities,” Pflugfelder said. “As well as some challenges.”
Pflugfelder said that instructors can also critique AI text to teach students how to do better than the models can.
Gottlieb said that he allows students in his 400-500 level courses to use AI in longer research papers.
“Even then, I tell students that they need to footnote every instance of their AI usage, explaining why they used AI at that juncture, what they asked it and what limitations they can see in its response,” Gottlieb said.
His students have only chosen to do this a handful of times, he said.
“The biggest danger with student AI usage in the humanities is not that it’s going to do the sentence-level writing for students,” Gottlieb said. “When students use AI to ‘do their thinking for them,’ they’re robbing themselves of their own education.”
Pflugfelder emphasized one way in which AI can help students learn: by asking questions.
“One of the positives of some AI integration is that good AI prompt questions are often good questions, full-stop,” Pflugfelder said. “Prompts to an AI can mimic the rhetorically savvy questions we want students to ask, and then think with, as they create their own writing.”
“Right now what we don’t see, I would say, is a lot of uniformity,” Scott said. “We’re not at a point (where) everybody is doing the same thing the same way, we’re really at a point of innovation.”
Scott said that she’s excited to support those innovators.
“Think about the way the internet has changed your daily life. Now think about that with AI,” Scott said.