California college professors have mixed views on AI in the classroom


1 COVER PHOTO CG @ CSULB

Cal State Long Beach lecturer Casey Goeller wants his students to know how to use AI before they enter the workforce.

Tasmin McGill/EdSource

Since Open AI’s release of ChatGPT in 2022, artificial intelligence (AI) chatbots and models have found their way into the California college systems. These AI tools include language models and image generators that provide responses and images based on user prompts.

Many college professors have spoken out against AI’s use in college coursework, citing concerns of cheating, inaccurate responses, student overreliance on the tool, and, as a consequence, diminished critical thinking. Universities across the U.S. have implemented AI-detecting software like Turnitin to prevent cheating through the use of AI tools.

However, some professors have embraced the use of generative AI and envision its integration into curricula and research in various disciplines. To these professors, students learning how to use AI is critical to their future careers.

An October 2024 report from the University of Southern California’s Marshall School of Business found that 38% of the school’s faculty use AI in their classrooms.

Ramandeep Randhawa, professor of business administration and data science at USC, was one of the report’s 26 co-authors and organized the effort. 

“As companies increasingly integrate AI into their workflows, it is critical to prepare students for this AI-first environment by enabling them to use this technology meaningfully and ethically,” Randhawa said. “Universities, as bastions of knowledge, must lead the way by incorporating AI into their curricula.”

All in on AI

At California State University, Long Beach, gerontology lecturer Casey Goeller has incorporated AI into his course assignments since fall 2023.

Students enter Goeller’s Perspectives on Gerontology course with various levels of experience with AI. By asking students for a show of hands, Goeller estimates the class is usually evenly split, with some students having no experience, others having dabbled with it and some who have used it extensively.

Goeller aims to help students understand how AI can be beneficial to them academically, whether it be assisting with brainstorming, organizing, or acting as a 24/7 on-call tutor.

To achieve this, Goeller’s assignments include students using an AI tool of their choice to address his feedback on their essays based on criteria such as content, flow and plagiarism concerns. Another assignment, worth 15% of their grade, emphasizes the importance of prompt engineering by having students use AI-generated questions to interview an older person in their life.

While Goeller gets a lot of questions from fellow faculty members about how AI works and how to implement it, he also hears plenty of hesitation.

“There’s a lot of faculty who’s still riding a horse to work, I call it,” Goeller said. “One of them said, ‘I am never going to use AI. It’s just not going to happen.’ I said, ‘What you should do if you think you can get away with that is tomorrow morning, get up really early and stop the sun from coming up, because that’s how inevitable AI is.’”

Goeller heeds the difficulties in establishing a conclusive way to incorporate AI into curricula due to different academic disciplines and styles of learning, but he does recognize the growing presence of AI in the workforce. Today, AI is filling various roles across industries, from analyzing trends in newsrooms and grocery stores, to generating entertainment, a point of contention for SAG-AFTRA members during 2023’s Hollywood strikes.

“If we don’t help our students understand AI before they escape this place, they’re going to get into the workforce where it’s there,” Goeller said. “If they don’t know anything about it or are uncomfortable with it, they’re at a disadvantage compared to a student with the same degree and knowledge of AI.”

California State University, Northridge, journalism lecturer Marta Valier has students use ChatGPT to write headlines, interview questions and video captions in her Multimedia Storytelling and Multi-platform Storytelling classes due to the inevitability of AI in the workforce.

The goal of the implementation is to teach students how AI algorithms operate and how journalists can use AI to assist their work. Not using it, she said, “would be like not using ink.”

“I absolutely want students to experiment with AI because, in newsrooms, it is used. In offices, it is used,” Valier said. “It’s just a matter of understanding which tools are useful, for what and where human creativity is still the best and where AI can help.”

AI tools such as ChatGPT and Copilot are frequently updated, so Valier emphasizes flexibility when teaching about these technological topics.

“I basically change my curriculum every day,” Valier said. “I think it reminds me as a professional that you need to constantly adapt to new technology because it’s going to change very fast. It’s very important to be open, to be curious about what technology can bring us and how it can help us.”

However, Valier acknowledges the issues of AI in terms of data privacy and providing factual responses. She reminds students that it is their responsibility to make sure the information ChatGPT provides is accurate by doing their own research or rechecking results, and to avoid reliance on the platform.

“Be very careful with personal information,” Valier said. “Especially if you have sources, or people that you want to protect, be very careful putting names and information that is sensitive.”

Valier sees a clear difference in the quality of work produced by students who combine AI with their own skills, versus those who rely entirely on artificial intelligence.

“You can tell when the person uses ChatGPT and stays on top of it, and when GPT takes over,” Valier said. “What I am really interested in is the point of view of the student, so when GPT takes over, there is no point of view. Even if [a student] doesn’t have the best writing, the ideas are still there.”

Balancing AI use in the classroom

Many AI-friendly instructors seek to strike a balance between AI-enriched assignments and AI-free assignments. 

At USC, professors are encouraged to develop AI policies for each of their classes. Professors can choose between two approaches, as laid out in the school’s instructor guidelines for AI use: “Embrace and Enhance” or “Discourage and Detect.”

Bobby Carnes, an associate professor of clinical accounting at USC, has adopted a balance between both approaches while teaching Introduction to Financial Accounting. 

“I use it all the time, so it doesn’t make sense to tell (students) they can’t use it,” Carnes said.

6 Bobby Carnes @ USC
An avid user of AI tools like ChatGPT, USC associate professor of clinical accounting Bobby Carnes encourages AI experimentation for some assignments, but prohibits students from using it on exams. (Christina Chkarboul/EdSource)

Carnes uses AI to refine his grammar in personal and professional work and to develop questions for tests. 

“I give ChatGPT the information that I taught in the class, and then I can ask, ‘What topics haven’t I covered with these exam questions?’ It can help provide a more rich or robust exam,” Carnes said.

He doesn’t allow students to use AI in exams that test for practical accounting skills, though. 

“You need that baseline, but we’re trying to get students to be at that next level, to see the big picture,” he said.

Carnes said he wants his students to take advantage of AI tools that are already changing the field, while mastering the foundational skills they’ll need to become financial managers and leaders. 

“The nice thing about accounting is that the jobs just become more interesting (with AI), where there’s not as much remedial tasks,” Carnes said. 

Preserving foundational learning

Olivia Obeso, professor of education and literacy at California State Polytechnic University, San Luis Obispo, believes establishing foundational knowledge and critical thinking skills through AI-free teaching is non-negotiable.

Obeso enforces her own no ChatGPT/AI usage policy in her Foundations of K-8 Literacy Teaching class to prepare her students for challenges in their post-collegiate life.

“AI takes out the opportunity to engage in that productive struggle,” Obeso said. “That means my students won’t necessarily understand the topics as deeply or develop the skills they need.”

Obeso is also concerned about ChatGPT’s environmental impact: For an in-class activity at the start of the fall 2024 semester, she asked students to research the software’s energy and water use. 

The energy required to power ChatGPT emits 8.4 tons of carbon dioxide per year, according to Earth.Org. The average passenger vehicle produces 5 tons per year. Asking ChatGPT 20-50 questions uses 500 millliters (16.9)oinces) of water, the size of a standard plastic water bottle.

By the end of the exercise, Obeso said her students became “experts” on ethical considerations concerning AI, sharing their findings with the class through a discussion on what they read, how they felt and whether they had new concerns about using AI. 

“You are a student and you are learning how to operate in this world, hold yourselves accountable,” Obeso said. 

Jessica Odden, a senior majoring in child development, said Obeso’s class helped them understand AI use in the classroom as an aspiring teacher.

“For people that are using (AI) in the wrong ways, it makes people reassess how people might be using it, especially in classes like this where we are training to become teachers,” Odden said. “What are you going to do when you actually have to lesson-plan yourself?” 

Odden makes sure she sticks to learning the fundamentals of teaching herself so that she will be prepared for her first job.

AI in curricula

At the University of California, San Diego, some faculty members have echoed a concern for AI’s infringement upon independent learning. 

Academic coordinator Eberly Barnes is interested in finding a middle ground that incorporates AI into curricula where it complements students’ critical thinking, rather than replaces it.

Barnes oversees the analytical writing program, Making of the Modern World (MMW), where her responsibilities include revising the course’s policy of AI use in student work.

The current policy enables students to use AI to stimulate their thinking, reading and writing for their assignments. However, it explicitly prohibits the use of the software to replace any of the aforementioned skills or the elaboration of the written piece itself.

Despite the encouraged use of AI, Barnes expressed her own hesitancy about the role of AI in the field of social sciences and the research and writing skills needed to work within it. 

“One of the goals in MMW is to teach critical thinking and also to teach academic writing. And the writing is embedded in the curriculum. You’re not going to learn to write if you’re just going to machine,” Barnes said. “The policy is inspired by the fact that we don’t think there’s any way to stop generative AI use.”

When Barnes designs the writing prompts for the second and third series in the MMW program, she collaborates with teaching assistants to make assignment prompts incompatible with AI analysis and reduce the likelihood that students will seek out AI’s help for passing grades.

“Students feel absolutely obsessed with grades and are very pressured to compete,” Barnes said. “That’s been around. I mean it is definitely worse here at UCSD than it was at other colleges and universities that I’ve been at.”

A tool, not a cheat code

8 Dr. Celeste Pilegard @ UCSD
Dr. Celeste Pilegard

Celeste Pilegard is a professor of cognitive science and educational psychology at UCSD. She has been teaching introductory research methods since 2019, focusing on foundational topics that will prepare students for higher-level topics in the field.

Educators like Pilegard have been struggling to adapt after the widespread adoption of AI tools. 

“For me and a lot of professors, there’s fear,” Pilegard said. “We’re holding onto the last vestiges, hoping this isn’t going to become the thing everyone is using.”

Pilegard is concerned that students rely on AI tools to easily pass their intro-level courses, leaving them without a firm understanding of the content and an inability to properly assess AI’s accuracy.

“It’s hard to notice what is real and what is fake, what is helpful and what is misguided,” Pilegard said. “When you have enough expertise in an area, it’s possible to use ChatGPT as a thinking tool because you can detect its shortcomings.”

However, Pilegard does believe AI can assist in learning. She likens the current situation with AI to the advent of statistical analysis software back in the 1970s, which eliminated the need to do calculations by hand. 

At that time, many professors argued for the importance of students doing work manually to comprehend the foundations. However, these tools are now regularly used in the classroom with the acceptance and guidance of educators. 

”I don’t want to be the stick in the mud in terms of artificial intelligence,” Pilegard said. “Maybe there are some things that aren’t important for students to be doing themselves. But when the thing you’re offloading onto the computer is building the connections that help you build expertise, you’re really missing an opportunity to be learning deeply.”





Source link

About The Author

Scroll to Top