California Lutheran University currently has no campus wide stance on the usage of Artificial Intelligence for its students and faculty. However specific departments have their own stances that are unique to their own classes.
AI is a tool that has seen massive and rapid growth over the past few years. Dean of the College of Arts and Sciences Tim Hengst said many academic institutions donโt quite know how to deal with it.
โWeโve done a lot of work on AI and generative Artificial Intelligence. As a university, we donโt have a set policy,โ Hengst said.
The School of Management has created its own policy which states โWe believe that AI tools can be used to support student learning, but that they should not replace or displace that learning.โ
The policy also states that students can use AI for assignments, but are instructed to do so in specific ways as determined by the professor. Those ways vary from department to department.
โWeโve kind of relied on each department to define how they want to use it. I know the English Department has expressed the most concerns about it because they have, you know, theyโre focused on writing intensive assignments,โ Hengst said.
Assistant Professor of Finance John Garcia said he has been able to create and train a chatbot for his classes that students can ask questions to and get immediate answers back instead of having to wait for him to respond to them. He also said AI has been useful for his students when generating code or organizing financial data.
โSome of my machine learning classes, theyโll use it to generate some code and then kind of build off that and another, some of my finance classes will use it to basically summarize kind of the key risk from a 10-K.โ Garcia said.
Garcia also said the School of Manage- mentโs policy encourages students to use AI but does not require it, and it is up to the pro- fessor to decide the best ways to integrate AI as a tool in their classroom.
โWe canโt ignore it. You wanna get stu- dentsโ guidance as to whatโs fair and whatโs not fair, so that you can feel comfortable us- ing it in the right context, because in the real world, youโre gonna use it,โ Garcia said.
Garcia also said classes specifically focused on how to interact with specific AI would be interesting to develop.
โI very much like to think through what an AI major would look like, like an AI in So- ciety [major],โ Professor of Political Science Jose Marichal said.
Marichal said that in order to have a major focused in AI specifically it would need to in- clude both how to use it and how to critique it. One of the critiques that Garcia, Marichal and Hengst all said was that AI is unable to critically think.
โOf course, ChatGPT knows how to do some basic things, but that doesnโt mean you donโt have to learn,โ Garcia said. โThereโs some things that you want to learn so you can build the ability to critically think and connect those to do something thatโs different, some- thing that enables you to think critically.โ
Garcia said AI controversy also affects in- structors because it is becoming very hard to notice if certain work was done by the student or by an AI, but it is not completely undetect- able.
โIf youโre taking a pre-version, itโs clear if youโve used AI without even applying a tool to check, but if youโre using more of the paid tools like GBT 4, Gemini Advanced, then itโs much more difficult, but thereโs still some similarities,โ Garcia said.
Garcia said the smaller class sizes at Cal Lutheran give professors the opportunity to get to know their students better, which in turn helps them to understand if using an AI platform for assignments would be out of character for them or not.
โThatโs one of the good things about having small class sizes,โ Garcia said. โYouโre able to have, you know, better understanding of your students and that enables you to more effectively identify potentially if AI is being used for different things.โ