Skip To Main Content

Find It Fast

Student developing chatbot to determine when using AI violates Honor Code

Tom Coyne

Aidan Ji '24 works on his customized chatbot. 

 

A Culver Military Academy first classman is working on creating a customized chatbot that will help students understand when they are allowed to use artificial intelligence to complete homework and other assignments under the Culver Honor Code.

“I think AI should be used for the benefit of students, not just for the bad parts,” said Aidan Ji ’24, who is from Chicago. “It’s about teaching and instructing the students how we can work with AI.”

Anyone familiar with the importance Culver Academies places on honor and integrity understands the concerns both teachers and students have about the pervasive availability of generative AI such as ChatGPT, Microsoft’s Bing chatbot and Google’s Bard, which can answer questions, create essays on almost any topic (although it frequently contains incorrect information), write poetry, solve science and math problems and produce computer code.

The school’s Honor Code, known by all students, is straightforward: “I will not lie, cheat, or steal, and I will discourage others from such actions.” What constitutes cheating is not so straightforward as artificial intelligence rapidly transforms the tools available to students to complete homework and other assignments.

During a faculty meeting in August, Becky Strati, Huffington Library director and a Culver Girls Academy Honor Code advisor, told teachers how Ji and Celeste Gram ’24, CGA first rotation Honor Council chair, had been in touch over the summer because of their concerns about AI and the need to let students know about what is and isn’t allowed.

Culver faculty were told during that meeting that they should make clear in every assignment given to students whether use of AI is allowed, and if it is, in exactly what manner it can be used.

“You can't ban it. It's out there,” said Don Fox ’75, a senior instructor in leadership education and a CMA Honor Council advisor. “All you can do is, I think, put up guardrails about when it is appropriate, how it can be appropriate and when it's not.”

Honor Council leaders have reminded classmates twice during all school meetings this year about the importance of following the Honor Code. At a recent all school meeting, Gram and Gilii De Villiers ’24 reminded students that getting a bad grade on a test or being late on an assignment are situations where the consequences are much less severe than an Honor Code offense.

Ji believes AI is going to be transformational when it comes to life at Culver, including in the classroom, in athletics, in leadership programs and in extracurricular activities. 

“I think the sooner we realize that and the sooner we start to adapt with it the better,” Ji said.

Ji is developing the chatbot as part of his honors computer science class project. He has three versions and is testing which one is most accurate. The chatbots can be trained, said David Lawrence, a senior instructor who teaches honors computer science. A chatbot is a computer program designed to help answer a problem or a question without the need for a human operator.

Lawrence said the chatbot should be able to answer clearcut questions. It is different from programs like ChatGPT, where the answers aren’t vetted. Lawrence said students taking honors classes at Culver are expected to do work that is on par with a research project that is the equivalent of work that would be done by a college sophomore.

“He’s clearly passionate about it,” Lawrence said.

Ji said that during his time serving on Honor Council he has often heard cadets say they either didn’t understand what they were doing violated the code or they didn’t know the rule they were accused of violating.

“What I am working to develop is a chatbot in which users can ask the question regarding the regulations of the honor system and then it will output an answer that will allow the user to understand based on their question,” he said.

Ji gave the example of a history teacher telling students to complete an assignment using their reading as the main source to answer the questions. He said a student might use AI to help them understand. But the answers rely more heavily on the AI than the assigned text.

“The teacher clearly laid out in the assignment that it should be done with the book, not AI,” he said.

Generative AI can create text, images, audio and video on demand by getting information from volumes of text gathered from published writing and information online. But it can’t discern whether the information is accurate, which is why it sometimes provides incorrect information.

Ji is creating a chatbot that will get its answers only from a 50-page PDF that a subcommittee of the Honor Council is creating. He hopes to make the program available through Veracross.

Ji said the document will include the Culver Code of Conduct, the Culver Honor Code, the Culver Student Handbook, samples of cases the Honor Council has previously heard, various definitions and a history and the purpose of the Honor Council, which is responsible for administering the honor system.

Ji said that program would give “the most reliable and accurate response” using algorithms, such as key word detection and semantic searching. He’s working to make sure the program doesn’t give incorrect answers.

He said he hopes it will educate Culver students about not only what is and isn’t allowed, but what punishments they could face for violating the Honor Code. Violators could be required to resign their rank and lose any leadership positions, would likely face disciplinary confinement and could be ineligible for academic awards and nominations to academic societies, such as Blue Key or Cum Laude and the loss of an opportunity to receive a Gold or Silver A.

“I really want this chatbot to be used as a means of education as opposed to ‘Can I do this? Can I do this?’ ” he said. “I want people to engage in understanding exactly the different cases of cheating and what causes them and what the effects are and then a big part of this document that we're developing is how these cases can be avoided in the first place.”

He said the chatbot will provide suggestions on ways to avoid violating the Honor Code.

“It will give tips like, ask your teacher before every assignment whether you can use AI or coordinating with your friends or utilizing the resources that are posted on the bulletin boards across the regiment, across the girls dorms, because those things are there.”

Fox said he is not surprised students want to make sure AI is being used properly.

“If you look at the students that we have on the Honor Council, both on the CMA and CGA sides, they're generally high achieving students who have a high sense of morality or they wouldn't have stood for election for these positions. They work hard to achieve the things that they do. Things like cheating, plagiarism and copying offends them,” Fox said.

Ji, who has been awarded an Army ROTC scholarship, said he was encouraged by his older brother, who is in Army intelligence, to create the chatbot because he has seen how the Army is using AI.

“He said the opportunities are just continuing to expand, and I took great interest in that,” Ji said. “What he challenged me to do is to look into how AI is impacting Culver.”

Ji said he wants to leave a legacy of promoting AI for the benefit and mitigating its drawbacks. He hopes to have the chatbot available to students after Christmas break.

Subscribe to our Newsletter

Required

The Culver Cannon Newsletter is sent out weekly on Fridays.

More Recent News