Skip To Main Content

Find It Fast

Culver Academies adopts AI policy to give guidance to students, faculty, staff

Tom Coyne

Cali Miller, chairperson of the computer science and engineering department, works with students. (Photo by Scott Johnson)

 

Culver Academies has adopted its first comprehensive artificial intelligence policy to provide guidance to students, faculty and staff about when they can and can’t use it and how to use AI tools responsibly and ethically.

Cali Miller, chairperson of the computer science and engineering department, said the AI policy was challenging because AI is evolving so quickly.

“It’s an ever-moving target and lots of people have lots of different feelings about the topic,” she said.

Miller worked for the past year with Alexa Gardner ’06 W’01, Culver’s director of information technology, to develop a policy that provides everyone at Culver a guide to best practices. It is not meant to tell a teacher how to use AI in the classroom. Miller and Gardner shared drafts of the policies with faculty during June week last school year and got feedback that was used to create the most updated student and faculty policies. They also received feedback from the Academic Affairs Committee.

“The feedback from both of these groups made a big difference in what the faculty and student policies look like and what is included,” Miller said. “Our goal was to try create something that gave everyone an idea of where they should start from – not where they should go to – but where they should start from.”

She said Culver, a top boarding school in the United States, recognizes the increasing role that generative AI plays in education and in the workforce and wanted a policy that outlines guidelines to use AI tools effectively and according to the Culver Honor Code.

The guideline for students reads: “If a student is uncertain whether an AI tool can be used in a class, it is the student’s responsibility to obtain clarification from the teacher before using the tool. Misuse will result in disciplinary action as outlined in the Culver Honor Code.”

“It’s the student’s fault if they don’t ask,” Miller said.

The two-page document also tells students they should check in the course syllabus and the Academic Honor Template to see how AI can be used.

“The syllabus should say, ‘You can use AI for this, this and this. This is why you can use AI, and here’s where I definitely don’t want you to use AI, and here’s why,” Miller said.

She said teachers can eventually give students more freedom by allowing students to use AI for specific assignments.

“That way they can do it as a class, learn from it as a class, and the teacher can help guide them,” she said.

She said no matter how comfortable students feel with technology in general, but particularly at AI, most are “pretty horrible at it.”

“I think our teachers understand that we owe our students a service at some level so that when they leave here they are more competent with AI and can use AI in ways to study, to get explanations, to get help,” she said.

 

Cali Miller, chairperson of the computer science and engineering department, helped develop Culver's AI policy.

 

Miller said she and Gardner are working on ways to help teachers accomplish that goal.

The policy also specifies which AI platforms can be used because they adhere to Culver’s values, privacy standards and technological infrastructure.

The policy also tells students they should track their conversations with any AI tool into a document they can share with their teacher.

“Your teacher should be able to see those if they ask,” she said.

It also advises students they should review and check content for accuracy, appropriateness, bias and for “hallucinations,” where AI simply makes up information.

“They even make up sources,” Miller said. “So students will say to me, ‘Oh, it has sources.’ But if they click on the source, it’s gobbledygook, or it doesn’t take you anywhere.”

Teachers have sheets to provide students that explain how to check for accuracy, appropriateness, bias and hallucinations.

The policy also advises students they should never input personal information, such as names, addresses, phone numbers or birth dates into AI tools.

The policy also states which AI platforms students, faculty and staff can use that safely protects Culver Academies’ data.

The policy also prohibits students from using AI tools to manipulate media to impersonate others for bullying, harassment or any form of intimidation. It also tells students they should not submit AI-generated work as their own.

The faculty policy encourages, but does not require, teachers to use AI tools to enhance teaching methods, personalize learning and support innovation in the classroom.

The policy states that the syllabus must explicitly state whether students can use generative AI tools in the class. It also states that if students are allowed to use generative AI tools, teachers should be transparent about why and how the AI tools will or could be used.

Teachers also should disclose in the syllabus ways they may use an AI tool for the course.

The policy also states that if an assignment allows students to use AI in ways beyond what is stated in the syllabus, teachers are expected to share the allowed uses for the assignment with the Academic Honor Template.

Areas that are included in the template with respect to generative AI are: research, assisted thinking, assisted feedback and explanation of concepts.

The staff policy states the Culver employees can use AI tools to enhance productivity while upholding ethical standards and academic integrity and respecting privacy, promoting transparency and upholding the values of Culver’s school community.

Some other high schools and colleges have lengthy policies, up to 70 pages long, that can be confusing, Miller said.

“That’s too much. We wanted it boiled down to something manageable. It makes it harder to understand what are the key pieces, ‘What do I need to know.’ We wanted to get the priorities right for our school.”

The policy will be reviewed annually, Miller said.

“It’s going to change,” she said. "We’re in this together. We’re going to have to share what works and what needs to be changed. We’re in this together.”

Subscribe to our Newsletter

Required

The Culver Cannon Newsletter is sent out weekly on Fridays.

More Recent News