Top 10 Ethical AI Practices to Teach K-12 Students

Empower your students to use ChatGPT responsibly instead of cheating.

If you’ve never had students cheat or plagiarize… you will. And you’ll handle it.

And if you haven’t had students cheat or plagiarize using ChatGPT… you will. And you’ll handle that too.

We’ve entered the era of generative AI chatbots, and yes, students using them to cheat is a valid concern. At the same time, ChatGPT isn’t going away, and ignoring or prohibiting it won’t change that fact. So let’s teach students how to use AI properly.

Teaching students how to use AI ethically is like teaching teens safe driving habits. We know they may be tempted to speed, text while driving, or otherwise be reckless, but we don’t prevent them from driving. We teach them the traffic laws, tell them to keep their eyes on the road, and trust them to use good judgment.

As we help students learn how to use AI technology ethically and responsibly and trust them to make the right decisions, we’ll equip them for powerful, lifelong learning.

Here are 10 ethical AI practices to teach your students.

1. Think critically

Show your students that although AI can answer their questions, those answers shouldn’t be accepted at face value.

From bias to inaccuracies (more on those below), there are plenty of reasons not to thoughtlessly rely on ChatGPT.

Because AI chatbots feel like a trove of knowledge, remind your students that AI chatbots are designed to be large language models, not interactive encyclopedias.

To engage students’ critical thinking skills, consider requiring them to ask ChatGPT for multiple perspectives on topics they’re researching, rather than automatically going with the first response.

You can also establish guidelines so that students can only use ChatGPT after they’ve learned about a topic in class or with verified sources. That way, they can note any discrepancies or omissions in ChatGPT’s response based on what they already know, whether they’re learning about the causes of WWII or how to solve two-step inequalities.

2. Use ChatGPT as a tool, not a crutch

Encourage students to use AI as a tool, not a replacement, for their own thinking.

If allowed by your classroom AI policy, guide your students towards using ChatGPT as a starting point in research or a brainstorming partner—not a ghostwriter.

For example, if you’re exploring climate change, you can allow your students to use AI as part of their research, and then assign them to come up with their own unique solutions. Or if you’re a math teacher, you might encourage students to use ChatGPT to explain concepts to them, but they’ll need to solve problems on their own and explain their reasoning.

If you give your students the support and resources they need to feel confident in their own writing and thinking abilities, hopefully, they’ll feel empowered to use ChatGPT to extend, not replace, their learning.

3. Analyze bias in AI

Educate your students about bias in AI and how it can produce harmful ideas, omit key information, and perpetuate stereotypes.

Because ChatGPT and other large language models use content from biased humans to generate responses, AI-generated content will reflect those biases.

A 2023 Bloomberg report titled “Humans Are Biased. Generative AI Is Even Worse” found that an AI image generator produced images of light-skinned people almost exclusively in high-paying jobs, like lawyers, and dark-skinned people in low-paying jobs, like fast-food workers.

Support your students in developing the critical eye they’ll need to spot biases in ChatGPT’s output.

4. Confirm accuracy

Generative AI is imperfect, and students need to learn how to verify the accuracy of its content with reputable sources. If you were a teacher in the mid-2000s, this should sound familiar—it’s like dealing with Wikipedia all over again!

ChatGPT’s unreliability is widespread. For example, ChatGPT is getting worse at answering math problems. A 2023 Stanford study found that ChatGPT’s accuracy rate in answering a simple math problem went from 98% to 2% over just a few months.

ChatGPT will also hallucinate book titles, invent legal cases, and give dangerous health advice.

Arm students with this knowledge about AI’s limitations so they can decide whether they need to rely on their own skills or cross-reference ChatGPT’s answers against more trustworthy sources.

5. Be transparent

Help your students understand that if they’re allowed to use ChatGPT for an assignment, they should be upfront about how they used it.

To get them into the habit of disclosing AI usage, you might want to require students to note whether they used it for proofreading, outlining, or brainstorming. To encourage metacognitive skills, you might also have them reflect on what was and wasn’t helpful in using ChatGPT or other AI chatbots.

Be transparent with your students, too. If they understand that the reason you’re asking for these disclosures is so you can better help them learn and not get them in trouble, they’ll be more likely to trust you enough to be honest.

6. Understand what is considered cheating

Whether students use AI or not, academic dishonesty is not okay. But students don’t intuitively know everything that counts as cheating and plagiarism—we need to teach them.

For example, even before ChatGPT, students needed to be explicitly instructed, “Even if you change a few words, using a sentence from a source is still plagiarism.”

Now, we need to explicitly teach things like “Even if you prompted it with your idea, copying and pasting a ChatGPT answer is still plagiarism.”

Some teachers will have different standards for what’s acceptable and what’s cheating. You may be okay with students using ChatGPT to proofread or outline a paper, but your colleague down the hall might not.

So make sure you’re crystal clear what your AI policy is. In fact, one study found that students felt anxious because they lacked guidance on responsible AI use—so your students will probably appreciate getting it from you!

7. Ask questions about using ChatGPT

Make sure your students know it’s okay to ask you and other teachers about acceptable and unacceptable ways to use ChatGPT.

This technology is as new to them as it is for you—but they’re learning how to navigate ChatGPT at the same time they’re first getting introduced to the nuances of academic cheating and plagiarism. You may be the best guide they have!

Give them opportunities to ask questions by fostering a safe and open space to have these discussions. Ask your students what they think about AI, how they use it, and what they think is acceptable ChatGPT usage in school. Then have them ask you questions. It’s bound to be an illuminating conversation for everyone.

8. Respect privacy and consent

Teach your students that once they enter information into ChatGPT or other AI chatbots, it’s no longer private information.

Many chatbots collect user data and analyze their responses to improve their services (although some do offer the option to opt out).

The risks are not just hypothetical. At a recent hacker convention, attendees competed to get various AI chatbots to go rogue—and one person got AI to give up a credit card number that belonged to someone else.

If students share any personal information—about themselves or others—it could risk their cybersecurity. Equipped with this knowledge, students can be more wise about what information they share with AI.

9. Be ethical digital citizens

Make sure students understand that being ethical, respectful digital citizens isn’t just important on social media—it also applies to using AI.

Remind students that how they share, create, or engage with AI-generated content can impact others.

Because the adolescent brain is impulsive and hyper-values peer acceptance, middle school and high school students are at heightened risk of potentially damaging behavior.

When technology makes it easy to spread misinformation about people or even bully others, your students may need explicit encouragement to think about the potential impact of their online actions on others and their future selves.

Ultimately, we want to support students in being respectful and empathetic to others, with or without ChatGPT at their fingertips.

10. Educate others about ethical AI practices

Your students are lucky enough to have you—a teacher who wants to help them build the skills and knowledge they need to use AI ethically and responsibly. Not all of their peers, friends, and family will have that kind of mentorship.

Empower them to play that role for others. The world is evolving quickly, and we’re all in this together.

The future of ChatGPT: More learning, less cheating

As generative AI technology evolves, it will continue to provide benefits and risks to students, both in school and in their future careers.

By teaching them how to use ChatGPT ethically and responsibly, we can empower them to harness the benefits of AI while upholding academic integrity.

This may not eliminate cheating altogether—nothing ever will. But as we trust students to make the right decisions, they’ll learn how to use AI chatbots the right way, and their confidence and learning skills will grow. After all, fostering growth is what educators do best!

The American Consortium for Equity in Education, publisher of the "Equity & Access" journal, celebrates and connects the educators, associations, community partners and industry leaders who are working to solve problems and create a more equitable environment for historically underserved pre K-12 students throughout the United States.

Share this page with your friends and colleagues:


Interested in writing for us?

Send us your insight so we can share it! We welcome article submissions from educators, advocates, thought leaders and companies who are working to make education more equitable, accessible and inclusive.
Let's keep in touch! Click to join our email list & subscribe to our newsletter.

Skip to content