In the 2002 blockbuster film Spider-Man, a young Peter Parker, also known as Spider-Man, has a conversation with his uncle in the car before school. Peter's uncle offers him the iconic advice: "With great power comes great responsibility."
As educators and members of society, these words remain relevant today amid our current artificial intelligence revolution. Artificial intelligence provides powerful tools and capabilities like never before, but it also presents unique challenges. With such great power and potential, how can we teach our students to use AI responsibly?
Here are seven strategies to empower our students to become ethical and responsible AI users:
Tip 1: Communicate Expectations
Would you hand over the keys to your car to a teenager on their 16th birthday without any driving lessons? The idea is absurd, yet this is happening in many classrooms with generative AI. Students are given access to a powerful tool without clear guidance on what is expected of them, when it should or should not be used, and why it is important. Just like driving, setting clear expectations and communicating them effectively is crucial. It may be beneficial to co-create a list of expectations with your students, adding to it as necessary.
Tip 2: Model Responsible Use
Generative AI can be a fantastic tool, but it is crucial to teach students how to use it responsibly, just like in the 1990s they had to be taught to use web browsers and online resources responsibly. AI should support creativity, critical thinking, and learning, NOT replace them. For example, we must teach students to generate original content and use AI to brainstorm or provide feedback, much like we learned to use word-processing applications' built-in spell-check and grammar features. Share when and why you have used generative AI with your students.
Tip 3: AI is a Tool, Not a Human Being
While it may seem obvious, many students are unaware that AI lacks feelings or emotions. It’s crucial to help them understand that generative AI is a tool—not a friend, confidant, or counselor. Misunderstanding this can lead to misplaced trust or over-reliance on AI for emotional support or guidance. Encourage students to make connections with trusted adults to navigate the complexities of adolescents.
Tip 4: AI is Biased
It is well-documented that all generative AI models contain biases, stemming from incomplete, imbalanced, or historically skewed data. We must teach students that AI is not the ultimate authority and must be fact-checked for accuracy and bias. Teaching students to critically evaluate AI outputs and question their validity empowers them to use AI both responsibly and thoughtfully.
Tip 5: AI Can Be Wrong
While AI tools may seem to provide definitive answers, they are not always accurate. In many instances, AI can produce incorrect, inappropriate, or entirely fictional responses, a phenomenon known as hallucination. One of my colleagues recently shared a story about one of these hallucinations. She was looking for a podcast on a specific topic and struggled to find what she needed through Spotify and Apple Podcasts. She turned to Chat GPT, which happily gave her titles and episode numbers from reputable sources like The MindShift Podcast (KQED) and EdSurge On Air.
When she searched for these specific titles on Spotify, nothing came up. She then asked ChatGPT to provide her with links to the podcasts, which it did, “http://” and all! Each of those links led to a 404 page error (page not found). Frustrated, she told ChatGPT that she couldn’t find any of these podcasts and she felt like that were made up. It’s response? “I apologize for the confusion earlier. You're right—the podcast episodes and links I provided were not real, and I made an error in suggesting them.”
Therefore, it is crucial for students to treat AI-generated information as a starting point and to conduct further research for verification. One effective strategy is to use tools like Copilot, Gemini, and Perplexity, which include citations and links to additional resources for confirmation. Always check each link, as AI tools have been known to cite fictional sources.
Tip 6: Avoid Sharing PII
We teach our students not to share personally identifiable information (PII), such as their names, addresses, or contact details with strangers. The same caution should apply when using generative AI. It is essential to protect students from inadvertently sharing sensitive information, like names, addresses, email addresses, or birth dates. Once shared, this data becomes part of the information processed by AI models, which can lead to privacy risks and heighten the chances of unauthorized access or data breaches, potentially causing identity theft, fraud, or other harmful activities.
Tip 7: Protect Student Data
As educators, it is our responsibility to carefully review the terms of use for AI tools and ensure that student data is handled with care. While a new tool may seem like an exciting addition to a lesson, it could also pose risks to our students. Some AI tools are developed by companies that may not prioritize or fully understand the significance of student privacy. For example, if students log into an application using their Google account, it could grant the tool access to personally identifiable information (PII), files, location, and more.
It is essential to understand the terms of use, privacy policies, and compliance with regulations such as FERPA and COPPA in order to protect our students. Additionally, sharing this information with students and parents is crucial. Organizations like Common Sense Media provide valuable resources for evaluating the privacy policies of tools.
Teaching students to use generative AI responsibly is an essential skill that prepares them for the future. Employers are beginning to place emphasis on AI–fluency in making hiring decisions. Not preparing students to practice using AI may be convenient in the short-term, but could actually hurt them in the long-term. Classrooms offer a unique, low-pressure environment where students can experiment, learn, and even make mistakes as they develop the skills needed for responsible AI use. By providing clear guidance, setting expectations, and offering ongoing support, we ensure that students can confidently navigate the complexities of AI. This approach equips them with the tools and strategies to use AI ethically and effectively, both in school and beyond.
Before introducing AI into the classroom, ensure you have the necessary training to not only leverage it for elevating your instruction but also to use it as a tool for making learning more accessible and inclusive.
- Take an on-demand course: Learn how to connect UDL and AI (Intro course) or strategies to boost learner engagement with AI
- Schedule a workshop, training or professional learning series for you and your colleagues