Educators teaching AI literacy to students should include ethics in their lesson plans, scholars say
As educators incorporate AI literacy into their curriculums, some are teaching their students how to use the tech ethically and efficiently.
- Artificial intelligence is becoming more prevalent in education.
- Initiatives such as Stanford University's CRAFT program are promoting AI literacy in schools.
- Concerns about AI misuse have prompted instructors to teach students how to use the tech ethically.
- This article is part of "Build IT," a series about digital tech and innovation trends that are disrupting industries.
Assertions about expanding the use of artificial intelligence are typically rooted in one of two schools of thought: "Be ahead of the curve," or, "Don't fix what isn't broken." Whichever side of the coin someone is on, it's clear that the rapidly emerging technology isn't going anywhere anytime soon.
AI is everywhere now: healthcare, marketing, social media, music, hair care, the list goes on. Seeing that the tool is digging its heels in various industries, professionals are tapping in, including educators.
A report by Allied Market Research projected AI in the education market would reach $88.2 billion globally by 2032. In 2022, North America gathered the highest share of revenue in the global market for AI in education and is projected to hold that dominance, the report said.
Last month, Arizona State University announced a partnership with OpenAI to "enhance student success" by bringing ChatGPT into its classrooms. ASU is the first higher-education institution to collaborate with the AI company, according to a press release from the school.
While some optimists focus on AI's benefits in education, others fear that using AI in classrooms could catalyze cheating and misinformation.
This is where AI literacy can be useful.
Projects such as the Classroom-ready Resources about AI for Teaching initiative are working to help fill the knowledge gap and combat apprehension.
Created within the Stanford Graduate School of Education, CRAFT is a collaborative effort of Stanford education researchers, software developers, and curriculum developers. Its goal is to help high-school instructors have easy access to AI resources that are adaptable to their respective subjects and lesson plans.
Victor Lee, an associate professor at the Stanford Graduate School of Education who helped develop CRAFT, told Business Insider the initiative's team was working to build knowledge in multiple subject areas and "demystify and really contextualize" the areas in which AI affects the way we work in classrooms.
"Those who are less familiar with AI and what it is being used for are especially anxious and feel very cautious," Lee said. "Providing more specific knowledge about AI, both in examples of how it can be used productively and how it can exacerbate long-standing problems, helps alleviate some concerns."
Lee is one of CRAFT's developers who recognized the importance of young students learning the fundamentals of AI, especially as more workplaces lean into the technology. Resources such as CRAFT could also help tackle the equity gap, which may widen if underserved students don't have equal access to AI initiatives.
Advising students to use AI tools ethically and cautiously
When educators teach AI literacy, they should have previous experience with the platforms they're using in a class, said Matthew Ratz, a professor at Montgomery College.
Before introducing AI in his classroom, Ratz had been using AI for his writing, leveraging Copy.ai, which features tools to generate and simplify content, and Jasper, an AI writing service that helps with marketing copy. Since Montgomery College allows its professors to decide whether to use AI in their classrooms, Ratz opted to do so.
"If we use it, they want us to teach how to use it ethically," Ratz told BI.
He added: "If my view was to prepare students for the 21st-century workplace, I knew that teaching them how to use AI effectively was going to be part and parcel of that experience."
Lee also told BI that AI literacy in classrooms "should involve recognition of where AI can be effective and where it requires extra vigilance." Because AI's use cases vary across subjects, Lee said, "it is important to focus on specific uses more than broad generalities at the moment."
One of the tools that Ratz introduced to his classroom is Perplexity AI, a search engine that answers questions using natural language. It helps the English-composition students do peer-reviewed research — despite having no experience — and learn how to cite sources properly, he said.
He also designed his work prompts to be open-ended and elicit original thought so that AI couldn't answer for the students. For example, he had his students analyze the rhetoric between two TED Talks, which required independent thinking and lessened the likelihood of AI doing the work.
ChatGPT specifically works to Ratz's advantage as he uses it to help his students achieve their objectives and improve their writing, he added.
Last semester, he fed papers into the AI tool and asked it to identify errors that his students were making. "I actually used the feedback from ChatGPT to give lessons on transition sentences and specific use of punctuation — specifically, the comma, semicolon, ellipses — which students were struggling with," he said.
Concerns about AI in classrooms
While Ratz is pleased with how AI supports his students' academic growth, he said, "the ethical concerns behind AI are really legitimate." This includes the racism and bias that are inherent in AI.
Erin Reddick is one of many with these concerns. She launched ChatBlackGPT, an AI chatbot that pulls insights from Black, African American, and African sources, as she worked "to ensure technological advancements are ethically grounded and beneficial for all communities."
Another apprehension is that there aren't any federal laws regulating AI — and relatively few state-level ones that address it at all. Without any legislation to enforce responsible and ethical AI use, it's hard to prevent the misuse and misappropriation of the technology.
"Is it ethical for a teacher to grade a student's work with AI? Probably not, but who's actually saying, 'You can create this test with AI, but you can't grade it with AI?' Who's in there drawing that line?" Reddick said.
Still, Reddick said, she's aware that more jobs are demanding some level of AI knowledge — and that means institutions have a responsibility to introduce it to students. However, people need to fully comprehend that AI is simply a tool, she added, and it's important to use critical thinking when interacting with it.
"I encourage people never to go into a generative-AI conversation blindly," Reddick said. "You need to have intentions about the information you want to receive. Otherwise, it can have too much influence over what you came there for in the first place."
The future of AI in education
A 2023 report from the education publisher and platform Houghton Mifflin Harcourt found that 38% of educators planned to adopt AI tools in the 2023-24 school year.
A deterrent that's slowing teachers' enthusiasm for AI is not knowing how to safely and productively incorporate it into their curriculums. But programs such as CRAFT aim to remedy this disconnect and support more educators across the US.
Lee told BI that Stanford plans to develop an internal apprentice program to help students "engage more with the topic of AI in education."
He said the school also hopes to grow CRAFT's teacher codesign fellowship through which fellows develop AI literacy lessons. This, he added, would help to "ensure more voices from areas around the country facing different needs and circumstances can shape what gets made."
What's Your Reaction?