OpenAI envisions teachers using its AI-powered tools to create lesson plans and interactive tutorials for students. But some educators are wary of the technology — and its potential to go awry. Today, OpenAI released a free online course designed to help K-12 teachers learn how to bring ChatGPT , the company’s AI chatbot platform, into their classrooms.
Created in collaboration with the nonprofit organization Common Sense Media, with which OpenAI has an active partnership , the one-hour, nine-module program covers the basics of AI and its pedagogical applications. OpenAI says that it’s already deployed the course in “dozens” of schools, including the Agua Fria School District in Arizona, the San Bernardino School District in California, and the charter school system Challenger Schools. Per the company’s internal research, 98% of participants said the program offered new ideas or strategies that they could apply to their work.
“Schools across the country are grappling with new opportunities and challenges as AI reshapes education,” Robbie Torney, senior director of AI programs at Common Sense Media, said in a statement. “With this course, we are taking a proactive approach to support and educate teachers on the front lines and prepare for this transformation.” But some educators don’t see the program as helpful — and think it could in fact mislead.
Lance Warwick, a sports lecturer at the University of Illinois Urbana-Champaign, is concerned resources like OpenAI’s will normalize AI use among educators unaware of the tech’s ethical implications. While OpenAI’s course covers some of ChatGPT’s limitations, like that it can’t fairly grade students’ work , Warwick found the modules on privacy and safety to be “very limited” — and contradictory. “In the example prompts [OpenAI gives], one tells you to incorporate grades and feedback from past assignments, while another tells you to create a prompt for an activity to teach the Mexican Revolution,” Warwick noted.
“In the next module on safety, it tells you to never input student data, and then talks about the bias inherent in generative AI and the issues with accuracy. I’m not sure those are compatible with the use cases.” Sin á Tres Souhaits, a visual artist and educator at The University of Arizona, says that he’s found AI tools to be helpful in writing assignment guides and other supplementary course materials.
But he also says he’s concerned that OpenAI’s program doesn’t directly address how the company might exercise control over content teachers create using its services. “If educators are creating courses and coursework on a program that gives the company the right to recreate and sell that data, that would destabilize a lot,” Tres Souhaits told TechCrunch. “It’s unclear to me how OpenAI will use, package, or sell whatever is generated by their models.
”lo In its ToS, OpenAI states that it doesn’t sell user data, and that users of its services, including ChatGPT, own the outputs they generate “to the extent permitted by applicable law.” Without additional assurances, however, Tres Souhaits isn’t convinced that OpenAI won’t quietly change its policies in the future. “For me, AI is like crypto,” Tres Souhaits said.
“It’s new, so it offers a lot of possibility — but it’s also so deregulated that I wonder how much I would trust any guarantee.” Late last year, the United Nations Educational, Scientific, and Cultural Organization (UNESCO) pushed for governments to regulate the use of AI in education, including implementing age limits for users and guardrails on data protection and user privacy. But little progress has been made on those fronts since — and on AI policy in general.
Tres Souhaits also takes issue with the fact that OpenAI’s program, which OpenAI markets as a guide to “AI, generative AI, and ChatGPT,” doesn’t mention any AI tools besides OpenAI’s own. “It feels like this reinforces the idea that OpenAI is the AI company,” he said. “It’s a smart idea for OpenAI as a business.
But we already have a problem with these tech-opolies — companies that have an outsize influence because, as the tech was developed, they put themselves at the center of innovation and made themselves synonymous with the thing itself.” Josh Prieur, a classroom teacher-turned-product director at educational games company Prodigy Education, had a more upbeat take on OpenAI’s educator outreach. Prieur argues that there are “clear upsides” for teachers if school systems adopt AI in a “thoughtful” and “responsible” way, and he believes that OpenAI’s program is transparent about the risks.
“There remain concerns from teachers around using AI to plagiarize content and dehumanize the learning experience, and also risks around becoming overly reliant on AI,” Preiur said. “But education is often key to overcoming fears around the adoption of new technology in schools, while also ensuring the right safeguards are in place to ensure students are protected and teachers remain in full control.” OpenAI is aggressively going after the education market, which it sees as a key area of growth.
In September, OpenAI hired former Coursera chief revenue officer Leah Belsky as its first GM of education, and chargefd her bringing OpenAI’s products to more schools. And in the spring, the company launched ChatGPT Edu , a version of ChatGPT built for universities. According to Allied Market Research, the AI in education market could be worth $88.
2 billion within the next decade. But growth is off to a sluggish start, in large part thanks to skeptical pedagogues. In a survey this year by the Pew Research Center, a quarter of public K-12 teachers said that using AI tools in education does more harm than good.
A separate poll by the Rand Corporation and the Center on Reinventing Public Education found that just 18% of K-12 educators are using AI in their classrooms. Educational leaders have been similarly reluctant to try AI themselves, or introduce the technology to the educators they oversee. Per educational consulting firm EAB, few district superintendents view addressing AI as a “very urgent” need this year — particularly in light of pressing issues such as understaffing and chronic absenteeism .
Mixed research on AI’s educational impact hasn’t helped convince the non-believers. University of Pennsylvania researchers found that Turkish high school students with access to ChatGPT did worse on a math test than students who didn’t have access. In a separate study , researchers observed that German students using ChatGPT were able to find research materials more easily, but tended to synthesize those materials less skillfully than their non-ChatGPT-using peers.
As OpenAI writes in its guide, ChatGPT isn’t a substitute for engagement with students. Some educators and schools may never be convinced it’s a substitute for any step in the teaching process..
Technology
OpenAI releases a teacher’s guide to ChatGPT, but some educators are skeptical
OpenAI envisions teachers using its AI-powered tools to create lesson plans and interactive tutorials for students. But some educators are wary of the technology — and its potential to go awry. Today, OpenAI released a free online course designed to help K-12 teachers learn how to bring ChatGPT, the company’s AI chatbot platform, into their [...]© 2024 TechCrunch. All rights reserved. For personal use only.