Originally published in Forbes on April 12, 2023.
If you’ve been following tech trends for decades like me, you probably remember the 1996 launch of Ask Jeeves: a search engine that promised quick answers to simmering questions. Not ringing a bell? Well, I’m quite certain you remember the early 2000s’ explosion of Google: “a company that started as a novel search engine” and now manages a range of popular products. Then, of course, came Siri, Apple’s trailblazing smart voice assistant, and Alexa, Amazon’s “artificially intelligent virtual assistant.” What followed was a dizzying collection of other products and services that pledged to help us do things better, smarter and faster.
However, coupled with the excitement and the ascension of each of these inventions came the fear of what their widespread usage would mean for education, particularly for leaders of academic institutions and the companies that serve them. The questions abounded: Will kids forget how to do simple math problems if a search engine can do it for them? Will students remember that a complete sentence needs a subject and a verb? Will they rely too much on a computer instead of their own ingenuity? These questions, and many others, seemingly resolved in time and on their own.
But here in 2023, it seems the education industry has identified a new nemesis in the form of ChatGPT: an artificial intelligence tool that, as NPR explains, can “solve math problems, churn out college essays and write research papers.” Since its November launch, and 100 million monthly active users later, countless schools and educators across the country have expressed their concerns that the growing popularity of large language models (LLMs) could increase the likelihood of cheating, plagiarism and other unethical behavior.
While these concerns are certainly not unfounded, I think it’s important for each us—particularly those in the education and edtech spaces—to consider the ways that we can leverage tech tools to actually enhance students’ academic goals instead of impede them.
We’ve seen this work before. Once outlawed in the classroom, calculators are now commonplace, perhaps because we’ve learned that math is about more than just teaching students about calculation; rather, it’s about problem-solving and “showing” your work. YouTube—which offers everything from science and math lessons to informational school-specific webinars—has become a resource and communication tool for schools, students, education companies and families in every corner of the nation. And companies have created applications to bring real-life work-based experiences into the classroom with virtual work and college tours, virtual chats with working professionals, and much more.
So when it comes to thinking about AI, it’s vital that we as industry leaders remember that technology, and the application of it in the classroom, certainly isn’t anything new. Like calculators, like YouTube and like Google, tech-infused advancements simply aren’t ever going away. And I’m not sure we’d want them to.
While it’s still too early to predict what AI’s impact will be on education, I think it’s also too early to thwart its potential impact. Instead, we should find ways to adapt to it and other new technologies. If we don’t, we run the risk that our institutions and companies will be left behind.
In fact, no matter what kind of innovative technology joins the world in the future, edtech leaders should remain committed to supporting students’ individual critical thinking and problem-solving skills, which currently cannot be replicated by technology. We should continue to advance schools’ curricula, services and programs in ways that support this goal. And we should remain steadfast in our commitment to ensuring the learners of today hone the skills they need to succeed in the world tomorrow.
Along the way, each of us—particularly those of us in the edtech space—should continue to adhere to three principles:
1. Uphold an unwavering commitment to technological advancement and embrace it wholeheartedly.
2. Uphold the tenets of student authenticity, originality, creativity, transparency and academic integrity by discouraging the use of AI for school-related work or assignments.
3. Find new avenues to adapt to AI applications in ways that support our respective institutions’ and companies’ values and missions.
AI doesn’t have to be a setback for education; it can be a gift. We just need to figure out how to wrap it.
James Rhyu is the Chief Executive Officer of Stride, Inc.