I was at a talk about generative AI over the weekend, and someone asked about what it meant for students and homework assignments.
The speaker said something I can’t stop thinking about: that if ChatGPT can regurgitate information as well as a student, and the teacher has no idea, what’s really the problem?
Something I’m trying to teach my children, even at their young ages, is critical thinking. I don’t want them to memorize something. I want them to think about a problem, analyze it, and come to a solution.
I want them to question rules that don’t make sense. And I want them to force people in authority positions to explain themselves.
Our education system in the United States doesn’t teach those skills, and the fear of what ChatGPT means for homework grossly exposes that.
Who cares if ChatGPT can write a paper about the Battle of Gettysburg at a 5th grade level?
If my kid successfully leverages tools like ChatGPT to do their homework and save them time, they’ve gained a much more useful skill than memorizing the Battle of Gettysburg was a major turning point in the American Civil War, lasting from July 1 to July 3, 1863, with the Union Army defeating the Confederate Army led by General Robert E. Lee1.
We’re still in a culture where students are taught to obey arbitrary rules, learn enough to pass standardized tests, and then promptly forget what they’ve memorized.
And it’s so very broken. It’s why 98% of 5 year olds are considered “genius level creative,” and by age 15, that number is down to 12%.
The good teachers will celebrate generative AI. They’ll teach their students how to leverage it to do proper research in a fraction of the time.
- I know what you’re thinking, and you’re right. ?