Artificial intelligenceChatGPTChildrenEducationFeaturedSpectator P.M.Technology

Kids Don’t Need to Be ‘Dancing With Robots’ in School – The American Spectator | USA News and PoliticsThe American Spectator

There’s a universal truth about technology that most of us are somewhat unwilling to admit: Kids are better at it than adults are.

At this point, it’s a stereotype: the grandparents who squint at their phones and poke at them while grumbling about “newfangled technologies” or the parents who hand their kids their phone to figure out a settings adjustment. It’s kids who are asking ChatGPT the important questions like, “Is my wife the most beautiful woman in the world?” (an actual question my 21-year-old brother-in-law thought was worth asking) or “Are you [ChatGPT] a good wingman?” (RELATED: Mom, Meet My New AI Girlfriend)

Nevertheless, the Trump administration reportedly thinks that kids need their public school teachers’ help in figuring out how to use chatbots.

Last week, the Washington Post reported that it had seen a draft of an executive order that would allegedly “create a policy integrating artificial intelligence into K-12 education.” Reportedly, the executive order — which may never even see the light of day — would require federal agencies to “train students in using AI and to incorporate it into teaching-related tasks.”

While the White House doesn’t seem to be inclined to comment on the draft order (nothing is official yet), it seems like the government is considering allocating funds to train teachers on how to use AI effectively, and then having those same teachers provide homework assignments incorporating chatbot responses.

Of course, even if the White House never signs an executive order on the matter into effect, plenty of educators are already considering ways to integrate chatbots themselves. In an article published by Harvard Graduate School of Education, Elizabeth Ross argued in an article two years ago that educators should just embrace the new technology. “Where we want to get to is a place where you’re dancing with it, dancing with robots,” lecturer Houman Harouni posited at the time.

The American Psychological Association pointed out in January that one in seven adolescents are using generative technologies to “help with homework.” Not only that, but the same article argued that teachers pretty much haven’t had a choice: Kids are at the cutting edge of technology, and teachers need to be integrating it into the classroom too. 

The arguments for using tools like Claude and ChatGPT in the classroom are many and varied. For one, kids in high school today will likely be coming of age in an economy that has started to replace jobs like “software developer” or “breaking news journalist” with people who wear many hats and use chatbots to churn out tedious tasks. Learning how to use generative AI is really just job training. (READ MORE: Trump Hates EU Tech Meddling. Why Is He Letting His FTC do the Same?)

Additionally, at this point in time, it’s impossible not to recognize that this newfangled tech is everywhere a student touches, even if he never turns to it for help with his math teacher’s tedious homework: It writes and corrects emails, revises grammar in word processors, and even reads math equations on notepads (if you’re an Apple user, anyway). Teachers, the argument goes, might as well embrace it wholeheartedly. 

At the risk of sounding like an anti-progressive old geezer (I’m 24), we should hold off on using generative technologies in the classroom on a regular basis, if we ever decide that using them in the classroom is a good idea at all. 

No matter how hard teachers try, kids are going to be more adept at using ChatGPT than they are. It is just a fact of life that kids — whose whole job is to discover everything there is to discover about the world — are better at asking questions than adults who’ve become accustomed to the way they think the world ought to work. But that’s hardly the whole reason.

Educators are in the business of shaping and connecting neurons. To do that, they need to instill facts, processes, and critical thinking skills into the minds of their students — and it just so happens that, until relatively recently, we were quite good at that. An eighth-grade exam in 1895 asked students to calculate the levy a school district needed and to name and describe places like Monrovia, Hecla, and Aspinwall. Meanwhile, today, college students in the U.S. are totally stumped by basic civics questions.

How, you ask, could students graduating from the eighth grade more than 100 years ago manage to pass exams that most of us couldn’t pass today? Their teachers were interested in instilling facts in their brains and then teaching them how to use those facts to get answers. Today, most of us just ask a chatbot to feed us the answer. (READ MORE: Regarding AI, Is Sin Contagious?)

Nobody, at this point, is naive enough to think that students won’t turn to chatbots while studying. They will; it’s unavoidable. But while teachers need to recognize that there’s a high chance a five-page essay assignment will get brainstormed and written by a chatbot (and that they need to respond creatively to that inevitability), they also need to teach their students that there are just some facts they need to know and ideas they need to analyze on their own. 

The last thing a kid needs is a 40-year-old adult to teach him how to dance with the chatbots; he’s already better at it than his teacher will ever be.

Source link

Related Posts

1 of 103