Entering her first year of teaching as a graduate assistant at Bowling Green State University, Sydney Koeplin had more on her mind than how to relate to her students. She was worried about how to deal with generative AI.
![AI Atlas]()At first, Koeplin took a “hard line” against allowing students to use AI beyond basic grammar and spelling checks. (The school’s curriculum dictated that it could be used conditionally, but those conditions were left to the professor to define.) After several students in her first semester used AI to generate assignments, Koeplin changed her approach. She moved away from traditional grading to “contract grading,” where a student’s final grade was based on how much effort they put into the work. Koeplin didn’t receive any more AI-generated papers.
“I would tell my students, ‘the world wants to hear your voice,'” Koeplin said. “The whole point of writing is to give a piece of yourself to the world, and if you’re relying on a machine to think for you, then you’re not having freedom of thought.”
As students return to school this fall, they will step into a landscape transformed by AI. Educators are reimagining teaching, while students must learn to use these tools critically, collaborating with AI without outsourcing their own judgment and self-expression. Acquiring knowledge and skills, after all, is the true goal of learning. Learning is so important that, as a society, we dedicate much of the first couple of decades of our lives (sometimes more) to it. Yet education faces the specter of unavoidable change in how we consume and digest information, how we study and how we think — all while an entire generation’s cognitive development hangs in the balance.
Don’t miss any of CNET’s unbiased tech content and lab-based reviews. Add us as a preferred Google source on Chrome.
## Schools are trying to catch up with students’ AI use
Students have been quicker in using AI than their schools have been to prepare for and regulate it.
The numbers are staggering. In one survey, 86% of students globally reported using AI tools in their schoolwork. It’s not just college students either, 46% of students in grades 10 to 12 reported using AI tools for academic and non-academic activities.
Many are using AI tools not just for homework assistance but as study partners, research aids and writing collaborators. For instance, Grammarly recently introduced new specialized AI “agents” along with a writing platform called Grammarly Docs. These tools are built to helps with tasks ranging from essay drafting to refining workplace emails to estimating your grade on an assignment.
But schools and educators are scrambling to catch up with training and developing policies addressing AI use for schoolwork.
Only 26% of school districts planned to offer AI training during the 2024–2025 school year. Now, around 74% of districts plan to train teachers by Fall 2025, according to findings from the American School District Panel.
Many grade school teachers and college professors (84%) are already embracing AI, with or without school district or university-led training. Some are going so far as to require AI use in classrooms.
The trouble is the lack of comprehensive guidelines and policies around how students are allowed to use AI for their learning. Seemingly, it’s up to the individual instructors and teachers to decide when and how students can use AI tools, including potential penalties if students use chatbots to finish assignments, write essays and beyond.
This gap reveals a critical disconnect. Students, particularly digital natives, have embraced AI as naturally as they once adopted smartphones, the internet and social media. But are they being taught how to use it?
## Can schools use AI responsibly?
Welcomed or not, AI is already embedded in student workflows, often invisibly, making it urgent for schools and universities to establish clear policies that balance AI’s benefits with the cognitive demands essential for deep learning.
Earlier this year, a training effort called the National Academy for AI Instruction — a $23 million initiative backed by Microsoft, OpenAI, Anthropic and the American Federation of Teachers — launched to build educators’ capacity for effective and ethical AI integration.
Experts say effective integration means designing AI use that complements, not replaces, the mental effort required for lasting learning. Research on “desirable difficulties” shows that when learning feels too easy, long-term retention and critical thinking suffer.
As a student, you may feel like the priority is a stellar GPA and general perfection when it comes to completing assignments (which can heighten the temptation to use AI tools). Really, the goal of learning is genuine understanding, pushing yourself to think in new and different ways and expanding what you thought you knew already. AI may offer quick solutions, but the goal of going to school isn’t to get the right answer. To learn, you’ve got to understand how to get to that right answer. And to understand how to get there, you generally need to get things wrong first.
“For most of my students, if not all of them, it was their first semester of college and so they were really worried about writing a perfect paper and getting a good grade,” Koeplin said. “I tried to reiterate from the beginning that the class was really about process, not about a final product. Writing is a journey that is mostly thinking, and machines can’t think for you.”
![erson interacting with AI learning system using digital book and education icons on virtual interface.]()Across majors, schools and grades, there is consensus that certain types of AI use are no-gos.
vittaya pinpan/Getty Images
## AI learning do’s and don’ts
As a high school or university student, the temptation to use AI tools might be weighing heavily on your mind. You may be wondering, is there a method to use AI “creatively” and “responsibly?” How would I even know what’s “responsible” AI use?
I spoke to John Robinson, a former professor at the Hussman School of Journalism and Media at UNC-Chapel Hill and editor with 37 years in the newspaper business, who has created a lecture about how writers can responsibly use AI tools (specifically ChatGPT). Robinson shared the following tips for how students can use AI to write ethically:
* Narrowing down broad story angles
* Taking a piece of your writing to analyze its tone and voice
* Getting recommendations for sources to interview for an article
Robinson said that essentially, these are all tips a good editor or journalism professor would be able to share with you, but they can be helpful in moments when you don’t have immediate access to one. A good rule of thumb, he said, is to use AI as you would if you were brainstorming with a classmate, or asking your dormmate to read an assignment for grammar or clarity before you turn it in.
Robinson said when it comes to the ethics of incorporating these tools into school work, students know better. “We spent some time talking about the ethics of (using AI), and my position was always that they were well-versed in journalism ethics, and knew what was right and what was wrong,” he said.
A recent blog post from Studocu, a digital platform for study materials, also lays out some ethical ways AI can be used to study, such as helping with essay structuring, helping with presentations or paraphrasing a body of text for reference.
I reviewed several rubrics from STEM and medical students and found that other acceptable uses of AI in classwork include:
* Using AI to brainstorm or template ideas for assignments and homework
* Submitting work you’ve already created/completed for iteration or improvement
* Utilizing Grammarly’s editing function for punctuation and spelling checks
Across majors, schools and grades, there is consensus that these AI use cases are probably no-nos:
* Submitting all or part of an assignment rubric to an AI platform
* Incorporating any part of an AI-generated response in an assignment
* Assuming that the generated responses are accurate
* Breaching any university policies on plagiarism and cheating
* Misrepresenting your own work and skills
* Using tools like Grammarly to write whole sentences for you
![screenshot of ai class policy]()A screenshot from Koeplin’s WRIT1120 course rubric that addresses AI use.
Sydney Koeplin
## Memory, learning and ‘cramming’
What about the cognitive differences between using AI tools for schoolwork and studying and traditional cramming? If students aren’t internalizing information due to AI, isn’t it similar to students just cramming for a test and then moving on? Well, emerging research from MIT suggests that AI-assisted work, when not properly structured, may be even less effective for building lasting knowledge, reasoning and decision-making.
The key difference lies in the nature of cognitive engagement.
Cramming, while intensive and stressful, still requires active mental effort. Students must organize information, make connections and engage working memory. AI-assisted learning, particularly when used passively, can reduce this cognitive load to such an extent that meaningful learning doesn’t occur.
“The convenience of instant answers that LLMs provide can encourage passive consumption of information, which may lead to superficial engagement, weakened critical thinking skills, less deep understanding of the materials and less long-term memory formation,” the MIT study’s authors wrote. “The reduced level of cognitive engagement could also contribute to a decrease in decision-making skills and in turn, foster habits of procrastination and ‘laziness’ in both students and educators.”
However, the picture isn’t entirely doom and gloom. When used strategically in teaching and learning, AI can enhance rather than replace learning. The technology excels at providing immediate feedback, personalizing instruction to individual learning styles and helping students identify knowledge gaps. Considering how we integrate AI into education now will shape the minds of future generations, the wisdom with which we weave it into the fabric of learning will be the crucial factor.
Schools are trying to catch up with students’ AI useStudents have been quicker in using AI than their schools have been to prepare for and regulate it.
It’s not just college students either, 46% of students in grades 10 to 12 reported using AI tools for academic and non-academic activities.
Many are using AI tools not just for homework assistance but as study partners, research aids and writing collaborators.
Sydney KoeplinMemory, learning and ‘cramming’What about the cognitive differences between using AI tools for schoolwork and studying and traditional cramming?
When used strategically in teaching and learning, AI can enhance rather than replace learning.
Entering her first year of teaching as a graduate assistant at Bowling Green State University, Sydney Koeplin had more on her mind than how to relate to her students. She was worried about how to deal with generative AI.
At first, Koeplin took a “hard line” against allowing students to use AI beyond basic grammar and spelling checks. (The school’s curriculum dictated that it could be used conditionally, but those conditions were left to the professor to define.) After several students in her first semester used AI to generate assignments, Koeplin changed her approach. She moved away from traditional grading to “contract grading,” where a student’s final grade was based on how much effort they put into the work. Koeplin didn’t receive any more AI-generated papers.
“I would tell my students, ‘the world wants to hear your voice,'” Koeplin said. “The whole point of writing is to give a piece of yourself to the world, and if you’re relying on a machine to think for you, then you’re not having freedom of thought.”
As students return to school this fall, they will step into a landscape transformed by AI. Educators are reimagining teaching, while students must learn to use these tools critically, collaborating with AI without outsourcing their own judgment and self-expression. Acquiring knowledge and skills, after all, is the true goal of learning. Learning is so important that, as a society, we dedicate much of the first couple of decades of our lives (sometimes more) to it. Yet education faces the specter of unavoidable change in how we consume and digest information, how we study and how we think — all while an entire generation’s cognitive development hangs in the balance.
Schools are trying to catch up with students’ AI use
Students have been quicker in using AI than their schools have been to prepare for and regulate it.
The numbers are staggering. In one survey, 86% of students globally reported using AI tools in their schoolwork. It’s not just college students either, 46% of students in grades 10 to 12 reported using AI tools for academic and non-academic activities.
Many are using AI tools not just for homework assistance but as study partners, research aids and writing collaborators. For instance, Grammarly recently introduced new specialized AI “agents” along with a writing platform called Grammarly Docs. These tools are built to helps with tasks ranging from essay drafting to refining workplace emails to estimating your grade on an assignment.
But schools and educators are scrambling to catch up with training and developing policies addressing AI use for schoolwork.
Only 26% of school districts planned to offer AI training during the 2024–2025 school year. Now, around 74% of districts plan to train teachers by Fall 2025, according to findings from the American School District Panel.
Many grade school teachers and college professors (84%) are already embracing AI, with or without school district or university-led training. Some are going so far as to require AI use in classrooms.
The trouble is the lack of comprehensive guidelines and policies around how students are allowed to use AI for their learning. Seemingly, it’s up to the individual instructors and teachers to decide when and how students can use AI tools, including potential penalties if students use chatbots to finish assignments, write essays and beyond.
This gap reveals a critical disconnect. Students, particularly digital natives, have embraced AI as naturally as they once adopted smartphones, the internet and social media. But are they being taught how to use it?
Can schools use AI responsibly?
Welcomed or not, AI is already embedded in student workflows, often invisibly, making it urgent for schools and universities to establish clear policies that balance AI’s benefits with the cognitive demands essential for deep learning.
Earlier this year, a training effort called the National Academy for AI Instruction — a $23 million initiative backed by Microsoft, OpenAI, Anthropic and the American Federation of Teachers — launched to build educators’ capacity for effective and ethical AI integration.
Experts say effective integration means designing AI use that complements, not replaces, the mental effort required for lasting learning. Research on “desirable difficulties” shows that when learning feels too easy, long-term retention and critical thinking suffer.
As a student, you may feel like the priority is a stellar GPA and general perfection when it comes to completing assignments (which can heighten the temptation to use AI tools). Really, the goal of learning is genuine understanding, pushing yourself to think in new and different ways and expanding what you thought you knew already. AI may offer quick solutions, but the goal of going to school isn’t to get the right answer. To learn, you’ve got to understand how to get to that right answer. And to understand how to get there, you generally need to get things wrong first.
“For most of my students, if not all of them, it was their first semester of college and so they were really worried about writing a perfect paper and getting a good grade,” Koeplin said. “I tried to reiterate from the beginning that the class was really about process, not about a final product. Writing is a journey that is mostly thinking, and machines can’t think for you.”
Across majors, schools and grades, there is consensus that certain types of AI use are no-gos.
AI learning do’s and don’ts
As a high school or university student, the temptation to use AI tools might be weighing heavily on your mind. You may be wondering, is there a method to use AI “creatively” and “responsibly?” How would I even know what’s “responsible” AI use?
I spoke to John Robinson, a former professor at the Hussman School of Journalism and Media at UNC-Chapel Hill and editor with 37 years in the newspaper business, who has created a lecture about how writers can responsibly use AI tools (specifically ChatGPT). Robinson shared the following tips for how students can use AI to write ethically:
- Narrowing down broad story angles
- Taking a piece of your writing to analyze its tone and voice
- Getting recommendations for sources to interview for an article
Robinson said that essentially, these are all tips a good editor or journalism professor would be able to share with you, but they can be helpful in moments when you don’t have immediate access to one. A good rule of thumb, he said, is to use AI as you would if you were brainstorming with a classmate, or asking your dormmate to read an assignment for grammar or clarity before you turn it in.
Robinson said when it comes to the ethics of incorporating these tools into school work, students know better. “We spent some time talking about the ethics of (using AI), and my position was always that they were well-versed in journalism ethics, and knew what was right and what was wrong,” he said.
A recent blog post from Studocu, a digital platform for study materials, also lays out some ethical ways AI can be used to study, such as helping with essay structuring, helping with presentations or paraphrasing a body of text for reference.
I reviewed several rubrics from STEM and medical students and found that other acceptable uses of AI in classwork include:
- Using AI to brainstorm or template ideas for assignments and homework
- Submitting work you’ve already created/completed for iteration or improvement
- Utilizing Grammarly’s editing function for punctuation and spelling checks
Across majors, schools and grades, there is consensus that these AI use cases are probably no-nos:
- Submitting all or part of an assignment rubric to an AI platform
- Incorporating any part of an AI-generated response in an assignment
- Assuming that the generated responses are accurate
- Breaching any university policies on plagiarism and cheating
- Misrepresenting your own work and skills
- Using tools like Grammarly to write whole sentences for you
A screenshot from Koeplin’s WRIT1120 course rubric that addresses AI use.
Memory, learning and ‘cramming’
What about the cognitive differences between using AI tools for schoolwork and studying and traditional cramming? If students aren’t internalizing information due to AI, isn’t it similar to students just cramming for a test and then moving on? Well, emerging research from MIT suggests that AI-assisted work, when not properly structured, may be even less effective for building lasting knowledge, reasoning and decision-making.
The key difference lies in the nature of cognitive engagement.
Cramming, while intensive and stressful, still requires active mental effort. Students must organize information, make connections and engage working memory. AI-assisted learning, particularly when used passively, can reduce this cognitive load to such an extent that meaningful learning doesn’t occur.
“The convenience of instant answers that LLMs provide can encourage passive consumption of information, which may lead to superficial engagement, weakened critical thinking skills, less deep understanding of the materials and less long-term memory formation,” the MIT study’s authors wrote. “The reduced level of cognitive engagement could also contribute to a decrease in decision-making skills and in turn, foster habits of procrastination and ‘laziness’ in both students and educators.”
However, the picture isn’t entirely doom and gloom. When used strategically in teaching and learning, AI can enhance rather than replace learning. The technology excels at providing immediate feedback, personalizing instruction to individual learning styles and helping students identify knowledge gaps. Considering how we integrate AI into education now will shape the minds of future generations, the wisdom with which we weave it into the fabric of learning will be the crucial factor.