Most American students now use artificial intelligence. Get to know people who just say no.
When OpenAI launched ChatGPT in 2022, it sparked a firestorm among educators. Here was a tool that, with just a few lines of guidance, could collect large amounts of information, compose human-like sentences, and provide an answer to seemingly any question. They thought that students would definitely use it to cheat.
As AI-powered chatbots grow in popularity, there are also concerns about their potential misuse. In March, the Wall Street Journal told parents:There’s a good chance your child will use AI to cheat“New York Magazine declared that”Everyone cheats their way into college“.
For many students, these titles ring true. But not for everyone.
Why did we write this?
As artificial intelligence becomes entangled with everyday life, some students are pushing back. Their reasons range from the profound to the practical, and speak to maintaining a sense of community and humanity.
“What’s the point of going to college if you’re going to depend on this thing to give you the right answers?” says Mary Norkett, a student at St. John’s College in Santa Fe, New Mexico. “You’re not improving your mental abilities.”
Ms Norkett is among a cadre of students who have chosen not to use AI in their studies. They offer deep, practical reasons. Norkett, for example, is concerned not only about how fast roads might impair her critical thinking skills, but also about the accuracy of what is produced by artificial intelligence bots, which pull vast amounts of information from the Internet to mimic human cognition.
Such students are the minority on campus. In a September survey conducted by Copyleaks, the maker of the AI plagiarism detector, of college students, 90% of participants They said they use AI in school work. Of course, not all of these students were using it to cheat: the most common uses reported were brainstorming (57%) and drafting outlines (50%).
However, like many teachers, some AI abstainers worry that robots make cheating easier. In an internal report on the use of ChatGPT Powered by OpenAINearly a quarter of 18- to 24-year-olds, the most active of the more than 700 million weekly users of the bot, said they had used it for “exam answers.” A September report Discovery Education found that 40% of middle and high school students have used AI without their teacher’s permission, and nearly two-thirds of middle and high school teachers say they’ve caught students using chatbots to cheat.
The true extent of the fraud problem remains controversial. Decades of research has put the cheating rate at between 60% and 80%, says Victor Lee, an assistant professor of education at Stanford University. This has “remained fairly stable” since ChatGPT came on the scene.
Regardless, it is clear that students use technology more often. This reflects a variety of tensions. Students feel tremendous pressure to achieve academic success while juggling school, extracurriculars, jobs, and social obligations.
“There are also some situations that [students] “It is not clear where the line is between acceptable and unacceptable,” Professor Lee adds.
However, some students resisted the pressures that led their peers to use AI – whether legitimately or illegitimately. They have charted a path to an old-fashioned education, which for them is fulfilling, meaningful and decidedly humane.
“The full expression of a human being is not a robot. It is a creative and interactive force,” says Caleb Langenbrunner, another St. John’s University student. Simply receiving the answers provided by AI “doesn’t quite feel what it’s like to be human,” he says.
Maintain a sense of community
Unlike many universities, students at St. John’s say they rarely see their colleagues using AI. This may be due to the school’s unique teaching methods. It offers only one liberal arts degree, and its entire curriculum consists of a four-year reading list of what the college calls the “greatest books” in history. Titles include volumes such as Plato’s The Republic and Aristotle’s Politics.
However, St. John’s students are not the only ones who see over-reliance on AI among their peers as a problem. says Ashanti Rosario, a high school student from New York It does not use artificial intelligenceand she hopes her classmates don’t do it too.
“I think we lose a sense of community in the classroom if we are not actively engaged in whatever work we are given,” she says. When students use AI instead of turning to their peers, it “harms not only the person using it, but others who might gain a different perspective that enhances their learning.”
So too has the rapid rise of writing and art generated by artificial intelligence Worsening fears On the future of the humanities. Technology has entered the scene at a difficult time for creative majors. Number of students graduating from colleges with degrees in the humanities decreased by 24% between 2012 and 2022, according to the American Academy of Arts and Sciences.
“A big part of the humanities and arts is original thinking [and] “Creativity,” Ms. Rosario says, “is something that cannot be replicated, especially by a machine. So, I believe that in order to keep this cycle going — of art and spreading culture — it has to come from within.”
Credibility question
Abera Heitinga, a third-year student studying philosophy and psychology at the University of New Mexico, says he doesn’t use AI because it will “harm” his personality in the future. He also took classes in logic and critical thinking that shaped his outlook. He says the students in the class checked the accuracy of ChatGPT’s answers to various questions, and he didn’t like the chatbot.
Sometimes, when ChatGPT gave him a questionable answer, it would prompt him to explain his reasoning. Mr. Heitinga found that the robot often “just predicts what you want it to say.”
OpenAI He confessed Old models tended to tell users what they wanted to hear, even if that meant providing incorrect information. “That shaped my belief in it as far as its credibility,” Heitinga says. OpenAI says it has updated ChatGPT to address “adulation.”
As a writing instructor at the Center for Teaching and Learning at the University of New Mexico, Mr. Heitinga has first-hand experience with how overreliance on chatbots can deprive students of learning how to craft a persuasive argument.
“[AI] “It takes away the ability to formulate an argument,” he says. “You lose that crucial ability — to brainstorm ideas, organize a paper, know where to put your arguments, how to formulate a thesis, [and] Other crucial writing skills too.
Stanford University’s Professor Lee says charting a path to more sustainable use of AI may start with how schools engage with the tools – though he acknowledges it can be difficult for teachers to juggle the learning needs of dozens of students. Some teachers have already turned to old-fashioned testing methods, such as asking students to put pen to paper and handwrite essays in class.
Another strategy, he says, “is to develop students’ knowledge of artificial intelligence to help them learn how to use it responsibly, and what its capabilities and limitations are.”
Students interviewed say AI robots have potentially useful uses. For example, they can be a useful place to start research, because they collect and summarize huge amounts of information quickly.
Ultimately, Mr. Langenbrunner, of St. John’s, says he enjoys learning and working out the answers himself — and doesn’t want to miss out on a good time.
“You know, I guess [AI is] “It’s kind of boring,” he laughs. “If I were to use AI to write all my papers, it would take away all the fun.”