63

Implications of AI to schools

https://xcancel.com/karpathy/status/1993010584175141038

One of my students recently came to me with an interesting dilemma. His sister had written (without AI tools) an essay for another class, and her teacher told her that an "AI detection tool" had classified it as having been written by AI with "100% confidence". He was going to give her a zero on the assignment.

Putting aside the ludicrous confidence score, the student's question was: how could his sister convince the teacher she had actually written the essay herself? My only suggestion was for her to ask the teacher to sit down with her and have a 30-60 minute oral discussion on the essay so she could demonstrate she in fact knew the material. It's a dilemma that an increasing number of honest students will face, unfortunately.

an hour agoubj

I agree. Most campuses use a product called Turnitin, which was originally designed to check for plagiarism. Now they claim it can detect AI-generated content with about 80% accuracy, but I don’t think anyone here believes that.

an hour agovondur

80% is catastrophic though. In a classroom of 30 all honest pupils, 6 will get a 0 mark because the software says its AI?

an hour agophh

80% accuracy could mean 0 false positives and 80% false negatives.

My point is that accuracy is a terrible metric here and sensitivity, specificity tell us much more relevant information to the task at hand. In that formulation, a specificity < 1 is going to have false positives and it isn't fair to those students to have to prove their innocence.

an hour agokelseyfrog

It depends on their test dataset. If the test set was written 80% by AI and 20% by humans, a tool that labels every essay as AI-written would have a reported accuracy of 80%. That's why other metrics such as specificity and sensitivity (among many others) are commonly reported as well.

Just speaking in general here -- I don't know what specific phrasing TurnItIn uses.

an hour agoCaptainNegative

The promise (not saying that it works) is probably that 20% of people who cheated will not get caught. Not that 20% of the work marked as AI is actually written by humans.

an hour agoyoavm

I suppose 80% means you don't give them a 0 mark because the software says it's AI, you only do so if you have other evidence reinforcing the possibility.

an hour agov9v

I think it means every time AI is used, it will detect it 80% of the time. Not that 20% of the class will marked as using AI.

an hour agoj45

I had Turn It In mark my work as plagiarism some years ago and I had to fight for it. It was clear the teacher wasn’t doing their job and blindly following the tool.

What happened is that I did a Q&A worksheet but in each section of my report I reiterated the question in italics before answering it.

The reiterated questions of course came up as 100% plagiarism because they were just copied from the worksheet.

an hour agotyleo

This matches my experience pretty well. My high school was using it 15 years ago and it was a spotty, inconsistent morass even back then. Our papers were turned in over the course of the semester, and late into the year you’d get flagged for “plagiarizing” your own earlier paper.

26 minutes agopirates

When I was in college, there was a cheating scandal for the final exam where somehow people got their hands on the hardest question of the exam.

The professor noticed it (presumably via seeing poor "show your work") and gave zero points on the question to everyone. And once you went to complain about your grade, she would ask you to explain the answer there in her office and work through the problem live.

I thought it was a clever and graceful way to deal with it.

an hour agohuevosabio

This is a nice approach. The students who know the material, or even who manually prepare before seeing the prof achieve the objective of learning.

an hour agoj45

Doesn't google docs have fairly robust edit history? If I was a student these days I'd either record my screen of me doing my homework, or at least work in google docs and share the edit history.

an hour agoneom

This still leaves many options open for plagiarism (for example a second, seperate device)

38 minutes agoHelloUsername

Yes. When I was an educator, reviewing version history was an obvious way to clarify if/how much students plagiarized.

an hour agogerminalphrase

Now imagine this but its a courtroom and you're facing 25 years

an hour agobad_haircut72

Family law judges, in my small experience, are so uninterested in the basic facts of a case that I would actually trust an LLM to do a better job. Not quite what you mean, but maybe there is a silver lining.

We are already (in the US) living in a system of soft social-credit scores administered by ad tech firms and non-profits. So “the algorithms says you’re guilty” has already been happening in less dramatic ways.

an hour agostocksinsmocks

Easy if one of these options might be available to the writer:

- Write it in google docs, and share the edit history in the google docs, it is date and time stamped.

- Make a video of writing it in the google docs tab.

If this is available, and sufficient, I would pursue a written apology to remind the future detectors.

Edit: clarity

an hour agoj45

edit history in Google docs is a good way to defend yourself from AI tool use accusations

an hour agojohanam

Ironic that one of the biggest AI companies is also the platform to offer a service to protect yourself from allegations of using it.

an hour agoandrewinardeer

I seriously think the people selling AI detection tools to teachers should be sued into the ground by a coalition of state attorneys general, and that the tools should be banned in schools.

an hour agohiddencost

In my CS undergrad I had Doug Lea as a professor, really fantastic professor (best teacher I have ever had, bar none). He had a really novel way to handle homework hand ins, you had to demo the project. So you got him to sit down with you, you ran the code, he would ask you to put some inputs in (that were highly likely to be edge cases to break it). Once that was sufficient, he would ask you how you did different things, and to walk him through your code. Then when you were done he told you to email the code to him, and he would grade it. I am not sure how much of this was an anti-cheating device, but it required that you knew the code you wrote and why you did it for the project.

I think that AI has the possibility of weakening some aspects of education but I agree with Karpathy here. In class work, in person defenses of work, verbal tests. These were corner stones of education for thousands of years and have been cut out over the last 50 years or so outside of a few niche cases (Thesis defense) and it might be a good thing that these come back.

an hour agoecshafer

Yep, it's easy to shortcut AI plagiarism, but you need time. In most of the universities around the world (online universities especially), the number of students is way too big, while professors get more and more pressure on publishing and bureaucracy.

an hour agomercacona

I did my masters in GaTech OMSCS (Chatgpt came out at the very end of my last semester). Tests were done with cameras on and it was recorded and then they were watched I think by TAs. Homework was done with automated checking and a plagiarism checker. Do you need to have in person proctoring via test centers or libraries? Video chats with professors? I am not sure. Projects are importants, but maybe they need to become a minority of grades and more being based on theory to circumvent AI?

34 minutes agoecshafer

So we are screwed once we get brain-computer interfaces?

an hour agoSirMaster
[deleted]
35 minutes ago

> The students remain motivated to learn how to solve problems without AI because they know they will be evaluated without it in class later.

Learning how to prepare for in-class tests and writing exercises is a very particular skillset which I haven't really exercised a lot since I graduated.

Never mind teaching the humanities, for which I think this is a genuine crisis, in class programming exams are basically the same thing as leetcode job interviews, and we all know what a bad proxy those are for "real" development work.

an hour agosharkjacobs

> in class programming exams are basically the same thing as leetcode job interviews, and we all know what a bad proxy those are for "real" development work.

Confusing university learning for "real industry work" is a mistake and we've known it's a mistake for a while. We can have classes which teach what life in industry is like, but assuming that the role of university is to teach people how to fit directly into industry is mistaking the purpose of university and K-12 education as a whole.

Writing long-form prose and essays isn't something I've done in a long time, but I wouldn't say it was wasted effort. Long-form prose forces you to do things that you don't always do when writing emails and powerpoints, and I rely on those skills every day.

6 minutes agoyannyu

I use it every day.

Preparing for a test requires understanding what the instructor wants. concentrate on the wrong thing get marked down.

Same applies to working in a corporation. You need to understand what management wants. It’s a core requirement.

an hour agoiterateoften

It's a fair question, but there's maybe a bit of US defaultism baked in? If I look back at my exams in school they were mostly closed-book written + oral examination, nothing would really need to change.

A much bigger question is what to teach assuming we get models much more powerful than those we have today. I'm still confident there's an irreducible hard core in most subjects that's well worth knowing/training, but it might take some soul searching.

an hour agoqsort

It seems like a good path forward is to somewhat try to replicate the idea of "once you can do it yourself, feel free to use it going forward" (knowing how various calculator operations work before you let it do it for you).

I'm curious if we instead gave students an AI tool, but one that would intentionally throw in wrong things that the student had to catch. Instead of the student using LLMs, they would have one paid for by the school.

This is more brainstorming then a well thought-out idea, but I generally think "opposing AI" is doomed to fail. If we follow a montessori approach, kids are naturally inclined to want to learn thing, if students are trying to lie/cheat, we've already failed them by turning off their natural curiosity for something else.

an hour agoKerryJones

I agree, I think schools and universities need to adapt, just like calculators, these things aren't going away. Let students leverage AI as tools and come out of Uni more capable than we did.

AI _do_ currently throw in an occasional wrong thing. Sometimes a lot. A students job needs to be verifying and fact checking the information the AI is telling them.

The student's job becomes asking the right questions and verifying the results.

28 minutes agojay_kyburz

In other words, learn to use the tool BUT keep your critical thinking. Same with all new technologies.

I'm not minimizing Karpathy in any way, but this is obviously the right way to do this.

2 hours agoekjhgkejhgk

"You have to assume that any work done outside classroom has used AI."

That is just such a wildly cynical point of view, and it is incredibly depressing. There is a whole huge cohort of kids out there who genuinely want to learn and want to do the work, and feel like using AI is cheating. These are the kids who, ironically, AI will help the most, because they're the ones who will understand the fundamentals being taught in K-12.

I would hope that any "solution" to the growing use of AI-as-a-crutch can take this cohort of kids into consideration, so their development isn't held back just to stop the less-ethical student from, well, being less ethical.

an hour agomark242

> There is a whole huge cohort of kids out there

Well, it seems the vast majority doesn't care about cheating, and is using AI for everything. And this is from primary school to university.

It's not just that AI makes it simpler, so many pupils cannot concentrate anymore. Tiktok and others have fried their mind. So AI is a quick way out for them. Back to their addiction.

an hour agotgv

What possible solution could prevent this? The best students are learning on their own anyways, the school can't stop students using AI for their personal learning.

There was a reddit thread recently that asked the question, are all students really doing worse, and it basically said that, there are still top performers performing toply, but that the middle has been hollowed out.

So I think, I dunno, maybe depressing. Maybe cynical, but probably true. Why shy away from the truth?

And by the way, I would be both. Probably would have used AI to further my curiosity and to cheat. I hated school, would totally cheat to get ahead, and am now wildly curious and ambitious in the real world. Maybe this makes me a bad person, but I don't find cheating in school to be all that unethical. I'm paying for it, who cares how I do it.

People aren't one thing.

an hour agotechblueberry

Sure, but the point is that if 5% of students are using AI then you have to assume that any work done outside classroom has used AI, because otherwise you're giving a massive advantage to the 5% of students who used AI, right?

an hour agosharkjacobs

I’ve been following this approach since last school year. I focus on in-class work and home-time is for reading and memorization. My classmates still think classrooms are for lecturing, but it's coming. The paper-and-pen era is back to school!

an hour agomercacona

This doesn't adress the point that AI can replace going to school. AI can be your perfect personal tutor to help you learn thing 1:1. Needing to have a teacher and prove to them that you know what they teached will become a legacy concept. That we have an issue of AI cheating at school is in my eyes a temporary issue.

an hour agocharcircuit

ChatGPT just told me to put the turkey in my toaster oven legs facing the door, and you think it can replace school. Unless there is a massive architectural change that can be provably verified by third parties, this can never be. I’d hate for my unschooled surgeon to check an llm while I’m under.

an hour agoalariccole

What's the alternate if someone didn't know something during a procedure? Just wing it? Getting another data point from an LLM seems beneficial to me.

an hour agocharcircuit

Ask a human who does. If there are no competent humans on-call before the procedure starts, reschedule the procedure.

19 minutes agoaxus

Just curious, not being a turkey SME, what's the downside to positioning the turkey that way?

an hour agoCamperBob2

Most turkeys of my acquaintance would not fit into a toaster oven without some percussive assistance.

an hour agopatrickmay

It is considered valuable and worthwhile for a society to educate all of its children/citizens. This means we have to develop systems and techniques to educate all kinds of people, not just the ones who can be dropped off by themselves at a library when they turn five, and picked up again in fifteen years with a PHD.

an hour agosharkjacobs

Sure. People who are self motivated are who will benefit the earliest. If a society values ensuring every single citizen gets a baseline education they can figure out how to get an AI to persuade or trick people into learning better than a human could.

44 minutes agocharcircuit

For someone that wants to learn, I agree with this 100%. AI has been great at teaching me about 100s of topics.

I don't yet know how we get AI to teach unruly kids, or kids with neurodivergencies. Perhaps, though, the AI can eventually be vastly superior to an adult because of the methods it can use to get through to the child, keep the child interested and how it presents the teaching in a much more interactive way.

an hour agoqingcharles

This is the correct take. To contrast the Terance Tao piece from earlier (https://news.ycombinator.com/item?id=46017972), AI research tools are increasingly useful if you're a competent researcher that can judge the output and detect BS. You can't, however, become a Terence Tao by asking AI to solve your homework.

So, in learning environments we might not have an option but to open the floodgates to AI use, but abandon most testing techniques that are not, more or less, pen and paper, in-person. Use AI as much as you want, but know that as a student you'll be answering tests armed only with your brain.

I do pity English teachers that have relied on essays to grade proficiency for hundreds of years. STEM fields has an easier way through this.

2 hours agotrauco

Yesterday's Doonesbury was on point here: https://www.gocomics.com/doonesbury/2025/11/23

Andrej and Garry Trudeau are in agreement that "blue book exams" (I.e. the teacher gives you a blank exam booklet, traditionally blue) to fill out in person for the test, after confiscating devices, is the only way to assess students anymore.

My 7 year old hasn't figured out how to use any LLMs yet, but I'm sure the day will come very soon. I hope his school district is prepared. They recently instituted a district-wide "no phones" policy, which is a good first step.

2 hours agowffurr

Blue book was the norm for exams in my social science and humanities classes way after every assignment was typed on a computer (and probably a laptop, by that time) with Internet access.

I guess high schools and junior highs will have to adopt something similar, too. Better condition those wrists and fingers, kids :-)

an hour agophantasmish

I'm oldish, but when I was in college in the late 90s we typed a huge volume of homework (I was a history & religious studies double major as an undergrad), but the vast majority of our exams were blue books. There were exceptions where the primary deliverable for the semester was a lengthy research paper, but lots and lots of blue books.

an hour agoeitally

Oh how I hated those as a student. Handwriting has always been a slow and uncomfortable process for me. Yes, I tried different techniques of printing and cursive as well as better pens. Nothing helped. Typing on a keyboard is just so much faster and more fluent.

It's a shame that some students will again be limited by how fast they can get their thoughts down on a piece of paper. This is such an artificial limitation and totally irrelevant to real world work now.

an hour agonradov

Obviously the solution is to bring back manual typewriters.

26 minutes agoo11c

Maybe this is a niche for those low distraction writing tools that pop up from time to time. Or a school managed Chromebook that’s locked to the exam page.

an hour agowffurr

New York State recently banned phones state wide in schools.

an hour agoecshafer

It is, but it does not matter, because:

1. Corporate interests want to sell product 2. Administrators want a product they can use 3. Compliance people want a checkbox they can check 4. Teachers want to be ablet to continue what they have been doing thus far within the existing ecosystem 5. Parents either don't know, don't care, or do, but are unable to provide a viable alternative or, can and do provide it

We have had this conversation ( although without AI component ) before. None of it is really secret. The question is really what is the actual goal. Right now, in US, education is mostly in name only -- unless you are involved ( which already means you are taking steps to correct it ) or are in the right zip code ( which is not a guarantee, but it makes your kids odds better ).

an hour agoA4ET8a8uTh0_v2

This couldn’t have happened at a better time. When I was young my parents found a schooling system that had minimal homework so I could play around and live my life. I’ve moved to a country with a lot less flexibility. Now when my kids will soon be going to school, compulsory homework will be obsolete.

Zero homework grades will be ideal. Looking forward to this.

an hour agorenewiltord

If AI gets us reliably to a flipped classroom (=research at home, work through work during class) then I'm here for it. Homework in the traditional sense is an anti pattern.