> The parents' case hangs largely on the student handbook's lack of a specific statement about AI, even though that same handbook bans unauthorized use of technology. "They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB last month. "They basically punished him for a rule that doesn't exist."
I'm going out on a limb here, but if this is the viewpoint of the people who raised him, then I'm not surprised he cheated.
If this was my son and the facts were the same, he'd be grounded in addition to whatever consequence the school deems fit.
> I'm going out on a limb here, but if this is the viewpoint of the people who raised him, then I'm not surprised he cheated.
At my child's elementary school and now middle school the teachers openly ecourage using AI chat for schoolwork, to help the kids understand the usefulness and the uselessness of it.
I think that's great. AI isn't going away, so pretending students won't be using it at all times, both now and in their lives, is pretty naive and hopeless. Better to embrace it and teach them what it can and can't do.
From the article:
> the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations).
They should not be punished for using AI. They should, however, just get a very bad grade for "including citations to nonexistent books". Which would also be a good lesson in why AI is useful but you can't trust it at all without validation.
Agreed.
Only someone trying to cheat would use the excuse that it wasn’t explicitly stated that AI was cheating.
This reminds me of the court case where they asked the court to define child pornography and they said “I can can’t define it, but I know it when I see it.”
Imagine saying with a straight face that some pictures you have of a minor are fine because this particular pose, facial expression, and clothing wasn’t specifically designed child porn. It would instantly make you sound like a pedo, like he sounds like a cheater
> This reminds me of the court case where they asked the court to define child pornography and they said “I can can’t define it, but I know it when I see it.”
If you’re referring to the famous statement from Justice Potter Stewart’s concurrence in Jacobellis v. Ohio, that comment was in reference to defining “hardcore pornography,” not child pornography.
Exactly, it wasn't about CP (in particular) at all, just pornography. Which makes it a really horrible ruling: at least with CP, you can use age limits, though there's still huge controversies about 1) pictures by parents of their kids doing stuff like taking a bath and 2) artwork/animation that isn't even photography and never involved any actual children.
Stewart's ruling was ridiculous: how is anyone supposed to know whether something is "pornographic" or not if it can just be determined by some judge arbitrarily and capriciously, and there's no objective standard whatsoever? They could just rule any picture of an unclothed person to be illegal, even if it's something medical in nature. Heck, they could even rule a medical diagram to be porn. Or anything they want, really.
I think Germany has exclusions for pictures with your kids taking baths. That’s just pretty common and midwives will tell you to take a picture when they show you how to bath your baby for the first time.
Back in the 70's there was a sex-ed book called "Show Me" which featured very frank photographs of both adults and children, completely naked.
It was subject to litigation at the time it was released and was upheld as not being obscene. But it continued to cause moral panics and was eventually taken out of print by the publisher.
But it was never banned and can still be found as a collector's item. And in the mid 70's it was completely normal to walk into a book store and see it on display.
Highly successful people like Elon Musk and Donald Trump succeed in part because they push the rules really hard, and they win enough in the cases when it matters. So it's a proven strategy, though it doesn't always work.
Criminals push the rules really hard too. It's just that they're only labeled criminals by society if they're caught, prosecuted, and punished.
People who "push the rules" (I call this cheating, if I was playing a tabletop game with someone who did this, I would say they were cheating too) have an unfair advantage over the people who do follow the rules.
It's like sociopaths: smart sociopaths become successful businesspeople and politicians, while stupid sociopaths end up in prison.
People who are good at pushing the rules and have some kind of knack for knowing which rules to push and how far, end up succeeding over the rule-followers, while the people who lack this talent and push the wrong rules, or push too far, end up failing somehow (and maybe in prison).
It's no good cherry-picking success stories to evaluate a strategy. You have to know how often it fails too.
Anyway, "pushing the rules" is so vague, and it could be done intelligently, ethically, or otherwise.
In this particular case the student copy-pasted AI output in an assignment and tried to pass it off as his own work. I mean, come on... the fact his went to court is just absurd
The facts do not seem clear at all if you examine them carefully, and the article does a very poor job actually covering the details in a non-prejudicial way.
What facts? The facts are context dependent without identity. Revision time is meaningless. People still write rough drafts up on real paper, then copy the document in and put the final touches on it. How do you differentiate two processes, where part of the process is not technology driven, and not visible?
Is it right to say that revision time tracked by software is the whole time spent and correct in supporting AI use?
If it is a large project, it is often allowed to use previous material done in past assignments as components given the strict time limits. If the teacher verbally OK'ed this, that is explicit approval but how do you prove it after the fact? The court can only judge based on facts that are in evidence. How do you prove otherwise when there is no clear pattern of abuse? Academia has had decades to fine tune their legal teams for this and free money provided by you in the form of taxes.
The academic policy cited is indirectly self-referential. It both allows and disallows activity based on the teachers discretion of what constitutes explicit permission. There is also no internationally recognized standard for referencing AI derivative work (arbitrary).
This is also not uniquely defined, nor prescribes a valid process in adversarial environments, violating due process.
Academia has a long history of engaging in coercive behavior and violating due process. Many people working in academia view any investigation into an issue as creating a hostile work environment among co-workers. There is no duty to investigate, and in investigating the person involved (the chairperson or whatever), is creating a hostile work environment with co-workers. Its something they don't do unless forced, and when forced there is naturally a conflict of interest which violates due process. They are incentivized to seek to prove a false narrative where the investigation is not a hostile work environment (i.e. it was the student).
Vexatious process, lack of controls or agency to correct, and unclear instruction are structural elements that lead to "struggle sessions".
Struggle sessions are a circular Maoist thought reform practice based in torture (Maoism is a derivative of Marxism/Communism). It is all about psychologically torturing the subject without due process, accountability or defense, and forcing them to engage in the arbitrary circular process that breaks them.
The nature of corrupt systems are often that you have no choice but to cheat because the guidelines are designed to disadvantage certain people, and are arbitrary, self-referential, and circular, and act as a sieve/filter preventing future options. For example, about 20 years ago, Engineering fields required that you go through a course of physics (3 classes) as a roadblock. They had set up the notorious 3-question test, where individual students who were deemed acceptable by the professor were told prior to the test how to answer correctly.
Everyone else failed, and it went like this, the first question answer was needed to solve the second, and the second for the third. Each question involved has a specific set of significant digits, and the instructions said follow the practices for significant digits. The trick which was arbitrary is in how you round between the problems. Causality factors into rounding error, and grading was instead based on an arbitrary process for rounding. In other words, only those who were told beforehand passed. If they followed the textbook process to reduce rounding error, they failed. There have been a number of similarly indirect derivative promoted by Teacher's Union representatives at various conferences over the years.
These things go to the core basis of the prussian model of education, which was focused only on creating loyal unthinking soldiers. The model we have today focuses on loyal unthinking workers. You get these types by breaking them down psychologically, in effect destroying the individual going through the process.
In my opinion, you are far too trusting of authority.
I'd highly suggest you read Joost Meerloo and Robert Lifton to recognize these torture structures. They are everywhere and they have great impact, and damage the subject, but are subtle and hard to prove without having been exposed to it.
That’s a lot of words that say nothing to defend high school plagiarism.
If you turn in work that isn’t yours and represent it as yours, it is plagiarism, whether accidental or intentional.
You clearly didn't understand the meaning of the words, and rather seem to have embraced circular reasoning outside rational foundations. This is deluded, and leads to delusion.
Delusion is inherently slothful and steeped in darkness and destructive outcomes, is that really something you want to be passing on to your children?
The indirection making this circular is based on the definition of work, ownership, and the concept of the originality of ideas. The identity of such is based in the question, "according to whom and by what measure?". Without knowing these no equal measure can be made one way or the other, and hence is arbitrary. Systems that end up determine important parts of your future shouldn't be arbitrary.
The definition you provide changes for each person you ask, and when it comes down to an authority, without defining such things, enforcement becomes arbitrary.
Everything would be plagiarism under that definition, and be arbitrary since not everyone is punished. All information processed originally comes from outside yourself, and not even adult professionals can get this right.
Is it really fair or reasonable to deprive children of their future opportunities over this? Is it fair or reasonable to destroy children's futures using this type of claim?
I'll pray you get what you need to become a good role-model for your children. As it stands, I don't think you actually have children.
Delusion that persists is a strong indicator of schizophrenia, I hope you aren't a schizophrenic parent. You can read more about how that impacts the children here: https://fherehab.com/learning/parent-schizophrenia
As long as we're casting diagnoses, do you suffer from bouts of mania or are you just on black market Adderall with nothing productive to do? You still need to sleep you know, I know with Adderall you can still feel good 24 hours in, but your reasoning abilities degrade.
Anyway, yes, children should be disciplined in order to prevent them from plagiarizing a.k.a. putting their name on work that's not theirs, otherwise they grow up to be liars and con artists.
You can't call people delusional, or insinuate they might be schizophrenic.
> Not letting kids plagiarize is authoritarian, also if you disagree with me you have schizophrenia and are unfit to parent.
Bravo, my friend, this is a masterclass in HN-style trolling. There is absolutely no way a thinking human typed this completely seriously. Green account with a post that perfectly ticks every required box for an excellent HN parody? Come on, now.
What's the relevance? Are you going to embark on a "Let's Educate The Users" mission for parenting?
It would be futile. Parents and children are now united in not wanting to be educated.
This has got to be specific to the US, yes? None of my overseas colleagues have this attitude toward education, and my wife (now an American, but an immigrant) certainly doesn't.
A lot in France too, and it’s not just minorities who are uncivilized anymore.
SAD!
What is unauthorized use of technology? Is the light I need to read not technology? Is using the internet to find more about a topic not technology? Where is the line that makes AI forbidden?
The lack of implicit or explicit authorization. As the school has lights, you may assume they are authorized implicitly.
This is unproductive and unenlightening pedantry.
I think the throwaway actually raises the valid point about the rule being an exceedingly broad catchall. The type primed for selective and weaponized enforcement.
That said, the kids are clearly defenseless in this situation, for blatant plagiarism as well as just being just being factually incorrect in their report.
> The type primed for selective and weaponized enforcement
Theoretically true, but irrelevant because this particular case isn't that.
Yes, it is broad, and probably a bad rule. That said, there is more than enough than that simple rule in this case that points toward intentional academic dishonesty. If he was my son, getting off on a technicality isn’t exoneration in my house.
This is like saying that they have an AI policy, but that using an LLM isn’t in violation since it is just calculus applied to words and not true intelligence.
Courts aren’t stupid for the most part. Most of them are happy to interpret things in terms of what a ‘reasonable person’ would think, for better or worse. Right now, most reasonable people would say that using an LLM to do your work without disclosing it is in violation of the spirit of the student handbook.
I could fine tune an AI, and name it after myself, and put my own name at the top of the paper as an attribution, but no reasonable person would say that was following the spirit of the law if I turned that in as a class assignment.
The part where students were given explicit guidance on the use of AI resources and told how to cite it appropriately. Besides, even aside from the use of technology it’s still a blatant case of plagiarism as he passed off work he did not write as his own.
How could you even cite AI “appropriately”? That makes about as much sense as citing Google..
But no paper (even high school level) which does that should ever be accepted..
No paper?
If you're clear what your sources are, why does it matter who (or what) you quote?
Boris Johnson and GWB are known (for different reasons) spouting jibberish sentences, yet cite them and if the quote was appropriate then no foul; all fiction is made up and that too can be quoted when appropriate.
When I was at school in the UK 24 years back, media studies was denigrated as "mickey mouse studies", but in restrospect the ability to analyse and desconstruct media narratives would have been useful for the country.
Now AI is a new media.
Sometimes it will be the right thing to quote; sometimes it will be as much of an error as citing Wuthering Heights in an essay about how GWB handled the War On Terror.
Have you actually read the piece? The answers to those is in the written policy the student was given. But even without the policy, it should be pretty clear that passing others' work as your own (be they people or AI) is academic dishonesty.
As judge said, "the emergence of generative AI may present some nuanced challenges for educators, the issue here is not particularly nuanced"
Is what I wrote here mine or not? I used the autocorrect suggestions almost exclusively, wrote few letters only.
Then, no. This isn’t text you generated. No one cares on Internet forums though.
Who came up with the words? If autocorrect is acting as a typist, transferring your words to screen, you are the author.
What if I first asked ChatGPT what should I say? And what's the difference from just copy pasting it?
The question is who comes up with words. If you re-type textbook, you are plagiarizing. Same happens if you re-type ChatGPT output.
On the other hand, if you read some text first (be it ChatGPT's output, or a textbook) and then rephrase it yourself, then you are the author.
How much you have to rephrase? Is changing every other word with synonym enough? That's actually a gray area, and it depends on the teacher. Most teachers would expect you to at least change sentence structure. But in this case it's completely irrelevant, as we know the students did copy/paste.
I really don't see why you are trying to present ChatGPT like something special re plagiarism. Copying other's work is copying. Paying $10 to someone to do your homework and then copying their answer as-is is cheating. So is using ChatGPT yo do it for free.
ChatGPT is not someone. It's a tool.
So is a textbook. Still not OK to copy homework from it.
Textbook has an author that you can copy. You can't copy output of an auto suggest, it's just yours.
It does not matter if author is human or computer.
If there is a spelling bee, but student is secretly using spellcheck on the phone, they are cheating.
If there is a math speed competition, but student is using a calculator on the phone, they are cheating.
If it's a calculus exam, but student is using Wolfram Alpha (or TI-89) to calculate integrals and derivatives, it is cheating.
If it's a written exam but student is using ChatGPT to write the text, it is cheating as well. Not that different from previous cases.
There is no difference. They’re not your words.
These are games no one in the real world is interested in playing.
How come they're not my words? I thought of sending them, not the keyboard. Same with ChatGPT, it doesn't do anything on its own - even if it could, it's a tool, not a person.
If he had turned in the prompt he wrote, then this would be his words.
If your position is correct, then I suppose asking another person to write an essay for you is in fact your writing as well. Which is absurd.
This pedantry is useless. Ask any person if an essay produced by AI was written by the person writing the prompt and I think a majority will say “no”. If AIs writing essays for you isn’t plagiarism, then nothing is.
How could it be the same if another person wrote it?
Situation A - a person uses a tool. They are the author.
Situation B - a person contracts another person. They are not the author. The other person might be using a tool, but it doesn't matter.
This is not a theoretical debate. This is how the current legal framework works, and if you expect someone to behave differently, you have to be explicit. Change the law if you want the default to be different. This is how it works now.
> This pedantry is useless. Ask any person if an essay produced by AI was written by the person writing the prompt and I think a majority will say “no”.
I'm an expert in the field. I don't need to ask mainstream people about their sci-fi influenced opinions, they don't matter to me nor to the law. This is how it works, it's not magic, it's not scifi, it's not a person, it can't be an author and thus it can't be plagiarized by the user, nor the user can violate the tool's copyright by using the output.
It's a tool that authors can use to create their works, and even if all they did is push a single button, they are the author. Compare this to me applying a filter on a blank canvas to produce an abstract art wallpaper in Photoshop - am I the author or is Photoshop? Let me tell you, don't try to steal my work. I did it, not Photoshop, that was just a tool that made it easier for me.
Same with ChatGPT - this morning I used it to write an architectural proposal. It's my proposal, there is no way I somehow "plagiarized" something. Doesn't matter that I pushed like 50 buttons in total; it's my work, I am the author.
--
And if schools don't actively teach children to use this technology, they should be reformed. If this school was in my city, I'd vote for the party that will cancel it - for the extremely serious offense of letting the children think it's wrong to use tools. At least they should explain that the goal is elsewhere and why a particular tool shouldn't be used for that task. It's not like children will just know, you know.
This is just like when teachers at my own school told us that Wikipedia is a bad source and we can't use it because it can't be trusted. That's total bullshit, just be honest - this assignment is supposed to test your writing skills, not tool usage skills.
> This is how the current legal framework works, and if you expect someone to behave differently, you have to be explicit.
The classroom teacher was explicit in their expectations. The student did not follow the instructions. Did you RTFA?
> It's a tool that authors can use to create their works
This isn't about authorship or ownership. It doesn't matter whether the words are "yours" or not - that you keep making that a point of contention is a semantic sideshow. This isn't a commercial or artistic enterprise. If the student wants to publish their copy-pasted, error-ridden slop (or masterful slop!), then by all means, they can go for it.
Rather, this is a classroom, the goal is to learn, think, and demonstrate understanding. Copy-pasting responses from AI does not do this (even setting aside the explicit instructions not to to do so). Similarly, plagiarism isn't bad in high school because it hurts other authors, it's bad because it's a lazy end-run around the process of learning and thinking that undermines those goals and creates bad habits to avoid critical thinking.
> a person uses a tool
If you simplify and trivialize everything to an extreme degree, then yes, you might have a point…
> Wikipedia is a bad source and we can't use it because it can't be trusted
No, but you still shouldn’t use it for anything besides research (i.e. you should be citing the sources the Wikipedia article is based on directly).
> just be honest - this assignment is supposed to test your writing skills,
Isn’t that more than perfectly obvious already?
It's important to understand that all ChatGPT knows is what's in its training set, and what was in the prompt. I see it all the time in programming - if you are doing a boring, run-off-the-mill websites or plugging other peoples' libraries together, then AIs like copilot have a very high success rate. But try something new or unusual (even a bit unusual, like QNX), and the lack of training data means the amount of hallucinations becomes incredibly high, and the resulting code is often outright wrong.
A lot of times that's fine - after all, there are programmers who spend their careers never writing an original algorithm, and they could definitely use the help. And in a lot of cases an original architecture is actually a bad idea - stick to the boring technology, create your architecture proposal based on re-blending previous projects with a tiny bit of changes. Nothing wrong with that.
But school tries to be better than that (it does not always succeed, but it at least tries). That's why the students are taught how to multiply numbers by hand, and only then they are allowed calculators. And they must be taught how find their primary sources, before being allowed to use Wikipedia. And they must be taught how to write their thoughts in their words, before being allowed to ChatGPT.
Sure, some of them will go on and never multiply numbers by hand, nor ever read a primary source nor ever create a highly original proposal. That's fine and even expected. Still, the school aims high even if not everyone can get there.
Somewhere in there is the problem. ChatGPT & Co should be viewed as tools - they can do research, they can discuss and echo your ideas, they can write FOR YOU. (Or at least that's the idea, when they don't go nuts). And if they are your tool, you are doing the work - you are the author.
If ChatGPT copies the wikipedia article, you are now in trouble. Your tool caused you to plagiarize and not even notice but that won't save you. Same if you include complete nonsense generated by your tool. Still your fault (although some lawyers have been getting away with it.)
If you copy the wikipedia article yourself, at least you know you cheated.
But equating the tool to the plagiarizing is absurd. What is possible is that perhaps the problem is now outperformed / overpowered by the tool. It may now be trivial to answer the problem. A few clicks?
But again it's long tradition for schools to outlaw some tools in the effort to teach something specific that the tool would replace: calculator out of bounds, or only basic calculator and no computer, no books, sliderule only, etc.
Yes, exactly right. If your tool caused you to plagiarize, it's your problem - you should've checked the output. You can't point to ChatGPT, it'd be like pointing to Word. You are the author and thus you are responsible.
Schools need to adapt. Explain why a tool is forbidden - it will improve efficiency of learning if the kids know what is the purpose. Use different more fun method to teach mind skills, and teach overcoming problems using various always evolving and changing hi-tech tools - instead of the rigid lesson/homework style of today. Teach them that change is constant and that they need to adapt. This is not something theoretical - there are state sponsored schools operating in this way where I live. It's great.
In education, the goal is internalizing to the individual the knowledge required for them to bootstrap into a useful, contributing member of society. Things like composition of written works, organizing one's thoughts into communicable artifacts, doing basic mathematics, familiarity with the local and encompassing polity and it's history, how to navigate and utilize institutions of research (libraries) etc... Any technology employed that prevents or sidesteps that internalization is unauthorized.
It ain't that hard to connect the dots unless you're going out of your way to not connect the dots.
If that is the goal of a specific task, be explicit about that. From my personal experience, it's just what old teachers say because they can't deal with the new reality. The teachers who incorporated AI and other tech into their lessons are much more successful and better rated - by parents as well as children, and the final exams too.
They were explicit about that! There was an AI policy that the student knew about and blatantly ignored. I am not sure how much more did you want.
Here is a relevant quote from the TFA:
> Although students were permitted to use AI to brainstorm topics and identify sources, in this instance the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations).
(And re better rating: sadly making classes less useful usually improves the rating. I sure if there was a class where they were just watching cartoons, with super-simple quizzes at the end that everyone could answer, it'd have highest ratings from most students, as well as high ratings from many parents, and best scores on finals. Now, it might not work that hot in real world, but by that time the school would be long over...)
The problem is that the school simply didn't teach them about the tool enough, and they taking something that should be just another lesson as disciplinary action.
Why do you expect children to know math only after months of tries, but understand the perils of AI after hearing one sentence regulation? That's not going to help the kids. You need to spend time practicing using the tool with them, showing them the pitfalls in practice, and only after enough time you can start rating them.
The school handed them a gun and they're Pikachu surprised a kid got shot, and now they're blaming it on the kid that had it in hands - but the school is to blame. And it's certainly not newsworthy, or where is the math exam results article?
> they taking something that should be just another lesson as disciplinary action.
Remember the "disciplinary action" here is giving the student a bad grade, with the opportunity to redo the assignment.
Are you seriously asserting they should've gotten a good grade for a paper that cites sources that don't exist? In an advanced placement class no less?
If anything they're getting of lighter than students who did a bad job on their own without using AI. I know I was never given a chance to redo a paper I phoned in.
I know my paper was never in the national news.
The students are not expected to understand the perils of AI, they are expected to follow the policies the teacher gave them. And the policies were very clear: they can use AI for inspiration and web search, but they cannot use it to write text to be submitted.
What you are describing might make sense for "ChatGPT class", but that wasn't it, that was AP History.
(And that's how schools work in general: in real life, no one integrates matrixes by hand; and yet calculus classes do not teach CAS systems or their perils)
The current trend in education is to blend subjects and methods together and create cohesive interdisciplinary lessons practicing multiple skills at once. "ChatGPT lesson" is the 19th century way.
Citation Needed.
Both in and out of education, "it's" always means "it is" or "it has".
Come on, it's not that complicated. The judge determined that the student has copy/pasted the AI output and submitted that. That's not the same as using AI for research just like you use Google Search for research.
Don't think yourself into a hole there bud
What's sad is that the school district spending its limited money to fight frivolous lawsuits like these directly impacts all other students who are just trying to get a good education. Stuff like this won't stop unless the helicopter parents are made to pay the school's legal bills.
This isn’t helicopter parenting, it’s bulldozer parenting. One wonders how they convince themselves that this is good for their child.
[deleted]
Hopefully this sets a precent that discourages other similar lawsuits or has them dismissed out of hand.
[deleted]
Sadly, the parents could still win. What a precedent that would be. Though I'm not sure there's any avoiding a future with mass societal atrophy of reading and writing skills.
In the midst of the perennial "woe the kids today" rant, it's worth considering that this generation is the first generation in the 300,000 year history of homo sapiens sapiens to widely utilize reading and writing as a primary means of daily communication.
Yes, they'll be able grasp the beauty of Les Misérables or The History of the Decline and Fall of the Roman Empire so long as it is fed to them as a stream of <280 character tweets, dohohoho.
Mr. Munroe is falling victim to the unfortunate phenomenon where people believe their popularity means that their opinions outside of their areas of expertise are well-informed. Whether they can spell better or not, minds weaned on little chunks of text laced with memes and emoji are going to struggle with chapters, let alone full books.
Given the comic is an echo of conversations like this, I think perhaps that if he is guilty of that then so too are you and I and all others here.
Myself, I say that to equate the modern proclivity for tweets with a degradation of intellectual rigor is as fallacious as imagining that the concise elegance of Tacitus foretold a dulling of Roman wit. Or something like that.
Will they (kids these days) like the style of old classics? Of course not, just as few native english speakers alive today wish to speak (or write) in the style of Shakespeare — breaking a long thing up into tweet-sized chunks, that is simply style, no more relevant than choice of paragraph or sentence length.
But to dismiss a generation’s capacity for engagement with monumental works (be they Les Mis, The Decline and Fall etc., Shakespeare, Dickens, or any other) on the basis of their chosen tools of communication betrays not only an ignorance of how often communication has changed so far — when Edward Gibbon wrote the Decline and Fall, literacy in the UK was somehere around the 50% mark, but still the illiterates could watch plays and listen to tales — but also modern attention spans when we also have binge-watching of entire series of shows as a relatable get-to-know-you-better topic on dating apps.
Reading/writing part of the brain is a repurposed part which otherwise responsible for the faces, facial emotions, etc. recognition. Giving the already noticeable decrease of the in-person skills in the young "texting" generations we can speculate where it may go.
They might all become hyper-nerds interested in D&D?
Sounds cool.
But then, my brother was already into D&D in the late 80s back when I was learning to read from the Commodore 64 user manual, so of course I'd think that :P
The student was not punished for "using AI", but for plagiarism:
>The incident occurred in December 2023 when RNH was a junior. The school determined that RNH and another student "had cheated on an AP US History project by attempting to pass off, as their own work, material that they had taken from a generative artificial intelligence ('AI') application," Levenson wrote. "Although students were permitted to use AI to brainstorm topics and identify sources, in this instance the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations)."
so they were caught due to the citations to non-existing books. it seems fine and unconnected to all of the controversial news stories of using AI detection, which has a high rate of false positives
The teacher noticed a few irregularities such as the time the student spent inside the submitted project. The cheater student only had around 55 mins logged whereas everyone else had 7-8 hours. That set off alarm bells for the teacher who then actually looked into the paper more and noticed fake citations and it was flagged as “ai” generated when they ran it through a few ai detectors.
And for AP US History... a college level course. Yikes.
Good. Insofar as the point of a school paper is to make the student do the thinking and writing, taking it from ChatGPT is plagiarizing from OpenAI.
Probably unpopular opinion here but families, usually wealthy, that use the legal system like this to avoid consequences are parasites. It reveals not only your poor job of raising your children. But also the poor character of the parents.
Glad the courts didn’t grant a similar “affluenza” ruling here. The student plagiarized, short and simple.
> Probably unpopular opinion here but families, usually wealthy, that use the legal system like this to avoid consequences are parasites
100% agree. Plus it diverts school resources -- in other words, taxpayer money -- to fight the court case, meaning they have less to spend on educating children which is what taxpayer's are funding.
The parents should be on the hook for the school's legal fees.
I think the only unpopular part of that is that you'd think it's an unpopular opinion
It depends on the audience. On HN, it's probably a popular opinion. Among typical American parents these days, it's probably not.
+1.
The article hints at a worry about college applications by the student. So perhaps this was an attempt to clean the student's slate of the cheating record. Still, I can't support the approach taken.
Agreed. Ironically it’s parents such as these who are the most loudly self-proclaimed staunch supporters of meritocracy, except when it comes to their children who are somehow not subject to it at all.
What's striking to me is that the parents sued. RNH passed off AI-generated text as their own when they knew how to cite AI generated works and were versed in academic integrity. It wouldn't occur to me to sue the school if this was my kid.
They're not optimizing for the kid's education. They optimizing for the credentials the kid is able to get.
Filing the lawsuit is an asymmetric bet:
- win, and increase college admissions odds
- lose, and be no worse off that without the suit
> lose, and be no worse off that without the suit
This kid should change his name, given his initials, high school and parents’ names are public record next to a four brain cell cheating attempt.
Do you think college admissions officers follow the news and use what they learn to maintain a naughty list?
Perhaps a business idea?
Unless he has someone who is very sympathetic to his cause, the teacher/counselor recommendation will wreck him.
This guy needs to go to a JuCo that feeds into a decent state school — he’s screwed for competitive schools.
> Do you think college admissions officers follow the news and use what they learn to maintain a naughty list?
College admissions, no. College students and colleagues and employers, being able to use a search engine, absolutely.
If you search the student's name on Google, you probably won't find this lawsuit.
Admissions know his name and the name of the school, which helps find specific students.
It’s easy to miss, but I wouldn’t be surprised if it comes up as “Hingham High School Harris” brings up the relevant info. Further, his parents suing may be a larger issue for a college than his behavior.
school recommendations
I'm guessing at some point there will be LLMs trawling through news items to put together profiles for people, and as the cost comes down, it won't just be available to three letter agencies and ad platforms, but schools and employers will start to use them like credit scores.
Nope. I just replied above with a similar story when I was in school. My classmate got expelled for cheating and sued the school. tv segment, articles about him, etc.
Zero effect on his college outcomes. Got into really good schools.
understand though the kid is bearing the implications for the parent's decision.
> win and increase college admissions odds
+ also gain funds for the parents
> lose, and be no worse off that without the suit
If I were in college admissions then I'd probably think twice about admitting the candidate with a widely reported history of trying to sue their school on frivolous grounds when things don't go their way.
> - win, and increase college admissions odds
wouldn't this decrease? I wouldn't want to admit a litigious cheating student - whether won or lost. this was a pure money play by the parents
Do you think colleges thoroughly check the background of each applicant, such that they would discover this?
> - win, and increase college admissions odds
Will it, though? Like if the college happens to know about this incident?
It does strike me that the purpose in attending college is the credential you get; education is a far second.
It strikes me that this is a foolish take to adopt.
I saw lots of students acting a bit like this but I was grateful that I could dedicate myself primarily to my schooling and took as much advantage as I could to learn as much as I could.
The credential gets used as a heuristic for the learning you do but if you show up and don't have and knowledge, then everything is harder and your labor more fruitless.
I know some people don't care and that there are degenerate workplaces but you'll still be left with having been a lot less useful in your life than you were capable of being.
So what would you do in the parents' shoes?
teach my kids some good ethics and taking responsibility for their actions, instead of jeopardizing the chances of college with the prospects of money
> What's striking to me is that the parents sued
And the kid was even offered a redo!
On the other hand, the school caved on National Honor Society after the parents filed. So maybe the best move would have been (tactically, not as a parent) to show the school the draft complaint but never file it.
Almost zero downside. I knew a student who plagiarized 3x so they got kicked out. His parents sued. It was even on the tv news because they were asking for hundreds of thousands in compensation. He lost and the school kept him expelled.
I was expecting the bad press coverage to hurt his college chances since there were several articles online about him getting kicked out for cheating and then suing.
Nope! Dude got into a really good school. He even ended up texting asking me for past essays I wrote to turn in as his own to his college classes.
And the kicker was he then transferred to one of the prestigious military academies that supposedly upholds honor and integrity.
So. There is almost zero downside for suing even if it gets you tons of negative publicity.
I don't think we can claim zero downside from one anecdote. There are always outliers that can occur from extenuating circumstances.
- The family potentially has the financial resources or possibly connections to 'make things happen'.
- Perhaps the student is especially charismatic and was able to somehow right the situation. Some people have that con-artist mindset where they're able to cheat/commit fraud through their life with seemingly minimal consequences.
- Perhaps they just got lucky and the administration didn't do their due diligence.
anecdata is data if you do not have other data or anecdotes to pack your claims up. The present an example whereas you present speculation
> Perhaps they just got lucky and the administration didn't do their due diligence.
Are universities supposed to google every applicant?
I mean I haven't been in academia for a decade, but back when I was I certainly never browsed a 17-year-old girl's instagram before making an admission decision.
Not every applicant, but the ones in the accepted pool, it strikes me as odd there isn't some basic amount of vetting.
Instagram? No (although, wouldn't be surprised)... but doing a gut check with the school admin and looking at public records? Sure.
you present a very interesting specific example of Instagram which is completely unrelated. story time?
If I put some kid's name into Google, shouldn't I expect social media to come up?
> His parents sued. ...
> He even ended up texting asking me for past essays I wrote to turn in as his own ...
> he then transferred to one of the prestigious military academies
...
>> There is almost zero downside for suing even if it gets you tons of negative publicity.
Sounds like the caveat here should be, "when your parents/family is connected".
How on earth do you get caught plagiarizing and still pass with a C+?
I once had a student in a U.S. history class that literally copy-pasted almost the entirety of a paper from a Wikipedia article (incidentally, an article that was only tangentially related to what he was supposed to write about, which only made it more glaringly obvious something was wrong). After confronting him he told me he "had no clue" how the copying could have happened! I gave him a 0 on the paper, which caused him to fail the course, and reported the incident. But the school admins changed his grade so that he would pass. This was at a for-profit college that thankfully no longer exists (I quit after that experience).
This is how it works at most universities it seems.
I think it depends. At least at the major public university I went to grad school at, if an undergrad had pulled that there would have been extremely serious repercussions. Failing the class would have been the minimum. The bigger issue then was that students with money could just buy their papers and take-home work, which was often impossible to catch. This was before LLMs started hurting paper mills' bottom lines, and a lot has changed in the past few years though.
I used to pick up pocket money writing essays for dumb rich kids in college. If I cared, there’s enough of a paper trail to invalidate more than one degree, at least by the written rules. In the real world I doubt that the cheaters would face real consequences, and have concerns that I would lose my own credentials.
It's high school, and a public one at that. Cheating can be rampant in some schools or with some individuals.
What I find ridiculous is the parents are suing over a C+ vs B grade and a detention on the record. Like where do you see your cheating kid going in life that you're going to waste your resources and the district resources on this?
I find it completely unsurprising that parents who would sue to change a C+ into a B raised a kid who would cheat.
We elected a felon and frequent financial cheat as President, so the sky is the limit, I suppose.
Presumably, this one assignment wasn't the entire grade and the C+ was for the entire course.
> he received Saturday detention and a grade of 65 out of 100 on the assignment
The student still received a passing grade for the assignment despite some of the assignment being AI hallucinated text. From my experience, plagiarism is an automatic zero for the entire assignment or course, but there are tons of counterexamples when the teacher/professor doesn't want to deal with the academic integrity process.
- "but there are tons of counterexamples when the teacher/professor doesn't want to deal with the academic integrity process"
That's a good point: in this particular case, the teacher of the course was subpoenaed to federal court and compelled to testify about their grading. Incredible burden, for someone else's problem.
I have had the rare privilege to see up close examples of how at several US universities, when professors are presented with irrefutable proof that a student has cheated (well beyond any reasonable doubt) the professor will most often do nothing. In the best case they will meet with the student and give them a stern talking to.
The whole system is set up to disincentivize any effort to actually hold students accountable for cheating in a significant way (fail assignment, fail course, expulsion, etc.)
When we read about cases of students being held accountable it's generally the exception not the rule.
Last I checked, 65 was a D-, not a C+. So the C+ was for the course.
Grading scales vary. 65% could correspond to any letter grade.
Fail to meaningfully discipline students due to fear of litigious moron parents, get sued by litigious moron parents anyway.
There need to be counter claims to cover these costs instead of it falling on taxpayers.
also disciplinary action and very probable expulsion
At a university that takes their rules seriously, perhaps. Absolutely not expulsion at any k-12 school.
[deleted]
Is it plagiarizing when you copy stuff that isn't even factual?
Merriam-Webster:
> : to steal and pass off (the ideas or words of another) as one's own : use (another's production) without crediting the source
I guess so.
I think you briefly forgot the whole genre of fiction exists
No, it's academic dishonesty: representing work that you did not do as your own work.
ETA: "Academic dishonesty" also covers things like falsifying data and willfully misattributing sources, which is a closer approximation to this case.
There are times I have passed students simply because it isn't worth it to not.
We wonder why our society is going to shit. This is one of those thousand cuts.
ai hallucinated citations to non existent publications.
in this case, the ai should publish the cited hallucinated works on amazon to make it real.
not that it would help us, but the ai will have its bases covered.
Then they could train the next generation of models on those works. Nothing to scrape or ingest, since they already have the text on hand!
How do you get students to engage in creative writing assignments in age of AI?
How do you get them to dive into a subject and actually learn about it?
I'm thirty something. How did my teachers engage me in doing math? How did they engage me in rote-memorizing the multiplication tables when portable calculators were already a thing, being operated by coin-cells or little solar panels?
Part of teaching is getting kids to learn why and how things are done, even if they can be done better/faster/cheaper with new technology or large scale industrial facilities. It's not easy, but I think it's the most important part of education: getting kids to understand the subjacent abstract ideas behind what they're doing, and learning that there's value in that understanding. Don't really want to dichotomize, but every other way kids will just become non-curious users of magic black boxes (with black boxes being computers, societal systems, buildings, infrastructure, supply chains, etc).
The same way you did so before LLMs existed - you rely on in-class assignments, or take-home assignments that can't be gamed.
Giving out purely take-home writing assignments with no in-class component (in an age where LLMs exist), is akin to giving out math assignments without a requirement to show your work (in an age where calculators exist).
Many years before LLMs were ever a thing, I recall being required to complete (and turn in) a lot of our research and outlining in class. A plain "go home and write about X topic" was not that common, out of fear of plagiarism.
Sure, use AI for research, just like using the Internet for research.
But don't copy/paste AI generated content in the same way that you don't copy/paste a chapter from a book and pass it off as your own.
Invert the assignment, provide a prompt to supply to an essay writing AI of the students choice, but the assignment is to provide critique for the veracity and effectiveness of the generated essay
It would seem that what was put into the report is clearly wrong (in this case from generative AI, but regardless of where it came from, it would still be wrong), so it is still legitimate to mark those parts as wrong. There are other things too which can be called wrong, whether or not the use of this generative AI is permitted (and it probably makes sense to not permit it in the way that it was used in this instance), so there are many reasons why it should be marked wrong.
However, if the punishment is excessively severe, then the punishment would be wrong.
He didn't get detention for hallucinating facts. He got detention for plagiarizing hallucinations without attribution.
The parents seem absolutely unhinged.
Poor kid.
Yet another “affluenza” raised child joining the ranks of society. Probably will become a future C-level exec at an American company.
> the Harris’s lawsuit against the Hingham school committee remains alive
what does this mean if the judge already ruled in the school's favor? parents will appeal?
The parents asked for a preliminary injunction to remove the cheating from the kids record. A judge could do this prior to trial if he believes the suit likely to succeed. The judge refused the injunction because he believes the school district was likely acting in good faith and did nothing illegal.
Oh, I see, so the case is still going to court. What a waste of taxpayer money. Elon wants to cut government waste? Make it more difficult to sue by setting a higher bar by which your suit can even be accepted.
Yep, case is still alive. This just indicates the judge doesn’t see it as a slam dunk for the parents/cheater.
While the parents assert the C+ might keep their kid out of Stanford, the more likely impact is that being known for a nationally notorious lawsuit over a minor infraction is what will keep him out of Stanford.
Also, he's not getting into Stanford with the B grade that the parents are suing for anyway. You can't even get into Stanford with all A's these days.
> Also, he's not getting into Stanford with the B grade that the parents are suing for anyway. You can't even get into Stanford with all A's these days.
None of this is true.
Grades are just one part of the picture.
The folks who think a B is what kept them out of an elite school are just engaging in wishful thinking.
The number of people who get into elite schools like Harvard or Stanford with multiple Bs would surprise you.
I think you might get in with multiple Bs and a good story about your interest in the subject you're pursuing (or suitably connected family)
"good story" probably doesn't include being too uninterested to write your own answers despite parents so committed to you going to Stanford they're prepared to litigate to get you a B...
> "good story" probably doesn't include being too uninterested to write your own answers despite parents so committed to you going to Stanford they're prepared to litigate to get you a B...
So true.
This kid is a persona non grata for elite schools at this point.
As I said in the other thread, his best bet is to go to a JuCo that feeds into a decent state school, and just lay low for two years.
He can go to an elite school for a graduate degree if he wants the club membership.
Of course it's possible, but you have to have something truly extraordinary to make up for it (or be a legacy admit, rich parents who donated to the school, etc.). The B will certainly work against you.
> but you have to have something truly extraordinary to make up for it
Flip that, and you’re closer to correct for everyone.
You have to do something truly extraordinary to get in, with the things you listed as being some of the least common types.
Grades just need to be directionally correct rather than perfect.
Also, a side note about legacy admits…
While the admission rate of legacies is about 33% at Harvard (12% at Yale, 30% at Princeton, and 14% at Stanford), that doesn’t mean that being a legacy was the primary reason they got in.
First, 67% of legacies still get denied — that’s quite a bit.
Second, the folks who get into elite schools often know how to raise their kids in a way that increases their chances to get into an elite school. It’s an advantage, but much more often than not, the applicant put in the effort to make themselves a strong applicant.
The legacy “advantage” comes into play almost purely at the margin, when someone is borderline admit/waitlist or waitlist/deny, and the legacy status will push them to the favorable side. You’re not going to see a substantial difference in the folks who were rated comparably.
People seem to want it to be that legacies are freeloading off of their parents and aren’t really qualified admits, and that largely isn’t true. The exceptions are examples like z-list applicants (which you mentioned) or recruited athletes who also happen to be legacies.
I wanna see how many Asian men get in with B's
> I wanna see how many Asian men get in with B's
Please stop perpetuating this myth.
Asians are not held to a different standard.
Anecdotally (with truck load of anecdotes), Asian-Americans (to be specific) frequently seem to be held to a widely-known standard that either they aren’t aware of or don’t believe in.
Note that this is not exclusive to Asian-Americans — plenty of upper-middle class white people fall into this category as well — but that was the group you mentioned.
I have made an open offer to HN, and it still holds:
If you show me the application of an Asian that you felt was held to a different standard for elite school admissions, then I will give you the reason why they most likely didn’t get in.
[deleted]
that’s not much of an offer. one can easily always find (especially when specifically looking for it to prove a point) whatever it is they are looking for :)
I personally know there is asian-american bias (not just asian-american…) in admissions at least one elite school via one of my best friends who works in admissions office.
> I personally know there is asian-american bias (not just asian-american…) in admissions at least one elite school via one of my best friends who works in admissions office.
Oh, interesting.
What is the specific bias they claim exists?
Fwiw, they did a fully body cavity search on Harvard admissions, and the best that they could come up with was describing an applicant (accurately) using race-based shorthand — something like “standard Asian reach applicant”, which (iirc) meant something like high grades, high standardized test scores… and almost nothing else. This is a complete nothing burger.
Note that this stereotype exists for a reason. It’s not exclusive to Asians, but it’s much more common with Asian applicants than other races.
Edited to add:
> that’s not much of an offer. one can easily always find (especially when specifically looking for it to prove a point) whatever it is they are looking for :)
Almost every time I’ve done this face-to-face, it wasn’t some subtle oversight — it was a glaring omission or weakness in the application.
The times that it wasn’t obvious, the person got into an elite school, just didn’t get into their elite school of choice, and that’s a different issue.
Curiously, a nationally notorious lawsuit is not enough to keep you out of Stanford [0].
Citing nonexistent sources should lower your grade whether you used ai or not.
After reading this article, it is hard to say who is in the right here. The court could easily be wrong because they can only judge based on the facts at hand, based on presumptions they've already settled on.
On one hand, the school referenced academic honesty policy in their defense, but there are no international standards for referencing AI, many AI detection measures have false positives, and they both disallow and allow the same behavior seemingly based upon the teacher's discretion.
If you were a malign individual in a position of authority (i.e. the classic teacher being out to get a troublemaker), you could easily set up circumstances that are unprovable under these guidelines.
There is also a vested interest in academia to not create a hostile work environment, where there is no duty to investigate. They are all in it together. This has been an ongoing problem for academia for decades.
There were also several very prejudicial aspects referenced, such as the revision changes, but some people write their stuff out in paper first, and then copy what's written into a document from there. This is proof of nothing because its apples to oranges.
Finally, there are grievances made about lack of due process, and other arbitrary matters which are all too common in academia, but academia makes it very difficult to be caught in such matters short of recording every little thing, which may potentially be against the states laws.
For example, you may be given written instructions for an assignment that are unclear, and ask for clarification, and the teacher may say something contradictory. Should students be held accountable for a teacher lying verbally (if it happened)?
It is sad that it had to come down to court, but that is just how academia operates with free money from the government. They don't operate under a standard business loss function, nor do they get rid of teachers who don't perform once they reach permanent faculty status. The documentary waiting for superman really drives this home, and its only gotten worse since that documentary came out.
There are plenty of people in the comments who are just rabid against the parents, and that's largely caused by poor journalism seeking to rile people up into irrational fanatic fervor. These people didn't look at the details, they just hopped on the bandwagon.
Be rational, and realize that academia has been broken for decades, and what you read isn't necessarily the whole truth of the matter.
The parents had several valid points which went ignored because there is no smoking gun, and that is how corruption works in centralized systems, and indicates a rule by law rather than a rule of law.
One of the hallucinated authors is literally named "Jane Doe". Our society is about to become powerfully stupid.
"Doe" is actually a real surname, with a few thousand of them in the US. I'd guess that there probably have been people actually named "Jane Doe". I wonder if that causes many problems for them?
what sorts of problems do you imagine this causing?
The name is widely used as a placeholder. Here's how Wikipedia describes it [1]:
> John Doe (male) and Jane Doe (female) are multiple-use placeholder names that are used in the British and US-American legal system and aside generally in the United Kingdom and the United States when the true name of a person is unknown or is being intentionally concealed.
I'd imagine that could lead to some difficulties when someone really named Jane Doe has to deal with some system that uses that name as a placeholder. Similar to the way people whose surname is Null sometimes run into problems because of poorly written computer systems.
DoE is the department of energy. The department of education is ED.
I laughed out loud when I saw that McMahon was his pick. A fucking wrestling star for the department of education. This is Idiocracy.
Also I laughed because otherwise the fear takes over.
In legal cases that is how one can choose to remain anonymous.
See, there's stuff even geniuses dont know.
Why do you think the previous poster found that name notable? Just because it's inherently funny sounding or something?
That's not relevant to this. It's a direct quote from the work the students handed in.
I am not a lawyer but the student's defense is akin to "Ain't no rules says a dog can't play basketball" from airbud. There are clear rules against plagiarism, and the student copied stuff verbatim from an online source without any citations.
This would have been illegal in Italy has their 1970 Worker Protection against automated management would kill this AI.
[deleted]
AI is the new calc
Using AI in school today is heresy, yet give it a few years and "yesterday's heresy is today's canon".
Back when I was in high school CD-ROMs were brand new and you could buy encyclopedias on disc.
I made dozens of dollars selling book reports and history papers to my fellow honors class peers. Every paper was a virtually unaltered copy & paste job from Microsoft Encarta. Copy, paste into word, format using some “fancy font”, add my “customers” name, date and class to the top… print! Boom. Somebody buys me lunch.
I mean how else was I gonna have time to write shitty Visual Basic programs that used every custom control I could download in order to play/pause the CDROM’s music stuff?
A microcosm of society. Helping others cheat for profit.
Nothing modern whatsoever about it. Students at Oxford nearly two thousand years ago sold their talents to other students.
[deleted]
It was high school.
Hence the microcosm
[dead]
I just used chatGPT to code an html/css/JavaScript solution in an hour for coworkers who were having troubles. There were like wow that was fast we were trying to figure this out for a few days. I'm skilled / an expert but that would've taken me many hours vs. a few back n forth with GPT.
Overall my html/css/javascript skills I feel now aren't as valuable as they were.
I guess in this instance I cheated too or is it that my developer peers haven't gotten into using GPT or they are more moral? As well maybe this is just the new normal....
This has nothing to do with cheating at school.
The rules for working are very very different from being at school.
No you were not cheating, you did what was expected from you. But you knew that.
How so and or AI is changing the rules everywhere no? Today it seems not good yet tomorrow it's how things are...
The goals are very different. It was like this also before AI.
The goal in school is to learn things. To learn to write you can't just copy an article from a paper and say it is yours. You have not learned.
At work, the goal is to get things done.
In our field you needed / need to learn new things to stay relevant yet now the new thing does it almost all for you.
As well if one generation is using AI to get things done why wouldn't a younger generation do the same? Do as I say and not as I do.. that never has held well over time.
But you already learned the web stack--school kids haven't. Your mental model is what prepared you to use LLMs well to solve a problem. So if they're going to do as you did, they need to learn the subject first and then learn how to extend their reach with LLMs. Otherwise, they're just cheating in school.
You don't need that knowledge as i just went to GPT and asked it ...
"I need to create a dropdown for a website can you help me make it?"
And then I asked,
"How do I make what you wrote above work?"
It detailed the things one needs to do ..copy/paste each block of code in three separate notepad files and save each one accordingly (index.html, style.css and script.js) all in one folder. Once that's done double click on the index.html to run the dropdown.
And your colleagues really spent a few days trying to figure this out?
the kids are going to be in a different world than we are. just like it was useful for us to learn a foreign language (still being taught it schools but those days are numbered) for kids these days it is a waste of time (I am sure there are many studies that say being bi/tri/… lingual has benefits beyond communication but you get my point).
I think while we may think “they need to learn the subject first…” do they really? and if they do why? e.g. someone teaching their kid “web development” in soon-to-be 2025 is insane given the tools
we have now… so while there are things for sure kids should learn it is not easy to figure out what those things actually are
Yeesh this is full of red flags…
What is..the new normal of using AI to do or help you get your job done and or quicker? Comment above shows it could be the new normal...
No. This attitude of being better than coworkers, coming in and saving the day. It had nothing to do with using AI. It’s about “I am better than you” instead of helping people out, or teaching them these things you know.
It’s just a passing internet comment missing all the context, so what do I know.
My comments are to be controversial… To get people to think… What is the future with AI and using it as such… If I told my coworkers how I achieved it would they not think less present day… What about in a few years or more it's the norm and mine and everyone's HML, CSS, JavaScript skills are less valuable,… this example shows that AI will definitely take peoples jobs, including my own if I do not ramp up my skills
You ramping up your skills will do nothing for you if a machine can otherwise be delegated your job due to the overhead of human worker vs. just owning a machines output. Not having to negotiate is extremely valuable to a business owner. Mark my words. Until people realize that the whole innovation around AI is to sidestep the labor class, things'll continue getting much darker before they brighten.
And the saddest thing is, the fools think it'll work in their favor, and won't blowback with massive unintended consequences.
> The parents' case hangs largely on the student handbook's lack of a specific statement about AI, even though that same handbook bans unauthorized use of technology. "They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB last month. "They basically punished him for a rule that doesn't exist."
I'm going out on a limb here, but if this is the viewpoint of the people who raised him, then I'm not surprised he cheated.
If this was my son and the facts were the same, he'd be grounded in addition to whatever consequence the school deems fit.
> I'm going out on a limb here, but if this is the viewpoint of the people who raised him, then I'm not surprised he cheated.
At my child's elementary school and now middle school the teachers openly ecourage using AI chat for schoolwork, to help the kids understand the usefulness and the uselessness of it.
I think that's great. AI isn't going away, so pretending students won't be using it at all times, both now and in their lives, is pretty naive and hopeless. Better to embrace it and teach them what it can and can't do.
From the article:
> the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations).
They should not be punished for using AI. They should, however, just get a very bad grade for "including citations to nonexistent books". Which would also be a good lesson in why AI is useful but you can't trust it at all without validation.
Agreed.
Only someone trying to cheat would use the excuse that it wasn’t explicitly stated that AI was cheating.
This reminds me of the court case where they asked the court to define child pornography and they said “I can can’t define it, but I know it when I see it.”
Imagine saying with a straight face that some pictures you have of a minor are fine because this particular pose, facial expression, and clothing wasn’t specifically designed child porn. It would instantly make you sound like a pedo, like he sounds like a cheater
> This reminds me of the court case where they asked the court to define child pornography and they said “I can can’t define it, but I know it when I see it.”
If you’re referring to the famous statement from Justice Potter Stewart’s concurrence in Jacobellis v. Ohio, that comment was in reference to defining “hardcore pornography,” not child pornography.
Exactly, it wasn't about CP (in particular) at all, just pornography. Which makes it a really horrible ruling: at least with CP, you can use age limits, though there's still huge controversies about 1) pictures by parents of their kids doing stuff like taking a bath and 2) artwork/animation that isn't even photography and never involved any actual children.
Stewart's ruling was ridiculous: how is anyone supposed to know whether something is "pornographic" or not if it can just be determined by some judge arbitrarily and capriciously, and there's no objective standard whatsoever? They could just rule any picture of an unclothed person to be illegal, even if it's something medical in nature. Heck, they could even rule a medical diagram to be porn. Or anything they want, really.
I think Germany has exclusions for pictures with your kids taking baths. That’s just pretty common and midwives will tell you to take a picture when they show you how to bath your baby for the first time.
Back in the 70's there was a sex-ed book called "Show Me" which featured very frank photographs of both adults and children, completely naked.
It was subject to litigation at the time it was released and was upheld as not being obscene. But it continued to cause moral panics and was eventually taken out of print by the publisher.
But it was never banned and can still be found as a collector's item. And in the mid 70's it was completely normal to walk into a book store and see it on display.
<https://en.wikipedia.org/wiki/Show_Me!>
Highly successful people like Elon Musk and Donald Trump succeed in part because they push the rules really hard, and they win enough in the cases when it matters. So it's a proven strategy, though it doesn't always work.
Criminals push the rules really hard too. It's just that they're only labeled criminals by society if they're caught, prosecuted, and punished.
People who "push the rules" (I call this cheating, if I was playing a tabletop game with someone who did this, I would say they were cheating too) have an unfair advantage over the people who do follow the rules.
It's like sociopaths: smart sociopaths become successful businesspeople and politicians, while stupid sociopaths end up in prison.
People who are good at pushing the rules and have some kind of knack for knowing which rules to push and how far, end up succeeding over the rule-followers, while the people who lack this talent and push the wrong rules, or push too far, end up failing somehow (and maybe in prison).
It's no good cherry-picking success stories to evaluate a strategy. You have to know how often it fails too.
Anyway, "pushing the rules" is so vague, and it could be done intelligently, ethically, or otherwise.
In this particular case the student copy-pasted AI output in an assignment and tried to pass it off as his own work. I mean, come on... the fact his went to court is just absurd
The facts do not seem clear at all if you examine them carefully, and the article does a very poor job actually covering the details in a non-prejudicial way.
What facts? The facts are context dependent without identity. Revision time is meaningless. People still write rough drafts up on real paper, then copy the document in and put the final touches on it. How do you differentiate two processes, where part of the process is not technology driven, and not visible?
Is it right to say that revision time tracked by software is the whole time spent and correct in supporting AI use?
If it is a large project, it is often allowed to use previous material done in past assignments as components given the strict time limits. If the teacher verbally OK'ed this, that is explicit approval but how do you prove it after the fact? The court can only judge based on facts that are in evidence. How do you prove otherwise when there is no clear pattern of abuse? Academia has had decades to fine tune their legal teams for this and free money provided by you in the form of taxes.
The academic policy cited is indirectly self-referential. It both allows and disallows activity based on the teachers discretion of what constitutes explicit permission. There is also no internationally recognized standard for referencing AI derivative work (arbitrary).
This is also not uniquely defined, nor prescribes a valid process in adversarial environments, violating due process.
Academia has a long history of engaging in coercive behavior and violating due process. Many people working in academia view any investigation into an issue as creating a hostile work environment among co-workers. There is no duty to investigate, and in investigating the person involved (the chairperson or whatever), is creating a hostile work environment with co-workers. Its something they don't do unless forced, and when forced there is naturally a conflict of interest which violates due process. They are incentivized to seek to prove a false narrative where the investigation is not a hostile work environment (i.e. it was the student).
Vexatious process, lack of controls or agency to correct, and unclear instruction are structural elements that lead to "struggle sessions".
Struggle sessions are a circular Maoist thought reform practice based in torture (Maoism is a derivative of Marxism/Communism). It is all about psychologically torturing the subject without due process, accountability or defense, and forcing them to engage in the arbitrary circular process that breaks them.
The nature of corrupt systems are often that you have no choice but to cheat because the guidelines are designed to disadvantage certain people, and are arbitrary, self-referential, and circular, and act as a sieve/filter preventing future options. For example, about 20 years ago, Engineering fields required that you go through a course of physics (3 classes) as a roadblock. They had set up the notorious 3-question test, where individual students who were deemed acceptable by the professor were told prior to the test how to answer correctly. Everyone else failed, and it went like this, the first question answer was needed to solve the second, and the second for the third. Each question involved has a specific set of significant digits, and the instructions said follow the practices for significant digits. The trick which was arbitrary is in how you round between the problems. Causality factors into rounding error, and grading was instead based on an arbitrary process for rounding. In other words, only those who were told beforehand passed. If they followed the textbook process to reduce rounding error, they failed. There have been a number of similarly indirect derivative promoted by Teacher's Union representatives at various conferences over the years.
These things go to the core basis of the prussian model of education, which was focused only on creating loyal unthinking soldiers. The model we have today focuses on loyal unthinking workers. You get these types by breaking them down psychologically, in effect destroying the individual going through the process.
In my opinion, you are far too trusting of authority.
I'd highly suggest you read Joost Meerloo and Robert Lifton to recognize these torture structures. They are everywhere and they have great impact, and damage the subject, but are subtle and hard to prove without having been exposed to it.
That’s a lot of words that say nothing to defend high school plagiarism.
If you turn in work that isn’t yours and represent it as yours, it is plagiarism, whether accidental or intentional.
You clearly didn't understand the meaning of the words, and rather seem to have embraced circular reasoning outside rational foundations. This is deluded, and leads to delusion.
Delusion is inherently slothful and steeped in darkness and destructive outcomes, is that really something you want to be passing on to your children?
The indirection making this circular is based on the definition of work, ownership, and the concept of the originality of ideas. The identity of such is based in the question, "according to whom and by what measure?". Without knowing these no equal measure can be made one way or the other, and hence is arbitrary. Systems that end up determine important parts of your future shouldn't be arbitrary.
The definition you provide changes for each person you ask, and when it comes down to an authority, without defining such things, enforcement becomes arbitrary.
Everything would be plagiarism under that definition, and be arbitrary since not everyone is punished. All information processed originally comes from outside yourself, and not even adult professionals can get this right.
Is it really fair or reasonable to deprive children of their future opportunities over this? Is it fair or reasonable to destroy children's futures using this type of claim?
I'll pray you get what you need to become a good role-model for your children. As it stands, I don't think you actually have children.
Delusion that persists is a strong indicator of schizophrenia, I hope you aren't a schizophrenic parent. You can read more about how that impacts the children here: https://fherehab.com/learning/parent-schizophrenia
As long as we're casting diagnoses, do you suffer from bouts of mania or are you just on black market Adderall with nothing productive to do? You still need to sleep you know, I know with Adderall you can still feel good 24 hours in, but your reasoning abilities degrade.
Anyway, yes, children should be disciplined in order to prevent them from plagiarizing a.k.a. putting their name on work that's not theirs, otherwise they grow up to be liars and con artists.
You can't call people delusional, or insinuate they might be schizophrenic.
https://news.ycombinator.com/newsguidelines.html
> Not letting kids plagiarize is authoritarian, also if you disagree with me you have schizophrenia and are unfit to parent.
Bravo, my friend, this is a masterclass in HN-style trolling. There is absolutely no way a thinking human typed this completely seriously. Green account with a post that perfectly ticks every required box for an excellent HN parody? Come on, now.
What's the relevance? Are you going to embark on a "Let's Educate The Users" mission for parenting?
It would be futile. Parents and children are now united in not wanting to be educated.
This has got to be specific to the US, yes? None of my overseas colleagues have this attitude toward education, and my wife (now an American, but an immigrant) certainly doesn't.
A lot in France too, and it’s not just minorities who are uncivilized anymore.
SAD!
What is unauthorized use of technology? Is the light I need to read not technology? Is using the internet to find more about a topic not technology? Where is the line that makes AI forbidden?
The lack of implicit or explicit authorization. As the school has lights, you may assume they are authorized implicitly.
This is unproductive and unenlightening pedantry.
I think the throwaway actually raises the valid point about the rule being an exceedingly broad catchall. The type primed for selective and weaponized enforcement.
That said, the kids are clearly defenseless in this situation, for blatant plagiarism as well as just being just being factually incorrect in their report.
> The type primed for selective and weaponized enforcement
Theoretically true, but irrelevant because this particular case isn't that.
Yes, it is broad, and probably a bad rule. That said, there is more than enough than that simple rule in this case that points toward intentional academic dishonesty. If he was my son, getting off on a technicality isn’t exoneration in my house.
This is like saying that they have an AI policy, but that using an LLM isn’t in violation since it is just calculus applied to words and not true intelligence.
Courts aren’t stupid for the most part. Most of them are happy to interpret things in terms of what a ‘reasonable person’ would think, for better or worse. Right now, most reasonable people would say that using an LLM to do your work without disclosing it is in violation of the spirit of the student handbook.
I could fine tune an AI, and name it after myself, and put my own name at the top of the paper as an attribution, but no reasonable person would say that was following the spirit of the law if I turned that in as a class assignment.
The part where students were given explicit guidance on the use of AI resources and told how to cite it appropriately. Besides, even aside from the use of technology it’s still a blatant case of plagiarism as he passed off work he did not write as his own.
How could you even cite AI “appropriately”? That makes about as much sense as citing Google..
Like e.g. this:
“Clarity isn’t found in answers; it’s carved from the questions we dare to ask.” - ChatGPT, https://chatgpt.com/share/67439692-b098-8011-b1df-84d3761bba...
I know that being obtuse is sometimes fun..
But no paper (even high school level) which does that should ever be accepted..
No paper?
If you're clear what your sources are, why does it matter who (or what) you quote?
Boris Johnson and GWB are known (for different reasons) spouting jibberish sentences, yet cite them and if the quote was appropriate then no foul; all fiction is made up and that too can be quoted when appropriate.
When I was at school in the UK 24 years back, media studies was denigrated as "mickey mouse studies", but in restrospect the ability to analyse and desconstruct media narratives would have been useful for the country.
Now AI is a new media.
Sometimes it will be the right thing to quote; sometimes it will be as much of an error as citing Wuthering Heights in an essay about how GWB handled the War On Terror.
Have you actually read the piece? The answers to those is in the written policy the student was given. But even without the policy, it should be pretty clear that passing others' work as your own (be they people or AI) is academic dishonesty.
As judge said, "the emergence of generative AI may present some nuanced challenges for educators, the issue here is not particularly nuanced"
Is what I wrote here mine or not? I used the autocorrect suggestions almost exclusively, wrote few letters only.
Then, no. This isn’t text you generated. No one cares on Internet forums though.
Who came up with the words? If autocorrect is acting as a typist, transferring your words to screen, you are the author.
What if I first asked ChatGPT what should I say? And what's the difference from just copy pasting it?
The question is who comes up with words. If you re-type textbook, you are plagiarizing. Same happens if you re-type ChatGPT output.
On the other hand, if you read some text first (be it ChatGPT's output, or a textbook) and then rephrase it yourself, then you are the author.
How much you have to rephrase? Is changing every other word with synonym enough? That's actually a gray area, and it depends on the teacher. Most teachers would expect you to at least change sentence structure. But in this case it's completely irrelevant, as we know the students did copy/paste.
I really don't see why you are trying to present ChatGPT like something special re plagiarism. Copying other's work is copying. Paying $10 to someone to do your homework and then copying their answer as-is is cheating. So is using ChatGPT yo do it for free.
ChatGPT is not someone. It's a tool.
So is a textbook. Still not OK to copy homework from it.
Textbook has an author that you can copy. You can't copy output of an auto suggest, it's just yours.
It does not matter if author is human or computer.
If there is a spelling bee, but student is secretly using spellcheck on the phone, they are cheating.
If there is a math speed competition, but student is using a calculator on the phone, they are cheating.
If it's a calculus exam, but student is using Wolfram Alpha (or TI-89) to calculate integrals and derivatives, it is cheating.
If it's a written exam but student is using ChatGPT to write the text, it is cheating as well. Not that different from previous cases.
There is no difference. They’re not your words.
These are games no one in the real world is interested in playing.
How come they're not my words? I thought of sending them, not the keyboard. Same with ChatGPT, it doesn't do anything on its own - even if it could, it's a tool, not a person.
If he had turned in the prompt he wrote, then this would be his words.
If your position is correct, then I suppose asking another person to write an essay for you is in fact your writing as well. Which is absurd.
This pedantry is useless. Ask any person if an essay produced by AI was written by the person writing the prompt and I think a majority will say “no”. If AIs writing essays for you isn’t plagiarism, then nothing is.
How could it be the same if another person wrote it?
Situation A - a person uses a tool. They are the author.
Situation B - a person contracts another person. They are not the author. The other person might be using a tool, but it doesn't matter.
This is not a theoretical debate. This is how the current legal framework works, and if you expect someone to behave differently, you have to be explicit. Change the law if you want the default to be different. This is how it works now.
> This pedantry is useless. Ask any person if an essay produced by AI was written by the person writing the prompt and I think a majority will say “no”.
I'm an expert in the field. I don't need to ask mainstream people about their sci-fi influenced opinions, they don't matter to me nor to the law. This is how it works, it's not magic, it's not scifi, it's not a person, it can't be an author and thus it can't be plagiarized by the user, nor the user can violate the tool's copyright by using the output.
It's a tool that authors can use to create their works, and even if all they did is push a single button, they are the author. Compare this to me applying a filter on a blank canvas to produce an abstract art wallpaper in Photoshop - am I the author or is Photoshop? Let me tell you, don't try to steal my work. I did it, not Photoshop, that was just a tool that made it easier for me.
Same with ChatGPT - this morning I used it to write an architectural proposal. It's my proposal, there is no way I somehow "plagiarized" something. Doesn't matter that I pushed like 50 buttons in total; it's my work, I am the author.
--
And if schools don't actively teach children to use this technology, they should be reformed. If this school was in my city, I'd vote for the party that will cancel it - for the extremely serious offense of letting the children think it's wrong to use tools. At least they should explain that the goal is elsewhere and why a particular tool shouldn't be used for that task. It's not like children will just know, you know.
This is just like when teachers at my own school told us that Wikipedia is a bad source and we can't use it because it can't be trusted. That's total bullshit, just be honest - this assignment is supposed to test your writing skills, not tool usage skills.
> This is how the current legal framework works, and if you expect someone to behave differently, you have to be explicit.
The classroom teacher was explicit in their expectations. The student did not follow the instructions. Did you RTFA?
> It's a tool that authors can use to create their works
This isn't about authorship or ownership. It doesn't matter whether the words are "yours" or not - that you keep making that a point of contention is a semantic sideshow. This isn't a commercial or artistic enterprise. If the student wants to publish their copy-pasted, error-ridden slop (or masterful slop!), then by all means, they can go for it.
Rather, this is a classroom, the goal is to learn, think, and demonstrate understanding. Copy-pasting responses from AI does not do this (even setting aside the explicit instructions not to to do so). Similarly, plagiarism isn't bad in high school because it hurts other authors, it's bad because it's a lazy end-run around the process of learning and thinking that undermines those goals and creates bad habits to avoid critical thinking.
> a person uses a tool
If you simplify and trivialize everything to an extreme degree, then yes, you might have a point…
> Wikipedia is a bad source and we can't use it because it can't be trusted
No, but you still shouldn’t use it for anything besides research (i.e. you should be citing the sources the Wikipedia article is based on directly).
> just be honest - this assignment is supposed to test your writing skills,
Isn’t that more than perfectly obvious already?
It's important to understand that all ChatGPT knows is what's in its training set, and what was in the prompt. I see it all the time in programming - if you are doing a boring, run-off-the-mill websites or plugging other peoples' libraries together, then AIs like copilot have a very high success rate. But try something new or unusual (even a bit unusual, like QNX), and the lack of training data means the amount of hallucinations becomes incredibly high, and the resulting code is often outright wrong.
A lot of times that's fine - after all, there are programmers who spend their careers never writing an original algorithm, and they could definitely use the help. And in a lot of cases an original architecture is actually a bad idea - stick to the boring technology, create your architecture proposal based on re-blending previous projects with a tiny bit of changes. Nothing wrong with that.
But school tries to be better than that (it does not always succeed, but it at least tries). That's why the students are taught how to multiply numbers by hand, and only then they are allowed calculators. And they must be taught how find their primary sources, before being allowed to use Wikipedia. And they must be taught how to write their thoughts in their words, before being allowed to ChatGPT.
Sure, some of them will go on and never multiply numbers by hand, nor ever read a primary source nor ever create a highly original proposal. That's fine and even expected. Still, the school aims high even if not everyone can get there.
Somewhere in there is the problem. ChatGPT & Co should be viewed as tools - they can do research, they can discuss and echo your ideas, they can write FOR YOU. (Or at least that's the idea, when they don't go nuts). And if they are your tool, you are doing the work - you are the author.
If ChatGPT copies the wikipedia article, you are now in trouble. Your tool caused you to plagiarize and not even notice but that won't save you. Same if you include complete nonsense generated by your tool. Still your fault (although some lawyers have been getting away with it.)
If you copy the wikipedia article yourself, at least you know you cheated.
But equating the tool to the plagiarizing is absurd. What is possible is that perhaps the problem is now outperformed / overpowered by the tool. It may now be trivial to answer the problem. A few clicks?
But again it's long tradition for schools to outlaw some tools in the effort to teach something specific that the tool would replace: calculator out of bounds, or only basic calculator and no computer, no books, sliderule only, etc.
Yes, exactly right. If your tool caused you to plagiarize, it's your problem - you should've checked the output. You can't point to ChatGPT, it'd be like pointing to Word. You are the author and thus you are responsible.
Schools need to adapt. Explain why a tool is forbidden - it will improve efficiency of learning if the kids know what is the purpose. Use different more fun method to teach mind skills, and teach overcoming problems using various always evolving and changing hi-tech tools - instead of the rigid lesson/homework style of today. Teach them that change is constant and that they need to adapt. This is not something theoretical - there are state sponsored schools operating in this way where I live. It's great.
In education, the goal is internalizing to the individual the knowledge required for them to bootstrap into a useful, contributing member of society. Things like composition of written works, organizing one's thoughts into communicable artifacts, doing basic mathematics, familiarity with the local and encompassing polity and it's history, how to navigate and utilize institutions of research (libraries) etc... Any technology employed that prevents or sidesteps that internalization is unauthorized.
It ain't that hard to connect the dots unless you're going out of your way to not connect the dots.
If that is the goal of a specific task, be explicit about that. From my personal experience, it's just what old teachers say because they can't deal with the new reality. The teachers who incorporated AI and other tech into their lessons are much more successful and better rated - by parents as well as children, and the final exams too.
They were explicit about that! There was an AI policy that the student knew about and blatantly ignored. I am not sure how much more did you want.
Here is a relevant quote from the TFA:
> Although students were permitted to use AI to brainstorm topics and identify sources, in this instance the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations).
(And re better rating: sadly making classes less useful usually improves the rating. I sure if there was a class where they were just watching cartoons, with super-simple quizzes at the end that everyone could answer, it'd have highest ratings from most students, as well as high ratings from many parents, and best scores on finals. Now, it might not work that hot in real world, but by that time the school would be long over...)
The problem is that the school simply didn't teach them about the tool enough, and they taking something that should be just another lesson as disciplinary action.
Why do you expect children to know math only after months of tries, but understand the perils of AI after hearing one sentence regulation? That's not going to help the kids. You need to spend time practicing using the tool with them, showing them the pitfalls in practice, and only after enough time you can start rating them.
The school handed them a gun and they're Pikachu surprised a kid got shot, and now they're blaming it on the kid that had it in hands - but the school is to blame. And it's certainly not newsworthy, or where is the math exam results article?
> they taking something that should be just another lesson as disciplinary action.
Remember the "disciplinary action" here is giving the student a bad grade, with the opportunity to redo the assignment.
Are you seriously asserting they should've gotten a good grade for a paper that cites sources that don't exist? In an advanced placement class no less?
If anything they're getting of lighter than students who did a bad job on their own without using AI. I know I was never given a chance to redo a paper I phoned in.
I know my paper was never in the national news.
The students are not expected to understand the perils of AI, they are expected to follow the policies the teacher gave them. And the policies were very clear: they can use AI for inspiration and web search, but they cannot use it to write text to be submitted.
What you are describing might make sense for "ChatGPT class", but that wasn't it, that was AP History.
(And that's how schools work in general: in real life, no one integrates matrixes by hand; and yet calculus classes do not teach CAS systems or their perils)
The current trend in education is to blend subjects and methods together and create cohesive interdisciplinary lessons practicing multiple skills at once. "ChatGPT lesson" is the 19th century way.
Citation Needed.
Both in and out of education, "it's" always means "it is" or "it has".
Come on, it's not that complicated. The judge determined that the student has copy/pasted the AI output and submitted that. That's not the same as using AI for research just like you use Google Search for research.
Don't think yourself into a hole there bud
What's sad is that the school district spending its limited money to fight frivolous lawsuits like these directly impacts all other students who are just trying to get a good education. Stuff like this won't stop unless the helicopter parents are made to pay the school's legal bills.
This isn’t helicopter parenting, it’s bulldozer parenting. One wonders how they convince themselves that this is good for their child.
Hopefully this sets a precent that discourages other similar lawsuits or has them dismissed out of hand.
Sadly, the parents could still win. What a precedent that would be. Though I'm not sure there's any avoiding a future with mass societal atrophy of reading and writing skills.
In the midst of the perennial "woe the kids today" rant, it's worth considering that this generation is the first generation in the 300,000 year history of homo sapiens sapiens to widely utilize reading and writing as a primary means of daily communication.
Ob XKCD: https://xkcd.com/1414/
Yes, they'll be able grasp the beauty of Les Misérables or The History of the Decline and Fall of the Roman Empire so long as it is fed to them as a stream of <280 character tweets, dohohoho.
Mr. Munroe is falling victim to the unfortunate phenomenon where people believe their popularity means that their opinions outside of their areas of expertise are well-informed. Whether they can spell better or not, minds weaned on little chunks of text laced with memes and emoji are going to struggle with chapters, let alone full books.
Given the comic is an echo of conversations like this, I think perhaps that if he is guilty of that then so too are you and I and all others here.
Myself, I say that to equate the modern proclivity for tweets with a degradation of intellectual rigor is as fallacious as imagining that the concise elegance of Tacitus foretold a dulling of Roman wit. Or something like that.
Will they (kids these days) like the style of old classics? Of course not, just as few native english speakers alive today wish to speak (or write) in the style of Shakespeare — breaking a long thing up into tweet-sized chunks, that is simply style, no more relevant than choice of paragraph or sentence length.
But to dismiss a generation’s capacity for engagement with monumental works (be they Les Mis, The Decline and Fall etc., Shakespeare, Dickens, or any other) on the basis of their chosen tools of communication betrays not only an ignorance of how often communication has changed so far — when Edward Gibbon wrote the Decline and Fall, literacy in the UK was somehere around the 50% mark, but still the illiterates could watch plays and listen to tales — but also modern attention spans when we also have binge-watching of entire series of shows as a relatable get-to-know-you-better topic on dating apps.
Reading/writing part of the brain is a repurposed part which otherwise responsible for the faces, facial emotions, etc. recognition. Giving the already noticeable decrease of the in-person skills in the young "texting" generations we can speculate where it may go.
They might all become hyper-nerds interested in D&D?
Sounds cool.
But then, my brother was already into D&D in the late 80s back when I was learning to read from the Commodore 64 user manual, so of course I'd think that :P
The student was not punished for "using AI", but for plagiarism:
>The incident occurred in December 2023 when RNH was a junior. The school determined that RNH and another student "had cheated on an AP US History project by attempting to pass off, as their own work, material that they had taken from a generative artificial intelligence ('AI') application," Levenson wrote. "Although students were permitted to use AI to brainstorm topics and identify sources, in this instance the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations)."
so they were caught due to the citations to non-existing books. it seems fine and unconnected to all of the controversial news stories of using AI detection, which has a high rate of false positives
The teacher noticed a few irregularities such as the time the student spent inside the submitted project. The cheater student only had around 55 mins logged whereas everyone else had 7-8 hours. That set off alarm bells for the teacher who then actually looked into the paper more and noticed fake citations and it was flagged as “ai” generated when they ran it through a few ai detectors.
And for AP US History... a college level course. Yikes.
Good. Insofar as the point of a school paper is to make the student do the thinking and writing, taking it from ChatGPT is plagiarizing from OpenAI.
Probably unpopular opinion here but families, usually wealthy, that use the legal system like this to avoid consequences are parasites. It reveals not only your poor job of raising your children. But also the poor character of the parents.
Glad the courts didn’t grant a similar “affluenza” ruling here. The student plagiarized, short and simple.
> Probably unpopular opinion here but families, usually wealthy, that use the legal system like this to avoid consequences are parasites
100% agree. Plus it diverts school resources -- in other words, taxpayer money -- to fight the court case, meaning they have less to spend on educating children which is what taxpayer's are funding.
The parents should be on the hook for the school's legal fees.
I think the only unpopular part of that is that you'd think it's an unpopular opinion
It depends on the audience. On HN, it's probably a popular opinion. Among typical American parents these days, it's probably not.
+1.
The article hints at a worry about college applications by the student. So perhaps this was an attempt to clean the student's slate of the cheating record. Still, I can't support the approach taken.
Agreed. Ironically it’s parents such as these who are the most loudly self-proclaimed staunch supporters of meritocracy, except when it comes to their children who are somehow not subject to it at all.
What's striking to me is that the parents sued. RNH passed off AI-generated text as their own when they knew how to cite AI generated works and were versed in academic integrity. It wouldn't occur to me to sue the school if this was my kid.
They're not optimizing for the kid's education. They optimizing for the credentials the kid is able to get.
Filing the lawsuit is an asymmetric bet:
- win, and increase college admissions odds
- lose, and be no worse off that without the suit
> lose, and be no worse off that without the suit
This kid should change his name, given his initials, high school and parents’ names are public record next to a four brain cell cheating attempt.
Do you think college admissions officers follow the news and use what they learn to maintain a naughty list?
Perhaps a business idea?
Unless he has someone who is very sympathetic to his cause, the teacher/counselor recommendation will wreck him.
This guy needs to go to a JuCo that feeds into a decent state school — he’s screwed for competitive schools.
> Do you think college admissions officers follow the news and use what they learn to maintain a naughty list?
College admissions, no. College students and colleagues and employers, being able to use a search engine, absolutely.
If you search the student's name on Google, you probably won't find this lawsuit.
Admissions know his name and the name of the school, which helps find specific students.
It’s easy to miss, but I wouldn’t be surprised if it comes up as “Hingham High School Harris” brings up the relevant info. Further, his parents suing may be a larger issue for a college than his behavior.
school recommendations
I'm guessing at some point there will be LLMs trawling through news items to put together profiles for people, and as the cost comes down, it won't just be available to three letter agencies and ad platforms, but schools and employers will start to use them like credit scores.
Nope. I just replied above with a similar story when I was in school. My classmate got expelled for cheating and sued the school. tv segment, articles about him, etc.
Zero effect on his college outcomes. Got into really good schools.
understand though the kid is bearing the implications for the parent's decision.
> win and increase college admissions odds + also gain funds for the parents
> lose, and be no worse off that without the suit
If I were in college admissions then I'd probably think twice about admitting the candidate with a widely reported history of trying to sue their school on frivolous grounds when things don't go their way.
> - win, and increase college admissions odds
wouldn't this decrease? I wouldn't want to admit a litigious cheating student - whether won or lost. this was a pure money play by the parents
Do you think colleges thoroughly check the background of each applicant, such that they would discover this?
> - win, and increase college admissions odds
Will it, though? Like if the college happens to know about this incident?
It does strike me that the purpose in attending college is the credential you get; education is a far second.
It strikes me that this is a foolish take to adopt.
I saw lots of students acting a bit like this but I was grateful that I could dedicate myself primarily to my schooling and took as much advantage as I could to learn as much as I could.
The credential gets used as a heuristic for the learning you do but if you show up and don't have and knowledge, then everything is harder and your labor more fruitless.
I know some people don't care and that there are degenerate workplaces but you'll still be left with having been a lot less useful in your life than you were capable of being.
So what would you do in the parents' shoes?
teach my kids some good ethics and taking responsibility for their actions, instead of jeopardizing the chances of college with the prospects of money
> What's striking to me is that the parents sued
And the kid was even offered a redo!
On the other hand, the school caved on National Honor Society after the parents filed. So maybe the best move would have been (tactically, not as a parent) to show the school the draft complaint but never file it.
Almost zero downside. I knew a student who plagiarized 3x so they got kicked out. His parents sued. It was even on the tv news because they were asking for hundreds of thousands in compensation. He lost and the school kept him expelled.
I was expecting the bad press coverage to hurt his college chances since there were several articles online about him getting kicked out for cheating and then suing.
Nope! Dude got into a really good school. He even ended up texting asking me for past essays I wrote to turn in as his own to his college classes.
And the kicker was he then transferred to one of the prestigious military academies that supposedly upholds honor and integrity.
So. There is almost zero downside for suing even if it gets you tons of negative publicity.
I don't think we can claim zero downside from one anecdote. There are always outliers that can occur from extenuating circumstances.
- The family potentially has the financial resources or possibly connections to 'make things happen'.
- Perhaps the student is especially charismatic and was able to somehow right the situation. Some people have that con-artist mindset where they're able to cheat/commit fraud through their life with seemingly minimal consequences.
- Perhaps they just got lucky and the administration didn't do their due diligence.
anecdata is data if you do not have other data or anecdotes to pack your claims up. The present an example whereas you present speculation
> Perhaps they just got lucky and the administration didn't do their due diligence.
Are universities supposed to google every applicant?
I mean I haven't been in academia for a decade, but back when I was I certainly never browsed a 17-year-old girl's instagram before making an admission decision.
Not every applicant, but the ones in the accepted pool, it strikes me as odd there isn't some basic amount of vetting.
Instagram? No (although, wouldn't be surprised)... but doing a gut check with the school admin and looking at public records? Sure.
you present a very interesting specific example of Instagram which is completely unrelated. story time?
If I put some kid's name into Google, shouldn't I expect social media to come up?
> His parents sued. ...
> He even ended up texting asking me for past essays I wrote to turn in as his own ...
> he then transferred to one of the prestigious military academies ...
>> There is almost zero downside for suing even if it gets you tons of negative publicity.
Sounds like the caveat here should be, "when your parents/family is connected".
Here are some of the case documents:
https://www.courtlistener.com/docket/69190839/harris-v-adams...
How on earth do you get caught plagiarizing and still pass with a C+?
I once had a student in a U.S. history class that literally copy-pasted almost the entirety of a paper from a Wikipedia article (incidentally, an article that was only tangentially related to what he was supposed to write about, which only made it more glaringly obvious something was wrong). After confronting him he told me he "had no clue" how the copying could have happened! I gave him a 0 on the paper, which caused him to fail the course, and reported the incident. But the school admins changed his grade so that he would pass. This was at a for-profit college that thankfully no longer exists (I quit after that experience).
This is how it works at most universities it seems.
I think it depends. At least at the major public university I went to grad school at, if an undergrad had pulled that there would have been extremely serious repercussions. Failing the class would have been the minimum. The bigger issue then was that students with money could just buy their papers and take-home work, which was often impossible to catch. This was before LLMs started hurting paper mills' bottom lines, and a lot has changed in the past few years though.
I used to pick up pocket money writing essays for dumb rich kids in college. If I cared, there’s enough of a paper trail to invalidate more than one degree, at least by the written rules. In the real world I doubt that the cheaters would face real consequences, and have concerns that I would lose my own credentials.
It's high school, and a public one at that. Cheating can be rampant in some schools or with some individuals.
What I find ridiculous is the parents are suing over a C+ vs B grade and a detention on the record. Like where do you see your cheating kid going in life that you're going to waste your resources and the district resources on this?
I find it completely unsurprising that parents who would sue to change a C+ into a B raised a kid who would cheat.
We elected a felon and frequent financial cheat as President, so the sky is the limit, I suppose.
Presumably, this one assignment wasn't the entire grade and the C+ was for the entire course.
> he received Saturday detention and a grade of 65 out of 100 on the assignment
The student still received a passing grade for the assignment despite some of the assignment being AI hallucinated text. From my experience, plagiarism is an automatic zero for the entire assignment or course, but there are tons of counterexamples when the teacher/professor doesn't want to deal with the academic integrity process.
- "but there are tons of counterexamples when the teacher/professor doesn't want to deal with the academic integrity process"
That's a good point: in this particular case, the teacher of the course was subpoenaed to federal court and compelled to testify about their grading. Incredible burden, for someone else's problem.
I have had the rare privilege to see up close examples of how at several US universities, when professors are presented with irrefutable proof that a student has cheated (well beyond any reasonable doubt) the professor will most often do nothing. In the best case they will meet with the student and give them a stern talking to.
The whole system is set up to disincentivize any effort to actually hold students accountable for cheating in a significant way (fail assignment, fail course, expulsion, etc.)
When we read about cases of students being held accountable it's generally the exception not the rule.
Last I checked, 65 was a D-, not a C+. So the C+ was for the course.
Grading scales vary. 65% could correspond to any letter grade.
Fail to meaningfully discipline students due to fear of litigious moron parents, get sued by litigious moron parents anyway.
There need to be counter claims to cover these costs instead of it falling on taxpayers.
also disciplinary action and very probable expulsion
At a university that takes their rules seriously, perhaps. Absolutely not expulsion at any k-12 school.
Is it plagiarizing when you copy stuff that isn't even factual? Merriam-Webster: > : to steal and pass off (the ideas or words of another) as one's own : use (another's production) without crediting the source
I guess so.
I think you briefly forgot the whole genre of fiction exists
No, it's academic dishonesty: representing work that you did not do as your own work.
ETA: "Academic dishonesty" also covers things like falsifying data and willfully misattributing sources, which is a closer approximation to this case.
There are times I have passed students simply because it isn't worth it to not.
We wonder why our society is going to shit. This is one of those thousand cuts.
ai hallucinated citations to non existent publications.
in this case, the ai should publish the cited hallucinated works on amazon to make it real.
not that it would help us, but the ai will have its bases covered.
Then they could train the next generation of models on those works. Nothing to scrape or ingest, since they already have the text on hand!
Discussed before the ruling:
https://news.ycombinator.com/item?id=41861818
How do you get students to engage in creative writing assignments in age of AI?
How do you get them to dive into a subject and actually learn about it?
I'm thirty something. How did my teachers engage me in doing math? How did they engage me in rote-memorizing the multiplication tables when portable calculators were already a thing, being operated by coin-cells or little solar panels?
Part of teaching is getting kids to learn why and how things are done, even if they can be done better/faster/cheaper with new technology or large scale industrial facilities. It's not easy, but I think it's the most important part of education: getting kids to understand the subjacent abstract ideas behind what they're doing, and learning that there's value in that understanding. Don't really want to dichotomize, but every other way kids will just become non-curious users of magic black boxes (with black boxes being computers, societal systems, buildings, infrastructure, supply chains, etc).
The same way you did so before LLMs existed - you rely on in-class assignments, or take-home assignments that can't be gamed.
Giving out purely take-home writing assignments with no in-class component (in an age where LLMs exist), is akin to giving out math assignments without a requirement to show your work (in an age where calculators exist).
Many years before LLMs were ever a thing, I recall being required to complete (and turn in) a lot of our research and outlining in class. A plain "go home and write about X topic" was not that common, out of fear of plagiarism.
Sure, use AI for research, just like using the Internet for research.
But don't copy/paste AI generated content in the same way that you don't copy/paste a chapter from a book and pass it off as your own.
Invert the assignment, provide a prompt to supply to an essay writing AI of the students choice, but the assignment is to provide critique for the veracity and effectiveness of the generated essay
It would seem that what was put into the report is clearly wrong (in this case from generative AI, but regardless of where it came from, it would still be wrong), so it is still legitimate to mark those parts as wrong. There are other things too which can be called wrong, whether or not the use of this generative AI is permitted (and it probably makes sense to not permit it in the way that it was used in this instance), so there are many reasons why it should be marked wrong.
However, if the punishment is excessively severe, then the punishment would be wrong.
He didn't get detention for hallucinating facts. He got detention for plagiarizing hallucinations without attribution.
The parents seem absolutely unhinged.
Poor kid.
Yet another “affluenza” raised child joining the ranks of society. Probably will become a future C-level exec at an American company.
> the Harris’s lawsuit against the Hingham school committee remains alive
what does this mean if the judge already ruled in the school's favor? parents will appeal?
The parents asked for a preliminary injunction to remove the cheating from the kids record. A judge could do this prior to trial if he believes the suit likely to succeed. The judge refused the injunction because he believes the school district was likely acting in good faith and did nothing illegal.
Oh, I see, so the case is still going to court. What a waste of taxpayer money. Elon wants to cut government waste? Make it more difficult to sue by setting a higher bar by which your suit can even be accepted.
Yep, case is still alive. This just indicates the judge doesn’t see it as a slam dunk for the parents/cheater.
While the parents assert the C+ might keep their kid out of Stanford, the more likely impact is that being known for a nationally notorious lawsuit over a minor infraction is what will keep him out of Stanford.
Also, he's not getting into Stanford with the B grade that the parents are suing for anyway. You can't even get into Stanford with all A's these days.
> Also, he's not getting into Stanford with the B grade that the parents are suing for anyway. You can't even get into Stanford with all A's these days.
None of this is true.
Grades are just one part of the picture.
The folks who think a B is what kept them out of an elite school are just engaging in wishful thinking.
The number of people who get into elite schools like Harvard or Stanford with multiple Bs would surprise you.
I think you might get in with multiple Bs and a good story about your interest in the subject you're pursuing (or suitably connected family)
"good story" probably doesn't include being too uninterested to write your own answers despite parents so committed to you going to Stanford they're prepared to litigate to get you a B...
> "good story" probably doesn't include being too uninterested to write your own answers despite parents so committed to you going to Stanford they're prepared to litigate to get you a B...
So true.
This kid is a persona non grata for elite schools at this point.
As I said in the other thread, his best bet is to go to a JuCo that feeds into a decent state school, and just lay low for two years.
He can go to an elite school for a graduate degree if he wants the club membership.
Of course it's possible, but you have to have something truly extraordinary to make up for it (or be a legacy admit, rich parents who donated to the school, etc.). The B will certainly work against you.
> but you have to have something truly extraordinary to make up for it
Flip that, and you’re closer to correct for everyone.
You have to do something truly extraordinary to get in, with the things you listed as being some of the least common types.
Grades just need to be directionally correct rather than perfect.
Also, a side note about legacy admits…
While the admission rate of legacies is about 33% at Harvard (12% at Yale, 30% at Princeton, and 14% at Stanford), that doesn’t mean that being a legacy was the primary reason they got in.
First, 67% of legacies still get denied — that’s quite a bit.
Second, the folks who get into elite schools often know how to raise their kids in a way that increases their chances to get into an elite school. It’s an advantage, but much more often than not, the applicant put in the effort to make themselves a strong applicant.
The legacy “advantage” comes into play almost purely at the margin, when someone is borderline admit/waitlist or waitlist/deny, and the legacy status will push them to the favorable side. You’re not going to see a substantial difference in the folks who were rated comparably.
People seem to want it to be that legacies are freeloading off of their parents and aren’t really qualified admits, and that largely isn’t true. The exceptions are examples like z-list applicants (which you mentioned) or recruited athletes who also happen to be legacies.
I wanna see how many Asian men get in with B's
> I wanna see how many Asian men get in with B's
Please stop perpetuating this myth.
Asians are not held to a different standard.
Anecdotally (with truck load of anecdotes), Asian-Americans (to be specific) frequently seem to be held to a widely-known standard that either they aren’t aware of or don’t believe in.
Note that this is not exclusive to Asian-Americans — plenty of upper-middle class white people fall into this category as well — but that was the group you mentioned.
I have made an open offer to HN, and it still holds:
If you show me the application of an Asian that you felt was held to a different standard for elite school admissions, then I will give you the reason why they most likely didn’t get in.
that’s not much of an offer. one can easily always find (especially when specifically looking for it to prove a point) whatever it is they are looking for :)
I personally know there is asian-american bias (not just asian-american…) in admissions at least one elite school via one of my best friends who works in admissions office.
> I personally know there is asian-american bias (not just asian-american…) in admissions at least one elite school via one of my best friends who works in admissions office.
Oh, interesting.
What is the specific bias they claim exists?
Fwiw, they did a fully body cavity search on Harvard admissions, and the best that they could come up with was describing an applicant (accurately) using race-based shorthand — something like “standard Asian reach applicant”, which (iirc) meant something like high grades, high standardized test scores… and almost nothing else. This is a complete nothing burger.
Note that this stereotype exists for a reason. It’s not exclusive to Asians, but it’s much more common with Asian applicants than other races.
Edited to add:
> that’s not much of an offer. one can easily always find (especially when specifically looking for it to prove a point) whatever it is they are looking for :)
Almost every time I’ve done this face-to-face, it wasn’t some subtle oversight — it was a glaring omission or weakness in the application.
The times that it wasn’t obvious, the person got into an elite school, just didn’t get into their elite school of choice, and that’s a different issue.
Curiously, a nationally notorious lawsuit is not enough to keep you out of Stanford [0].
[0] https://www.popehat.com/p/an-anti-slapp-victory ("An Anti-SLAPP Victory")
Citing nonexistent sources should lower your grade whether you used ai or not.
After reading this article, it is hard to say who is in the right here. The court could easily be wrong because they can only judge based on the facts at hand, based on presumptions they've already settled on.
On one hand, the school referenced academic honesty policy in their defense, but there are no international standards for referencing AI, many AI detection measures have false positives, and they both disallow and allow the same behavior seemingly based upon the teacher's discretion.
If you were a malign individual in a position of authority (i.e. the classic teacher being out to get a troublemaker), you could easily set up circumstances that are unprovable under these guidelines.
There is also a vested interest in academia to not create a hostile work environment, where there is no duty to investigate. They are all in it together. This has been an ongoing problem for academia for decades.
There were also several very prejudicial aspects referenced, such as the revision changes, but some people write their stuff out in paper first, and then copy what's written into a document from there. This is proof of nothing because its apples to oranges.
Finally, there are grievances made about lack of due process, and other arbitrary matters which are all too common in academia, but academia makes it very difficult to be caught in such matters short of recording every little thing, which may potentially be against the states laws.
For example, you may be given written instructions for an assignment that are unclear, and ask for clarification, and the teacher may say something contradictory. Should students be held accountable for a teacher lying verbally (if it happened)?
It is sad that it had to come down to court, but that is just how academia operates with free money from the government. They don't operate under a standard business loss function, nor do they get rid of teachers who don't perform once they reach permanent faculty status. The documentary waiting for superman really drives this home, and its only gotten worse since that documentary came out.
There are plenty of people in the comments who are just rabid against the parents, and that's largely caused by poor journalism seeking to rile people up into irrational fanatic fervor. These people didn't look at the details, they just hopped on the bandwagon.
Be rational, and realize that academia has been broken for decades, and what you read isn't necessarily the whole truth of the matter.
The parents had several valid points which went ignored because there is no smoking gun, and that is how corruption works in centralized systems, and indicates a rule by law rather than a rule of law.
One of the hallucinated authors is literally named "Jane Doe". Our society is about to become powerfully stupid.
"Doe" is actually a real surname, with a few thousand of them in the US. I'd guess that there probably have been people actually named "Jane Doe". I wonder if that causes many problems for them?
what sorts of problems do you imagine this causing?
The name is widely used as a placeholder. Here's how Wikipedia describes it [1]:
> John Doe (male) and Jane Doe (female) are multiple-use placeholder names that are used in the British and US-American legal system and aside generally in the United Kingdom and the United States when the true name of a person is unknown or is being intentionally concealed.
I'd imagine that could lead to some difficulties when someone really named Jane Doe has to deal with some system that uses that name as a placeholder. Similar to the way people whose surname is Null sometimes run into problems because of poorly written computer systems.
[1] https://en.wikipedia.org/wiki/John_Doe
[flagged]
DoE is the department of energy. The department of education is ED.
I laughed out loud when I saw that McMahon was his pick. A fucking wrestling star for the department of education. This is Idiocracy.
Also I laughed because otherwise the fear takes over.
In legal cases that is how one can choose to remain anonymous.
See, there's stuff even geniuses dont know.
Why do you think the previous poster found that name notable? Just because it's inherently funny sounding or something?
That's not relevant to this. It's a direct quote from the work the students handed in.
I am not a lawyer but the student's defense is akin to "Ain't no rules says a dog can't play basketball" from airbud. There are clear rules against plagiarism, and the student copied stuff verbatim from an online source without any citations.
This would have been illegal in Italy has their 1970 Worker Protection against automated management would kill this AI.
AI is the new calc
Using AI in school today is heresy, yet give it a few years and "yesterday's heresy is today's canon".
Back when I was in high school CD-ROMs were brand new and you could buy encyclopedias on disc.
I made dozens of dollars selling book reports and history papers to my fellow honors class peers. Every paper was a virtually unaltered copy & paste job from Microsoft Encarta. Copy, paste into word, format using some “fancy font”, add my “customers” name, date and class to the top… print! Boom. Somebody buys me lunch.
I mean how else was I gonna have time to write shitty Visual Basic programs that used every custom control I could download in order to play/pause the CDROM’s music stuff?
A microcosm of society. Helping others cheat for profit.
Nothing modern whatsoever about it. Students at Oxford nearly two thousand years ago sold their talents to other students.
It was high school.
Hence the microcosm
[dead]
I just used chatGPT to code an html/css/JavaScript solution in an hour for coworkers who were having troubles. There were like wow that was fast we were trying to figure this out for a few days. I'm skilled / an expert but that would've taken me many hours vs. a few back n forth with GPT.
Overall my html/css/javascript skills I feel now aren't as valuable as they were.
I guess in this instance I cheated too or is it that my developer peers haven't gotten into using GPT or they are more moral? As well maybe this is just the new normal....
This has nothing to do with cheating at school.
The rules for working are very very different from being at school.
No you were not cheating, you did what was expected from you. But you knew that.
How so and or AI is changing the rules everywhere no? Today it seems not good yet tomorrow it's how things are...
The goals are very different. It was like this also before AI.
The goal in school is to learn things. To learn to write you can't just copy an article from a paper and say it is yours. You have not learned.
At work, the goal is to get things done.
In our field you needed / need to learn new things to stay relevant yet now the new thing does it almost all for you.
As well if one generation is using AI to get things done why wouldn't a younger generation do the same? Do as I say and not as I do.. that never has held well over time.
But you already learned the web stack--school kids haven't. Your mental model is what prepared you to use LLMs well to solve a problem. So if they're going to do as you did, they need to learn the subject first and then learn how to extend their reach with LLMs. Otherwise, they're just cheating in school.
You don't need that knowledge as i just went to GPT and asked it ...
"I need to create a dropdown for a website can you help me make it?"
And then I asked,
"How do I make what you wrote above work?"
It detailed the things one needs to do ..copy/paste each block of code in three separate notepad files and save each one accordingly (index.html, style.css and script.js) all in one folder. Once that's done double click on the index.html to run the dropdown.
And your colleagues really spent a few days trying to figure this out?
the kids are going to be in a different world than we are. just like it was useful for us to learn a foreign language (still being taught it schools but those days are numbered) for kids these days it is a waste of time (I am sure there are many studies that say being bi/tri/… lingual has benefits beyond communication but you get my point).
I think while we may think “they need to learn the subject first…” do they really? and if they do why? e.g. someone teaching their kid “web development” in soon-to-be 2025 is insane given the tools we have now… so while there are things for sure kids should learn it is not easy to figure out what those things actually are
Yeesh this is full of red flags…
What is..the new normal of using AI to do or help you get your job done and or quicker? Comment above shows it could be the new normal...
No. This attitude of being better than coworkers, coming in and saving the day. It had nothing to do with using AI. It’s about “I am better than you” instead of helping people out, or teaching them these things you know.
It’s just a passing internet comment missing all the context, so what do I know.
My comments are to be controversial… To get people to think… What is the future with AI and using it as such… If I told my coworkers how I achieved it would they not think less present day… What about in a few years or more it's the norm and mine and everyone's HML, CSS, JavaScript skills are less valuable,… this example shows that AI will definitely take peoples jobs, including my own if I do not ramp up my skills
You ramping up your skills will do nothing for you if a machine can otherwise be delegated your job due to the overhead of human worker vs. just owning a machines output. Not having to negotiate is extremely valuable to a business owner. Mark my words. Until people realize that the whole innovation around AI is to sidestep the labor class, things'll continue getting much darker before they brighten.
And the saddest thing is, the fools think it'll work in their favor, and won't blowback with massive unintended consequences.