this post was submitted on 22 Oct 2024
119 points (88.4% liked)

Not The Onion

12288 readers
1527 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
 

The lawsuit says the Hingham High School student handbook did not include a restriction on the use of AI.

"They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB. "They basically punished him for a rule that doesn't exist."


cross-posted from: https://lemmy.zip/post/24633700

Case file: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.8.0.pdf
Case file: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.13.0.pdf

top 25 comments
sorted by: hot top controversial new old
[–] Humanius@lemmy.world 61 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

I'm guessing they probably have rules against plagiarism, or passing off other people's work as your own.
So then I guess it would be down to whether using AI (without disclosure?) is plagiarism or not

[–] 667@lemmy.radio 55 points 3 weeks ago* (last edited 3 weeks ago) (4 children)

Most of the larger LLMs state the results of the model stemming from the user’s prompt intellectually belong to the user.

It’s a massive grey area, and the sum of these kinds of cases are what will define ownership of LLM output for the next ~50 years.

Don’t get me wrong, kid absolutely did not comply with the spirit of the assignment.

E: @Blue_Morpho@lemmy.world makes an excellent point:

If the student hired someone to write their essay and the author assigned all copyrights to the student, it's still plagiarism.

Who legally owns the work isn't the issue with plagiarism.

[–] BananaTrifleViolin@lemmy.world 24 points 3 weeks ago (2 children)

The LLMs can claim whatever they like, it holds no weight or value. They are basically advanced plagiarism engines and the law has already made it clear you cannot copyright the output of an LLM.

This particular case will go nowhere, but there are plenty of legal cases between content creators and AI makers that are slowly moving through the legal system that will go somewhere.

[–] hedgehog@ttrpg.network 3 points 3 weeks ago

the law has already made it clear you cannot copyright the output of an LLM.

That’s true in this context and often true generally, but it’s not completely true. The Copyright Office has made it clear that the use of AI tools has to be evaluated on a case-by-case basis, to determine if a work is the result of human creativity. Refer to https://www.copyright.gov/ai/ai_policy_guidance.pdf for more details.

For example, they state that the selection and arrangement of AI outputs may be sufficient for a work to be copyrightable. And that’s without doing any post-processing of the AI’s outputs.

They don’t talk about situations like this, but I suspect that, if given a prompt like “Rewrite this paragraph from third person to first person,” where the paragraph in question is copyrighted, the output would maintain the same copyright as the input (particularly if performed faithfully and without hallucinations). Such a revision could be made with non-LLM technology, after all.

[–] Flax_vert@feddit.uk 2 points 3 weeks ago

So who owns the copyright then? Is the output just public domain?

[–] Blue_Morpho@lemmy.world 20 points 3 weeks ago* (last edited 3 weeks ago)

It doesn't matter what the LLM license states. Replace the LLM with a person doing exactly what the LLM does and ask yourself if it is plagiarism.

If I do your homework for you and I say, "Because you prompted me with the questions, the answers belong to you." That isn't a free 'get out of plagiarism card' for you. What I tell you isn't relevant.

It's not gray at all.

Edit: that's weird. I got a personal message but the reply showed up here.

[–] Blue_Morpho@lemmy.world 10 points 3 weeks ago* (last edited 3 weeks ago)

If the student hired someone to write their essay and the author assigned all copyrights to the student, it's still plagiarism.

Who legally owns the work isn't the issue with plagiarism.

[–] spankmonkey@lemmy.world 9 points 3 weeks ago* (last edited 3 weeks ago)

Most of the larger LLMs state the results of the model stemming from the user’s prompt intellectually belong to the user.

Who cares what they say to avoid being sued for copyright infringement?

[–] saltesc@lemmy.world 18 points 3 weeks ago (2 children)

I sometimes use an LLM to "tidy up" my work and paste a bunch of writing in to see if it comes up with anything better. Some parts it will, others it won't, and I'll use or tweak some of it. I wonder if that counts? It's all my work going in, but it's using other people's work to make adjustments.

[–] Blue_Morpho@lemmy.world 17 points 3 weeks ago (1 children)

Replace LLM with a person. If it was a person editing your work, does it make it plagiarism?

A common proofreading technique is to give your work to another person to read and make comments. That's not plagiarism.

[–] Saik0Shinigami@lemmy.saik0.com 6 points 3 weeks ago (1 children)

People who proofread only generally make recommendations to edit. LLMs often "rewrite" the vast majority of the document.

If I tell a person who's my editor the concept of my paper and about 20-30% of the actual content that's in the end paper... sounds like someone else wrote the paper to me.

It's all up to how you're using the tool. Lots of kids out there will simple tell chatgpt to write something for them. Other's will simply ask for basic proofreading. It's a bitch to tell the difference on the grading side.

[–] Blue_Morpho@lemmy.world 2 points 3 weeks ago (1 children)

Yes, that's exactly my opinion on the subject. ( I realize this is a contentless reply but I didn't want you to think I downvoted you.)

[–] Saik0Shinigami@lemmy.saik0.com 1 points 3 weeks ago

I didn’t want you to think I downvoted you.

I'm admin on my small instance. I can see the votes. No worries. In this case the downvote is from xektop@lemmy.world.

Anyway, the most I ever use LLMs professionally for is to help rearrange content for better flow or maybe convert more rambly bits into something that's concise. I tend to be more verbose than I need to be (mostly because my documentation for stuff is wildly verbose since I tend to forget stuff, which is great for documentation... not always great for talking through something for a client).

[–] dharmacurious@slrpnk.net 3 points 3 weeks ago

I write my own papers, but will put paragraphs through an llm and ask it how it can be improved (normally grammarly's 'ai'), and sometimes I take it's advice, but half the time I dislike what it's done. Sometimes I give it a bunch of information on what I need to write, and it'll spit something out, and then I'll sort of use it as a skeleton for my paper, but to be honest, it's kind of shit, regardless of which one I've tried. And it lies. So much.

[–] BonerMan@ani.social -1 points 3 weeks ago

But those rules don't apply here.

[–] DandomRude@lemmy.world 43 points 3 weeks ago

This reminds me of a story a friend who is a teacher recently told me: One of his students was so nervous during an oral exam that he could barely form a complete sentence. So the friend of mine, in consultation with the exam board, gave the poor guy a second chance on the same day - that didn't go particularly well either, but was enough to pass. The parents of the nervous student sued because this procedure did not comply with the examination regulations. They won and managed to get the exam repeated a third time - the examination board stayed unchanged. You can perhaps imagine how this went for the student, who was understandably all the more nervous the third time around. In the end, he didn't graduate, not because the examiners were vindictive, but because they had to grade the student purely based on his performance which wasn't good enough because the poor guy couldn't get a coherent sentence together again. If his parents hadn't sued, he would have graduated.

[–] breakingcups@lemmy.world 20 points 3 weeks ago (2 children)
[–] federalreverse@feddit.org 8 points 3 weeks ago

I guess you're right in that the headline is not Onion-worthy. But I find "it's not cheating to cheat using a machine, let's sue" a rather creative approach.

[–] Akasazh@feddit.nl 1 points 3 weeks ago

Iirc the suing parents are teachers themselves

[–] solsangraal@lemmy.zip 16 points 3 weeks ago (2 children)

lol AI has written many thousands of words for me at work. the real life skill is how to not get caught using it

[–] Hideakikarate@sh.itjust.works 32 points 3 weeks ago

Don't quote Wikipedia, instead quote the citations in Wikipedia.

[–] Solumbran@lemmy.world 3 points 3 weeks ago (1 children)

Sounds like your employer is lucky...

[–] solsangraal@lemmy.zip 9 points 3 weeks ago (1 children)

yea, lucky i even show up, with the bullshit pittance they're calling "wages"

[–] AmidFuror@fedia.io 1 points 3 weeks ago (1 children)

It seems like with your skill set, you should be able to get a better job.

[–] Saik0Shinigami@lemmy.saik0.com 6 points 3 weeks ago

I think their skillset might be limited to what chatgpt can produce.