this post was submitted on 13 Aug 2023
783 points (97.7% liked)

Technology

59664 readers
2691 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

top 50 comments
sorted by: hot top controversial new old
[–] AlmightySnoo@lemmy.world 113 points 1 year ago* (last edited 1 year ago)

I think that's actually a good idea? Sucks for e-learning as a whole, but I always found online exams (and also online interviews) to be very easy to game.

[–] HexesofVexes@lemmy.world 105 points 1 year ago (52 children)

Prof here - take a look at it from our side.

Our job is to evaluate YOUR ability; and AI is a great way to mask poor ability. We have no way to determine if you did the work, or if an AI did, and if called into a court to certify your expertise we could not do so beyond a reasonable doubt.

I am not arguing exams are perfect mind, but I'd rather doubt a few student's inability (maybe it was just a bad exam for them) than always doubt their ability (is any of this their own work).

Case in point, ALL students on my course with low (<60%) attendance this year scored 70s and 80s on the coursework and 10s and 20s in the OPEN BOOK exam. I doubt those 70s and 80s are real reflections of the ability of the students, but do suggest they can obfuscate AI work well.

load more comments (51 replies)
[–] mwguy@infosec.pub 54 points 1 year ago (2 children)

They're about to find out that gen Z has horrible penmanship.

load more comments (2 replies)
[–] aulin@lemmy.world 51 points 1 year ago (5 children)

There are places where analog exams went away? I'd say Sweden has always been at the forefront of technology, but our exams were always pen-and-paper.

[–] Leroy@lemmy.world 8 points 1 year ago

Same in Germany

load more comments (4 replies)
[–] neptune@dmv.social 42 points 1 year ago (6 children)

This isn't exactly novel. Some professors allow a cheat sheet. But that just means that the exam will be harder.

Physics exam that allows a cheat sheet asks you to derive the law of gravity. Well, OK, you write the answer at the bottom pulled from you cheat sheet. Now what? If you recall how it was originally created you probably write Newtons three laws at the top of your paper... And then start doing some math.

Calculus exam that let's you use wolfram alpha? Just a really hard exam where you must show all of your work.

Now, with ChatGPT, it's no longer enough to have a take home essay to force students to engage with the material, so you find news ways to do so. Written, in person essays are certainly a way to do that.

load more comments (6 replies)
[–] Rozz@lemmy.sdf.org 41 points 1 year ago (5 children)

Am I wrong in thinking student can still generate an essay and then copy it by hand?

[–] CrimsonFlash@lemmy.ca 50 points 1 year ago (3 children)

Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.

load more comments (3 replies)
load more comments (4 replies)
[–] Mugmoor@lemmy.dbzer0.com 31 points 1 year ago (14 children)

When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn't exactly a new development.

[–] whatisallthis@lemm.ee 23 points 1 year ago (1 children)

So what you’re telling me is that written tests have, in fact, existed before?

What are you some kind of education historian?

load more comments (1 replies)
load more comments (13 replies)
[–] UsernameIsTooLon@lemmy.world 31 points 1 year ago (1 children)

You can still have AI write the paper and you copy it from text to paper. If anything, this will make AI harder to detect because it's now AI + human error during the transferring process rather than straight copying and pasting for students.

load more comments (1 replies)
[–] TimewornTraveler@lemm.ee 28 points 1 year ago (3 children)

Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?

[–] WackyTabbacy42069@reddthat.com 28 points 1 year ago (20 children)

It actually is artificial intelligence. What are you even arguing against man?

Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn't AI because you don't like it is like saying rock and roll isn't music

load more comments (20 replies)
load more comments (2 replies)
[–] Four_lights77@lemm.ee 25 points 1 year ago (1 children)

This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.

[–] pinkdrunkenelephants@sopuli.xyz 25 points 1 year ago (1 children)

And forget about having any sort of integrity or explaining to kids why it's important for them to know how to do shit themselves instead of being wholly dependent on corporate proprietary software whose accessibility can and will be manipulated to serve the ruling class on a whim 🤦

[–] Not_Alec_Baldwin@lemmy.world 18 points 1 year ago* (last edited 1 year ago) (1 children)

It's insane talking to people that don't do math.

You ask them any mundane question and they just shrug, and if you press them they pull out their phone to check.

It's important that we do math so that we develop a sense of numeracy. By the same token it's important that we write because it teaches us to organize our thoughts and communicate.

These tools will destroy the quality of education for the students that need it the most if we don't figure out how to reign in their use.

If you want to plug your quarterly data into GPT to generate a projection report I couldn't care less. But for your 8th grade paper on black holes, write it your damn self.

load more comments (1 replies)
[–] thedirtyknapkin@lemmy.world 22 points 1 year ago

as someone with wrist and hand problems that make writing a lot by hand, I'm so lucky i finished college in 2019

[–] jordanlund@lemmy.one 22 points 1 year ago

Chat GPT - answer this question, add 4 consistent typos. Then hand transcribe it.

[–] Mtrad@lemm.ee 20 points 1 year ago (5 children)

Wouldn't it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?

There could still be classes / lectures that cover the more classical methods, but I remember being told "you won't have a calculator in your pocket".

My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can't solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?

[–] revv@lemmy.blahaj.zone 13 points 1 year ago

Training how to use "AI" (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use "AI", you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.

Without some means of forcing students to engage cognitively, there's little point in education. Pen and paper seems like a pretty cheap way to get that done.

I'm all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.

[–] Atomic@sh.itjust.works 13 points 1 year ago (8 children)

That's just what we tell kids so they'll learn to do basic math on their own. Otherwise you'll end up with people who can't even do 13+24 without having to use a calculator.

load more comments (8 replies)
[–] settxy@lemmy.world 10 points 1 year ago

There are some universities looking at AI from this perspective, finding ways to teach proper usage of AI. Then building testing methods around the knowledge of students using it.

Your point on checking for accuracy is on point. AI doesn't always puke out good information, and ensuring students don't just blindly believe it NEEDS to be taught. Otherwise wise you end up being these guys... https://apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa122437dbb59b

load more comments (2 replies)
[–] SocialMediaRefugee@lemmy.world 14 points 1 year ago (7 children)

Might as well go back to oral exams and ask the student questions on the spot.

[–] HexesofVexes@lemmy.world 9 points 1 year ago

That's actually something that is done (PhD viva). If I had the budget to hire another 6 assistant profs to viva my 120 students, I'd probably do it for my module too!

load more comments (6 replies)
[–] HawlSera@lemm.ee 9 points 1 year ago* (last edited 1 year ago) (3 children)

Isn't this kind of ableist? I remember when I was in school I had special accommodations to type instead of write, because I had wrists too weak to write legibly, but fingers fast enough to type expediently, they legitimately thought that I was a really stupid kid, until they realized that my spelling tests were not incorrect.

They just couldn't read that I had spelled it correctly. Somehow I wrote the word fly, and the teacher mistook my y for a v. I went from being the dumbest kid to the smartest kid as soon as the accommodation was put in place.

[–] wholeofthemoon@lemmy.world 19 points 1 year ago (6 children)

Your comment is full of errors, interestingly enough...

[–] Krachsterben@feddit.de 8 points 1 year ago* (last edited 1 year ago)

It's so bad lol. There's multiple errors in each sentence

load more comments (5 replies)
[–] Water1053@lemmy.world 18 points 1 year ago (1 children)

You became the smartest kid because everyone else had a stroke trying to read what you wrote.

load more comments (1 replies)
load more comments (1 replies)
load more comments
view more: next ›