this post was submitted on 16 Dec 2024
599 points (98.7% liked)

Technology

60082 readers
2680 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Warl0k3@lemmy.world 27 points 1 week ago (3 children)

As a prof, it's getting a little depressing. I'll have students that really seem to be getting to grips with the material, nailing their assignments, and then when they're brought in for in-person labs... yeah, they can barely declare a function, let alone implement a solution to a fairly novel problem. AI has been hugely useful while programming, I won't deny that! It really does make a lot of the tedious boilerplate a lot less time-intensive to deal with. But holy crap, when the crutch is taken away people don't even know how to crawl.

[–] rottingleaf@lemmy.world 7 points 1 week ago (1 children)

Seem to be 2 problems. One is obvious, the other is that such tedious boilerplate exists.

I mean, all engineering is divide and conquer. Doing the same thing over and over for very different projects seems to be a fault in paradigm. Like when making a GUI with tcl/tk you don't really need that, but with qt you do.

I'm biased as an ASD+ADHD person that hasn't become a programmer despite a lot of trying, because there are a lot of things which don't seem necessary, but huge, turning off my brain via both overthinking and boredom.

But still - students don't know which work of what they must do for an assignment is absolutely necessary and important for the core task and which is maybe not, but practically required. So they can't even correctly interpret the help that an "AI" (or some anonymous helper) is giving them. And thus, ahem, prepare for labs ...

[–] Entropywins@lemmy.world 6 points 1 week ago (1 children)

If you're in school, everything being taught to you should be considered a core task and practically required. You can then reassess once you have graduated and a few years into your career as you'll now possess the knowledge of what you need and what you like and what you should know. Until then, you have to trust the process.

[–] rottingleaf@lemmy.world 2 points 1 week ago (1 children)

People are different. For me personally "trusting the process" doesn't work at all. Fortunately no, you don't have to, generally.

[–] Warl0k3@lemmy.world 2 points 1 week ago (1 children)

I have never had a student with this attitude pass my program, and I've had a great many students with this attitude. Take from that what you will.

[–] rottingleaf@lemmy.world -1 points 1 week ago (1 children)

Then you are a bad instructor, obviously.

Because it's often not like this and the difference is usually in the instructor.

That's what I take from that.

(Other than common sense about meaningless mimicking versus gradual understanding from small steps, confirmed by plenty of research about didactics.)

[–] Warl0k3@lemmy.world 1 points 1 week ago* (last edited 1 week ago) (1 children)

I'm going to be totally honest, on a re-read I do not understand what you're trying to say here.

[–] rottingleaf@lemmy.world 0 points 1 week ago

Not sure which particular parts are confusing, so I'm going to guess and rephrase like this:

People are obviously different, it's obvious that a certain process can't fit all sizes, so if there's a kind of "attitude" with which that process fails, then the problem can be both with the process and with the attitude.

And in my personal experience there are processes which work just fine with that attitude.

Processes are built for human needs. Not humans are built for processes.

So the problem is with the process, which includes the instructor who seems to think that it's not.

[–] Omega_Jimes@lemmy.ca 5 points 1 week ago (1 children)

This semester i took a basic database course, and the prof mentioned that LLMs are useful for basic queries. A few weeks later, we had a no-computer closed book paper quiz, and he was like "You can't use GPT for everything guys!".

Turns out a huge chunk of the class was relying on gpt for everything.

[–] Warl0k3@lemmy.world 5 points 1 week ago* (last edited 1 week ago)

Yeeeep. The biggest adjustment I/my peers have had to make to address the ubiquity of students cheating using LLMs is to make them do stuff, by hand, in class. I'd be lying if I said I didn't get a guilty sort of pleasure from the expressions on certain students when I tell them to put away their laptops before the first thirty-percent-of-your-grade in-class quiz. And honestly, nearly all of them shape up after that first quiz. It's why so many profs are adopting the "you can drop your lowest-scoring quiz" policy.

Yes, it's true that once they get to a career they will be free to use LLMs as much as they want - but much like with TI-86, you can't understand any of the concepts your calculator can't solve if you don't have an understanding of the concepts it can.

[–] thefactremains@lemmy.world 5 points 1 week ago

When AI achieves sentience, it'll simply have to wait until the last generation of humans that know how to code die off. No need for machine wars.