this post was submitted on 30 Aug 2023
204 points (95.9% liked)

Technology

59135 readers
2878 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TurnItOff_OnAgain@lemmy.world 18 points 1 year ago (2 children)

Oof. I've tried it with a few Powershell things and it has recommended cmdlets that don't exist, parameters that don't exist, or the wrong usage of cmdlets.

[–] FMT99@lemmy.world 13 points 1 year ago

It's really limited to basic, junior level programming assistance, and even then it's not 100% reliable. Any time I've tried asking it something more advanced it takes a lot of coaxing to get it to output reasonable code. But it's helpful for boilerplating basic code sometimes.

[–] CharlestonChewbacca@lemmy.world 5 points 1 year ago (3 children)

Have you tried 3.5 or 4?

I haven't had many issues in 4. Occasionally it does what you're saying and I just say "bro, that doesn't exist" and it's like "oh, my bad, here you go." And gives me something that works.

[–] zero_spelled_with_an_ecks@programming.dev 1 points 1 year ago (2 children)

Just yesterday I had 4 make up a Jinja filter that didn't exist. I told it that and it returned something new that also didn't work but had the same basic form. 4 sucks now for anything that I'd like to be accurate.

[–] CharlestonChewbacca@lemmy.world 1 points 1 year ago (1 children)

What kind of prompts are you giving?

I find results can be improved quite easily with better prompt engineering.

[–] zero_spelled_with_an_ecks@programming.dev 0 points 1 year ago (1 children)

It makes things up wholecloth and it's the user's fault for not prompting in correctly? Come on.

It's not a person. It's a tool.

[–] Spellbind8558@lemmy.world 1 points 1 year ago

Both models have definitely decreased in quality over time.

[–] TurnItOff_OnAgain@lemmy.world 1 points 1 year ago (1 children)

I don't remember what version. I just gave up trying

[–] CharlestonChewbacca@lemmy.world 1 points 1 year ago (1 children)

Well don't expect it to just give magical results without learning prompt engineering and understanding the tools you're working with.

[–] TurnItOff_OnAgain@lemmy.world 1 points 1 year ago

Set-MailboxAddressBook doesn't exist.

Set-ADAttribute doesn't exist.

Asking for a simple command and expecting to receive something that actually exists is magical?

[–] radau@lemmy.dbzer0.com 0 points 1 year ago* (last edited 1 year ago)

I used gpt4 for terraform and it was kind of all over the place in terms of fully deprecated methods. It felt like a nice jumping off point but honestly probably would've been less work to just write it up from the docs in the first place.

I can definitely see how it could help someone fumble through it and come up with something working without knowing what to look for though.

Was also having weird issues with it truncating outputs and needing to split it, but even telling it to split would cause it to kind of stall.