this post was submitted on 15 Feb 2024
429 points (95.5% liked)
Technology
59323 readers
6143 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In my experience as a game designer, the code that LLMs spit out is pretty shit. It won't even compile half the time, and when it does, it won't do what you want without significant changes.
The correct usage of LLMs in coding imo is for a single use case at a time, building up to what you need from scratch. It requires skill both in talking to AI for it to give you what you want, knowing how to build up to it, reading the code it spits out so that you know when it goes south and the skill of actually knowing how to build the bigger picture software from little pieces but if you are an intermediate dev who is stuck on something it is a great help.
That or for rubber ducky debugging, it s also great in that
That sounds like more effort than just... writing the code.
It s situationally useful
Chatgpt once insisted my JSON was actually YAML
Technically it is, but I agree that is imprecise and nobody would say so IRL. Unless they are being a pedantic nerd, like I am right now.
You should refine your thoughts more instead of dumping a stream of consciousness on people.
Essentially what this stream of consciousness boils down to is "Wouldn't it be neat if AI generated all the content in the game you are playing on the fly?" Would it be neat? I guess so but I find that incredibly unappealing very similar to how AI art, stories and now video is unappealing. There's no creativity involved. There's no meaning to any of it. Sentient AI could probably have creativity but what people like you who get overly excited about this stuff don't seem to understand is how fundamentally limited our AI actually is currently. LLMs are basically one of the most advanced AI things rn and yet all it does is predict text. It has no knowledge, no capacity for learning. It's very advanced auto correct.
We've seen this kind of hype with Crypto with NFTs and with Metaverse bullshit. You should take a step back and understand what we currently have and how incredibly far away what has you excited actually is.
I don't mean to be dismissive of your entire train of thought (I can't follow a lot of it, probably because I'm not a dev and not familiar with a lot of the concepts you're talking about) but all the things you've described that I can understand would require these tools to be a fuckload better, on an order we haven't even begun to get close to yet, in order to not be super predictable.
It's all wonderful in theory, but we're not even close to what would be needed to even half-ass this stuff.