this post was submitted on 25 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ILikeCutePuppies@alien.top 1 points 10 months ago (8 children)

I think we'll get better models by having LLMs start to filter out less quality data from the training set and also have more machine generated data, particularly in the areas like code where a AI can run billions of experiments and use successes to better train the LLM. All of this is gonna cost a lot more compute.

ie for coding LLM proposes experiment, it is run, it keeps trying until its successful and good results are fed back into the LLM training and it is penalized for bad results. Learning how to code has actually seemed to help the LLM reason better in other ways, so improving that I would expect it to help it significantly. At some point, if coding is good enough, it might be able to write its own better LLM system.

[–] sergeyzenchenko@alien.top 0 points 10 months ago (5 children)

Coding is not Important to make better LLM, it’s all about math

[–] ILikeCutePuppies@alien.top 1 points 10 months ago (1 children)

There was a paper which I couldn't find at the moment, which says in the early stages of the gpt, when they added code into its knowledge base, it got better at reasoning. I think that math might help in some other ways, but code can be used to solve math problems and do more than math in anycase.

[–] farmingvillein@alien.top 1 points 10 months ago (1 children)

I think OP is responding to (without commenting on correctness...)

At some point, if coding is good enough, it might be able to write its own better LLM system.

[–] sergeyzenchenko@alien.top 1 points 10 months ago

Yes, it’s all about math problems, code is just tool to express them

load more comments (3 replies)
load more comments (5 replies)