this post was submitted on 18 Mar 2024
35 points (84.3% liked)

Technology

59377 readers
4666 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 6 comments
sorted by: hot top controversial new old
[–] CyberSeeker@discuss.tchncs.de 18 points 8 months ago (2 children)

Who cares if the code is open source, or pre-training weights are released? Virtually every Masters in CS student in 2024 is building this from scratch. The differentiator is the training dataset, or at worst, the weights after fine tuning the model.

[–] General_Effort@lemmy.world 4 points 8 months ago

Training something like this costs millions. It's not going to be useful to many people because of it's size/the cost of doing anything with it.

I don't think that giving alms justified the existence of feudal Lords. In the same way, I think the fact that nation states can't keep up with the research expenditure of a few rich men (think space travel), shows us that we have a problem. That said, it does represent a fairly generous donation to science, start-ups, or whatever.

[–] BetaDoggo_@lemmy.world 17 points 8 months ago

It's size makes it basically useless. It underperforms models even in it's active weight class. It's nice that it's available but Grok-0 would have been far more interesting.

[–] autotldr@lemmings.world 4 points 8 months ago

This is the best summary I could come up with:


It is fine-tuned for applications such as natural language dialog, and represents the raw base model checkpoint from the pre-training phase, which concluded in October 2023.

Grok will be familiar to users of Musk's social media platform, X, and subscribers have been able to ask the chatbot questions and receive answers.

If a user flicks through a dog-eared copy of The Hitchhiker's Guide to the Galaxy radio scripts, the following definition can be found lurking in Fit the Tenth: "The Hitchhiker's Guide to the Galaxy is an indispensable companion to all those who are keen to make sense of life in an infinitely complex and confusing universe, for though it cannot hope to be useful or informative on all matters, it does make the reassuring claim that where it is inaccurate, it is at least definitively inaccurate.

The release comes on the first anniversary of the launch of OpenAI's GPT-4 model, and Musk's legal spat with his former AI pals remains in the background.

OpenAI responded by releasing a trove of emails, claiming Musk was fully aware of its plans and wanted it folded into Tesla.

By opening up the weights behind Grok-1, Musk is attempting to plant a flag in the opposite camp to the proprietary world of OpenAI.


The original article contains 639 words, the summary contains 210 words. Saved 67%. I'm a bot and I'm open source!