this post was submitted on 10 Jun 2023
77 points (100.0% liked)

Technology

37745 readers
480 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

One of Spez's answers in the infamous Reddit AMA struck me

Two things happened at the same time: the LLM explosion put all Reddit data use at the forefront, and our continuing efforts to reign in costs...

I am beginning to think all they wanted to do was getting their share of the AI pie, since we know Reddit's data is one of the major datasets for training conversetional models. But they are such a bunch of bumbling fools, as well as being chronically understaffed, the whole thing exploded in their face. At this stage their only chance if survival may well be to be bought out by OpenAI...

you are viewing a single comment's thread
view the rest of the comments
[–] j4k3@lemmy.world 6 points 1 year ago (1 children)

The value of LLM's has changed drastically in favor of open source since the Meta weights leak. The proprietary model looks pretty much wrecked now, at least as far as I understand the leaked internal memo from a google researcher last month.

https://www.semianalysis.com/p/google-we-have-no-moat-and-neither

[–] gotofritz@beehaw.org 2 points 1 year ago (2 children)

Oh I'm not saying they are doing the right thing or that it was the correct decision. Just speculating whether LLMs is what kicked off the whole thing

[–] j4k3@lemmy.world 1 points 1 year ago

I'm saying the premise that LLM's have anything to do with it is either incompetent failure to keep up with LLM developments, or a pack of lies.

[–] j4k3@lemmy.world 0 points 1 year ago (1 children)

I'm saying the premise that LLM's have anything to do with it is either incompetent failure to keep up with LLM developments, or a pack of lies.

[–] gotofritz@beehaw.org 2 points 1 year ago* (last edited 1 year ago)

I disagree, it's still too early and a bit presumptuous to make such conclusive statements