msgraves

joined 10 months ago
[–] msgraves@lemmy.dbzer0.com 4 points 2 days ago (1 children)

She holds some blame, but at the end of the day there are nearly 18 million people that didn't vote for her, but did vote for biden. The only reasonable conclusion to draw is, well, she's black and even worse: a woman.

(second part is /s)

[–] msgraves@lemmy.dbzer0.com 18 points 3 months ago (2 children)

One of the worst parts of this boom in LLM models is the fact that they can "invade" online spaces and control a narrative. For an example, just go on twitter and scroll to the comments on any tagesschau (german news site) post- it's all rightwing bots and crap. LLMs do have uses, but the big problem is that a bad actor can basically control any narrative with the amount of sheer crap they can output. And OpenAI does nothing- even though they are the biggest provider. It earns them money, after all.

I also can't really think of a good way to combat this. If you would verify people using an ID, you basically nuke all semblance of online anonymity. If you have some sort of captcha, it will probably be easily bypassed- it doesn't even need to be tricked. Just pay some human in a country with extremely cheap labour that will solve it for your bot. It really sucks.

[–] msgraves@lemmy.dbzer0.com 0 points 3 months ago (3 children)

same energy (and impact) as "X formerly known as Twitter"

[–] msgraves@lemmy.dbzer0.com 7 points 3 months ago

Exactly, this isn't about any sort of AI, this is the old playbook of trying to digitally track images, just with the current label slapped on. Regardless of your opinion on AI, this is a terrible way to solve this.

[–] msgraves@lemmy.dbzer0.com 1 points 5 months ago

actual example please not like your other friend Luddite on the other comment

[–] msgraves@lemmy.dbzer0.com 3 points 5 months ago

disregarding the fact that the model learns and extrapolates from the training data, not copying,

have fun figuring out which model made the image in the first place!

[–] msgraves@lemmy.dbzer0.com 10 points 5 months ago (4 children)

you’re gonna have a bad time restricting software

[–] msgraves@lemmy.dbzer0.com 7 points 6 months ago

I wish that would stop Nintendo.

[–] msgraves@lemmy.dbzer0.com 2 points 7 months ago

Eh surprisingly they often break less on arch in my experience

[–] msgraves@lemmy.dbzer0.com 1 points 7 months ago

Undertale. It was the best game I've ever played and I can never play it again. This game lives rent free in my head, in my fanworks, in the music I listen to and make. It's a game that combines technology and art.

[–] msgraves@lemmy.dbzer0.com 6 points 10 months ago

ok, fair; but do consider the context that the models are open weight. You can download them and use them for free.

There is a slight catch though which I’m very annoyed at: it’s not actually Apache. It’s this weird license where you can use the model commercially up until you have 700M Monthly users, which then you have to request a custom license from meta. ok, I kinda understand them not wanting companies like bytedance or google using their models just like that, but Mistral has their models on Apache-2.0 open weight so the context should definitely be reconsidered, especially for llama3.

It’s kind of a thing right now- publishers don’t want models trained on their books, „because it breaks copyright“ even though the model doesn’t actually remember copyrighted passages from the book. Many arguments hinge on the publishers being mad that you can prompt the model to repeat a copyrighted passage, which it can do. IMO this is a bullshit reason

anyway, will be an interesting two years as (hopefully) copyright will get turned inside out :)

[–] msgraves@lemmy.dbzer0.com 10 points 10 months ago (14 children)

ohno my copyright!!!! How will the publisher megacorps now make a record quarter??? Think of the shareholders!

view more: next ›