stsquad

joined 2 years ago
[–] stsquad@lemmy.ml 3 points 1 day ago

I also have a diverter which heats up my hot water tank which saves on gas, especially in the summer.

[–] stsquad@lemmy.ml 4 points 1 day ago

It will be fun watching those users who first make the jump to the new project.

[–] stsquad@lemmy.ml 3 points 1 day ago (2 children)

Export to the grid, for every kWh I export during the day I can afford two kWh overnight.

[–] stsquad@lemmy.ml 12 points 1 day ago (2 children)

If it's finding valid vulnerabilities then it's just another tool like static analysis, fuzzers and sanitizers. There definitely seems to be a difference in quality compared to earlier generations that were behind the sloppy avalanch of reports.

[–] stsquad@lemmy.ml 3 points 2 days ago
[–] stsquad@lemmy.ml 1 points 4 days ago (1 children)

They don't have to be. They know what they asked the LLM to do. They know how much they adapted the output. You usually have to work to get the models to spit out significant chunks of memorised text.

[–] stsquad@lemmy.ml 1 points 5 days ago (3 children)

No, that's why the author asserts that with their signed-of-by. It's what I do if I use any LLM content as the basis of my patches.

[–] stsquad@lemmy.ml 1 points 5 days ago (5 children)

If the 2-10% is just boilerplate syscall number defines or trivial MIN/MAX macros then it's just the common way to do things.

[–] stsquad@lemmy.ml 3 points 6 days ago (1 children)

If you are using MakeMKV when ripping you can override the filename template. So I name them for example "Show s01e04+" based on the disc I'm ripping. Then once encoded it's relatively quick to rename the files with the full episode number. I personally use dired in Emacs because a macro makes short work of the renaming but I'm sure other solutions are possible.

[–] stsquad@lemmy.ml 18 points 1 week ago

My kids are growing up in this environment and they already have an eye for ai slop. I suspect it's the same thing that led to OpenAI's TikSlop "product" is getting canned. After society had gotten over the sugar rush excitement of new and shiny toys I suspect the interest will fade and people will crave the connection you get from real art made by real people.

At least I hope that is what will happen. We might have to do something to hold the tech companies accountable for their dopamine trigger machines though.

[–] stsquad@lemmy.ml 3 points 1 week ago (9 children)

Where are you seeing the 2-10% figure?

In my experience code generation is most affected by the local context (i.e. the codebase you are working on). On top of that a lot of code is purely mechanical - code generally has to have a degree of novelty to be protected by copyright.

[–] stsquad@lemmy.ml 9 points 1 week ago

I was glad to see Niko publish his initial work and look forward to seeing how it's gone.

9
submitted 6 months ago* (last edited 6 months ago) by stsquad@lemmy.ml to c/videos@lemmy.world
 

A fairly deep dive about how you can cherry pick stats to push a narrative.

 

For virtualization there are improvements for VirtIO, vfio and Loongarch CPU hotplug. On the emulation side additions for Arm, RiscV and even some speed ups for x86 string ops. On the documentation side a whole bunch of work has been done on QMP API to make it clearer and more navigable.

 

I was trying to add a Matter device from my phone but it kept saying I needed to install the companion app from the Play store even though I was in the companion app (from f-droid). I've installed the Bluetooth proxy app as well but it made note difference.

Does anyone know what's going on?

 

It always seemed to me that QAnon was some sort of online LARP on 4chan that got out of control and metastasized. It's left a trail of broken families and swept into the mainstream with branding and everything. After the predictions of Trump's return to power after Jan 6th it seems to have fizzled out. Did QAnon stop posting? Did their adherents just glom onto the next crazy theory? How many followers now disavow the theories of QAnon?

 
view more: next ›