bamboo

joined 1 year ago
[–] bamboo@lemm.ee -2 points 6 months ago

As many as they can possibly require so that this doesn’t finish before Trump is inaugurated next year. Then he can kill it and the democrats can recycle the campaign material in 2028.

[–] bamboo@lemm.ee 3 points 6 months ago

Being democratic and inclusive is pointless if it prevents progress. People want good, affordable healthcare more than they care about if it was achieved with Republicans at the table too. The Democratic process is a means to an end, it’s not sacred and should be disposed of when it can’t work.

[–] bamboo@lemm.ee 2 points 6 months ago (1 children)

The point is it’s always one persons fault and that certainly everyone else was a good guy. The blame is planted on a single individual and not the party that failed as a whole

[–] bamboo@lemm.ee 8 points 6 months ago (2 children)

I can understand why a project might want to do this until the law is fully implemented and testing in court, but I can tell most of the people in this thread haven’t actually figured out how to effectively use LLMs productively. They’re not about to replace software engineers, but as a software engineer, tools like GitHub copilot and ChatGPT are excellent at speeding up a workflow. ChatGPT for example is an excellent search engine that can give you a quick understanding of a topic. It’ll generate small amounts of code more quickly than I could write it by hand. Of course I’m still going to review that code to ensure it is to the same quality that hand written code would be, but overall this is still a much faster problem.

The luddites who hate on LLMs would have complained about the first compilers too, because they could write marginally faster assembly by hand.

[–] bamboo@lemm.ee 1 points 6 months ago

Llama 2 70B can run on a specc-ed out current gen MacBook Pro. Not cheap hardware in any sense, but it isn’t a large data center cluster.

[–] bamboo@lemm.ee 1 points 6 months ago (1 children)

In this case I was referring to bandwidth and latency, which on-package memory helps with. It does make a difference in memory-intensive applications, but the majority of people would never notice a difference. Also Apple will absolutely give you a ton of memory, you just have to pay for it. They offer 128GB on the MacBook Pro, and it’s unified so the GPU has full access to it, which makes it surprisingly good for running LLMs locally, for example.

[–] bamboo@lemm.ee 4 points 6 months ago

Also advertising a low rent and then tacking on sometimes hundreds of dollars in “admin fees” or similar. The advertised price should be, to the penny, the size of the monthly check.

[–] bamboo@lemm.ee 4 points 6 months ago

You misunderstand. This is protectionism plain and simple. US car companies are horribly inefficient. Better yet, the US car cartel eliminated most of their budget models to push trucks and SUVs that are more expensive. It doesn’t take much to undercut them, so the US government is banning the competition.

[–] bamboo@lemm.ee 5 points 6 months ago

To their credit, Safari’s extension support on iOS is reasonably good. Not like Firefox good, but compared to chrome it’s excellent!

[–] bamboo@lemm.ee 25 points 6 months ago

Which parts of Firefox ares proprietary?

[–] bamboo@lemm.ee 4 points 6 months ago

It’s work, I don’t get much of a choice here. I do get paid for the hassle though.

[–] bamboo@lemm.ee 1 points 6 months ago

Debian or Ubuntu are usually the best choice if you depend on glibc. Alpine is definitely more compact but musl isn’t always an option.

view more: ‹ prev next ›