sukhmel

joined 2 years ago
[–] sukhmel@programming.dev 3 points 1 week ago

But people gained a possibility to edit photos

[–] sukhmel@programming.dev 3 points 1 week ago (2 children)

I can only find something called ‘crime index’ on Number and it's 11–83 for all cities in their DB, with 11 being the safest

[–] sukhmel@programming.dev 1 points 1 week ago

You just need to rearrange bits so that each bit answers a practical question:

  1. Is it sleep time
  2. Is it time for food
  3. etc.
[–] sukhmel@programming.dev 29 points 1 week ago (1 children)

Looks ethically sourced artisanal lines to me

[–] sukhmel@programming.dev 17 points 1 week ago (1 children)

Or because their agent wiped all the data on production

[–] sukhmel@programming.dev 2 points 1 week ago

I had to leave most of first letters, and sometimes if all vowels are removed there's nothing left

But yeah, we need a committee and come up with a standard for that

[–] sukhmel@programming.dev 14 points 2 weeks ago (1 children)

You've transcended programming

[–] sukhmel@programming.dev 8 points 2 weeks ago (3 children)

Yh, y cn sv a lt f spc wtht ths unncssr vwls

[–] sukhmel@programming.dev 18 points 2 weeks ago

I think, the point is Haskell is more CS theoretical than practical language and anyone who uses it (or any other FP) has never written a single line of production code (the last statement is even in the meme)

[–] sukhmel@programming.dev 1 points 2 weeks ago

Yeah, I understand that this case doesn't require a QA, but in the wild companies seem to increasingly think that developers are necessary (yet), but QA are surely not

It's not even bad engineers, it's just squeezing of productivity as dry as possible, as I see it

[–] sukhmel@programming.dev 3 points 2 weeks ago (2 children)

Yeah, if only QA vere not the first ‘replaced’ by AI 😠

[–] sukhmel@programming.dev 3 points 2 weeks ago (4 children)

I have a feeling that their test case is also a bit flawed. Trying to get index_value instead of index value is something I can imagine happening, and asking an LLM to ‘fix this but give no explanation’ is asking for a bad solution.

I think they are still correct in the assumption that output becomes worse, though

view more: ‹ prev next ›