CommanderCloon

joined 1 year ago
[–] CommanderCloon@lemmy.ml 35 points 1 month ago (4 children)

It's not UI backsliding. It's Microsoft being incompetent. I have no idea how they're still in business, and astounded at their valuation. It seems like everything they manage to push out is just barely functioning

[–] CommanderCloon@lemmy.ml 10 points 2 months ago* (last edited 2 months ago) (1 children)

Also, even if it was permanent, it would still be something like a permanently_removed set to TRUE in a database. License keys probably are one of those things no company truly ever deletes from their records.

[–] CommanderCloon@lemmy.ml 14 points 2 months ago (1 children)

The policy isn't there just to be extra nice, it's because otherwise the patient dies without a liver.

Since she was too sick for a partial liver transplant, and not eligible for a dead donor full liver transplant, she would have just died.

It might seem cruel but the same is done for a lot of other procedures; if the chance of you dying in surgery is way too high, doctors won't take the risk, they're not executioners.

It's not a moral judgement about her alcoholism, the same would have been true if she had a cancer no surgeon would take on.

[–] CommanderCloon@lemmy.ml 106 points 2 months ago (2 children)

A partial liver transplant wasn't viable for someone this sick, so when the partial transplant failed, they would have to resort to a full transplant from a dead donor, or she would die in operation.

Since she wasn't eligible, a partial transplant was just a death sentence.

[–] CommanderCloon@lemmy.ml 22 points 2 months ago

No. A partial liver transplant wasn't viable for someone this sick, so when the partial transplant failed, they would have to resort to a full transplant from a dead donor. But she wasn't eligible, so a partial transplant was just a death sentence.

[–] CommanderCloon@lemmy.ml 8 points 2 months ago* (last edited 2 months ago) (2 children)

so to get AI generated CSAM....it had to have been fed some amount of CSAM

No actually, it can combine concepts that aren't present together in the dataset. Does it know what a child looks like? Does it know what porn looks like? Then it can generate child porn without having ever had CSAM in its dataset. See the corn dog comment as an argument

Edit: corn dog

[–] CommanderCloon@lemmy.ml 0 points 2 months ago (1 children)

If you hash in the browser it means you don't salt your hash. You should absolutely salt your hash, not doing so makes your hashes very little better than plaintext.

[–] CommanderCloon@lemmy.ml 0 points 2 months ago (5 children)

Because then that means you don't salt your hashes, or that you distribute your salt to the browser for the hash. That's bad.

[–] CommanderCloon@lemmy.ml 11 points 2 months ago (2 children)

Sounds like DNS blocking. Use DoH, won't be as good as a VPN but it will stop the sniffing which allows them to block domains

[–] CommanderCloon@lemmy.ml 10 points 3 months ago (2 children)

"Art should not represent themes of sexual assault" Wtf are you on

[–] CommanderCloon@lemmy.ml 1 points 3 months ago

That's because "firefox" (or "chrome" too) on apple products is just a reskin of Safari. Apple does not allow 3rd party browser engines in its app store.

That's because 3rd party browser engines might not suck ass, which would allow OWA apps in your browser whcih would circumvent Apple's 30% cut on everything. So they kneecap their own browser and don't allow any other browser on their devices.

[–] CommanderCloon@lemmy.ml 9 points 3 months ago (2 children)

I'm under 30, I have no idea what winamp is but I figured it's some music software from the skins' pics. I imagine it was popular for it to have a museum thing about user created skins

(I haven't googled anything yet)

view more: ‹ prev next ›