anarchoilluminati

joined 1 year ago
[–] anarchoilluminati@hexbear.net 2 points 1 day ago (1 children)

Liberalism.

Liberals are welcome here!

Unironically, Hexbear.

[–] anarchoilluminati@hexbear.net 23 points 1 month ago (2 children)

Then: Don't download applications and run executables you don't fully trust.

Now: Download everyone's new snazzy app just because and scan everything with your phone that contains all your most private information so you can unlock a surprise!

Oh, that would be a good rewatch.

I rewatched Reservoir Dogs last year or so for the first time in like 15 years and it was amazing.

I'm a bit sick so I comfort watched Rush Hour 1 and 2 last night.

Some aspects didn't age great, but they're still really fun movies for 90's sensibilities.

I do wish Pop!_OS had a better hibernate/sleep mode.

My computer just stays on until I turn it off or the battery dies.

Your crush leaving you for someone else.

Ikiru. Harakiri. The Human Condition. Mishima: A Life in Four Chapters.

Kumite!

Loved his movies as a kid. Glad to hear he cleaned up.

I was also thinking I'd love to hear some adagios being sung.

[–] anarchoilluminati@hexbear.net 4 points 1 month ago (1 children)

Parking enforcement?

[–] anarchoilluminati@hexbear.net 13 points 1 month ago

But, ironically, the Chinese Room Argument you're bringing up supports what others are saying that LLMs do not 'understand' anything.

It seems to me like you are establishing 'understanding' with a functionalist meaning to be able to say that input/output is equivalent to understanding in order to say the measurable process in itself shows 'understanding'. But that's not what Searle, and seemingly the others here, seem to mean by 'understanding'. As Searle argues, it is not purely the syntactic manipulation in question but the semantic. In other words, these LLMs do not "know" the information they provide, they are just repeating based off the input/output process with which they were programmed. LLMs do not project or internalize any meaning to the input/output process. If they had some reflexive consciousness and any 'understanding', then they could have critically approach the meaning of the information in order to assess its validity against facts rather than just naïvely proclaiming that cockroaches got their name because they like to crawl into penises at night. Do you believe LLMs are conscious?

view more: next ›