gayhitler420

joined 1 year ago
[–] gayhitler420@lemm.ee 2 points 1 year ago

Already happened a few times. Gca 68 was after Kennedy and 86 was after reagan.

[–] gayhitler420@lemm.ee 0 points 1 year ago (1 children)

Plenty of things that take weeks of work aren’t art.

[–] gayhitler420@lemm.ee 1 points 1 year ago

No need to apologize, we’re just chatting. You can stop posting anytime you like.

[–] gayhitler420@lemm.ee 1 points 1 year ago (2 children)

Oh I see! That makes sense. You and the other commenters didn’t understand the technology and needed an explanation.

It still doesn’t explain why you’re getting so aggro about apple fanboys. There don’t seem to be any and even if the part of that post I quoted was added later the content and context doesn’t change.

This is the android community, did y’all get brigades recently or something?

[–] gayhitler420@lemm.ee 1 points 1 year ago (4 children)

Here’s the comment I think you’re referring to:

Apple has the biggest tracking network right now. It dwarfs the others. FindMy network devices should be the clear priority. They’re more precise, encounter fewer dead zones, and in other words, are likely a stalker’s preferred tracker.

That seems to refute what you’re saying here, but I don’t understand why you’re so heated over this anyway. Airtags are the more popular and better tracker and that’s definitely why the android software prioritized them.

What about recognizing that makes someone an apple fan?

[–] gayhitler420@lemm.ee 2 points 1 year ago

See anything you like?

[–] gayhitler420@lemm.ee 2 points 1 year ago

Hey I know you’re out, but I just wanna jump in and defend myself: I never put words in your mouth and never moved a goal post.

Be safe out there.

[–] gayhitler420@lemm.ee 2 points 1 year ago (4 children)

Woof.

I’m not gonna ape your style of argumentation or adopt a tone that’s not conversational, so if that doesn’t suit you don’t feel compelled to reply. We’re not machines here and can choose how or even if we respond to a prompt.

I’m also not gonna stop anthropomorphizing the technology. We both know it’s a glorified math problem that can fake it till it makes it (hopefully), if we’ve both accepted calling it intelligence there’s nothing keeping us from generalizing the inference “behavior” as “feeling”. In lieu of intermediate jargon it’s damn near required.

Okay:

Outputting correct information isn’t just one use case, it’s a deep and fundamental flaw in the technology. Teaching might be considered one use case, but it’s predicated on not imagining or hallucinating the answer. Ai can’t teach for this reason.

If ai were profitable then why are there articles ringing the bubble alarm bell? Bubbles form when a bunch of money gets pumped in as investment but doesn’t come out as profit. Now it’s possible that there’s not a bubble and all this is for nothing, but read the room.

But let’s say you’re right and there’s not a bubble: why would you suggest community college as a place where ai could be profitable? Community colleges are run as public goods, not profit generating businesses. Ai can’t put them out of business because they aren’t in it! Now there are companies that make equipment used in education, but their margins aren’t usually wide enough to pay back massive vc investment.

It’s pretty silly to suggest that billionaire philanthropy is a functional or desirable way to make decisions.

Edx isn’t for the people that go to Harvard. It’s a rent seeking cash grab intended to buoy the cash raft that keeps the school in operation. Edx isn’t an example of the private school classes using machine teaching on themselves and certainly not on a broad scale. At best you could see private schools use something like Edx as supplementary coursework.

I already touched on your last response up at the top, but clearly the people who work on ai don’t worry about precision or clarity because it can’t do those things reliably.

Summarizing my post with gpt4 is a neat trick, but it doesn’t actually prove what you seem to be going for because both summaries were less clear and muddy the point.

Now just a tiny word on tone: you’re not under any compulsion to talk to me or anyone else a certain way, but the way you wrote and set up your reply makes it seem like you feel under attack. What’s your background with the technology we call ai?

[–] gayhitler420@lemm.ee 2 points 1 year ago

We’ve invented a computer model that bullshits it’s way through tests and presentations and convinced ourselves it’s a star student.

[–] gayhitler420@lemm.ee 1 points 1 year ago (6 children)

I never said they were synonyms, just that the same thing is most popular and best sometimes and trackers are one of those times. After I said that I explained why that was the answer to the question almost exactly as the commenter you responded to did.

What are you trying to prove? I’m kinda confused…

[–] gayhitler420@lemm.ee -1 points 1 year ago

If I could afford a million dollar bail I’d probably buy 69 guns too.

[–] gayhitler420@lemm.ee 8 points 1 year ago (10 children)

You got two problems:

First, ai can’t be a tutor or teacher because it gets things wrong. Part of pedagogy is consistency and correctness and ai isn’t that. So it can’t do what you’re suggesting.

Second, even if it could (it can’t get to that point, the technology is incapable of it, but we’re just spitballing here), that’s not profitable. I mean, what are you gonna do, replace public school teachers? The people trying to do that aren’t interested in replacing the public school system with a new gee whiz technology that provides access to infinite knowledge, that doesn’t create citizens. The goal of replacing the public school system is streamlining the birth to workplace pipeline. Rosie the robot nanny doesn’t do that.

The private school class isn’t gonna go for it either, currently because they’re ideologically opposed to subjecting their children to the pain tesseract, but more broadly because they are paying big bucks for the best educators available, they don’t need a robot nanny, they already have plenty. You can’t sell precision mass produced automation to someone buying bespoke handcrafted goods.

There’s a secret third problem which is that ai isn’t worried about precision or communicating clearly, it’s worried about doing what “feels” right in the situation. Is that the teacher you want? For any type of education?

view more: ‹ prev next ›