venustrapsflies

joined 11 months ago
[–] venustrapsflies@alien.top 1 points 9 months ago (1 children)

Real brains aren't perceptrons. They don't learn by back-propagation or by evaluating performance on a training set. They're not mathematical models, or even mathematical functions in any reasonable sense. This is a "god of the gaps" scenario, wherein there are a lot of things we don't understand about how real brains work, and people jump to fill in the gap with something they do understand (e.g. ML models).

[–] venustrapsflies@alien.top 1 points 10 months ago (11 children)

It’s not a fallacy at all. It is just statistics, combined with some very useful inductive biases. The fallacy is trying to smuggle some extra magic into the description of what it is.

Actual AGI would be able to explain something that no human has understood before. We aren’t really close to that at all. Falling back on “___ may not be AGI yet, but…” is a lot like saying “rocket ships may not be FTL yet, but…”

[–] venustrapsflies@alien.top 1 points 10 months ago

I came in here to say the cowboys are worse than their record and I’m only half-joking. They have yet to beat a good team and they lost to the cardinals.

Admittedly they’ve looked good at times and blown out some of those bad teams (ahem) and played the eagles close. But the record itself is a pretty lightweight 6-3.