this post was submitted on 21 Nov 2023
0 points (50.0% liked)
Machine Learning
1 readers
1 users here now
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It depends on what level of abstraction you are claiming deterministic behaviour. As stated elsewhere, at the upper level of qualia, it’s hard to say whether something that looks and feels like a decision made with free will is or isn’t.
Likewise, if you move to the lower levels of bit patterns, electron flow or quantum events, it looks to an outside observer to be non-deterministic.
So, at the absurd level of abstraction that posits symbols being manipulated by executing software are real phenomena, you could argue that neural nets are deterministic.
But at what point and to which observer does complexity become indistinguishable from randomness?
It’s a shaky argument that is based on the perfect functioning of an ideal of a computer to claim determinism, when we know in practice that abstraction levels bleed into each other, form strange loops and the Blue Screen Of Death is only ever a couple of bits away, especially when the sun flares and you’re not using ECC RAM.