Hacker News new | past | comments | ask | show | jobs | submit login

I wholeheartedly disagree. Logic is inherently statistical due to the very nature of empirical sampling, which is the only method we have for verification. We will eventually find that it's classical, non-statistical logic which was the (useful) approximation/hack, and that statistical reasoning is a lot more "pure" and robust of an approach.

I went into a little more detail here last week: https://news.ycombinator.com/item?id=42871894

> My personal insight is that "reasoning" is simply the application of a probabilistic reasoning manifold on an input in order to transform it into constrained output that serves the stability or evolution of a system.

> This manifold is constructed via learning a decontextualized pattern space on a given set of inputs. Given the inherent probabilistic nature of sampling, true reasoning is expressed in terms of probabilities, not axioms. It may be possible to discover axioms by locating fixed points or attractors on the manifold, but ultimately you're looking at a probabilistic manifold constructed from your input set.

I've been writing and working on this problem a lot over the last few months and hopefully will have something more formal and actionable to share eventually. Right now I'm at the, "okay, this is evident and internally consistent, but what can we actually do with it that other techniques can't already accomplish?" phase that a lot of these metacognitive theories get stuck on.




> Logic is inherently statistical due to the very nature of empirical sampling, which is the only method we have for verification.

What? I'm sorry, but this is ridiculous. You can make plenty of sound logical arguments in an empirical vacuum. This is why we have proof by induction - some things can't be verified by taking samples.


I'm speaking more about how we assess the relevance of a logical system to the real world. Even if a system is internally self-consistent, its utility depends on whether its premises and conclusions align with what we observe empirically. And because empirical observation is inherently statistical due to sampling and measurement limitations, the very act of verifying a logical system's applicability to reality introduces a statistical element. We just typically ignore this element because some of these systems seem to hold up consistently enough that we can take them for granted.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: