Paraconsistent Consensus
Standard federated averaging destroys the most valuable signal in distributed learning: genuine disagreement between nodes. This paper introduces a paraconsistent aggregation function built on Belnap's FOUR-valued logic — where every parameter can be true, false, both, or neither — and synchronizes model state across BlockDAG checkpoints. When nodes contradict each other, the contradiction itself becomes computable information, allowing the network to distinguish frontier knowledge from noise.
Key Contributions
- Belnap four-valued logic applied to federated learning
- Contradiction-preserving aggregation functions
- BlockDAG checkpoint-based model synchronization
Explainers
What is Belnap logic?
A four-valued logic system developed by Nuel Belnap where propositions can be true, false, both true and false simultaneously, or neither. Unlike classical logic, it does not collapse under contradiction — it reasons through it, making it uniquely suited for environments where information sources legitimately conflict.
Why preserve contradiction?
When federated nodes disagree on a learned parameter, standard averaging produces a value no node actually believes. By preserving the disagreement as a BOTH value, the network retains the full information landscape and can identify where models are exploring frontier knowledge versus where consensus has already formed.
How do BlockDAG checkpoints fit in?
The DAG's natural partial ordering provides synchronization points where model states are aggregated without requiring global consensus on every update. Checkpoints capture the four-valued state of every parameter at that moment, creating an auditable history of how model agreement evolves over time.