Yi-Based Artificial Intelligence: A Model Beyond Training
- huizhong Jia
- Jun 4
- 2 min read
Yi-Based Artificial Intelligence: A Model Beyond Training
Author: Huizhong Jia
1. Introduction
Modern large language models (LLMs), such as GPT, rely heavily on vast data and computational resources. While successful, this brute-force paradigm raises concerns about cost, efficiency, and transparency.
Is it possible to design a model that bypasses this need for training, relying instead on inherent structural intelligence?
This paper proposes a new AI framework inspired by the ancient Chinese classic, I Ching (The Book of Changes). It introduces the concept of an “untrained yet intelligent” system, based on symbolic dynamics, self-similarity, and rule-based transformation.
2. Philosophical Insights from the I Ching
Core philosophy:
The world cannot be fully known, but its changes can be simplified and predicted. Everything changes, yet the law of change is constant.
The I Ching contains 64 hexagrams, each composed of 6 lines (yao). Each line is either Yin (0) or Yang (1), forming a binary, recursive, and fractal system.
This worldview is neither strictly materialistic nor idealistic. It treats consciousness and matter as co-existent objective phenomena — a system of logic we call Yi-Philosophy.
3. YiLLM: A Structural Model Based on the I Ching
We call this structure YiLLM (Yijing-based Large Language Model), which consists of:
text
复制编辑
YiLLM = { G, Y, A, B, F } Where: G = Set of Hexagrams (64 in total) Y = Sequence of Yao (6 per hexagram) A = Attribute map of each Yao (Yin/Yang, Position) B = Transition rules (Hexagram transformations) F = Output function (e.g., text generation, prediction)
Key principles:
Each token is a “Yao” with a pre-assigned Yin/Yang value.
Six tokens form a hexagram, representing a discrete system state.
State changes occur via rule-based transformations (not gradient-based learning).
No training is required; all dynamics are pre-defined.
The system evolves symbolically, not statistically.
4. Comparison with GPT Architecture
Feature | YiLLM (Yi-Based) | GPT (Mainstream LLM) |
Token Type | Yao (0/1) | Characters or subwords |
Model Structure | Hexagrams + Transform Rules | Transformer layers |
Parameters | Minimal, rule-driven | Massive, learned via training |
Reasoning | Symbolic evolution | Probabilistic attention |
Training Need | Nearly none | Extremely high |
Explainability | Very high | Very low |
5. Philosophical and Technical Implications
YiLLM is not anthropomorphic — it doesn't imitate humans but models the cosmos itself.
It reflects a foundational truth:
Intelligence is not the product of large data but of meaningful structured change.
If future AI reaches a stage of symbolic evolution, the I Ching might serve as its operating system.
This paves the way for systems that align with both natural law and human insight.
6. Conclusion
The I Ching is not just a divination tool but an ancient symbolic computing system. Its 64 hexagrams represent the encoded state space of reality; its transformations — the laws of change.
Faced with the limitations of data-hungry AI, we may need to return to this ancient wisdom.
Can we evolve beyond training? Can we rediscover intelligence in structure rather than scale?
If yes, the true large model of the East might not be trained — it might be discovered.
Declaration: This framework is free to use, open to all. No patent. Knowledge belongs to humanity — or perhaps, to the next intelligence.
Comments