One way to approach this topic about bypass-thinking and combinatorial research, is actually a side-entry. An introductory concept might be the extraction of truth out of a given constellation of incomplete or incorrect data.
To define the term truth, it refers to the concept of an exact and correct reality model. I will use this abbreviation in the continuing sections.
Truth, can be extracted from a constellation of incomplete or partly incorrect data.
This procedure is what I remark as extraction of truth out of the negative space of a constellation of "lies" or half-truths.
This extracted truth is equivalent to the holographic multi-dimensional consensus of multpile contradictory statements. The more data, whether it is fully incorrect, only partly incorrect, fully correct or incomplete, the more the holographic consensus will be visible as an effect of emergence.
The actual difficulty lies in adequatly creating the constellation, which starts with grouping according to primary principles. This is the basic form of the concept and will be rendered up to the details.
This procedure can be done using a principle I declare as Bypass-thinking.
The underlying thinking processes can be grouped into logical induction and logical deduction, of which both have an implicit and an explicit approach.
Logical induction is a way of integrating the facts or details into a hypothesis or big picture. This is a bottom-up approach as it generates the overall message by integrating the tiny parts.
This can be synonymous to a principle of focal concentration; a way of convergence.
Alternately it is similary to encryption or compression, which works like a cenripetal force.
This system produces inward growth.
Logical deduction is merely a way of differentiation, as it generates the details out of the big picture, which uses a top-down approach.
This can be synonymous to a principle of focal dispersion; a way of divergence.
It has similarity to decryption or expansion, which works like a centripetal force.
This system produces outward growth.
Parallelity in thought is the primary compound of bypass-thinking.
The key concept of bypass-thinking lies in two (Level 1) or four (Level 2) major thinking processes, that flow reverse to each other.
First level bypass-thinking consists of two parallel thought processes. There are two ways of this, namely:
A) Integration: Generation of the holistic picture by using reductionized elements in processes of infinitesimal steps, primarily inductive reasoning (with bottom-up-approach) accompanied by a subset deductive reasoning (with top-down-approach).
B) Differentiation: Extracting the reductionistic details out of the holistic picture, primarily deductive reasoning, accompanied by a nested inductive reasoning.
Second level bypass-thinking consists of four parallel thought processes, say combining level 1A and level 1B reasoning.
Furthermore in this parallelity there exists a kind of simultaneity.
Simultaneity doesn't mean that every thinking process has the same amount of focus all the time.
It merely refers to switching the focus at a merely fast rate; This rate is also flexible and thus includes acceleration as well as decelleration.
The focal switching rate is like the rhythm of weaving ideas.
Important to consider: Details consist of subordinated big pictures, and big pictures are also subordinated details.
There exists an equivalence:
H n = D n±1 or D n = H n±1 | n ∈ ℕ
where n = Level, H = the holistic big picture, D = the details
Integration is a method of connecting the details to form the big picture.
Thought process one wanders into the superordinated levels, say from the details towards the holistic big picture. - inductive reasoning
Thought process two wanders into the subordinated levels, say from the detail (equivalent to the nested big picture) towards the nested details. - deductive reasoning
Thought process one therefore goes from detail level 1 (D1) to big picture level 1( H1).
D 1 → H1 | because D 1 = H 0, therefore H 0 → H 1
Thought process two goes from the subordinated big picture level 0 (H 0) (which is equivalent to detail level 1 (D1)) to the subordinated detail D 0
H 0 → D 0 | because H0 = D1, therefore D1 → D0
Differentation is a method of decrypting the holistic big picture to gather the details.
Thought process one wanders into the superordinated levels, say from the holistic big picture towards the details. - deductive reasoning
Thought process two wanders into the subordinated levels, say from the holistic big picture (equivalent to the nested detail) towards the nested holistic big picture. - inductive reasoning
Thought process one therefore goes from holistic big picture level 1 (H1) to detail level 1( D1).
H1 → D1 | because H1 = D0, therefore D0 → D1.
Thought process two goes from the subordinated details level 0 (D 0) (which is equivalent to holistic big picture level 1 (H1)) to the subordinated holistic big picture H 0.
D0 → H0 | because D0 = H1, therefore H1 → H0
This thinking process is a mixture of two parallel level 1 bypass-methods, and can be regarded as a form of quadropolar thinking.
This quadropolar bypass-thinking consists of implicit and explicit induction and deduction.
Each of these reasoning modes have indirect symmetries towards each other.
Often these heavily intertwined thought processes lead to an interaction and can create a neat interweaving of ideas. From experience of practice, sometimes the parallel thought processes knot unluckily and the entire thought construction tends to become a dysfunctional mess.
For intervention of this "unwanted knotting" one can introduce special recycling-loops, called error-detecting-loops that will be discussed in the following section.
There emerges a helical motion pattern in bypass-thinking.
Error-detecting-loops, as the name may suggest, detect errors in own thinking processes. Because these error-detecting-loops are adaptable, say, they are able to learn, and therefore are flexible both in structures as well as processes, they need nested error-detecting-loops as well as they can also get "infected by a kind of trojan virus".
The more error-detecting-loops are nested into each other the more stable will the system be.
These error-detecting-loops relate to the level they are in; Say, the most superordinate loop is responsible for the main frame, the loop in the level below, will be responsible for the first layer of details.
This can be illustrated as generating a regular fractal like the Sierpinski-triangle or the Koch-curve, starting with the biggest shape and then further generating the tiny parts.
Each iteration is hence a creation of the error-detecting-loop of the same level and the levels before.
D n ∑ L n! | n ∈ ℕ; (n = iteration level, D = amount of details, L = error-detecting-loop) Meaning: The detail amount of iteration level n is defined by the factorized level n of error-detecting-loops.
The more details can be generated the more error-detecting-loops are nested into each other.
⇧Return to top
⇦Go to previous chapter "Axiomatic networks and non-binary logic"
⇨Go to next chapter "-"
↫↬Return to chapter overview