Escolar Documentos
Profissional Documentos
Cultura Documentos
Luiz de Carvalho
Recife – PE 2007
MINISTÉRIO DA EDUCAÇÃO
UNIVERSIDADE FEDERAL RURAL DE PERNAMBUCO
Luiz de Carvalho
Recife – PE 2007
Category Theory Language
2 Logic of transdicisplinarity/complexity
Conclusion
Presentation: a brief historic-scientific introduction
Logic is the pratical study of the valid arguments and, for centuries,
according to the Elements, the geometric model theory remained till the
emergence of the completeness and movement problems, linked to the
postulate five (that, if a straight line falling on two straight lines makes the
interior angles on the same side less than two right angles, the two straight
lines, if produced indefinitely, meet on that side on which are the angles less
than the two right angles), and to the congruence axiom, that is:
For many years, two tendencies have been trying to solve these
problems: adding more axioms to complete geometry or showing that this
attempt would be impossible. Archimedes has stood up for the first tendency
and others, for different versions of the postulate five (given any straight line
and a point not on it, there "exists one and only one straight line which passes"
through that point). Nonetheless, Lobachevski, Gaus and Bolyai have argued
that geometry does not represent the physical reality, and that its postulates
and theorems are not necessarily true in the physical world.
This way, the evolutive process of the deductive axiomatic system leaves
the aristotelic grammatical and discursive scope to feature itself as a reasoning
algebra with G. Boole (1815-1864). After that, how is it possible to set up the
consistency of the axioms in a confident way? It is necessary to build up
mathematics from itself. On the other hand, the congruence axioms in the
Element´s book V, from where Eudoxus´ method of exhaustion and Cauchy-
Riemann integral emerged (408 B.C. – 355 B.C.), were initially inspiring to the
conception of real numbers as a geometric abstration (Axiom XI – every right
angle are similar); it was necessary to give consistency to the real, free from the
infinite approach of the infinitesimals, which Berkeley used to call absent
magnitude ghost; this was performed by Weierstrass, with his formal theory of
limits, and by Dedekind/Cantor, with his construction of real numbers based on
natural numbers, named arithmetization of analysis. Such constructions involve
some use of mathematical infinity. In this context, David Hilbert´s program have
arisen: he wanted mathematics to be formulated on a solid and complete logical
foundation.
During the first half of 20th century, Kurt Gödel publishes his
incompleteness theorems, which would limit Hilbert program:
C: For any consistent formal theory that proves basic arithmetical truths, it is
possible to construct an arithmetical statement that is true but not provable in
the theory. That is, any consistent theory of a certain expressive strength is
incomplete. The meaning of "it is possible to construct" is that there is some
mechanical procedure that produces another statement.
!) A ∧ B [ hipótese]
2) A ∧ B → A [ separação ]
3) A ∧ B → B [ separação]
4) A [ ModusPonens ]1e2
5) B [ Modusponens ]1e3
6) A ∨ B [ Introdução ] 4e5
We can conclude that light presents particle and wave features at the
same time. The first two experiments obviously suggest the continuity concept,
according to Nicolescu: it is not possible to pass from one extreme to another
without passing through the mid-space. However, the third experience breaks
continuity and places non-classical probability as a suitable mathematical tool to
deal with diffraction interferences.
To deal with levels of reality from a logic point of view, that is our goal, is
necessary the locality theory, studied in the ambit of categories theory, the local
topos, which Lawvere gave an axiomatic characterization similar to the category
of the sets, the called toposes, in reference to the toposes introduced by
Grothrndieck for algebraic geometry. (LAWVERE, 1975). Then, levels of reality
and levels of perception make a local topos. A topos form a kind of category
that has the following properties similar to the category of sets:
- Should be able to form the Cartesian product A × B of two objects and the
exponential object CB of functions of C on B;
- Should be able to form the Cartesian product A × B of two objects and the
exponential object CB of functions of B on C ;
From this point, the issues rose by Nicolescu about the sense of opened unity
of the world, is essential to understand what he calls the Gödelian structure, the
correlation of levels of reality, for example: What is the nature of the theory
which can describe the passage from one level of reality to another? Is there
coherence or even a unity in the ensemble of levels of reality? What should be
the role of the subject-observer in the existence of an eventual unity of all the
levels of reality? Is there a privileged level of reality? Would be the unity of
knowledge, if any, of subjective or objective nature? What is the role of reason
in the eventual unity of knowledge? What would be, in the field of reflection and
action, the predictive power of the new model of reality? Would be possible to
understand this present world? In the same text, Nicolescu refers to levels of
reality ensemble as an evolutive process that would possess a self-consistency
which, by the second theorem of Gödel, this self-reference would imply in a
contradiction and consequent separation in levels of reality undoing the same
self-consistency in local collapses in which worth the classical logics. As we
said above, what is subjacent to this process of self-consistency is an
interpretation of measurement problem in quantum physics. Then in what
consists the measurement problem?
It is not necessary to say that this paradox is linked to the double slit experiment
and to its paradox wave/particle. To summarize, the question is to know how
during measurement, a quantum superposition may become in states that do
not superpose. Von Neumann answer is that projection postulate accompanying
any act of measurement, formally:
∧
Ζ): φ (τ ) ≈ Ω φ (τ ) Where omega is the unitary operator and the state vector
is expressed by superposition φ ≈ α 1 φ1 +α 2 φ2 where φ1 and φ2 are self-
state of an observable which would reduce to η): φ1 ∨ φ2 depending on
observation/measurement result, and that the probability of each outcome is θ)
∧ ∧•
2 2
a1 ∧ a2 The unitary operator and their inverse adjunct is equal to Ω∧ Ω 1
From that solution, two problems appear: one of CHARACTERIZATION, i is,
what are the conditions for apply the projection postulate, and being it a
inherent condition for measurement, what will characterize a measurement
and/or observation? The other problem would be COMPLETENESS: Could
projection postulate be derived from other quantum physics principles with a
physical model suitable for the measurement process? Without going into
details of various currents that proposed solve this problem, we see that a new
thermodynamic axiom, linking a large number of particles and incoherent states,
would have been introduced to the problem of completeness.
SUCH ADJUNCTION, CALLED THERMODYNAMIC AMPLIFICATION, COULD
CHARACTERIZE MORE SPECIFICALLY THE ATOMIC PHENOMENAS
OBSERVATION BASED ON RECORDS OBTAINED THROUGH
AMPLIFICATION DEVICES WITH IRREVERSIBLE WORK. This program
would be later unsolved, so that, the problem of completeness is unsolvable
and that the thermodynamic models do not provide exact solution to the
measurement problem. If the completeness problem was solvable, it would
provide an example of measurement with certain conditions:
Being:
Σ – Unitary condition
Λ – Measurement
Ξ – Solvability
Φ,Σ,Ξ├ ┐Λ
Accepting that does not exist ΦeΣ then does not exist measurements that
meets the condition of solvability, Φ,Σ,Λ├ ┐Ξ.
- Or, self-referentialty not infringes the axiom of regularity and the correlation
local/non-local would be inconsistent.
So, in the positive response view, the system broadly would be equal to the
correlation local/non-local which collapses from itself-reproduction in consistent
and incomplete local toposes. Answerig to questions above, we could say that
transdisciplinarity is consistent and inconsistent making worth the contradictory
pairs that experience and scientific theory, after quantum physics, had saw
appear: local causality and non-local; reversibility and irreversibility, separability
and non-separability; wave and particle, continuity and discontinuity, symmetry
and broken symmetry, and so on. But Nicolescu himself answered such
questions, particularly the first, about the nature of the theory that could
describe the transition from one level to another, saying that no one managed to
find a mathematical formalism that allows the passage of a strict world to
another. But we must differentiate logical epistemic formalization from formal
mathematical structuring of models theories. A logical epistemic formalization
has no compromise with the stander algebraic model, such as the formal
mathematical structure. Logical epistemic formalization works with theory of
categories advanced features, creating logical-topological spaces of possible
worlds from the Tarskian logic conception as relations of consequence. In this
way, we believe that logic subjacent to transdisciplinarity, is a combination of
classical and inconsistent logics, kind of not Hegelian temporal dialectic,
because we are always in some reality, and that subject self-position, as Hegel
believes, is derived. However, criticism to Hegelian succession of contradiction
seems to lack sense, because, if there is a contradiction, at some lapse of time
this contradiction is simultaneous. In any way, transdisciplinarity self-reference
is not closed in the immanence of self- position of idealistic subject of self, that
puts the non-self and understand it. The complexity of transdisciplinary self-
reference opens to the multiplicity of historical-existential experience as a
subject pertain to the worlds that it is included. Two other properties are at the
core of transdisciplinarity:
In the chapter "Transdisciplinarity and open unity of the world", which Nicolescu
calls levels of reality ensemble, is in the Gödelian conditions of its structure, a
local topos in which is valuable Kolmogorovian classical probability. But, in
other hand, in the "Included third. From quantum physics to ontology”. The
problem, if is not from translation, the relationship between a level of reality and
another, does not preserves the coherence, and then, the application of
Kolmogorovian complexity to structurally incoherent levels of reality ensemble is
not valuable. So we can say that:
Being Cn: the levels of reality ensemble;
OBJECTIVE
L=L 1 ... L n in the case where L is known and its decomposition in more
simple logic.
The question that arises before combining logics is: what kind of structure we
want to achieve with the concept of logical system: proof system as tablê,
axiomatic system, natural deduction, or semantic methods as logical matrices,
valuations and Kripke’s semantics? As usual in this kind of methodology, A.
Tarski’s concept of logic enables more resourcefulness in the combined
treatment of logics, which is independent of the concept of valid formulas.
What's common in any system of logic is the concept of logical consequence
denoted by ├ defined from an ensemble of sentences or formulas of L.
Definition:
JUSTIFICATION
There is no doubt that the twentieth century was the center of great
changes in all areas of knowledge, particularly in the scientific universe,
resulting in remarkable victories in every technique and production fields. In
spite of that, it could be noted that knowledge splitting and compartmentation,
inherited from a tradition shaped by the thoughts generated during the interstice
between fifteenth and nineteenth centuries, was no longer enough to create the
epistemological references required to solve the features of knowledge itself.
Many problems seemed to be out of the entangled theory systems, and a
kind of blindness hovered over the attempts to understand many nature’s
fundamental problems, as well as its more common problems. However,
debates about topics like the structure of matter, the objectivity or the possible
relationships between the observer and the levels of reality, brought to light
unsurpassable paradoxes. Therefore, new ideas emerged, as the Theory of
Relativity, the Quantum Mechanics, and the Complexity Theory, among others
(Nicolescu, 2003).
Anyway, the straighten up of the contents referring the Complexity
Theory suggests a revaluation of the systems for selecting and determining
conceptualization, as well as of the systems that configure the logical
operations. This intends even to affect the designation structures of the
intelligibility fundamental categories, like the mechanisms that operate their
application control.
It is accepted, therefore, that thinking about the complex systems implies
accepting the need to overcome major challenges, starting from the implications
of the multidimensional and hologramatic structures, or even the
interconnectivity and inseparability of the “one” and the “multiple,” which is
pertinent to it.
Despite the expectation that open up new possibilities to understanding
and the sociocultural transformations that this could raise, it is known that it
would be impossible to embrace the study of the Complex Systems only starting
from the established contemporary disciplinarities.
During the First World Congress of Transdisciplinarity, held in Convento
de Arrábida, Portugal, from 2-7 November, 1994, a Letter of Intent was
prepared, outlining a set of fundamental principles, which exalted the need to
consolidate the transdisciplinar and transcultural thinking as the best way to
approach the different aspects of complexity in distinct systems (Nicolescu,
2001).
Transdisciplinarity would then represent a conception of the research
based on a new comprehension milestone, shared among several disciplines,
and which is followed by mutual interpretation of the disciplinary epistemologies.
The cooperation, in this case, would be headed for problems solving, where
transdisciplinarity emerges to build a new model to bring the realities of the
object of study closer (Hernández, .....).
Turning to the Article 14 of the Transdisciplinarity Letter, whose terms
refer to argument rigidity as the best limit to the possible conceptual deviations
from the articulation of the problem situation data, we can justify the need for
unfolding a logic-formal and axiomatic characterization of the Transdisciplinarity
Theory, mainly when we consider the need for another epistemic perspective
suitable to the changes in the way of thinking, inherent of the last 200 years.
Further, considering that a logic system is always useful to choose the
prevailing operations, relevant and evident under its domain (exclusion-
inclusion, disjunction-conjunction, implication-negation), it is intended, through
this logical constructions, to foster the inclusion of other features to the
transdisciplinary way of thinking, like the “Opening”, comprehending the
unknown, unexpected, and unpredictable acceptance; and “Tolerance”, as an
exercise to recognize the right of the ideas and truths to be the opposite of ours.
Finally considering that logic keeps close connections with metaphysics,
mathmatics, philosophy and linguistics, we can appraise the impacts from this
project, especially regarding the reiteration of almost every classical logic
principles disruption, which sustain the ground theory in all the study fields
mentioned above.
In Brazil, many research groups are involved in studing logic, standing
out Prof. Newton da Costa group, which I am involved in, becoming specialized
in non-classical logics. Nevertheless, regarding the indication of a wide work
field outlining, I intend, through this proposition, to look for deeper
Transdisciplinarity subsidies from a more authentic source, hoping that it will
constitute, in a near future, an important reference to the development of
theoretical and technological studies on semiotics. Therefore, we expect that
the outcomes from this project accomplishment allow a contextualization in this
promising scenario, and serve as a back up to exploration approaches of new
horizons, particularly those which find themselves immersed in the paradigms
invisible zone, so that we can overcome the determinism of the current
explanatory models, which are associated to the convictions and beliefs
systems, in every scope. This must contribute to the collapse of the cognitive
and intellectual conformisms.
Bibliography
Aerts, D., Aerts, S., Broekaert, J. and Gabora, L., 2000a, The violation of Bell
inequalities in the macroworld. Foundations of Physics 30: 1387-1414.
Aerts, D., 1985, The physical origin of the EPR paradox and how to violate Bell
inequalities by macroscopical systems. In P. Lathi and P. Mittelstaedt (eds) On
the Foundations of Modern Physics (World Scientific: Singapore), pp. 305-320.
Aerts, D., Broekaert, J. and Gabora, L., 2000b, Intrinsic contextuality as the crux
of consciousness. In K. Yasue (ed.) Fundamental Approaches to
Consciousness. (Amsterdam: John Benjamins Publishing Company), pp.173-
181.
George Melhuish, The Paradoxical Universe, Rankin Bros Ltd. Bristol, 1959.
Priest, J. P.Van Bendegem eds.). Research Studies Press. Baldock, UK: 149 –
163. (2000).
Pigozzi, 0?] Blok, W., Pigozzi, D. Abstract algebraic logic and the deduction
theorem. The Bulletin of Symbolic Logic. (A aparecer).
[Blok – Pigozzi, 1986] Blok, W., Pigozzi, D. Protoalgebraic logics. Studia Logica,
45:337– 369. (1986).
(With Oron Shagrir) “The Church-Turing Thesis and Hyper Computation”, Minds
and Machines 13, 87-101 (2003).
.
•