Friday, August 10, 2012

from complicated to complex T@O

Internet nodes and routes mapping
Scientific apparatus offers a window to knowledge,
but as they grow more elaborate,
scientists spend ever more time washing the windows.
Isaac Asimov

For systems with a very high number of networked elements with many dynamic processes which interact each other the main distinction is between complicated and complex systems.
Generally a system may be very complicated but not complex, while a relatively simple system may exhibit high complexity. Furthermore if in a system, simple or complicated, one introduces just one complex element or process then the whole system becomes so.

As an example of emergence of complexity in a very complicated system it is possible to use the most complicated one existing today, Internet.
  •  Complicated
Network structure
As a telecommunications network Internet has a partially or totally meshed structure, where the elements, the network nodes (estimated in approximately 100 millions known in 2006), are partially directly connected among them. The networked connecting and control structure is an evolution of the hierarchical structure of the type:

 which represents the maximum possible degree of complexity - in this case of complication:

from:Yaneer Bar-Yam, Introducing Complex Systems
    Fundamentals
    Internet, as any communication system, has its basis in the information theory founded by Claude Shannon in 1948:

    In his study he defines the theoretical basis of description applicable to any communication channels. Shannon refers to the following quite general model:

    where a message should be transmitted from the information source to the left to the destination to the right. The communication channel involves the message encoding in a signal by a transmitter, the transmission through a physical media and the receiving by a receiver, which decodes it delivering to the destination. The received signal generally is not equal to the transmitted one due to distortion and random noise added by the channel. Shannon's theory defines the meaningful parameters and the transmissive channel capacity and is totally general for any transmission system, from smoke signals of native americans to voice/fax telephone networks, from Assyrian-Babylonians scripts to global data networks such Internet.
    The second basis is technological, being represented from the extensive research conducted for about two decades since the 70s by all the telephone companies of the world, with several other centres and universities, to develop optical fiber communication systems, a research which involved investments for tens of billions of dollars and that - perhaps unique - it can be considered completed forever. The acknowledgement of this development has been formalized awarding to one of the pioneers, Charles K. Kao, the 2009 Nobel prize for physics "for groundbreaking achievements concerning the transmission of light in fibers for optical communication";

    Proc. IEE, 1966
    Systems with transmission capacities up to tens of thousands of billions of bits per second (bit/s) have been demonstrated:

    and about 420.000 Km of intercontinental submarine fiber cables have been installed and deployed to 2006:

    Communication
    Currently Internet is a composition of public (such the WWW - World Wide Web, the service that historically determined the explosion of Internet, invented conceived and designed in 1990-91 by Tim Berners-Lee and Robert Cailliau at CERN) and private networks (intranet) which has by now an almost unknowable number of network equipments and subscribers - estimated of the order of billions; for example the number of subscribers in the 2011 was estimable in around 2 billions.

    A small portion of the Web surrounding the Wikipedia website
     In its most synthetic form Internet may be represented as:
    where two users are represented linked by any fixed or mobile device, which is considered by the network as host,  and which use two classical services: the first is browsing of a webpage through a web server in a classical exchange modality of client/server type; the second can be a chat or an email exchange or a peer-to-peer (P2P) link between the two users. Internet is represented by a cloud, indicating that at this level one does not enter into details on how these services and links are made.
    The fundamental distinctions between a telecommunication network such Internet and the traditional telephony are between the transport of the information signal in analog or digital form and in the information transfer mode, divided in packet or circuit switching.
    In the traditional plain old telephone network (POTS) the transmitted signal is of analog type, eventually sampled, that is digitally converted, and the linking between the two users is made by switching based on the called telephone number with a physical circuit obtained - in an historical progression - manually, mechanically, electro-mechanically or electronically. This type of switching is indicated as circuit switching, since the two users are directly linked by a circuit.

    Hierarchical protocol stack
    In data network such Internet the transmitted signal is of digital (or discrete) type, divided and collected by determined rules through a communication protocol in a set of digital data (a sequence of "1" and "0" bits which encodes information) of variable or fixed length called a data packet, where the link between the two users is made reading some specified fields of the data packet called source and destination addresses, which allow the network to determine to who transfer to and back the information, according to a communication packet  switching type.
    In the case of Internet the communication protocol which makes packet-switching is known as TCP/IP suite, a set of several tens of protocols used to establish connectivity and manage the network equipments; in particular the interconnecting protocol is defined as Internet Protocol (IP) and structured according to a hierarchical stack of type:
    The seven levels to the left are stacked according to an historical reference classification named ISO/OSI model; those to the right are actually operative in the network ans define the the hierarchical structure of, from level 1 physical to level 4 of transport to level 7 of application. The structuring by levels of the communication protocol, which reflects directly on the hardware architecture, is one of the points - together with the network structure with a very high number of elements - which drive a system like Internet to border complexity.
    The use of the TCP/IP protocol to connect the network hosts is schematized in the following two figures. In the first:
    is illustrated how data packets are organized in the different protocol levels. At the highest level the host demands an information, for example the browsing of a webpage by a browser; this demand is passed to the application level - in this case http - which adds in the first part of the data packet a specific packet called header, passing it to next transport level where the TCP adds in turn a specific to pass to next network level where the IP protocol adds a further header to arrive to the data link layer where, in the example, the layer 2 protocol Ethernet (eth) adds a final header. At this point the overall packet is passed to the physical level with a specific protocol, for example the same eth.
    The use of the
    TCP/IP protocol stack for the connectivity between the two users which use a network host is as follow:
    where user A by host A requires a link with host B with user B. The host A encapsulate according to the previous figure the data packets, these are transmitted through the network, transit on several zones (for example net1 and net3 can be access areas and net2 a long-distance transport area) addressed by the equipment which work specifically of routing  the packets through the different networks, named switch at level 2 data link or routers at level 3 IP network, which play the same role of telephone switches in the circuit switched networks, such traditional telephony. Delivered the overall packet to the host B network interface this is decapsulated with an inverse procedure followed by A to arrive to the application used by the hosts. The transition from the host level - hardware - to the users level - humans - marks the border between the complicated network system to the complex interactive social/virtual human.
    The connectivity at geographical level and the transport modalities of a packet along the network are illustrated as:
    where host A belongs to LAN ethernet local area network - typically a business situation - which send the eth packet to dedicated switches at this level; this part of the network is the one named access, which converges to that of transport, where the packet is brought to the IP level and routed and, with an inverse process, delivered to the destination host B.
    The protocol structure of the IP set is rather complex, as illustrated in the following:
    where some of the main protocols and applications used over the 5 network levels are summarized. At the application layer IP provides the basic applications for the elementary services, like http for browsing, smtp for email, ftp for files transfer etc. At transport layer 4 there are the TCP and UDP protocols which manage specific points like packets check, flow control and congestion. At network layer 3 works the IP protocol specifically dedicated for interconnection inside the network. At this level there are also a set of routing protocols, like BGP and OSPF, which allow the routers to exchange information among them to build routing tables which allow the knowledge of the network topology to the routers for a correct routing toward the destination. At the data link layer 2 there are protocols either for the accessd network like ethernet, ATM - a still existing past protocol no more deployed - and the Point-to-Point_Protocol (PPP) - used to connect residential users with ADSL through the conventional telephonic line, and protocols used in long-distance core/transport network, such MPLS, which has been added to IP to improve the Quality of Service (QoS) and allow traffic engineering which plain IP does not allow, specifically for voice transport as Voice over IP (VoIP), one of the fundamental services in modern Ip networks where the telephone traffic is digitalized and transported as IP packets. At this level there are protocols for long-distance transport, typically by fiber optics, as SDH - in Europe - and SONET - in North America. Finally at the physical layer 1 there is all the hardware used as transmissive media, from telephone lines fixed or mobile to high frequency HFC cables for network access to optical fiber systems and satellite links for transport inside the network. 
    A more detailed schematic view of the access and transport/core architecture for several users typologies is:
    where are shown the access network to the left and the long-distance transport-core network to the right; between the two there is a Point of Presence (POP) which interconnects the two networks and collects and distributes traffic at regional and national level, distinguished in primary POPs, connected by optical high bandwidth links, and secondary POPs which collect user areas and are connected to the primaries; for example in nations like Italy, France, Germany and United Kingdom the primary POPs may be some units and the secondaries some tens, each with some millions of users. The users types are commonly divided by residential mass-market and business users. The first includes traditional telephony collected by the POTS network and transformed into IP datagrams, the residential users with ADSL line collected by DSLAM equipments and mobile devices users collected by the mobile network. These typologies are then collected by a metro MAN ethernet aggregation network connected to the access router (BNAS) of the metropolitan POP. The business users with their LANs are connected to the POP through two specific switches or routers, one user-side, Customer Edge (CE), and one POP's Internet Service Provider side, Provider Edge (PE). The business traffic is then generally associated to a virtual private network (VPN)  which allow institution or organizations with a wide distribution of locations or branches (typically government offices and banks) the possibility to have a private intranet where nodes are linked through the public national network.
    The access traffic entered in the POP (if not local) goes to the transport network, with high-end routers (Giga and Tera routers) with processing capabilities of the order from billions to hundreds of billions of bits per second, and typically transmitted through optical fiber rings with the SDH/SONET protocol by ADM equipments; in the high-traffic national links is also possible to transmit several optical channels into the fiber by different wavelengths using DWDM technologies; for example there are some DWDM systems where a single optical fiber is capable to carry all the fix-to-fix world calls. If the link required by the user is international - typically the browsing of a webpage or a peering by a Server Farm - specific national network POPs connect and exchange toward international networks, the union of which is globally named Internet or WWW for the public portion. Furthermore the national ISP network manager should provide to collect voice and data traffic coming from other fixed or mobile telecommunication operators (Other Licensed Operators - OLO) through an Internet Exchange Point (IX - IXP).
    The national network is divided, by the routing protocols, in several areas at national level (by OSPF) and autonomous systems (AS) at international level (by BGP), named also domains. At the global level Internet is representable also as the union of the existing domains:
    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy
    Network evolutions, involutions and coevolutions
    The Internet schematic evolution from the 70s to the medium and long term future can be shown as:

    the historical evolution of the network, particularly of the level 3 IP protocol, cannot take into account in the 70-80 years of its enormous growth at world level. This has resulted in several critical present points and present and future developments among which:
    the IP protocol was developed having in mind small-medium networks dimensions, typically for connecting some tens of hosts between metropolitan locations, and certainly not those of billions of machines. In particular, the two address fields (destination and source) of the protocol have been codified at 32 bits, which allows an address space of 232 -1, that is about 4,3 billions, a number considered at time largely sufficient, if not overestimated, compared to the estimated developments, mainly of research. These have been further divided in classes which further limited the number of available and assignable addresses, distributed by the IANA ICANN - Internet Assigned Numbers Authority - which, in the first years, assigned them in a easy and free way to those few entities which required them. Starting from the early 90s, with the Web development and the beginning of commercial applications, the explosion of the global demand of classes of IP addresses from ISPs and agencies of any kind is bringing to the saturation and exhaustion of available addresses, as shown in the following figure which reports the global Internet routing table as aggregated routes announced by the BGP protocol, around 400.000 at present.
    Internet global routing table: BGP Routing Information Base entries to date 
    To remedy this situation from 1998 an evolution of the protocol has been proposed and developed named IP version 6 (IPv6), as an updated release of the present version, named IP version 4 (IPv4). With IPv6, besides to introduce a number of improvements compared to IPv4, the address field is codified over 128 bits, that is an address space of 2128 , about 3,4 × 1038; the typical example which shows the difference is to note that for any square meter of the earth's surface there are 655.570.793.348.866.943.898.599 IPv6 unique addresses (that is 655.571 billions of billions), but only 0,000007 IPv4 (that is only 7 IPv4 every million of square meters). If IPv6 resolves forever the assignable address number problem it raises another very serious, that IPv6 and IPv4 are not compatible with each other (if nothing else for the different dimension of the address field), whereby the introduction of IPv6 should provide a migration between the two protocols by several proposed solutions, which choice is reserved to ISPs and institutions which, necessarily if they grow and deplete their IPv4 addresses, should switch to IPv6.

    Internet is born and historically developed as an overlay of the telephone network, where data transmission was made by a modem which allowed to transmit, in a limited way, digital data over lines designed for analog voice. With IP networks development the situation is reversed, currently telephony is becoming increasingly a downlay of Internet, starting from higher stages down to lowers - international, national, extraurban and urban. To carry the analog voice digitally converted over a network designed for data as VoIP raises a number of problems, first of all that to introduce into the network a Quality of Service (QoS) that strongly privileges voice real-time, particularly in term of delay and packets loss compared to other services; as an example, if a webpage loading takes some seconds the user does not experience any particular malfunction, while if applied to an audio/video streaming such as YouTube it becomes difficult and if applied to a phone conversation makes it unintelligible. For this reason several technologies have been added to the IP protocol, which alone does not solve the question since at time it was not designed for voice transport, as new protocols at the interface of level 3 IP and the 2 Data Link such MPLS, which intrinsically have been designed to privilege the stringent requirements of voice compared to streaming and browsing.

    Finally Internet development until around 2000 was mainly related to connectivity, that is in a way that all the network resources would be always reachable, a target largely achieved today. After connectivity is taken for granted and user-transparent then the next step is based on the fact that the network is largely used for information search and retrieval, mainly for multimedia files of text, music, video and so on. Following the example of peer-to-peer and file sharing applications like eMule and BitTorrent- which now are an Internet overlay - the user search for information on a content base and for where this content resides. Therefore the development is toward a distributed network intelligence based on connectivity which makes a content-based network, Content Delivery Network (CDN), where the present P2P overlay follows the previous example of Internet over the telephone network and incorporate it.
    • from Complicated to Complex: Dynamical Networks
    The high number of nodes, intended as domains/autonomous systems (groups of routers), routers and hosts, the intrinsic presence of a protocol hierarchy in data transmission and the meshed network geographically distributed makes globally Internet very close to a complex system.

    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy
    At this level Internet and the WWW have been investigated for their statistical properties related to their structure and connectivity. For example in a 2007 study has been determined that either the incoming and outgoing probability in a hypertext link document inserted in a web URL and the probability on the links number that, in a router's domain, one of these has with the others either follows a power law distribution, characterized by a scale invariance :

    Ref.: A. Barabási: "The Architecture of Complexity", IEEE Control System Magazine (2007)
    The dynamic network structure is shared by a large number of systems, from the relational structure of family systems (in the example the Medici family in renaissance Florence):

    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy.
    to proteic networks generation which interact in the cellular cycle in molecular biology:

     to metabolic networks:
    Ref.: G. Cardarelli: "Lectures on Complex Networks", (2008)
    to drug users networks:

    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy.
    to the network of ingredients for a cooking recipe:

    Portion (full) of an ingredients network for a cooking recipe.
    From: L. Adamic et al., "Recipe recommendation using ingredient networks", WebSci 2011, ArXiv.org.
    to end with the most complex system of all: the human brain, a system with evident  emergent phenomena made in the human cerebral cortex by about one hundred billions neurons, with about one hundred thousand billions synaptic interconnections and composed by at least two networks, one axonal, with middle and long-range links and the other denditric, considered of short-distance.
    A typical cerebral interconnection among different areas is of the type:
    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy.
    with particular dynamics of interaction, for example the enhancement of synchronization in a local neuronal network due to the addition of long-range interneurons:
    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy.
    •  Complex: Social Networks and Virtual Communities
    Networks on networks on networks: virtual communities over users over Internet
    The Internet full complexity is reached when users are added, which interact in a virtual relation equivalent to the "real" one except the impossibility of body contact. The study of social dynamics and information and opinion diffusion through Internet has been particularly researched in the virtual communities environment, for example on Social Sites, like Second Life, Facebook, Twitter and so, which emulate social network. The situation can be described as emergent social networks from emergent virtual networks from a global physical-logical network, Internet.
    Ref.: S. Boccaletti et al.: "Complex networks: Structure and dynamics", Physics Reports (2006) 
     Cactus Group, Catania, Italy.
    A recent study on information diffusion in a virtual community with some results about the kind of the established relationship has been obtained diffusing and studying the propagation of a meme on Facebook of the type:
    Do any of us really know everybody on our friend list?
    Here is a task for you. I want all my fb friends to comment
    on this status about how you met me. After you
    comment, copy this to your status so I can do the same.
    You will be amazed at the results you get in 12 hours.

    its diffusion from July 6 to 9, 2010 has been:
    A brief portion of the diffusion of the meme. Nodes are users who posted their meme as their Facebook status. Edges are drawn between nodes if one user commented on the meme post of another. The colors denote the time at which the status update was posted, starting on the 6th of July 2010 (red), and ending 9th July 2010 (blue).
    The meme continued propagating past this point, eventually reaching millions of users.
    Ref.: Adamic and FB: "How you met me", (2012).

    providing various other data as popularity over time, the age and location of distributors which contribute to describe the establishment of a "friendship" relation typical of Facebook. In another 2011 study the observed relation between the virtual network structure evolution and the information it carries has been modeled, showing that the network structure alone can be extremely revealing about the diversity and the novelty of information and contents which are communicated. Networks with a higher conductance in link structure exhibit higher information entropy, or higher information disorder, while unexpected network configurations can be tied to information novelty, for example an online user which announces a scoop - true or false - rapidly increase his fan's group network for a certain period of time.





     The Cooperative Association for Internet Data Analysis


    The Internet Mapping Project







    Guido Caldarelli







    Monday, July 16, 2012

    the complexity from KaliYuga to Tao - I

    *Presented at the Colloquium “Intelligence de la complexité: épistémologie et pragmatique”, Cerisy-La-Salle, France, June 26th, 2005

    Why has the problematic of complexity appeared so late? And why would it be justified?

    1. The three principles of the rejection of complexity by ‘classical science’

    Classical science rejected complexity in virtue of three fundamental explanatory principles:
    1. The principle of universal determinism, illustrated by Laplace’s Daemon, capable, thanks to his intelligence and extremely developed senses, of not only knowing all past events, but also of predicting all events in the future. 
    2. The principle of reduction, that consists in knowing any composite from only the knowledge of its basic constituting elements. 
    3. The principle of disjunction, that consists in isolating and separating cognitive difficulties from one another, leading to the separation between disciplines, which have become hermetic from each other.
    These principles led to extremely brilliant, important, and positive developments of scientific knowledge up to the point where the limits of intelligibility which they constituted became more important than their elucidations.
    In this scientific conception, the notion of “complexity” is absolutely rejected. On the one hand, it usually means confusion and uncertainty; the expression “it is complex” in fact expresses the difficulty of giving a definition or explanation. On the other hand, since the truth criterion of classical science is expressed by simple laws and concepts, complexity relates only to appearances that are superficial or illusory. Apparently, phenomena arise in a confused and dubious manner, but the mission of science is to search, behind those appearances, the hidden order that is the authentic reality of the universe.
    Certainly, western science is not alone in the search of the “true” reality behind appearances; for Hinduism, the world of appearances, the māyā is illusory; and for Buddhism the saṃsāra, the world of phenomena, is not the ultimate reality. But the true reality, in the Hindu or Buddhist worlds, is inexpressible and in extreme cases unknowable. Whereas, in classical science, behind appearances, there is the impeccable and implacable order of nature.
    Finally, complexity is invisible in the disciplinary division of the real. In fact, the first meaning of the word comes from the Latin complexus, which means what is woven together. The peculiarity, not of the discipline in itself, but of the discipline as it is conceived, non-communicating with the other disciplines, closed to itself, naturally disintegrates complexity. 
    For all these reasons, it is understood why complexity was invisible or illusory, and why the term was rejected deliberately.

    2. Complexity: A first breach: irreversibility
    However, a first breach is made within the scientific universe during the nineteenth century; complexity would appear from it de facto before starting to be recognized de jure.
    Complexity would make its appearance de facto with the second law of thermodynamics, which indicates that energy degrades into caloric form: this principle lies within the scope of the irreversibility of time, while until then physical laws were in principle reversible and that even in the conceptionof life, the fixism of species did not need time.
    The important point here is not only the irruption of irreversibility, thus time, but it is also the apparition of a disorder since heat is conceived as the agitation of molecules; the disordered movement of each molecule is unpredictable, except at a statistical scale where distribution laws can bedetermined effectively.
    The law of the irreversible growth of entropy has given place to multiple speculations, and beyond the study of closed systems, a first reflection about the universe, where the second law leads toward dispersion, uniformity, and thus towards death. This conception of the death of the universe, long ago rejected, has appeared recently in cosmology, with the discovery of black energy. This will lead to the dispersion of galaxies and would seem to announce us that the universe tends to a generalized dispersion. As the poet Eliot said: “the universe will die in a whisper” ...
    Thus, the arrival of disorder, dispersion, disintegration, constituted a fatal attack to the perfect, ordered, and determinist vision.
    And many efforts will be needed-we are not there precisely because it is against the reigning paradigm-to understand that the principle of dispersion, which appears since the birth of the universe with this incredible deflagration improperly named big bang, is combined with a contrary principle of bonding and organization which is manifested in the creation of nuclei, atoms, galaxies, stars, molecules, and life.


    3. Interaction Order/Disorder/Organization

    How is it that both phenomena are related?
    This is what I tried to show in the first volume of La Methode (The Method). We will need to associate the antagonist principles of order and disorder, and associate them making another principle emerge that is the one of organization.
    Here is in fact a complex vision, which one has refused to consider during  a very long time, for one cannot conceive that disorder can be compatible with order, and that organization can be related to disorder at all, beingantagonist to it.
    At the same time than that of the universe, the implacable order of life is altered. Lamarck introduces the idea of evolution, Darwin introduces variation and competition as motors of evolution. Post-darwinism, if it has, in certain cases, attenuated the radical character of the conflict, has brought this other antinomy of order: chance, I would say even a vice of chance. Within the neodarwinian conception, to avoid calling “creation” or “invention” the new forms of living organization such as wings, eyes - one is very afraid of the word "invention" and of the word "creation” - one has put chance at the prow. One can understand the rest of the fear of creation because science rejects creationism, i.e. the idea that God is a creator of living forms. But the reject of creationism finished in masking the creativity that manifests itself in the history of life and in the history of humanity. And, from the philosophical point of view, it is rather recently that Bergson, and then in another way, Castoriadis, put at the centre of their conception the idea of creation.
    In addition, in the beginning of the twentieth century, microphysics introduced a fundamental uncertainty in the universe of particles that ceases to obey the conceptions of space and time characteristic of our universe called macro-physic. How thus these two universes, that are the same, but at a different scale, are compatible? One begins today to conceive that one can pass, from the micro-physical universe to ours, since between them a certain number of quantum elements are connected, in virtue of a process called decoherence. But there remains this formidable logical and conceptual hiatus between the two physics.
    Finally, at a very large scale - mega-physical - Einstein’s theory discovers that space and time are related to one another, with the result that our lived and perceived reality becomes only meso-physical, situated between micro-physic reality and mega-physical reality


    4. Chaos

    All this made that the dogmas of classical science are reached, but de facto: although increasingly mummified, they remain.
    Yet a certain number of strange terms would appear. For example, the term “catastrophe”, suggested by René Thom to try to make intelligible the discontinuous changes of form; then the fractalism of Mandelbrot; then the physical theories of chaos, which distinguishes itself from the rest, since today it is thought that the solar system, which seems to obey an absolutely impeccable and measurable order with the most extreme precision, considering its evolution in millions of years, is a chaotic system comprising a dynamic instability modifying for example Earth’s rotation around itself or around the Sun. A chaotic process may obey to deterministic initial states, but these cannot be know exhaustively, and the interactions developed within this process alter any prevision. Negligible variations have considerable consequences over large time scales. The word chaos, in these physics, has a very limited meaning: that of apparent disorder and unpredictability. Determinism is saved in principle, but it is inoperative since one cannot know exhaustively the initial states. We are in fact, since the original deflagration and forever, plunged in a chaotic universe.

    Bruce Torrence, Lisbon Oriente Station, Panoramic Photograph, 2011

    5. The emergence of the notion of complexity
    However, complexity remained always unknown in physics, in biology, in social sciences. Admittedly, after more than half a century, the word complexity irrupted, but in a domain that also remained impermeable to the human and social sciences, as well as to the natural sciences themselves. It is at the bosom of a sort of nebulous spiral of mathematicians and engineers where it emerged at about the same time, and became connected at once, in the forties and fifties, with Information Theory, Cybernetics, and General Systems Theory. Within this nebula, complexity will appear with Ashby to define the degree of variety in a given system. The word appears, but does not contaminate, since the new thinking remains pretty confined: the contributions of Von Neumann, of Von Foerster will remain completely ignored, and still remain in the disciplinary sciences closed on themselves. One can also say that Chaitin’s definition of randomness as algorithmic incompressibility becomes applicable to complexity. Consequently, the terms chance, disorder, complexity tend to overlap one another and sometimes to be confused.
    There were breaches, but still not an opening.
    This would come from the Santa Fe Institute (1984) where the word will be essential to define dynamical systems with a very large number of interactions and feedbacks, inside of which processes very difficult to predict and control take place, as “complex systems”, where the classical conception was unable to be considered.
    Thus, the dogmas or paradigms of classical science began to be disputed.
    The notion of emergence appeared. In “Chance and Necessity”, Jacques Monod makes a great state of emergence, i.e. qualities and properties that appear once the organization of a living system is constituted, qualities that evidently do not exist when they are presented in isolation. This notion is taken, here and there, more and more, but as a simple constatation without being really questioned (whereas it is a conceptual bomb).
    It is like this that it was arrived to the complexity I call “restricted”: the word complexity is introduced in “complex systems theory”; in addition, here and there the idea of “sciences of complexity” was introduced, encompassing the fractalist conception and chaos theory.
    Restricted complexity spread rather recently, and after a decade in France, many barriers have been jumped. Why? Because more and more a theoretical vacuum was faced, because the ideas of chaos, fractals, disorder, and uncertainty appeared, and it was necessary at this moment that the word complexity would encompass them all. Only that this complexity is restricted to systems which can be considered complex because empirically they are presented in a multiplicity of interrelated processes, interdependent and retroactively associated. In fact, complexity is never questioned nor thought epistemologically.
    Here the epistemological cut between restricted and generalized complexities appears because I think that any system, whatever it might be, is complex by its own nature.
    Restricted complexity made it possible important advances in formalization, in the possibilities of modeling, which themselves favor interdisciplinary potentialities. But one still remains within the epistemology of classical science. When one searches for the “laws of complexity”, one still attaches complexity as a kind of wagon behind the truth locomotive, that which produces laws. A hybrid was formed between the principles of traditional science and the advances towards its hereafter. Actually, one avoids the fundamental problem of complexity which is epistemological, cognitive, paradigmatic. To some extent, one recognizes complexity, but by decomplexifying it. In this way, the breach is opened, then one tries to clog it: the paradigm of classical science remains, only fissured.

    6. Generalized complexity
    But then, what is “generalized” complexity? It requires, I repeat, an epistemological rethinking, that is to say, bearing on the organization of knowledge itself.
    And it is a paradigmatic problem in the sense that I have defined “paradigm”. Since a paradigm of simplification controls classical science, by imposing a principle of reduction and a principle of disjunction to any knowledge, there should be a paradigm of complexity that would impose a principle of distinction and a principle of conjunction.
    In opposition to reduction, complexity requires that one tries to comprehend the relations between the whole and the parts. The knowledge of the parts is not enough, the knowledge of the whole as a whole is not enough, if one ignores its parts; one is thus brought to make a come and go in loop to gather the knowledge of the whole and its parts. Thus, the principle of reduction is substituted by a principle that conceives the relation of whole-part mutual implication.
    The principle of disjunction, of separation (between objects, between disciplines, between notions, between subject and object of knowledge), should be substituted by a principle that maintains the distinction, but that tries to establish the relation.
    The principle of generalized determinism should be substituted by a principle that conceives a relation between order, disorder, and organization. Being of course that order does not mean only laws, but also stabilities, regularities, organizing cycles, and that disorder is not only dispersion, disintegration, it can also be blockage, collisions, irregularities.
    Let us now take again the word of Weaver, from a text of 1948, to which we often referred, who said: the XIXth century was the century of disorganized complexity and the XXth century must be that of organized complexity.
    When he said “disorganized complexity”, he thought of the irruption of the second law of thermodynamics and its consequences. Organized complexity means to our eyes that systems are themselves complex because their organization supposes, comprises, or produces complexity.
    Consequently, a major problem is the relation, inseparable (shown in La Methode 1) , between disorganized complexity and organized complexity.
    Let us speak now about the three notions that are present, but to my opinion not really thought of, in restricted complexity: the notions of system, emergence, and organization.

    7. System: It should be conceived that “any system is complex”
    What is a system? It is a relation between parts that can be very different from one another and that constitute a whole at the same time organized, organizing, and organizer.
    Concerning this, the old formula is known that the whole is more than the sum of its parts, because the addition of qualities or properties of the parts is not enough to know those of the whole: new qualities or properties appear, due to the organization of these parts in a whole, they are emergent.
    But there is also a substractivity which I want to highlight, noticing that the whole is not only more than the sum of its parts, but it is also less that the sum of it parts.
    Why?
    Because a certain number of qualities and properties present in the parts can be inhibited by the organization of the whole. Thus, even when each of our cells contains the totality of our genetic inheritance, only a small part of it is active, the rest being inhibited. In the human relation individual society, the possibilities of liberties (delinquent or criminal in the extreme) inherent to each individual, will be inhibited by the organization of the police, the laws, and the social order.
    Consequently, as Pascal said, we should conceive the circular relation:‘one cannot know the parts if the whole is not known, but one cannot know the whole if the parts are not known’.
    Thus, the notion of organization becomes capital, since it is through organization of the parts in a whole that emergent qualities appear and inhibited qualities disappear.

    8. Emergence of the notion of emergence
    What is important in emergence is the fact that it is indeductible from the qualities of the parts, and thus irreducible; it appears only parting from the organization of the whole. This complexity is present in any system, starting with H2O, the water molecule which has a certain number of qualities or properties that the hydrogen or oxygen separated do not have, which have qualities that the water molecule does not have.
    There is a recent number of the Science et Avenir journal devoted to emergence; to relate emergence and organization, one wonders wether it is a hidden force in nature, an intrinsic virtue.
    From the discovery of the structure of the genetic inheritance in DNA, where it appeared that life was constituted from physicochemical ingredients present in the material world, therefore from the moment that it is clear that there is not a specifically living matter, a specifically living substance, that there is no élan vital in Bergson’s sense, but only the physicochemical matter that with a certain degree of organizing complexity produces qualities of the living - of which self-reproduction, self-reparation, as well as a certain number of cognitive or informational aptitudes, as from this moment, the vitalism is rejected, the reductionism should be rejected, and it is the notion of emergence that takes a cardinal importance, since a certain type of organizing complexity produces qualities specific of selforganization.
    The spirit (mens, mente) is an emergence. It is the relation brain-culture that produces as emergent psychic, mental qualities, with all that involves language, consciousness, etc.
    Reductionists are unable to conceive the reality of the spirit and want to explain everything starting from the neurons. The spiritualists, incapable of conceiving the emergence of the spirit starting from the relation brainculture, make from the brain at most a kind of television.


    9. The complexity of organization
    The notion of emergence is a capital notion, but it redirects to the problem of organization, and it is organization which gives consistence to our universe. Why is there organization in the universe? We cannot answer this question, but we can examine the nature of organization.
    If we think already that there are problems of irreducibility, of indeductibility, of complex relations between parts and whole, and if we think moreover that a system is a unit composed of different parts, one is obliged to unite the notion of unity and that of plurality or at least diversity. Then we realize that it is necessary to arrive at a logical complexity, because we should link concepts which normally repel each other logically, like unity and diversity. And even chance and necessity, disorder and order, need to be combined to conceive the genesis of physical organizations, as on the plausible assumption where the carbon atom necessary to the creation of life was constituted in a star former to our sun, by the meeting exactly at the same time - absolute coincident - of three helium nuclei. Thus, in stars where there are billions of interactions and meetings, chance made these nuclei to meet, but when this chance occurs, it is necessary that a carbon atom will be constituted.
    You are obliged to connect all these disjoined notions in the understanding that was inculcated to us, unfortunately, since childhood, order, disorder, organization.
    We then manage to conceive what I have called the self-eco-organization, i.e. the living organization.

    10. The self-eco-organization
    The word self-organization had emerged and had been used as of the end of the 50’s by mathematicians, engineers, cyberneticians, neurologists.
    Three important conferences had been held on the topic of “self-organizing systems”, but a paradoxical thing, the word had not bored in biology, and was a marginal biologist, Henri Atlan, who retook this idea, in a great intellectual isolation within his corporation, in the 70’s. Finally the word emerged in the 8O’s-9O’s in Santa Fe as a new idea, whereas it existed already for nearly half a century. But it is still not imposed in biology.
    I call self-eco-organization to the living organization, according to the idea that self-organization depends on its environment to draw energy and information: indeed, as it constitutes an organization that works to maintain itself, it degrades energy by its work, therefore it must draw energy from its environment. Moreover, it must seek its food and defend against threats, thus it must comprise a minimum of cognitive capacities.
    One arrives to what I call logically the complex of autonomydependence. For a living being to be autonomous, it is necessary that it depends on its environment on matter and energy, and also in knowledge and information. The more autonomy will develop, the more multiple dependencies will develop. The more my computer will allow me to have an autonomous thought, the more it will depend on electricity, networks, sociological and material constraints. One arrives then to a new complexity to conceive living organization: the autonomy cannot be conceived without its ecology. Moreover, it is necessary for us to see a self-generating and self-producing process, that is to say, the idea of a recursive loop which obliges us to break our classical ideas of product producer, and of cause effect.
    In a self-generating or self-producing or self-poetic or self-organizing process, the products are necessary for their own production. We are the products of a process of reproduction, but this process can continue only if we, individuals, couple to continue the process. Society is the product of interactions between human individuals, but society is constituted with its emergencies, its culture, its language, which retroacts to the individuals and thus produces them as individuals supplying them with language and culture. We are products and producers. Causes produce effects that are necessary for their own causation.
    Already the loop idea had been released by Norbert Wiener in the idea of feedback, negative as well as positive, finally mainly negative; then it was generalized without really reflecting on the epistemological consequences which it comprised. Even in the most banal example which is that of a thermal system supplied with a boiler which provides the heating of a building, we have this idea of inseparability of the cause and effect: thanks to the thermostat, when 20” is reached, the heating stops; when the temperature is too low, the heating is started. It is a circular system, where the effect itself intervenes in the cause which allows the thermal autonomy of the whole compared to a cold environment. That is to say that the feedback is a process which complexifies causality. But the consequences of this had not been drawn to the epistemological level.
    Thus feedback is already a complex concept, even in non-living systems. Negative feedback is what makes it possible to cancel the deviations that unceasingly tend to be formed like the fall in temperature compared to the standard. Positive feedback develops when a regulation system is not able anymore to cancel the deviations; those can then be amplified and go towards a runaway, kind of generalized disintegration, which is often the case in our physical world. But we could see, following an idea advanced more than fifty years ago by Magoroh Maruyama, that the positive feedback, i.e. increasing deviation, is an element that allows transformation in human history. All the great transformation processes started with deviations, such as the monotheist deviation in a polytheist world, the religious deviation of the message of Jesus within the Jewish world, then, deviation in the deviation, its transformation by Paul within the Roman empire; deviation, the message of Mohammed driven out of Mecca, taking refuge in Medina. The birth of capitalism is itself deviating in a feudal world. The birth of modern science is a deviating process from the XVIIth century. Socialism is a deviating idea in the XIXth century. In other words, all the processes  start by deviations that, when they are not suffocated, exterminated, are then able to make chain transformations.

    11. The relationship between local and global
    In logical complexity, you have the relation between the local and the global.
    One believed to be able to assume the two truths of the global and of the local with axioms of the style: “think globally and act locally”. In reality, one is, I believe, constrained in our global age to think jointly locally and globally and to try to act at the same time locally and globally. Also, which is also complex, local truths can become global errors. For example, when our immune system rejects with the greatest energy the heart that one grafts to him, like a nasty foreigner, this local truth becomes a global error, because the organism dies. But one can also say that global truths can lead to local errors. The truth of the need to fight against terrorism can lead to interventions, which will favor even more the development of terrorism, just look at Irak.