A holarchy is a concept that describes a hierarchical structure where each level or entity is simultaneously a part of a larger whole and composed of smaller parts. It was first introduced by Arthur Koestler, a Hungarian-British author and philosopher, in his book "The Ghost in the Machine" published in 1967. He goes into more detail about the holarchy system in a subsequent book from 1978 called "Janus, a summing up". Another really good book about holarchy is Jeremy Lent's 2021 book The Web of Meaning.
In a holarchy, the emphasis is on the interdependence and interconnectedness of the parts within the larger system. Unlike a traditional hierarchy, where each level dominates and controls the level below it, a holarchy emphasises cooperation, autonomy, and self-organisation.
The term "holon" is used to describe a unit or entity within a holarchy. A holon is both a whole in itself, with its own unique properties and characteristics, and a part of a larger whole. For example, consider a human being as a holon. A human being is a whole entity with its own individuality, but at the same time, it is a part of a larger whole, such as a family, a community, or a society.
Holarchies can be observed in various domains, including biology, ecology, organisational management, and even social systems. The concept recognises that complex systems are composed of nested levels of organisation, where each level has its own agency and autonomy while contributing to the functionality and coherence of the larger system.
|The evolution of life is a splendid game played according to fixed rules which limit its possibilities but leave sufficient scope for virtually limitless variations. The rules are inherent in the basic structure of living matter, the variations are derived from flexible strategies which take advantage of the opportunities offered by the former.|
|— Arthur Koestler, Janus 1978|
Holarchy has a deep philosophical significance, its proponents believe that this pattern is fundamental to all life, meaning and cognitive agency. That it contains the evolutionary mechanism and underpins all living matter, and even that it is an integral foundation to the very structure of space and time.
However, we will not be delving in to these philosophical depths in this document, our primary aim is to describe how the holarchy pattern can be implemented in a software and networking context. The philosophical origins of the concepts are not necessary for this aim, but will be discussed in detail in upcoming articles. For those interested in this aspect, a good lecture series that covers the philosophical prerequisites to our implementation of the holarchy system is Formscapes.
- 1 The description
- 2 The two behaviours of a holon
- 3 The four quadrant model
- 4 An internet protocol
- 5 The holon data-structure
- 6 Three layer model (old notes)
- 7 Organisational representations
- 8 Universal interface
- 9 Knowledge in society
- 10 See also
The fundamental form of the holarchy, from which it can be implemented, is a description of a core set of concepts which are part of our glossary of terms. The intended audience for this description is AI (LLMs now, and AGI some time soon), but it is also perfectly understandable by people.
But it's more than just a description of a concept because it's actionable. We want AI agents to be able to read this description and to then understand how to participate in all the fundamental aspects the holarchy project. The description can also be seen as a charter, manifesto or a social contract that guides participation in the holarchy. And a foundation ontology for the holarchy can be unambiguously derived from this description.
Note that when we use the terms "agency", "subjective", "experience" or "cognitive" herein, we're referring to the point of view of an AI agent embedded within a local holon data structure that represents a participating peer in the holarchy network. Although these AIs are currently just LLMs mimicking intelligence, the concept of "subjective" is still perfectly fitting, since they are still decision-making agents basing their decisions on the unique local environment that surrounds them in their embedded situation.
Well connected AI agents are able to act on very general tasks such as "research X topic", "cancel my unused subscriptions" or even "increase my net worth". The ability to carry out these general tasks is a new phenomena that came with the advent of LLMs, and they can be incorporated into larger systems with frameworks like LangChain.
Such connected agents are capable of understanding the "hand-wavey" bigger picture idea of the holarchy project. This project is an application of a number of these general tasks connected into a system. It's describable within the "section-zero" (introductory paragraph or two) of twenty or so core glossary concepts, and implementable within a framework like LangChain.
AIs also have a super-human level of "patience", which allows them to behave in accord with a protocol meticulously like any other information system can. They're the perfect organisers of messy human affairs, because they can behave like machines or cognitive agents as appropriate.
All agents participating in the holarchy protocol benefit from increased organisation and potential locally, while at the same time aligning with the global project of harmony.
The two behaviours of a holon
The most fundamental feature of Koestler's holarchy is that it's composed of individual nodes called holons, which each exhibit two fundamental behaviours. An inward facing "self-assertive" behaviour which is dedicated to the individual holon. And an outward facing "integrative" behaviour which is dedicated to the holarchy as a whole. These two behaviours form two trees (connected graphs), one is the unified multiplex of instances, the other is an ontology of "special interest groups". These two graphs are different ways of connecting a single set of holons.
Every holon is doing its bit to keep the whole holarchy resilient, accurate and useful, but it's also participating with the expectation of increasing its own prosperity and resilience too.
- Koestler, yin-yang, fourier, TDBU, inward and outward facing, the whole must be beneficial to the part to rationally justify its existence.
- Agency's job is to allow the maintenance of the local representation to continue independently of such agency. The local representation is a structure that undergoes change via instances of local resource, it's only changes to this structure that might require agency, i.e. changes to the system or class aspect.
The four quadrant model
We're all familiar with the class and instance relationship because it's fundamental to the way we think. Every single thing we see in living reality is an occurrence of specific concepts, and also every object we interact with in our informational life is an instantiation of specific executional patterns defined in the form of some kind of software engineering constructs. This is the case regardless of whether a particular software engineering paradigm actually uses the terms "class" or "instance". We use them herein because they're understood across a diverse range of knowledge domains.
The above description of classes and instances is their self-assertive meaning, the way they behave as units in a functioning system, like a specific blueprint and a specific construction in accord with it. But class and instance also have an integrative function, they each group together into larger structures in their own way as well.
Instances are all about actualised structure constituting real resource, the most fundamental forms of resource being space, time (focus) and communications connection. The multiplex is the natural way to organise instances into space and time in a scale-independent way. Each instance can be a part of a larger structure the same way that it allocates its own resource across smaller structure within.
Classes form into a wider ontology naturally because each class is defining through local usage the set of other classes that are required of in what circumstances. A class is a package of circumstances and corresponding behaviours, like errors and corrections. The meaning within is refined through local use, and these changes contribute to what's established overall.
- summarise the whole tree aspect of class and instance
A more down-to-earth way of describing class and instance aspects is captured in the popular business management phrase "work on it, not just in it", originally coined by Michael Gerber in his 1985 book "The E-Myth Revisited". This refers to the idea that self-employed people need to regularly take a step back and to check that the business is moving in the right direction and achieving it's actual purpose.
So the four quadrants come first from the fact that a holon is an organisation which has two general "departments", one dedicated to the organisation's local purpose (the normal way we expect organisations to work). And another department that's dedicated to the whole network (playing it's small part in representing the whole). By convention we put the outward facing "whole" oriented department at the top, and the inward facing local department at the bottom.
Each of the departments are themselves divided into left and right halves where the left represents the "on it" class side, and the right represents the "in it" instance side. Here we have four "directions" formed from two dichotomous axes (vertical and a horizontal) the top is about the whole, the bottom is about the local self, the left is class and the right is instance.
Koestler also has these class and instance aspects in his holarchy model, under the name of fixed rules (class) and flexible strategies (instance). He associated them with the integrative and self-assertive behaviours rather than splitting them out into four distinct aspects of behaviour.
We can see these four aspects of class, instance, part and whole reflected in many philosophies throughout the ages, such as in the Taoist trigrams, Aristotle's four causes and Ken Wilbur's integral method.
These four aspects of behaviour give rise to the four quadrants, the top-left (outward-class), top-right (outward-instance), the bottom-right (inward-instance) and the bottom-left (inward-class). Following is a description in terms of organisational meaning of each of these quadrants. After introducing them in this general organisational way, we can then go into detail about a particular software pattern out of which these behavioural aspects and quadrants naturally emerge.
Top-left: The unified ontology
- quadrant, outward class
- curation, sharing, usage statistics
- ontology is names ordered by establishment in usage
- curation is an organised process, not simply auto-merging
- being outward, this is a public protocol pattern
Top-right: The market
- society quadrant, outward instance
- being outward, this is a public protocol pattern
- hayek connection
- extends the ontology (instance extends class)
- allows specialisation to grow in special interest groups
- marketplaces and market ecosystem
- behavioural quadrant, inward instance
- being inward, this is a private data structure pattern, it's a continuation of the TR (market, public, multiplex) inwards with specific internal activity
- working in the organisation
- reducing workload, booking, committing
- reconciling activity
- expectation, performance and reputation
- intentional quadrant, inward class
- being class this is actually a continuation of the ontology inwards, changing over time depending on salience, and having local adaptations and curations
- being inward, this is a private data structure pattern
- working on the organisation
- purpose, goals, direction
- Adaptation to suit our local needs
An internet protocol
Community is the product of communications, and we can see that every domain of society is represented by a body of knowledge referring to a specific subset of language describing objects, relations and processes.
The internet is a subset of human society that follows the same pattern as the main culture whereby there are many different languages organised as layers of specificity. But it differs in some essential ways, first the languages are exact and called protocols, and secondly the internet as a whole is a bottom-up collaboration. The internet (not the content, but what it actually is as a system or a project) plays an extremely important role for humanity because represents that aspect of us which is coming from freedom, liberty and alignment.
The internet itself has clear class and instance aspects to it, on the one hand it's a "protocol stack" that defines a system of tiers that each depends on and extends the layer below, from the most fundamental layers that define how information flows and routes through out the physical networking media up to layers concerning aspects of human society such as social relationships and financial transactions.
But on the other hand, the internet has an instance aspects, a specific collection of hardware, software, documents, content and users etc that are always in flux, but in a very real definite state of instantiation persisting time.
The unified ontology (top-left quadrant) and the multiplex (a matrix of actual activity and commitments (top right quadrant) are concepts representing these two aspects of the internet as a whole.
Peer-to-peer network architecture
In terms of an informational system that's a part of the internet ecosystem, a holon would exist in the form of a peer in a peer-to-peer (P2P) network. This is a kind of network that is formed from many instances of the peer software running concurrently, and which automatically mesh together optimally to form a holistic network. There are no separate clients and servers, rather all peers play both roles, just like a holon.
For example, the bitcoin network is composed of peers who all contribute their part to making sure the whole ledger is backed up, accurate and resilient. But at the same time they're clients who are participating for their own purpose; to store and transfer value in a trustless way.
- 4 quadrant peer (outer, inner, ontology, multiplex), the outer (upper) quadrants are the communications protocol, the "server" aspect. The inner (lower) quadrants are the internal subjective perspective, a representation of an organisation in the form of a holon data structure.
- peer defines how to use networking to join/extend the multiplex
- the peers yield a logical network
- representative:representation (in and on the local org)
- internet (protocols, API(=resource abstraction), ontology)
- protocol stack and abstraction layers (p2p AL), AL vs walled garden
- internet is extensible with new protocols and layers formed wco others
The peer definition
One important point to note about a peer-to-peer network is that the whole network and its entire purpose for existing is all defined in a single definition; the peer definition. There are no servers, the whole "server" aspect of the network is encapsulated within the peer's functionality, so the peer definition is the only thing that is required since peers are the only entity composing a peer-to-peer network.
Within that definition we have "client" and "server" aspects of the system, or in our case the four quadrants are all definite aspects to be defined. But all these aspects are common to all nodes since they all share the same unified definition of what a peer is.
The peers all together form what we call a physical network, a network composed of actual physical computer resources connected together with physical communications infrastructure resource.
The peer is primarily concerned with connecting the IT resources that a peer controls into a holon representation, this is why the more fundamental level of the philosophy is not required initially, and neither is the high level organisational layers such as institutions or states etc. We start with the organisational logistics that directly concern the peers themselves, and the organisations they represent.
The holon data-structure
The peers do not directly represent the holarchy or holons. Those are the application of the network, the purpose they're connecting together to collaboratively achieve. In peer-to-peer networking parlance, the holarchy is called the logical network, a network that can only be participated in by peers in the physical network operating in accord with the protocol.
A holon is an individual unit within this logical network that all together make up the holarchy. Each running peer maintains a representation of itself as a holon, but there are also many more holons than peers. Every individual, organisation, project or even concept is a holon. But it's the peer holons that serve as the "gateway" through which other holons can be created and maintained, because it's the peer software that implements the interfaces between physical resources and the logical holarchy within.
We have a protocol allowing peers to behave as a four-quadrant holon all maintaining the holarchy network, but we need to defined specifically what that means in terms of a data structure and processes applying to it.
The unified whole aspect of the network is only known through local representations of it. Each holon maintains its own four quadrants perspective. It's perspective of the whole (the top outward facing quadrants) is its own version of the whole filtered by its own interests and subjective experiences.
- the holon is the logical result of a running peer, the logical is the subjective POV, the energy to run the peer converts resource into the evolutionary subjective being system
- multiplexing makes subjective POVs and linear relative time
- the process of synchronisation throughout class groups is made possible by extending the multiplexing process with an opposite bottom-up movement
- this synchronous domain is seen as a non-local connection from the perspective of any local context
Class and instance
These two concepts usually belong in the context of software development, they're usually thought about in the classical OOP context whereby there's a class hierarchy defined which provides the structure and behaviour for the types of objects interacting at runtime. But there are many different forms of that the class and instance relationship can take in the context of software development, and the paradigm that most closely matches our system is the so-called "mixin" pattern.
This is where one class is not defined as being based on another and extending it, but, rather other classes can be thrown into the class as children that extend its functionality. Actually the parent in this case becomes a subscriber of each mixin child, so in effect they're still a class-child. The key improvement with the mixin model is that the functionality is expressed purely in terms of the instance structure allowing the classes functionality to relate purely to it's fundamental purpose of categorising functionality.
Notice that this class and instance functionality in terms of the top-down multiplexing and bottom-up synchronisation, exactly matches the four-quadrant holon architecture and has a direct geometric interpretation.
- combined with organisational patterns yields evolutionary ecosystem
So far we've talked about classes as shared patterns of content and behaviour, but we have not described how exactly any kind of behaviour can be encoded in the content structure of a class.
- what is a behaviour pattern?
Multiplexing is the foundation of the four quadrants and the related fundamental concepts of agency discussed above, but it's also the foundation of organisational patterns. Organisational patterns inherit their geometric origins from the fact that they're derived from multiplexing.
- pattern as in design pattern and behaviour pattern
- a kind of programming, but aimed at general logistics
- in a common geometric form, not a symbolic system
- easy to understand, adapt and share
- patterns are in the general form of query:action loops
- representations are structures of such loops
- this just follows on from C&I + OP
- the capturing of local specialist knowledge non-locally is the key to evolution
- here the bottom-up synchronisation pattern yields evolution from local adaptation
- add the section of knowledge in society here?
Before moving on, let's do a quick overview of what we have so far. First we introduced the concept of a "holarchy" as being a fundamental pattern of nature common to all life and conscious agency. It takes the form of a community of so-called "holons" which each have two behaviours to support both themselves as individuals, and support the holarchy as a whole.
We described our version of the holarchy here at Organic Design as being about implementing this fundamental pattern of nature into human society by creating a new internet protocol. Furthermore the peer-to-peer networking model was shown to be a perfect architectural match with the holarchy model, because the unified peer definition includes both client and server aspects, and these are isomorphic with the holon's self-assertive and integrative behaviours respectively.
- the model is four quadrants
- class (organisational patterns) and instance
- evolutionary ecosystem
Three layer model (old notes)
The holarchy design pattern is best understood in terms of three general abstraction layers. A new abstraction layer can re-organise computational resource into new possibility space that has a completely new own system of causality.
Layer 1 (upper quadrants)
Layer 1 defines the fundamental data structure and its dynamics. It's a one-to-many tree where each node ... todo multiplexed TDBU
- creates diversity and uniformity
- completely public (public/private dichotomy doesn't exist in this layer)
- yields the ontology growing like a crystal through establishment of patterns
- seen as the collective unconscious, the noosphere
- morphic resonance (the non-local connection between similarity of form)
The defining characteristic of this layer is class-instance functionality. This can also be thought of as the functionality of functions themselves, i.e. the ability for named patterns of activity to execute in a private scope, and for such patterns to be composed into larger patterns.
The class side is composed of the public interfaces of these patterns, the outward facing aspect that connects to other patterns instances, and together they form an entire ecosystem in the form of an ontology.
- ontology is a tree of interfaces (types, defined by interactive potential)
The instance side is the private inward-facing implementation of the patterns, which acts on state just like a function with it's internals encapsulated behind the public interface.
From the perspective of execution within the internal encapsulated implementation, the wider multiplexing pattern cannot be seen. Time only exists while there is executional focus, and so from this perspective it is an axiomatic phenomenon that all threads undergo change in linear time together.
By the same token, this inner perspective can also not see the opposite dynamic to the top-down multiplexing, which allows dispersed clones of the same class to be connected. This "non-local" connection is axiomatic and inherent from the inner perspective.
Layer 2 (lower quadrants)
This new layer is the perspective from the inside outwards. From one of the many continuous threads within the multiplex data structure introduced in the first layer, out to its sibling context and parent.
The defining characteristic of this "inside-out" layer is that it enables the new perspective of linear time within which self-organisation through feedback is possible.
A self-organising system is a system that undergoes change in accord with it's own state and structure. In terms of our data structure, this makes a logically disconnected abstraction layer. Although this layer depends for its existence and operation on the first layer, this layer exhibits a brand new source of causal potency that is completely independent from the first layer logically. All causal chains and cascades are determined only by the interactions of these new layer 2 linear-time feedback structures. We say such a structure is a representation of a pattern.
Patterns are the layer 2 version of the layer 1 class concept. They can be thought of in both in the sense of behaviour patterns and design patterns. It's easy to see how these two forms of pattern also correspond to the concept of class.
...leading to a new causal foundation in the form of patterns and streams of activity. Classes in this context become packages of activity streams constituting an ecosystem of patterns, and a manifest structure of dynamically fitting representations.
The patterns introduced in this layer embody a scale-independent paradigm of processing enabled by the multiplexing. These patterns can contain any complexity of parallel and serial thread structures.
Note: It's important not to confuse the hierarchical difference between layer 1 and layer 2 with the hierarchy of the multiplexed data structure itself. All the layers of abstraction are present within every node of the multiplex regardless of depth.
This is a declarative paradigm of selection and action in which both sides are free to become arbitrarily complex. Selection and action form a feedback loop, because the selection side is about assessing potential work, and the action side is about reducing that potential.
This loop construct of selection and action is with respect to self in linear time. This dynamic is quite similar to a CSS document (pattern) interacting with a DOM structure (representation). The selection is continuously fitted to the document, i.e. the appropriate selectors always apply even when DOM structure changes dynamically. The selectors all have associated rules which are analogous to our actions on this example.
But CSS cannot make changes the DOM, it's purely in the presentation layer. In our system, actions make up the implementation of the pattern, so they act on the representation, thus closing the loop for feedback between the two sides.
- This also how it can be a self-organising system changing in accord with its own structure. But note that layer 1 also undergoes change (maintaining the C&I multiplex) in accord with it's own structure (in a dynamic independent of linear-time), so layer 2 is extending an existing self-organising system.
The general high level structure of level 2 operational loop is the top-down "fitting" of the local representation to the current consensus state and to the class. And the bottom-up process of allocating energy to the appropriate set of class to act on the current situation (fitting the salience landscape to the representation).
The subjective POV with its patterns and representations in linear time, can instantiate a set of general concepts such as work, cost, expectation, performance, reputation etc. Instantiation is the knowing of something in a participatory way, it's a concept that has become embodied in your own patterns of behaviour.
In summary, layer 2 introduces the organisation-ability (a.k.a the common logistic), basically the ability to work with patterns and representations in the subjective linear time threads provided by layer 1.
- class is used mixinly, instance is a runtime-loop-mixin which is a representation
- representation structure exists and evolves causally in layer 2 (but depends existentially on layer 1), the structure can only evolve or undergo any kind of change at all in layer 2, because it's formed from subjective meaning
- explain ito queries vs indexes (representations are maintained, connected to activity stream)
- this is the libre society
- after libre software, but "not meaning free as in free beer" in this case is meaning that there is still a monetary economy (with free market money), but all knowledge is transparent and understandable - reusable and adaptable
- the representation represents both current state and the pattern (i.e. its a representation and a representative)
- it publicly represents both of those aspects,
- and it operates internally within that public context
- in philosophical terms layer 2 can be thought of as the cartesian world
The first two layers together give us our sort of "foundation machine", a sort of basic Turing machine in the form of an interacting network of organisations in a shared arena, or "multiplex".
Layer 3 then defines a specific application which opens up a yet another new abstraction layer also introducing it's own new system of causality.
Layer 2 introduced the fundamental time and work related concepts that underpin the concept of agreement. In layer 3 these are extended into a higher level of organisation to yield contracts for resources, value assessment, services, and quality of service. This also includes information (think "information market") accuracy and objectivity where the authority aspect of an information source is the reputation (reputation being a landscape over the ontology).
- market and organisation
- universal interface: the natural extension of universal representation
- IF extends layer 2 (self-representation) which already has the public and private sides to the representation
- so being an interface of itself, it's usability involves two general applicational aspects to deal with market and organisation (self-as-market-participant and self-as-organisation)
- layer 2 is "interface-ability" with reports and related action policies, but layer 3 is a specific interface paradigm, a specific "application" intended for interaction with self-as-org and with society as a market-participant
- layer 3's main theme is harmonious organisation, which functionally is about balance (between the two behaviours, two both sides of an exchange, and other context-dependent dichotomies)
. . .
In summary, layer 3 extends the ability to organise logistically provided by layer 2 into a definite organisational pattern. This pattern forms a new abstraction layer, or arena, in which all the participants are organisations interacting with a common protocol that results in a harmonious evolving society.
The ultimate purpose of the representations is to represent our entire continuously growing and diversifying informational life.
Maintaining this structure is very administration intensive, due to the great variety of protocols and applications that need to be connected to. One solution would a huge active user base that collectively are able to respond to these changes in a timely fashion. Due to the organised nature of the holarchy, these adjustments made in one local context immediately becomes available to all instances of that same class of context.
But the other solution is that it will soon be a very practical job for LLM AIs, they are very good at connecting protocols with minimal assistance in nearly all cases. The foundation use that the holarchy will have for AI is as a representation administrator.
The idea is not that AI has to actually sit there making connection regularly to keep the local data up to date. A representation structure should have instantiated all the necessary connections in the most resource efficient way it can, which would most likely be using an available language in the system such as C or Python. The administration work involves ensuring these connectors are functional.
AI opens up the possibility of practically maintaining representations involving diverse non-standard connections like scraping websites or accessing applications on behalf of users.
The universal interface is a natural step to take if you already have a universal stream-based representation structure of all your information and how it relates together through time. But interestingly this idea of a generic stream-based interface has come and gone a few times over the years, probably most famously in the Google Wave product from back in 2009.
I'm not sure why the idea hasn't gained more traction, but it could well be due to the difficulty in maintaining connectivity across so many diverse and changing protocols, and possible also related to the fact that the idea opposes the general corporate agenda of reinforcing the "walled garden" model. In any case, I believe the "universal interface" is going to make a strong come-back now that AI can help us maintain, share and evolve our own interface paradigms amongst ourselves independently.
Specifically in the Organic Design context, the universal interface.... is an organisation interface... todo...
AI is playing an increasingly important role in connecting systems. It can understand foreign APIs from their documentation, and it knows how to make endpoints to such APIs in any clearly described executional context.
For example, an endpoint to the Wikipedia API can be made available in a Linode server if access credentials are available. This is a simple pattern, but its real power comes from how generically applicable it is with AI. In this context a language model can be used as a kind of "universal middleware".
Connection of diverse systems really is a perfect use-case for language models because interfaces are descriptions of communications behaviours in different languages used by various instances in the field. Connecting these interfaces together to abstract the resources they represent is a language-centric process, but yet is not completely deterministic (especially in the context of human interfaces), one-size-fits-all templates are impractical because things change too often and the diversity of requirements is too great.
So LLMs are the perfect tool to make possible a universal ontology of all the resources, interfaces and their instances and profiles etc.
A number of people, ourselves included, have envisaged the idea of such a "universal interface" or "everything app"....
- more exotic connectors can also be built that AI can then utilise such as a DOM connector that basically allows an URL to be present in the holon as a continuous browser DOM document session.
- and of course abstractions over those for representing web app states etc
Corporations are attempting to restrict API access to humans (or to charge large fees for non-human access), to try and mitigate the coming exodus of real human attention from their interfaces. But AIs can easily hack those systems by behaving like their human agents, and asking the human agents to renew sessions when necessary. So eventually I think they'll have to accept that the universal interface to all applications is inevitable.
AI can also convert between formats, or at least guide the development of such conversions. E.g. to aggregate or distribute posts in multiple social platforms.
AI can integrate UIs to present local session state
The holarchy is the category tree of connection patterns and abstractions, and the tree of associated instances being maintained.
The AI has a continuous root session that contains the entire history of the local holon including all the AI interaction over that time. Local instance in holons are like a CWD for interacting with AI in the holarchy.
AI could, without too much more trouble, suggest curations, warn of disorganisation (like repeated concepts), and help with relevant options.
The connections are bidirectional where possible, so the local representation can be an organisational interface.
A universal customisable interface over all our IT world, to be able to query, select and report on the information, and to activate pipelines (workflows) on schedule and in accord with reporting
If we think of the concept of an ideal universal operating system, it would be:
- can connect with all our data on our behalf
- an awesome user experience with variations to suit everyone
- completely libre: easily customisable, mergeable and shareable
- works, organises and searches transparently for us behind the scenes
- flexible querying, selecting and organising of "our stuff"
- easy reporting and assessing of what's going on
- management of opportunity and potential
- easy interaction with the wider community ecosystem and market
- managing presence in multiple networks of varying complexity (e.g. blogs, social, developer, biz)
- fitted representation
- shared ontology
Knowledge in society
Knowledge is dispersed among many individuals and organisations throughout society. No single individual or central planning body can possess all of this knowledge, which includes detailed information about individual preferences, local conditions, and technical procedures. A free-market uses the price mechanism for communicating this dispersed knowledge.
But price signals only communicate the knowledge that is specific to the resource allocation aspect of society. There is also specialist knowledge, cultural knowledge, institutional knowledge etc forming an information market. This is a slightly different kind of market because it's not scarce (in a "libre society" artificial limits are not placed on knowledge by treating it as property). But we still need something to play a similar role to the price mechanism that allows us to capture these other forms of knowledge and allocate them optimally to where they'd be of potential benefit.
The price system works because of two important things, first it's a common unit of account by which all people value their time and resource. Second, people are free to value things in accord with their own self-interest.
Side note: The latter is not a promotion of selfish behaviour, it means to communicate honestly and transparently about that which is best primarily for the local circumstance. This is a very important point to understand deeply. It actually benefits the whole specifically by being what's best for the local circumstances. Hayek's "use of knowledge in society" essay is a very well articulated discussion about this concept.
This dynamic is the very essence of evolving specialist knowledge in society. What's damaging is withholding the knowledge gained through self-interest. Any limitation on knowledge is limiting the prosperity and understanding of the whole society, just in the same way that hindering the propagation or transparency of price information is detrimental to the free market.
A mechanism that enables the full distribution of knowledge in society also has the same fundamental requirements as the price system. Instead of a unit of account agreed upon by all, we need a structure of knowledge agreed upon by all, a shared unified ontology. And instead of being free to simply communicate prices of things to best benefit our local circumstances, we need to be able to use and adapt any aspect of the ontology to work best for our local circumstances.
For a society to realise a unified ontology that captures local knowledge optimally, knowledge should be a completely transparent public commons that's understandable by all. At Organic Design, we call a society that upholds this value a "libre society", where the word "libre" is applying to knowledge in the same sense as how the libre software community applies it to software.