Grid computing is an emerging computing model that provides the ability to perform higher throughput computing by taking advantage of many networked computers to model a virtual computer architecture that is able to distribute process execution across a parallel infrastructure. Grids use the resources of many separate computers connected by a network (usually the Internet) to solve large-scale computation problems. Grids provide the ability to perform computations on large data sets, by breaking them down into many smaller ones, or provide the ability to perform many more computations at once than would be possible on a single computer, by modeling a parallel division of labor between processes. Today resource allocation in a grid is done in accordance with service level agreements.
In the nodal model, a nodal network is formed from many separated peers running the nodal reduction algorithm and connected by the communications nodal organisation. Each peer has a local node space which consumes RAM and CPU resource, and has access to various external storage or processing resources such as hard-drives, databases, compilers and people.
The node spaces and nodal networks are unified so that the runtime environment is a seemless continuation of the high level applicational and organisational environment. The nodal reduction algorithm is an interpreter and compiler of nodal structure so that all functionality is described without syntax, and is therefore a seemless continuation of it's conceptual model.