You are on page 1of 5

Natural Machine Logic

Logic is used in its own discipline and as a foundation for short-hand


conciseness and precision in philosophical discussion, and in practical
applications for the computational control systems that guide our tools,
appliances, factory machinery, vehicle subsystems, and the like. Of the billions
of microprocessor “logic engines” fabricated each year, 98% of them are used
in these “low-end” control systems with the rest performing in our personal
computers and even more complex systems.

Natural Machine Logic (NML) contains a foundation of revised and expanded


fundamental logic operators and corresponding hardware logic elements. It has
a new language and methodology by which a process specification, especially
suitable for real-time or safety-critical systems, can be implemented in existing
FPGA (field-programmable gate-array) hardware very quickly and easily. The
NML technology results in mostly-hardware constructions that are directly
responsive to the events and conditions in the physical process being monitored
and controlled. It is mostly-hardware (not necessarily all-hardware) because
NML technology is compatible with, and can coordinate, use, and manage
subsystems created in the conventional way (hardware and software). In sum,
NML is an all-encompassing process-management technology.

The investigation of process-control technology that induced the creation of


NML and experience with it over the years has sparked additional realizations
and information useful in its practice.

As a young engineer I was struck by the absence of simple temporal theory


applicable to process control. As a matter of necessity, I invented several
temporal logic elements to solve common machine-control problems and
continued to perform self-sponsored research on the subject. Some of my
observations and findings during this investigation brought to light specific
characteristics—or impediments—of the computational method, as listed
below. These characteristics, interpreted as problematic attributes, are
countered and solved by the system and practice of Natural Machine Logic.

Impediments of Computation

1. There are no verbs, dynamic operators, or temporal logic in the fundamental


computer logic that has now completely pervaded our daily lives.
2

2. There are only two primitive logic operations that are necessary and
sufficient, in combination, to perform the 16 possible static Boolean operations
between two operands: AND, a conjunction, and NOT, the operator of
negation, an adverb. These can be used or combined to perform logical AND,
NOT, NAND, OR, NOR, XOR, XNOR, as well as equate to certainty (1), and
null (0). The set can also perform binary arithmetic. All of these operations are
conjunctive, or coincident, in both space and time. “A AND B” is true if both
are present at one and the same time. When performed by physical logic
elements, the operations are considered to be executed in a null-time zone, as
the evaluations are ready at the next live moment (usually at the next clock
pulse or instruction), which is designed to occur after any contributing settling
times or gate-delays have run to completion. Boolean logic used in such a
manner is static, is unobservant of change, and can be said to inhabit the space-
domain. The time-domain is an untapped resource.

3. Another operation useful (and necessary) to computing is STORE, the


memory operator. STORE is a transitive verb, but it is not supported by any
formal logic, which is all static, not dynamic.

All higher-level computer languages (i.e., in software) are ultimately


decomposable to, hence built up from, sequences and combinations of the
Boolean operations and STORE. In machine language, those operations are
used to determine explicitly: a) the locations from which to acquire the
numerical or conditional operands, b) what Boolean operations to perform, c)
where to put the results, and d) the next step in the program. Every step is, and
must be, predetermined.

4. The tasks performed in modern times by computers (and microprocessors


and microcontrollers) are no longer confined to translation or transformation of
one set of static symbols to another, which was the (presumed) original intent
of Alan Turing. Computers are now used to monitor and control dynamic
physical processes, while the computers themselves are able to perform series
of only static operations as means to those dynamic ends. The memory operator
STORE, used to log into memory samples of a process that changes over time,
thereby performs a succession of time-to-space translations. This frame-by-
frame treatment of a continuous process allows desired static operations to be
executed upon the static and discrete values either recalled from memory or
directly sampled from external sources. Process-control results are shifted from
static repositories within the computing device to the output ports (space-to-
time translation). Such approximations of temporal processes can become quite
3

complicated, hard to understand, and in the final analysis, not indisputably


correct. Boolean logic and Turing machine principles are anything but
fundamental when they are used to deal with time.

5. The order of events and the chain of cause and effect are usually much more
important than how many microseconds each condition lasts, or at what clock-
time each occurred in a process. Physical processes start, continue, and stop.
They have beginnings and existence extended in time. They end. They repeat.
Several conditions can overlap, with different start and stop times for each. A
natural language narrative (say, in English) can precisely describe a process
having these characteristics no matter how that process twists and turns over
time and in space, and all without reference to clock time, the increments of
which, in any case, are arbitrary.

Given the above observations, how do we “tell the process stories” using
computers with only AND, NOT, (and their combinations), and STORE? It is
demonstrably difficult and it is no wonder that software production for large
systems is only 50% efficient and can’t ship product guaranteed to be error-
free.

6. In the early days, the active elements of computers (vacuum tubes) were
large, expensive, and heat-producing. With the present level of integration and
integrated circuits having up to millions of transistors, there is little reason to
continue sharing common resources directed by a central command point such
as a central processing unit (CPU). Competition for resources slows the
performance of control functions. By their very nature, linear-sequential
operations actively obstruct parallel-concurrent operations.

7. The desire to conserve processing activity to reduce needless expenditure of


energy suggests dealing with the elemental conditions and events of a process
as close to their original form and substance as possible. It is important,
therefore, to minimize the digitization and storage of process parameters, thus
preventing massive data-processing later, with consequent saving of energy and
time.

8. Computer technology favors the discrete a bit much, using frames of


information and samples instead of anything continuous. This is partially due to
the almost exclusive use of the linear-sequential Turing-type machines (TM)
paradigm for computation, which has taken over, not only all static
computational tasks, but those affecting continuous-process control, as well.
4

9. The software designers live and work in native three-dimensional spaces and
time domains, while they are required to translate every temporal process
reference to the space-domain for data-processing after which they must
transfer the resultants from internal memory, which are space-domain
locations, to the active outputs and thereby join the ongoing time-stream. It is
no wonder they have a tough time shipping bug-free product. (See references
for “Software is Mostly Unnecessary.”)

10. Conventional techniques make everything discrete through sampling,


digitization, and storing in memory. Even time is treated as an ever-increasing
set of arbitrary numbers at selected points or attached to parameters that have
been stored in memory. Scientists and engineers then assume all can be dealt
with as discrete items in a series of conjunctive expressions or static frames.
That is only part of the story in dynamic processes. Given a condition that
differs in separate frames, cause and effect and temporal order can’t be
computed without additional information or interpretation.

The ten characteristics above describe impediments that designers should take
pains to design out of any new control systems and theories.

NML is an expanded system of logic together with a methodology that


addresses and remedies each of the ten areas listed above. NML includes
words, operators, and corresponding hardware logic elements that describe and
reckon change, as well as the ordinary ones for static conditions. This novel
process-control logic and language embodies the dynamic human concepts of
ongoing process, which includes verbs, instead of being limited to the static
Boolean operations (and STORE) of current step-by-step computing devices
that follow the Turing paradigm. The new system operates in a fundamentally
parallel-concurrent manner vs. the usual linear-sequential mode, and its
architecture is flexible, rather than fixed, and uniquely determined by the
individual process to be controlled. The process description in the language of
NML is the design for the process’s hardware controller.

Ordinary logic begins and ends with static existence, non-existence, and
conjunction in both space and time. NML incorporates those factors of
existence and also manages the when of ongoing process. The underlying
assumptions of NML are: 1. No change happens without a causative event. All
things have an initiation in space, time, or space-time, at which cause-point the
creation of that effect takes place. 2. Once created or begun, a condition persists
until it ends (is changed into something else, is nullified, or destroyed). 3.
5

There are more temporal operations and functions than conjunction. Among
these are creation and destruction (as mentioned above), precedence and
succession, initiation, continuation, cessation, repetition, and concurrency.
NML is a logic system of thought and action that incorporates all of these
temporal concepts, and more, upon a background of continuous real time.

These three underlying assumptions of NML, together with remedies to the


perceived impediments of computation contained within the characteristics (1.-
10.) listed above, generate the technology and practice that is Natural Machine
Logic. The tables of rules, functional relationships, and worked examples that
are shown in the following pages describe and illustrate this new technology
and demonstrate how easy it is (as compared to microprocessors and software).

Copyright 2010
by c.moeller@ieee.org

You might also like