$$\mbox{father}(X,Y) \wedge \mbox{father}(Y,Z) \rightarrow \mbox{grandfather}(X,Z).$$

The use of variables makes this formula rather long and

*unnatural*. Long formulas are difficult to learn via inductive learning.
My latest idea is to introduce procedural elements into logic so that certain operations intuitive to the human mind can be carried out via very short instructions.

## Working memory structure

Cognitive science studies have shown that the human brain can hold approximately "7 plus or minus 2" items in

$$\mbox{append [Head|Tail] to List gives Head | List2} ← \mbox{append Tail to List gives List2}$$

**working memory**.
Imagine that working memory is like a data structure (eg a linked-list, tree, stack, queue, graph, etc..) that can be programmed with micro-instructions.

In logic, every problem must be fashioned as **deduction**. This makes certain simple operations complicated to define. For example, this is the definition of "append" in Prolog:$$\mbox{append [Head|Tail] to List gives Head | List2} ← \mbox{append Tail to List gives List2}$$

which is 13 symbols long.

Notice that the variables are inter-dependent among conjuncts in the formula:

This creates complications in pattern matching (unification) and substitution.

Suppose that Genifer's working memory has a linked-list tool kit, with

**micro-instructions**such as:

- focus on first list item
- move to next list item
- replace focused list item with X
- insert X before current focus
- etc...

Such instructions are very short and are usually atomic or have only 1 argument. This is a significant advantage during machine learning.

## Eliminating variables in logic

Consider again the grandfather formula:
$$\mbox{father}(X,Y) \wedge \mbox{father}(Y,Z) \rightarrow \mbox{grandfather}(X,Z).$$

**Relation algebra**(not to be confused with "relational algebra" in database theory) offers a way to eliminate variables. It can be regarded as a form of

**combinatory logic**(the major theory of eliminating variables), focused on relations.

For example, the grandfather example is formulated as:

$$\mbox{father} \circ \mbox{father} = \mbox{grandfather}$$

which is similar to natural-language "father's father is grandfather". Notice that the above is a complete statement about the "operators" father⚫ and grandfather⚫.

Look at the following inference carried out in relation algebra, using equations purely, and

**substitution of equal by equals**:
$\mbox{john = father paul}$

$\mbox{paul = father pete}$

$\mbox{john = father father pete = grandfather pete}$

Also notice how similar the above derivation is to natural language.

## Genifer logic

I suggest to use an eclectic mix of logical and procedural elements; such a set will almost certainly be Turing universal, if it is not too impoverished.

- equations
- probabilistic conditioning (Bayesian arrow)
- subsumption

## Learning

A hybrid formula of the form:

$$conditions \rightarrow action$$

is neither logical nor procedural, but it can be learned via logical inductive learning or reinforcement learning.

- In logical learning, the above conditional statement is construed as "if
__conditions__are satisfied, it would be appropriate to carry out the__action__". Such a statement becomes declarative and thus can be learned via logical inductive learning. - In the procedural setting, when a reward is received, the sequence of actions would be recorded and it would include the above hybrid formula. So the formula can be up-scored as usual.

These 2 learning processes may occur independently.

## No comments:

## Post a Comment