You are on page 1of 4

1.

Definitions
Three basic functions, which will help us generate all numbers:
Zero function: the point of the zero function is to allow us to use
all constants with the minimal amount of machinery. z = 0, the zero
function, will be used here interchangeably with the character 0.
It has to be included in this definition so that this use of terminology
is unambiguously rigorous.
Successor function: s(0) = 1, s(1) = 2, s(2) = 3. One reason to see
why this successor function is important is that with just the characters s, (, ), and 0 (or, z) we can represent every natural
number
Identity function: idni (x1 , . . . , xi , . . . xn ) = xi selects entries from an
ntuple.
The point of these basic functions is that we want any set of functions
to have at least this much machinery - at the very least, however we define
function, we want to be able to have constants, find successors, and select
numbers from a finite list.
Of course, want our class of primitive recursive functions to be much more
powerful than these three functions - we need to be able to combine them
in certain ways. So we will take the class of primitive recursive functions to
be (the smallest one that is) closed under two operations:
Composition closure: If f is a k-ary PRF, and gi are nary PRFs,
then
Cn[f, g1 , . . . , gk ](x1 , . . . , xn ) = f (g1 (~x), . . . , gk (~x))
is an nary PRF. We might think of Cn as a generalization of the
familiar f g to k ary f functions and tuples of nary g functions.
Primitive Recursion closure: this is the powerful closure, which allows us to define functions recursively. In short, it says that if f is
nary, and g is n + 2ary, then
P r[f, g](~x, 0) = f (~x)
P r[f, g](~x, s(y)) = g(~x, y, P r[f, g](~x, y))
defines a PRF which is n + 1ary, taking inputs x1 . . . xn , y. The
class of PRFs includes functions that are defined recursively.
2. Examples
So, now that we got over the formal definitions of what constitutes a
primitive recursive function, we review some elementary examples that were
discussed in class. Here I use these examples to develop the methodology of
1

finding the appropriate functions, whose compositions or primitive recursion


closures would construct a desired function. Later, when we talk about some
more interesting examples of primitive recursive functions, we will readily
use the fact that all these functions are p.r.
Addition
(1) +(x,0)=0
(2) +(x,s(y))=s(+(x,y))
And so + = P r[id, Cn[s, id33 ]].
Multiplication
(1) *(x,0)=0
(2) *(x,s(y))=+(x,*(x,y))
Thus, = P r[z 1 , Cn[+, id31 , id33 ]].
sg (signature)
(1) sg(0)=0
(2) sg(s(y))=1
Therefore, sg = P r[z, Cn[One1 , id21 ]].
Zero test (sg)
(1) sg(0) = 1
(2) sg(s(y)) = 0
And so sg = P r[One, Cn[z 1 , id21 ]].
3. More interesting/useful examples
General summation Before, we defined normal summation, which
was +(x, y). But often we want to find sums of series, or more
generally, the sum of a list of things. Can we be sure that all such
summation can be represented by press? Yes:
X
g(~x, y) = f (~x, 0) + . . . + f (~x, y) =
f (~x, i)
i

Can be defined recursively as


g(~x, 0) = f (~x, 0)
g(~x, y + 1) = g(~x, y) + f (~x, y + 1)
Lets try writing this at a more basic-function level:
n+2
n+2
h = +(idn+2
, idn+2
, . . . , idn+2
n , dn+1 ]
n+2 , Cn[f, id1
2
n+2
n+2
g = P r[f, +(idn+2
, idn+2
, . . . , idn+2
n , idn+1 ])
n+2 , Cn[f, id1
2
Is it clear that this is actually the general summation we want? Yes.
To write it out:

P r[f, g](~x, s(y)) = h(x1 , . . . , xn , y, P r[f, g](~x, y))


= +(P r[f, g](~x, y), f (~x, s(y)))
As desired. Now, we could write it in even more basic notation to
get rid of the +; but we showed already that addition is a PRF, and
it should be clear (perhaps it could be an exercise for the class) that

we can rewrite h purely in terms of P r, Cn, f, id, s, 0.


The point of this example is to show that our quick-and-dirty recursive definitions, like
g(~x, 0) = f (~x, 0)
g(~x, y + 1) = g(~x, y) + f (~x, y + 1)
are really valid - what is produced is actually a PRF constructible
with our closures and basic properties. For the rest of the examples
we will not go through this tedium.
An example of a sum that we can produce with PRFs, is
where here f (i) = i3 . (f is 1-ary)

x3 ,

General product: It is clear that since multiplication is a PRF, we


can generate
Y
f (~x, i)
i

with PRFs: Let


g(~x, 0) = f (~x, 0)
g(~x, y + 1) = (g(~x, y), f (y + 1))
Clearly each step is a PRF; in particular we could write
n+2
n+2
, . . . , idn+2
, idn+2
h = (idn+2
n , dn+1 ]
2
n+2 , Cn[f, id1

and so on; but thats unnecessary to see that this general product is
defined in PRF.
Closure properties of conditions
In class, we discussed how to define function using definition by
cases:

g1 (~x if C1
f (~x) = ...

gk (~x) if Ck
Clearly, this definition requires the conditions and all gi s to be p.r.
In fact, these are the only requirement as we mentioned in class that:
Theorem 1. Suppose {Ci }ki=1 is a family of mutually exclusive and
jointly exhaustive conditions on ~x. Let also ci be the characteristic
function of Ci . Then, if {gi }ki=1 is a family of primitive recursive
functions, and all c0i s are primitive recursive, then so is f .
Now we can actually justify it as it is an easy consequence of the
general summation formula.

After that discussion a reasonable question emerges: What can


we do with primitive recursive conditions (i.e. ones whose characteristic function is primitive recursive)? We now show that they are
closed with respect to the standart binary operations , , .
(1) C1 sg(c1 (~x))
(2) C1 C2 c1 (~x).c2 (~x)
(3) C1 C2 (C1 C2 )
Bounded quantification Two important symbols, or functions, or
pieces of machinery that we might hope to have in our logic system
are and . In our definition of primitive recursive functions we are
restricted to the bounded versions of these quantifiers:
iy , iy
Given a primitive recursive condition C(~x, y) as defined in the previous problem, we can certainly construct the desired quantifiers as
PRFs using previously described functions:
Y
iy C(~x, i) = u(~x, y) :=
c(~x, i)
Here, we know that generalized products are PRF, and c is a PRF
that is 1 if the conditions is true and 0 otherwise; if the condition
is true for all i y, then, of course this returns 1; and it returns 0
otherwise.
Similarly we can write
X
i y = e(~x, y) = sg(
c(~x, i))
P
Having checked that each of these component functions (sg, , c) are
PRF. So we should feel comfortable using the shorthand iy , iy
when defining other PRFs.
Bounded minimization
This will be the last example for the day. The idea is the following:
Suppose h(~x, y) is p.r. Show that so is
(
min{y : y [n], h(~x, y) = 0} if such exists
f (~x) =
n
otherwise
The construction is a little bit more involved, but we can play
again the same spiel (define at 0, and find appropriate recursion
producing p.r f, g for P [f, g])
(1) g(~x, 0) = 0
(2) g(~x, y + 1) = g(~x, y) + sg(h(~x, g(~x, y)))

You might also like